Abstract

Objectives

To examine how residents are trained and assessed in musculoskeletal US (MSUS), MSUS-guided and landmark-guided joint aspiration and injection. Additionally, to present the available assessment tools and examine their supporting validity evidence.

Methods

A systematic search of PubMed, Cochrane Library and Embase was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines and studies published from 1 January 2000 to 31 May 2021 were included. Two independent reviewers performed the search and data extraction. The studies were evaluated using the Medical Education Research Quality Instrument (MERSQI).

Results

A total of 9884 articles were screened, and 43 were included; 3 were randomized studies, 21 pre- and post-test studies, 16 descriptive studies and 3 studies developing assessment tools. The studies used various theoretical training modalities, e.g. lectures, anatomical quizzes and e-learning. The practical training models varied from mannequins and cadavers to healthy volunteers and patients. The majority of studies used subjective ‘comfort level’ as assessment, others used practical examination and/or theoretical examination. All training programs increased trainees’ self-confidence, theoretical knowledge, and/or practical performance, however few used validated assessment tools to measure the effect. Only one study met the MERSQI high methodical quality cut-off score of 14.

Conclusion

The included studies were heterogeneous, and most were of poor methodological quality and not based on contemporary educational theories. This review highlights the need for educational studies using validated theoretical and practical assessment tools to ensure optimal MSUS training and assessment in rheumatology.

Rheumatology key messages
  • Studies examining training and assessment of musculoskeletal US and injection competences have a low level of evidence.

  • No validated rating scale assessing competencies in US-guided joint injections exists.

  • International uniform competency-based training programs using validated assessment tools are needed.

Introduction

The use of musculoskeletal US (MSUS) has expanded significantly during recent decades and is widely implemented as a clinical tool for the diagnosis, monitoring and treatment of patients with rheumatological diseases [1–3]. MSUS also provides guidance to invasive procedures such as joint aspirations and injections, which are core procedures for a rheumatologist [4]. Joint aspirations and injections can be performed landmark-guided or MSUS-guided. However, MSUS-guided procedures are more difficult to learn and the quality and value of MSUS, including MSUS-guided aspiration and injection, is highly dependent on the competencies of the individual operator [5, 6].

Mastering MSUS, and MSUS-guided aspiration and injection require theoretical knowledge and extensive hands-on training [7–9]. Additionally, hand–eye coordination training ensures the ability to control the US probe and the needle at the same time [4, 7]. Evidence-based and structured training is essential when residents acquire MSUS competences. MSUS training programs are already available through different scientific societies of rheumatology [10, 11]. The programs typically require both theoretical and practical skills in the competency assessment but differ in the required minimum number of MSUS examinations to obtain a certain competency level [12, 13]. Depending on the specific trainees, a pre-defined number of examinations may not necessarily result in a sufficient competence level and may even exceed the training needed due to different learning curves [5, 14]. A more individualized approach where certification is granted based on an assessment of competences would be efficient and could ensure that all trainees reach the necessary level of proficiency before proceeding to individual practice.

Once training is completed it is crucial to certify resident competences [6, 15] using dedicated assessment tools supported by validity evidence to ensure reliable assessments with minimal bias [16–21].

However, before implementing an assessment tool it is essential to examine different sources of validity, thereby ensuring that the tool is measuring what it is supposed to. The classic validity framework including construct, content and criterion is replaced by Messick’s modern, uniformed framework [22]. Messick’s framework is recommended as the standard by the American Educational Research Association. It is one of the most described and recognized frameworks for validity testing in medical education and procedural training [23]. Messick’s framework consists of five sources of validity: content, response process, internal structure, relations to other variables and consequences [24]. Ideally, all five sources of validity evidence should be examined before using a new assessment tool.

The importance for international harmonization in MSUS education has been highlighted in several studies, including the need for standardized training programs containing both competency-based training, objective assessment and certification [1, 7, 9, 25–30]. A comprehensive work has already been made to promote this harmonization across several European countries [10, 11, 28, 31].

In this review we sought to answer the following research questions:

  • In residents, is systematic training of MSUS, MSUS-guided and landmark-guided joint aspiration and injection skills in comparison with no formal training associated with improved knowledge, skills or behaviour?

  • What are the educational content and types of competencies being trained in relation to achieve MSUS, MSUS-guided and landmark-guided joint aspiration and injection skills?

  • What tools are available to assess competences of residents in MSUS, MSUS-guided and landmark-guided joint aspiration and injection, and what validity evidence supports them?

The overall aim of this systematic literature review was to examine how residents are trained and assessed in MSUS, including MSUS-guided and landmark-guided joint aspiration and injection. Additionally, we aim to present the available assessment tools and examine their supporting validity evidence.

Methods

Design, eligibility criteria and study selection

We have followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [32] and have illustrated the selection process in a flow diagram according to their recommendations (Fig. 1). We included all original learning or educational studies on how residents are trained and assessed in MSUS, including MSUS-guided and landmark-guided joint aspiration and injection. Moreover, studies reporting the development, evaluation or implementation of assessment tools to assess resident competencies in MSUS, MSUS-guided and landmark-guided joint aspiration and injection were included. Studies were only excluded if they: (i) were published in non-English language or (ii) were abstracts, case reports, case series, letters or review articles. Two authors (S.M.D.C. and M.J.V.) independently identified all relevant articles by screening titles and abstracts and read all potentially eligible studies in full. Any disagreement was resolved by discussion. Additionally, the reference lists of included studies were examined for additional eligible trials.

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram of study selection process
Fig. 1

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram of study selection process

Search strategy

We searched PubMed, Embase and the Cochrane Library using a search strategy developed in collaboration with a research librarian from the medical research library of Copenhagen, Denmark. We included studies from the past two decades, more precisely studies published from January 2000 to May 2021. The search strategy is available as supplementary material (supplementary Fig. S1, available at Rheumatology online). The Covidence systematic review software (Veritas Health Innovation, Melbourne, Australia) was used to manage the identified studies, to search for duplicates and for screening abstracts.

Data collection and extraction

Data extraction was performed independently by the two authors, in a data extraction chart using Microsoft Excel (Microsoft Corp, Redmond, WA, USA) with the following headlines: study design, participants, theoretical and practical training, practical training models, assessment, outcome measures, statistical analysis methods and study conclusion.

Assessment of methodological quality

The studies were evaluated using the Medical Education Research Quality Instrument (MERSQI), which was developed and validated in 2007. The MERSQI was designed to evaluate the methodical quality of medical educational research [33, 34] and includes six domains: study design, sampling, type of data (subjective or objective), validity, data analysis and outcomes. The maximum MERSQI score is 18 points. A mean MERSQI score below 9 of a given study is predicted to be rejected by journal editors, whereas a mean MERSQI score of 10.7 indicates acceptance. Moreover, studies with a score of 14 or above are rated as ‘high quality’ studies [33, 35–37].

Results

After the removal of duplicates, a total of 9884 articles were screened for eligibility, and 76 papers were identified and read in full. The excluded articles did not meet the inclusion criteria and covered topics other than training and assessment of competences in MSUS, MSUS-guided or landmark-guided aspiration and injection. Of the remaining 76 studies, 33 were excluded because they did not meet the inclusion criteria or only the conference abstracts were available (Fig. 1). No further studies were identified based on the reference lists of included studies. Of the 43 included studies, 3 examined MSUS-guided aspiration and injection training [38–40], 14 examined MSUS training [41–54], 23 examined landmark-guided aspiration and injection training [55–77], and 3 described the development of assessment tools [78–80].

The study design, participants, practical training models and assessment are described in Table 1 and more deeply in supplementary Table S1, available at Rheumatology online. Studies developing and validating assessment tools are described in Table 2. Study outcome measures, statistical analysis methods, study conclusion and individual MERSQI evaluation are described in supplementary Table S2 (available at Rheumatology online), whereas a concise MERSQI evaluation comparing MSUS, MSUS-guided and landmark-guided joint aspiration and injection studies is illustrated in Table 3.

Table 1

Description of included studies examining training of MSUS, MSUS-guided or landmark-guided joint aspiration and injection skills

Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studiesa14323
Study design
 Randomized003
 Pre- and post-test2217
 Descriptive1213
Participantsb
 Residents408381388
 Experts18
 Medical students149
 Others47
Practical training modelsc
 Patients61
 Healthy volunteers1
 Images from patients1
 Low fidelity models1
 Cadavers12
 Mannequins11
 Mixed610
Assessmentc
 Hands-on skills71
 Theoretical2
 Questionnaire2113
 Mixed518
Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studiesa14323
Study design
 Randomized003
 Pre- and post-test2217
 Descriptive1213
Participantsb
 Residents408381388
 Experts18
 Medical students149
 Others47
Practical training modelsc
 Patients61
 Healthy volunteers1
 Images from patients1
 Low fidelity models1
 Cadavers12
 Mannequins11
 Mixed610
Assessmentc
 Hands-on skills71
 Theoretical2
 Questionnaire2113
 Mixed518
a

Studies developing assessment tools are not included in this table.

b

Accumulated number of participants enrolled in the studies.

c

Number of studies. MSUS: musculoskeletal US.

Table 1

Description of included studies examining training of MSUS, MSUS-guided or landmark-guided joint aspiration and injection skills

Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studiesa14323
Study design
 Randomized003
 Pre- and post-test2217
 Descriptive1213
Participantsb
 Residents408381388
 Experts18
 Medical students149
 Others47
Practical training modelsc
 Patients61
 Healthy volunteers1
 Images from patients1
 Low fidelity models1
 Cadavers12
 Mannequins11
 Mixed610
Assessmentc
 Hands-on skills71
 Theoretical2
 Questionnaire2113
 Mixed518
Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studiesa14323
Study design
 Randomized003
 Pre- and post-test2217
 Descriptive1213
Participantsb
 Residents408381388
 Experts18
 Medical students149
 Others47
Practical training modelsc
 Patients61
 Healthy volunteers1
 Images from patients1
 Low fidelity models1
 Cadavers12
 Mannequins11
 Mixed610
Assessmentc
 Hands-on skills71
 Theoretical2
 Questionnaire2113
 Mixed518
a

Studies developing assessment tools are not included in this table.

b

Accumulated number of participants enrolled in the studies.

c

Number of studies. MSUS: musculoskeletal US.

Table 2

Description of included studies developing assessment tools to evaluate practical MSUS, MSUS-guided or landmark-guided joint aspiration and injection skills

Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studies111
Type of assessment tool
 Rating scalexx
 Checklistx
Sources of validity
 Messick’s (5 sources)a
Other validity sources
 Contentxxx
 Facex
 Concurrentxx
 Constructxx
 Predictivex
Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studies111
Type of assessment tool
 Rating scalexx
 Checklistx
Sources of validity
 Messick’s (5 sources)a
Other validity sources
 Contentxxx
 Facex
 Concurrentxx
 Constructxx
 Predictivex
a

Messick’s framework is recommended as the standard by the American Educational Research Association and includes five sources of validity: content, response process, internal structure, relations to other variables and consequences. MSUS: musculoskeletal US.

Table 2

Description of included studies developing assessment tools to evaluate practical MSUS, MSUS-guided or landmark-guided joint aspiration and injection skills

Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studies111
Type of assessment tool
 Rating scalexx
 Checklistx
Sources of validity
 Messick’s (5 sources)a
Other validity sources
 Contentxxx
 Facex
 Concurrentxx
 Constructxx
 Predictivex
Study characteristicsMSUSMSUS-guidedLandmark-guided
No. of studies111
Type of assessment tool
 Rating scalexx
 Checklistx
Sources of validity
 Messick’s (5 sources)a
Other validity sources
 Contentxxx
 Facex
 Concurrentxx
 Constructxx
 Predictivex
a

Messick’s framework is recommended as the standard by the American Educational Research Association and includes five sources of validity: content, response process, internal structure, relations to other variables and consequences. MSUS: musculoskeletal US.

Table 3

Assessment of methodological quality of the included studies using the Medical Education Research Study Quality Instrument

InstrumentDescriptionItemOptions and scoresaMean scores
MSUSbMSUS-guidedbLandmark-guidedb
MERSQIDeveloped to appraise the methodological quality of studies of medical educationStudy design
  • Single-group cross-sectional or single-group post-test only

1
  • Single-group pre-test and post-test

1.5
  • Nonrandomized, 2 group

2
  • Randomized controlled trial

31.01.21.5
SamplingInstitutions
  • Institutions

  • 1 institution

0.5
  • Response rate

  • 2 institutions

1
  • >2 institutions

1.5
Response rate %
  • Not applicable

  • <50% or not reported

0.5
  • 50–74%

1
  • ≥75%

1.51.10.51.4
Type of data
  • Assessment by study participant

1
  • Objective

32.71.71.8
Validity evidence for evaluation instrument scores
  • Not applicable

  • Internal structure

1
  • Content

1
  • Relationships to other variables

10.400.2
Data analysisAppropriateness of analysis
  • Appropriate

  • Data analysis appropriate for study design and type of data

1
  • Complexity

Complexity of analysis
  • Descriptive analysis only

1
  • Beyond descriptive analysis

22.432.8
Outcome
  • Satisfaction, attitudes, perceptions, opinions, general facts

1
  • Knowledge, skills

1.5
  • Behaviours

2
  • Patient/healthcare outcome

31.41.21.2
Total score189.27.58.9
InstrumentDescriptionItemOptions and scoresaMean scores
MSUSbMSUS-guidedbLandmark-guidedb
MERSQIDeveloped to appraise the methodological quality of studies of medical educationStudy design
  • Single-group cross-sectional or single-group post-test only

1
  • Single-group pre-test and post-test

1.5
  • Nonrandomized, 2 group

2
  • Randomized controlled trial

31.01.21.5
SamplingInstitutions
  • Institutions

  • 1 institution

0.5
  • Response rate

  • 2 institutions

1
  • >2 institutions

1.5
Response rate %
  • Not applicable

  • <50% or not reported

0.5
  • 50–74%

1
  • ≥75%

1.51.10.51.4
Type of data
  • Assessment by study participant

1
  • Objective

32.71.71.8
Validity evidence for evaluation instrument scores
  • Not applicable

  • Internal structure

1
  • Content

1
  • Relationships to other variables

10.400.2
Data analysisAppropriateness of analysis
  • Appropriate

  • Data analysis appropriate for study design and type of data

1
  • Complexity

Complexity of analysis
  • Descriptive analysis only

1
  • Beyond descriptive analysis

22.432.8
Outcome
  • Satisfaction, attitudes, perceptions, opinions, general facts

1
  • Knowledge, skills

1.5
  • Behaviours

2
  • Patient/healthcare outcome

31.41.21.2
Total score189.27.58.9
a

Max score for each item is 3, which gives a total max score of 18.

b

Mean score for each MERSQI item in studies examining training and assessment of MSUS (n = 14), MSUS-guided (n = 3) or landmark-guided joint aspiration and injection skills (n = 23). MSUS: musculoskeletal US; MERSQI: Medical Education Research Quality Instrument.

Table 3

Assessment of methodological quality of the included studies using the Medical Education Research Study Quality Instrument

InstrumentDescriptionItemOptions and scoresaMean scores
MSUSbMSUS-guidedbLandmark-guidedb
MERSQIDeveloped to appraise the methodological quality of studies of medical educationStudy design
  • Single-group cross-sectional or single-group post-test only

1
  • Single-group pre-test and post-test

1.5
  • Nonrandomized, 2 group

2
  • Randomized controlled trial

31.01.21.5
SamplingInstitutions
  • Institutions

  • 1 institution

0.5
  • Response rate

  • 2 institutions

1
  • >2 institutions

1.5
Response rate %
  • Not applicable

  • <50% or not reported

0.5
  • 50–74%

1
  • ≥75%

1.51.10.51.4
Type of data
  • Assessment by study participant

1
  • Objective

32.71.71.8
Validity evidence for evaluation instrument scores
  • Not applicable

  • Internal structure

1
  • Content

1
  • Relationships to other variables

10.400.2
Data analysisAppropriateness of analysis
  • Appropriate

  • Data analysis appropriate for study design and type of data

1
  • Complexity

Complexity of analysis
  • Descriptive analysis only

1
  • Beyond descriptive analysis

22.432.8
Outcome
  • Satisfaction, attitudes, perceptions, opinions, general facts

1
  • Knowledge, skills

1.5
  • Behaviours

2
  • Patient/healthcare outcome

31.41.21.2
Total score189.27.58.9
InstrumentDescriptionItemOptions and scoresaMean scores
MSUSbMSUS-guidedbLandmark-guidedb
MERSQIDeveloped to appraise the methodological quality of studies of medical educationStudy design
  • Single-group cross-sectional or single-group post-test only

1
  • Single-group pre-test and post-test

1.5
  • Nonrandomized, 2 group

2
  • Randomized controlled trial

31.01.21.5
SamplingInstitutions
  • Institutions

  • 1 institution

0.5
  • Response rate

  • 2 institutions

1
  • >2 institutions

1.5
Response rate %
  • Not applicable

  • <50% or not reported

0.5
  • 50–74%

1
  • ≥75%

1.51.10.51.4
Type of data
  • Assessment by study participant

1
  • Objective

32.71.71.8
Validity evidence for evaluation instrument scores
  • Not applicable

  • Internal structure

1
  • Content

1
  • Relationships to other variables

10.400.2
Data analysisAppropriateness of analysis
  • Appropriate

  • Data analysis appropriate for study design and type of data

1
  • Complexity

Complexity of analysis
  • Descriptive analysis only

1
  • Beyond descriptive analysis

22.432.8
Outcome
  • Satisfaction, attitudes, perceptions, opinions, general facts

1
  • Knowledge, skills

1.5
  • Behaviours

2
  • Patient/healthcare outcome

31.41.21.2
Total score189.27.58.9
a

Max score for each item is 3, which gives a total max score of 18.

b

Mean score for each MERSQI item in studies examining training and assessment of MSUS (n = 14), MSUS-guided (n = 3) or landmark-guided joint aspiration and injection skills (n = 23). MSUS: musculoskeletal US; MERSQI: Medical Education Research Quality Instrument.

Study design and participants

Of the 43 included studies, 16 were descriptive, 21 were pre- and post-test studies, and only 3 were randomized. The three randomized studies all reported findings on landmark-guided aspiration and injection training (Table 1). The remaining three studies developed new assessment tools. The study participants in the included studies were residents in rheumatology, family medicine and internal medicine with varying experience levels. The number of participants included in the studies ranged from 1 to 474 (supplementary Table S1, available at Rheumatology online).

Theoretical skill training

Training of the theoretical skills in MSUS, MSUS-guided and landmark-guided aspiration and injection skills were heterogeneous in the included studies. Within the theoretical part of the training curriculum, there were many different modalities used, e.g. reading material, lectures, videos, anatomical quizzes, interactive games or e-learning (supplementary Table S1, available at Rheumatology online). Furthermore, the time used on the theoretical part of the curriculum varied from 30 min [39] to 5 h [68].

E-learning was used in one study by Filippucci et al. [52]. After a 3-day course followed by 6 months of web-based tutoring, there was an improvement in the agreement of image interpretation (i.e. synovitis yes/no) between the trainees and their trainer.

Practical skill training

Training of the practical skills in MSUS, MSUS-guided and landmark-guided aspiration and injection skills were also heterogeneous in the included studies. Within the practical part of the training curriculum, the modalities varied between direct or indirect supervised training, individual or team-based training, and practical demonstrations on patients, healthy models, mannequins or cadavers. In addition, the time used on the practical part of the training curriculum varied from a 1-day course [38, 57, 67] to 10 months of clinical practice [51].

The practical training models

MSUS-guided aspiration and injection training

The three studies examining MSUS-guided aspiration and injection training used either patient models, fresh frozen cadaver models or a low fidelity model (Table 1). Charnoff et al. presented a low fidelity model [38] made by agar powder dissolved in water, including a chicken leg used to simulate muscle and tendon and two tomatoes simulating blood vessels. The participants were required to identify structures in the agar model using the US equipment and perform an MSUS-guided procedure. Afterwards, they were given a short lecture and instructed to complete the same task again. The residents demonstrated improved MSUS-guided skills; however, no validated assessment tools were used.

MSUS training

Within the 14 studies examining MSUS training, they used either healthy models, patients, US images from patients or a combination (Table 1). One study by Miguel et al. used images from patients with SpA and normal controls in their training program [47]. The residents showed improved image interpretation after the theory session (30 min) and the reading session (60 min).

Landmark-guided aspiration and injection training

Twenty-three studies examined landmark-guided aspiration and injection training, and the majority used mannequins (n = 19, 83%). Three of these studies used a combination of mannequins and cadavers [55, 63, 76], and three used a combination of mannequins and patients in their training programs [68, 70, 74]. Stroud et al. presented a study examining hybrid simulation in knee arthrocentesis training [75]. This study was the only study training the residents in both the technical aspects of the procedure and the communicative aspect using standardized patients. The residents’ performances were video-taped and assessed by two individual physicians using a procedure-specific modified version of the direct operational procedure skills (DOPS) rating form with established evidence of validity [81]. The communicative skills were rated by both the physicians and the standardized patient.

Three randomized studies examined the efficacy of different training curriculum on resident performance and self-confidence. Michels et al. compared three training programs using residents in general practice as participants [55]. The training program included five anatomical sites; glenohumeral, subacromial, lateral epicondyle, carpal tunnel and the knee. Improved skills were found when a theoretical lecture was followed by hands-on training, but no statistical differences in skills were found between training on cadavers as compared with mannequins.

Leopold et al. compared three training programs using residents, nurses and physician assistants as participants [56]. The program only examined one anatomical site: the knee. There was no statistically significant difference in outcome between groups randomized to receive instruction through a printed manual, a video or live hands-on instructions.

Gormley et al. compared two training programs using residents in general practice as participants [57]. The program examined one anatomical site: the shoulder. After a 1-day course, the intervention group got additional training on patients in a joint injection clinic, which improved confidence and increased injection activity 6 months post-course.

Assessment of the theoretical and practical skills

The assessment of participants’ skills varied from study to study, with the majority of studies using subjective comfort level (23), theoretical assessment, e.g. multiple-choice questions (10) and/or practical examination (13) (supplementary Table S1, available at Rheumatology online). Subjective comfort scales ranging from 4 points [39] to 10 points [74] were used as assessment in the 23 studies.

Thirteen studies assessed participants’ practical skills. The studies used a spectrum of different methods to measure and evaluate the participants practical skills e.g. procedure time, structures identified, image interpretation (± synovitis or pathology present/absent), image quality (1–10), Objective structured clinical exam (OSCE) score checklist (yes/no), OSCE score rating scale or DOPS rating form (supplementary Table S1, available at Rheumatology online).

In general, validity evidence supporting the outcomes of the assessment tools was limited. Only three studies reported or discussed the validity of the assessment tool used [45, 51, 75]. Two of them used the DOPS rating form, either in the original version or in a modified procedure-specific version [51, 75] and briefly stated that this assessment tool had been demonstrated to have evidence of validity.

Development of assessment tools

MSUS-guided aspiration and injection skills assessment

Kunz et al. developed a checklist for US-guided arthrocentesis using the Delphi method. However, they only conducted one source of validity evidence: content validity [78] (Table 2). Therefore, they concluded that further validity evidence should be examined before using the tool in the clinical or simulated environment. Furthermore, it is noteworthy that the use of a checklist has shown to have lower sensitivity, reliability and validity measures compared with rating scales [82].

MSUS skills assessment

Kissin et al. developed and gathered validity evidence for written multiple-choice questions and established a passing score [45]. Ten faculty members developed the multiple-choice questions, and the test could discriminate between operators with different levels of experience.

In a more recent study by Kissin et al. the research group focused on gathering validity evidence for a practical examination for assessing MSUS skills, i.e. OSCE [79]. The participants were assessed by trained and blinded raters using the newly developed 5-point rating scale. Their performances were evaluated strictly on image characteristics and quality. A pass/fail score was established using the borderline methodology. The research group used concurrent and construct validity and tested the reliability using interrater reliability (Table 2).

Landmark-guided aspiration and injection skills assessment

Singh et al. developed a procedure specific rating scale to assess resident competences in landmark-guided joint aspiration and injection skills [80]. The 5-point rating scale consisted of nine items in total. The research group used the following sources of validity: construct, content, predictive, face and concurrent validity (Table 2).

Evaluation of the methodological quality

The detailed MERSQI evaluation of the included studies is presented in supplementary Table S2 (available at Rheumatology online) and a concise MERSQI evaluation comparing MSUS, MSUS-guided, and landmark-guided joint aspiration and injection studies are presented in Table 3. Only five studies were above the ‘editor acceptance’ score of 10.7 points and just one of these reached the ‘high-methodological quality’ cut-off of 14 points [45]. Twenty-one studies were below the ‘editor rejection’ score of 9 points. The mean MERSQI quality score was 7.5 points in MSUS-guided aspiration and injection educational studies, 9.2 in studies examining MSUS education and 8.9 in landmark-guided aspiration and injection educational studies (maximum points 18) (Table 3). This translates to an overall low quality of studies with a high risk of bias, especially due to the pre- and post-test study designs, lack of objective competence assessment using validated assessment tools and non-patient-related outcomes.

Discussion

Despite the general agreement on the importance of MSUS, MSUS-guided and landmark-guided joint aspiration and injection skills, no international consensus exists on how to optimally train and assess them. This systematic literature review revealed substantial heterogeneity in the current published studies regarding the theoretical and practical part of the curriculum and the assessment methods, and hence no guidelines can be given. Despite the heterogeneity, all included studies found a favourable effect on confidence or competence regardless of the training intervention or assessment method. However, the majority of the included studies are of low methodological quality, with an associated low MERSQI score. The main methodological problem was study design, where we found a total lack of randomized studies comparing training strategies in MSUS or MSUS-guided aspiration and injection. Moreover, several studies used a single group pre- and post-test study design. This set-up which compares ‘something’ with ‘nothing’ has been discussed and criticized for several years and primarily supports the fact that any training intervention has an effect [83–85]. Additionally, the majority of studies used self-assessment even though there is solid scientific evidence showing that self-assessment is an unreliable measure of competences [86–88]. Validated assessment tools assessing resident practical competences were only used in three studies [45, 51, 75].

When we look deeper into the results of this systematic review, we found three central topics of interest according to curriculum development: the theoretical part, the practical part and the assessment methods. The majority of included studies used a blended training approach with both a theoretical and a practical part. However, the time spent on each part varied greatly between the studies.

For the theoretical part, none of the studies used a strictly web-based learning approach even though similar competences can be obtained in this way when compared with lecture-based learning [89]. One research group used a web-based learning environment for continued learning after a 3-day course with lectures [52]. However, they did not replace the theoretical lecture-based part with a web-based approach before the course. Web-based learning can reduce cost, time and the staff required in lecture-based learning, thereby releasing more time for hands-on training [29, 90, 91]. The flexibility and accessibility of a web-based approach as well as the possibility of repeating the online training after a course are major advantages.

Concerning the practical part of the curriculum it is important to recognize that different hands-on training models can contribute to different aspects of the learning process; healthy models are suitable for introduction to the US equipment and learning the basic anatomy, mannequins and/or cadavers models are suitable to practice MSUS-guided aspiration and injection, and the use of supervised scanning on patients allows the resident to learn the US pathology and to practice documentation. However, there are several challenges when using actual patients in a training curriculum for interventional US: the primary challenge is the higher risk of inaccurate treatment or side effects when a novice performs the procedure, as seen in other medical procedures [92–94]. The secondary challenge is the limited access and time to practice in the clinic under the supervision and the waiting time between relevant procedures for training, which makes the apprenticeship training method less effective [95]. The alternative approach is to practice in a patient-free environment using a mannequin or cadaver. Fresh frozen cadavers maintain sonographic quality and can be manipulated when performing the procedures, and it is possible to access multiple anatomical areas in the same training session [39]. However, it may be more feasible to use a mannequin due to the low cost, the portability and the accessibility, which makes repeated practice possible (i.e. distributed practice). Two randomized studies compared training of landmark-guided aspiration and injection using a cadaver-based program with a mannequin-based program [55, 63]. Michels et al. found that both training on mannequins and cadavers improved skill performance, but training on cadavers revealed better results on the majority of anatomical sites, whereas the study by Berman et al. found no significant difference on residents’ comfort level, however the participants considered the training on cadavers to be the most effective in a survey [63].

In relation to assessment, it is essential to recognize that using subjective confidence level is considered to be obsolete for assessing the effect of educational interventions [87]. Previous studies of self-assessment have shown that self-assessed abilities are uncorrelated with resident performance [77, 96]. Moreover, it would be surprising if the participant did not feel more comfortable after an educational intervention [83, 84]. A substantial body of evidence emphasizes the importance of skill assessment in medical education and procedural training. Numerous studies have proven that assessment drives learning [97, 98]. However, the included studies in this review show that only few use validated assessment tools to measure resident competences.

In order to assess participants theoretical and/or practical competences, development and validation of assessment tools are needed. Three studies included in this review developed assessment tools, however none of the studies examined all five validity sources in the contemporary framework of Messick. The use of outdated frameworks or frameworks where only few sources of validity are examined may result in incorrect trainer decision in relation to assessment of trainees’ competences [99, 100]. Across the other medical specialties using US, two assessment tools have been developed according to the contemporary framework: the Objective Structures Assessment of Ultrasound Skills (OSAUS) [101], which was developed by an expert panel in multiple specialties and are already validated and implemented in several medical fields [17, 102, 103], and the Interventional Ultrasound Skills Evaluation (IUSE) [20], developed by an international expert panel of radiologists. However, the novel IUSE tool has only established content validity and further examination of the other sources of validity needs to be made before implementation. Like Stroud et al. [75], the IUSE-tool also includes assessment of competences in patient communication, which is highly relevant with today’s increased focus on patient-centred treatment. The validation and application of these tools to assess resident competences in MSUS, and MSUS-guided aspiration and injection have yet to be determined.

Because of the aforementioned heterogeneity in the included studies with respect to the study designs, the theoretical and practical part of the curriculum and the assessment methods used, it was not possible to perform a quantitative meta-analysis, which is a limitation of the current review. Another limitation is the poor methodological quality of the majority of included studies. Furthermore, we cannot exclude the possibility of publication bias. It is important to consider that researchers may not desire to publish or distribute studies showing that their intervention (e.g. training program) does not have an effect on outcomes (e.g. resident knowledge, skills or behaviour); moreover negative findings may also be difficult to get published.

In conclusion, we found substantial heterogeneity in the current published studies, most of which were of poor methodological quality and not based on contemporary educational theories. Therefore, there is a need for carefully designed educational studies using validated theoretical and practical assessment tools to define the optimal MSUS training curriculum and competence assessment in rheumatology.

Acknowledgements

We thank Tove Margit Svendsen, research librarian at the Medical Library at Rigshospitalet Denmark, for her assistance with developing the search string.

Funding: No specific funding was received from any bodies in the public, commercial or not-for-profit sectors to carry out the work described in this article.

Disclosure statement: The authors have declared no conflicts of interest.

Data availability statement

All data generated or analysed during this study are included in this article and its supplementary information files, available at Rheumatology online.

Supplementary data

Supplementary data are available at Rheumatology online.

References

1

Naredo
E
,
D’Agostino
MA
,
Conaghan
PG
et al.
Current state of musculoskeletal ultrasound training and implementation in Europe: results of a survey of experts and scientific societies
.
Rheumatology (Oxford)
2010
;
49
:
2438
43
.

2

Carstensen
SMD
,
Terslev
L
,
Jensen
MP
,
Østergaard
M.
Future use of musculoskeletal ultrasonography and magnetic resonance imaging in rheumatoid arthritis
.
Curr Opin Rheumatol
2020
;
32
:
264
72
.

3

Filippucci
E
,
Cipolletta
E
,
Mashadi Mirza
R
et al.
Ultrasound imaging in rheumatoid arthritis
.
Radiol Med
2019
;
124
:
1087
100
.

4

Kane
D
,
Grassi
W
,
Sturrock
R
,
Balint
PV.
Musculoskeletal ultrasound - A state of the art review in rheumatology. Part 2: clinical indications for musculoskeletal ultrasound in rheumatology
.
Rheumatology (Oxford)
2004
;
43
:
829
38
.

5

Jang
TB
,
Ruggeri
W
,
Dyne
P
,
Kaji
AH.
The learning curve of resident physicians using emergency ultrasonography for cholelithiasis and cholecystitis
.
Acad Emerg Med
2010
;
17
:
1247
52
.

6

Brown
AK
,
O’Connor
PJ
,
Roberts
TE
et al.
Recommendations for musculoskeletal ultrasonography by rheumatologists: setting global standards for best practice by expert consensus
.
Arthritis Rheum
2005
;
53
:
83
92
.

7

Mandl
P
,
Naredo
E
,
Conaghan
PG
et al.
Practice of ultrasound-guided arthrocentesis and joint injection, including training and implementation, in Europe: results of a survey of experts and scientific societies
.
Rheumatology (Oxford)
2012
;
51
:
184
90
.

8

Bruyn
GAW
,
Schmidt
WA.
How to perform ultrasound-guided injections
.
Best Pract Res Clin Rheumatol
2009
;
23
:
269
79
.

9

Brown
AK
,
O’Connor
PJ
,
Wakefield
RJ
et al.
Practice, training, and assessment among experts performing musculoskeletal ultrasonography: toward the development of an international consensus of educational standards for ultrasonography for rheumatologists
.
Arthritis Rheum
2004
;
51
:
1018
22
.

10

Terslev
L
,
Hammer
HB
,
Torp-Pedersen
S
et al.
EFSUMB minimum training requirements for rheumatologists performing musculoskeletal ultrasound
.
Ultraschall der Medizin
2013
;
34
:
475
7
.

11

Naredo
E
,
Bijlsma
JWJ
,
Conaghan
PG
et al.
Recommendations for the content and conduct of European League Against Rheumatism (EULAR) musculoskeletal ultrasound courses
.
Ann Rheum Dis
2008
;
67
:
1017
22
.

12

EULAR MSUS COURSE
 . https://esor.eular.org/enrol/index.php?id=304 (1 July 2021, date last accessed).

13

EFSUMB MSUS COURSE
.
2021
. https://efsumb.org/wp-content/uploads/2020/09/Programm_A5_Ultraschallkurs_2020.pdf (1 July 2021, date last accessed).

14

Barsuk
JH
,
Cohen
ER
,
Feinglass
J
,
McGaghie
WC
,
Wayne
DB.
Residents’ procedural experience does not ensure competence: a research synthesis
.
J Grad Med Educ
2017
;
9
:
201
8
.

15

Brown
AK
,
O’Connor
PJ
,
Roberts
TE
et al.
Ultrasonography for rheumatologists: the development of specific competency based educational outcomes
.
Ann Rheum Dis
2006
;
65
:
629
36
.

16

Strøm
M
,
Lönn
L
,
Konge
L
et al.
Assessment of EVAR competence: validity of a novel rating scale (EVARATE) in a simulated setting
.
Eur J Vasc Endovasc Surg
2018
;
56
:
137
44
.

17

Todsen
T
,
Melchiors
J
,
Charabi
B
et al.
Competency-based assessment in surgeon-performed head and neck ultrasonography: a validity study
.
Laryngoscope
2018
;
128
:
1346
52
.

18

Russell
L
,
Østergaard
ML
,
Nielsen
MB
,
Konge
L
,
Nielsen
KR.
Standardised assessment of competence in Focused Assessment with Sonography for Trauma
.
Acta Anaesthesiol Scand
2018
;
62
:
1154
60
.

19

Pietersen
PI
,
Konge
L
,
Graumann
O
,
Nielsen
BU
,
Laursen
CB.
Developing and gathering validity evidence for a simulation-based test of competencies in lung ultrasound
.
Respiration
2019
;
97
:
329
36
.

20

Kahr Rasmussen
N
,
Nayahangan
LJ
,
Carlsen
J
et al.
Evaluation of competence in ultrasound-guided procedures—a generic assessment tool developed through the Delphi method
.
Eur Radiol
2021
;
31
:
4203
11
.

21

Østergaard
ML
,
Rue Nielsen
K
,
Albrecht-Beste
E
et al.
Simulator training improves ultrasound scanning performance on patients: a randomized controlled trial
.
Eur Radiol
2019
;
29
:
3210
8
.

22

Borgersen
NJ
,
Naur
TMH
,
Sørensen
SMD
et al.
Gathering validity evidence for surgical simulation: a systematic review
.
Ann Surg
2018
;
267
:
1063
8
.

23

Yudkowsky
R
,
Soon Park
Y
,
Downing
SM.
Assessment in health professions education: edition 2
.
New York and London
:
Routledge (Taylor and Francis Group
),
2019

24

Messick
S.
Foundations of validity: meaning and consequences in psychological assessment
.
ETS Res Rep Ser
1993
;
1993
:
i
18
.

25

Kane
D
,
Balint
PV
,
Sturrock
R
,
Grassi
W.
Musculoskeletal ultrasound - A state of the art review in rheumatology. Part 1: current controversies and issues in the development of musculoskeletal ultrasound in rheumatology
.
Rheumatology (Oxford)
2004
;
43
:
823
8
.

26

Widener
BB
,
Cannella
A
,
Martirossian
L
,
Kissin
EY.
Modern landscapes and strategies for learning ultrasound in rheumatology
.
Rheum Dis Clin North Am
2020
;
46
:
61
71
.

27

Ike
R
,
Arnold
E
,
Arnold
W
et al.
Ultrasound in American rheumatology practice: report of the American College of Rheumatology Musculoskeletal Ultrasound Task Force
.
Arthritis Care Res (Hoboken)
2010
;
62
:
1206
19
.

28

Sivera
F
,
Ramiro
S
,
Cikes
N
et al. ;
Working Group on Training in Rheumatology across Europe
.
Rheumatology training experience across Europe: analysis of core competences
.
Arthritis Res Ther
2016
;
18
:
213
.

29

Brown
AK
,
Roberts
TE
,
O’Connor
PJ
et al.
The development of an evidence-based educational framework to facilitate the training of competent rheumatologist ultrasonographers
.
Rheumatology (Oxford)
2007
;
46
:
391
7
.

30

Janta
I
,
Terslev
L
,
Ammitzbøll-Danielsen
M
et al.
EFSUMB COMPASS for Rheumatologists dissemination and implementation - an international survey
.
Med Ultrason
2016
;
18
:
42
6
.

31

Mandl
P
,
Ciechomska
A
,
Terslev
L
et al.
Implementation and role of modern musculoskeletal imaging in rheumatological practice in member countries of EULAR
.
RMD Open
2019
;
5
:
e000950
.

32

Moher
D
,
Liberati
A
,
Tetzlaff
J
,
Altman
DG
,
for the PRISMA Group
.
Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement
.
BMJ
2009
;
339
:
b2535
6
.

33

Reed
DA
,
Cook
DA
,
Beckman
TJ
et al.
Association between funding and quality of published medical education research
.
JAMA
2007
;
298
:
1002
9
.

34

Cook
DA
,
Reed
DA.
Appraising the quality of medical education research methods: the medical education research study quality instrument and the Newcastle-Ottawa scale-education
.
Acad Med
2015
;
90
:
1067
76
.

35

Wasson
LT
,
Cusmano
A
,
Meli
L
et al.
Association between learning environment interventions and medical student well-being: a systematic review
.
JAMA
2016
;
316
:
2237
52
.

36

Lin
H
,
Lin
E
,
Auditore
S
,
Fanning
J.
A narrative review of high-quality literature on the effects of resident duty hours reforms
.
Acad Med
2016
;
91
:
140
50
.

37

Reed
DA
,
Beckman
TJ
,
Wright
SM
et al.
Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM’s medical education special issue
.
J Gen Intern Med
2008
;
23
:
903
7
.

38

Charnoff
J
,
Naqvi
U
,
Weaver
M
,
Price
C.
Resident education of ultrasound-guided procedures: a homemade practice model pilot study
.
Am J Phys Med Rehabil
2019
;
98
:
e116
8
.

39

Amini
R
,
Camacho
LD
,
Valenzuela
J
et al.
Cadaver models in residency training for uncommonly encountered ultrasound-guided procedures
.
J Med Educ Curric Dev
2019
;
6
:
1
5
.

40

Atchia
I
,
Birrell
F
,
Kane
D.
A modular, flexible training strategy to achieve competence in diagnostic and interventional musculoskeletal ultrasound in patients with hip osteoarthritis
.
Rheumatology (Oxford)
2007
;
46
:
1583
6
.

41

Wu
WT
,
Chang
KV
,
Han
DS
,
Özçakar
L.
Musculoskeletal ultrasound workshops in postgraduate physician training: a pre- and post-workshop survey of 156 participants
.
BMC Med Educ
2019
;
19
:
362
.

42

Gulati
G
,
Alweis
R
,
George
D.
Musculoskeletal ultrasound in internal medicine residency – a feasibility study
.
J Community Hosp Intern Med Perspect
2015
;
5
:
27481
.

43

Irwin
RW
,
Smith
J
,
Issenberg
SB.
Long-term retention of musculoskeletal ultrasound training during residency
.
Am J Phys Med Rehabil
2018
;
97
:
523
30
.

44

Gould
SJ.
Action-observation in fine motor learning applying neuropsychology principles to enhance academic teaching in the field of rheumatology: a pilot study
.
Bull Hosp Joint Dis
2020
;
78
:
144
5
.

45

Kissin
EY
,
Niu
J
,
Balint
P
et al.
Musculoskeletal ultrasound training and competency assessment program for rheumatology fellows
.
J Ultrasound Med
2013
;
32
:
1735
43
.

46

Gutiérrez
M
,
Di Geso
L
,
Rovisco
J
et al.
Ultrasound learning curve in gout: a disease-oriented training program
.
Arthritis Care Res (Hoboken)
2013
;
65
:
1265
74
.

47

Miguel
C
,
De Miguel
E
,
Batlle-Gualda
E
,
Rejón
E
,
Lojo
L
;
Entheses Ultrasound Workshop Group
.
Teaching enthesis ultrasound: experience of an ultrasound training workshop
.
Rheumatol Int
2012
;
32
:
4047
52
.

48

Ellegaard
K
,
Torp-Pedersen
S
,
Christensen
R
et al.
Feasibility of a standardized ultrasound examination in patients with rheumatoid arthritis: a quality improvement among rheumatologists cohort
.
BMC Musculoskelet Disord
2012
;
13
:
35
.

49

Gutierrez
M
,
Filippucci
E
,
Ruta
S
et al.
Inter-observer reliability of high-resolution ultrasonography in the assessment of bone erosions in patients with rheumatoid arthritis: experience of an intensive dedicated training programme
.
Rheumatology (Oxford)
2011
;
50
:
373
80
.

50

Kissin
EY
,
Nishio
J
,
Yang
M
et al.
Self-directed learning of basic musculoskeletal ultrasound among rheumatologists in the United States
.
Arthritis Care Res (Hoboken)
2010
;
62
:
155
60
.

51

Taggart
AJ
,
Wright
SA
,
Ball
E
,
Kane
D
,
Wright
G.
The Belfast musculoskeletal ultrasound course
.
Rheumatology (Oxford)
2009
;
48
:
1073
6
.

52

Filippucci
E
,
Meenagh
G
,
Ciapetti
A
et al.
E-learning in ultrasonography: a web-based approach
.
Ann Rheum Dis
2007
;
66
:
962
5
.

53

D’Agostino
M-A
,
Maillefert
J-F
,
Said-Nahal
R
et al.
Detection of small joint synovitis by ultrasonography: the learning curve of rheumatologists
.
Ann Rheum Dis
2004
;
63
:
1284
7
.

54

Filippucci
E
,
Unlu
Z
,
Farina
A
,
Grassi
W.
Sonographic training in rheumatology: a self teaching approach
.
Ann Rheum Dis
2003
;
62
:
565
7
.

55

Michels
NR
,
Vanhomwegen
E.
An educational study to investigate the efficacy of three training methods for infiltration techniques on self-efficacy and skills of trainees in general practice
.
BMC Fam Pract
2019
;
20
:
133
.

56

Leopold
SS
,
Morgan
HD
,
Kadel
NJ
et al.
Impact of educational intervention on confidence and competence in the performance of a simple surgical task
.
J Bone Jt Surg Ser A
2005
;
87
:
1031
7
.

57

Gormley
GJ
,
Steele
WK
,
Stevenson
M
et al.
A randomised study of two training programmes for general practitioners in the techniques of shoulder injection
.
Ann Rheum Dis
2003
;
62
:
1006
9
.

58

Gould
S
,
Knowling
E
,
Smola
R
,
Titer
K
,
Martin
K.
Efficacy of a cadaver-based procedural skills lab for internal medicine residents
 .
Cogent Med
2020
;
7
:
1780065
.

59

Sattler
LA
,
Schuety
C
,
Nau
M
et al.
Simulation-based medical education improves procedural confidence in core invasive procedures for military internal medicine residents
.
Cureus
2020
;
10
:
12
.

60

Seifert
MK
,
Holt
CT
,
Haskins
A
,
Dexter
W.
Improving internal medicine resident comfort with shoulder and knee joint injections using an injection workshop
.
MedEdPORTAL
2020
;
16
:
10979
.

61

Denizard-Thompson
N
,
Feiereisel
KB
,
Pedley
CF
,
Burns
C
,
Campos
C.
Musculoskeletal basics: the shoulder and the knee workshop for primary care residents
.
2018
;
14
:
10749
.

62

Blake
T
,
Marais
D
,
Hassell
AB
,
Stevenson
K
,
Paskins
Z.
Getting back to the dissecting room: an evaluation of an innovative course in musculoskeletal anatomy for UK-based rheumatology training
.
Musculoskelet Care
2017
;
15
:
405
12
.

63

Berman
JR
,
Ben-Artzi
A
,
Fisher
MC
,
Bass
AR
,
Pillinger
MH.
A comparison of arthrocentesis teaching tools: cadavers, synthetic joint models, and the relative utility of different educational modalities in improving trainees’ comfort with procedures
.
J Clin Rheumatol
2012
;
18
:
175
9
.

64

Sterrett
AG
,
Bateman
H
,
Guthrie
J
et al.
Virtual rheumatology: using simulators and a formal workshop to teach medical students, internal medicine residents, and rheumatology subspecialty residents arthrocentesis
.
J Clin Rheumatol
2011
;
17
:
121
3
.

65

Lenchus
J
,
Issenberg
SB
,
Murphy
D
et al.
A blended approach to invasive bedside procedural instruction
.
Med Teach
2011
;
33
:
116
23
.

66

Lenchus
JD.
End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction
.
J Am Osteopath Assoc
2010
;
110
:
340
6
.

67

Barilla-Labarca
ML
,
Tsang
JC
,
Goldsmith
M
,
Furie
R.
Design, implementation, and outcome of a hands-on arthrocentesis workshop
.
J Clin Rheumatol
2009
;
15
:
275
9
.

68

Lenhard
A
,
Moallem
M
,
Marrie
RA
,
Becker
J
,
Garland
A.
An intervention to improve procedure education for internal medicine residents
.
J Gen Intern Med
2008
;
23
:
288
93
.

69

Jolly
M
,
Hill
A
,
Mataria
M
,
Agarwal
S.
Influence of an interactive joint model injection workshop on physicians’ musculoskeletal procedural skills
.
J Rheumatol
2007
;
34
:
1576
9
.

70

Wilcox
T
,
Oyler
J
,
Harada
C
,
Utset
T.
Musculoskeletal exam and joint injection training for internal medicine residents
.
J Gen Intern Med
2006
;
21
:
521
3
.

71

Alguire
PC.
Teaching physicians procedural skills at a national professional meeting
.
Med Educ Online
2004
;
9
:
4346
.

72

Oxentenko
AS
,
Ebbert
JO
,
Ward
LE
,
Pankratz
VS
,
Wood
KE.
A multidimensional workshop using human cadavers to teach bedside procedures
.
Teach Learn Med
2003
;
15
:
127
30
.

73

Vogelgesang
SA
,
Karplus
TM
,
Kreiter
CD.
An instructional program to facilitate teaching joint/soft-tissue injection and aspiration
.
J Gen Intern Med
2002
;
17
:
441
5
.

74

Fortuna
RJ
,
Marston
B
,
Messing
S
et al.
Ambulatory training program to expand procedural skills in primary care
.
J Med Educ Curric Dev
2019
;
6
:
2382120519859298
.

75

Stroud
L
,
Cavalcanti
RB.
Hybrid simulation for knee arthrocentesis: improving fidelity in procedures training
.
J Gen Intern Med
2013
;
28
:
723
7
.

76

Bakewell
CJ
,
Gardner
GC.
A survey of arthrocentesis and soft-tissue injection procedures performed in primary care practice: effect of resident training and using data to shape curriculum
.
J Rheumatol
2011
;
38
:
1986
9
.

77

MacKenzie
MS
,
Berkowitz
J.
Do procedural skills workshops during family practice residency work?
Can Fam Physician
2010
;
56
:
8
.

78

Kunz
D
,
Pariyadath
M
,
Wittler
M
et al.
Derivation of a performance checklist for ultrasound-guided arthrocentesis using the modified Delphi method
.
J Ultrasound Med
2017
;
36
:
1147
52
.

79

Kissin
EY
,
Grayson
PC
,
Cannella
AC
et al.
Musculoskeletal ultrasound objective structured clinical examination: an assessment of the test
.
Arthritis Care Res (Hoboken)
2014
;
66
:
2
6
.

80

Singh
S
,
Kaur
A
,
Singh
H.
Objective assessment of orthopaedic skills
.
Int J Med Public Heal
2017
;
7
:
187
90
.

81

Kneebone
R
,
Nestel
D
,
Yadollahi
F
et al.
Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI)
.
Med Educ
2006
;
40
:
1105
14
.

82

Ma
IWY
,
Zalunardo
N
,
Pachev
G
et al.
Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation
.
Adv Heal Sci Educ
2012
;
17
:
457
70
.

83

Cook
DA
,
Beckman
TJ.
Reflections on experimental research in medical education
.
Adv Heal Sci Educ
2010
;
15
:
455
64
.

84

Yarris
LM
,
Gruppen
LD
,
Hamstra
SJ
,
Anders Ericsson
K
,
Cook
DA.
Overcoming barriers to addressing education problems with research design: a panel discussion
.
Acad Emerg Med
2012
;
19
:
1344
9
.

85

Cook
DA.
How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education
.
Med Educ
2014
;
48
:
750
60
.

86

Nayahangan
LJ
,
Clementsen
PF
,
Konge
L.
Career development training for interventional pulmonary fellows
.
J Bronchology Interv Pulmonol
2020
;
27
:
39
.

87

Norman
G.
Data dredging, salami-slicing, and other successful strategies to ensure rejection: twelve tips on how to not get your paper published
.
Adv Health Sci Educ Theory Pract
2014
;
19
:
1
5
.

88

Davis
DA
,
Mazmanian
PE
,
Fordis
M
et al.
Accuracy of physician self-assessment compared with observed measures of competence: a systematic review
.
JAMA
2006
;
296
:
1094
102
.

89

Cook
DA
,
Levinson
AJ
,
Garside
S
et al.
Internet-based learning in the health professions: a meta-analysis
.
JAMA
2008
;
300
:
1181
96
.

90

Kang
TL
,
Berona
K
,
Elkhunovich
MA
et al.
Web-based teaching in point-of-care ultrasound: an alternative to the classroom?
Adv Med Educ Pract
2015
;
6
:
171
5
.

91

Platz
E
,
Goldflam
K
,
Mennicke
M
et al.
Comparison of web-versus classroom-based basic ultrasonographic and efast training in 2 European hospitals
.
Ann Emerg Med
2010
;
56
:
6
.

92

McGee
DC
,
Gould
MK.
Preventing complications of central venous catheterization
.
N Engl J Med
2003
;
348
:
1123
33
.

93

Barsuk
JH
,
McGaghie
WC
,
Cohen
ER
,
Balachandran
JS
,
Wayne
DB.
Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit
.
J Hosp Med
2009
;
4
:
397
403
.

94

Barsuk
JH
,
Cohen
ER
,
Williams
MV
et al.
Simulation-based mastery learning for thoracentesis skills improves patient outcomes: a randomized trial
.
Acad Med
2018
;
93
:
729
35
.

95

Konge
L
,
Clementsen
PF
,
Ringsted
C
et al.
Simulator training for endobronchial ultrasound: a randomised controlled trial
.
Eur Respir J
2015
;
46
:
1140
9
.

96

Eva
KW
,
Regehr
G.
Self-assessment in the health professions: a reformulation and research agenda
.
Acad Med
2005
;
80
:
S46
54
.

97

Vilmann
AS
,
Norsk
D
,
Svendsen
MBS
et al.
Computerized feedback during colonoscopy training leads to improved performance: a randomized trial
.
Gastrointest Endosc
2018
;
88
:
869
76
.

98

Strandbygaard
J
,
Bjerrum
F
,
Maagaard
M
et al.
Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized trial
.
Ann Surg
2013
;
257
:
839
44
.

99

Ghaderi
I
,
Manji
F
,
Soo Park
Y
et al.
Technical skills assessment toolbox a review using the unitary framework of validity
.
Ann Surg
2015
;
261
:
251
62
.

100

Cook
DA
,
Brydges
R
,
Zendejas
B
,
Hamstra
SJ
,
Hatala
R.
Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality
.
Acad Med
2013
;
88
:
872
83
.

101

Tolsgaard
MG
,
Todsen
T
,
Sorensen
JL
et al.
International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey
.
PLoS One
2013
;
8
:
e57687
.

102

Todsen
T
,
Tolsgaard
MG
,
Olsen
BH
et al.
Reliable and valid assessment of point-of-care ultrasonography
.
Ann Surg
2015
;
261
:
309
15
.

103

Tolsgaard
MG
,
Ringsted
C
,
Dreisler
E
et al.
Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology
.
Ultrasound Obstet Gynecol
2014
;
43
:
437
43
.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)

Supplementary data

Comments

0 Comments
Submit a comment
You have entered an invalid code
Thank you for submitting a comment on this article. Your comment will be reviewed and published at the journal's discretion. Please check for further notifications by email.