CONFERD-HP: recommendations for reporting COmpeteNcy FramEwoRk Development in health professions

Abstract Background Competency frameworks outline the perceived knowledge, skills, attitudes, and other attributes required for professional practice. These frameworks have gained in popularity, in part for their ability to inform health professions education, assessment, professional mobility, and other activities. Previous research has highlighted inadequate reporting related to their development which may then jeopardize their defensibility and utility. Methods This study aimed to develop a set of minimum reporting criteria for developers and authors of competency frameworks in an effort to improve transparency, clarity, interpretability and appraisal of the developmental process, and its outputs. Following guidance from the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network, an expert panel was assembled, and a knowledge synthesis, a Delphi study, and workshops were conducted using individuals with experience developing competency frameworks, to identify and achieve consensus on the essential items for a competency framework development reporting guideline. Results An initial checklist was developed by the 35-member expert panel and the research team. Following the steps listed above, a final reporting guideline including 20 essential items across five sections (title and abstract; framework development; development process; testing; and funding/conflicts of interest) was developed. Conclusion The COmpeteNcy FramEwoRk Development in Health Professions (CONFERD-HP) reporting guideline permits a greater understanding of relevant terminology, core concepts, and key items to report for competency framework development in the health professions.


Introduction
Competency frameworks provide a description of the perceived knowledge, skills, attitudes, and other attributes required to enact professional practice competently within a given context 1 . In health professions, these are developed for several reasons, which can include defining the profession, ensuring a competent workforce, and facilitating curriculum development, systems of assessment, or professional mobility. While a significant number of competency frameworks are available in medicine, surgery, nursing, and other health professions (a previous scoping review identified 190 frameworks), and more continue to be developed, their quality may be compromised due to variability in reporting that describes clearly and sufficiently how the competency framework was developed 2 . Poor, inconsistent, or insufficient reporting practices have led to concerns regarding the validity of outputs, and threaten their utility and trustworthiness in informing downstream activities such as curriculum design and descriptions of practice 2,3 . If competency frameworks are to meet their goal, consistent and sufficient reporting is one way to improve their use and judgements of quality.
Reporting guidelines are developed to help researchers improve the completeness and transparency of their research reports 4 . A reporting guideline is defined as 'a checklist, flow diagram, or explicit text to guide authors in reporting a specific type of research, developed using explicit methodology' 4 . Examples of reporting guidelines include the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) 5 and the Guidance for Reporting Involvement of Patients and the Public 2 (GRIPP2) 6 . As part of a broader programme of research on competency framework development, the need for improved reporting guidance for those developing competency frameworks was identified 2 . This guidance-in the form of a reporting guideline-could help to improve the utility and validity of reported competency frameworks 7 . It will also help those who wish to use competency frameworks (as a type of quality appraisal), as well as those undertaking peer review and editorial review of manuscripts describing the development of competency frameworks. Evidence demonstrates the use of reporting guidelines increases methodological transparency and uptake of research findings 7 , which is a key component in the translation of research findings into clinical practice, and education.
The aim of this study was to develop an evidence-and consensus-based reporting guideline to aid the evaluation of health professions-focused competency frameworks.

Methods
This study followed the steps outlined by the 'Enhancing the QUAlity and Transparency Of health Research' (EQUATOR) network for developing reporting guidelines in health research 4 . This framework has been used successfully to develop reporting guidelines for other areas of health research 5,8 . The reporting guideline was developed in four stages, including: stage 1, project launch (research team established, expert advisor recruited, protocol drafted, and ethics approval); stage 2, knowledge synthesis (comprehensive literature and guideline review to inform and identify potential reporting items); stage 3, consensus process (panel of developers, journal editors, regulators, and end users recruited to complete a Delphi study and virtual workshops); stage 4, developing the reporting guideline (initial reporting guideline drafted to solicit feedback from the expert panel, and feedback used to create the final reporting guideline).
See Fig. 1 for a schematic of these stages. Details on the activities at each stage are described below.

Establish team, expert advisor, and protocol
A core project team (A.M.B., W.T., J.V.R., and B.W.) was established and were responsible for drafting the protocol and the day-to-day operations of the project. The team members provided expertise in competency framework development and evaluation, and in health profession competency-based education. The team engaged a methodological advisor (T.H.) to guide both the consensus methodology and development of the reporting guideline. This advisor guided the team in the conduct of this study, including the literature review; the nomination of participants for the Delphi study; reviewing the checklist items for inclusion in the Delphi study, providing feedback after each round of the survey (e.g. interpreted results of the previous round and approved content for the next round); and contributed to the production and reporting of the final reporting guideline. The development of the COmpeteNcy FramEwoRk Development in Health Professions (CONFERD-HP) reporting guideline was registered on the EQUATOR Network website in December 2019 9 .

Ethics approval
The Monash University Human Research Ethics Committee provided ethics approval for this study in June 2021 (#27484).

Stage 2: knowledge synthesis
Results from a scoping review of competency frameworks were used to inform this study 2 . The scoping review aimed to understand how health professions developed competency frameworks and then to consider these activities against existing development guidance 2 . The major aspects and characteristics of the competency framework development process reported across 190 studies were reviewed and extracted using a standardized data-collection form. This included items potentially relevant for inclusion in a reporting guideline (e.g. rationale for development, rationale for selection of methods, and declaration of funding source) 2 . All published checklists and guidelines on the EQUATOR Network website were then reviewed (421 at the time of search (January 2020)), to identify reporting guidelines related to qualitative and mixedmethods data collection. Several guidelines were extracted in full and entered into a spreadsheet. Finally, elements of a contemporary six-step competency framework development model that did not feature in the existing literature were leveraged. This six-step model furthers existing development guidance through the use of theoretical or conceptual approaches, as well as the use of mixed or multiple methods of data collection 2 . The three sources of extracted data were combined into a single spreadsheet and a key list was generated that was cross-referenced with each source.

Delphi panel
The consensus approach was guided by traditional Delphi methodology. A purposive sampling strategy was employed, aimed at recruiting panel members that represented users of the guideline 10,11 . This meant attending to diversity of representation across anticipated intended and end-users, professions, contexts, and stakeholders 10,12 . To begin, intended users were defined as those who will use the reporting guideline (e.g. authors, those developing competency frameworks, journal editors, and regulators) and end users as those who will use the reported competency framework (e.g. health professionals and educators). Broad representation was sought by targeting authors of competency frameworks, editors of journals that publish competency frameworks, and health profession regulators (described in further detail below). Finally, a virtual approach was employed to allow experts who were dispersed globally to participate 13,14 , and to accommodate restrictions imposed by the ongoing COVID-19 pandemic (the guideline was developed throughout 2021). Recruitment of 30 panel members was anticipated, with detailed sampling strategy outlined below.

Intended users: authors
All authors of reports of competency framework development identified in the scoping review 2 that had a corresponding email address were invited to participate. Additional authors who were not represented in the scoping review but had authored a report of competency framework development since the publication of the review were also invited. These authors were identified through database searches for papers related to competency framework development in health professions. Authors from all countries, disciplines, and organizations identified were invited with no exclusion criteria.

Intended users: journal editors
Journals in which competency frameworks were published were identified and ranked according to the frequency of publication of frameworks. Editors from the top-five most productive journals (having published five or more competency frameworks) were invited to participate.

Intended users: health professions regulators
Representatives of a number of national regulatory bodies across health professions who developed or provided oversight for the development of competency frameworks were invited. These were identified through the scoping review.

End users
Representatives of professional associations, individual health professionals (including physicians, surgeons, nurses, pharmacists, dietitians, therapists, and others), and educators/faculty members, identified through the research team and the results of the scoping review, were invited (e.g. associations that endorsed guidelines).
All participants were recruited via email invitation in July 2021 which outlined the objective of the study, study design, participation details, and level of commitment. Consent forms were also included in the invitation.

Survey development
The Delphi study included three rounds: an initial generative, open-ended survey-based round, and two subsequent rounds of item scoring online surveys conducted using the edelphi.org

Research question
What are the essential items that should be included when reporting on the development of a competency framework in the health professions?
Objective Develop an evidence-and consensus-based guideline for the reporting of competency framework development in health professions using explicit, systematic, and transparent methods  website (with Google Form and Excel versions available in case of technical issues). Research team members tested but did not contribute data. Prior to deployment, surveys were piloted for content and clarity with the core project team and expert advisor. Participants were sent an initial invite to each round, a reminder at 1 week, and a final reminder with 3 days remaining to optimize participation. Each survey round was open for 3 weeks with 2 weeks between each round for analysis and synthesis.

Round 1 of Delphi
Delphi panel members were asked one non-identifying demographic question related to their role in competency framework development. This was asked to gain insight into the composition of the panel in relation to their roles, and to ensure diversity in responses. Panel members where then asked to indicate (via free-text responses) the essential items that should be included when reporting the development of a competency framework. Specifically, they were asked: 'As a consumer, when reading a report/publication about the development of a competency-related framework, at a minimum, what elements would require reporting in order to imbue trust in the competency framework itself?' In keeping with traditional approaches and as a means of reducing bias, this open-ended approach was implemented rather than providing a list of items that were informed by the knowledge synthesis (described above).

Round 1 analysis
Results from Round 1 were analysed using continuous content analysis, whereby the data were analysed to inductively generate lists and categories of minimum required reporting items 15 . For example, items suggested by panel members related to target audience, title, and definitions were categorized under 'Background information'. One member of the research team (A.M.B.) then collapsed redundant or repetitive concepts or items within categories into a single reporting item (see Table S1). The potential reporting items generated during the knowledge synthesis were then added to the list presented to the panel in round 2 of the Delphi process. The revised list of reporting items and brief explanations was distributed among the research team and expert advisor for review and feedback prior to distribution to Delphi panel members for round 2.

Round 2 of Delphi
Using a 7-point Likert scale (1 = strongly disagree (exclude item), 7 = strongly agree (include item)), Delphi panel members were asked to indicate their agreement with inclusion of each proposed reporting item in a reporting guideline. Each proposed reporting item included detail on whether it originated from the knowledge synthesis exercise (stage 2) or round 1 of the Delphi, and, if it was from round 1, how many panel members suggested the item. An optional text-box response option was also included where respondents could provide feedback and suggestions to improve the clarity of wording for each item. This feedback was visible to other members of the panel in an anonymous format. The feedback was used to clarify wording for round 3, and to identify items considered duplicate by panel members. This feedback was distributed to all panel members in advance of round 3, the virtual workshops, and final feedback.

Round 3 of Delphi
Delphi panel members completed a third round of scoring of reworded and clarified discrepant items based on round 2 feedback. Delphi panel members were asked to rate their agreement with each of the reworded items using the same 7-point Likert scale as above. Each survey item included an optional text box where respondents could provide further comments.

Round 2 and round 3 analysis
The approach to round 2 and 3 survey analysis involved calculating the average scores for the group to feed back to individual participants in the following Delphi round. Seventy per cent agreement was established a priori for each of the reporting items as a threshold for consensus among the Delphi panel members. This rule required that at least 70 per cent of the respondents indicated that they either 'agreed' or 'strongly agreed' (values of 6 or 7 on the Likert scale) with the inclusion of the item as an essential requirement within the reporting guideline. If agreement was less than 70 per cent the item was considered discrepant and brought forward for discussion at the workshops. If agreement was just over 70 per cent or the distribution of responses was not clearly in favour of agreement (e.g. not mainly values of 6 or 7 on the Likert scale), the item was brought forward for discussion at the virtual workshops. In addition, the free-text comments provided by Delphi panel members were analysed for content and used to inform decisions for either the merging or rewording of draft items generated at the end of each round. The Delphi panel members were provided with a detailed report at the end of each round, which contained the level of agreement for each item and a summary of the free-text comments.

Workshops
The aim of this stage in reporting guideline development was to discuss and gain consensus on the items for inclusion in the final reporting guideline 4  . All participants who participated in at least one of the steps above were invited. Attendees reviewed the items resulting from the Delphi rounds, and provided feedback on the wording to promote clarity, and sequence of each item. In the workshops, participants and the research team worked in a live synchronous environment whereby edits were made as suggested by participants. Disagreement was resolved by discussion. Workshops were held via Zoom to facilitate attendance across multiple time zones and to comply with COVID-19 restrictions. Workshops were recorded and detailed notes were taken by the research team.

Draft guideline
A draft reporting guideline was developed based on the results of the knowledge synthesis, consensus process, feedback from workshop participants, and notes taken by the research team. The draft was circulated to all Delphi panel members to receive further feedback on item wording, clarity, sequence of items, and categories within the checklist. The research team then used the feedback provided by Delphi panel members on the draft reporting guideline to inform the final version of the guideline.

Measures to promote trustworthiness
To promote trustworthiness the research team strived to demonstrate dependability, credibility, confirmability, and transferability 16 . Diversity was sought in representation across users, professions, contexts, and stakeholders, to improve dependability 10 . Credibility in the process was ensured by providing ongoing feedback from Delphi panel members to each other as a form of member checking 10,17 . The Delphi study was reported in detail in line with the Guidance on Conducting and REporting DElphi Studies (CREDES) reporting guideline 18 , to improve confirmability.

Participant stipends and payments
Participants were not remunerated for their participation in this study.

Stage 2: knowledge synthesis
Analysis of the scoping review findings related to reporting quality revealed several inconsistencies in the reporting of competency framework development ( Table 1) 2 . The search of the EQUATOR Network website for mixed-methods and qualitative reporting guidelines resulted in the identification of six relevant guidelines [18][19][20][21][22][23] . A total of 157 individual items were abstracted from these guidelines. Finally, the competency framework development guidance was reviewed which resulted in the identification of 31 potential reporting items 1,2 . When cross-referenced and collapsed, this resulted in a synthesized list of 17 potential reporting items for mapping against the panel responses from round 1 of the Delphi study.  Fig. 2).

Round 1
An initial list of 117 suggested reporting items was generated. Following categorization and synthesis, a list of 27 reporting items was identified. This list was mapped against the knowledge synthesis findings (stage 2), which resulted in an additional six unique items for a total of 33 potential reporting items.

Round 2
A panel of 30 expert participants scored 33 potential reporting items from round 1. Five participants did not complete the  Rounds 2 and 3 were anonymous and no demographic information was gathered. *Some panel members held more than one role and therefore the total is greater than the number of participants. Values are n (%) unless otherwise indicated.
survey. Overall, 19 items reached 70 per cent agreement, five items were considered duplicate and were merged with agreed items (e.g. the panel suggested merging four items on the reporting of different methods to one tiered item addressing 'methods'-item 9c). Nine items were considered discrepant and brought forward for consideration in round 3. Edits were suggested to the wording of all but one item (item 1a). (Table S2).

Round 3
A panel of 17 expert participants scored nine discrepant reporting items from round 2. Overall, two items (both rephrased based on round 2 feedback) reached 70 per cent agreement (items 2 and 7). Seven items remained discrepant and were brought forward for discussion in the workshops. (Table S2).

Workshops and initial draft
A total of nine participants attended one of two virtual workshops. Participants were exclusively female, largely held senior academic ranks (e.g. Associate Professor, Professor, and Chair), and represented multiple health professions (e.g. medicine, nursing, dietetics, and pharmacy). In these workshops, which were facilitated by the expert advisor and members of the research team, an initial draft guideline was presented, and revisions were suggested to the wording of items. Revisions were suggested to the wording of all but four of the items between both workshops (items 1a, 3, 11, and 14). Two items were merged based on workshop feedback.

Revised draft guideline and feedback
A revised draft guideline was generated that contained 20 reporting items, and this was circulated to the Delphi panel for feedback. Feedback was received from 17 panel members across four roles and eight disciplines, and 16 further elected to be identified as collaborators on the guideline. A sample of four participants, who volunteered, 'tested' the reporting guideline for usability by applying it with development, peer-review, and quality-appraisal contexts in mind. All items were reviewed and edited for clarity and sequence based on the feedback from the Expert panel with 30 members completed online survey to rate agreement with 33 items using a 7-point likert scale Items that reached consensus were retained for inclusion, duplicate items were merged, and discrepant items were considered for discussion in workshops Expert panel with 17 members completed online survey to rate agreement with 9 reworded and/or clarified discrepant items from round 2 Items that reached consensus were retained for inclusion, and discrepant items were discussion in workshops Final checklist was compiled with 18 items that reached agreement in round 2 (including 1 merged item), and 2 items that reached agreement in round 3 Wording for each item was agreed by 9 panel members at one of two virtual workshops and by feedback and usability testing by 17 members 20-item checklist 19 items reached consensus (at least 70% agreement) 5 items were considered duplicates by the panel and merged into accepted items 9 items were edited and put forward for scoring in round 3 2 items reached consensus (at least 70% agreement) 7 items were considered discrepant and brought forward to workshops for discussion 117 items were generated by the expert panel 6 additional items were identified via knowledge synthesis Synthesized into 33 items by research team for discussion and scoring in round 2

Fig. 2 Results of the Delphi study and virtual workshops
Delphi panel. A copy editor edited the final checklist to ensure clarity, consistency, and grammatical accuracy.

Discussion
The CONFERD-HP reporting guideline was developed using an evidence-informed approach as outlined by the EQUATOR Network 4 . The CONFERD-HP reporting guideline is intended to guide the essential items that should be reported when describing the development of competency frameworks in the health professions, including medical specialities, surgical specialties, nursing, allied health, and others. The final reporting guideline checklist is presented in Table 3 and includes 20 items. It consists of 18 items that reached agreement in round 2, along with two additional items that were modified for inclusion after round 3. The checklist items encompass the following categories: basic information (e.g. title); background information (e.g. purpose); development process (e.g. methods); testing; funding; and conflicts of interest. An explanation and elaboration statement is being prepared, which will provide readers with a comprehensive explanation and rationale, as well as examples of good reporting, for each item in the reporting guideline. The reporting guideline can assist those developing competency frameworks in the development process, support journal editors and peer reviewers when considering frameworks for publication, and help end users (e.g. educators, professionals, and regulators) understand the scope, rigour, and trustworthiness of a competency framework, and inform decisions around its utility and applicability for their intended purposes. When the reporting guideline is applied, it results in a clear, explicit description of processes and procedures that were used when developing a competency framework, and access to the resources and evidence used to formulate each recommended item. The CONFERD-HP reporting guideline is not intended to be a prescriptive or linear format for reporting competency framework development. Rather, each item should be presented, and sufficient elaboration provided somewhere in the reporting of the competency framework development process.
The CONFERD-HP study should be interpreted in light of some limitations. Firstly, the Delphi panel response rate represented 23 per cent of those invited. While there is no sample size calculation for the Delphi method, it is generally accepted that a larger sample may increase the reliability of the group's judgements, and more than 12 participants are recommended 24 -a threshold which was met in all three rounds. More importantly, panel members represented a diverse sample of all potential end users of the reporting guideline, With input from experts in medicine, surgery, nursing, allied health professions, and multidisciplinary fields such as public health, the study managed to gained input from the full range of the population the guidance is intended to influence 24 . Secondly, not all Delphi panel members were able to participate in all rounds of the Delphi. Unfortunately, schedule and COVID-19 demands placed on Delphi panel members made this a challenge. In an effort to reduce the Table 3 CONFERD-HP checklist   Section/topic  Item  Checklist item   Title and abstract  Title  1a Identification as a competency framework in the title 1b Identification of intended profession and level of practice/stage of training in title Structured abstract/ summary 2 Structured summary that includes intended user(s) and use(s) of the framework, the purpose of the framework, the development process and methods used Definition(s) 3 Defined or referenced definitions for competence, competency, and other key terms used to promote understanding of the framework Framework development Rationale and justification 4 Description of rationale and justification for the development of the framework, including supportive references where possible Purpose and use 5a Description of the purpose of the framework 5b Description of the intended use(s) of the framework 5c Description of the intended user(s) of the framework Developer group 6 Description of the qualifications and expertise of those leading the development of the framework Oversight/governance group 7 Description of the group that had oversight of the framework, the purpose and expertise of the group members, how they reviewed the work, and/or contributed to the development Theoretical/conceptual approach(es) 8 Description of theoretical/conceptual approach(es) used to develop the framework, including references and rationale for their use Development process Process and methods 9a Description of each step of the development process 9b Description of how existing literature was gathered and used to inform the competency framework development. Provide a list of references used 9c Description of all methods used throughout the development process, including associated reference(s) and details of any modifications End-user contributions 10 Description of all stakeholders, including end users of both the framework (e.g. the professional group consulted) and the services (e.g. patients/consumers and other healthcare professionals) who contributed to the development process, how they were selected (with considerations of equity, diversity, and inclusion), and how they participated. Ethics 11 Description of ethical considerations and approvals obtained where applicable Evaluation and implementation Evaluation 12 Description of the approach for evaluating the draft competency framework, including how feedback from stakeholders was gathered and used Implementation 13 Suggestion for how the framework should be implemented and in what settings Description of how COI were considered and managed in the development process effects of attrition, all members of the Delphi panel were engaged at several points-Delphi rounds, workshops, and an opportunity to provide final feedback via email. Thirdly, important items may have been missed in the initial literature review (stage 2), but every effort was made to minimize this possibility by examining relevant reporting guidelines and engaging the expertise of the Delphi panel in round 1. Thirdly, some items identified from the knowledge synthesis were not considered relevant by the Delphi panel. In addition, the panel suggested 10 items more than the knowledge synthesis had generated. This may be partly due to wording (as evidenced by reworded items in round 3 gaining agreement) but is most likely because the perspectives of the panel are informed by their real-world experiences developing, reporting on, and publishing competency frameworks. The expert insight provided by the Delphi panel ensures the reporting guideline will be useable and useful. Finally, like any other reporting guideline, the CONFERD-HP reporting guideline is an evolving document that will require ongoing evaluation, improvement, and updating. There may also need to be adaptations across professions as the framework is used. For example, some professions (or specialties within professions) may require consideration of items not included in this guideline. The statement will be revised in the future based on user feedback, results of evaluations, and improved guidance on reporting guidelines. Those who use the CONFERD-HP reporting guideline are encouraged to submit comments via the CONFERD-HP website (www.conferd-guideline.org). Use of the CONFERD-HP reporting guideline will be encouraged by journals by contacting the editors of journals that have published competency frameworks to elicit their support and inform national and international competency framework developers about the reporting guideline.

Funding
No funding was received for this study.