Abstract

Before problems can be solved, they must be defined. In global public policy, problems are defined in large part by institutions like the World Bank, whose research shapes our collective understanding of social and economic issues. This article examines how research is produced at the World Bank and deemed to be worthwhile and legitimate. Creating and capturing research on global policy problems requires organizational configurations that operate at the intersection of multiple fields. Drawing on an in-depth study of the World Bank research department, this article outlines the structures and technologies of evaluation (i.e., the measurements and procedures used in performance reviews and promotions) and the social and cultural processes (i.e., the spoken and unspoken things that matter) in producing valuable policy research. It develops a theoretically informed account of how the conditions of measurement and evaluation shape the production of knowledge at a dominant multilateral agency. In turn, it unpacks how the internal workings of organizations can shape broader epistemic infrastructures around global policy problems.

Introduction

Expert research informs and legitimizes global governance. It encompasses information created by scientific, technical, and professional methods of inquiry (van Kerkhoff & Lebel, 2006), supported by the agreed practices and preferences of specialized peers. Overarching policy concepts, such as poverty or development, are produced at different sites comprising a range of practices. In particular, international organizations play a key role in the creation and dissemination of knowledge on global policy problems, given the substantial resources, data, and incentives required (Boräng et al., 2018). In these sites, science, technology and politics are combined toward solving complex, interdependent problems (Haas, 1975). Classic conceptualizations of knowledge production around global policy issues tend to focus on the ways knowledge bestows authority and engenders influence, rather than on the everyday structures and forms through which knowledge is created, substantiated, and maintained (Bueger, 2015). The complexities of how knowledge on specific objects of inquiry (“epistemic objects”) originates and how the value of such knowledge is established remain underexamined. In particular, there is little work that considers contexts where multiple “ways of knowing,” each with attendant origins and objectives, are engaged simultaneously within a specific site.

The sites that define and constitute global policy problems are connected by epistemic infrastructures, which are interwoven structures through which knowledge is created and dispersed. These are the structural, technological, and social patterns of relations between actors, institutions, and material forms (e.g., indicators and templates) that give shape and substance to policy paradigms (Star, 1999). Epistemic infrastructures involve knowledge tools that become entrenched institutional practices. For example, specific multidimensional indicators become embedded in the epistemic infrastructure of the Sustainable Development Goals (SDGs), which are key pillars of global policy (Tichenor et al., 2022). The objects of poverty or development are thus produced at different sites within a broader overarching system, including international agencies, think tanks, private companies, universities, and the media, each with specific ways of knowing and generalizing the object. Within the broader system, there are sites that play particularly important roles in producing knowledge and exerting control, described in science and technology studies as “laboratories” (Bueger, 2015). Laboratories are mediating sites of practice that combine elements from different contexts to create and maintain orders of meaning. These sites often perform a maintenance function that holds together epistemic infrastructures. A key function of these sites is creating receptive audiences for their knowledge claims (Jasanoff, 2005).1 For example, dominant development research actors create the conditions for evidence and arguments around multidimensional indicators to be understood as legitimate by the development community as well as wider fields. Thus, the circumstances of knowledge production in laboratories shape the way specific global problems are defined and enacted and, more broadly, the way that knowledge is received.

In global public policy, one key laboratory is the World Bank, where much of the knowledge that underpins the SDGs is produced, compiled, transcribed, condensed, and compared (Law, 2003). Brought together in one place, everything that is salient (i.e., concepts, indicators, data, and networks) can be seen. Operating in a space where multiple boundaries intersect, the Bank is a site of integration and hybridization. It contains the logics and agendas of specific professional and disciplinary communities (Markauskaite & Goodyear, 2017), such as economics, politics, and development practice. These communities and orientations each come with a distinct set of intangible and material tools and forms of language and patterns of interaction conducive to certain kinds of knowing. These logics and agendas are not always perfectly aligned. There can be conflict and trade-offs within and between them. It is thus necessary to examine the hybrid practices of knowledge production and evaluation in these sites. Yet, practices through which their epistemic objects become knowable have not been adequately addressed.

This article examines the World Bank’s Development Research Group (DECRG).2 DECRG is a dominant site in development economics, with unrivaled financial resources, staff numbers and influence (Donovan, 2018; Ravallion, 2013). Its research team actively manages the logics of various social worlds in the pursuit of legitimacy. For example, DECRG researchers operate within academia, but also politics and policy, application and practice, the media, and commerce (Williams, 2020), and each shape how knowledge is produced and evaluated. This article seeks to divert understandings of knowledge production away from the influence of the World Bank as an actor, toward the practical infrastructures through which its knowledge is created, legitimated, and maintained. Considering practices directs us to the everyday maintenance of structures or orders of meaning (Bueger, 2015). Practices are relational dynamics that are enacted and re-enacted. They are made up of subjective and bodily elements, material and symbolic artifacts and technologies, along with objects and their use (Markauskaite & Goodyear, 2017). This understanding of practice enables us to examine the arrangements and procedures of everyday forms of organization, such as DECRG’s performance evaluation, as well as the social configurations that facilitate and sustain these structures. This, in turn, illuminates how these arrangements and procedures shape the concept of research value and make particular forms of knowledge more or less likely.

This article examines how the World Bank’s research department makes known its epistemic objects through engaging the practices of different fields such as academia, media, and politics. The article considers patterns of organization around evaluation using the lenses of boundary objects (Star, 1999), infrastructures (Star & Ruhleder, 1996), and fields (Bourdieu, 1986). It argues that the systems of knowledge production and evaluation in hybrid sites that produce dominant epistemic objects and infrastructures reflect hybrid practices at the intersection of multiple fields. It highlights quantification as a key part of the process of infrastructuralizing signifiers of value in policy knowledge production and, in turn, the policy problems they aim to define, contribute to, and legitimize. The article provides a missing piece in the literature by showing how the internal workings of organizations can shape the types of knowledge that make up the broader epistemic infrastructures around global policy problems.

Past research and theoretical departure points

Boundaries in hybrid sites

Available approaches for studying the role of science in global policy tend to suggest a straightforward relationship—either linear, such as the knowledge utilization model in policy science, or unified, such as the co-producer model in science and technology studies (Hoppe, 2009). However, both approaches tend to rely on the logic of either science or politics, which does not account for the complexity of the practices of knowledge production, evaluation, and use in global policy. More fruitful conceptions focus on movement across boundaries. The literature on boundary-work—where divisions between fields of knowledge are established, maintained, or destabilized—continues to grow (Pachucki et al., 2007), yet there is scarce empirical research on the intersection of multiple boundaries between fields. Much of the literature focuses on “brokers,” “boundary-spanners,” or “intermediaries” who act as go-betweens for multiple stakeholders. This type of work often considers the role of specific actors in convening, translating, facilitating, and mediating between stakeholders (Tribbia & Moser, 2008). For example, actors bringing together organizations, sectors, governments, and constituents (Gerlak & Heikkila, 2011) or collaborations between the state, universities, and industry (Boardman & Bozeman, 2007). There is also a specific literature that considers movement at the science–policy interface, such as co-production between scientists and others in the policy process (van den Hove, 2007). Other work has looked at the combinations of two distinct logics or institutional cultures. For example, how research managers negotiate the logic of independence, sustainability, and freedom and the logic of integration, relevance, and predictability (Adler et al., 2009). Another example is how open science proponents seek to apply the logic of the media and public space to improve the accessibility, openness, and reproducibility of research (Mirowski, 2018). Yet, the literature has largely neglected instances where various logics are negotiated by actors who are beholden to multiple requirements at the same time. There is a need to study hybridization across boundaries as practiced by those who must simultaneously manage combinations of logics from diverse social worlds.

Despite recognizing the importance of boundary work, the literature does not sufficiently analyze the practices, structures, or conditions that shape knowledge production and determine which “ways of knowing” are valuable in hybrid spaces. Existing work offers limited understanding of the practices through which actors negotiate the boundaries between established fields to produce work that is valuable in multiple contexts. Further research is needed to understand the systems of knowledge production and evaluation in these hybrid spaces. One useful theoretical lens focuses on boundary objects—enacted entities that “allow different groups to work together without consensus” (Star, 2010, p. 602)—and how they crystalize into infrastructures. In this view, boundary objects are sets of work arrangements (i.e., procedures and tools) that link different social worlds. These objects operate at the local level within each world as well as at the collective level within the space between social worlds, with actors toggling between the local and collective forms of the object. Star (2010) observes that, in some cases, the movement across boundaries in this communal space happens at such scale or frequency that the objects come to form infrastructures. These infrastructures are observable configurations embedded in social arrangements and technologies that connect multiple events or locations. Members of social worlds come to understand them through socialization, and they rely on existing conventions for their shape and function. Infrastructures are transparent, standardized forms of organization, both material and enacted, that shape and are shaped by the practices of a community.

Evaluation is a key facet of the materiality that shapes the practice of science. The observable forms of assessment within communities alerts us to the ways knowledge is collected, disciplined, and coordinated. Evaluation systems, particularly metrics, promise accountability, transparency, and clarity. They often take the form of material technologies (i.e., templates, indicators, and data) that come to be imbued with significant power (Derrick & Pavone, 2013; Espeland & Sauder, 2007). Research has also highlighted their tendency to encourage strategic behaviors (de Rijcke et al., 2016) or promote risk avoidance in research agendas (Butler, 2007). Evaluation systems shape and are shaped by institutional practices, and there is a need for research on their effects on knowledge production. The role of evaluation processes in supporting infrastructures across social worlds is thus a key area for investigation. Key here is how these processes shape the concepts of value and worth and apportion forms of legitimacy and power. Insights into how value and worth are managed within hybrid research contexts, including organizational practices and individual responses, are needed. Further theoretical development is required to account for the intersection of multiple social spheres involved in the production of expert research that shapes overarching epistemic objects. This article will make visible how patterns of evaluation bring together several fields and the effects this has on the practices of policy research.

The shared space between multiple fields

Following an orientation to examine epistemic practices within particular contexts leads us to consider the social worlds in which actors operate. A fruitful way of understanding these worlds is offered by Bourdieu (1993) who describes fields of struggle, which are produced and reproduced when actors seek to control certain resources or “capitals” (such as scarce economic resources, cultural tastes, social networks, and recognition and prestige). Actors have a shared understanding of the rules that govern the field, a sense of what can be gained or lost, and an interest in gaining the upper hand. It is through actors’ acceptance of the implicit and explicit rules and investment in the struggle that creates the field (Lamont & Lareau, 1988). Actors engage within the logic of the field by seeking various capitals that afford power. For example, academics aim to establish credibility by producing research that advances knowledge, including journal articles, invited talks, and conference papers (Williams & Lewis, 2021). Collegial input is gained through informal and formal peer review, and value is measured and understood with reference to academic reviews, citations, and publication data. By contrast, policy specialists seek to establish authority by facilitating policymaking, political action, and public debate through reports, talks, and workshops. They often gain feedback from a range of stakeholders, and the value of their outputs is measured and understood with reference to policymaker engagement or use in drafting legislation or shifting political discourse. The spread of capitals is not insignificant; it determines actors’ strategies, practices, and affordances (Medvetz, 2008). Thus, even where actors might share similarities, their practices are shaped by the available capitals and logics of relevant fields.

Taking this approach, attention is given to power, competition and struggle within and between fields, as well as to individual socialization. This orientation allows for an examination of the practices of knowledge production without linking automatically to broad or national interests and influence, or to narrow descriptions of particular organizations. It therefore allows for scrutiny of interaction between social structures and cultural forms (Benson & Neveu, 2005). However, there is a lack of attention to the relations between social spaces (Liu, 2020), which is crucial in understanding how actors negotiate the messy boundaries between them. For example, a think tank researcher might be expected to publish in a range of formats (Gonzalez Hernando, 2019; Ruser, 2018), such as journal articles and blogs, with potentially competing emphases on academic citations and wider public engagement. Of some use here is work that has articulated the spaces between fields (Eyal, 2012), their variations and heterogeneity (Krause, 2017), and their relations to one another (Liu, 2020). This goes some way to rectify the impression that fields are distinct spheres that have clear boundaries separating them (Eyal, 2013). The present study is concerned with the practices that arise to facilitate movement across multiple boundaries in overlapping spaces. In these cases, rather than being tangentially linked or one nested in the other, overlapping spaces merge and share a communal area. The processes of connection and interaction of actors within the communal area give it its shape.

Eyal (2012) offers a novel view of the shared area between overlapping spaces, whereby the boundary between fields is “a thick boundary zone” made up of interactions and links between them. Inspired by actor network theory, he highlights expertise networks in the space between three fields (i.e., bureaucratic, academic, and media). A key insight is that the thick boundary is more porous and unregulated than areas at the center of each space. Yet, the focus on the networks between the shared space does not offer much insight into the positions of the actors and how these are negotiated (Liu, 2020). For example, in Eyal’s analysis, actors are located within government, academia, or commerce, rather than operating as hybrid actors simultaneously beholden to all three spaces. Thus, understanding overlapping spaces requires attention to the distances and connections between those inhabiting the shared area. In his study of American think tanks, Medvetz (2012) shows that although the boundaries are often ill-defined, it is possible to map the relations between the actors and practices in such interstitial spaces. As “hybrid intellectuals,” think tank researchers must borrow from the suite of practices from a range of more fixed professions to create their signature mode of policy knowledge. Further insight can be gained from Evans and Kay’s (2008) work on brokerage of rules, alliances, resources, and frames across activist, political, and legal fields. A crucial element in their analysis is actors’ strategic practices to integrate fields. The authors thus put strategy at the center of the analysis of field overlap, focusing on how actors use leverage across fields. From here, what is required is examination of how leverage (in this case, research value) is created by strategic development of practices that draw on the capitals of diverse fields for different objectives.

In a study of how research actors accrue legitimacy, Williams and Lewis (2021) show how policy researchers increasingly create “impact” in fields outside of academia, including politics and policy, media and public, application and service, and economic and enterprise. The resources of each field are juggled to establish worth and legitimacy from different sources. The authors show that research actors must balance contributing to existing knowledge and ways of knowing (academic), contributing to policy, politics, or governance (politics and policy), enabling changes in practice, processes, or procedure (application and service), facilitating changes in public debate and perception (media and public), and facilitating economic changes or resources (economic and enterprise). The communal space between fields makes available new configurations (Eyal, 2012). For example, academics can combine academic practices with media or policy skills to establish legitimacy as an impactful actor. From this theoretical base, this paper will contribute an empirical account of the everyday practices of knowledge production and evaluation in a hybrid site that operates in the space between fields.

In hybrid sites of knowledge production, the scale and frequency of movement across boundaries often leads to structured attempts to control the toggling back and forth between local and shared boundary objects (Star, 2010). One key attempt takes the form of evaluation, which tries to standardize and commensurate those elements that link different communities or groups. The practices of knowledge production and evaluation at sites that operate in the shared space between fields are thus a key site of tension “between the formal and informal, the ill structured and the well-structured, the standardized and the wild” (Star, 2010, p. 614). This article offers a novel, in-depth examination of the structures, technologies, and social configurations through which knowledge is produced and evaluated at the hybrid site of DECRG. Examining the knowledge practices of DECRG provides a lens into the infrastructuralization of the nexus of actors, institutions, and tools through which global problems are defined and enacted. This is important because World Bank research is influential in the creation of epistemic objects, such as poverty or development, and epistemic infrastructures, such as the SDGs, that shape global governance.

Methods

Case study: context and selection

As a global organization, the World Bank aims to assist countries to overcome a range of pressing market and societal problems. In 1995, the Bank self-styled as an impartial “knowledge bank,” which was a major rhetorical shift from capital-provider and development agency. This change was couched in the vision of creating knowledge for development (and with it, returns on capital investments) (Everett, 2004). Since then, reforms have been directed toward providing an evidence base for financial and policy operations and knowledge for countries’ self-determination. The Bank implemented an Open Knowledge strategy in 2010, making its World Development Indicators and all Bank publications accessible. The World Bank, and DECRG specifically, has been an essential legitimizing force behind the neoliberal free-market paradigm (Broad, 2006). Thus, although the financial authority of the Bank has waned, it maintains its principal position as a source of intellectual influence (Donovan, 2018).

The case study focused on the Bank’s research unit, DECRG, which aims to generate new knowledge through publication in scholarly journals as well as wider dissemination. The research department aims to set the path for aid and development agencies and contribute knowledge and expertise to policymakers, practitioners, and academics. Its primary role is to conduct academic research and analysis that spans countries and sectors. It also provides support to the Bank’s regional centers to ensure synthesis between official policy advice and on-the-ground knowledge. DECRG was selected because it has enormous power in producing, legitimizing, and infrastructuralizing dominant epistemic objects that are acted upon by different organizations and countries (Bandola-Gill, 2022). A key example is the department’s integral role in the development of the World Bank’s Multidimensional Poverty Measure, which was one of the key drivers of the epistemic infrastructure of the SDGs.3 DECRG also contributes to a diverse evidence base that informs and is informed by the SDGs. In particular, it, along with other Bank departments, are major producers in particular, it and other Bank departments are major producers of research on SDG 1: “End Poverty.”4 The organizational arrangements around valuable research in global public policy, and associated relations of power in the overlapping space between fields, are thus a critical area for investigation.

Case study approach

The study utilized an instrumental case design (Stake, 1995), drawing on a descriptive and explanatory approach (Yin, 2017). It focused on how DECRG organizes knowledge production and evaluation and the related practices taken up by actors. This study examined the structures and technologies of evaluation (i.e., official measurements and procedures) and the social and cultural processes (i.e., everyday inter-subjective elements) in producing valuable policy research. Through a 3-month ethnographic period undertaken in 2017 at the World Bank headquarters in Washington DC, I examined the context and processes of knowledge production and evaluation at the institutional and individual levels, including the acquisition of various capitals and the existence and emergence of evaluative practices. Data collection included interviews, as well as fieldnotes and digital observations (e.g., emails, website content, and online documents). This allowed for triangulation of the case, which helped corroborate participants’ accounts and clarify objective information (Guba & Lincoln, 1994). Consent was obtained to report the name of the organization, but interview transcripts were de-identified at the point of transcription and all personal information was removed from interview excerpts. I conducted 36 interviews with World Bank staff, 28 of whom were researchers within DECRG, representing each of DECRG’s eight thematic research teams and every level of seniority. The remainder comprised staff in related areas of the Bank, including the business intelligence, library, and communication teams who provided broader insights into DECRG’s production and dissemination.

Analysis of data was undertaken using Atlas.ti software. The body of data was coded following the tradition of qualitative content analysis (Mayring, 2004). The process began with categorization of similarities and differences in the data. Patterns that displayed the properties of value or worth that guided thinking and action were identified, with evidence compiled into groups of similar excerpts via an inductive coding structure. The second phase of analysis identified the symbolic effects of the above patterns in terms of what is deemed valuable or legitimate. The application of content analysis provided insights into the consequences of structures and strategies for certain forms of knowledge. Analysis focused on describing how boundaries are negotiated in everyday research activities and processes. This process showed how researchers engage in everyday practices in a hybrid site that values multiple forms of knowing. The case study demonstrates the processes that are engaged in defining, producing, and evaluating valuable research.

Results

The following section outlines the results of the analysis of interview, document, and fieldnote data from the World Bank. The findings first describe DECRG’s research character, before examining the structures and technologies of evaluation and the social and cultural processes around research value. Analysis of internal and external organizational materials illuminates how DECRG positions its research within its broader context.

Knowledge production and research character

DECRG’s research focuses on multiple facets of development, broadly covering evaluation, finance, environment, agriculture, human development, poverty, growth, and trade. Researchers produce outputs for diverse audiences through academic journals, working papers, books, blogs, and reports. The language and concepts of the discipline of economics, and academia more broadly, dominate institutional materials. However, policy concerns are also paramount, with the website and other materials emphasizing DECRG’s applied nature. As stated on the website:

Bank research, in contrast to academic research, is directed toward recognised and emerging policy issues and is focused on yielding better policy advice. Although motivated by policy problems, Bank research addresses longer-term concerns rather than the immediate needs of a particular Bank lending operation or of a particular country or sector report DECRG, 2015a.

Thus, there is a temporal and epistemic tension in representations of the Bank’s research, where its intellectual work seeks to improve policy but not at the expense of a long-term focus.

DECRG also emphasizes media strategies and accessibility, emulating think- or advocacy-tanks by making all materials free and readily accessed via a range of online tools that enable easy searching, filtering, and downloading. It also demonstrates the commercial value of its research and analysis products through emphasizing knowledge wholesaling and consulting. The multiple demands are shown in the excerpt from an official document below:

Research produced at the World Bank - and particularly the work produced by the research department - must satisfy a variety of demands. It must meet standards of technical quality on par with any major research university. It must push forward the frontier of development research, both by adding to the available data on development topics and advancing research techniques. And it must achieve influence by expanding the base of policy-relevant knowledge for the World Bank’s operational staff and for policy makers in developing countries (DECRG, 2015b).

Thus, DECRG reflects the hybridity of the communal space between fields, balancing the language and resources of a range of fields. The research unit demonstrates a hybrid capital profile, where the logics of multiple fields are held in tension. The next section considers structural elements, before examining the social and cultural practices at play.

Structures and technologies of evaluation

Analyzing organizational documents and interviews with management at DECRG allowed for identification of how the overlapping space between fields plays out in practice in the structures and technologies of evaluation. A key way that the World Bank governs the knowledge it produces is through quantification. DECRG uses numbers to standardize and categorize forms of knowledge in a way that makes them comparable between researchers. This takes the form of a performance-based salary increase tool, whereby the size of an increase depends on performance evaluation rating, called the salary review increase (SRI, ranging from a low of 1 to a high of 5), and starting salary. The SRI ratings are given relative to the performance of colleagues in a department. For DECRG members, the annual objectives that the evaluations are conducted against are set out in an Overall Performance Evaluation (OPE) template, shown in Table 1.

Table 1.

World Bank Research Department’s Overall Performance Evaluation template.

Research outputs (acceptedJournal articles: A, B, C list
in last year)Revise and resubmits
Chapters in books
Authored books
Edited volumes
Working papers
Flagship reports (World Development Report, etc.)
Research wholesaling activitiesData sets: large survey or new primary data at country level, new compilation of existing data or significant update to past data set
Software: publicly available, callable, and reusable programs with a user manual, FAQ materials, reference materials, updating procedures, and provision of training or with a help file and instructions
Courses developed + organized
Presentations at training courses
Outreach/disseminationResearch seminars and conference presentations
Broader outreach activities (blog entries; OPEDs; briefings to country offices; media articles)
Working paper downloads (WB + RePEc) (last 2.5 years)
Working paper downloads (SSRN) (last 2.5 years)
CitationsCitations in Google Scholar of papers published during last 5.5 years
Cross-supportWeeks in last year
Research outputs (acceptedJournal articles: A, B, C list
in last year)Revise and resubmits
Chapters in books
Authored books
Edited volumes
Working papers
Flagship reports (World Development Report, etc.)
Research wholesaling activitiesData sets: large survey or new primary data at country level, new compilation of existing data or significant update to past data set
Software: publicly available, callable, and reusable programs with a user manual, FAQ materials, reference materials, updating procedures, and provision of training or with a help file and instructions
Courses developed + organized
Presentations at training courses
Outreach/disseminationResearch seminars and conference presentations
Broader outreach activities (blog entries; OPEDs; briefings to country offices; media articles)
Working paper downloads (WB + RePEc) (last 2.5 years)
Working paper downloads (SSRN) (last 2.5 years)
CitationsCitations in Google Scholar of papers published during last 5.5 years
Cross-supportWeeks in last year
Table 1.

World Bank Research Department’s Overall Performance Evaluation template.

Research outputs (acceptedJournal articles: A, B, C list
in last year)Revise and resubmits
Chapters in books
Authored books
Edited volumes
Working papers
Flagship reports (World Development Report, etc.)
Research wholesaling activitiesData sets: large survey or new primary data at country level, new compilation of existing data or significant update to past data set
Software: publicly available, callable, and reusable programs with a user manual, FAQ materials, reference materials, updating procedures, and provision of training or with a help file and instructions
Courses developed + organized
Presentations at training courses
Outreach/disseminationResearch seminars and conference presentations
Broader outreach activities (blog entries; OPEDs; briefings to country offices; media articles)
Working paper downloads (WB + RePEc) (last 2.5 years)
Working paper downloads (SSRN) (last 2.5 years)
CitationsCitations in Google Scholar of papers published during last 5.5 years
Cross-supportWeeks in last year
Research outputs (acceptedJournal articles: A, B, C list
in last year)Revise and resubmits
Chapters in books
Authored books
Edited volumes
Working papers
Flagship reports (World Development Report, etc.)
Research wholesaling activitiesData sets: large survey or new primary data at country level, new compilation of existing data or significant update to past data set
Software: publicly available, callable, and reusable programs with a user manual, FAQ materials, reference materials, updating procedures, and provision of training or with a help file and instructions
Courses developed + organized
Presentations at training courses
Outreach/disseminationResearch seminars and conference presentations
Broader outreach activities (blog entries; OPEDs; briefings to country offices; media articles)
Working paper downloads (WB + RePEc) (last 2.5 years)
Working paper downloads (SSRN) (last 2.5 years)
CitationsCitations in Google Scholar of papers published during last 5.5 years
Cross-supportWeeks in last year

The template is organized according to counts of research outputs, research wholesaling activities, outreach or dissemination activities, citations, and weeks of consulting to other parts of the Bank. This process of quantification links individual researchers to the epistemic objectives of the system, determines what constitutes satisfactory performance, and assigns responsibility to the individual researcher for achieving valuable intellectual work.

Moreover, this evaluation system contains certain assumptions arising from particular logics and particular uses. Thus, it provides an infrastructural form that makes possible certain modes of knowing, while making others less likely. Using the theoretical framework outlined above, it is possible to reconceive these evaluative criteria in terms of overarching fields and the “epistemic agendas,” or ways of knowing that are prioritized, underpinning different material forms. Re-coding the World Bank’s official promotion processes, incentive schemes and internal evaluation criteria show how the space between fields plays out in practice in the form of multidimensional structures of evaluation. Table 2 links the field, the material forms that are counted as indicators of worthwhile research, and the epistemic agendas that underpin them. These multifaceted categories are in contrast to the structures of institutions that operate primarily in a single field, such as a science-based technology company that might determine worth by way of number of patents rather than scholarly citations, or a campaigning organization that is primarily oriented to policy change. Through the process of quantification, rendered multiple forms of knowledge production with different epistemic agendas are made comparable and comprehendible, which becomes a central organizing principle for the Bank’s intellectual authority.

Table 2.

Structural indicators of worth at the World Bank Research Department.

FieldEpistemic agendaKey numerical indicators of worth
ScholarlyEnhancing conceptual knowledgeJournal articles; research presentations; citations
Politics and policyFacilitating knowledgeable policies and politicsWorking papers; briefings to country offices; consulting
Application and serviceFacilitating knowledgeable practices and processesFlagship reports; datasets & software; training courses; consulting
Media and publicEnhancing accessible knowledgeBlog entries; op eds; downloads; citations in media articles
Economic and enterpriseEnabling the commodification of knowledgeConsulting; reports; training courses; briefings to country offices
FieldEpistemic agendaKey numerical indicators of worth
ScholarlyEnhancing conceptual knowledgeJournal articles; research presentations; citations
Politics and policyFacilitating knowledgeable policies and politicsWorking papers; briefings to country offices; consulting
Application and serviceFacilitating knowledgeable practices and processesFlagship reports; datasets & software; training courses; consulting
Media and publicEnhancing accessible knowledgeBlog entries; op eds; downloads; citations in media articles
Economic and enterpriseEnabling the commodification of knowledgeConsulting; reports; training courses; briefings to country offices
Table 2.

Structural indicators of worth at the World Bank Research Department.

FieldEpistemic agendaKey numerical indicators of worth
ScholarlyEnhancing conceptual knowledgeJournal articles; research presentations; citations
Politics and policyFacilitating knowledgeable policies and politicsWorking papers; briefings to country offices; consulting
Application and serviceFacilitating knowledgeable practices and processesFlagship reports; datasets & software; training courses; consulting
Media and publicEnhancing accessible knowledgeBlog entries; op eds; downloads; citations in media articles
Economic and enterpriseEnabling the commodification of knowledgeConsulting; reports; training courses; briefings to country offices
FieldEpistemic agendaKey numerical indicators of worth
ScholarlyEnhancing conceptual knowledgeJournal articles; research presentations; citations
Politics and policyFacilitating knowledgeable policies and politicsWorking papers; briefings to country offices; consulting
Application and serviceFacilitating knowledgeable practices and processesFlagship reports; datasets & software; training courses; consulting
Media and publicEnhancing accessible knowledgeBlog entries; op eds; downloads; citations in media articles
Economic and enterpriseEnabling the commodification of knowledgeConsulting; reports; training courses; briefings to country offices

Analysis of interviews given by DECRG’s senior leadership provides further insight into the patterns of knowledge production that are valued by the organization. One senior leader described the importance of a balance between academic elements and contributing meaningfully to operational aspects of DECRG’s work, noting management’s interest in:

To what extent [researchers] are expected to give keynote speeches, to what extent they are becoming prominent people in their fields, but their impact also, because part of the benefit of being located in an institution like this is the ability to directly work with operational colleagues […] I do keep a very close eye on that to make sure that we deliver what we promise we deliver.

There is also evidence of tension between quantifiable indicators of impact and actual impact in practice:

So, media citations, yes, this is something that we also collect as part of our OPE process to see whether there are citations from other places, but we don’t necessarily take that extremely seriously because at different points in your career, obviously, you’re going to have less visibility in the media, also, in these institutions there are communications they prefer to push, so I don’t evaluate individual researcher based on that. […] We are certainly taking account of whether people are active in the blogs, and Twitter, and that’s how that makes changes, we do take an account.

Here, media citations are measured quantitatively in the OPE template, but given less importance in practice because of a range of mitigating factors. In contrast, some practices that are not included in the OPE, such as the use of Twitter to disseminate research findings, can nevertheless be considered during the evaluation process.

Similarly, another leader described the inherent difficulty in understanding genuine impact beyond the available metrics and the necessary flexibility that allows for a deeper evaluation:

There are things you cannot measure. For example, maybe that year somebody convinced the minister to do something wonderful that’s not going to be on [the OPE evaluation] so that’s why there is a narrative, there is a written thing, there is a meeting. Then at the meeting you say “I know this person has a predicted [SRI] regression of 3.3, but this year she did this”.

Indeed, although social media is not formally included in the OPE process (except blogs), one senior leader indicated in a staff meeting the importance of using online tools in order to disseminate DECRG’s research directly to beneficiaries:

We are very interested in reaching out to people we see as our clients, not the governments, but the people. Being able to influence the people in poor countries so that they are able to demand change from their government, it’s something that we want to do. So, Twitter or Facebook, or whatever is important, blogs, everything that reaches out to the people.

This excerpt indicates organizational support for particular types of knowledge production and dissemination that may influence researchers’ perceptions of what is valued by the organization. This demonstrates the significance of social and cultural practices in addition to those that are formalized through quantification or written record. The above analysis indicates the nebulous nature of evaluation, comprising both structural templates and accompanying cultural and social elements that draw from a range of different fields.

The formal evaluation instruments at DECRG therefore standardize and categorize phenomena within an overlapping space that contains multiple epistemic objectives. Fulfillment of these criteria requires skills and activities deemed appropriate and worthy by range of fields. This organizational infrastructure makes up the hybrid conditions through which some types of knowledge are valued and others are ignored. Thus, processes of quantification come to establish the conditions of knowledge production, going beyond the ascribed meanings of the numbers themselves. Production of valued policy research requires managing the tension between the diverse activities, skills, and practices needed to obtain capital from and export knowledge to multiple fields.

Social and cultural processes of evaluation

The measurement of value in research is prescriptive and performative; it contains a theory of what knowledge should be and creates the appropriate forms that intellectual interventions should take. Thus, it is also important to consider the everyday subjective patterns and processes of evaluation in knowledge production. The subsequent analysis of interviews shows similarities and variation between structural features and cultural features of evaluation at DECRG. These give further insight into the practices that researchers understand as expected by the organization as well as those that are valued by individuals themselves. Researchers frequently indicated the importance of scholarly legitimacy:

If it’s not really respected by researchers elsewhere, it’s not really going to be very influential, is it? Yes, we could be an in-house research department that’s serving the needs of the Bank […] our colleagues can say, “Okay, this is really valuable to us,” but like who’s gauging exactly what the value of that is? You want something independent – it’s not just because it helped operations do something, right? Has it influenced some other ideas? Has it been informed by existing literature? And is it really cutting edge?

However, it is clear that academic credibility is not sufficient on its own; there is an ongoing tension in what is expected of Bank research. Researchers frequently highlighted the simultaneous requirement to influence policymakers:

Well, we get very mixed signals frankly, within the research department. If I talk to other researchers, there’s a pretty clear sense of how academic papers are valued. There’s sort of the prestige of journals, and there are citations, and there are downloads, and that’s all pretty clear. And that’s useful to a point, but it is clear that having an impact is also of value.

Similarly, the focus on providing recommendations and practical solutions to problems was also made salient by researchers. The interplay of academic and pragmatic practices is shown here:

A sign of a system that’s working well is one that changes the course of basic research. And that also leads to better solutions to practical problems. The valuation I will be looking for is along both of those dimensions. What problems, who have we touched, how many have we touched and how valuable was it. But also, have we changed the course of intellectual inquiry in academic disciplines? It’s a high bar but that’s the aspiration.

These excerpts show how creating worthwhile knowledge requires negotiation of practices within the communal space between fields; to be at once academic while also influential in the operational arena. This reflects an interest in working across fields toward a common goal. To sidestep disputes between the logics of different fields, researchers create new sets of hybrid criteria that encompass multiple concerns.

An orientation to harnessing traditional and new forms of media to ensure research is accessible to non-specialists and the public was also a dominant feature of researchers’ constructions of value. As one stated:

If your blogs receive exposure to journalists, if they think it’s an interesting idea you’re putting out there, they might value it - that increases your chances of getting exposure in newspapers and when you get the newspapers there’s a bigger chance that policymakers might take notice of your work. […] Without blogging, most of the papers would’ve only been circulating within academic communities.

This focus on the accessibility of research is also related to an orientation to the marketability of research and research skills. As one researcher explained:

We have this sort of dual role where we spend two thirds of our time on our own research and one third on workloads requested by operations that we sell our time for.

This reflects an ever-present requirement that research skills be harnessed for use by actors outside of academia. This often takes the form of research as a product that is “sold” to the consumer base of donors and country offices. Thus, the above analysis shows that researchers are necessarily hybrid actors whose practices take multiple forms, requiring a broad set of skills and priorities. Individuals’ subjective positions thus reflect the overlapping space between fields, which accounts for the simultaneous divergence and convergence between what is valued by the organization and individuals.

Bank researchers therefore engage with various field logics in their construction of valued, legitimate knowledge. Table 3 summarizes the results of the analysis of interview data. It outlines those elements that are deemed valuable by research staff and the methods of assessment researchers use to understand their own and others’ intellectual labor within the overlapping space. The key social and cultural indicators of worth tend to resist quantification; when researchers reflect on the value of their work, they consider factors such as the prestige of journal, actual shifts in policy or observable changes in practice, as well as the accessibility of their work and the value they have provided to clients. One example of a disconnect between structural and cultural elements is that Twitter is valued by researchers, potentially gaining media capital by facilitating the broader reach of outputs to the public, but it is not valued by formal evaluation processes, which favors blogs and traditional news media. Thus, the Bank’s researchers value intellectual production that is scholarly, critical, and rigorous. Yet, they must also gain the skills and capitals of policy professionals who provide recommendations, pragmatic technical specialists who make tools and ideas available in practice, media specialists who disseminate key messages to a wide audience, and economic actors who provide products to intended consumers.

Table 3.

Social and cultural indicators of worth at the World Bank Research Department.

FieldEpistemic agendaKey non-numerical indicators of worth
ScholarlyEnhancing conceptual knowledgePrestige of journal, peer review, cognitive independence, rigor
Politics and policyFacilitating knowledgeable policies and politicsShifts in policy, feedback from policymakers
Application and serviceFacilitating knowledgeable practices and processesShifts in practices or processes, feedback from operational staff
Media and publicEnhancing accessible knowledgeAvailability and openness of work, contribution to public discourse
Economic and enterpriseEnabling the commodification of knowledgeValue to operational colleagues, contributions to clients
FieldEpistemic agendaKey non-numerical indicators of worth
ScholarlyEnhancing conceptual knowledgePrestige of journal, peer review, cognitive independence, rigor
Politics and policyFacilitating knowledgeable policies and politicsShifts in policy, feedback from policymakers
Application and serviceFacilitating knowledgeable practices and processesShifts in practices or processes, feedback from operational staff
Media and publicEnhancing accessible knowledgeAvailability and openness of work, contribution to public discourse
Economic and enterpriseEnabling the commodification of knowledgeValue to operational colleagues, contributions to clients
Table 3.

Social and cultural indicators of worth at the World Bank Research Department.

FieldEpistemic agendaKey non-numerical indicators of worth
ScholarlyEnhancing conceptual knowledgePrestige of journal, peer review, cognitive independence, rigor
Politics and policyFacilitating knowledgeable policies and politicsShifts in policy, feedback from policymakers
Application and serviceFacilitating knowledgeable practices and processesShifts in practices or processes, feedback from operational staff
Media and publicEnhancing accessible knowledgeAvailability and openness of work, contribution to public discourse
Economic and enterpriseEnabling the commodification of knowledgeValue to operational colleagues, contributions to clients
FieldEpistemic agendaKey non-numerical indicators of worth
ScholarlyEnhancing conceptual knowledgePrestige of journal, peer review, cognitive independence, rigor
Politics and policyFacilitating knowledgeable policies and politicsShifts in policy, feedback from policymakers
Application and serviceFacilitating knowledgeable practices and processesShifts in practices or processes, feedback from operational staff
Media and publicEnhancing accessible knowledgeAvailability and openness of work, contribution to public discourse
Economic and enterpriseEnabling the commodification of knowledgeValue to operational colleagues, contributions to clients

What constitutes valuable policy research thus occurs at the intersection of a range of fields. This can be seen in the structures of evaluation in the organization, but also the experiences and tendencies of individual researchers. Indeed, internal and external evaluations at the unit level also use a similar range of indicators (Banerjee et al., 2006; DECRG, 2015) and tend to mirror the hybridity seen in the evaluation of individual actors. Thus, rather than operating primarily within the logic of either research and politics where one is tangentially linked to or one nested in the other, the Bank’s evaluative structures and cultures encompass a range of epistemic goals, which require simultaneous orientation to the activities, practices, and skills of multiple fields. For the organization, the movement across fields is formalized and standards through practical infrastructures of evaluation. For researchers, holding these practices and forms of worth in balance is a complex ongoing task. As one researcher remarked, “we try to make the policy community happy, the operational community in the bank happy and the academic community happy, and that’s not an easy Venn diagram.” The space between fields thus requires policy research actors to enact the appropriate practices of various criteria of evaluation simultaneously and thus makes particular forms of knowledge more likely.

Discussion

In its pursuit of legitimacy, the World Bank’s research unit seeks to instill confidence that its power to inform decisions on collective problems is justified and appropriate. It utilizes a range of measures to ensure the knowledge created by the Bank creates stable links with various pockets of the political and social environment. Establishing a reputation as a knowledge bank that defines global policy issues relies heavily on hybridity that draws on multiple bases. This article demonstrates that practices from a range of fields are stabilized and standardized through material tools that demarcate valuable policy knowledge. The conditions that exist within the communal space between fields shape what is understood as valuable policy research. The article shows how forms of knowledge underpinned by different epistemic agendas are brought together through the infrastructures of legitimate knowledge creation.

Evaluation systems are rooted in assumptions that adhere to a certain logic and are intended for particular uses (Dunne & Raby, 2013). Decisions around evaluation produce an infrastructural form that privileges certain modes of thought over others. Deriving meaning from numbers, as DECRG does, relies on social and material resources, including money, expertise, administration, and technology (Tichenor et al., 2022). In particular, numerical representations, drawn from diverse fields, are essential elements in these infrastructures that shape the ways in which actors define their intellectual work. At the World Bank, quantification around knowledge production links actors and instruments, assigns judgments, and delegates responsibility (Miller & Power, 2013). Through standardizing, categorizing, and simplifying, comparisons become possible between different forms of knowledge underpinned by distinct epistemic agendas (Espeland & Sauder, 2007). Quantification ascribes stability to measured phenomena (Espeland & Stevens, 2008) and allows them to be transferred across fields. Thus, studying the infrastructural systems at the World Bank foregrounds the templates that are used by managers to collect data, the digital resources required to access and analyze it, and the administrative structures for managing, incentivizing, and rewarding on that basis. These infrastructures make up the conditions through which some types of knowledge production are valued and others are ignored. Thus, the internal limitations of knowledge production at a dominant site have important consequences for the materiality of the broader epistemic infrastructure that underpins global public policy.

The material qualities of the indicators, targets, and research outputs at the World Bank are able to move across and between fields. The structures and practices are open, unfinished, and subject to change. They operate in the background as taken-for-granted assumptions. The respective social and cultural forms of established fields thus also come into play. The indicators come to provide legitimacy to the actors that produce and use them, thus giving character to DECRG as a whole. In turn, these set the stage for what constitutes valuable research for other players in the field. In addition to being ensconced in a mode of scholarly intellectual production, the Bank’s research team must also utilize and gain the capitals of policy professionals, technical specialists, media specialists, and economic actors. Thus, through various structural features and cultural cues, valuable research requires hybrid intellectual practice shaped by the overlapping space where fields intersect. Actors therefore come to perform mixed capital profiles, displaying diversity in strategies, goals, and interests over time. These strategies can then translate into forms of power, legitimacy, and prestige. An examination of research value shows how individuals and organizations create hybrid intellectual practices. It debunks the notion that this type of intellectual work is an extension of a single field or a dichotomy of two defined areas. The practices of knowledge creation are constantly negotiated within the communal space between fields, dependent on actors’ capital profiles and objectives.

By examining the knowledge practices of a multilateral organization that seeks to comply with expectations of a range of audiences, this article shows the boundary-work in which multiple field logics overlap. The epistemic diversity of this communal area is significant because various practices come to be associated with different levels of value (Bandola-Gill, 2019). This article provides an important first step in empirically examining the practices of knowledge production around research value that occur in a hybrid context in the communal space between fields. This is an essential task in an increasingly complex world that relies on knowledge produced and synthesized by a range of research actors who operate across the boundaries of fields, disciplines, and nations. In hybrid settings, there is a need to bring together broad and specific goals (Star & Ruhleder, 1996), by combining multiple epistemic agendas and setting new standards as organizing principles for what constitutes valuable research. The World Bank is a key player in the intricate relationships around poverty and development and in turn how they are measured and addressed around the world. This article has addressed an important gap in the literature, exploring how the value of research has been constructed in an institution that shapes the landscape of global policy problems. From this base, a crucial next step would be to examine how DECRG exerts effects on the framing and legitimization of policy decisions in the locations it is trying to reach. Future research could consider the ramifications of hybrid pressures and incentives on policies that are enacted by states and the actors who draw legitimacy from them.

Acknowledgements

I am grateful to Professor Michele Lamont and the Harvard Weatherhead Center for International Affairs Research Cluster on Inequality and Inclusion for their valuable feedback on an early draft and particularly to Dr Kobe De Keere, Dr Stefan Beljean, and Dr Jonathan Mijs who reviewed a previous version of this article.

Funding

This work was supported by the Economic and Social Research Council grants ES/V004123/1 and ES/N016319/1. This article is part of the themed issue “Global Public Policy in a Quantified World: Sustainable Development Goals as Epistemic Infrastructures,” edited by the ERC-funded METRO project (“International Organisations and the Rise of a Global Metrological Field”). As such, the article has received publication support from the H2020 European Research Council (715125 METRO (ERC-2016-StG)).

Conflict of interest

None declared.

Footnotes

1

Jasanoff (2005) sets out the concept of “civic epistemologies,” which are “publicly accepted and procedurally sanctioned ways of testing and absorbing the epistemic basis for decision making,” which is related but separate to the notion of “epistemic infrastructures.”

2

The acronym DECRG comes from Development Economics Vice Presidency (DEC) Research Group.

3

The multidimensional poverty measure was created by the Global Poverty Working Group, comprising members from DECRG, the Development Economics Vice Presidency Data Group (DECDG), and the Poverty and Equity Global Practice (www.worldbank.org/en/topic/poverty/brief/multidimensional-poverty-measure).

4

The World Bank was the most prolific institution in producing journal articles on topics underpinning SDG 1 published from 2011 to 2020. This is based on Elsevier’s (Scopus) 2020 search queries for each of the 16 SDGs, which were used as part of the 2020 Times Higher Education (THE) Impact Rankings. Bank researchers produced 304 publications related to SDG 1 from 2011 to 2020, which were cited 6,842 times. Harvard and Oxford followed with 261 and 259 publications respectively.

References

Adler
 
N.
,
Elmquist
 
M.
, &
Norrgren
 
F.
(
2009
).
The challenge of managing boundary-spanning research activities: Experiences from the Swedish context
.
Research Policy
,
38
(
7
),
1136
1149
. https://doi.org/10.1016/j.respol.2009.05.001.

Bandola-Gill
 
J.
(
2019
).
Between relevance and excellence? Research impact agenda and the production of policy knowledge
.
Science and Public Policy
,
46
(
6
),
895
905
. https://doi.org/10.1093/scipol/scz037.

Bandola-Gill
 
J.
(
2022
).
Statistical entrepreneurs: the political work of infrastructuring the SDG indicators
.
Policy & Society
.
41
(
4
),
498
512
.

Banerjee
 
A.
,
Deaton
 
A.
,
Lustig
 
N.
,
Rogoff
 
K.
, &
Hsu
 
E.
(
2006
).
An evaluation of World Bank Research, 1998–2005
.
Washington, DC
.

Benson
 
R. D.
, &
Neveu
 
E.
(
2005
).
Bourdieu and the journalistic field
.
Polity
.

Boardman
 
C.
, &
Bozeman
 
B.
(
2007
).
Role strain in University Research Centers
.
Journal of Higher Education
,
78
(
4
),
430
463
. https://doi.org/10.1080/00221546.2007.11772323.

Boräng
 
F.
,
Cornell
 
C.
,
Grimes
 
M.
, &
Schuster
 
C.
(
2018
).
Cooking the books: Bureaucratic politicization and policy knowledge
.
Governance
,
31
(
1
),
7
26
. https://doi.org/10.1111/gove.12283.

Bourdieu
 
P.
(
1986
). The forms of capital. In
G.
 
Richardson
(Ed.),
Handbook of Theory and Research for the Sociology of Education
. (pp. 241–258).
Greenwood Press
.

Bourdieu
 
P.
(
1993
).
Language and symbolic power
.
Harvard University Press
.

Broad
 
R.
(
2006
).
Research, knowledge, and the art of ‘paradigm maintenance’: The World Bank’s development economics vice-Presidency
.
Review of International Political Economy
,
13
(
3
),
387
419
. https://doi.org/10.1080/09692290600769260.

Bueger
 
C.
(
2015
).
Making things known: Epistemic practices, the United Nations, and the translation of piracy
.
International Political Sociology
,
9
(
1
),
1
18
. https://doi.org/10.1111/ips.12073.

Butler
 
L.
(
2007
).
Assessing university research: A plea for a balanced approach
.
Science and Public Policy
,
34
(
8
),
565
574
. https://doi.org/10.3152/030234207X254404.

de Rijcke
 
S.
,
Wouters
 
P. F.
,
Rushforth
 
A. D.
,
Franssen
 
T. P.
, &
Hammarfelt
 
B.
(
2016
).
Evaluation practices and effects of indicator use - A literature review
.
Research Evaluation
,
25
(
2
),
161
169
. https://doi.org/10.1093/reseval/rvv038.

DECRG
. (
2015b
).
Research at work 2015: Turning insights into impact
. http://www.worldbank.org/content/dam/Worldbank/Publications/DEC/Research_TrningInsightintoImpact_med.pdf.

Derrick
 
G.
, &
Pavone
 
V.
(
2013
).
Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review
.
Science and Public Policy
,
40
(
5
),
563
575
. https://doi.org/10.1093/scipol/sct007.

Donovan
 
K.
(
2018
).
The rise of the randomistas: On the experimental turn in international aid
.
Economic Sociology
,
47
(1),
27
5
. https://doi.org/10.1080/03085147.2018.1432153.

Dunne
 
A.
, &
Raby
 
F.
(
2013
).
Speculative everything: Design, fiction, and social dreaming
.
MIT Press
.

Espeland
 
W.
, &
Sauder
 
M.
(
2007
).
Rankings and reactivity: How public measures recreate social worlds
.
American Journal of Sociology
,
113
(
1
),
1
40
. https://doi.org/10.1086/517897.

Espeland
 
W.
, &
Stevens
 
M.
(
2008
).
A sociology of quantification
.
European Journal of Sociology
,
49
(
3
),
401
436
. https://doi.org/10.1017/S0003975609000150.

Evans
 
R.
, &
Kay
 
T.
(
2008
).
How environmentalists “greened” trade policy: Strategic action and the architecture of field overlap
.
American Sociological Review
,
73
(
6
),
970
991
. https://doi.org/10.1177/000312240807300605.

Everett
 
G.
(
2004
).
Governing the knowledge economy: The World Bank as a gatekeeper for development
.
University of Bristol
.

Eyal
 
G.
(
2012
). Spaces between fields. In
P.
 
Gorski
(Ed.),
Bourdieu and historical analysis
(pp.
158
182
).
Duke University Press
.

Eyal
 
G.
(
2013
).
Plugging into the body of the Leviathan: Proposal for a new sociology of public interventions
.
Middle East - Topics & Arguments
,
1
,
13
24
. https://doi.org/10.17192/meta.2013.1.1033.

Gerlak
 
A. K.
, &
Heikkila
 
T.
(
2011
).
Building a theory of learning in collaboratives: Evidence from the Everglades restoration program
.
Journal of Public Administration Research and Theory: J-PART
,
21
(4),
619
644
. https://doi.org/10.1093/jopart/muaq089.

Guba
 
E.
, &
Lincoln
 
Y.
(
1994
). Competing paradigms in qualitative research. In
K.
 
Denzin
&
Y.
 
Lincoln
(Eds.),
Handbook of qualitative research
(pp.
105
117
).
Sage
.

Haas
 
E.
(
1975
).
On systems and international regimes
.
World Politics
,
27
(
2
),
147
174
. https://doi.org/10.2307/2009879.

Gonzalez Hernando
 
M.
(
2019
).
British think tanks after the 2008 global financial crisis
.
Palgrave Macmillan
.

Hoppe
 
R.
(
2009
).
Scientific advice and public policy: Expert advisers’ and policymakers’ discourses on boundary-work
.
Poiesis Praxis
,
6
(
3–4
),
235
263
. https://doi.org/10.1007/s10202-008-0053-3.

Jasanoff
 
S.
(
2005
).
Designs on nature: Science and democracy in Europe and the United States
.
Princeton University Press
.

Krause
 
M.
(
2017
).
How fields vary
.
British Journal of Sociology
,
69
(
1
),
3
22
. https://doi.org/10.1111/1468-4446.12258.

Lamont
 
M.
, &
Lareau
 
A.
(
1988
).
Cultural capital: Allusions, gaps and glissandos in recent theoretical developments
.
Sociological Theory
,
6
(
2
),
153
168
. https://doi.org/10.2307/202113.

Law
 
J.
(
2003
).
Ordering and obduracy
.
Centre for Science Studies: Lancaster University
.

Liu
 
S.
(
2020
).
Between social spaces
.
European Journal of Social Theory
,
24
(
1
),
123
139
. https://doi.org/10.1177/1368431020905258.

Markauskaite
 
L.
, &
Goodyear
 
P.
(
2017
).
Epistemic fluency and professional education, innovation, knowledgeable action and actionable knowledge
.
Springer
.

Mayring
 
P.
(
2004
). Qualitative content analysis. In
U.
 
Flick
,
E.
 
von Kardoff
&
I.
 
Steinke
(Eds.),
A companion to qualitative research
(pp.
159
176
).
Sage
.

Medvetz
 
T.
(
2008
).
Think tanks as emergent fields
.
SSRC Octubre
,
20
(October),
1
10
. https://www.ssrc.org/publications/think-tanks-as-an-emergent-field/.

Medvetz
 
T.
(
2012
).
Think tanks in America
.
University Of Chicago Press
.

Miller
 
P.
, &
Power
 
M.
(
2013
).
Accounting, organizing, and economizing: Connecting accounting research and organization theory
.
Academy of Management Annals
,
7
(
1
),
557
605
. https://doi.org/10.5465/19416520.2013.783668.

Mirowski
 
P.
(
2018
).
The future(s) of open science
.
Social Studies of Science
,
48
(
2
),
171
203
. https://doi.org/10.1177/0306312718772086.

Pachucki
 
M.
,
Pendergrass
 
S.
, &
Lamont
 
M.
(
2007
).
Boundary processes: Recent theoretical developments and new contributions
.
Poetics
,
35
(
6
),
331
351
. https://doi.org/10.1016/j.poetic.2007.10.001.

Ravallion
 
M.
(
2013
).
Knowledgeable bankers? The demand for research in World Bank operations
.
Journal of Development Effectiveness
,
5
(
1
),
1
29
. https://doi.org/10.1080/19439342.2013.763283.

Ruser
 
A.
(
2018
).
Climate politics and the impact of think tanks: Scientific expertise in Germany and the US
.
Palgrave Macmillan
.

Stake
 
R.
(
1995
).
The art of case study research: Perspectives on practice
.
Sage
.

Star
 
S.
(
1999
).
The ethnography of infrastructure
.
American Behavorial Scientist
,
43
(
3
),
377
391
. https://doi.org/10.1177/00027649921955326.

Star
 
S.
(
2010
).
This is not a boundary object: Reflections on the origin of a concept
.
Science, Technology & Human Values
,
35
(
5
),
601
617
. https://doi.org/10.1177/0162243910377624.

Star
 
S.
, &
Ruhleder
 
K.
(
1996
).
Steps toward an ecology of infrastructure: Design and access for large information spaces
.
Information Systems Research
,
7
(
1
),
111
134
. https://doi.org/10.1287/isre.7.1.111.

Tichenor
 
M.
,
Merry
 
S.
,
Grek
 
S.
, &
Bandola-Gill
 
J.
(
2022
).
Global public policy in a quantified world: Sustainable Development Goals as epistemic infrastructures
.
Policy & Society
.
41
(
4
),
431
444
.

Tribbia
 
J.
, &
Moser
 
S. C.
(
2008
).
More than information: What coastal managers need to plan for climate change
.
Environmental Science & Policy
,
11
(
4
),
315
328
. https://doi.org/10.1016/j.envsci.2008.01.003.

van den Hove
 
S.
(
2007
).
A rationale for science–policy interfaces
.
Futures
,
39
(
7
),
807
826
. https://doi.org/10.1016/j.futures.2006.12.004.

van Kerkhoff
 
L.
, &
Lebel
 
L.
(
2006
).
Linking knowledge and action for sustainable development
.
Annual Review of Environment and Resources
,
31
(
1
),
445
477
. https://doi.org/10.1146/annurev.energy.31.102405.170850.

Williams
 
K.
(
2020
).
Strategic positioning: How policy research actors situate their intellectual labour to gain symbolic resources from multiple fields
.
The Sociological Review
,
68
(
5
),
1070
1091
. https://doi.org/10.1177/0038026119900116.

Williams
 
K.
, &
Lewis
 
J.
(
2021
).
Understanding, measuring, and encouraging public policy research impact
.
Australian Journal of Public Administration
,
80
(
3
),
554
564
. https://doi.org/10.1111/1467-8500.12506.

Yin
 
R.
(
2017
).
Case study research: Design and methods
.
Sage
.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.