Shaping scientific work in universities in Chile: exploring the role of research management instruments

ABSTRACT Research management instruments (RMIs) are organizational mechanisms that shape scientific work and influence the trajectory of scientific fields within universities. This qualitative study examines 80 RMIs implemented by eight research-oriented universities in Chile between 1998 and 2021. The findings reveal that these institutions employ policies prioritizing competition as the primary means of accessing funding and opportunities, contributing to the concentration of resources among established researchers participating in international circuits. Consequently, RMIs establish hierarchies within the research community based on individual merit, disregarding the material conditions that may hinder productivity for certain actors. Furthermore, these instruments discourage participation in national and regional scientific communication networks. By highlighting the impact of RMIs, this research enhances our understanding of the organizational mechanisms that shape scientific work in Chilean universities, offering insights into the challenges and opportunities researchers face in the country’s higher education system. Future studies should explore alternative participation circuits within Chilean universities and compare experiences across Latin American regions to understand how local institutions align with global evaluation criteria.


Introduction
Research management instruments (RMIs) are organizational mechanisms that aim to align individual behaviors with institutional guidelines by stimulating and evaluating scientific research.These instruments include policies, regulations, competition rules, and manuals.Often overlooked, they constitute an essential infrastructure that significantly influences and governs scientific work within universities and research institutions.RMIs define the scope of research by determining problem areas, funding sources, and eligible researchers (Rovelli 2017;Mathies, Kivistö, and Birnbaum 2020;Liu et al. 2019).In this manner, they play a crucial role in shaping the field of scientific inquiry (Mathies, Kivistö, and Birnbaum 2020;Jiménez 2019).As part of academia's practical politics (Bowker and Star 1999), researchers contend with the influence of classifications and standards imposed by RMIs, which can profoundly impact their lives.The study of RMIs provides insights into how recent transformations in higher education systems translate into organizational mechanisms that shape scientific work, addressing the link between macro-level changes and governance instruments (Cruz-Castro and Sanz-Menéndez 2018).By doing so, their study contributes to further our understanding of the dynamics of science in Latin America.
Since the 1990s, Latin America has experienced profound transformations in the organization of scientific work (Bruner, Ganga-Contreras, and Rodríguez-Ponce 2018;Góngora 2021;Rovelli 2017;Viales-Hurtado 2021).During this period, the role of the State has evolved, becoming a dynamic agent that drives research efforts and emphasizes the pivotal role of technology and innovation in fostering productive development (Viales-Hurtado 2021).In parallel, funding agencies, universities, and research institutions have increasingly adopted governance strategies influenced by the New Public Management approach.These strategies aim to enhance productivity and foster a serviceoriented environment by implementing market-like incentives and heightened accountability (Hicks 2012;Bruner, Ganga-Contreras, and Rodríguez-Ponce 2018).
Furthermore, the diffusion of new communication technologies has facilitated the expansion of international knowledge circulation and collaborative research (Beigel 2014), contributing to consolidating global evaluative cultures (Reymert, Jungblut, and Borlaug 2021).Despite these developments, Latin America remains situated on the periphery of the global scientific system.Researchers working in these regions often face limited opportunities to contribute to mainstream international circuits (Beigel 2014;Beigel, Gallardo, and Bekerman 2018;Kreimer 2011;Feld and Kreimer 2019;Koch, Vanderstraeten, and Ayala 2021).With variations across disciplines and localities, their focus may be more strongly directed toward less prestigious national and regional circuits (Beigel 2014;Beigel, Gallardo, and Bekerman 2018).
In this context of profound transformations, studying RMIs provides insights into the organizational mechanisms that shape scientific work and stimulate specific forms of research in Latin America.RMIs are the mundane instruments that materialize these trends within organizations, articulating macro-level changes with institutional governance processes, often not without friction and conflict.The higher education system in Chile serves as an intriguing case for examination.The implementation of governance strategies influenced by the NPM approach has led to significant changes in the management of scientific work, transitioning from institutional block funding to targeted, competitive funding schemes that reward institutions and research groups meeting specific governmental criteria (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas 2018).Universities in Chile have established organizational mechanisms within this public policy framework to align individual behaviors with institutional strategies, including internal research funding opportunities, career advancement criteria and procedures, and publication incentives (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas 2018; Beigel 2014; García de Fanelli 2019).Despite their significance, RMIs have only recently begun to receive attention, with a particular focus on the critical analysis of strategies based on bibliometric techniques (Sisto 2017;Koch, Vanderstraeten, and Ayala 2021).However, the comprehensive characterization and broader impact of RMIs on the organization and dynamics of scientific research in Chile remain compelling areas for further exploration.
This paper aims to highlight the role of university RMIs in shaping scientific work and influencing the trajectory of scientific fields in Chile.The study is based on a qualitative document analysis of 80 RMIs implemented across eight research-oriented universities in Chile from 1998 to 2021.The findings suggest that these institutions, through the design of their policies, promote competition as the dominant mechanism for accessing funding and opportunities, ultimately leading to the concentration of resources in the hands of already established researchers who have been able to participate in international circuits.Consequently, RMIs position researchers as members of an imagined community of peers that is hierarchically ordered according to their merit in meeting these criteria.However, these instruments inadvertently silence the material conditions of scientific work that hinder certain actors from meeting productivity criteria.Additionally, they obscure and disincentivize participation in national and regional communication circuits of science.By shedding light on the influence of RMIs on scientific practices, this research contributes to a broader understanding of the organizational mechanisms that shape scientific work in universities in Chile, providing insights into the challenges and opportunities researchers face in the country's higher education system.
This paper consists of five sections.After this introduction, a second section provides a theoretical framework for analyzing RMIs, drawing from studies on evaluative cultures and the metricization of higher education systems.The third section discusses the methodological decisions that informed our study.The fourth section presents a qualitative analysis of RMIs used in eight Chilean universities.Finally, in the fifth section, we discuss some implications of our findings and draw some conclusions.

RMIs and the transformation of higher education systems
Examining university-level RMIs offers valuable insights into the relationship between macro-level changes and governance instruments (Cruz-Castro and Sanz-Menéndez 2018), contributing to the growing literature on evaluation technologies and their effects on research ecosystems and individual trajectories.This section explores the growing adoption of research governance strategies influenced by the New Public Management approach, which aims to enhance productivity and accountability.We delve into the limitations and challenges associated with performance-based research funding and ex-ante evaluations of projects and individuals, highlighting the importance of research on evaluative cultures and metricization processes.Furthermore, we underscore the importance of RMIs as governance mechanisms, examining the potential tensions and unintended consequences they may generate based on current research.
Funding agencies, universities, and research institutions have increasingly embraced governance strategies influenced by the New Public Management approach.These strategies aim to enhance productivity and cultivate a service-oriented environment by implementing market-like incentives and increased accountability (Hicks 2012;Bruner, Ganga-Contreras, and Rodríguez-Ponce 2018).The literature highlights two primary funding mechanisms that have replaced institutional block funding.Firstly, performance-based research funding (PBRF) allocates resources at the organizational or institutional level based on ex-post assessments of research performance (Hicks 2012;Zacharewicz et al. 2019;Abramo and D'Angelo 2015;Good et al. 2015).In Chile, since 1988, higher education funding in the public sector (CRUCH) has been weighted according to the performance of each university.Government authorities set institutional goals, and indicators measuring educational and research activity are reported to them (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas 2018; Sisto 2017).
Secondly, funding mechanisms that rely on ex-ante evaluations of projects and individuals may involve identifying priority areas for research and implementing technologies based on peer review (Kreimer 2011).Research on the first type of mechanism has shed light on the challenges of steering research communities and evaluating research impact (Spinello, Reale, and Zinilli 2021).Furthermore, studies on peer review have underscored its limitations.According to Guthrie, Ghiga, and Wooding (2017), peer review is costly as an evaluation technology, exhibits bias against innovative research, and is a weak predictor of future performance.
Research on evaluative cultures and metricization processes offers a critical perspective for assessing the functions and impacts of evaluation technologies on research ecosystems, institutions, and individual trajectories.Lamont (2009) emphasizes that evaluation technologies, particularly those based on peer review, enable negotiations regarding the definition of excellence in research fields and the allocation of prestige.Furthermore, these technologies enhance transparency, accountability, and legitimacy in decisionmaking processes within research and higher education systems.Consequently, evaluation technologies can articulate the operations of the system of science with political decision-making (Reinhart and Schendzielorz 2021).In the context of university-level RMIs, this perspective suggests that these mechanisms mediate the authority derived from the research community with organizational or hierarchical leadership (Cruz-Castro and Sanz-Menéndez 2018).
The term "metricization," as defined by Burrows (2012), refers to the increasing reliance on quantitative measures in academic performance evaluation systems.The metricization of universities involves the creation of surveillance and subjectivation technologies that observe and manage academic careers (Barron 2021;Clarke and Knights 2015).By using "hard" categories that purport to be objective and universal (Spence 2019), quantitative evaluation technologies contribute to the production of bibliographic (Lim 2021) or quantified identities (Fardella, Corvalán-Navia, and Zavala 2019).Individual academics participate in these systems by producing works that align with institutional, national, and global information infrastructures and reporting their performance using these terms (Barron 2021;Lim 2021).
Among university governance mechanisms, RMIs play a role in this context of profound transformations of higher education systems.These seemingly mundane and often overlooked instruments serve as a means for institutions to align their administrative objectives with knowledge production processes.They establish objectives for academics to achieve while regulating essential factors such as research areas (Rovelli 2017), preferred journals for publication (Mathies, Kivistö, and Birnbaum 2020), and researchers' working locations (Liu et al. 2019).The application of RMIs can be seen as a guide that individuals use to align their actions with the expected outcomes outlined in the instruments, considering symbolic disputes, bibliometric goals, and immediate economic results (Mathies, Kivistö, and Birnbaum 2020).

Materials and methods
To explore the impact of RMIs on scientific activities in Chilean universities, we conducted a qualitative documentary analysis study (Prior 2003) of 80 instruments used in eight research-oriented universities in the country.This study involved a systematic review and analysis, with the RMIs as the primary focus of observation.We adopted this perspective due to its comprehensive nature, emphasis on local contexts, and sensitivity to constructing textual worlds (Prior 2003).Treating the documents as social artifacts, we recognized their creation, consumption, sharing, and organized use within a social framework (Atkinson & Coffey 1997;Prior 2003).Consequently, we consider RMIs to be elements within an information infrastructure (Bowker and Star 1999;Kreimer 2011) capable of reflecting norms in the management of research endeavors.
The study design employed a theoretically grounded sampling approach (Patton 2014).For the university selection, we identified the top eight institutions in the 2021 Scimago ranking for Chilean universities.This number was chosen to encompass a significant portion of research activity in Chile, as the top ten institutions account for a substantial proportion (Mondaca et al. 2019).The Scimago ranking was selected as an internationally recognized standard for bibliometric measurement (Mondaca et al. 2019).
The timeframe for the implementation of RMIs spanned from 1998 to 2021.This period is justified by the introduction of performance-based funding in 1998 and the subsequent significant policy changes in fund allocation, which are considered crucial in shaping the design of RMIs within the NPM framework.We considered various types of RMIs for each university to encompass the range of devices employed, distinguishing between institutional policies, regulations, call guidelines, and manuals based on the labels assigned by each university.When obtaining specific label information was not feasible, we applied predefined definitions for each category.Thus, institutional policy documents were interpreted as articulating the broad norms that define the university's research interests.Regulations referred to the regulatory codes governing the implementation of research policies.Call guidelines encompassed invitations to the academic community to participate in research projects or seek resources to support their work.Lastly, manuals pertained to technical documents outlining the use of institutional infrastructure.
The data sources were the respective Vice-Rectors for Research and Development websites in all cases.In some instances, other sources hosting academic and research career policies and regulations were also consulted.The websites were reviewed from August to September 2021, and only documents issued directly by the institutions themselves were considered, excluding any instruments from external sources.A total of 80 documents were selected, averaging approximately ten documents per university.It is important to note that for University 7 (U7), only four documents from its Vice-Rector for Research were available.However, one of these documents contained all the funding programs for research projects at the university, from which six programs were selected.Table 1 summarizes the number of available RMIs per university (N) and the number selected (S).For practical purposes, an RMI that repeated year after year was considered as a single instrument.The percentage at the end row of Table 1 indicates how many documents were considered in relation to the available documents of the same type.
We conducted a pragmatic discourse analysis in three phases.The first phase involved the preparation and organization of the material.Texts were anonymized and labeled according to the university they belonged to.The second phase entailed an initial exploration of the data using procedures outlined by Grounded Theory (Glaser and Strauss 1967).Texts were coded using CAQDAS.The systematic reading of codes allowed to identify patterns and regularities, which were subsequently grouped into emergent categories.In the third phase, the categories were discussed, filtered, and reorganized based on their ability to answer the research questions (Wetherell 2007).Thus, during the analysis and discussion of the documents, four key categories emerged to understand the normativity of the RMIs: 1. Objectives: This category includes the declared objectives or functions of the instrument.2. Participation conditions: This category encompasses the declared criteria for including and excluding participants.3. Allocation criteria: This category focuses on the declared evaluative criteria for allocating the instrument.4. Clauses: This category comprises the declared requirements that must be fulfilled to terminate the execution of the instrument.
To ensure the coding quality and interpretive validity, we employed triangulation and member checking as our method (Maxwell 2005).

Results
In this section, we explore how RMIs shape scientific work.We begin by examining the role of university policies and their relationship with regulations and call guidelines.We then explore the coordination challenges among these instruments, illustrating how the lack of explicit coordination may reinforce the emphasis on productivity over other criteria for evaluating scientific work.Furthermore, we investigate the selection criteria for conducting research and advancing in academic hierarchies, highlighting the significance placed on research outputs and productivity indicators.Lastly, we analyze the type of research incentivized by RMIs through evaluative criteria, including novelty, impact, collaboration, knowledge dissemination, international linkage, and excellence.

How do RMIs perform scientific work?
University policy plays a central role in shaping RMIs by establishing strategic organizational decisions declaratively, without requiring further justification.For all observed cases (9 documents in total, including 2 from U3), policies dictate regulations for academic jobs by stimulating and evaluating research in competitive calls.A clear example of this trend can be seen in the research policy of U2, which states: "The University will provide economic and academic incentives to researchers for their research outputs and the generation of new knowledge (…) Internal economic resources available for research will be allocated through competitive processes based on public procedures" (U2-PI).
Research policies often incorporate regulations restricting research activities and giving rise to additional instruments designed to incentivize specific behaviors.Competition rules were the most prevalent among the procedural RMIs identified in our analysis (53 documents, 66%).This instrument stands out as it is widely employed across universities and disciplines in our sample.Money transfer competitions are the preferred approach to research management, as all the calls for proposals presented this modality.They create a funding landscape characterized by selectivity, where only the most promising proposals are chosen in a competitive procedure.
Even though within each organization, RMIs present themselves as systems or sets of interrelated instruments, there is often a lack of coordination among policies, regulations, and competition rules.An instance of coordination failure can be observed in a research policy that initially lists 13 guiding principles, including productivity and inclusion.Later in the document, principles are translated into objectives.The principle of productivity is defined as "strengthening and expanding capacities in creation, research, infrastructure, innovation, and technology transfer" (U8-PI).Surprisingly, the principle of inclusion is omitted at this stage.Consequently, when mechanisms are designed to assess compliance with the objectives, procedural RMIs fail to reference the principle of inclusion.In this case, evaluation criteria incorporate measures of productivity but not inclusion.For example, one of the call guidelines used in this university includes the following indicators: In this case, the initial mention of productivity and inclusion in the policy is ultimately assessed mainly through productivity indicators.
In a second example (U5), institutional emphasis on improving societal well-being is omitted in evaluation technologies focused on scientific novelty and project feasibility.In its research policy, U5 establishes the mission of research as follows: "The mission of research (…) is to enhance the improvement of our society's well-being, serving as a significant lever for the development of excellent human capital and for the understanding and resolution of complex problems" (U5-RGI).However, the same orientation is not reflected in the RMIs derived from this policy.For instance, one of their competitions outlines the allocation criteria as follows: "[The Research and Doctoral Directorate] must evaluate the novelty and feasibility of the proposed research and, based on this assessment, recommend the allocation or non-allocation of funds" (U5-FIIE).In this manner, competition guidelines may deviate from the intentions expressed in research policies, in this case emphasizing scientific novelty, feasibility, and impact in academic publication circuits.

Who can do research?
A group of institutional documents explicitly state selection criteria for career advancement in academic hierarchies and competition access.Firstly, the regulations for academic careers provide instructions on how the hierarchy is structured: "The academic career governs the trajectory of professors at [University] based on the academic merits outlined in this Regulation" (U6-RCA).Secondly, access to competitions is limited by contractual requirements that identify individuals eligible to participate in the community's calls.It is common in competitions to require as a condition the possession of a specific academic rank or a minimum number of hours worked for the university: "The competition is open to researchers who have at least a half-time contract in the Faculty of Engineering Sciences" (U2-BCAI).
RMIs establish criteria for advancing in academic careers, which are constructed in terms of individual merit.We were able to access documents in six universities regarding systems for evaluating academic careers.For all the observed cases, an academic career is structured as a hierarchy, and merit is estimated through productivity measures that become more stringent as higher ranks are considered.For instance, at U3, to move from an instructor to an assistant professor, the academic must demonstrate academic and personal qualities aligned with the university's mission, hold a doctoral degree, work as an instructor for 400 h, exhibit a good level of teaching in undergraduate courses, have at least one publication, supervise the work of teaching assistants, collaborate in administrative tasks, and ensure a continuous contribution to the university (U3-RCA).To achieve promotion from assistant professor to associate professor, the academic is required to have demonstrated a commitment to the university in their trajectory, hold a doctoral degree, work as an assistant professor for 400 h, achieve a distinguished level among peers as a researcher through the number of publications, research projects, and dissemination of experiences, demonstrate dedication to guiding the work of teaching assistants and students, collaborate in administrative tasks, and ensure a continuous contribution to the university (U3-RCA).Then, to go from associate professor to full professor, they must "accredit a remarkable research career at an international level" (U3-RCA).In this case, criteria are accentuated, evidencing the persistence of a model that incentivizes both productivity and participation in international scientific communication circuits (Beigel 2014) above other forms of academic work.
Academic regulations at U1 are also illustrative of the emphasis on academic productivity, measured in terms of research projects and publications: An objective validation of the quality and relevance of the research activities of an academic is the interest they arouse in their disciplinary community at national and international level.Therefore, both the publications generated and the funding that supports them are performance indicators.(U1-PGCA) In all cases, evaluation criteria in the academic career focus on the merit of the evaluated individuals.Therefore, RMIs regulate scientific work by considering individual academics as units that can be ordered according to their merit, which is inferred through productivity indicators.They make up a community of peers ordered under a principle of equivalence: everyone will be rewarded according to their efforts.However, to the extent that they select particular areas of virtue-related trajectories of people, anything beyond the criterion is immediately ignored.Difficulties, exclusions, leaks, or any indication of an external world are not taken into account.Thus, the research community is presented as a fictional group where the material conditions of the exercise of scientific work are silenced under a principle of partial equivalence.
However, we have observed an emerging focus on care in some RMIs, aiming to avoid evaluation based solely on the previously described criteria.They are being implemented in five universities in our sample (six documents at U1, two at U2, five at U4, one at U6, and one at U7).These RMIs prioritize addressing individuals' challenges in their work rather than solely focusing on individual merits.They consider relevant elements that hinder the adequate representation of civil society in academia, providing alternative avenues for promotion beyond personal virtues.Examples include the opportunity to access a decentralization bonus equivalent to an additional 10% of the approved fund (U6-CCAC), the provision of a "Childcare Support Fund" for female academics with children under 12 years old (U1-PBE), and efforts to overcome cultural and institutional barriers that impede the equal development of women and men in the university and the country (U4-PCAF).

What kind of research do RMIs incentivize?
RMIs influence the type of research encouraged in universities primarily by establishing criteria for evaluating research projects.According to our findings, the most frequently used criteria in research project competitions, in descending order, are novelty (40 documents, 75% of competitions), impact (37, 70%), collaboration (37, 70%), knowledge dissemination (36, 68%), and international linkage (33, 62%).Although less commonly used, the criterion of academic excellence (24, 45%) carries the most weight in the evaluation matrices.
Novelty: The criterion of novelty encompasses two perspectives.Firstly, it relates to the current academic standing of the principal researcher.Secondly, it pertains to the timeliness of the scientific knowledge underlying the research project.The number of publications in mainstream journals and involvement in externally funded projects are considered to assess the first aspect of novelty.For instance: "The productivity of each member of the research team must be demonstrated through indexed publications (WoS or Scopus) and their participation as Principal Investigators in externally funded projects in the last five years (e.g.Fondecyt)" (U4-BPN).Similarly, a common practice among RMIs is considering the past five years of research output.This criterion highlights active researchers who have used their time effectively, with each outcome contributing to their short evaluation cycles.
The second aspect of novelty, concerning the up-to-dateness of knowledge, is typically evaluated by expert panels.These panels consist of academic peers who possess knowledge of advancements in a specific field of study.Peer evaluation implies that the allocation criteria in competitions are determined by community members who have been validated by meeting academic career evaluation criteria.
Impact: High impact evaluates the significance of the research project's results.To assess high impact, the evaluation primarily relies on two characteristics indicating the project's potential.Firstly, the project's ability to produce research papers published in reputable journals is considered.Secondly, the production of knowledge beneficial to the public good, particularly regarding technological uses, is measured through patent applications.The latter aims to contribute to developing a national productive matrix based on science, technology, knowledge, and innovation.For example, an RMI states: The objective is the development and/or validation of technology through concept testing in the context of applied research, which, through achieving a continuity milestone, enables the development of solutions that lead to new products, services, or processes to meet a market need, generating significant economic and/or social impact.(U7-PCPI) Collaboration: The criterion of collaboration focuses on the formation of research teams.It is common for RMIs to encourage research conducted by groups of academics to promote institutional collaboration.For instance, "Received applications will be reviewed to determine their eligibility, compliance with the present guidelines, including their interdisciplinary nature, and international and/or national collaborations" (U8-BII).
According to our findings, numerous RMIs promoting collaborative research do not provide specific instructions for group composition besides forming research teams that include students, individuals from different disciplines, and foreign researchers.Thus, apart from promoting intergenerational, interdisciplinary, and international research, no other indications guarantee the heterogeneity of successful academic projects.
Knowledge dissemination: The criterion of knowledge dissemination refers to the requirements set by call guidelines for organizing events that inform the community about research outcomes.These events serve as secondary outputs of the project, but like the formation of collaborative groups, there are limited and unclear guidelines on achieving successful dissemination.For instance, in the guidelines of a funding competition that considers knowledge dissemination a significant criterion for awarding funds, the only instruction is that the responsible researcher should disseminate the proposals and their results within the departments (U6-PIIE).Similarly, none of the guidelines specify deadlines, evaluation indicators, or dedicated resources for the dissemination aspect.
International Association: The criterion of international linkage refers to evaluating and promoting research activity in conjunction with the international community's work.For example, "This contest is designed to support doctoral and Master's programs in bringing foreign researchers to strengthen their research lines" (U8-BVPE).Notably, this criterion was highly valued in contests for which numerous instruments are designed to establish a connection with foreign institutions.Unlike collaboration and knowledge dissemination, these instruments define a successful case and how it is evaluated.For instance, in the case of U3, one of its instruments aims to "promote and strengthen national and international academic partnerships" (U3-AAEVI).The selection process is then described as follows: The evaluation committee will consider the following criteria to assess the contest (…): Curriculum vitae of the last 5 (five) years; Relevance of the event to be participated in and the work to be carried out; Pre and/or post-congress or exhibition arrangements.(U3-AAEVI) According to our findings, international linkage contests were highly interesting to research activity, as dedicated instruments are designed exclusively for this purpose, with specific resource allocation and evaluation criteria to assess their success.
Excellence: According to allocation weights, excellence is the most emphasized characteristic in call guidelines.However, although it generally involves the submission of high-quality research proposals, no explicit definitions are provided for assessing excellence.Nevertheless, unlike collaboration and knowledge dissemination, its evaluation is strict and entrusted to an expert panel usually tasked with ensuring excellence.It is presented as follows: "The aforementioned evaluation committee will determine the outcome of the contest based on compliance with these guidelines and the following criteria: Quality of the proposal presented (…)" (U4-AII).According to our findings, excellence is often treated as an indescribable criterion.It represents the enclosure of a specific knowledge area and establishes a threshold that only field experts can surpass.However, in other cases, excellence is inferred from evaluations of individual academics.Call guidelines indicate that excellence can be inferred from individual productivity.For example, an RMI states: "Objective: Facilitate the integration of young academics of excellence into the [University]" (U2-BUI).Moreover, later: "Interested individuals must demonstrate high potential to achieve, in the near future, a relevant level of scientific productivity, meaning they are capable of developing an independent research line" (U2-BUI).

Discussion and conclusion
In this study, we have explored how Chilean universities' RMIs shape scientific work.Previous research has noted that RMIs define research parameters by specifying problem areas, funding sources, and eligible researchers (Rovelli 2017;Mathies, Kivistö, and Birnbaum 2020;Liu et al. 2019).Our findings have revealed three dimensions in which these instruments shape scientific practices.
In the first dimension, which focuses on formal aspects, we have observed that these instruments shape scientific work through their discursive style and integration within the same institution.Research policies adopt a declarative style that establishes specific mechanisms as natural practices, despite being determined by organizational decisions.This style reinforces the perception that these practices are necessary.We have also noted that the way instruments are interconnected may contribute to replicating critical elements in the organization of scientific work.In this respect, we have identified instances in which institutions express interest in emphasizing the social impact of research.However, this intention is not translated into specific objectives and evaluation criteria in funding calls.As a result, egalitarian principles stated in policies can be disregarded in practice (Bhopal 2016), perpetuating evaluation practices that prioritize impact within scientific communication networks.
In the second dimension, RMIs exert influence by establishing standards for the scientific activity that shape individual behaviors and identities (Lounsbury 2001).These standards dictate how researchers should conduct their work and what alternatives are available (Busch 2013).We have observed that RMIs establish standards for classifying academic trajectories based on individual merit, emphasizing productivity indicators and participation in international scientific communication networks.However, these standards overlook various factors that affect academic performance, such as gender (Vohlídalová 2021), ethnicity (Bhopal 2016), socioeconomic class (Chiappa 2021), precarious working conditions (Pérez and Montoya 2018), and challenges in balancing research, management, and teaching responsibilities (Fardella-Cisternas, Espinosa-Cristia, and Garrido-Wainer 2023), and mental health issues (McAlpine and Amundsen 2015).Consequently, established researchers are more likely to receive favorable evaluations, perpetuating existing inequalities in the academic system.To echo the words of Bowker and Star (1999, 34), "One person's infrastructure may be another's barrier." In the third dimension, RMIs shape scientific work by establishing criteria for its evaluation, reinforcing the dominance of international scientific communication networks.These instruments align with government mechanisms that promote engagement in international circuits, which are increasingly prioritized in Latin America (García de Fanelli 2019; Beigel, Gallardo, and Bekerman 2018).This is particularly the case of the Chilean higher education system, where the Ministry of Education uses criteria to define yearly increases in governmental funding of "public" (CRUCH) universities that prioritize publications in journals indexed by Web of Science (Ministerio de Educación 2023).Translating this emphasis within universities, RMIs downplay participation in national and regional science communication circuits.
Lastly, the criterion of excellence holds a significant position within RMI systems.It is the most relevant criterion for evaluating research proposals, yet explicit definitions are not provided.The application of excellence criteria consistently relies on peer review mechanisms.Previous research has highlighted that this approach legitimizes evaluative processes within scientific communities and political decision-making regarding allocating limited resources (Lamont 2009;Reinhart and Schendzielorz 2021).Not giving explicit definitions or indicators allows for negotiation within scientific communities regarding what constitutes excellence for each of them.
In conclusion, RMIs in the selected Chilean universities profoundly impact scientific work, shaping it through formal aspects, career standards, and project evaluation criteria.These findings highlight the need for a critical examination of these instruments and their potential implications for the scientific community, and the allocation of resources.
Our study has limitations that may direct to future lines of research.Our analysis was based on a subset of 80 documents from the top 8 universities in the Scimago ranking for Chile, covering the period from 1998 to 2021.This sample excludes a significant portion of the observed instruments and universities in the country, indicating that our findings may not fully represent the national landscape.Furthermore, our research design focused on universities with a stronger emphasis on publishing in mainstream journals.Future research could expand on this by investigating the participation circuits of Chilean universities that prioritize different aspects over international focus.Exploring how these universities navigate their research activities within the context of government funding requirements would provide valuable insights.

Disclosure statement
No potential conflict of interest was reported by the author(s).
in Sociology from the Pontificia Universidad Católica de Chile.In his most recent projects, he has conducted research on the contemporary conditions of scientific work and explored the sociocultural dimensions of telemedicine.
Juan Felipe Espinosa-Cristia specializes in the field of management education, with a particular emphasis on knowledge production processes and the impact of technology on organizations and society.His research portfolio encompasses projects exploring technical mediation within banking environments and innovation management practices in both startup and established companies.Dr. Espinosa-Cristia currently holds the position of Associate Professor at the Universidad Técnica Federico Santa María, and he earned his Doctorate in Management from the University of Leicester.
Paulina E. Varas is a specialist in the field of art theory and has authored numerous articles on contemporary Chilean and Latin American art.She has actively engaged in artistic endeavors, participating as an exhibitor, speaker, and lecturer.Dr. Varas holds the position of Full Professor and Researcher at the University Andrés Bello, and she earned her Ph.D. in Art History and Theory from the University of Barcelona.Claudio Broitman's research interests encompass a wide range of areas, including environmental communication, risk assessment, knowledge production, discourse analysis, and socio-technical controversies.Dr. Broitman holds the position of Associate Professor and serves as the Director of the School of Journalism at the University Andrés Bello.He earned his Ph.D. in Sciences de l'information et de la Communication from the Université Paris-Sorbonne.

Table 1 .
Number of available RMIs per university (N) and number of selected RMIs (S).
Number of applied research, innovation, or entrepreneurship projects derived from acquired capabilities; Number of press releases or extension articles related to training activities and dissemination activities; Number of technical, continuous, undergraduate, and postgraduate training courses where acquired knowledge and capabilities are applied; Level of satisfaction of attendees in dissemination activities.(U7-IFI)