A guiding framework for theory adaptation in operations management studies

Abstract The importance of theory cannot be overstated, as it is a primary research focus across most academic fields. This conceptual research study aimed to develop a guiding framework for theory adaptation in operations management studies. We logically developed a four-stage framework with a process flow and methodological approach to measurements. The framework conceptualizes the domain theory, problematizes the theory, and identifies the method theory for an alternative point of reference and required transition as well as identifies theory-strengthening influences. This is followed by the approach to the theory of unification with either integration or reconciliation. The framework then identifies the core tenets of the emergent theory and its boundaries based on value, time and space. The last stage is theory evaluation which ensures falsifiability of the theory to show that it is refutable. This last stage is based on operationally defined quantitative analysis for the variables, construct validity, convergence, discriminant, predictive and reliability, while for a relationship—linear or non-linear and examines logical and empirical adequacy. In addition to falsifiability, the theory must also be evaluated with the utility, which must ensure the theory’s scope, explanatory potential and predictive adequacy score. This study provides novel research as to the details of the framework of theory adaptation balancing the divergent objectives of the academia and practitioners and can be employed in operations management and other fields.


PUBLIC INTEREST STATEMENT
South Africa and most African countries are currently not globally competitive.This situation is exacerbated by the operating environmental turbulence and uncertainty that test their resolve for survival, growth, and sustainability.The manufacturing industry requires collective efforts to enhance its position, as it makes significant contributions to the nation's economic progress and serves as a major source of employment, necessitating sustainability measures within South Africa and other African nations striving to address high levels of unemployment and poverty.We present a useful guiding framework for theory adaptation in operations management studies that ensures a delicate balance to meet academia's and practitioners' divergent objectives grounded on precision and simplicity.

Introduction
Operations management (OM) encompasses intricate interactions between individuals, technological systems, and organizational and physical procedures, many of which undergo transformations over time (Dhiaf et al., 2021;Peinado et al., 2018;Wolniak, 2020).Thus, OM has a plethora of meaningful, relevant, and consequential topics to address (Browning, 2020;Walker et al., 2015).These fall within the areas of supply chain management (Abbasi & Varga, 2022;Alexander et al., 2022), product and service design (Jiang et al., 2021;Joly et al., 2019), lean management (dos Santos Bento & Tontini, 2018;Ferreira & Saurin, 2019), quality management (Fundin et al., 2020;Gremyr et al., 2021;Mtotywa, 2022) and decision support (Baldwin et al., 2010), amongst others.Operations management has encountered new challenges as a result of global competitiveness, co-development, co-creation of products, innovation, technology integration, global supply networks, sustainability, and corporate social responsibility (Moghadam et al., 2021).This necessitates a more holistic integration of other management disciplines and expanding enterprise-wide responsibilities within management.This, as OM is characterized by its dynamic nature, as it constantly witnesses the emergence of new practices within a complex and uncertain environment.The presence of uncertainty significantly influences the investment decisions made by firms, subsequently affecting the manufacturing process and ultimately leading to disruptions in exports and the trade balance (Dagar & Malik, 2023).In addition, firms found themselves confronted with the challenging circumstance of a decline in overall demand, initially resulting from a supply shock and subsequently exacerbated by job losses (Bendoly et al., 2022;Guru et al., 2023).
The field of research in operations management presents distinct challenges, namely a dearth of theoretical frameworks, intricacy in its subject matter, and a deficiency in robustly established definitions and metrics (Moghadam et al., 2021).Despite this, numerous researchers have made significant and valuable contributions to advancing theory in operations management (Narasimhan, 2014;Schmenner & Swink, 1998;Walker et al., 2015) and, more recently, there was an increase in theory-driven empirical research (Roth & Rosenzweig, 2020).Several academics have investigated operational phenomena by applying management and organizational science concepts in response to this development and the root dependency that generally characterizes research.Walker et al. (2015) argued that most studies in operations management focus on theory testing rather than theory formation.Despite this, it is important to reflect on theories since they impact how research analyzes and addresses operations management problems.This is essential in enhancing operations management expertise while considering the discipline's theoretical developments.This permits the researcher to reflect on the collective literature to identify trends and gaps.
Many theories are relevant and applicable to operations management, and researchers in empirical operations management are adapting to meet the growing need for theoretically grounded insights (Chatha et al., 2018;Peinado et al., 2018;Wacker, 1998;Walker et al., 2015).Kenworthy and Balakrishnan (2016) argued that, over time, empirical operations management research is responding to demands for more excellent theory-based knowledge.Operations management research employs and develops a vast diversity of domestic theories to comprehend and interpret empirical data, emphasizing theories derived from other scientific domains.Over the years, efforts have been made to build theories, and there are several types of theory building, such as theory synthesis, theory adaptation, theory elaboration, and models, amongst others (Fisher & Aguinis, 2017;Jaakkola, 2020).Within these theory-building types, relatively few studies provide guidelines or frameworks for theory-building, especially theory adaptation.A few that are relevant include Bacharach (1989), Meredith (1993), Lynham (2002), Jacard and Jacob (2010), French (2010), Shepherd and Suddaby (2016), Jaakkola (2020) and Svejvig (2021).Shepherd and Suddaby (2016) posited that despite the growing body of literature that provides a wide variety of theoretical frameworks, there is still a lack of consensus as to when and how to cohesively use these theorizing tools, and so much knowledge remains fragmented.Consolidating ideas and making do with what we have is one approach to addressing the fragmentation problem, while exploring inadequately tapped theoretical territory is another way to address the lack of novelty (Fisher & Aguinis, 2017).This study aims to contribute towards closing this gap and proposes a guiding framework for theory adaptation in operations management studies, a critical facet of theory building.This is investigated with the following research objectives (RO): RO1: To determine the components of research activities to be optimally integrated for an effective guiding framework for theory adaptation in operations management studies.

RO2:
To highlight the operational environmental influences of theory development.

RO3:
To determine the measurement approach to falsification and utility of the theory.
The remainder of the article is divided into four sections.Section 2 provides the necessary grounding for the study by contextualizing theory building: providing the theory and an overview of theory building, components of the theory as well as the theory building influences-dynamism of sustainability and technological advances.This is followed in section 3 by the guiding framework of theory adaptation, which provides a blueprint for "good theory" construction.Section 4 provides conclusions that highlight theoretical and managerial implications, and section 5 offers limitations and future research directions.

Contextualizing the theory
Theories are critical for interpreting or comprehending research and to be able to contribute to the development of new knowledge (Eriksson-Zetterquist et al., 2021).Theory and research act as a double helix, as they have a connected relationship, and it provides "currency" for scholarly research (Fawcett, 1978;Corley & Gioia, 2011;Post et al., 2020).Theory serves as a fundamental pillar in research endeavors, regardless of whether the research is conducted using quantitative or qualitative methods.It assists in providing a comprehensive analysis of a phenomenon, elucidating the underlying mechanisms and causal factors that contribute to its occurrence.The utilization of theory in research endeavors varies, yet in both types of research, the incorporation of theory is considered an essential component of the research (Fried, 2020).Without theory, it is hard to interpret empirical evidence meaningfully and to discern between successful and unsuccessful outcomes (Díaz Andrade et al., 2023;Fried, 2020).Thus, without theory, empirical inquiry becomes nothing more than "data dredging".The development of theories helps to set science apart from common sense.Despite variations in the level of detail and specificity, extant definitions of "theory" share a common understanding, that it is a body of conceptual knowledge that seeks to elucidate phenomena (van Assche et al., 2021).

Overview of theory building
The bodies of knowledge that compose theories originate from contributions from various disciplines and new theories are constantly being developed and utilized in many contexts.Research relies heavily on theory-building since it lays the groundwork for analysis and promotes productive growth in the relevant discipline (Gay & Weaver, 2011).It is also essential for practically solving real-world problems, so the research must settle on a suitable theory that is applicable and underpins the study (Grant & Osanloo, 2014).Theory building focuses on the process through which a new thought or metaphor leads to constructing a conceptual model that helps to better describe the topic.However, while developing theories is essential for increasing management understanding, it is a challenging task (Mollah, 2019;Shepherd & Suddaby, 2016).One can engage in theory testing to better understand a topic, which involves applying previously established theories in novel ways or contexts (Colquitt & Zapata-Phelan, 2007).Walker et al. (2015) argued that most studies in operations management focused on theory testing rather than theory formation.Finding commonalities across disparate fields is crucial to theory-building since doing so raises the research's level of abstraction and potential impact.A theory is "good" if it satisfies the qualities (criteria) for "good" theory, which include the following: originality, parsimony, internal consistency and generalizability (Gay & Weaver, 2011).Theories are developed to explain, predict, and comprehend phenomena, and they frequently push the boundaries of current understanding while expanding it (Paul et al., 2023;Shepherd & Suddaby, 2016).Multiple theories offer the opportunity to view the same issue through different lenses.Therefore, it is up to researchers to determine which perspective they employ or adhere to for the purpose of constructing an argument, defining the background of the problem, and explaining their findings, respectively.

Components of theory-variable, constructs and their relationship
The critical components of a theory are variables, constructs, and their relationships.A variable is also known as an observed unit empirically operationalized by the measurement.In contextualizing this, Andrade (2021) posited that variables need to be operationalized, meaning they need to be defined in a manner that makes it possible to precisely measure those variables.Variables are utilized to depict the characteristics of the sample being analyzed and are referred to as such due to their tendency to fluctuate in value across the subjects within the sample (Calder et al., 2021;Kaliyadan & Kulkarni, 2019;Schott, 2008).The five most common variables in research are independent and dependent variables, moderation, mediation and confounding variables (Baron & Kenny, 1986;Igartua & Hayes, 2021;Mtotywa, 2019).
In the realm of empirical research, independent variables exert an effect on the value of other variables.In contrast, dependent variables are subject to the influence of other variables in determining their value.Baron and Kenny (1986) posited that the moderation variable can be of a qualitative or quantitative nature, and it exerts an influence on the magnitude and/or direction of the association between the dependent or criterion variable (Y) and the independent or predictor variable (X).Namazi and Namazi (2016) contended that incorporating critical moderating variables into the analyzed model reveals the true relationship between X and Y, thus emphasizing the significance of moderation as a contextualizing factor.Different from the moderation variable, the mediation variable, also referred to as an "intervening or process variable", mediates the relationship between the dependent and independent variables (Baron & Kenny, 1986;Igartua & Hayes, 2021).The mediational model postulates that there is no statistically significant relationship between the variables under study (independent and dependent).In the analysis, moderation and mediation variables can be combined, depending on the interest of the research.Namazi and Namazi (2016) explained that statistical analysis can be employed to test the moderated-mediation or mediated-moderation relationships.Thus, analysis from moderation (Igartua & Hayes, 2021;Ramanathan & Akanni, 2015) or mediation (Mehmetoglu, 2018;Mtotywa & Mdlalose, 2023;Mtotywa & Odebiyi, 2023) or their combination (Igartua & Hayes, 2021;Yoon, 2020) can then yield different effects such as the total effect, the direct effect, as well as the indirect effect.The power of these models is derived from their generalizability, as they can be employed in scenarios featuring non-linear interactions of an arbitrary nature, dependencies among arbitrary disturbances, and continuous and categorical variables (Namazi & Namazi, 2016;Yoon, 2020).The last common variable is the confounding variable, which is a variable that can artificially inflate or deflate the observed impact, leading to spurious relationships (Andrade, 2021).
The theory is viewed as a system comprising constructs and variables related to propositions and hypotheses.Bacharach (1989) explained constructs as approximated units as they cannot be directly observed, while variables are observed units and are empirically operationalized by measurement.Guided by the boundaries, the theory evaluation can be done with propositions or hypotheses.At an abstract level, propositions are used to evaluate the relationship among the constructs, while at a detailed level, hypotheses are preferred for analyzing the relationship among the variables (Bacharach, 1989).Study propositions are essential for ensuring internal validity in research.They provide valuable insights into the precision of definitions, measurements, associations, and confounding factors that are taken into account during the course of the study.Propositions serve as the foundation for deducing inferences in the context of external validity (Avan & White, 2001;Cornelissen, 2017;Ulaga et al., 2021).They can also be developed in some studies for future research, assisting in reducing potential relevance gaps within a study of a particular phenomenon (Fundin et al., 2018).
The hypothesis posits a proposed association between two or more variables.A noteworthy association between an independent variable and a dependent variable does not establish causality.This association may be partially or entirely accounted for by one or more confounding variables (Mamdani et al., 2005).The other component of the theory is the construct, which is an approximated unit that cannot be observed directly in research (Cheung et al., 2023).A construct is an abstract idea that is explicitly chosen (or "created") to explain a particular phenomenon (Kivunja, 2018).A construct can be either a unidimensional or a multidimensional construct, and it can be either a basic notion or a composite of a collection of numerous connected underlying concepts.Scientific research conceptions need specific definitions that others may use to grasp precisely what they mean and do not mean.Bacharach (1989) argued that the variables and constructs are related by hypotheses and propositions, respectively.If an obtained relationship finds a correlation between two empirical variables, it could point to either (a) the same theoretical construct being reflected by both variables or (b) two related but distinct theoretical constructs.Evidence supporting a different conclusion resulting in rejection of the hypothesis is proposed when relationships are shown to be dynamic, nonmonotonic, and/or asymmetric (Tesser & Krauss, 1976).Roth and Rosenzweig (2020) argued that there is a continuous dynamism of social, and environmental and technological changes in operations.These authors posit that there would be significant advantages for the field of operations management to broaden its research frameworks beyond those traditionally employed in the "hard" sciences.This expansion would incorporate social science perspectives utilizing behavioral and latent variable analyZes.

Sustainability
Sustainability refers to the capacity of attaining environmental, social, and economic objectives in the present while ensuring the ability to uphold these objectives in the future, without making any concessions.Sustainability is a reality from an ontological perspective and regulatory point of view (Ebrahimi & Koh, 2021).The pressure to centralized sustainability is evident at the global, country and firm level (Koh et al., 2016).There is growing significance recently attributed to adopting environmental, social, and governance (ESG) practices (Alsayegh et al., 2020;Rao et al., 2023).Social and environmental responsibility is crucial; although it might show short-term negative impacts in some cases, it has long-term benefits (Rao et al., 2023).Firms need to incorporate the assessment of social and environmental consequences of their business activities, alongside their economic performance.This is critical for the holistic view to guide firm decision-making and performance assessment.Tang (2018) highlighted that little research has been done on sustainable production frameworks.As such, social and environmental challenges have prompted a call for researchers and the manufacturing community to satisfy sustainability development goals (SDGs).Alvarado et al. (2022) posited that the integration of environmental sustainability into the formulation of the SDGs is driven by the recognition that economic progress is contingent upon maintaining harmony with the environment.This, as operations management decisions made by firms have significant factors contributing to the anthropogenic impact on the environment.Hence, it can be argued that implementing sustainable operations management practices holds significant potential in addressing humanity's various challenges (Swalehe et al., 2020).This requires developing tools that consider sustainability in decision-making (Tiwari et al., 2020).

Fourth and fifth industrial revolution technology
The fourth industrial revolution (4IR) and fifth industrial revolution (5IR) is occurring, and their associated technological advancements have repercussions, including altering our economic and personal practices, so that nearly every enterprise is experiencing difficulties (Mayer et al., 2021;Santhi & Muthuswamy, 2023).To aid in their pursuit of sustainable development, theories should consider the influences of the 4IR and 5IR technologies.This despite challenges in most African nations still requiring assistance in adopting and implementing such technologies.This is evidenced by the low rate of technology adoption across a range of economic domains (Kibe et al., 2023).
Globally, manufacturers must figure out how to increase productivity while keeping humans involved in the process.Liao et al. (2017) found that the revolution led to significant developments in scientific knowledge, which in turn led to individualized goods for consumers, streamlined manufacturing processes, and higher productivity.Many industrialized economies are renowned for their innovation (Kibe et al., 2023), in contrast to developing economies that may use 4IR technology to innovate, grow and develop (Mtotywa et al., 2022).Emerging 4IR technologies can make it difficult to regulate emerging economies' industrial resources and establish their sovereignty.These issues include but are not limited to whether the technologies are sufficiently mature to keep up with technological disruption and whether the costs are affordable for interoperability reasons (Micheler et al., 2019).Studies have shown the importance of digital platforms for improvements within operations management (David et al., 2022;Grabowska & Saniuk, 2022;Nyagadza et al., 2022).The critical nature of these influences highlights the need for both sustainability and fourth/fifth industrial revolution technologies to be some of the fundamental constructs in theory building.

Research design and approach
We logically developed four stages that build a framework (Bacharach, 1989;Grant & Osanloo, 2014;Rivard, 2020;Smith & Hitt, 2006) for theory adaptation, including its measurements.Theory adaptation attempts to improve upon a current theory by drawing on the insights provided by other theories (Jaakkola, 2020).While empirical research may gradually extend some components of theory within the confines of a specific environment, theory-based adaptation makes an effort to affect a more instantaneous shift in perspective.MacInnis (2011) argued that theory adaptation research develops a contribution by revising the extant knowledge, more specifically by presenting alternative frames of reference in order to propose a unique viewpoint on an existing conceptualization.
In these four stages, the first three stages are the conceptual phase, theory unification and conceptual framework while the last stage is the empirical analysis phase.Adaptation based on theory brings about a more instantaneous shift in perspective (Jaakkola, 2020).The theory adaptation approach contributes by modifying already known knowledge or, to be more specific, by presenting alternative frames of reference to propose a new viewpoint on previously established conceptualizations (Zhang & Gable, 2017).This is the most effective strategy for addressing inconsistencies in the conceptualization and measurement processes.Thus, the initial stage of conceptualization focuses on how a study identifies, problematizes and uses method theory for an alternative point of reference (Jaakkola, 2020).The second stage is theory unification which focuses on the identification, selection of a unifying approach and implementation of the unification process (Hindriks, 2022;Pettit, 2007).The third stage focuses on the identification of the core tenets of the emergent theory (Hindriks, 2022), its boundaries (Bacharach, 1989) and the conceptual framework (Grant & Osanloo, 2014;Luse et al., 2012).The last stage involves theory evaluation (IV).Bacharach (1989) argues that falsifiability and utility should be used as the two criteria for evaluating the theory.Falsifiability evaluates whether the theory is constructed so that it is empirically refutable.At the same time, utility analyzes whether the theory is useful, can be explained, and can predict the constructs, variables, and their linkages.Utility evaluates whether the theory includes both the logical, which involves scope, and the empirical, which provides for explanation and prediction.
In this developed framework, the underlying assumptions on the nature of reality as well as how the knowledge is acquired for the different stages are provided including the methodological approach required during theory adaptation (Kjaergaard & Vendelø, 2015;Truex et al., 2006;Zhang & Gable, 2017).In developing this framework, we took into account the divergent objectives of operational management practitioners and academic researchers which necessitate the establishment of a harmonious equilibrium between the precision and simplicity of models.Practitioners anticipate that the model will closely align with the practical problem at hand, enabling them to readily implement the solutions without the need for supplementary modifications.In academia, a succinct and broad theory with only an approximate accuracy level typically holds greater utility than an entirely accurate theory.

Framework for theory adaptation
We developed a guiding framework for theory adaptation as presented in Figure 1.Wacker (2008) noted that research creates studies with long-lasting effects when they adhere to the standards of excellent theory, which can only be achieved by adhering to a well thought-out framework.This framework lays out an approach to developing rational criteria for theory adaptation, emphasizing learning what constitutes "good" theory construction.

Identifying and problematization of the domain theory
The process of theory adaptation begins with selecting a suitable theory or notion (domain theory).MacInnis (2011) defined a domain theory as the body of knowledge about a substantive subject matter area that exists inside a specific topic or domain.A unique canon of ideas and theories underpin this field (Jaakkola, 2020).
The problematization of certain areas in domain theory serves as the foundation for theory adaptation.To enhance the congruence between a given concept or theory and its intended application or to reconcile any internal discrepancies, it may be argued that particular empirical advancements or perspectives from alternative bodies of literature challenge the adequacy or coherence of an established conceptual framework.In such a case, the authors may argue that a reconfiguration, shift in perspective, or change in scope is necessary.Usually, the research uses a different theory to direct this transition and the contribution of this type of research/paper is frequently focused on the area in which the critical idea is found (Jaakkola, 2020;MacInnis, 2011).

Identify the method theory for an alternative point of reference or required transition
The next step is to identify the method theory for strengthening the domain theory for alternative points of reference and the required transition.A method theory is "a meta-level conceptual system for studying the substantive issue(s) of the domain theory at hand" (Lukka & Vinnari, 2014, p. 1309).It is critical to ensure that both the domain and method theories can be applied to the field and area of investigation.The impact of the method theory depends on the characteristics it possesses.There are a variety of possible connections between domain theories and method theories, with the theoretical objective of the studies under consideration often focusing on domain theories (Lukka & Vinnari, 2014).As such, the method theory provides new and necessary insights into the domain theory (Jaakkola, 2020).

Identify theory-strengthening influences
There is a focused shift to sustainability in operations management, resulting in growing sustainable operations management (Atasu et al., 2020;Opresnik & Taisch, 2015;Walker et al., 2014).Opresnik and Taisch (2015) posited that the role of sustainability at the operations management level must be clear and entail the analysis of internal and external elements and in which mode as well as how these elements are managed at the firm level.Other authors encourage an industrial symbiosis thus "increasing eco-efficiency and positive social return of production systems" (Naderi et al., 2019, p. 457).This is complemented by the advances of the fourth and fifth industrial revolution technologies.

Theory unification
The conceptualization phase involves the identified and problematized domain theory and the method theory with its core tenets to help strengthen the emergent theory and is followed by theory unification.Theory unifying involves integrating their fundamental perspectives into a unified and logically consistent structure.Enhanced explanatory power is frequently attained through integrating theories and establishing novel connections among their explanatory factors (Hindriks, 2022).Pettit (2007) and Hindriks (2022) introduced methods of theory unification, which are "unification by reconciliation" and "unification by integration", respectively.
For reconciliation, there must first be two (or more) conflicting hypotheses that attempt to explain the same phenomenon in mutually exclusive ways.The objective is to turn contradictory ideas into complementary ones.To achieve this, the researcher must adjust the theories' scopes of applicability to eliminate any remaining overlap.Thus, the theories no longer predict the same events.Instead, they are each given their unique subset of the original domain and "now complement each other".Thus, their insights can be preserved in their entirety.Therefore, reconciling hypotheses involves juxtaposing them (Pettit, 2000).According to Mäki (2000), the precision of the reconciliation process can be enhanced by incorporating the concepts of a domain assumption and an applicability assumption.On the other hand, the integration, or unification, approach constructively integrates perspectives from the foundational theories.Integration specifically proceeds by leaving out factors and adding new relationships between factors of the various ideas.It frequently alters current understanding in original ways (Guala & Hindriks, 2014).Consequently, this approach not only maintains and enhances fundamental understandings, but it also results in heightened explanatory efficacy.According to Hindriks (2022), integration should only be chosen over reconciliation when the application domain contains two separate types of phenomena that require different explanatory factors to describe them.Caution is warranted, however, because increasing a theory's explanatory power by combining other theories is not guaranteed.The new theory is sometimes more complicated than the two parent ideas because it incorporates features from both.Some causal factors or interactions may lose their significance over time.However, for integration to lead to improved explanatory efficacy, the scope of the new theory must increase considerably.

Identify and align core tenets of the emergent theory
Over and above unifying the theories, extracting the core tenets of the emergent theory or approach is critical, mainly when theories are not in their original form (Guala & Hindriks, 2014;Hindriks, 2022).Jacard and Jacob (2010), later supported by Glanz (2017), argued that the fundamental components of a theory include its assumptions, tenets, assertions, propositions and predictions.These elements collectively form a shared framework that researchers in a given field can utilize to explore the significance and validity of real-life experiences and provide a solid foundation for conducting research within that field.

Identify boundaries of the theory
In addition to the core tenets, it is critical also to highlight the boundaries of the theory, as they highlight the assumptions of the theory and help set the limitations in applying the theory.These assumptions entail implicit values and explicit restrictions on space and time.Bacharach (1989) posited that the generalization of the theory increases from being bounded by space and/or time to being unbound by both space and time (Figure 2).

Develop a conceptual framework
A conceptual framework functions as a road map for the study, enabling the researcher to visualize and implement the research project as it helps to delineate the scope of a study and establish its theoretical foundations.It provides guidance and momentum to the research inquiry and helps stimulate research and ensure the extension of knowledge.Most conceptual frameworks take the form of diagrams in which the links between the constructs and variables involved in the study are graphically represented using arrows.The core tenets of the theory can be used to develop the conceptual model and identify the variables pertinent to the study and their potential relationships.
Conceptual frameworks serve distinct purposes in various categories of research.The conceptual framework may be used to generate hypotheses for explanations and predictions or to determine survey questions or data points.A conceptual framework is a set of interconnected ideas that helps provide a picture of the relationships between concepts in a study and their theoretical underpinnings.
Applicable to specific firms but over different time.
More widely applicable to firms but bound by specific temporal context.
Widely applicable to firms and over different time.

Figure 2. The spectrum of generalizability in theory development.
Source: Developed from the literature of Bacharach (1989) It is more than a list of ideas; it is a chance to highlight an epistemological and ontological perspective and the research approach (Grant & Osanloo, 2014).The conceptual framework can also describe and refine the problem's underlying notions (Luse et al., 2012).

Falsifiability to evaluate the theory
Good theory-building research methods are those that, like a theory itself, specify their variables, narrow their focus to a specific topic, construct coherent causal chains, and offer concrete hypotheses (Wacker, 1998).Such research should follow comparable procedures, regardless of study methodology, to become integrative.The operationalized variables must be coherent and part of a good measurement model for falsification of variables.This entails face and construct validity.Face validity is usually an informal review of the survey instrument by non-experts for clarity and appropriateness with the target sample, while experts familiar with both the topic being studied and the measuring tools are used to conduct content validity (Mosier, 1947;Tanner, 2018).In addition, variables must have adequate variance for logical analysis and adequate reliability, which is important for stability.The falsifiability of the constructs should be determined with exploratory factor analysis, convergence validity, discriminant validity, and predictive validity, amongst others.
4.1.1.1.Exploratory factor analysis.Exploratory factor analysis (EFA) is a statistical technique used to identify the minimum number of hypothetical constructs that can effectively account for the observed covariation among a set of observed variables (Watkins, 2018).That is, to determine the characteristics shared by all of the measured variables and help explain their order and structure.The factor analysis assumes all variables correlate and determines which variables constitute independent logical groupings.The variables should be ordinal.A 10-to-1 sample ratio is suitable for factor analysis (Ho, 2006).Kaiser (1974) recommended the use of the Kaiser-Meyer Olkin (KMO) test for sample adequacy, which can be formulated as follows: where R ij represents the correlation matrix while U ij is regarded as the partial covariance matrix.A KMO result ≥ 0.5 is a good test result (Kaiser, 1974).In addition, Bartlett's test for sphericity where the χ 2 should be statistically significant (p < .001),indicates that the pattern of\correlations is relatively compact (Bartlett, 1954): where n = total sample size, q = number of the variables while R = correlation matrix.The results of the KMO and Bartlett's tests show that factor analysis is appropriate for producing valid findings (Field, 2013).Eigenvalue and Scree plot help to determine to an optimum number of constructs with the variance explained expected to be more than 60% (Shrestha, 2021).

Convergence validity.
Estimating a measurement model is a means by which convergent validity may be assessed.According to Fornell and Larcker's (1981) recommendations, establishing the model must establish the appropriate association between variables and the constructs they are intended to measure while also ensuring that they are not directly related to the constructs they are not intended to measure.The attainment of convergent validity requires the appropriate alignment of the proposed measurement model with the collected data.According to Hair et al. (2009) where λ i is the standardized factor loading.For the construct of interest, the AVE is: where λ i is the standardized factor loading of the i th variable while p is the number of variables of the construct of interest.

Discriminant validity.
Measuring the discriminant validity is achieved when measures of constructs that are not expected to be strongly correlated with each other based on theory, and exhibit no significant correlation (Cronbach & Meehl, 1955).The discriminant validity can be assessed with the Fornell-Larcker criterion, by cross-loading (Chin, 1998), and using the heterotrait-monotrait ratio of correlations (HTMT) and HTMT2.Rönkkö and Cho (2022) introduced an updated analysis for discriminant validity with CI CFA sys ð Þ and X 2 sys ð Þ.The Fornell-Larcker criterion is an approach that is widely applied for determining discriminant validity.According to this criterion, the correlation level between a construct and any other construct must be lower than the square root of the average variance derived from the first construct.The Fornell-Lacker criterion is met if there is a distinction between the constructs with the AVE of the individual constructs higher than their shared variance (Hilkenmeier et al., 2020): where AVE� j is the construct and ϕ ij is the shared variance with other constructs.It is essential that the average variance extracted (AVE) level for each construct be higher than the squared correlation that involves the components.In cross-loading, the "discriminant validity is shown when each measurement item correlates weakly with all other constructs except for the one to which it is theoretically associated" (Gefen & Straub, 2005, p. 92).The heterotrait-monotrait ratio of correlations (HTMT) formulation (Henseler et al. (2015) is: r ig:jh is the average heterotrait-heteromethod correlation and is the geometric mean of the average monotraitheteromethod correlation of construct, � i and the average monotrait-heteromethod correlation of construct, � j (Henseler et al., 2015).Due consideration should be given in selecting the optimum technique for discriminant.For example, Henseler et al. (2015) argued that the Fornell-Larcker criterion and cross-loadings perform poorly compared to HTMT.This is a viewpoint that was backed by Hamid et al. (2017), who suggested that the Fornell-Larcker criterion and crossloadings are not as sensitive as the heterotrait-monotrait (HTMT) criterion for discovering discriminant validity in research.Roemer et al. (2021) proposed the use of HTMT2, analogous to the HTMT, which the authors believe to be an improvement over the original HTMT as it depends on the geometric mean instead of the arithmetic mean, it simplifies computing and has loosened assumption of tau-equivalence.

Predictive validity.
The predictive validity focuses on predicting how the operationalization of a construct will perform based on the theory of the construct.Generally, the aim is to correlate the new construct with external criterion to establish the predictive validity of the new contract or trait (Niessen et al., 2018).When using prototypical epistemological process, the predictive validity employs correlation analysis.A statistical correlation is a relationship between two variables that measures how they vary in relation to one another (Jaggers & Loomis, 2020).The correlation matrix can be analyzed with Pearson correlation for linear relationship theories.The Pearson product moment correlation equation is as follows: where r is correlation coefficient, X i values of x variables in the sample, � X is the mean of the values of x variables, Y i values of y variables in the sample, � Y is the mean of the values of y variables.
In the context of statistical analysis, the symbol " r" denotes the correlation coefficient.The variables X i and Y i represent the values of the x and y variables in the sample, respectively.

Additionally, �
X and � Y represent the mean of the x and y variables, respectively.
As not all the theories are linear, it is prudent to also use Spearman's correlation (ρ) to assess monotonic relationships, irrespective of whether there was a linear or non-linear relationship.The Spearman rank correlation is: The aforementioned formula pertains to the calculation of Spearman's rank correlation coefficient ρ, where d i = represents the difference between the two ranks of each observation, and n denotes the total number of observations.In studies which follow ontological process, the regression analysis is used for the predictive validity.
For non-linear theories, a non-linear regression can be modelled after determining which transformation will work best.For example, SPSS may use a curve estimation procedure such as quadratic, logarithmic, inverse, power, logistics, exponential and growth.For a quadratic equation: where β 0 2 is the coefficient for transformed variable, β 0 2 2 .
In addition to the falsifiability of variable and construct, there is also a need for falsifiability of the relationship, and these can be done with logical adequacy where the logic is embedded into the hypotheses and proposition.Empirical adequacy focuses on operationalization of the hypotheses and propositions where the theory can then be subject to disconfirmation.

Utility to evaluate the theory
The utility focuses on the scope as well as the explanatory potential and predictive adequacy during theory building.The scope must ensure that the variables included in the theoretical system are sufficient but parsimony (Bacharach, 1989).A confirmatory tetrad analysis can assist as it determines statistically whether a variable of the construct is best specified as reflective and formative construct (Gudergan et al., 2008;Mtotywa & Kekana, 2023).It is essential to confirm this to ensure that the measurement model is accurate and acceptable.In reflective constructs in a model, not all variables need to be present to be acceptable, however for formative all variables should be retained.In the analysis, if tetrads are not significantly different from zero, Ԏ = 0 the model is reflective.
Predictive relevance helps with the evaluation of the model.A Q 2 >0 indicates a good predictive relevance (Hair et al., 2018).The predictive relevance of the endogenous variables is calculated using the following equation: where SSE is the sum of the squared prediction errors and SSO with the sum of the squared prediction errors.
The guiding framework that we developed and synthesized provides a comprehensive and integrated view of theory adaptation.This is critical for research as it can provide a systematic approach and a meaningful contribution to the existing knowledge within the relevant field or area of study.This guiding framework for theory adaptation in operations management studies can be applied in different studies in operations management fields and can, where applicable, also be employed for other management studies.Some of the topical areas in management theory include sustainable change, strategic management, and the requirements of employees and society (Skačkauskienė, 2022).

Linking theory concepts
We suggest a framework with logic that is cohesive around a set of connected concepts and visionary in its ability to inspire future studies (Díaz Andrade et al., 2023).The four stages bring critical outcomes.Firstly, the article provides the process flow for theory adaptation which clarifies conceptualization to emergent theory, with each phase having substages which are critical for the build-up of the framework.Secondly, the framework also provides the necessary pathway to follow to evaluate whether the theory is constructed so that it is empirically refutable (falsifiability).This, together with utility, evaluates whether the theory includes both the logical, which involves scope, and the empirical, which provides for explanation and prediction.This creates a critical pathway to self-check and develop a robust adapted theory.Thirdly, it ensures that the influences of the operating environment are considered, particularly two critical ones-sustainability and technological advances (4IR/5IR).This is essential from a pragmatic standpoint, since the recognition and dissemination of subjects within the field of operations management holds greater significance for firms, contingent upon their sector of operation.Thus, it offers valuable insights for business researchers, educators, and for management within a firm.This understanding can aid in aligning their activities with the aim of effecting meaningful change in day-to-day business operations allowing for more effective progress in the subject area, and promote practical, real-world issuesolving.Finally, the methodological approach to measurements is also presented allowing for an objective assessment of the theory that is being developed.

Conclusion
The theory adaptation approach contributes by revising previously known information, or more specifically, by incorporating alternative frames of reference, allowing one to present a fresh viewpoint on previously established conceptualizations.Theory has conceptual definitions, domain restrictions, relationship construction, and predictions-the four essential features of any theory.It normally has a set of boundaries as well as assumptions and constraints.The first objectives of the research was to determine the components of research activities to be optimally integrated for an effective guiding framework for theory adaptation in operations management studies.The study developed four stages, which are conceptualization, theory unification, core tenets of the adapted theory and, finally, theory evaluation, empirical analysis, and then the emergent theory or approach.Theoretical frameworks facilitate the comprehension of reality so that beyond merely describing phenomena, they can help foresee how those phenomena are related to one another.The second objective was to highlight the operational environmental influences of theory development.Sustainable development and technological advances were highlighted as the two critical influences of the operating environment that should be considered during theory adaptions process.The third objective was to determine the measurement approach to falsification and utility of the theory.The study created an operationalisation with quantitative analysis for the variables, construct validity-convergence, discriminant, predictive and reliability, while for a relationship it examines logical and empirical adequacy.The falsifiability of the theory must also be evaluated with the utility, which must ensure the theory's scope, explanatory potential and predictive adequacy outcome.
We developed a guiding framework for theory adaptation, which has become a crucial route for strengthening theories as this theoretical framework can be used for theory adaptation.The majority of the leading theories used in operations management research were adopted from other fields rather than originating from within the research field itself.These fields, which include economics, sociology, and psychology, have proven to be rich sources of theories applicable to operations management problems.However, for operations management to advance as a discipline, its inherent theory-building capabilities must be enhanced.This research contributes to a theory by providing the necessary vehicle that can help the research take theory-building to the next level.The current study is supported by a theoretical framework, as it aims to contribute to the existing knowledge regarding the significance of theory adaptation in operations management.This has the potential to inform future research to enhance the theoretical understanding that underpins the teaching, research and application of operations management.

Implications for further research
The study is not without limitations; it must be noted that this research is conceptual and theoretical in nature, and the framework will increase its credibility once it is validated with empirical analysis.As such, the direction for future research is to conduct an empirical analysis of the framework for the approach to theory adaptation.Lastly, we provide a guiding framework for theory adaptation that has sufficient built-in flexibility or malleability to recognise possibly disparate aspects of theories and then deem which parts of the theory are applicable and may add value to solving the particular problem.This means that there may be a general qualitative analysis of a theory to determine if it is applicable, followed by a more in-depth analysis of parts of the theory and their utilisation as pertinent components towards theory development.

Figure
Figure 1.A framework for theory adaptation.
, the presence of convergent validity can be established if three specific criteria are satisfied, namely: (a) the composite reliability (CR) values are ≥ 0.7; (b) all standardized factor loadings are ≥ 0.5; and (c) the average variance extracted (AVE) values are ≥ 0.5.Cheung et al. (2023) supported this method of testing convergent validity.The formulation of composite reliability (CR) is: