A guideline to implement a CPS architecture in an SME

ABSTRACT In Industry 4.0 context, data valorisation allows industries to develop new capabilities, create competitive advantages and achieve manufacturing sustainability, but technological infrastructures are needed to support system interoperability and to manage datas. These infrastructures are not enought mature in many industrial environments, especially in small and medium enterprises (SMEs). Technology integration is challenging due to system and information heterogeneity , and even more so in SMEs that have constraint environment and which lack specific research study. . Although several approaches have been proposed, the literature lacks empirical evidence of the adoption of new technologies in SMEs. This paper presents a guideline for implementing a Cyber-Physical system (CPS) architecture in an SME and its application in an organic flour mill in Montreal. The case study provides evidence of the possibility to implement a CPS architecture in SMEs and can serve as an inspiration for SME to develop an Industry 4.0 strategy.


Introduction
The current manufacturing context favours specialized products, quality standards, support services, and immediate demand satisfaction (Ganzarain & Errasti, 2016).Companies need to cope with this new market environment, while maintaining a sustainable activity in integrating systems and various processes to produce a high quality of products with minimum resource utilization, sustainable resources, being safer for customers, employees and communities (Jamwal, Agrawal, Sharma, & Giallanza, 2021;Jamwal, Agrawal, Sharma, Kumar, et al., 2021).In this context, Industry 4.0 has been introduced to define strategies for developing new capabilities in industry and reorganizing the value chain (Hermann et al., 2015).Industry 4.0 strategy enables the integration of new technologies, like Cyber-Physical System (CPS), Big data, Internet of Things (IoT), digital twins (DT) and Artificial Intelligence (AI), to manage and explore knowledge.Industry 4.0 will ultimately facilitate the execution of smart industry which refers to the vision of a fully connected industry operating autonomously through the generation, transfer, and analysis of product life cycle data (Kajati et al., 2019;Lasi et al., 2014).
Data valorisation is one of the key objectives in Industry 4.0, as it will develop new capabilities to create competitive advantages, to achieve manufacturing sustainability and to improve strategic and operational decision-making.Data valorisation is based on the management of the lifecycle of data, which refers to the stages of data collection, transmission, storage, processing, visualization, and application (Tao et al., 2018), but it requires significant technological infrastructure and maturity (Shukla & Shankar, 2022).The technological infrastructure, with information and production systems, needs a quality of performance but also a technological integration, i.e. the systems need to be able to work together.The technological integration is particularly challenging due to the systems heterogeneity, which brings interoperability issues.
Interoperability is one of the key principles of Industry 4.0, along with virtualization, decentralization, real-time capability, service orientation and modularity (Hermann et al., 2015).In this paper, interoperability is defined as the ability for two or more systems or organizations to exchange information and to use the information exchanged (IEEE, 1990).Standards and protocols are crucial elements to ensure interoperability between systems and organizations (EIF, 2017).However, an enterprise brings together a set of heterogeneous systems that use different standards and protocols.The implementation of new technologies is therefore costly and requires multidisciplinary skills.
One approach used for interoperability issues is service-oriented architectures (SOA) (Valle et al., 2019).SOA defines interfaces, called services, that facilitate the interaction and sharing of capabilities between systems (Paniagua et al., 2019).SOA are separated into two main communication models (Riedl et al., 2014): client-server architectures that define server nodes (to provide access to specific services) and client nodes (to use the services) and publish-subscribe architectures that, when an event is triggered, publish a message to all nodes that have subscribed to receive the message.
SOA also participates in vertical, horizontal, and end-to-end integration (2013; Dalenogare et al., 2018;Pivoto et al., 2021).Vertical integration deals with the integration of systems at all hierarchical levels of a company to give management timely and up-todate production information.Horizontal integration is the integration of systems between companies involved in the product life cycle.End-to-End integration is an integration of systems across the entire value chain of the enterprise.
These integrations allow company to understand the scope of their systems and are important element in many norms and standards, such as the ISA-95 norm (Pivoto et al., 2021).ISA-95 norm standardizes the interoperability between the corporate management and the production system and has introduced a hierarchical enterprise architecture to represent the functionalities of an enterprise (Figure 1).
ISA-95 architecture (Jiang, 2017) is decomposed in 4 levels, from the production process to the business planning and logistics.For each level, information systems have been developed to provide the services corresponding to the level functionalities and to ensure horizontal integration.These systems are defined as organized set of resources (people, data, procedures, hardware, software) for collecting, processing, storing, and visualizing information (in the form of data, text, images, sounds) within and across organizations (Reix et al., 2016), and includes systems such as enterprise resource planning (ERP), manufacturing execution system (MES), supervisory control and data acquisition (SCADA), etc.
The Open Platform Communications Unified Architecture (OPC-UA) is a standard (IEC-62541, 2020) using an SOA and providing vertical integration for levels 0, 1, and 2 present in ISA-95 (Riedl et al., 2014).OPC-UA address the issues of proprietary interfaces, asynchronous processing and complex information flows between systems.The communications between all interfaces are standardized to limit the translators for each system.However, this solution cannot fully benefit from seamless integration since legacy machines are not prepared to perform web service functions and respond to smart industry requirements (Nakayama et al., 2020).
Despite industry standards, the issues of adoption, integration and interoperability of technology solutions represent major challenges for many companies, especially for small and medium enterprises (SMEs) (Najjari et al., 2021).SMEs have particular environments that influence the adoption and integration of new technologies in industry (Dalenogare et al., 2018).
Following the EU definition, SMEs are enterprises that 'employ fewer than 250 people and which have an annual turnover not exceeding EUR 50 million, and/or an annual balance sheet total not exceeding EUR 43 million' (EC, 2003).This category includes, however, a big variety of enterprises with large differences in organization structure, economic characteristics and behaviour (Najjari et al., 2021).In fact, the differences between SMEs can be linked to the constraints faced by each SME see Table 1.
Despite many constraints, SMEs also have strengths, such as a less complex business model (Müller et al., 2018), short hierarchy, communication among employees, and local management (Moeuf et al., 2020).These factors influence the implementation of new technologies and should be considered in Industry 4.0 strategies.Even though an Industry 4.0 strategy requires a large investment and expertise, the solutions developed are more flexible than the information systems (Moeuf et al., 2018).Therefore, Industry 4.0 strategies can be considered the most accessible solution to create a competitive advantage in SMEs and develop new capabilities.
SME environments seem to change adoption conditions of Industry 4.0 strategies.To support SMEs, approaches have been developed with readiness assessments, maturity models, roadmaps, and frameworks (Mittal et al., 2018(Mittal et al., , 2020)).These approaches are theoretical and the literature lacks studies providing empirical evidence on how new technologies are adopted and integrated in SME environments and in particular in addressing interoperability issues in SMEs (Frank et al., 2019;Mittal et al., 2020;Moeuf et al., 2020;Müller et al., 2018;Najjari et al., 2021;Stentoft et al., 2020) (Amaral & Pecas, 2021).presents two digitalization projects in SMEs by increasing the maturity of production processes, but both proposals use an Excel-based technology that does not allow further development of automatic data collection services, connections with other systems, data valorisation by algorithms or decentralized accessibility.
The current industrial culture uses standards, complex architecture and information systems to integrate industrial systems in an interoperability framework (Goerzig & Bauernhansl, 2018).The resulting interoperability and data processing happens to be dependent on information systems and are therefore rigid.These solutions are not

Constraints
Description References Financial resources Lack of financial resources to make investment.

Expertise
SMEs lack expertise in support functions and are unprepared to manage digital projects requiring multidisciplinary teams.In particular, there is a lack of leaders with appropriate skills and experience to manage a coherent schedule, define precise objectives and to set up steps and resources to make the project succeed.(Moeuf et al., 2018;Najjari et al., 2021) Technological culture and awareness Technological culture and awareness are about the knowledge on industrial standards and Industry 4.0 tools and the understand of Industry 4.0 strategies and their importance.The involvement of shareholders is therefore necessary to increase their interactions and collaboration.However, some companies are afraid of involving their employees and changing their organizational structure.(Kolla et al., 2019;Mittal et al., 2020;Müller et al., 2018;Prause, 2019;Stentoft et al., 2020) Maturity at the beginning of the project The maturity of large companies at the beginning of projects includes knowledge and awareness of the new technologies, while SMEs usually start at a level where the new technologies are seldom used or even unknown: the company's computer networks are restricted by their bandwidth and security, and the information systems do not communicate with each other.At this stage, SMEs are not confident in launching an Industry 4.0 strategy.(Mittal et al., 2018) Technology complexity Technology complexity is a constraint for SMEs and corresponds to the degree to which technologies are perceived as difficult to understand and use, and this is often negatively correlated with adoption.Therefore, there is a real need to simplify technology and the methods to overcome the constraints facing SMEs.(Najjari et al., 2021;Prause, 2019) Short-term strategy Industry 4.0 projects have a strategic impact and should be included in the corporate strategy, but they are long-term projects, and SMEs usually have a short-term strategy.(Moeuf et al., 2020) adapted to SME contexts because of their cost, rigidity, complexity, need for specific skills and lack of suitability for rapid development (Moeuf et al., 2020).
In this paper, a guideline is presented, and experimented in an SME to meet the objectives of integration of systems and data, in order to manage the data lifecycle and benefit from their valorisation.The guideline tries to provide an answer to the research line identified in the review of (Cañas et al., 2021) about 'a conceptual framework that specifically defines what a company must do, more specifically an SME, to move towards Industry 4.0'.The empirical approach validates issues and approaches from the literature and highlight some success and failure factors.
In the next section, we define the term CPS and present architectures, implementation methods and challenges.In section 3, we describe the guideline by detailing the steps to develop and integrate a CPS in the process.In section 4, we present a case study that served as an empirical study.In section 5, we discuss the results of the case study.In section 6, we present the limits and perspectives for the company and the guideline.

Cyber-Physical System (CPS)
The term cyber-physical system (CPS) was introduced at the National Science Foundation in the U.S. to refer to the integration of computation within physical processes (Colombo et al., 2017).Through new technologies, CPS initiated a new paradigm of communication and cooperation among value chain participants, including equipment, systems, organizations, and humans (Lam & Haugen, 2019;You & Feng, 2020) (Jamwal et al., 2020).present a bibliometric analysis about the emerging research interest on CPS.This study highlights the application of CPS and their benefits over traditional manufacturing systems in terms of production line monitoring, smart supply chain, asset monitoring, predict analysis or personalized products.Besides their application, the literature proposed different architectures to define the necessary elements for the realization of CPS.

Functional architecture
A functional architecture, the 5C architecture, was proposed by (Lee et al., 2015) with 5 functionalities: connection, conversion, cyber, cognition and configuration (Figure 2).
The connection functionality (C1) connects data sources to collect data.These sources can be production equipment, products, employees, customers, information systems or computer networks.Different collection systems, automatic or not, can then be used and coupled with protocols to transmit the data automatically.The conversion functionality (C2) focuses on the discovery of information and knowledge from locally collected data.The objective is to provide access to real-time information about its individual state as a system or a small set of systems.The cyber functionality (C3) allows data to be centralized from different connected units to provide information and knowledge about the entire value chain.CPS create an intelligent control system to develop advanced capabilities for forecasting and identifying events that can affect production, quality, and maintenance (Frank et al., 2019).Data mining can provide enormous benefits to support decision making and to determine potential targets for improvement (Moeuf et al., 2020).These include reducing the error rate, reducing the rejection percentage, creating optimized production schedules, increasing product quality, and increasing cost efficiency (Horváth & Szabó, 2019).While the conversion layer focuses on equipment status, the cyber functionality develops inter-equipment comparison capabilities to measure the performance of each unit and to analyse the set of individual past behaviours to predict the behaviour of a particular machine.However, it is important to record and process only the data that is useful to decrease the load on computing technologies (Horváth & Szabó, 2019).The cognition functionality (C4) corresponds to the visualization of data, information, and knowledge on precise and explicit dashboards.This makes the added value of data accessible to employees on statements, graphs, charts, tables, or with the help of augmented or virtual reality (Tao et al., 2018).The final function of CPS, configure (C5), is the feedback from cyber space to physical space.The CPS acquires capabilities to control production processes and makes the equipment autonomous to the external parameters, demand, and changes in the industrial environment.

Organic architecture
Whereas functional architecture defines the functions of a system independent of its realization, organic architecture is used to define elements that are concretely realized as well as the interactions between the elements.More specifically, a CPS can be described as having four layers: physical, network, cloud computing, and control terminal (B.Chen et al., 2018;Wan et al., 2016;Wang, Wan, Li, et al., 2016;Wang, Wan, Zhang, et al., 2016).(Lee et al., 2015).
Physical resources are the first layer (L1) and represent the physical elements of the value chain with production systems, capture systems, and products.The network layer (L2) corresponds to the IT equipment and communication protocols that allow data and commands to be transmitted between the physical resources and the cloud.Communication networks play an important role in CPS in providing reliable communication with low delays, hight access density, low power consumption, and highly accurate synchronization.The goal is to define a common and well-defined exchange format between the stakeholders, so interoperability standards, such as SOA and OPC-UA, can be used (Pivoto et al., 2021).Internet of Things (IoT) technologies can be applied to connect the physical resources to networks.IoT technologies deal with information and communication technologies with communication protocols, middleware architectures (Najjari et al., 2021) or even software defined networks (SDN) (Pivoto et al., 2021).Different tools have been developed to work towards standard architecture, such as In.IoT and C2NET.C2NET (the Cloud Collaborative Manufacturing Networks) is a cloudbased environment that features negotiation model in order to capture all the negotiation steps and decisions through the industrial supply chain (Cretan et al., 2017;Qureshi et al., 2017).In.IoT is a middleware solution for IoT that uses a microservices architecture and allowing the inclusion of services using hypertext transfer protocol (HTTP), constrained application protocol (CoAP), and MQTT at the application layer, which are the most popular protocols for IoT applications (Da Cruz et al., 2021).IoT addresses advanced performance issues at the physical and network layer but most of the current tools remain at a low hierarchical level and need to be linked with a virtual space to integrate them vertically in the enterprise (You & Feng, 2020).
The Cloud layer (L3) centralizes data, algorithms, and services in cloud computing.The National Institute of Standards and Technology defined the cloud (NIST, 2011) as 'a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g.networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction'.The use of cloud computing, with the execution of algorithms, optimization of decision making, and storage of large amounts of data, is essential in an CPS.The control terminal (L4) is the link between the CPS and the company's employees.Endpoints can be computers, tablets, phones, displays and all virtual and augmented reality technologies.Although Industry 4.0 strategies are focused on adopting new technologies such as CPS, IoT, cloud computing or data mining, human resources remain key elements to make decisions but also to understand, use and optimize facilities, processes and services (Sinha & Roy, 2020).
These two architectures are complementary and each functionality of the 5C architecture can be achieved with elements of the organic architecture; see Table 2.
These architectures demonstrate that CPS support the entire data lifecycle with data collection, transmission, storage, processing, visualization, and application.CPS are flexible and adapt to their environment by connecting available system services, production systems and distributing computing power between the cloud, network, and physical levels.Compared to information systems, CPS have connections and cyber functionalities that support more data sources and more data.The local performance of a CPS allows for a greater perimeter of the physical process and is less constrained by individual system performance.Processes can thus be reorganized.

Process reorganization
The implementation of new technologies and CPS brings variations in work design, which refers to the processes and outcomes of how work is structured, organized, experienced and enacted (Waschull et al., 2020).This implementation might flatten the pyramidal architecture defined by ISA-95 (Riedl et al., 2014), which can be described as a hierarchical distributed architecture (Trentesaux, 2007).
Trenstesaux defines the distribution mechanism by decomposing a process into activities and assigning these activities to different entities.In a company, activities are decomposed into maintenance, production, quality, accounting, etc. and these activities are then assigned to different departments and employees in the enterprise.We note that a hierarchical distributed architecture does not meet the challenges of technological integration and interoperability, because (1) the hierarchy requires transmissions from level to level and generates latencies and instabilities in the communications between the different levels, and (2) the distribution of responsibilities can decrease information communication between departments and restrict horizontal integration, end-to-end integration and data valorization.CPS offers the opportunity to create an interacting network that correlates product lifecycle information with value chain information and provides knowledge to all employees in real time (Riedl et al., 2014).Those architectures can be characterized as decentralized heterarchical architecture (Trentesaux, 2007(Trentesaux, , 2009)).
Heterarchy was defined by (Trentesaux, 2007) as an organization in which entities do not have a hierarchical relationship with each other (Figure 3).A semi-heterarchy represents a hierarchical organization with at least one heterarchical sub-organization.The principle is to allow decision-making entities to work together and to react quickly instead of requiring decisions to be made at higher levels.The Kanban system is an example of semi-heterarchy that allows an increase in production responsiveness.
The mechanism of decentralization is the duplication of the process, with particularization (parameterization) of the duplicates according to the specific contexts and the assignment of these duplicates to the different entities.The decentralization of information therefore corresponds to the duplication of information and its accessibility by all systems and all employees.In the same way, the decentralization of a decision-making process corresponds to the duplication of the decision-making logic in the mechatronic and human entities.In mechatronic systems, all entities embed the same algorithms to make decisions, whereas in a human organization, people must have the capability and The physical resources (L1), network (L2) and cloud (L3) layers share the responsibilities of connecting data sources Conversion (C2) The cloud computing layer (L3) hosts the algorithms and computational logic, whether for individual states or policy decisions.Middleware technology allows computing capabilities to be embedded in data sources and to make the conversion (C2) on the network layer (L2) so as not to overload the bandwidth of the company's networks.Cyber (C3) The centralization of information (C3) is at the layer of cloud computing (L3) that also includes the techniques of data valorization.Cognition (C4) Control terminal layer (L4) Configuration (C5) The connection achieved between the physical (L1), network (L2) and cloud computing (L3) levels includes monitoring and collection services as well as actuator control.
authority to make decisions.In both cases, the decentralization of a decision-making process requires the decentralization of information.
Despite the theoretical benefits of a decentralized production control system, current information systems are mostly centralized (Meissner et al., 2017) with a centralization of information and decision logic in each hierarchical level.The strength of centralized systems is the access to all information, which is a prerequisite for global optimization and data mining.However, this production optimization works in a stable environment but during unexpected situations, such as machine problems, large deviations from schedules can happen (Meissner et al., 2017).These deviations are due to a lack of responsiveness of the systems and the centralization of the decision logic.The principle of decentralizing logic is to duplicate it in decision-making entities to increase the responsiveness of systems.The cloud technology of the CPS makes it possible to centralize the information for data valorization, while keeping on-demand and delocalized access.The allocation of responsibility, whether tasks are assigned to humans, machines or both (collaboratively works), is therefore a management choice and is one of the main issues in the CPS design and implementation (Waschull et al., 2020).

CPS implementation
Implementation methods can be separated into two groups, holistic methods, which aim to analyze Industry 4.0 tools from all angles to derive success factors and general rules, and specific methods, which are restricted to analyzing only certain tools but in more detail (Schumacher et al., 2019).
Several papers explain their methods, but the authors generally rely on a holistic approach with enterprise architectures and standards to implement CPS (Jiang, 2017).uses ISA-95 and a 5C architecture to build CPS but does not go into implementation details.Several approaches propose maturity models and roadmaps but remain at the strategic level, without explaining the operational implementation of a technological  (Trentesaux, 2007).
solution (Issa et al., 2018) (Ghobakhloo, 2018).describes a strategic roadmap to guide the transition of companies.This roadmap uses a holistic approach by describing common steps but provides a generic solution that is difficult to implement in a real industrial environment (Schumacher et al., 2019).defines 8 dimensions and uses 65 maturity categories (holistic approach) and derives actions targeted to the company's maturity levels specific to the business context (specific approach).This approach results in a systematic procedure to define the specific needs of the company and the roadmap to follow.However, the article remains general and does not provide any example of technology or example of implementation steps to follow to adopt Industry 4.0 technologies.Reference architectures, like RAMI4.0,IIRA, SITAM, IVRA, IBM Industry 4.0 or LASFA, have also been adopted in different domains to guide engineers on how their systems should interoperate and be structured.These reference architectures are not completely suitable yet to support Industry 4.0 processes, mainly due to their high level of abstraction and/or missing of detailed documentation, together with a lack of customization guidelines that could describe how to refine the diverse modules of the architectures (Najjari et al., 2021;Nakagawa et al., 2021;Pivoto et al., 2021).
These approaches are theoretical and do not illustrate how to practically apply the solutions of CPS architectures in industrial environments.The application processes are delegated to project team members and to standards, which are documents that provide requirements, specifications, guidelines or characteristics that can be used consistently to ensure that materials, products, processes and services are fit for their purpose (Ahmadi et al., 2017).provides a literature review to map the CPS ISO/IEC standards landscape on the 5C CPS architecture (Figure 4).
Figure 4 illustrates that all the different layers have been at that time tackled by standardization, but independently since no standard exist at the interface in between different layers, and no standards are tackling two levels at the same time.However, SMEs lack the expertise to understand CPS architectures and create their roadmaps, but also lack the financial and human resources to identify and use the various standards or existing IoT tools.Therefore, detailed strategies are still needed to support SMEs in the adoption of Industry 4.0 technologies (Stentoft et al., 2020).

CPS challenges
Some challenges related to CPS have already been discussed in the previous section, about the interoperability between heterogeneous systems and technology complexity (Sinha & Roy, 2020).Two other CPS-related challenges are presented in Table 3: cybersecurity and organizational barriers.

Guidelines to implement a CPS architecture in SME
The literature shows that Industry 4.0 strategies are based on standards, information systems and architectures to implement new technologies, including interoperability and data valorization, and thus bring new capabilities and services into industries.CPS is one of the Industry 4.0 technologies that connects physical processes with digital solutions to manage the data lifecycle.CPS have flexible architectures, develop real-time decentralized capabilities, and enable the reorganization of processes and their systems to implement more responsive control systems.However, the CPS implementation guidelines presented in the literature are approaches that are difficult to apply in industry and use standards and architecture that are often inaccessible to SMEs.The uniqueness of the SME environment requires a flexible architecture with affordable technology that is adapted to a company's current systems and data.The starting point analysis, identified as a success factor (Horváth & Szabó, 2019), will define the scope and objectives of the CPS and will be followed by the definition of the solution requirements.The development of the technological solutions will be detailed with the definition of a data structure, the development of a service-oriented application, the connection of data sources and the development of user interfaces and control.The services will be validated and integrated into processes and new requirements may be considered.
The strategy is to use a service-oriented architecture to create services to meet the business requirements.As shown in Figure 5, if services are validated then they are integrated into the process.A loop can be performed to define new requirements, as with the Plan Do Check Act (PDCA) procedure in continual improvement.It allows companies to decompose requirements and organize development periods of a few weeks to adapt requirements dynamically as the CPS is implemented.The guideline therefore uses an agile approach with short development cycles and revision iterations.The definition of a new requirement depends on the company's strategy and its willingness to continue to implement CPS.If a service is not validated and does not meet the requirements, an iteration is carried out on the development of the CPS.
The agile practices adopted during these iterations are commonly used to develop SOA.The literature review has shown that these practices are a success factor in the context of SMEs by favoring a gradual and dynamic implementation.They are used to mitigate the impact of significant organizational changes by allowing stakeholders to adapt smoothly and to communicate new tools regularly.

Starting point analysis
Starting point analysis is used to understand the environment, the systems, the existing data and the maturity of the PME at the beginning of the project.Starting point analysis can involve identifying available manufacturing data and readiness assessment of the SME data lifecycle (Mittal et al., 2020).Maturity models can also be used to evaluate the company's situation.These models define maturity levels and suggest steps to evolve and reach a more sophisticated level.Several maturity models exist, but fail to take into account the specifics of SMEs, and the subsequent steps are not adapted to the constraints of SMEs (Mittal et al., 2018).Among these models is the approach of (Schumacher et al., 2019) which defines 8 dimensions and uses 65 maturity categories.Some models are more generic, such as the model of (Müller et al., 2018), which includes only one dimension of four categories: crafter manufacturers, preliminary stage planners, Industry 4.0 users and Full-scale adopters.The aim of the latter model is to position SMEs based on their structural and technological characteristics and their motivation to implement an Industry 4.0 strategy (Colli et al., 2019).presented a maturity assessment tool to tailor the model to the characteristics of the organization.However, such a tool requires a team of experts to perform the assessment and transform the collected, classified and structured data into a recommendation (Reiner Anderl et al., 2016).proposed a guideline with a 12 dimensions maturity model, 6 dimensions about product maturity and 6 dimensions about production maturity.Selecting the tools according to the knowledge and skills of the project team is important in this analysis.The expertise of employees is one of the constraints of SMEs, so less sophisticated tools can also be used, such as mapping.A type of mapping that is of interest is systems mapping, which consists of listing all the systems in the company and representing them either in a table or in the plans of the company.A business process modeling can also be used to understand the value-added processes and the changes brought about by an Industry 4.0 strategy in processes.Data mapping is another example of a type of mapping that is of interest.Both system mapping and process modelling can be enriched to display data manipulated in the processes and by the system.
These models go through interviews and a report analysis to understand how employees work and what tools and data they manipulate.Then, the information can be structured in the models on a spreadsheet or through formalities such as ANSI, UML or BPMN.

Definition of the requirements
This step is based on the starting point analysis and the identified problems to define targeted and coherent requirements and objectives.Enterprise architectures, such as the one presented in the ISA-95 standard or the CPS architectures, can be used to define requirements by identifying gaps between the initial situation of the enterprise and the architecture.The requirements can be defined at several scales.The company can define broad goals in terms of desired maturity on maturity models, or using the decomposition of (Moeuf et al., 2018) with goals in terms of flexibility, cost reduction, productivity increases, quality improvement or delivery time reduction.More specific goals can also be defined with the data lifecycle.For example, if some data from a process or system is collected but not stored, then a goal of storing it on a centralized system may be the subject of a development phase.Similarly, if data is stored but not processed or not visualized, then precise requirements can be defined for each data.Finally, reference architectures present extensive axes to position the objectives in a structured manner, axes such as Hierarchy Levels, Product Life-cycle and Architecture Layers in RAMI 4.0 (Pivoto et al., 2021).
Indicators can then be used to qualify or quantify the company's requirements.These indicators can be in time, cost, quality, maturity, etc.The stages of the data lifecycle are indicators to track the evolution of the company's data management.
A good understanding of CPS characteristics can help in defining gaps by allowing stakeholders to imagine the possibilities of solutions.Therefore, the technological and organizational culture of employees is a success factor (Horváth & Szabó, 2019) and allows them to be involved in the strategy and to ensure the consistency of requirements with business processes.Tools to develop the culture of a company might be, for example, training, seminars, or the involvement of employees in projects.

Development of a CPS
Figure 6 represents the CPS architecture implemented in the guideline with an organic view of a CPS and these links to the ISA-95 enterprise architecture system services.The enterprise architecture (left) is generic and corresponds to an enterprise with the essential ISA-95 services available, as well as the information systems for each hierarchical level.The model on the right is a representation of the organic view of a CPS with the physical resources, network, cloud, and control terminal levels.This architecture uses a SOA to manage the interoperability with each system and with the human organization.

Data structure definition
Data structure is an essential part of the development of CPS because it allows data and its links to be modeled.Data can therefore be reused when it is accessed to process it and apply it within business processes.This is a step in which the company must be involved to ensure consistency between the actual processes, the data structure and the requirements.
Several tools exist to structure data, so the project team must choose a tool that is adapted to the members' skills and the requirements.The first tool is a class diagram to create a database structure.The second tool is ontologies.Ontologies are comprehensive structures that enable concept modeling in a systematic way and can be used to address interoperability issues (Valle et al., 2019).However, this tool is not common in industry and requires domain expertise to create, develop, and integrate it into systems.Other tools have also been developed to meet specific requirements such as a multi-agent structure for reconfigurable control systems (Wang, Wan, Zhang, et al., 2016).
Class diagrams were deemed more suitable for an SME environment due to the availability of class diagram expertise among professionals.This diagram allows data to be structured using object orientation and classes.A class represents a group of objects with the same attributes, a common behavior, and common relationships with other objects.The class diagram is then used to create a database to store the objects' data.The guideline recommends using a class diagram because it allows employees to be involved in its development.The ontology is a more complex and abstract representation that requires more effort to involve employees in the development loops.
Duffy (Duffy, 1999) proposed a procedure for defining a class diagram.The first step of the procedure is to identify the data in the business processes; this step was done during the starting point analysis.The second step is to define the classes that will form the tables of the database.Then, attributes, an object identifier (primary key), object behaviors (methods) and a class relationship can be defined.Several formalisms can be used to draw a class diagram including UML.System data and process data are reused, so their current format could be analyzed to create classes and attributes.For example, data stocked on Excel files can be grouped to create classes (Figure 7).
The resulting diagram is unique to the company's environment and requirements.Once each class has its own attributes, identifier and relationships, the diagram can be validated with the stakeholders by simulating a process and creating an activity diagram.An activity diagram represents the process scenarios in the form of a flowchart.The process activities are defined, along with their sequence and the objects and data transmitted from one activity to another.The data and objects of the class diagram must therefore appear in this diagram.
The data structure created in this step corresponds to the structure of the CPS database.If a database already exists in the CPS, then the data structure created in this part must be related to the existing structure.Our proposal uses a service-oriented application to manage the database.

Development of a service-oriented application
In this step, an application is developed to create an SOA and deployed on a cloud.The objective of this step is to create the interfaces for the other systems.These interfaces are APIs (application programming interfaces) and have been represented in Figure 8.
APIs are programs that take requests as input and respond with structured data, usually after processing the requested data.Client-server or publish-subscribe distributions can be used depending on the enterprise systems and the skills of the development team.We recommend using a client-server architecture because all systems and their data are known and structured.A frequently used technology to develop a service-oriented application is the REpresentational State Transfer (REST) technology (Valle et al., 2019).Exchanges between a client and a RESTFUL server are performed using requests that must follow these rules: each request includes an application identifier, a resource identifier, a request body, an HTTP verb, and an authentication parameter.
The application identifier allows the client to target the correct server for its request.
The resource identifiers are called the endpoints and correspond to access point of the application objects.Each endpoint is an API; thus, each database object has as many APIs as there are HTTP verbs defined for that resource.
The request body is the data transmitted in the requests.This data must respect the format defined in the application, whether it is html, xml or Json.
The HTTP verbs are the methods defining the behavior of the resources.The main verbs in a RESTFUL application are: • GET: the method requests a representation of the specified resource.
• POST: the method is used to send entities to the specified resource.
• PUT: the method replaces all current data in the target resource with the contents of the request body.
• DELETE: The method deletes the target resource.
However, other methods can be developed in the application, including OPTION, UPDATE or HEAD.
The authentication of a request can be basic, with a parameter in the form of an identifier and a password, or more complex with the use of a token delivered by the authorization system.The token-based authorization process includes authentication, which refers to the process of proving one's identity; authorization, which refers to the function of specifying access rights to resources, and accounting, which refers to the process of measuring the consumption of resources in a service exchange (Kolluru et al., 2018).When a request is made, a token is issued to the client if it is registered in the authorization system, this token stores client information, and a validity period.The token is then encrypted and can be used to access authorized resources as long as the consumption of resources does not exceed the validity threshold.As soon as the token is no longer valid, the system either provides the client with a new token or the server rejects the request and the client must resend the request.
The development of RESTFUL application can be done with Python using specific libraries including Flask for the body of the application and SQLalchemy for the SQL queries in the database.These libraries are compatible with the Json Web Token (JWT) authorization standard.Other technologies exist but depend on the skills of the development team.
Three types of interfaces can be developed.The first allows objects to be registered in a database.The second type corresponds to resource management interfaces that are accessible by other systems.The third type of interface allows the data to be processed using data mining techniques.In this last type of interface, the data in the database is used as a parameter in the data valorization algorithms.Each interface can be checked with a query tester such as POSTMAN.Other systems can then use the interfaces to transmit their data automatically and receive data in the case of control systems, but it is necessary to connect these systems to the cloud.

Data sources connection
The connection of the data sources allows for automatic data transmission and storage.The principle of this connection is either to use the application as a client and to connect the application to the interfaces of the data sources, or to use the application as a server but the sources must be able to send their requests to the application while respecting the protocols.The sources have different standards, services, and protocols, so each type of source must be studied to define a suitable connection solution; see Figure 9.
The first type of data source is the sources that do not collect their own data.These sources can be mechanical systems or the environment.The objective is to identify the best ways to capture process data.The collection system depends entirely on the industrial context and the data to be collected, so the solutions can be industrial sensors or more complex systems integrating storage or processing services.
The second type of source is systems that collect data but do not store or transmit them.These sources are basic sensors, so they must either be connected to a compatible information system such as a SCADA system or be connected to embedded middleware.Middleware creates services to communicate synchronously or asynchronously with a cloud application (Riedl et al., 2014).Synchronous communications means that the middleware is developed as a client to send POST requests at regular intervals to the cloud application and to transmit its data in the request body.Synchronous communications are primarily used by embedded interfaces to provide real-time binding and report events such as errors.Asynchronous communications means that the middleware is developed as a server.The middleware embeds an application with its own APIs, so the cloud application can send requests to the middleware based on a triggering event.The events enable a reduction in the data exchanged by accepting communication only under certain conditions.
The third type of sources are sources that collect their own data, store it, but cannot transmit it because their protocols or interfaces do not allow it.These sources usually encrypt their data in proprietary files.The first solution is to decrypt the data using a service.This service can either be in the cloud or embedded in middleware.The second solution is to develop an adapter between the source protocol and the cloud application.
However, SMEs rarely have the skills to decrypt the data or to create compatible protocols.The third solution is to contact the source provider to find a solution.The fourth solution is to implement a new capture system.
The fourth type of source is systems that collect their data, store it but do not transmit it because they are not connected.These sources can be SCADA compatible.Otherwise, the source can be directly connected to a service of the application, either as a client or server depending on the source.Middleware can also be used to embed the services of a client or server.
The fifth type of source is a source that collects and transmits data, but the data is not stored.This is the case, for example, of a system connected to an information system, but the data is not stored in the information system.The first solution is to add a data storage service in the information system.The second solution is to create parallel communication from the information system to the application to store the data directly on the cloud.
The sixth type of source is the systems that collect, store and transmit their data.The connection of these sources is therefore based on the connection path of the source or application.
The previously developed RESTFULL application has already structured the data of the sources and already integrates the interfaces to manage these data, so as soon as the source is connected, the registration of the data is automatic.The services and data of the application can then be visualized and applied thanks to the user and control interfaces.

Development of user and control interfaces
The last step in the development of a CPS is to integrate the user interfaces and the control loops with other systems.
The control terminal allows the access to the APIs to be structured to make it intuitive and to integrate services with processes.The services of the interfaces are primarily the visualization of data, information and knowledge, but also the collection, transmission and storage of data that is necessary but not collected automatically.One of the types of infrastructure to host this control terminal is web interfaces.These user interfaces are in the form of web pages that use internet protocols to access APIs, which are then coupled with html templates to create a multitude of representations and dashboards.Another type of control terminal infrastructure is Excel Power Query, which allows queries to be sent to APIs.The data is retrieved in the form of an Excel table and can then be used by employees.This solution is of particular interest because it allows the layouts for each employee to be customized and then the layouts of the data before they are centralized in the cloud to be reused while allowing access to information in real time.

Service validation and integration
The service validation step ensures that the services are accessible and that they meet the requirements.The accessibility of the services is validated by computer tests and the requirements are validated with interviews and meetings with stakeholders.If the services are validated, then they can be integrated in the processes, and if the services are not validated, then the development of the CPS starts once again from the definition of the data structure.The indicators chosen in the requirement definition step are re-evaluated to reposition the company in relation to these objectives.A new enterprise maturity can be defined to validate the development of the new capabilities.New data management can also be analyzed with the data lifecycle to monitor the progress of the Industry 4.0 strategy.
The integration of services into processes is important because it corresponds to changing the way employees work by adding the use of new services and capabilities.A leader is a success factor for this step.Indeed, tasks without added value are eliminated from some posts, while other posts gain the responsibility of the new services.A leader is therefore needed to balance the workload for the new services.The use of an activity diagram when defining the data structure also allows the changes of the stakeholder responsibilities to be planned, and therefore allows for employees to be trained before the actual integration.
The integration of services into processes can also generate resistance to change.It is important that employees understand the benefits of services and their uses.Local management in SMEs improves the communication needed to integrate services into the work of employees.Involving stakeholders in the starting point analysis, the requirement definition and the definition of the data structure is essential in developing the culture of employees and lowering resistance to change.Presentations and seminars can also be given to employees on concrete Industry 4.0 solutions and their implementation in an industrial context.
An iteration loop can be performed following the integration of a service.This loop is used to redefine requirements and to develop new services in the company.One of the causes of iteration is the failure to meet the company's maturity objective.Process changes can also lead to the emergence of new requirements.
To adopt this guideline, IT skills are obviously required with IT development, cybersecurity, data processing or middleware development.The fluidity of the iteration loops relies on the culture of the company but also on the robustness and modularity of the SOA application.Once these elements are adopted in the company, precise schedules can be set up to mark the different steps of the guideline.Regular meetings can also be held to review the progress of the various requirements and the benefits developed by the enterprise with CPS.

Case study
The guideline presented in the previous chapter was implemented in La Milanaise, an organic flour mill SME in the Montreal (Québec, Canada) area.
The company belongs to the agri-food sector, which imposes numerous regulations on production to guarantee the quality and safety of its products (X.Chen & Voigt, 2020).The organic nature of the production imposes restrictions on the grains, which must be grown without the use of chemicals, pesticides, and synthetic fertilizers (MAPAQ, 2019).The living and organic characteristics of grains mean that the quality of the raw materials fluctuates depending on the time of year, climate and weather conditions.To better adapt its production to fluctuations in the quality of raw materials, the company considered improving the use of the data present in its processes.

Starting point analysis
The mapping of the systems present in the company are illustrated in Figure 10.
These systems are heterogeneous data sources.The ERP (orange) is an information system specifically developed, which does not have the capacities to adapt.The SCADA (yellow) is connected to the automatic production equipment.Other sources (purple) are also present with the truck scale, the silo tonnage measurement, the bagging system and all of the laboratory measurement equipment.These systems are isolated and heterogeneous data sources, they do not communicate and every system has different vendor, data format, service and communication protocol.Employees link these systems by manually storing and processing data on Excel files (green).
A process model has been created using the ANSIS formalism.This model has been enriched to map the data by adding a corridor below the process.It identifies the data manipulated during the transformation process with the color of the headers representing the storage system and the color of the columns representing the data source.This map demonstrates that the processes are linear and not complex, but that a lot of data that comes from the ERP (orange), the SCADA (yellow) and the other sources of data (purple) are stored in Excel files (green).
Finally, the team chose to use the models of (Reiner Anderl et al., 2016) to evaluate Industry 4.0 maturity.Specially developed for industrial SME, this model is adapted for the SME context.Two production's dimensions have been selected to illustrate this case study because they are particularly addressed by the guideline (Figure 11).

Definition of the requirements
The starting point analysis shows that a lot of data is stored in static, isolated and dispersed Excel files and La Milanaise lacks system connectivity and data transparency.Data cannot be reused by other systems that would process and apply them.Figure 12 represents the enterprise architecture at the beginning of the project using the guideline architecture.
This architecture illustrates that the services and data of the workflow control are managed by Excel files.Automatic data collection, transmission and storage requirements are therefore necessary to leverage the data and realize the industry opportunities.However, the company's ERP and SCADA cannot manage data from all sources neither support data valuation algorithms.Therefore, production management decisions are made by a few managers who must aggregate and process data from different sources.The expertise necessary to make decisions is rare so the production is dependent on certain human resources.
The objective of this project for La Milanaise is therefore to support the decision making of its critical personnel by managing the life cycle of the data present in its processes and by developing data valorization solutions.These new capabilities will allow La Milanaise to evolve in maturity and in particularly to develop web service for Machine-to-Machine (M2M), to set up interdivisional fully network IT solutions and to assure decentralized production monitoring.
In line with the Acatech Maturity Index (Günther et al., 2020), the first requirement was to improve system computerization and connectivity through data automatic collection, transmission and storage.The project team decided to start at the beginning of the production chain with the reception and analysis of grains.

Development of a CPS
A CPS was developed to meet the initial requirements for collecting, transmitting and storing grain receipt and analysis data.

Data structure
A class diagram was created to structure the CPS database.Metadata on each attribute has been defined, with the type, the visibility, nullable value, default value, data sources and comments, but these metadata are not displayed for readability issues.Classes have been validated by the company by playing process scenarios and recording data to verify the attributes and their logical links.

Service oriented application
The service-oriented application was realized in the Python language with the Flask, SQLalchemy libraries and the JWT standard.These tools allowed for interfaces to be developed to manage the objects defined in the class diagram and to register them in a database.

Data source connection
The data sources for this iteration were the ERP, the truck scale and the analysis equipment.
The ERP was a type 6 source for collecting data, storing it in a database and transmitting it.The ERP does not have a software interface to transmit its data through queries because it was not developed to provide this service.The application was directly connected to the ERP database to export the data listed in the class diagram and make it accessible without constraining the use of the ERP.
The truck scale was a type 2 data source that collected data and displayed it to employees.Algorithms have been embedded on a middleware (a Raspberry Pi board) to retrieve the sensor signal and convert it into a digital signal.The middleware sends a query to the application each time a truck is detected on the scale with the weight of the truck.The system makes the link between the weight and the number of trucks.
Each piece of analytical equipment in the lab was analyzed to determine the best way to automatically collect and transmit their data.A near infra-red (NIR) was a type 4 source because it collected data and stored it in a database internally, but the device was not connected to the network.The NIR was therefore able to be connected and the application was able to access the database.A device called 'Glutopeak' was a type 3 data source because it encrypted its data onto text files.The SME did not have the skills to decrypt this data.A service was created on the application interface to copy the data display on the Glutopeak and paste it into a space capable of understanding the format and saving it in the database.

User and control interfaces
First, web pages were developed as an interface for users to connect to the application, manually collect data and visualize the data.Figure 13 illustrates the tables, forms and graphs that were created.
Tables using Power Queries on Excel were also created to allow employees to independently customize their use of the data.

Services validation and integration
The application allowed the development of services for automatic and manual data collection, storage in a centralized system and data visualization in real time and throughout the company.Computer tests were conducted throughout the project to validate the computer interfaces and communication between the CPS systems.Meetings and individual interviews were conducted to present the services to the stakeholders and to validate their compliance with the employees' requirements.Several iterations were necessary to validate the project because the requirements were not specified clearly at the beginning of the project.

Discussion
The guideline details a strategy to manage and valorize data in SMEs, with a vision, through the CPS architecture, and the steps and elements to realize this vision.The definition of such strategy is a success factor to overcome many constraints in SMEs like financial constraints, lack of expertise and lack of culture through the possibility to dynamically plan each step (Estensoro et al., 2021).SOA is another success factor to meet the interoperability issues with a flexible and modular architecture, acquired by construction (Pivoto et al., 2021).The communication backbone of the CPS are middlewares with a unified protocol, which allows direct communication between the shareholders and enables the scaling of the system with minimal disruption (Najjari et al., 2021).
The case study implements steps for data collection, transmission, storage and visualization.Data processing and application were not included in the requirements but are being prototyped.Data processing has already been carried out to understand the challenges and possibilities of data mining techniques for the company.Data applications were also being prototyped to send data to SCADA and control actuators.However, the company only decided to give access to the CPS of the SCADA data and not of the control, for security reasons.The choice to restrain the connection to the SCADA is linked to the company's choice to keep distributed hierarchical human decision-making power.The information has been decentralized by allowing real-time monitoring of processes on all platforms that support web interfaces and connected to the company network.This allows managers and operators to have a better overview of situations when making decisions.
The implementation of this CPS also improved the maturity of the company according to the model of (Reiner Anderl et al., 2016).Web interfaces enable the accessibility to production equipment and SCADA data all over the enterprise.The data structure and company wide application assure the interdivisional fully networked IT solution and enable production monitoring through decentralized process.This technological integration is a prerequisite to valorize the data and advance towards manufacturing sustainability, whether it be for new business models, closed loop supply chain networks, sustainable product design, predictive quality and maintenance or human-robot collaboration (Jamwal, Agrawal, Sharma, & Giallanza, 2021).
The first examples of benefits from industry 4.0 strategy in SMEs have started to be documented in the literature (Frank et al., 2019;Mittal et al., 2020;Müller et al., 2018), but not all benefits have been studied yet.In the case study presented, the development team failed in integrating the services within the processes.Indeed, service integration changed employees work or their daily responsibilities and need to be managed but the team lacked an internal leader for the transition during the integration.The significance of top management beliefs and active participation has already been demonstrated in various studies (Prause, 2019).The company still needs to integrate services within their processes to benefit from the CPS and develop other capabilities and competitive advantage by integrating the technologies in several parts of the value chain and implement more CPS functions (C1, C2, C3, C4 and C5).
SMEs and CPS challenges were considered in the guideline and in the case study.The case study in particular has benefited from government support which can intervene financially wise or by providing technological solutions and maintaining a circular business structure in the manufacturing sector (Wankhede & Vinodh, 2021).The guideline integrates standard solutions, such as SOA, but do not use whole standards due to SME constraints.These solutions have been judged as success factors to solve the problems of heterogeneity and tools in the system integration process (X.Chen & Voigt, 2020).Stakeholders were involved in several steps of the guideline to develop the technical and organizational culture.Understanding the strategy, its compatibility with the environment and its advantage had a positive impact on the acceptance of solutions and development timelines (Prause, 2019).The complexity of the technologies was addressed in the guideline by requiring expertise that was deemed accessible to SMEs, such as software development, database management and web interface skills.These skills may be present internally, in academic partnership or with IT consultants.However, competencies with Industry 4.0 maturity models, data mining, and cybersecurity are needed, even if they are not present in SMEs.Cybersecurity is needed in all companies, large or small, and an example of solution has been presented in the guideline.An agile approach was also used to implement the technologies gradually and to provide regular results in the development of the CPS.Agile approach allowed the SME to learn continuously, to better plan their next iteration and to develop a coherent strategy through hierarchy levels and product lifecycle.

Conclusion
To achieve manufacturing sustainability, industries require to develop new business models and new capabilities.Industry 4.0 strategies help in this transition through the management and the valorization of more and more data, but many industries are still having difficulties with the technological integration of their heterogeneous systems and with the appropriation of the new technologies.SMEs, in particular, have constrained environments which change the adoption conditions of Industry 4.0 strategies, and still lack empirical R&D results for their specific fields and needs.A guideline is therefore proposed to support SMEs in their implementation of CPS.This guideline considers the SME-related challenges and interoperability issues to develop a flexible architecture enabling data valorisation.
This guideline has been implemented in an industrial SME and helped to improve the technological maturity on Machine-to-Machine communication and on the companywide networking and the employees' technological culture and awareness.Besides considering SME-related challenges, the guideline starts from industrial problems and objectives to define an appropriate CPS architecture and manage relevant data, as opposed to strategy only using available data to find out what can be done.This approach supports SMEs to tailor their strategy and develop genuine new business plan.Be it operators or managers, their involvement in the design and implementation are success factors to define their place in the system and to reduce the resistance to change.

Limitation of the study
The guideline has been implemented in a particular case which limits the generalization of the benefits, partly because skills present in the company and in the project are one of the main factors piloting the progress and impacting the benefits.Critical skills are about complex project management, computer science, cybersecurity, and data science.Another limitation is the robustness of the system, due to the agile based approach and quick implementation process.Iterations are planned which can served to revise some services, but the guideline is dedicated to flexibility more than robustness, contrary to expert system.

Future implication
The perspectives for the company of the case study are to continue to define requirements to implement CPS across the entire value chain of the company and to develop the potential for data mining with data-based decision making.For example, data on grain analysis can be value to optimize the parameterization of the mill, considering the grains and the production objectives.Future work for this guideline can be used to study the compatibility of the guideline with IoT tools, with different industrial standards or with multi-agent systems.Radio-frequency identification (RFID) technologies can also be used with the current system to help in logistic traceability.

Figure 5
Figure 5 presents the proposed guideline including several successive steps to implement technological solutions and consider the constraints and strengths of SMEs and CPS.The starting point analysis, identified as a success factor(Horváth & Szabó, 2019), will define the scope and objectives of the CPS and will be followed by the definition of the solution requirements.The development of the technological solutions will be detailed with the definition of a data structure, the development of a service-oriented application, the connection of data sources and the development of user interfaces and control.The services will be validated and integrated into processes and new requirements may be considered.The strategy is to use a service-oriented architecture to create services to meet the business requirements.As shown in Figure5, if services are validated then they are integrated into the process.A loop can be performed to define new requirements, as with the Plan Do Check Act (PDCA) procedure in continual improvement.It allows companies to decompose requirements and organize development periods of a few weeks to adapt requirements dynamically as the CPS is implemented.The guideline therefore uses an agile approach with short development cycles and revision iterations.The definition of a new requirement depends on the company's strategy and its willingness to continue to implement CPS.If a service is not validated and does not meet the requirements, an iteration is carried out on the development of the CPS.The agile practices adopted during these iterations are commonly used to develop SOA.The literature review has shown that these practices are a success factor in the context of SMEs by favoring a gradual and dynamic implementation.They are used to mitigate the impact of significant organizational changes by allowing stakeholders to adapt smoothly and to communicate new tools regularly.

Figure 5 .
Figure 5. Guideline to implement a CPS architecture in SME.

Figure 7 .
Figure 7. Example of class creation from an excel file.

Figure 9 .
Figure 9. Illustration of source types and connection solutions.

Figure 12 .
Figure 12.SME architecture at the beginning of the project.

Table 1 .
Description of SME constraints.

Table 2 .
Complementarity between CPS 5C architecture and CPS organic architecture.

Table 3 .
Challenges to implement a CPS.