Littérature scientifique sur le sujet « Optimisation des KPIs »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Optimisation des KPIs ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Optimisation des KPIs"

1

Dalpadulo, Enrico, Francesco Gherardini, Fabio Pini et Francesco Leali. « Integration of Topology Optimisation and Design Variants Selection for Additive Manufacturing-Based Systematic Product Redesign ». Applied Sciences 10, no 21 (5 novembre 2020) : 7841. http://dx.doi.org/10.3390/app10217841.

Texte intégral
Résumé :
The development of additive manufacturing allows the transformation of technological processes and the redesign of products. Among the most used methods to support additive manufacturing, the design can be optimised through the integration of topology optimisation techniques, allowing for creating complex shapes. However, there are critical issues (i.e., definition of product and process parameters, selection of redesign variants, optimised designs interpretation, file exchange and data management, etc.) in identifying the most appropriate process and set-ups, as well as in selecting the best variant on a functional and morphological level. Therefore, to fully exploit the technological potentials and overcome the drawbacks, this paper proposes a systematic redesign approach based on additive manufacturing technologies that integrate topology optimisation and a tool for selecting design variants based on the optimisation of both product and process features. The method leads to the objective selection of the best redesigned configuration in accordance with the key performance indicators (KPIs) (i.e., functional and production requirements). As a case study, the redesign of a medical assistive device is proposed, previously developed in fused filament fabrication and now optimised for being 3D printed with selective laser melting.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Kohlgrüber, Michael, Antonius Schröder, Félix Bayón Yusta et Asier Arteaga Ayarza. « A new innovation paradigm : combining technological and social innovation ». Matériaux & ; Techniques 107, no 1 (2019) : 107. http://dx.doi.org/10.1051/mattech/2018065.

Texte intégral
Résumé :
A new innovation paradigm is needed to answer the societal, economic and environmental challenges the world and companies are facing. The EU funded Horizon 2020 SPIRE Project “Coordinating Optimisation of Complex Industrial Processes” (COCOP) is combining technological and social innovation within a steel company pilot case (Sidenor). The project aims at reducing raw materials consumption (and energy and emissions reduction as well) by plant-wide optimisation of production processes based on a software solution and at the same time changing social practices. Key for COCOP is a methodology integrating technological innovation within a social innovation process of co-creation and co-development by involving (potential) users of the future software system and relevant stakeholders right from the beginning; thereby improving effectiveness and impact of the innovations and the implementation process. This involvement is instructed and measured by social key performance indicators (social KPIs) and operationalised in surveys (questionnaire and interviews) with future users, engineers and external experts (from different industry sectors not involved in the project). The article presents the results of the starting point of COCOP illustrating the future user perspective of the pilot steel company (Sidenor) contrasted by the view of external experts – seriously taking into account the interfaces between technology, human and organisation.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Redlein, A., C. Baretschneider et L. Thrainer. « ESG monitoring and optimisation solutions and their return on investment : results of several case studies ». IOP Conference Series : Earth and Environmental Science 1176, no 1 (1 mai 2023) : 012029. http://dx.doi.org/10.1088/1755-1315/1176/1/012029.

Texte intégral
Résumé :
Abstract In 2021 the European Union defined the Environmental, Social and Governance (ESG) directive to foster sustainability. As the real estate sector is responsible for around 40% of the CO2 emissions, this industry has to carry out additional sustainability reporting and optimisation activities to prove its assets are fulfilling sustainability goals. Most investors concentrate on energy- and CO2 reduction, but ESG is much more. The Sustainable Development Goals of the United Nations give a perfect overview of the related topics but do not define KPIs. A second challenge is that the market has not honoured the additional tasks by now. Automation is necessary to reduce efforts of the activities The paper provides answers to the following research questions based on several case studies: What are the relevant parameters to prove ESG, focusing especially on the areas of “Environment” and “Social” as parts of the ESG directive? How can an IT support look like to automatise the data gathering efficiently? What is the Return on Investment of the suggested solution? The research focuses on historical buildings as they usually have a low degree of building automation. The solution is to enable efficient, automated optimisation of energy consumption and safeguard the well-being of the tenants with low investment.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Shariat, Mehrdad, Ömer Bulakci, Antonio De Domenico, Christian Mannweiler, Marco Gramaglia, Qing Wei, Aravinthan Gopalasingham et al. « A Flexible Network Architecture for 5G Systems ». Wireless Communications and Mobile Computing 2019 (11 février 2019) : 1–19. http://dx.doi.org/10.1155/2019/5264012.

Texte intégral
Résumé :
In this paper, we define a flexible, adaptable, and programmable architecture for 5G mobile networks, taking into consideration the requirements, KPIs, and the current gaps in the literature, based on three design fundamentals: (i) split of user and control plane, (ii) service-based architecture within the core network (in line with recent industry and standard consensus), and (iii) fully flexible support of E2E slicing via per-domain and cross-domain optimisation, devising inter-slice control and management functions, and refining the behavioural models via experiment-driven optimisation. The proposed architecture model further facilitates the realisation of slices providing specific functionality, such as network resilience, security functions, and network elasticity. The proposed architecture consists of four different layers identified as network layer, controller layer, management and orchestration layer, and service layer. A key contribution of this paper is the definition of the role of each layer, the relationship between layers, and the identification of the required internal modules within each of the layers. In particular, the proposed architecture extends the reference architectures proposed in the Standards Developing Organisations like 3GPP and ETSI, by building on these while addressing several gaps identified within the corresponding baseline models. We additionally present findings, the design guidelines, and evaluation studies on a selected set of key concepts identified to enable flexible cloudification of the protocol stack, adaptive network slicing, and inter-slice control and management.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Vavallo, Michele, Marco Arnesano, Gian Marco Revel, Asier Mediavilla, Ane Ferreiro Sistiaga, Alessandro Pracucci, Sara Magnani et Oscar Casadei. « Accelerating Energy Renovation Solution for Zero Energy Buildings and Neighbourhoods—The Experience of the RenoZEB Project ». Proceedings 20, no 1 (18 juillet 2019) : 1. http://dx.doi.org/10.3390/proceedings2019020001.

Texte intégral
Résumé :
Buildings are the key factor to transform cities and to contribute to recent European energy efficiency objectives for 2030 and long-term 2050. New buildings account to only 1–2% annually. Yet, ninety percent of the existing building stock in Europe was built before 1990, it is therefore necessary to promote their energy renovation to achieve the set objectives. Renovation solutions are available on the market, yet a wrong implementation and integration due to a lack of knowledge neither maximizes the energy performance of the post-retrofitting nor the financial optimisation and viability of the projects. This paper presents research on a plug & play, modular, easy installable façade and ICT decision making technologies to provide affordable solutions in order to overcome those deep renovation barriers. The paper sets out by defining a value framework that can be applied by real estate investors for making better retrofitting decisions for residential buildings, through mapping targeted building typologies and investigating new building revalorisation strategies, new renovation concepts and KPIs for evaluation. Thereafter the paper presents the modular and easy-to-install façade system that is replicable and scalable at European level.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Branca, Teresa Annunziata, Ismael Matino, Valentina Colla, Alice Petrucciani, Amarjit Kuor Maria Singh, Antonella Zaccara, Teresa Beone et al. « Paving the way for the optimization of water consumption in the steelmaking processes : barriers, analysis and KPIs definition ». Matériaux & ; Techniques 108, no 5-6 (2020) : 510. http://dx.doi.org/10.1051/mattech/2021006.

Texte intégral
Résumé :
The efficient use of water resources is one of the main challenges of the steel sector, according to the European Union water policy. On this subject, monitoring and optimization systems, linked to the innovative water treatments, represent important tools to improve water management and the related energy use. The present paper describes a part of the work developed in the early stage of the project entitled “Water and related energy Hub Advanced Management system in steelworks – WHAM”, which is co-funded by the Research Fund for Coal and Steel. The project aims at optimizing water consumption in the steelworks through a holistic combination of on-line monitoring and optimisation and innovative water treatment technologies. As different aspects affect water use in the steelmaking processes, in the first part of the paper, the main technical barriers and factors, that can impact on reuse and recirculation of wastewater and energy efficiency, are analysed. The main constraints on water management in the steel sector, such as fresh water availability, its quality and local legal requirements, were considered in order to maximise the water reuse and recycling. Furthermore, the main barriers, such as environmental issues and several costs, were investigated. In the second part of the paper, a set of Key Performance Indicators are listed. They aim at assessing and monitoring the water management sustainability in a holistic way, both in terms of environmental and economic performances, as well as of new water treatments efficiency and their economic viability. Key Performance Indicators will be used to monitor the efficiency of water management, aiming at achieving significant increase of performances. On the other hand, some of these indicators will be used as objective functions for problems optimization. The computation of the selected Key Performance Indicators will take into account both industrial data and results from simulations that will be carried out after the development of suitable tools in order to assess the feasibility of some relevant process modifications or the applications of new technologies.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Kassen, Stefan, Holger Tammen, Maximilian Zarte et Agnes Pechmann. « Concept and Case Study for a Generic Simulation as a Digital Shadow to Be Used for Production Optimisation ». Processes 9, no 8 (3 août 2021) : 1362. http://dx.doi.org/10.3390/pr9081362.

Texte intégral
Résumé :
Optimising an existing production plant is a challenging task for companies. Necessary physical test runs disturb running production processes. Simulation models are one opportunity to limit these physical test runs. This is particularly important since today’s fast and intelligent networking opportunities in production systems are in line with the call of Industry 4.0 for substantial and frequent changes. Creating simulation models for those systems requires high effort and in-depth knowledge of production processes. In the current literature, digital twins promise several advantages for production optimisation and can be used to simulate production systems, which reduce necessary physical test runs and related costs. While most companies are not able to create digital twins yet, companies using enterprise resource planning (ERP) systems have the general capability to create digital shadows. This paper presents a concept and a case study for a generic simulation of production systems in AnyLogic™ to create digital shadows as the first step towards a full digital twin. The generic simulation visualises production systems automatically and displays key performance indicators (KPIs) for the planned production program, using representational state transfer (REST) interfaces to extract product and production data from an ERP system. The case study has been applied in a learning factory of the University of Applied Life Sciences Emden/Leer. The results prove the presented concept of the generic simulation and show the limits and challenges of working with generic simulation models.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Stora, T., C. Duchemin, W. Andreazza, E. Aubert, C. Bernerd, T. Cocolios, M. Deschamps et al. « CERN-MEDICIS : Operational indicators to support the production of new medical radionuclides by mass-separation ». Journal of Physics : Conference Series 2687, no 8 (1 janvier 2024) : 082039. http://dx.doi.org/10.1088/1742-6596/2687/8/082039.

Texte intégral
Résumé :
Abstract CERN-MEDICIS is an isotope mass separation facility dedicated to biomedical research located in a type A work sector, receiving on average 50% of the 1.4 GeV protons delivered by the Proton Synchrotron Booster (PSB). It was commissioned with Radioactive Ion Beams (RIB’s) in 2017. MEDICIS has operated for the past 5 years in batch mode, with targets irradiated in a station located at the HRS beam dump, and with external sources provided by MEDICIS cyclotrons and nuclear reactors partners, notably during the Long Shutdown (LS2). Additional features of the facility include the MELISSA laser ion source, radiochemistry on implanted radionuclides and an online gamma-ray spectroscopy implantation monitoring. In 2022, we introduced Key Performance Indicators (KPI’s) to monitor the operation of the facility for collected efficiencies, the optimisation of the radiological risks and evaluate impact of possible modifications of the station, paralleling for instance LHC’s integrated luminosity. Defined KPI’s cover aspects in the operation cycle, e.g. planning in CERN schedule, target irradiations, duration of the process, radiological risk mitigation, facility up-time, developments and maintenance. MEDICIS KPI’s can help distinguish which of the operation and infrastructure life cycle requires immediate intervention, developments or consolidation. Those are related to the irradiation stations and irradiation possibilities, the beamlines (parallel collections), target and ion sources (reliability), robot handling and infrastructure, or the separation process itself.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Jurczak, Marcin, Grzegorz Miebs et Rafał A. Bachorz. « Multi-criteria human resources planning optimisation using genetic algorithms enhanced with MCDA ». Operations Research and Decisions 32, no 4 (2022). http://dx.doi.org/10.37190/ord220404.

Texte intégral
Résumé :
The main objective of this paper is to present an example of the IT system implementation with advanced mathematical optimisation for job scheduling. The proposed genetic procedure leads to the Pareto front, and the application of the multiple criteria decision aiding (MCDA) approach allows extraction of the final solution. Definition of the key performance indicator (KPI) reflecting relevant features of the solutions, and the efficiency of the genetic procedure provide the Pareto front comprising the representative set of feasible solutions. The application of chosen MCDA, namely elimination et choix traduisant la réalité (ELECTRE) method, allows for the elicitation of the decision maker (DM) preferences and subsequently leads to the final solution. This solution fulfils all of the DM expectations and constitutes the best trade-off between considered KPIs. The proposed method is an efficient combination of genetic optimisation and the MCDA method.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Rison, S. C. G., I. Dostal, Z. Ahmed, Z. Raisi-Estabragh, C. Carvalho, M. Lobo, R. Patel et al. « Protocol design and preliminary evaluation of the REAL-Health Triple Aim, an open-cohort CVD-care optimisation initiative ». European Heart Journal 42, Supplement_1 (1 octobre 2021). http://dx.doi.org/10.1093/eurheartj/ehab724.3170.

Texte intégral
Résumé :
Abstract Introduction Effective treatment of cardiovascular disease (CVD) in primary care could be improved. We aim to assess the efficacy of a scalable treatment optimisation programme in unselected community populations in South East England, with the triple aim of improved blood pressure control in people with hypertension, increased high-intensity statin use in people with CVD and reduced gastrointestinal bleeding in patients on antithrombotic medication. Method This observational study comprises an open cohort of approximately 200,000 adults at high cardiovascular risk registered with general practitioners in five South East England Clinical Commissioning Groups (CCGs). An intervention programme is planned in four of these CCGs with a further non-intervention CCG acting as a control group. The intervention will consist of: clinical guidelines and educational outreach; virtual patient-reviews software; peer-performance “dashboards” and, where available, financial incentives. The study will examine 3 primary outcomes: 1. Diagnosed hypertension with a blood pressure <140/90mmHg; 2. Diagnosed CVD on a high-intensity statin; 3. A cardiovascular indication for antithrombotic therapy with one or more factors for increased risk of gastrointestinal bleeding (e.g. age ≥65) on gastroprotection. A further 17 secondary outcomes related to these three aims will be assessed. Analysis We will use an interrupted time series analysis over 18 months, representing the pre-implementation, implementation and the post-implementation phases with comparison to the control CCG and applicable national Quality and Outcomes Framework and national prescribing statistics (e.g. OpenPrescribing). Secondary outcomes include an equity impact analysis with results stratified by age, gender, ethnic group and index of deprivation. Preliminary data We present preliminary data on Key Performance Indicators (KPIs) collected from 191 GP practices including [percentage achievement on 01/09/2019, on 01/09/2020]: 1. Patients with hypertension and most recent blood pressure ≤140/90mmHg [68.7%, 60.6%]. 2. Patients eligible for treatment with a high-intensity statin on such treatment [53.8%, 55.8%]. 3. Patients on antithrombotics with ≥1 risk factors for gastrointestinal bleeding on gastroprotection [59.0%, 60.1%]. We also present our virtual patient-review software tool and outcome visualisation dashboard. Conclusion The REAL-Health Triple Aim initiative is a large-scale primary care cardiovascular risk reduction initiative which was launched almost contemporaneously with the United Kingdom's first SARS-CoV-2 related lockdown. Preliminary data justify the need for the Triple Aim initiative and give us an insight on the impact of the pandemic on its implementation. Funding Acknowledgement Type of funding sources: Other. Main funding source(s): Barts CharityBritish Heart Foundation
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Optimisation des KPIs"

1

Oudrhiri, Ali. « Performance of a Neural Network Accelerator Architecture and its Optimization Using a Pipeline-Based Approach ». Electronic Thesis or Diss., Sorbonne université, 2023. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2023SORUS658.pdf.

Texte intégral
Résumé :
Ces dernières années, les réseaux de neurones ont gagné en popularité en raison de leur polyvalence et de leur efficacité dans la résolution d'une grande variété de tâches complexes. Cependant, à mesure que les réseaux neuronaux continuent de trouver des applications dans une gamme toujours croissante de domaines, leurs importantes exigences en matière de calcul deviennent un défi pressant. Cette demande en calcul est particulièrement problématique lors du déploiement de réseaux neuronaux sur des dispositifs embarqués aux ressources limitées, en particulier dans le contexte du calcul en périphérie pour les tâches d'inférence. De nos jours, les puces accélératrices de réseaux neuronaux émergent comme le choix optimal pour prendre en charge les réseaux neuronaux en périphérie. Ces puces offrent une efficacité remarquable avec leur taille compacte, leur faible consommation d'énergie et leur latence réduite. Dans le cadre du calcul en périphérie, diverses exigences ont émergé, nécessitant des compromis dans divers aspects de performance. Cela a conduit au développement d'architectures d'accélérateurs hautement configurables, leur permettant de s'adapter aux demandes de performance distinctes. Dans ce contexte, l'accent est mis sur Gemini, un accélérateur configurable de réseaux neuronaux conçu avec une architecture imposée et mis en œuvre à l'aide de techniques de synthèse de haut niveau. Les considérations pour sa conception et sa mise en œuvre ont été motivées par le besoin de configurabilité de la parallélisation et d'optimisation des performances. Une fois cet accélérateur conçu, il est devenu essentiel de démontrer la puissance de sa configurabilité, aidant les utilisateurs à choisir l'architecture la plus adaptée à leurs réseaux neuronaux. Pour atteindre cet objectif, cette thèse a contribué au développement d'une stratégie de prédiction des performances fonctionnant à un niveau élevé d'abstraction, qui prend en compte l'architecture choisie et la configuration du réseau neuronal. Cet outil aide les clients à prendre des décisions concernant l'architecture appropriée pour leurs applications de réseaux neuronaux spécifiques. Au cours de la recherche, nous avons constaté qu'utiliser un seul accélérateur présentait plusieurs limites et que l'augmentation de la parallélisme avait des limitations en termes de performances. Par conséquent, nous avons adopté une nouvelle stratégie d'optimisation de l'accélération des réseaux neuronaux. Cette fois, nous avons adopté une approche de haut niveau qui ne nécessitait pas d'optimisations fines de l'accélérateur. Nous avons organisé plusieurs instances de Gemini en pipeline et avons attribué les couches à différents accélérateurs pour maximiser les performances. Nous avons proposé des solutions pour deux scénarios : un scénario utilisateur où la structure du pipeline est prédéfinie avec un nombre fixe d'accélérateurs, de configurations d'accélérateurs et de tailles de RAM. Nous avons proposé des solutions pour mapper les couches sur les différents accélérateurs afin d'optimiser les performances d'exécution. Nous avons fait de même pour un scénario concepteur, où la structure du pipeline n'est pas fixe, cette fois il est permis de choisir le nombre et la configuration des accélérateurs pour optimiser l'exécution et également les performances matérielles. Cette stratégie de pipeline s'est révélée efficace pour l'accélérateur Gemini. Bien que cette thèse soit née d'un besoin industriel spécifique, certaines solutions développées au cours de la recherche peuvent être appliquées ou adaptées à d'autres accélérations de réseaux neuronaux. Notamment, la stratégie de prédiction des performances et l'optimisation de haut niveau du traitement de réseaux neuronaux en combinant plusieurs instances offrent des aperçus précieux pour une application plus large
In recent years, neural networks have gained widespread popularity for their versatility and effectiveness in solving a wide range of complex tasks. Their ability to learn and make predictions from large data-sets has revolutionized various fields. However, as neural networks continue to find applications in an ever-expanding array of domains, their significant computational requirements become a pressing challenge. This computational demand is particularly problematic when deploying neural networks in resource-constrained embedded devices, especially within the context of edge computing for inference tasks. Nowadays, neural network accelerator chips emerge as the optimal choice for supporting neural networks at the edge. These chips offer remarkable efficiency with their compact size, low power consumption, and reduced latency. Moreover, the fact that they are integrated on the same chip environment also enhances security by minimizing external data communication. In the frame of edge computing, diverse requirements have emerged, necessitating trade-offs in various performance aspects. This has led to the development of accelerator architectures that are highly configurable, allowing them to adapt to distinct performance demands. In this context, the focus lies on Gemini, a configurable inference neural network accelerator designed with imposed architecture and implemented using High-Level Synthesis techniques. The considerations for its design and implementation were driven by the need for parallelization configurability and performance optimization. Once this accelerator was designed, demonstrating the power of its configurability became essential, helping users select the most suitable architecture for their neural networks. To achieve this objective, this thesis contributed to the development of a performance prediction strategy operating at a high-level of abstraction, which considers the chosen architecture and neural network configuration. This tool assists clients in making decisions regarding the appropriate architecture for their specific neural network applications. During the research, we noticed that using one accelerator presents several limits and that increasing parallelism had limitations on performances. Consequently, we adopted a new strategy for optimizing neural network acceleration. This time, we took a high-level approach that did not require fine-grained accelerator optimizations. We organized multiple Gemini instances into a pipeline and allocated layers to different accelerators to maximize performance. We proposed solutions for two scenarios: a user scenario where the pipeline structure is predefined with a fixed number of accelerators, accelerator configurations, and RAM sizes. We proposed solutions to map the layers on the different accelerators to optimise the execution performance. We did the same for a designer scenario, where the pipeline structure is not fixed, this time it is allowed to choose the number and configuration of the accelerators to optimize the execution and also hardware performances. This pipeline strategy has proven to be effective for the Gemini accelerator. Although this thesis originated from a specific industrial need, certain solutions developed during the research can be applied or adapted to other neural network accelerators. Notably, the performance prediction strategy and high-level optimization of NN processing through pipelining multiple instances offer valuable insights for broader application
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Optimisation des KPIs"

1

Panagiotopoulou, Vasiliki C., Alexios Papacharalampopoulos et Panagiotis Stavropoulos. « Developing a Manufacturing Process Level Framework for Green Strategies KPIs Handling ». Dans Lecture Notes in Mechanical Engineering, 1008–15. Cham : Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-28839-5_112.

Texte intégral
Résumé :
AbstractGreen strategies in manufacturing have multifold perspectives implying that are highly diversified in terms of resources management. Popular green strategies are Zero Defect, Circularity and Sustainability. The challenges regarding resources efficiency result from different concepts addressed by each strategy; Zero Defect focuses on defect prevention via quality planning, control, and improvement, while Circularity addresses resources optimisation via resources management, material production, usage and disposal. Sustainability is a different approach, to include economic growth and social impact, besides resources management, waste management and environmental impact. Until now, key performance indicators (KPIs) have been used for individual strategy, while literature shows a lack of frameworks towards transforming KPIs when adopting more than one strategy. The current work is a step towards defining an approach describing the relationship between the KPIs of different green strategies and elaborating the repercussions of this transformation on workflows and specifically on manufacturing processes. Two different approaches could be used (monetary and qualitative) with thermoforming used as a case, and the results are indicative of the method efficiency, where KPIs for Zero Defect, Circularity and Sustainability are compared. The framework is developed to be later generalised and applied to other manufacturing processes.
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Optimisation des KPIs"

1

Dubey, Pranav, Rachit Garg, Prateek Kumar, Akash Tyagi, Aditi Jain, Pramit Chakraborty et Sameer Chabbra. « Calculating Invisible Loss Time (ILT) Index Values & ; Predictive Analysis Using Bayesian Approach to Improve Drilling Operational Efficiency : Adopting Best Practices ». Dans ADIPEC. SPE, 2023. http://dx.doi.org/10.2118/216289-ms.

Texte intégral
Résumé :
Abstract Contribution in reducing Invisible Loss Time (ILT) towards operational efficiency has been a significant approach towards adopting best practices in drilling operations but predicting ILT using mathematical models accurately has been an additional challenge due to multiple factors contributing towards ILT. This paper sketches an algorithm for calculating the ILT Index value and then uses the Bayesian approach for predicting the invisible loss time (ILT) index value over other existing statistical methods. Digital Oilfield architecture processes along with data management systems have worked in synchrony for streaming data for analytics in real-time well engineering solutions. Defined KPIs of Invisible Loss Time (ILT) were evaluated in small sets of datasets with a Bayesian Optimisation approach using a probability model for the likeness of event occurrence. The performance of this model is evaluated based on a comparison of actual vs predicted ILT index values. Benchmarked (BM) values were calculated based on the best performance for the quarter, month, and week to understand the randomness of values. Real-time data generated were packeted for small sets as per KPIs defined for probability analysis. Sets of data made available for calculation were used to feed in the probability model for forecasting the values. Results from the predictive models showcased that batch drilling activities had a significant reduction in variance amongst the ILT values. Reduced ILT while operational activity due to adaptive learning can be calculated to quantify that cost component. Weighted percentages of KPIs in decreasing order of their significance were calculated.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Romdhanne, Bilel Ben, Mourad Boudia et Nicolas Bondoux. « Amadeus Migration Process a Simulation-Driven Process to Enhance the Migration to a Multi-Cloud Environment ». Dans 12th International Conference on Digital Image Processing and Vision. Academy & Industry Research Collaboration, 2023. http://dx.doi.org/10.5121/csit.2023.131308.

Texte intégral
Résumé :
With the development of the cloud offers, we observe a prominent trend of applications being migrated from private infrastructure to the cloud. Depending on the application’s complexity, the migration can be complex and needs to consider several dimensions, such as dependency issues, service continuity, and the service level agreement (SLA). Amadeus, the travel industry leader, had partnered with Microsoft to migrate its IT ecosystem to the Azure cloud. This work addresses the specificity of cloud-to-cloud migration and the multi-cloud constraints. In this paper, we summarise the Amadeus Migration process. The process aims to drive the migration from an initial private cloud environment to a target environment that can be a public or hybrid cloud. Further, the process focuses on a prediction phase that guides the migration process. This paper expects to provide an efficient decision-making process that guides managers and architects to optimise and secure their migration process while considering micro-servicesoriented applications targeting an efficient deployment over multi-cloud or hybrid cloud. The prediction relies on the network simulation to predict applications’ behaviour in the cloud and evaluate different scenarios and deployment topologies beforehand. The objective is to predict migrated applications’ behaviour and identify any issue related to the performance, the application’s dependency on other components, or the deployment in the cloud. The migration process proposed in this paper relies on SimGrid, a toolkit developed by INRIA[52] for distributed application modelling. This framework offers a generic process to model IT infrastructure and can assist cloud-to-cloud migration. Specific attention is given to predictive and reactive optimisations. The first results show predictive optimisation's impact on securing KPI and reactive optimisation to optimise the solution cost. Thus, we reach an average cost reduction of 40% in comparaison with the same deployment strategy while keeping the same SLA.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Tierney, Christopher M., Peter L. Higgins, Colm J. Higgins, Rory J. Collins, Adrian Murphy et Damian Quinn. « Steps towards a Connected Digital Factory Cost Model ». Dans 2023 AeroTech. 400 Commonwealth Drive, Warrendale, PA, United States : SAE International, 2023. http://dx.doi.org/10.4271/2023-01-0999.

Texte intégral
Résumé :
<div class="section abstract"><div class="htmlview paragraph">Digital transformation is at the forefront of manufacturing considerations, but often excludes discrete event simulation and cost modelling capabilities, meaning digital twin capabilities are in their infancy. As cost and time are critical metrics for manufacturing companies it is vital the associated tools become a connected digital capability. The aim is to digitize cost modelling functionality and its associated data requirements in order to couple cost analysis with digital factory simulation. The vast amount of data existing in today’s industry alongside the standardization of manufacturing processes has paved the way for a ‘data first’ cost and discrete event simulation environment that is required to facilitate the automated model building capabilities required to seamlessly integrate the digital twin within existing manufacturing environments.</div><div class="htmlview paragraph">An ISA-95 based architecture is introduced where phases within a cost modelling and simulation workflow are treated as a series of interconnected modules: process mapping (including production layout definition); data collection and retrieval (resource costs, equipment costs, labour costs, learning rates, process/activity times etc.); network and critical path analysis; cost evaluation; cost optimisation (bottleneck identification, production configuration); simulation model build; cost reporting (dashboard visualisation, KPIs, trade-offs). Different phases are linked to one another to enable automated cost and capacity analysis. Leveraging data in this manner enables the updating of standard operating procedures and learning rates in order to better understand manufacturing cost implications, such as actual cost versus forecasted, and to incorporate cost implications into scheduling and planning decisions.</div><div class="htmlview paragraph">Two different case studies are presented to highlight different applications of the proposed architecture. The first shows it can be used within a feasibility study to benchmark novel robotic joining techniques against traditional riveting of stiffened aero structures.</div><div class="htmlview paragraph">In the second case study discrete event digital factory simulations are used to supply important production metrics (process times, wait times, resource utilisation) to the cost model to provide ‘real-time’ cost modelling. This enables both time and cost to be used for more informed decision making within an ever demanding manufacturing landscape. In addition, this approach will add value to simulation processes by enabling simulation engineers to focus on value adding activities instead of time consuming model builds, data gathering and model iterations.</div></div>
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie