Дисертації з теми "Estimation des coûts d'exploitation"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-22 дисертацій для дослідження на тему "Estimation des coûts d'exploitation".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Aubé, Antoine. "Comprendre la migration infonuagique : exigences et estimation des coûts d'exploitation." Electronic Thesis or Diss., Toulouse, ISAE, 2024. http://www.theses.fr/2024ESAE0021.
Повний текст джерелаSelecting a host for an information system is a major decision for any organization. Indeed, such a decision has consequences on many aspects, such as the operating costs, the manpower allocation to operations, and the quality of service provided by the information system.While the hosting was traditionally carried out by the organizations themselves, in their premises, the emergence of third-party hosting providers initiated a change in practices: a migration of information systems to the infrastructure of another organization.Cloud Computing is such a model for delegating infrastructure to a third party. The latter provides a cloud, which is a set of configurable services to deliver computing resources.In this context of cloud migration, new issues emerge to select information systems hosts, named emph{cloud environments}. In particular, problems related to the recurrence and variety of operational costs are well-known in the industry.In this research work, we aim to identify the criteria for selecting a cloud environment during a migration, and how we can evaluate the compliance of the cloud environment with the requirements linked to these criteria. To this end, we first carried out a qualitative study with industrial experts to identify the most recurring requirements of cloud environments in the industry. Then, we focused on estimating the operational costs of these environments, which are frequently mentioned as a criterion to be minimized, and which are often misunderstood, given the variety of pricing schemes of Cloud Computing. We therefore developed a conceptual model for estimating these costs, and then a tool that implements this conceptual model to automate the estimation
Ould, Mohamed Abdellahi Mohamed El-Moustapha. "Contribution à une méthode de chiffrage paramétrique dédiée à la conception amont de systèmes mécaniques." Besançon, 2006. http://www.theses.fr/2006BESA2012.
Повний текст джерелаThis PhD dissertation proposes a prescriptive method dedicated to the parametric costing within the conceptual and functional design stages of mechanical system. This method is intended for the cost engineer, the project leader and mechanical design engineer. It identifies and lays out several costing tasks spread in three steps: before the design project (preparation), during the project (estimation), after the project (improvement). Its data-processing applicability was validated by a tool called Kostimator. Its relevance was appreciated in the case of manual gear box costing in the early design stage (conceptual, functional, lay out design)
Fatemi, Seyyedeh Zohreh. "Planification des essais accélérés : optimisation, robustesse et analyse." Phd thesis, Université d'Angers, 2012. http://tel.archives-ouvertes.fr/tel-01004379.
Повний текст джерелаCamargo-Pardo, Mauricio. "Estimation paramétrique des coûts des produits finis dans la filière textile-habillement." Valenciennes, 2004. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/fe756208-c237-4caf-a060-91109809c4ba.
Повний текст джерелаIn supply chains with high degree of product diversity and renewal, there is very difficult to establish economical laws at the design stage, in order to accurately forecast product cost. Nevertheless at this early stage, product cost is defined by 70 to 80% but also, only scarce product information is available, related mainly to product aesthetical features or functionalities. In order to minimise the risk of product reject, it is important for designers to have a cost estimation tool, flexible and easy to adapt. We use the parametric approach in order to develop Cost Estimation Relationships (CER's). First, we define a general methodology and main concepts in order to develop CER's as regression and softcomputing techniques. In particular we developed a Simplified Hybrid Neuro-Fuzzy model, allowing better variables interpretation, mainly for complex systems. Also, we proposed a tool in order to develop a CER by using the described modelling techniques. The candidates CER's could be compared in terms of accuracy, robustness and relevance. This tool allows a maximum of information in order to choose the best CER. This approach has been tested on a particular case concerning the development of specific CER for a textile printing industry
Pham, Thuy Van. "Ancrage nominal du taux de change et coûts de la désinflation : une estimation économétrique." Phd thesis, Université Panthéon-Sorbonne - Paris I, 2007. http://tel.archives-ouvertes.fr/tel-00198619.
Повний текст джерелаPham, Thuy Vân. "Ancrage nominal du taux de change et coûts de la désinflation : une estimation économétrique." Paris 1, 2007. https://tel.archives-ouvertes.fr/tel-00198619.
Повний текст джерелаMao, Gwladys. "Estimation des coûts économiques des inondations par des approches de type physique sur exposition." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE1192/document.
Повний текст джерелаThis research was conducted within the framework of the Caisse Centrale de Réassurance's (CCR) R&D objective - to widen the scope of impacts estimated by its flood impact model. In chapter one, we study the various impacts of a flood and their interdependencies in order to create a classification of impacts. This classification allows us to design the architecture of a global impact model built from linking specific impacts models where the outputs of one model are the inputs of the following one. This approach generates an estimation with a breakdown by type of impact. It also allows us to understand the domino effects from the direct damage to the macroeconomic impacts. In chapter two, we study models for car damages according to CCR's specifications. The requirements are: the model should be independent from other natural catastrophes and impacts estimations and it should be able to model both a specific event and the total annual load. Through this work we describe, implement, and identify issues with possible improvements of three modelling approaches: - a simple linear regression, CCR's presently used method, - a frequency x severity model associated to the extreme value theory, widely used in the insurance business sector, - a model that pairs a physical model with exposure through damage curves. CCR already uses this approach to estimate damage to buildings. Hence, we are using CCR's flood hazard model and develop an exposure model and a damage model specific to cars. CCR is in charge of the accounting management of the Agricultural National Risk Management Fund on behalf of the State. Hence, chapter three contains state of the art modeling solutions for this risk and description of the designed model and its implementation. A vulnerability model and a damage model specific to the agricultural risk are developed and paired with CCR's flood hazard model. The vulnerability model uses the Graphic Parcel Register database. The damage model is based on the damage curves developed by IRSTEA for the national think tank on flood cost-benefit analysis. Chapter four is a technical document that will allow CCR to continue the development of the global model. It presents a situational analysis of what has been done (cars and agricultural risks) and of ongoing works (business interruption due to direct damage). For the remaining impacts, it presents the modeling issues, a short research review and the conclusions reached in terms of modeling
Deschênes, Sébastien. "Estimation des pertes liées aux prêts commerciaux d'une institution financière coopérative, gestion de l'information et coûts d'agence." Thèse, Université du Québec à Trois-Rivières, 2010. http://depot-e.uqtr.ca/2054/1/030187826.pdf.
Повний текст джерелаColin, Antoine. "Estimation de temps d'éxécution au pire cas par analyse statique et application aux systèmes d'exploitation temps réel." Rennes 1, 2001. http://www.theses.fr/2001REN10118.
Повний текст джерелаBertier, Clément. "Quantification in Device-to-Device Networks : from Link Estimation to Graph Utility." Electronic Thesis or Diss., Sorbonne université, 2020. http://www.theses.fr/2020SORUS250.
Повний текст джерелаDevice-to-device (D2D) communications are valuable in several domains, such as data offloading and diffusion, as their cost is only a fraction of what regular cellular communication would have. In this thesis, we argue that understanding the potential utility behind direct communications is key to quantifying the realization of contact networks. We tackle related questions from two distinct, yet complementary contributions. Firstly, we consider the problem of estimating the importance of a node in large dynamic topologies. We propose a novel approach to estimate centralities based on a pre-established database, where the estimation is based on the geographical coordinates of the node instead of the identifier of the node. Doing so enables us to estimate the centrality of a node for a fraction of the computational cost. Secondly, we quantify the value of direct links through an experimental measurement campaign. Using an Android tool of our making, we derived a model to obtain an estimate of the upper-bound of D2D throughput based on the distance between the devices. Thirdly, we investigate the differences between the traditional quantification of a contact and the model extracted from our measurements campaigns. Among other results, we reveal that when considering an adaptive throughput according to the distance between two devices, the long-distance data-exchange makes up more than 50% of the total data exchanged in the entire network. We propose a tool to extract from mobility datasets the volume of data obtained, based on specific contact quantification strategies
Moumen, Chiraz. "Une méthode d'optimisation hybride pour une évaluation robuste de requêtes." Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30070/document.
Повний текст джерелаThe quality of an execution plan generated by a query optimizer is highly dependent on the quality of the estimates produced by the cost model. Unfortunately, these estimates are often imprecise. A body of work has been done to improve estimate accuracy. However, obtaining accurate estimates remains very challenging since it requires a prior and detailed knowledge of the data properties and run-time characteristics. Motivated by this issue, two main optimization approaches have been proposed. A first approach relies on single-point estimates to choose an optimal execution plan. At run-time, statistics are collected and compared with estimates. If an estimation error is detected, a re-optimization is triggered for the rest of the plan. At each invocation, the optimizer uses specific values for parameters required for cost calculations. Thus, this approach can induce several plan re-optimizations, resulting in poor performance. In order to avoid this, a second approach considers the possibility of estimation errors at the optimization time. This is modelled by the use of multi-point estimates for each error-prone parameter. The aim is to anticipate the reaction to a possible plan sub-optimality. Methods in this approach seek to generate robust plans, which are able to provide good performance for several run-time conditions. These methods often assume that it is possible to find a robust plan for all expected run-time conditions. This assumption remains unjustified. Moreover, the majority of these methods maintain without modifications an execution plan until the termination. This can lead to poor performance in case of robustness violation at run-time. Based on these findings, we propose in this thesis a hybrid optimization method that aims at two objectives : the production of robust execution plans, particularly when the uncertainty in the used estimates is high, and the correction of a robustness violation during execution. This method makes use of intervals of estimates around error-prone parameters. It produces execution plans that are likely to perform reasonably well over different run-time conditions, so called robust plans. Robust plans are then augmented with what we call check-decide operators. These operators collect statistics at run-time and check the robustness of the current plan. If the robustness is violated, check-decide operators are able to make decisions for plan modifications to correct the robustness violation without a need to recall the optimizer. The results of performance studies of our method indicate that it provides significant improvements in the robustness of query processing
Ouni, Bassem. "Caractérisation, modélisation et estimation de la consommation d'énergie à haut-niveau des OS embarqués." Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-01059814.
Повний текст джерелаBricola, Jean-Charles. "Estimation de cartes de profondeur à partir d’images stéréo et morphologie mathématique." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEM046/document.
Повний текст джерелаIn this thesis, we introduce new approaches dedicated to the computation of depth maps associated with a pair of stereo images.The main difficulty of this problem resides in the establishment of correspondences between the two stereoscopic images. Indeed, it is difficult to ascertain the relevance of matches occurring in homogeneous areas, whilst matches are infeasible for pixels occluded in one of the stereo views.In order to handle these two problems, our methods are composed of two steps. First, we search for reliable depth measures, by comparing the two images of the stereo pair with the help of their associated segmentations. The analysis of image superimposition costs, on a regional basis and across multiple scales, allows us to perform relevant cost aggregations, from which we deduce accurate disparity measures. Furthermore, this analysis facilitates the detection of the reference image areas, which are potentially occluded in the other image of the stereo pair. Second, an interpolation mechanism is devoted to the estimation of depth values, where no correspondence could have been established.The manuscript is divided into two parts: the first will allow the reader to become familiar with the problems and issues frequently encountered when analysing stereo images. A brief introduction to morphological image processing is also provided. In the second part, our algorithms to the computation of depth maps are introduced, detailed and evaluated
Belharet, Mahdi. "L'estimation de la valeur statistique de la vie humaine dans le domaine de la santé : quel fondement normatif pour une estimation monétaire au sein de l'économie du bien-être ?" Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0098.
Повний текст джерелаThe value of statistical life (VSL) is an economic analytical tool, which is defined as the value that a person is ready to pay (WTP) in order to reduce the risk mortality or morbidity. The advantage of such a tool is to monetarily estimate the social benefit of an investment project which is made to reduce the risk, but also to establish an arbitrage between several alternatives. Respond to the moral hazard in a context pertaining to the scarcity of resources, which is perfectly in keeping with VSL. With people’s estimation on their willingness to pay, depending on how they perceive risks and their income level, people are positioned as the sole judges as for the value of their lives. Because people freely determine the WTP depending on their personal preferences and these preferences are included in order to determine a social choice. The value of statistical life doesn’t contradict the normative framework of establishing a decision. Nonetheless, welfarism which is a source of estimating methods of VSL is directly related to utilitarianism. Eventually, the estimated value by VSL is subjective nature. In the health sector, the VSL needs to surpass the subjective framework of an estimation in order to answer the normative ethic which describes the medical practice, especially by taking personal self-sufficiency into account but also the personal notion of a good life and the universal notion of the person. Researching establishing arguments of reference value pertaining to VSL which takes on a normative framework and this is objective when it comes to our work. This theoretically requires an in-depth analysis within the economic theory of well-being
Pirson, Magali. "Apports de la comptabilité analytique par cas et par pathologie à la gestion hospitalière." Doctoral thesis, Universite Libre de Bruxelles, 2006. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210867.
Повний текст джерелаLes DRGs représentent la tentative la plus récente de maîtriser la croissance des dépenses des hôpitaux en introduisant une médicalisation partielle des mécanismes de financement.
La connaissance des coûts des pathologies peut permettre aux hôpitaux de participer à l’élaboration des tarifs par pathologie en faisant partie d’un échantillon de référence des coûts hospitaliers. En cas de financement basé sur les pathologies, les hôpitaux doivent pouvoir comparer le coût des séjours au chiffre d’affaires octroyé et s’y adapter. Cet intérêt s’accroît en cas de financement forfaitaire, évolution qui semble se profiler en Belgique tout comme dans d’autres pays. En décrivant une méthodologie de calcul des coûts par pathologie et en indiquant la manière dont ceux-ci pourraient contribuer à la création d’une échelle de cost-weights, notre thèse incite les hôpitaux à adopter une politique proactive dans le domaine du financement des hôpitaux.
Les comparaisons de coûts hospitaliers pour évaluer la gestion sont pratiquées depuis de nombreuses années. Cependant, ce « benchmarking » est imparfait car il ne prend pas en compte la lourdeur des patients pris en charge. La standardisation des coûts sur base du case-mix de l’hôpital suppose un préalable important :l’existence d’une échelle de cost-weights issue d’un échantillon représentatif d’hôpitaux. Si cette situation n’est pas encore totalement rencontrée en Belgique, il est néanmoins possible de suggérer une voie de réflexion. La simulation inspirée de la méthodologie suisse à partir d’un échantillon de quatre hôpitaux belges présentée dans le cadre de cette thèse, est une première avancée en ce sens.
Un des problèmes majeurs de la gestion hospitalière est d’intéresser les prescripteurs et les prestataires à un contrôle de gestion essentiellement financier. Depuis quelques années, de nombreux efforts visent à intégrer de nouveaux indicateurs de performance dans les tableaux de bord. L’analyse des coûts des pathologies et de la variabilité des cas permet d’entamer un dialogue entre gestionnaire et corps médical. En abordant différentes études (apport des nomenclatures dans le calcul des coûts par pathologie, mesure des coûts associés aux bactériémies nosocomiales, analyse des facteurs médico-sociaux expliquant les surcoûts des patients outliers, analyse de la relation entre le coût et la sévérité des cas, comparaison des coûts de production et des pratiques médicales), nous avons voulu montrer l’importance d’associer une approche médicalisée à des raisonnements économiques. Si elle se développe, cette approche est susceptible de représenter un moyen de communication idéal entre le personnel médical et soignant et le monde de la gestion.
Comme nous le rappelions au début de cette thèse, les concepteurs des DRGs (Fetter et Thompson) ont regretté le manque d'intérêt manifesté par les gestionnaires d'hôpitaux pour l'utilisation de leur concept dans le management hospitalier. Au terme de cette thèse, nous pensons que, si l'analyse des coûts par pathologie reste encore d'un abord difficile, elle peut rendre d'importants services en associant médecins et managers à l'élaboration d'un contrôle de gestion enfin adapté à la spécificité de leurs institutions.
Doctorat en Sciences de la santé publique
info:eu-repo/semantics/nonPublished
Condomines, Jean-Philippe. "Développement d’un estimateur d’état non linéaire embarqué pour le pilotage-guidage robuste d’un micro-drone en milieu complexe." Thesis, Toulouse, ISAE, 2015. http://www.theses.fr/2015ESAE0002.
Повний текст джерелаThis thesis presents the study of an algorithmic solution for state estimation problem of unmanned aerial vehicles, or UAVs. The necessary resort to multiple miniaturized low-cost and low-performance sensors integrated into mini-RPAS, which are obviously subjected to hardspace requirements or electrical power consumption constraints, has led to an important interest to design nonlinear observers for data fusion, unmeasured systems state estimation and/or flight path reconstruction. Exploiting the capabilities of nonlinear observers allows, by generating consolidated signals, to extend the way mini-RPAS can be controlled while enhancing their intrinsic flight handling qualities.That is why numerous recent research works related to RPAS certification and integration into civil airspace deal with the interest of highly robust estimation algorithm. Therefore, the development of reliable and performant aided-INS for many nonlinear dynamic systems is an important research topic and a major concern in the aerospace engineering community. First, we have proposed a novel approach for nonlinear state estimation, named pi-IUKF (Invariant Unscented Kalman Filter), which is based on both invariant filter estimation and UKF theoretical principles. Several research works on nonlinear invariant observers have been led and provide a geometrical-based constructive method for designing filters dedicated to nonlinear state estimation problems while preserving the physical properties and systems symmetries. The general invariant observer guarantees a straightforward form of the nonlinear estimation error dynamics whose properties are remarkable. The developed pi-IUKF estimator suggests a systematic approach to determine all the symmetry-preserving correction terms, associated with a nonlinear state-space representation used for prediction, without requiring any linearization of the differential equations. The exploitation of the UKF principles within the invariant framework has required the definition of a compatibility condition on the observation equations. As a first result, the estimated covariance matrices of the pi-IUKF converge to constant values due to the symmetry-preserving property provided by the nonlinear invariant estimation theory. The designed pi-IUKF method has been successfully applied to some relevant practical problems such as the estimation of Attitude and Heading for aerial vehicles using low-cost AH reference systems (i.e., inertial/magnetic sensors characterized by low performances). In a second part, the developed methodology is used in the case of a mini-RPAS equipped with an aided Inertial Navigation System (INS) which leads to augment the nonlinear state space representation with both velocity and position differential equations. All the measurements are provided on board by a set of low-cost and low-performance sensors (accelerometers, gyrometers, magnetometers, barometer and even Global Positioning System (GPS)). Our designed pi-IUKF estimation algorithm is described and its performances are evaluated by exploiting successfully real flight test data. Indeed, the whole approach has been implemented onboard using a data logger based on the well-known Paparazzi system. The results show promising perspectives and demonstrate that nonlinear state estimation converges on a much bigger set of trajectories than for more traditional approaches
Piechucka, Joanna. "Essays in competition policy and public procurement." Thesis, Paris 1, 2018. http://www.theses.fr/2018PA01E013.
Повний текст джерелаThis PhD dissertation studies three research questions in public procurement and competition policy presented in the respective chapters and preceded by a general introduction. The first chapter focuses on a microeconometric analysis of the strategic relationships between a firm awarded a public contract and the public authority responsible for regulating a public service. It exploits data on the French urban public transport industry to study the determinants of regulatory contract choices which in turn impact the cost efficiency of transport operators. The second chapter explores an ex-post assessment of a merger which took place between two major transport groups in France (Veolia Transport and Transdev), focusing on the possible existence of merger efficiency gains. Finally, the third chapter provides insight on the impact of merger when firms compete in quality and reposition their services by analyzing the French hospital industry
Etienne, Alain. "Intégration Produit / Process par les concepts d'activités et de caractéristiques clés - Application à l'optimisation de l'allocation des tolérances géométriques." Phd thesis, Université de Metz, 2007. http://tel.archives-ouvertes.fr/tel-00224938.
Повний текст джерелаMartinod, Restrepo Ronald Mauricio. "Politiques d’exploitation et de maintenance intégrées pour l’optimisation économique, sociétale et environnementale des systèmes de transports urbains interconnectés." Electronic Thesis or Diss., Université de Lorraine, 2021. http://www.theses.fr/2021LORR0069.
Повний текст джерелаUrban public transport systems influence the infrastructure of urban areas and the lives of their inhabitants while directly stimulating the economy. Intelligent urban public transport systems help to improve the quality of life and the environment in cities. The rapid development of urban transport solutions has led to a large number of operators entering the market, thus preventing a global optimum. These discrete optimisations, without any articulation between transport operators, avoid the identification of a global optimum. As a result, the inefficient operation of urban public transport systems does not necessarily reduce the environmental cost. To address these challenges, this thesis proposes a methodology associated with mathematical models developing optimisation approaches for multimodal public transport networks, for achieving the best service policy while minimising operation costs in order to satisfy the principle of sustainability, frequently expressed in urban development goals
Fréchette, Richard. "Création d'un outil d'évaluation des coûts des infrastructures municipales souterraines selon différents facteurs d'influences." Thèse, 2018. http://depot-e.uqtr.ca/id/eprint/9491/1/eprint9491.pdf.
Повний текст джерелаJobin, Guy. "Exploration de la capacité d'un réseau de neurones à imiter le jugement et l'expérience d'un estimateur chevronné pour l'attribution du taux de productivité d'une équipe d'excavation en infrastructures municipales." Mémoire, 2008. http://www.archipel.uqam.ca/1210/1/M10484.pdf.
Повний текст джерелаDoan, Thi Lien Huong. "L'estimation du coût du capital dans les marchés émergents : une application au secteur alimentaire du Vietnam." Mémoire, 2006. http://www.archipel.uqam.ca/2994/1/M9422.pdf.
Повний текст джерела