Дисертації з теми "Modélisation du process"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 дисертацій для дослідження на тему "Modélisation du process".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Ulmer, Jean-Stéphane. "Approche générique pour la modélisation et l'implémentation des processus." Thesis, Toulouse, INPT, 2011. http://www.theses.fr/2011INPT0002/document.
A company must be able to describe and to remain responsive against endogenous or exogenous events. Such flexibility can be obtained with the Business Process Management (BPM). Through a BPM approach, different transformations operate on process models, developed by the business analyst and IT expert. A non-alignment is created between these heterogeneous models during their manipulation: this is the "business-IT gap" as described in the literature. The objective of our work is to propose a methodological framework for a better management of business processes in order to reach a systematic alignment from their modelling to their implementation within the target system. Using concepts from Model-driven Enterprise and Information System engineering, we define a generic approach ensuring an intermodal consistency. Its role is to maintain and provide all information related to the model structure and semantics. By allowing a full restitution of a transformed model, in the sense of reverse engineering, our platform enables synchronization between analysis model and implementation model. The manuscript also presents the possible match between process engineering and BPM through a multi- erspective scale
Dopler, Thomas. "Low pressure infiltration process modeling." Châtenay-Malabry, Ecole centrale de Paris, 1999. http://www.theses.fr/1999ECAP0673.
Lu, Xiaofei. "Modélisation du carnet d’ordres, Applications Market Making." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLC069/document.
This thesis addresses different aspects around the market microstructure modelling and market making problems, with a special accent from the practitioner’s viewpoint. The limit order book (LOB), at the heart of financial market, is a complex continuous high-dimensional queueing system. We wish to improve the knowledge of LOB for the research community, propose new modelling ideas and develop concrete applications to the interest of Market Makers. We would like to specifically thank the Automated Market Making team for providing a large high frequency database of very high quality as well as a powerful computational grid, without whom these researches would not have been possible. The first chapter introduces the incentive of this research and resumes the main results of the different works. Chapter 2 fully focuses on the LOB and aims to propose a new model that better reproduces some stylized facts. Through this research, not only do we confirm the influence of historical order flows to the arrival of new ones, but a new model is also provided that captures much better the LOB dynamic, notably the realized volatility in high and low frequency. In chapter 3, the objective is to study Market Making strategies in a more realistic context. This research contributes in two aspects : from one hand the newly proposed model is more realistic but still simple enough to be applied for strategy design, on the other hand the practical Market Making strategy is of large improvement compared to the naive one and is promising for practical use. High-frequency prediction with deep learning method is studied in chapter 4. Many results of the 1-step and multi-step prediction have found the non-linearity, stationarity and universality of the relationship between microstructural indicators and price change, as well as the limitation of this approach in practice
Pujol, Arnaud. "Modélisation du procédé de compostage - Impact du phénomène de séchage." Thesis, Toulouse, INPT, 2012. http://www.theses.fr/2012INPT0015/document.
Composting may look like a simple process. However, it requires an important expertise, as the biological response is governed by the control parameters (temperature, oxygen, moisture content), involving many coupled phenomena. Given the complexity of the studied mechanisms and in order to optimize the process, using a composting model seems relevant to understand the mechanisms involved, identify the effects of coupling between these mechanisms, highlight some key factors or compare different scenarios, in order to optimize the industrial process. The state of the art of composting models in the literature shows that today, despite the large number of composting models, there is none that can predict, with a formulation in time and space, temperature, concentration of gases (oxygen, carbon dioxide, nitrogen, ...), moisture content, transfers between phases, degradation of the substrate, and take into account the changes in aeration. The development of a new model was therefore necessary to predict the evolution of these variables and study their coupling in the process. The technique of volume averaging applied to the pore scale equations has led to a composting model at the Darcy-scale. This model takes into account a gas phase, a liquid phase and a solid phase. The gas phase includes four species: oxygen, carbon dioxide, nitrogen and water vapor. In the liquid phase, only water is considered. Drying is integrated into the model as an exchange term between gas and liquid phases. Finally, the biological model, included in the composting model, allows to take into account the degradation of the substrate. It is divided into three fractions: readily hydrolysable, slowly hydrolysable and inert. The first two fractions are hydrolized, providing a readily assimilable soluble fraction. It is this fraction that is directly consumed by bacteria. In a composting process, degradation of organic matter is associated with oxygen consumption and production of carbon dioxyde, water and heat. The assumption of thermal and chemical local equilibrium was assumed in this work. However, for water, the two approaches (Local Equilibrium (LE) and Local Non-Equilibrium (LNE)) have been numerically tested. The results showed that when , the water mass exchange coefficient between gas and liquid phases, ranges from 1 to 4 s-1, the LE and LNE approaches are equivalent, with less computing time for the LNE case. Thus, for all future simulations, it was decided to adopt a LNE approach with a value of equal to 2.5 s-1. Tests were then carried out to show the consistency of the model. Given the large number of parameters, a sensitivity analysis was performed to determine the parameters that have the greatest impact on the process. This analysis showed that one must be cautious about the values used for the heat capacity, a coefficient of the sorption isotherm, many parameters from the biological model (ksH, krH, μmax, Xa,0, Tmax, Topt, Xi,0, Xrb,0) and porosity, because these are the parameters that affect mainly the process. Finally, the results provided by the model were compared with experimental results obtained at a pilot scale of 1/1000 using identical operating conditions. The composting experiments were carried out by Veolia Environment Research and Innovation with a mixture of household biowaste and green waste. The results on the 1/1000 scale pilot showed that the model is good at capturing the average change in temperature and concentration during the process. The temperature at the central point in particular is very well reproduced by the model. The same applies to the assessment of organic matter degradation. Simulations at industrial scale (1/1) have also been carried out. They have given promising results
Kasdali, Sihem. "Modélisation complexe de l’impact des dispositifs de formation à distance." Thesis, Cergy-Pontoise, 2014. http://www.theses.fr/2014CERG0722/document.
This research aims at analysing the impact, in this case, of the changes led by technopedagogical systems of distance education, on the behavior of learners, trainers or future trainers.To do so, we adopt a systemic approach based on modeling complex systems. The implementation of our model aims at highlighting the interrelations that can exist between the individual and his system, and the structure that may arise, their evolution and their entanglement at different stages. Thus, the dynamics of change is appreciated in its environment, and highlights the active variables in order to build its intelligibility.Our research intention has thus the ambition of not considering the training as a variable explaining the dynamics of change, but to try to understand, like in each training, a set of variables used, leading to results while other variables do not allow it to happen. Our proposal consists of identifying possible relevant variables, the “process variables”. These describe, across time, the space and the processing that are implemented in this dynamics
Constant, Camille. "Modélisation stochastique et analyse statistique de la pulsatilité en neuroendocrinologie." Thesis, Poitiers, 2019. http://www.theses.fr/2019POIT2330.
The aim of this thesis is to propose several models representing neuronal calcic activity and unsderstand its applicatition in the secretion of GnRH hormone. This work relies on experience realised in INRA Centre Val de Loire. Chapter 1 proposes a continuous model, in which we examine a Markov process of shot-noise type. Chapter 2 studies a discrete model type AR(1), based on a discretization of the model from Chapter 1 and proposes a first estimation of the parameters. Chapter 3 proposes another dicrete model, type AR(1), in which the innovations are the sum of a Bernouilli variable and a Gaussian variable representing a noise, and taking into account a linear drift . Estimations of the parameters are given in order to detect spikes in neuronal paths. Chapter 4 studies a biological experience involving 33 neurons. With the modelisation of Chapter 3, we detect synchronization instants (simultaneous spkike of a high proportion of neurons of the experience) and then, using simulations, we test the quality of the method that we used and we compare it to an experimental approach
Montellano, Garcia Ramón Francisco. "Un système d'aide à la modélisation et à la conduite de bioprocédés." Grenoble INPG, 1989. http://www.theses.fr/1989INPG0081.
Lambolais, Thomas. "Modélisation du développement de spécifications LOTOS." Vandoeuvre-les-Nancy, INPL, 1997. http://www.theses.fr/1997INPL106N.
Suri, Kunal. "Modeling the internet of things in configurable process models." Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLL005.
On the one hand, a growing number of multi-national organizations have embraced the Process-Aware Information Systems (PAIS) to reap the benefits of using streamlined processes that are based on predefined models, also called as Business Process (BP) models. However, today's dynamic business environment demands flexibility and systematic reuse of BPs, which is provided by the use of Configurable Process Models (CPMs). It avoids the development of processes from scratch, which is both time-consuming and error-prone, and facilitates the sharing of a family of BP variants that can be customized based on concrete business requirements. On the other hand, the adoption of the Internet of Things (IoT) resources in various cross-organizational BPs is also on a rise. However, to attain the desired business value, these IoT resources must be used efficiently. These IoT devices are heterogeneous due to their diverse properties and manufactures (proprietary standards), which leads to issues related to interoperability. Further, being resource-constrained, they need to be allocated (and consumed) keeping in the mind relevant constraints such as energy cost, computation cost, to avoid failures during the time of their consumption in the processes. Thus, it is essential to explicitly model the IoT resource perspective in the BP models during the process design phase. In the literature, various research works in Business Process Management (BPM) domain are usually focused on the control-flow perspective. While there do exist some approaches that focus on the resource perspective, they are typically dedicated to the human resource perspective. Thus, there is limited work on integrating the IoT resource perspective into BPs, without any focus on solving issues related to heterogeneity in IoT domain. Likewise, in the context of CPMs, there is no configuration support to model IoT resource variability at the CPM level. This variability is a result of specific IoT resource features such as Shareability and Replication that is relevant in the context of BPs. In this thesis, we address the aforementioned limitations by proposing an approach to integrate IoT perspective in the BPM domain and to support the development of IoT-Aware CPMs. This work contributes in the following manner: (1) it provides a formal description of the IoT resource perspective and its relationships with the BPM domain using semantic technology and (2) it provides novel concepts to enable configurable IoT resource allocation in CPMs. To validate our approach and to show its feasibility, we do the following: (1) implement proof of concept tools that assist in the development of IoT-aware BPs and IoT-aware CPMs and (2) perform experiments on the process model datasets. The experimentation results show the effectiveness of our approach and affirm its feasibility
Le, Pivert Patrick. "Contribution à la modélisation et à la simulation réaliste des processus d'usinage." Châtenay-Malabry, Ecole centrale de Paris, 1998. http://www.theses.fr/1998ECAP0553.
Dauzat, Marc. "Évolution thermique des alumines de transition. Modélisation." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 1989. http://tel.archives-ouvertes.fr/tel-00845199.
Al, Haddad Mazen. "Contribution théorique et modélisation des phénomènes instantanés dans les opérations d'autovaporisation et de déshydratation." Phd thesis, La Rochelle, 2007. http://www.theses.fr/2007LAROS216.
The instantaneous autovaporisation, as a fundamental process, is correlated to an abrupt variation of the thermodynamic conditions of the surrounding environment below the conditions of saturation of the liquid. During the autovaporisation, no heat transfer between the material and the surrounding environment takes place because of the shortness of the process then; the quantity of heat necessary to the evaporation of water is recovered only within the matter that sees its temperature lowering in a meaningful way. Numerous operations and experimentations realized like BLEVE (Boiling Liquid Expanding Vapor Explosion), LOCA (Loss Of Coolant Accident), WFEC (Water Flash Evaporation Cooling), CSC (Cryogen Spray Cooling), DIC (Instantaneous Controlled Pressure Drop)… imply the autovaporisation process. No general survey has been established yet. Nevertheless, the passing of the quasi-statics balance states, temperature and quantity (or flux) of evaporated water predicted, reconsiders the second principle of thermodynamics on the plan of the intermediate stage… The second principle remains applicable as soon as the balance is reached. In spite of the numerous applications, no classical thermodynamic survey could explain the phenomena observed. Indeed, only the specific theoretical analysis to the instantaneous phenomena proposed by ALLAF in 2002 will be the basis of the explanation and the survey of this type of autovaporisation. Through this survey, we will explain and prove the relevance of this theoretical analysis in the case of some instantaneous phenomena
Hodara, Pierre. "Systèmes de neurones en interactions : modélisation probabiliste et estimation." Thesis, Cergy-Pontoise, 2016. http://www.theses.fr/2016CERG0854/document.
We work on interacting particles systems. Two different types of processes are studied. A first model using Hawkes processes, for which we state existence and uniqueness of a stationnary version. We also propose a graphical construction of the stationnary measure by the mean of a Kalikow-type decomposition and a perfect simulation algorithm.The second model deals with Piecewise deterministic Markov processes (PDMP). We state ergodicity and propose a Kernel estimator for the jump rate function having an optimal speed of convergence in L²
Suri, Kunal. "Modeling the internet of things in configurable process models." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLL005/document.
On the one hand, a growing number of multi-national organizations have embraced the Process-Aware Information Systems (PAIS) to reap the benefits of using streamlined processes that are based on predefined models, also called as Business Process (BP) models. However, today's dynamic business environment demands flexibility and systematic reuse of BPs, which is provided by the use of Configurable Process Models (CPMs). It avoids the development of processes from scratch, which is both time-consuming and error-prone, and facilitates the sharing of a family of BP variants that can be customized based on concrete business requirements. On the other hand, the adoption of the Internet of Things (IoT) resources in various cross-organizational BPs is also on a rise. However, to attain the desired business value, these IoT resources must be used efficiently. These IoT devices are heterogeneous due to their diverse properties and manufactures (proprietary standards), which leads to issues related to interoperability. Further, being resource-constrained, they need to be allocated (and consumed) keeping in the mind relevant constraints such as energy cost, computation cost, to avoid failures during the time of their consumption in the processes. Thus, it is essential to explicitly model the IoT resource perspective in the BP models during the process design phase. In the literature, various research works in Business Process Management (BPM) domain are usually focused on the control-flow perspective. While there do exist some approaches that focus on the resource perspective, they are typically dedicated to the human resource perspective. Thus, there is limited work on integrating the IoT resource perspective into BPs, without any focus on solving issues related to heterogeneity in IoT domain. Likewise, in the context of CPMs, there is no configuration support to model IoT resource variability at the CPM level. This variability is a result of specific IoT resource features such as Shareability and Replication that is relevant in the context of BPs. In this thesis, we address the aforementioned limitations by proposing an approach to integrate IoT perspective in the BPM domain and to support the development of IoT-Aware CPMs. This work contributes in the following manner: (1) it provides a formal description of the IoT resource perspective and its relationships with the BPM domain using semantic technology and (2) it provides novel concepts to enable configurable IoT resource allocation in CPMs. To validate our approach and to show its feasibility, we do the following: (1) implement proof of concept tools that assist in the development of IoT-aware BPs and IoT-aware CPMs and (2) perform experiments on the process model datasets. The experimentation results show the effectiveness of our approach and affirm its feasibility
Le, Dang Huy. "Modélisation simplifiée des processus de laminage." Phd thesis, Université Paris-Est, 2013. http://pastel.archives-ouvertes.fr/pastel-00966940.
Azab, Marc. "Caractérisation par corrélation d'images et modélisation par zones cohésives du comportement mécanique des interfaces." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAI102.
This work concerns the study of materials and assemblies structures integrity using cohesive zone model (CZM) to analyze fracture. These models have the advantage to incorporate a characteristic length in the description of fracture initiation and propagation, which can lead to size effects studies. Three parameters characterize the CZM : the maximal cohesive traction Tmax to which the interface or the material can resist before the onset of debonding, the critical crack opening from which a crack is created locally and finally the traction-separation law which describe the cohesive traction distribution depending on fracture process.The main purpose of this thesis is to identify the cohesive zone parameters describing fracture at the interface or in the material. The first step was to elaborate an analytical model which can describe properly the DCB or Wedge Test kinematic, to characterize mode I fracture. Despite the fact of mode I fracture, the displacement field near the crack-tip is not K-dominant for these tests, at least not always. Various traction-separation laws were considered in order to study their influence on the local and global response of the test. An inverse identification methodology has been proposed from the analytical model, which can extract cohesive parameters through a least square error minimization between numerical and analytical deflection. Once validated, it was subsequently applied to a real Wedge test. The experimental displacement field measurement was done due to digital image correlation measurement.A deep analysis to evaluate the fracture process zone length has been also dedicated in the case of Wedge or DCB Tests. This analysis shows that Lcz is not an intrinsic interface or materials property and it can vary depending on the sample's geometry, the cohesive zone properties or the traction-separation law. A new expression to determine Lcz was established for rectangular and triangular cohesive zone.A second local identification approach, based on the work of Réhoré and Estevez (2013), has been also proposed. It was implemented to analyze the Wedge test, before applying it to a notched four points bending test. A round trip between numerical simulations and experimental results allow identifying the cohesive properties in the materiel or at the interface
Le, Dain Guillaume. "Modélisation de la gravure profonde du silicium en plasmas fluorés : étude du procédé BOSCH : simulations et calibration expérimentale." Thesis, Nantes, 2018. http://www.theses.fr/2018NANT4048/document.
Due to a collaborationbetween IMN of Nantes and STMicroelectronics Tours, the aim of this study is the development of silicon etching simulator using Bosch process Nowadays used for microelectronics devices such as 3D capacitors or vias, Bosch process is a cyclic plasma etching process. Two plasmas are needed, a SF₆ plasma to etch silicon by chemical way, using mainly chemical processes. A C₄F₈ plasma which allows the deposition of fluorocarbon species into a “Teflon-Like” polymer, to passivate sidewalls of the trenches and protect them from chemical etching. This polymer is removed by ion bombardment. By the repetition of a large amount of SF₆/C₄F₈pulses, the process leads to the creation of features with a high aspect ratio (a high depth for a low aperture).To develop an intimate knowledge about physical and chemical interactions involved in Bosch process, we develop a simulation tool based on a multiscale approach. This software allows to track the etch profile evolution versus operating conditions (pressure, power, flow rate, reactor diameter and height). Kinetic model provides space-avergaed values of plasma paramters at steady state. Sheath model determines ion energy and angular distribution functions. Surface model manages these data to know temporal evolution of a representative feature into the substrate surface exposed to Bosch process. To validate the model, we carried out some experiments at IMN, dedicated to plasma phase measurements, and at STMicroelectronics Tours, dedicated to the study of the influence of theoperating conditions on the etch profile evolution
Sinfort, Carole. "Couplage entre recherche expérimentale et modélisation pour l'optimisation des procédés de pulvérisation agricole." Habilitation à diriger des recherches, Université Montpellier II - Sciences et Techniques du Languedoc, 2006. http://tel.archives-ouvertes.fr/tel-00105765.
L'étude de la pénétration de la pulvérisation dans la vigne a ensuite été étudié. Un modèle a été développé à l'aide d'outils commerciaux de CFD pour la représentation du flux d'air dans la végétation et le calcul de trajectoires représentatives de gouttes. Le modèle a été paramétré à partir de mesures de vitesses d'air de part et d'autre de la canopée. Le comportement des nuages de gouttes et la proportion de produit retenue par le feuillage a été développé de manière plus détaillée : le modèle s'appuie sur un coefficient d'efficacité obtenu à partir d'autres simulations. Ces simulations ont été conçues de manière à permettre une
validation expérimentale. Les mesures de dépôt réalisées en conditions réelles ont ensuite permis de discuter des résultats du modèle global ainsi que des limites de l'approche.
Enfin des contaminations atmosphériques pendant les applications ont fait l'objet d'une démarche expérimentale pour mettre en évidence les relations entre les variables météorologiques, les paramètres-machine et les émissions de pesticide. L'analyse par des systèmes d'inférence floue ont débouché sur une proposition d'outil d'expertise. Un modèle dédié a par ailleurs été développé pour simuler les quantités émises ainsi que leur dispersion atmosphérique.
Casagranda, Stefano. "Modélisation, analyse et réduction des systèmes biologiques." Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4049/document.
This thesis deals with modeling, analysis and reduction of various biological models, with a focus on gene regulatory networks in the bacterium E. coli. Different mathematical approaches are used. In the first part of the thesis, we model, analyze and reduce, using classical tools, a high-dimensional transcription-translation model of RNA polymerase in E. coli. In the second part, we introduce a novel method called Principal Process Analysis (PPA) that allows the analysis of high-dimensional models, by decomposing them into biologically meaningful processes, whose activity or inactivity is evaluated during the time evolution of the system. Exclusion of processes that are always inactive, and inactive in one or several time windows, allows to reduce the complex dynamics of the model to its core mechanisms. The method is applied to models of circadian clock, endocrine toxicology and signaling pathway; its robustness with respect to variations of the initial conditions and parameter values is also tested. In the third part, we present an ODE model of the gene expression machinery of E. coli cells, whose growth is controlled by an external inducer acting on the synthesis of RNA polymerase. We describe our contribution to the design of the model and analyze with PPA the core mechanisms of the regulatory network. In the last part, we specifically model the response of RNA polymerase to the addition of external inducer and estimate model parameters from single-cell data. We discuss the importance of considering cell-to-cell variability for modeling this process: we show that the mean of single-cell fits represents the observed average data better than an average-cell fit
Charpentier, Frédéric. "Maîtrise du processus de modélisation géométrique et physique en conception mécanique." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2014. http://tel.archives-ouvertes.fr/tel-01016665.
Chamroukhi, Faicel. "Hidden process regression for curve modeling, classification and tracking." Compiègne, 2010. http://www.theses.fr/2010COMP1911.
This research addresses the problem of diagnosis and monitoring for predictive maintenance of the railway infrastructure. In particular, the switch mechanism is a vital organ because its operating state directly impacts the overall safety of the railway system and its proper functioning is required for the full availability of the transportation system; monitoring it is a key task within maintenance team actions. To monitor and diagnose the switch mechanism, the main available data are curves of electric power acquired during several switch operations. This study therefore focuses on modeling curve-valued or functional data presenting regime changes. In this thesis we propose new probabilistic generative machine learning methodologies for curve modeling, classification, clustering and tracking. First, the models we propose for a single curve or independent sets of curves are based on specific regression models incorporating a flexible hidden process. They are able to capture non-stationary (dynamic) behavior within the curves and address the problem of missing information regarding the underlying regimes, and the problem of complex shaped classes. We then propose dynamic models for learning from curve sequences to make decision and prediction over time. The developed approaches rely on autoregressive dynamic models governed by hidden processes. The learning of the models is performed in both a batch mode (in which the curves are stored in advance) and an online mode as the learning proceeds (in which the curves are analyzed one at a time). The obtained results on both simulated curves and the real world switch operation curves demonstrate the practical use of the ideas introduced in this thesis
Ayad, Sarah. "Business Process Models Quality : evaluation and improvement." Thesis, Paris, CNAM, 2013. http://www.theses.fr/2013CNAM0922/document.
In recent years the problems related to modeling and improving business processes have been of growing interest. Indeed, companies are realizing the undeniable impact of a better understanding and management of business processes (BP) on the effectiveness, consistency, and transparency of their business operations. BP modeling aims at a better understanding of processes, allowing deciders to achieve strategic goals of the company. However, inexperienced systems analysts often lack domain knowledge leading and this affects the quality of models they produce.Our approach targets the problem related to business process modeling quality by proposing an approach encompassing methods and tools for business process (BP) models quality measurement and improvement. We propose to support this modeling effort with an approach that uses domain knowledge to improve the semantic quality of BP models.The main contribution of this thesis is fourfold:1. Exploiting the IS domain knowledge: A business process metamodel is identified.Semantics are added to the metamodel by the mean of OCL constraints.2. Exploiting the application domain knowledge. It relies on domain ontologies. Alignment between the concepts of both metamodels is defined and illustrated.3. Designing of the guided quality process encompassing methods and techniques to evaluate and improve the business process models. Our process propose many quality constraints and metrics in order to evaluat the quality of the models and finally the process propose relevant recommendations for improvement.4. Development of a software prototype “BPM-Quality”. Our prototype implements all theabove mentioned artifacts and proposes a workflow enabling its users to evaluate andimprove CMs efficiently and effectively.We conducted a survey to validate the selection of the quality constraints through a first experience and also conducted a second experiment to evaluate the efficacy and efficiency of our overall approach and proposed improvements
Rey, Clément. "Étude et modélisation des équations différentielles stochastiques." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1177/document.
The development of technology and computer science in the last decades, has led the emergence of numerical methods for the approximation of Stochastic Differential Equations (SDE) and for the estimation of their parameters. This thesis treats both of these two aspects. In particular, we study the effectiveness of those methods. The first part will be devoted to SDE's approximation by numerical schemes while the second part will deal with the estimation of the parameters of the Wishart process. First, we focus on approximation schemes for SDE's. We will treat schemes which are defined on a time grid with size $n$. We say that the scheme $ X^n $ converges weakly to the diffusion $ X $, with order $ h in mathbb{N} $, if for every $ T> 0 $, $ vert mathbb{E} [f (X_T) -f (X_T^n)]vert leqslant C_f / h^n $. Until now, except in some particular cases (Euler and Victoir Ninomiya schemes), researches on this topic require that $ C_f$ depends on the supremum norm of $ f $ as well as its derivatives. In other words $C_f =C sum_{vert alpha vert leqslant q} Vert partial_{alpha} f Vert_{ infty}$. Our goal is to show that, if the scheme converges weakly with order $ h $ for such $C_f$, then, under non degeneracy and regularity assumptions, we can obtain the same result with $ C_f=C Vert f Vert_{infty}$. We are thus able to estimate $mathbb{E} [f (X_T)]$ for a bounded and measurable function $f$. We will say that the scheme converges for the total variation distance, with rate $h$. We will also prove that the density of $X^n_T$ and its derivatives converge toward the ones of $X_T$. The proof of those results relies on a variant of the Malliavin calculus based on the noise of the random variable involved in the scheme. The great benefit of our approach is that it does not treat the case of a particular scheme and it can be used for many schemes. For instance, our result applies to both Euler $(h = 1)$ and Ninomiya Victoir $(h = 2)$ schemes. Furthermore, the random variables used in this set of schemes do not have a particular distribution law but belong to a set of laws. This leads to consider our result as an invariance principle as well. Finally, we will also illustrate this result for a third weak order scheme for one dimensional SDE's. The second part of this thesis deals with the topic of SDE's parameter estimation. More particularly, we will study the Maximum Likelihood Estimator (MLE) of the parameters that appear in the matrix model of Wishart. This process is the multi-dimensional version of the Cox Ingersoll Ross (CIR) process. Its specificity relies on the square root term which appears in the diffusion coefficient. Using those processes, it is possible to generalize the Heston model for the case of a local covariance. This thesis provides the calculation of the EMV of the parameters of the Wishart process. It also gives the speed of convergence and the limit laws for the ergodic cases and for some non-ergodic case. In order to obtain those results, we will use various methods, namely: the ergodic theorems, time change methods or the study of the joint Laplace transform of the Wishart process together with its average process. Moreover, in this latter study, we extend the domain of definition of this joint Laplace transform
Etchegaray, Christèle. "Modélisation mathématique et numérique de la migration cellulaire." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS428/document.
Collective or individual cell displacements are essential in fundamental physiological processes (immune response, embryogenesis) as well as in pathological developments (tumor metastasis). The intracellular processes responsible for cell motion have a complex self-organized activity spanning different time and space scales. Highlighting general principles of migration is therefore a challenging task.In a first part, we build stochastic particular models of migration. To do so, we describe key intracellular processes as discrete in space by using stochastic population models. Then, by a renormalization in large population, infinitesimal size and accelerated dynamics, we obtain continuous stochastic equations for the dynamics of interest, allowing a relation between the intracellular dynamics and the macroscopic displacement.First, we study the case of a leukocyte carried by the blood flow and developing adhesive bonds with the artery wall, until an eventual stop. The binding dynamics is described by a stochastic Birth and Death with Immigration process. These bonds correspond to resistive forces to the motion. We obtain explicitly the mean stopping time of the cell.Then, we study the case of cell crawling, that happens by the formation of protrusions on the cell edge, that grow on the substrate and exert traction forces. We describe this dynamics by a structured population process, where the structure comes from the protrusions' orientations. The limiting continuous model can be analytically studied in the 1D migration case, and gives rise to a Fokker-Planck equation on the probability distribution for the protrusion density. For a stationary profile, we can show the existence of a dichotomy between a non motile state and a directional displacement state.In a second part, we build a deterministic minimal migration model in a discoïdal cell domain. We base our work on the idea such that the structures responsible for migration also reinforce cell polarisation, which favors in return a directional displacement. This positive feedback loop involves the convection of a molecular marker, whose inhomogeneous spatial repartition is characteristic of a polarised state.The model writes as a convection-diffusion problem for the marker's concentration, where the advection field is the velocity field of the Darcy fluid that describes the cytoskeleton. Its active character is carried by boundary terms, which makes the originality of the model.From the analytical point of vue, the 1D model shows a dichotomy depending on a critical mass for the marker. In the subcritical and critical cases, it is possible to show global existence of weak solutions, as well as a rate-explicit convergence of the solution towards the unique stationary profile, corresponding to a non-motile state. Above the critical mass, for intermediate values, we show the existence of two additional stationary solutions corresponding to polarised motile profiles. Moreover, for asymmetric enough initial profiles, we show the finite time apparition of a blowup.Studying a more complex model involving activation of the marker at the cell membrane permits to get rid of this singularity.From the numerical point of vue, numerical experiments are led in 2D either in finite volumes (Matlab) or finite elements (FreeFem++) discretizations. They allow to show both motile and non motile profiles. The effect of stochastic fluctuations in time and space are studied, leading to numerical simulations of cases of responses to an external signal, either chemical (chemotaxis) or mechanical (obstacles)
Gahlam, Nadia. "L'entrepreneuriat durable : essai de modélisation d'un processus innovant." Thesis, Reims, 2019. http://www.theses.fr/2019REIME001/document.
The notion of sustainable development is today a central concern of the population and the public authorities. Sustainable entrepreneurship is a form of response to this concern through the integration of sustainable development standards into the core business of the company. This type of business comes to meet economic, social and environmental objectives. Sustainable entrepreneurship research has been particularly interested in the profile of the sustainable entrepreneur. However, the research did not ask enough about how it works. It is considered the breaking agent through the introduction of eco-innovations. Innovation therefore appears as a solution to social and environmental issues. But this is not enough to consider this entrepreneurial phenomenon as an innovative form. This thesis attempts to fill these gaps by modeling, the sustainable entrepreneurial process. In addition, the borrowing of a theory of innovation "C-K Theory" makes it possible to bring the sustainable entrepreneurial process closer to the CK innovative design process in order to determine the innovative nature of sustainable entrepreneurship
Yvinec, Romain. "Modélisation probabiliste en biologie moléculaire et cellulaire." Phd thesis, Université Claude Bernard - Lyon I, 2012. http://tel.archives-ouvertes.fr/tel-00749633.
Boust, Bastien. "Étude expérimentale et modélisation des pertes thermiques pariétales lors de l'interaction flamme–paroi instationnaire." Phd thesis, Université de Poitiers, 2006. http://tel.archives-ouvertes.fr/tel-00116773.
Viprey, Fabien. "Modélisation et caractérisation des défauts de structure de machine-outil 5 axes pour la mesure in-process." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLN071/document.
In-process metrology consists in obtaining measurement data directly into the manufacturing process. This method results from an increasing need of manufacturers to carry out on-line measurements during one manufacturing task or between two manufacturing tasks by using the mean of production to measure the machined part. Monitoring the sources of errors like geometric errors is one of the prerequisites to ensure the traceable dimensional metrology directly on the machine tool.This thesis deals with the geometric modeling of 5-axis machine tool based on a standardized parameterization of geometric errors. This model is simulated and simplified by the use of a virtual machine developed in order to help understand and visualize the effects of geometric errors on the volumetric error.A new standard thermo-invariant material namely Multi-Feature Bar has been developed.After its calibration and after a European intercomparison, it provides a direct metrological traceability to the SI meter for dimensional measurement on machine tool in a hostile environment. The identification of three intrinsic parameters of this standard, coupled with a measurement procedure ensures complete and traceable identification of motion errors of linear axes. The identification of position and orientation errors of axis is based on an analysis of combinations of necessary parameters to characterize volumetric error and at best. A model parameter identification procedure is proposed by minimizing the time drift of the structural loop and the effects of previously identified motion errors. Asensitivity analysis of the measurement procedure settings and of the noise effects ensures the quality of this proposed identification
Guitarra, Silvana Raquel. "Modélisation multi-échelles des mémoires de type résistives (ReRAM)." Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0537/document.
A model for the switching of resistive random-access memories (ReRAM) is presented. This model is based on two hypotheses: (1) the resistive switching is caused by changes that occur in the narrow zone (active region) of the conductive filament under the influence of the electric field and (2) the resistive switching is a stochastic process governed by a switching probability. The active region is represented by a net of vertical connections, each one composed of three electrical elements: two of them are always low resistive (LR) while the third one acts as a breaker and can be low or high resistive (HR). In the model, the change of the breaker's state is governed by a switching probability (P$_{s}$) that is compared with a random number $p$. P$_{s}$ depend on the voltage drop along the breaker and the threshold voltage, V$_{set}$ or V$_{reset}$ for set (HR to LR) or reset (LR to HR) processes. Two conduction mechanism has been proposed: ohmic for the low resistive state and trap-assisted tunneling (TAT) for the high resistive state. The model has been implemented in Python and works with an external C-library that optimizes calculations and processing time. The simulation results have been successfully validated by comparing measured and modeled IV curves of HfO$_{2}$-based ReRAM devices of nine different areas. It is important to note that the flexibility and easy implementation of this resistive switching model allow it to be a powerful tool for the design and study of ReRAM memories
Cao, Trong Son. "Modélisation de l’endommagement ductile sous trajets de chargement complexes." Thesis, Paris, ENMP, 2013. http://www.theses.fr/2013ENMP0038/document.
The present PhD thesis aims at a better understanding and modeling of ductile damage mechanisms during cold forming processes, with wire drawing, rolling and cold pilgering as examples. In addition, special attention is paid to implemented damage models parameters identification methodology. All three approaches of ductile damage were investigated: uncoupled phenomenological fracture criteria; coupled phenomenological models; micromechanical model. These models have been implemented in Forge®, which required adaptation of algorithms to its mixed velocity-pressure formulation and to its finite element (P1+/P1). Parallel to the numerical work, various mechanical tests on three different materials (high carbon steel, stainless steel and zirconium alloy) were carried out for work hardening, and damage models parameters identification. In situ X-ray micro-tomography tensile tests have also been exploited for the identification of ductile damage mechanisms (nucleation, growth and coalescence) as well as the identification of micromechanical model. Finally, we carried out comparative studies of these models on our three abovementioned forming processes and materials. Regarding wire drawing and rolling of stainless steel, good agreement between numerical simulations and experimental results was found. For high carbon perlitic steel ultimate wire drawing, the GTN micro-mechanical model has given the best result, both qualitatively and quantitatively. Moreover, the comparison of the different models on different processes (wire rolling on high carbon steel, cold pilgering on zirconium alloy) highlights on the one hand the important role of the third deviatoric stress invariant in damage localization for shear-dominated forming processes. It shows on the other hand that the identification process itself should be based on microstructure measurements to provide accurate results in forming application
De, oliveira Hugo. "Modélisation prédictive des parcours de soins à l'aide de techniques de process mining et de deep learning." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEM021.
Initially created for a reimbursement purpose, non-clinical claim databases are exhaustive Electronic Health Records (EHRs) which are particularly valuable for evidence-based studies. The objective of this work is to develop predictive methods for patient pathways data, which leverage the complexity of non-clinical claims data and produce explainable results. Our first contribution focuses on the modeling of event logs extracted from such databases. New process models and an adapted process discovery algorithm are introduced, with the objective of accurately model characteristic transitions and time hidden in non-clinical claims data. The second contribution is a preprocessing solution to handle one complexity of such data, which is the representation of medical events by multiple codes belonging to different standard coding systems, organized in hierarchical structures. The proposed method uses auto-encoders and clustering in an adequate latent space to automatically produce relevant and explainable labels. From these contributions, an optimization-based predictive method is introduced, which uses a process model to perform binary classification from event logs and highlight distinctive patterns as a global explanation. A second predictive method is also proposed, which uses images to represent patient pathways and a modified Variational Auto-Encoders (VAE) to predict. This method globally explains predictions by showing an image of identified predictive factors which can be both frequent and infrequent
Mazzuca, Muriel. "Quantification par mesures directes d'émissions polluantes gazeuses de divers grands process industriels, et modélisation d'un panache réactif." Lille 1, 1998. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/1998/50376-1998-49.pdf.
Perez, Medina Jorge Luis. "Approche Orientée Services pour la Réutilisation de Processus et d'Outils de Modélisation." Phd thesis, Grenoble, 2010. http://www.theses.fr/2010GRENM025.
An Information System development methodology allows designers to model products (any software artifact) by executing process models. Designers are assisted in their tasks by modeling tools. Methods engineering permits defining development methods by reusing and assembling fragments of existing methods. The point is then to provide to designers the modeling environment adapted to his/her activities. The aim of our research is to facilitate the work of project managers and model designers by helping them in choosing methods fragments, models and modeling environments adapted to their specific needs. The contributions of this thesis concern a service-oriented architecture based on three different abstract levels. The intentional level represents the needs of model designers and project managers in terms of the models management. The emphasis is placed on the modeling goals with the use of ontologies of the information systems the domain. The organizational level proposes a model of services facilitating the capitalization and the selection of method fragments which realize the goals of the designers. The operational level relies on a model of services allowing to characterize and to select the appropriate modelling tools adapted to the methods fragment. All the proposals are illustrated by study cases and a first version of prototype for supporting and for validating the realized works
Perez, Medina Jorge Luis. "Approche Orientée Services pour la Réutilisation de Processus et d'Outils de Modélisation." Phd thesis, Grenoble, 2010. http://tel.archives-ouvertes.fr/tel-00493314.
Krebs, Stéphane. "Modélisation des propriétés thermodynamiques de solutions d'électrolytes à intérêt industriel." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2006. http://tel.archives-ouvertes.fr/tel-00357703.
Al, Haddad Mazen. "Contribution théorique et modélisation des phénomènes instantanés dans les opérations d'autovaporisation et de déshydratation." Phd thesis, Université de La Rochelle, 2007. http://tel.archives-ouvertes.fr/tel-00399138.
Huynh, Ngoc Tho. "A development process for building adaptative software architectures." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0026/document.
Adaptive software is a class of software which is able to modify its own internal structure and hence its behavior at runtime in response to changes in its operating environment. Adaptive software development has been an emerging research area of software engineering in the last decade. Many existing approaches use techniques issued from software product lines (SPLs) to develop adaptive software architectures. They propose tools, frameworks or languages to build adaptive software architectures but do not guide developers on the process of using them. Moreover, they suppose that all elements in the SPL specified are available in the architecture for adaptation. Therefore, the adaptive software architecture may embed unnecessary elements (components that will never be used) thus limiting the possible deployment targets. On the other hand, the components replacement at runtime remains a complex task since it must ensure the validity of the new version, in addition to preserving the correct completion of ongoing activities. To cope with these issues, this thesis proposes an adaptive software development process where tasks, roles, and associate artifacts are explicit. The process aims at specifying the necessary information for building adaptive software architectures. The result of such process is an adaptive software architecture that only contains necessary elements for adaptation. On the other hand, an adaptation mechanism is proposed based on transactions management for ensuring consistent dynamic adaptation. Such adaptation must guarantee the system state and ensure the correct completion of ongoing transactions. In particular, transactional dependencies are specified at design time in the variability model. Then, based on such dependencies, components in the architecture include the necessary mechanisms to manage transactions at runtime consistently
Trouvilliez, Alexandre. "Observations et modélisation de la neige soufflée en Antarctique." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-01072241.
Guitarra, Silvana Raquel. "Modélisation multi-échelles des mémoires de type résistives (ReRAM)." Electronic Thesis or Diss., Aix-Marseille, 2018. http://theses.univ-amu.fr.lama.univ-amu.fr/181210_GUITARRA_584ohbsor62rwx899lsxvt26qyx_TH.pdf.
A model for the switching of resistive random-access memories (ReRAM) is presented. This model is based on two hypotheses: (1) the resistive switching is caused by changes that occur in the narrow zone (active region) of the conductive filament under the influence of the electric field and (2) the resistive switching is a stochastic process governed by a switching probability. The active region is represented by a net of vertical connections, each one composed of three electrical elements: two of them are always low resistive (LR) while the third one acts as a breaker and can be low or high resistive (HR). In the model, the change of the breaker's state is governed by a switching probability (Ps) that is compared with a random number p. Ps depend on the voltage drop along the breaker and the threshold voltage, V_{set} or V_{reset} for set (HR to LR) or reset (LR to HR) processes. Two conduction mechanism has been proposed: ohmic for the low resistive state and trap-assisted tunneling (TAT) for the high resistive state. The model has been implemented in Python and works with an external C-library that optimizes calculations and processing time. The simulation results have been successfully validated by comparing measured and modeled IV curves of HfO₂-based ReRAM devices of nine different areas. It is important to note that the flexibility and easy implementation of this resistive switching model allow it to be a powerful tool for the design and study of ReRAM memories
Marin, Gallego Mylene. "Valorisation chimique des condensats issus de la torréfaction de biomasses : modélisation thermodynamique, conception et analyse des procédés." Thesis, Toulouse, INPT, 2015. http://www.theses.fr/2015INPT0131.
Lignocellulosic biomass is considered as a renewable carbon resource with great potential for the energy and chemical recovery. Torrefaction is a thermal process carried out at temperatures below 300°C, under inert atmosphere, at atmospheric pressure, and with residence times for the solid biomass ranging from few minutes to several hours. Torrefied wood is a solid product constituted by more than 70% of the initial mass with properties close to those of coal. The 30% remaining part is a gaseous effluent, composed of about one third of non-condensable gases – carbon monoxide and carbon dioxide – and two thirds of condensable species. Currently, torrefied wood is the main product of interest and is usually transformed into energetic gases by the gasification process or directly used as coal for combustion. Conversely, gaseous by-products are considered at present time as a waste and in the best case are burned to provide energy to the process. Yet, the recovery and valorization of the condensable fraction as bio-sourced chemicals is worth considering. The aim of the thesis is to propose a separation-purification process for condensable chemicals of the waste gas. This condensable fraction is a predominantly aqueous phase, containing more than 150 identified organic species. Minority species are present in varying proportions depending on torrefied wood. Finally, it is a reactive and thermally unstable mixture, where different chemical equilibria are present. An analysis of the physicochemical characteristics of the condensable fraction allowed selecting a limited number of compounds to model the mixture. A representative model of the thermodynamic behavior of the reactive mixture has been selected and the binary interaction parameters identified. Experimental vapor-liquid equilibria data were acquired in part to validate this model. The target compounds and objectives of the recovery process were selected and several development strategies were developed and simulated in ProSim+ on the basis of thermodynamic modeling. This study assessed these different strategies in terms of energy efficiency and purity of the products for potential implementation on an industrial scale of this sector
Duchez, Laurent. "Modélisation et contribution à l'industrialisation du procédé de rétification du bois." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2001. http://tel.archives-ouvertes.fr/tel-00872327.
Beaubier, Benoit. "Étude physique et modélisation numérique de procédés d'assemblage par soudo-brasage de sous-ensembles en carrosserie automobile." Thesis, Paris 6, 2014. http://www.theses.fr/2014PA066060/document.
This study is about the impact of thermo-mechanical assembly processes of metal sheets, in the automotive industry context. The aim is to predict thermally induced deformations by using a numerical tool. We are particularly interested in Plasmatron and laser brazing processes that are used to assemble an automotive roof and the body side of the vehicle. Parts are made from 0,67 mm thickness XES thin metal sheet of about one meter length. To validate such complex non-linear numerical simulations with experimental observations, it is necessary to develop well-controlled and highly instrumented tests. In a first step, experimental welding brazing tests are carried out in order to identify validity domains, heat source model and thermal coefficients of exchange. In a second step, high temperature tension tests are performed to identify the behaviour of each material. These tests are instrumented with a new DIC protocol in order to measure displacement fields from 20°C to 1000°C. Finally, in order to validate the thermo-mechanical simulation, in-situ 3D Digital Image Correlation (stereo-correlation) measurements were performed during the welding-brazing assembly. For this purpose and due to the geometry of the involved parts, a new calibration method, based on the CAD part geometry has been developed
Mougin, Jonathan. "Vers un outil de modélisation des processus de transfert de connaissance." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAI022/document.
Knowledge, and more precisely, knowledge management, is a strategic issue that can provide a significant competitive advantage in a competitive and dynamic economy. However, this field has an undeniable interdisciplinary character. Indeed, it is treated by multiple research areas such as information science, computer science, psychology, economics or engineering sciences.The research presented in this thesis occurred in a CIFRE context (French Industrial Conventions). The initial research topic was born from the collaboration between the "Collaborative Design" team of the G-SCOP laboratory (Grenoble laboratory dedicated to the Sciences of Design, Optimization and Production) and the company BASSETTI, editor and integrator of the TEEXMA® software solution.As part of the work presented in this manuscript, our goal has been to study how to model knowledge transfer processes for a better understanding of how these processes are implemented within organizations. The modelling tool proposed in this thesis relies in part on the reuse of existing models found in the literature and on a participant observation methodology.The research carried out was conducted following an intervention research methodology. To answer this question, we were able to test our modelling tool through several case studies. These studies allowed to validate the first part of our hypotheses. After this validation phase, we sought to evaluate the appropriation of the use of our modelling tool after knowledge management trainings. In parallel, we have included our modelling tool in a more global approach to diagnose situations of knowledge transfer. This global approach was then applied to two case studies. Finally, we reused the previous approach. This time, the modelling tool was used to analyse uses related to computer tools
Gahlam, Nadia. "L'entrepreneuriat durable : essai de modélisation d'un processus innovant." Electronic Thesis or Diss., Reims, 2019. http://www.theses.fr/2019REIME001.
The notion of sustainable development is today a central concern of the population and the public authorities. Sustainable entrepreneurship is a form of response to this concern through the integration of sustainable development standards into the core business of the company. This type of business comes to meet economic, social and environmental objectives. Sustainable entrepreneurship research has been particularly interested in the profile of the sustainable entrepreneur. However, the research did not ask enough about how it works. It is considered the breaking agent through the introduction of eco-innovations. Innovation therefore appears as a solution to social and environmental issues. But this is not enough to consider this entrepreneurial phenomenon as an innovative form. This thesis attempts to fill these gaps by modeling, the sustainable entrepreneurial process. In addition, the borrowing of a theory of innovation "C-K Theory" makes it possible to bring the sustainable entrepreneurial process closer to the CK innovative design process in order to determine the innovative nature of sustainable entrepreneurship
Boillot, Mathieu. "Validation expérimentale d'outils de modélisation d'une pile à combustible de type PEM." Phd thesis, Institut National Polytechnique de Lorraine - INPL, 2005. http://tel.archives-ouvertes.fr/tel-00109628.
Chapurlat, Vincent. "Vérification et validation de modèles de systèmes complexes: application à la Modélisation d'Entreprise." Habilitation à diriger des recherches, Université Montpellier II - Sciences et Techniques du Languedoc, 2007. http://tel.archives-ouvertes.fr/tel-00204981.
Le travail de recherche entrepris depuis le début du Doctorat en 1991 relève de la thématique de la modélisation de systèmes complexes puis de la vérification et de la validation de ces modèles. Ceci a pour objectif d'assurer, ou à défaut de rassurer, le modeleur sur la qualité des modèles, sur leur pertinence vis-à-vis du système considéré et sur le respect d'exigences qui ont présidé à leur construction. La recherche a donc consisté au développement d'approches de modélisation, de spécification formelle de propriétés, de vérification par preuve de propriétés au moyen de Graphes Conceptuels et de simulation comportementale. Les domaines d'application privilégiés ont été les systèmes de contrôle commande répartis, puis plus largement la modélisation d'entreprise et tentent aujourd'hui d'intégrer une dimension risque dans la modélisation d'entreprise et de s'ouvrir plus largement à l'ingénierie des systèmes complexes. Les résultats sont des langages et un cadre de modélisation intégré, un langage de spécification baptisé LUSP, une suite de mécanismes de preuve formelle et de simulation qui ont donné lieu à divers encadrements de thèses, de travaux et à des transferts vers l'industrie.
Enfin, l'activité d'enseignement a tenté de rester cohérente avec le profil de compétence à la fois de producticien et d'ingénierie système acquis ou inspiré par la thématique de recherche. Elle s'est déroulée dans le cadre de diverses Universités, Ecoles d'Ingénieurs ou de cursus spécialisés. Les résultats sont des propositions et l'accompagnement de thématiques nouvelles, une activité d'ingénierie pédagogique et une implication dans diverses responsabilités administratives.
El, Hentati Fatima Zahra. "Etude expérimentale et modélisation mathématique de la réponse lymphocytaire T." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2009. http://tel.archives-ouvertes.fr/tel-00466660.
Kacem, Manel. "Processus de risque : modélisation de la dépendance et évaluation du risque sous des contraintes de convexité." Thesis, Lyon 1, 2013. http://www.theses.fr/2013LYO10051/document.
In this thesis we focus on two different problems which have as common point the contribution to the modeling and to the risk management in insurance. In the first research theme, we are interested by the modeling of the dependence in insurance. In particular we propose an extension to model with common factor. In the second research theme we consider the class of nonincreasing discrete distributions and we are interested in studying the effect of additional constraint of convexity on the convex extrema. Some applications in ruin theory motivate our interest to this subject. The first part of this thesis is concerned with factor models for the modeling of the dependency in insurance. An interesting property of these models is that the random variables are conditionally independent with respect to a factor. We propose a new model in which the conditioning is with respect to the entire memory of the factor. In this case we give some mixing properties of risk process under conditions related to the mixing properties of the factor process and to the conditional mixing risk process. The law of the sum of random variables has a great interest in actuarial science. Therefore we give some conditions under which the law of the aggregated process converges to a normal distribution. In the second part of the thesis we consider the class of discrete distributions whose probability mass functions (p.m.f.) are nonincreasing on a finite support. Convex extrema in that class of distributions are well-known. Our purpose is to point out how additional shape constraints of convexity type modify these extrema. Two cases are considered : the p.m.f. is globally convex on N or it is convex only from a given positive point. The corresponding convex extrema are derived by using a simple crossing property between two distributions. Several applications to some ruin problems are presented for illustration
Zhang, Qiang. "Process modeling of innovative design using systems engineering." Thesis, Strasbourg, 2014. http://www.theses.fr/2014STRAD007/document.
We develop a series of process models to comprehensively describe and effectively manage innovative design in order to achieve adequate balance between innovation and control, following the design research methodology (DRM). Firstly, we introduce a descriptive model of innovative design. This model reflects the actual process and pattern of innovative design, locates innovation opportunities in the process and supports a systematic perspective whose focus is the external and internal factors affecting the success of innovative design. Secondly, we perform an empirical study to investigate how control and flexibility can be balanced to manage uncertainty in innovative design. After identifying project practices that cope with these uncertainties in terms of control and flexibility, a case-study sample based on five innovative design projects from an automotive company is analyzed and shows that control and flexibility can coexist. Based on the managerial insights of the empirical study, we develop the procedural process model and the activity-based adaptive model of innovative design. The former one provides the conceptual framework to balance innovation and control by the process structuration at the project-level and the integration of flexible practices at the operation-level. The latter model considers innovative design as a complex adaptive system, and thereby proposes the method of process design that dynamically constructs the process architecture of innovative design. Finally, the two models are verified by supporting a number of process analysis and simulation within a series of innovative design projects
Oliva, Florian. "Modélisation, caractérisation et optimisation des procédés de traitements thermiques pour la formation d’absorbeurs CIGS." Thesis, Saint-Etienne, EMSE, 2014. http://www.theses.fr/2014EMSE0738/document.
Solar energy is promised to be a major actor in the future of energy production. Even if silicon based solar cells remain the main product their fabrication is energy consuming and requires heavy cover glass for protection, which reduce their development. For several years, commercial interest has shifted towards thin-film cells for which manufacturing time, large scale production, fabrication costs and weight savings are the main advantages. For thin film technology, a wide variety of materials can be used but chalcopyrite such as Cu(In,Ga)Se2 is one of the most promising. The most current method used for chalcopyrite formation is co- evaporation but this process is very expensive and not well suitable for large scale production due to high vacuum requirements. One alternative solution described in this work consists of a two-step technology based on the sequential electro-deposition of a metallic precursor followed by a rapid reactive annealing. However to reach its full potential this technology needs a better understanding of the Ga incorporation mechanism and of the selenization/sulfurization step. This work focuses first on formation mechanisms through the study of several kinds of precursor. This knowledge is then used to explain and to optimize innovative annealing processes. This study is achieved by observing the impact of some process parameters using designs of experiment (DOE). A link between process parameters and properties of these thin films is obtained using electrical, structural and diffusion characterization of the devices. Finally we propose hypothesis to explain observed phenomena and also some improvements to meet the challenges of this process