Dissertations / Theses on the topic 'Mise à l’échelle automatique'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Mise à l’échelle automatique.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Perera, Jayasuriya Kuranage Menuka. "AI-driven Zero-Touch solutions for resource management in cloud-native 5G networks." Electronic Thesis or Diss., Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2024. http://www.theses.fr/2024IMTA0427.
Full textThe deployment of 5G networks has introduced cloud-native architectures and automated management systems, offering communication service providers scalable, flexible, and agile infrastructure. These advancements enable dynamic resource allocation, scaling resources up during high demand and down during low usage, optimizing CapEx and OpEx. However, limited observability and poor workload characterization hinder resource management. Overprovisioning during off-peak periods raises costs, while underprovisioning during peak demand degrades QoS. Despite industry solutions, the trade-off between cost efficiency and QoS remains unresolved. This thesis addresses these challenges by proposing proactive autoscaling solutions for network functions in cloud-native 5G. It focuses on accurately forecasting resource usage, intelligently differentiating scaling events (scaling up, down, or none), and optimizing timing to achieve a balance between cost and QoS. Additionally, CPU throttling, a significant barrier to this balance, is mitigated through a novel approach. The developed framework ensures efficient resource allocation, reducing operational costs while maintaining high QoS. These contributions establish a foundation for sustainable and efficient 5G network operations, setting a benchmark for future cloud-native architectures
Stevens, Nolwenn. "Mise à l’échelle des interventions complexes en santé publique. Aspects conceptuels et méthodologiques." Electronic Thesis or Diss., Bordeaux, 2024. http://www.theses.fr/2024BORD0234.
Full textPoliticians, researchers and stakeholders all share a desire to mobilise evidence to build public health policy. This strategy is motivated by a desire for equity, a duty to ensure safety and effectiveness, and a need to make rational use of resources. Scaling up interventions would be the way to achieve this strategy. However, this is not easy. It is necessary to mark out and secure this path so that scaling up is no longer a risky gamble. This is the ambition we have pursued in this thesis. Our research was guided by four main questions: what do we mean by 'scaling up'? How do we scale up? How can it be made safe? What challenges does population health intervention research face in terms of scaling up? All these questions are limited to the case of complex population-based interventions. To meet our objectives, we conducted two successive studies. We carried out a review of the international literature on conceptual frameworks and models, guides and tools developed for scaling up public health interventions. We then conducted semi-structured interviews with a variety of stakeholders who had experience of scaling up a public health intervention in the French context, with the aim of gathering their experiences and perceptions of the process. Our research has enabled us to shed light on the very concept of scaling up, and to propose a definition. In addition, we propose to consider it as a composite concept, incorporating those of implementation, dissemination and sustainability. It has also made it possible to describe the range of strategies that can be adopted to deploy it, whether these concern: i) the territorial expansion of the intervention, ii) the sustainability of the intervention, and finally iii) its relevance to reality. In addition, this process is supported by organisational approaches that enable the dynamic to be set in motion and a set of eight essential activities. Finally, a set of conditions will have to be collected, and certain obstacles will have to be avoided. The literature has identified 30 levers and 28 obstacles to scaling up. Among these, the testimonies of experimenters have highlighted 6 catalyst factors and 6 inhibitors, enabling us to develop and detail the most fundamental conditions influencing the process. Finally, our research proposes to demystify evidence-based intervention but to encourage the adoption of evidence-based approaches. It also suggests considering evidence that is useful, plural, relative and grounded. Questions relating to the generalisability of results, intrinsically linked to the ambition of scaling up, require further exploration. Recognition of the identity of the intervention, linked to what underpins its effectiveness, is an imperative that needs to be consolidated methodologically. As announced by the expression "science of solutions", we suggest that we take our academic vision beyond interventions and transform the knowledge gained from intervention research into substrates that enable the emergence of solutions adapted to the issues and situations at stake. Finally, embarking on a policy of scaling up interventions would also require: greater transparency in the process of selecting interventions, a redirection of funding from leadership to support services and activities, and the introduction of monitoring systems. The identification or creation of expert resource structures to support the various stakeholders in this complex process could be a great help
Rivière, Pascal. "Mise au point automatique et interactive de modeles." Caen, 1994. http://www.theses.fr/1994CAEN2058.
Full textSeverini, Alfiero. "Mise au point d'un système de traduction automatique italien-français." Paris 13, 2001. http://www.theses.fr/2001PA131007.
Full textRUESCH, SANDRA. "Mise en oeuvre d'un progiciel general de traitement automatique d'images stereoscopiques." Paris 7, 1994. http://www.theses.fr/1994PA077365.
Full textKermad, Chafik. "Segmentation d'image: recherche d'une mise en oeuvre automatique par coopération de méthodes." Phd thesis, Université Rennes 1, 1997. http://tel.archives-ouvertes.fr/tel-00008781.
Full textSalmeron, Eva. "Mise en coïncidence automatique des contours extraits d’images aériennes et d’éléments cartographiques." Compiègne, 1986. http://www.theses.fr/1986COMPD018.
Full textKERMAD, CHAFIK DJALAL. "Segmentation d'images : recherche d'une mise en uvre automatique par cooperation de methodes." Rennes 1, 1997. http://www.theses.fr/1997REN10109.
Full textCao, Van Toan. "La mise en registre automatique des surfaces acquises à partir d'objets déformables." Doctoral thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/26764.
Full textThree-dimensional registration (sometimes referred to as alignment or matching) is the process of transforming many 3D data sets into the same coordinate system so as to align overlapping components of these data sets. Two data sets aligned together can be two partial scans from two different views of the same object. They can also be two complete models of an object generated at different times or even from two distinct objects. Depending on the generated data sets, the registration methods are classified into rigid registration or non-rigid registration. In the case of rigid registration, the data is usually acquired from rigid objects. The registration process can be accomplished by finding a single global rigid transformation (rotation, translation) to align the source data set with the target data set. However, in the non-rigid case, in which data is acquired from deformable objects, the registration process is more challenging since it is important to solve for both the global transformation and local deformations. In this thesis, three methods are proposed to solve the non-rigid registration problem between two data sets (presented in triangle meshes) acquired from deformable objects. The first method registers two partially overlapping surfaces. This method overcomes some limitations of previous methods to solve large global deformations between two surfaces. However, the method is restricted to small local deformations on the surface in order to validate the descriptor used. The second method is developed from the framework of the first method and is applied to data for which the deformation between the two surfaces consists of both large global deformation and small local deformations. The third method, which exploits both the first and second method, is proposed to solve more challenging data sets. Although the quality of alignment that is achieved is not as good as the second method, its computation time is accelerated approximately four times since the number of optimized parameters is reduced by half. The efficiency of the three methods is the result of the strategies in which correspondences are correctly determined and the deformation model is adequately exploited. These proposed methods are implemented and compared with other methods on various types of data to evaluate their robustness in handling the non-rigid registration problem. The proposed methods are also promising solutions that can be applied in applications such as non-rigid registration of multiple views, 3D dynamic reconstruction, 3D animation or 3D model retrieval.
Zehani, Mongia. "Optimisation du procédé polyol pour la synthèse de nanoparticules d'oxyde de zinc : mise à l'échelle du procédé et applications photovoltaïques." Thesis, Paris 13, 2014. http://www.theses.fr/2014PA132044/document.
Full textThanks to developments in synthesis methods and characterization techniques, nanomaterials research field is increasingly active and attractive. This thesis aims to investigate the polyol process for zinc oxide nanoparticles synthesis. Indeed, this method has the advantage of providing a wide variety of particle morphology with a good crystalline quality. In this thesis, we show that by varying the synthesis conditions we can adjust the size, the size distribution and the morphology of nanoparticles to obtain either shaped nanospheres as small as 6 nm or nanowires as long as 600 nm. Our systemic study focused on a set of parameters that control the forced hydrolysis reaction including stoichiometry, temperature, nature of the polyol but also mixing, injection of reagents and ultrasound activation. We show that the shape of the nanoparticles is determined by the competition between growth rates of different zinc oxide crystal facets. Our study also compared different mixing devices such as laboratory reactor, T- mixer and impinging jets. More over, to mass produce zinc oxide nanoparticles, we developed an original strategy to understand the effect of mixing on nanoparticle size. In our approach, we correlate the turbulent energy dissipated as obtained from Computation Fluid Dynamics with theme asured nanoparticle size. The application to the specific case of zinc oxide has allowed us to produce sample aliquots of ~50 g per Batch. These nanoparticles were subsequently incorporated into dye-sensitized solar cells as semi conducting material at the École Nationale Supérieure de Chimie de Paris. Indeed, the morphological richness of the zinc oxide produced via polyol process suggests good adsorption of the dye on their surfaces. Our results show that the photoconversion efficiencies depend both on the morphology and the size. Our best photoconversion efficiency approaches 5.3%
Ngo, Tri Dat. "Mise à l’échelle d’un écoulement diphasique avec gravité dans un milieu géologique hétérogène : application au cas de la séquestration du CO₂." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS005/document.
Full textThis work deals with the mathematical modeling and the numerical simulation of the migration under gravity and capillarity effects of the supercritical CO₂ injected into a geological heterogeneous sequestration site. The simulations are performed with the code DuMux. Particularly, we consider the upscaling, from the cell scale to the reservoir scale, of a two-phase (CO₂ -brine) flow model within a periodic stratified medium made up of horizontal low permeability barriers, continuous or discontinuous. The upscaling is done by the two-scale asymptotic method. First, we consider perfectly layered media. An homogenized model is developed and validated by numerical simulation for different values of capillary number and the incident flux of CO₂ . The homogenization method is then applied to the case of a two-dimensional medium made up of discontinuous layers. Due to the gravity effect, the CO₂ accumulates under the low permeability layers, which leads to a non-standard local mathematical problem. This stratification is modeled using the gravity current approach. This approach is then extended to the case of semi-permeable stratas taking into account the capillarity. The upscaled model is compared with numerical simulations for different types of layers, with or without capillary pressure, and its limit of validity is discussed in each of these cases. The final part of this thesis is devoted to the study of the parallel computing performances of the code DuMux to simulate the injection and migration of CO₂ in three-dimensional heterogeneous media (layered periodic media, fluvial media and reservoir model SPE 10)
Beaurepaire, Lionel. "Contribution à la mise en œuvre d'une chaîne automatique de filtrage d'images numériques." Rennes 1, 1996. http://www.theses.fr/1996REN10195.
Full textKhouri, Antoun. "Optimisation et mise en oeuvre d'algorithmes parallèles pour la reconnaissance de la parole." Avignon, 1995. http://www.theses.fr/1995AVIG0110.
Full textVernes, Léa. "Mise au point d’un procédé innovant d’éco-extraction assisté par ultrasons d’ingrédients alimentaires à partir de spiruline et transposition à l’échelle industrielle." Electronic Thesis or Diss., Avignon, 2019. http://www.theses.fr/2019AVIG0273.
Full textMicroalgae are one of the most promising renewable resource for future sustainable food. Thanks to their diversity of metabolism, these microorganisms can synthesize a wide range of compounds of interest with high nutritional value. However, their consumption remains limited because of their intrinsic organoleptic characteristics unattractive. To tackle this problem and to overcome these barriers, this thesis was focused on the development of a production process of food ingredient from spirulina.A green and innovative method using ultrasonic technology for the extraction of proteins from Arthrospira platensis was proposed in a first part. This is the manothermosonication (MTS). The use of an experimental plan made it possible to optimize extraction parameters; and mathematical modeling and microscopic investigations led to an understanding of the mass transfer phenomena on the one hand, and the structural effects of ultrasound on spirulina filaments on the other hand. According to the experimental results, MTS allowed to obtain 229 % more proteins (28.42 ± 1.15 g / 100 g DW) compared to the conventional method without ultrasound (8.63 ± 1.15 g / 100 g DW). With 28.42 g of protein per 100 g of spirulina in the extract, a protein recovery rate of 50% was achieved in 6 minutes with a continuous MTS process. Based on these promising results, extrapolation tracks have been studied in order to propose decision support tools for process industrialization. Thus, a risk analysis procedure (HACCP & HAZOP), a cost study as well as the environmental impact of the process were developed in a second part of this work. Lastly, ways of exploiting by-products have been presented in a biorefinery approach
Horn, Odile. "Étude et mise en oeuvre d'un algorithme de poursuite de cible par analyse d'image." Vandoeuvre-les-Nancy, INPL, 1989. http://www.theses.fr/1989NAN10180.
Full textSabri, Mohamed. "Filtrage et restauration en traitement des images numériques : recherche d'une mise en œuvre automatique." Rennes 1, 1991. http://www.theses.fr/1991REN10027.
Full textMoron, Véronique. "Mise en correspondance de données 3D avec un modèle CAO : application à l'inspection automatique." Lyon, INSA, 1996. http://theses.insa-lyon.fr/publication/1996ISAL0131/these.pdf.
Full textThis work deals with the automatic inspection of solid based free-form surfaces, using 3D data produced either by a 3D Range sensor or by a Coordinate Measuring Machine. We first introduce a complete state of the art in the 3D modelisation domain. We select two kinds of surface-based models, the first is an interpolated triangulated model, and the other one is an exact NURBS surfaces based model. For each of them, we state the computation of the point/surface entity distance. We present an automatic and robust (up to 50 % of outlier points) general registration method, capable of registering 3D data with a geometric model in any initial state. We apply this method for different applications, like pattern recognition. But mainly for automatic inspection of complex parts. We state an inspection method that produces an inspection report including numerical results concerning global or local tolerance verification. The other outputs are several types of coloured versions of the model indicating the level of discrepancy between the measured points and the model. Using this colouring scheme, an operator or a robotic system can rapidly identify defective parts or monitor process drift on a production line
Moreau, Aurélien. "Mise en œuvre automatique de processus métier dans le domaine des architectures orientées services." Paris 6, 2009. http://www.theses.fr/2009PA066660.
Full textHelmy, Amr. "Mise en œuvre de techniques de démonstration automatique pour la vérification formelle des NoCs." Grenoble INPG, 2010. http://www.theses.fr/2010INPG0035.
Full textThe current technology allows the integration on a single die of complex systems-on-chip (SoC's) composed of manufactured blocks (IP's) that can be interconnected through specialized networks-on-chip (NoCs). IP's have usually been validated by diverse techniques (simulation, test, formal verification) and the key problem remains the validation of the communication infrastructure. This thesis addresses the formal verification of NoCs by means of a mechanized proof tool, the ACL2 theorem prover. A meta-model for NoCs has been developed and implemented in ACL2. It satisfies generic correctness statements, which are logical consequences of a set of proof obligations for each one of the NoC constituents (topology, routing, switching technique,. . . ). Thus the verification of a particular NoC instance is reduced to discharging this set of proof obligations. The purpose of this thesis is to extend this meta-model in several directions: more accurate timing modeling, flow control, priority mechanisms,. . . The methodology is demonstrated on realistic and state-of-the-art NoC designs: Spidergon (STMicroelectronics), Hermes (The Federal University of Rio Grande do Sul, Brazil, and LIRMM) , and Nostrum (Royal Institute Of Technology, Sweden)
Moron, Véronique Redarce Tanneguy Boulanger Pierre. "Mise en correspondance de données 3D avec un modèle CAO application à l'inspection automatique /." Villeurbanne : Doc'INSA, 2001. http://docinsa.insa-lyon.fr/these/pont.php?id=moron.
Full textPerraud, Veronique. "Mise au point d'un préleveur automatique pour la mesure en continu des composés carbonylés atmosphériques." Phd thesis, Université de Provence - Aix-Marseille I, 2007. http://tel.archives-ouvertes.fr/tel-00346977.
Full textPerraud, Véronique. "Mise au point d’un préleveur automatique pour la mesure en continu des composés carbonylés atmosphériques." Aix-Marseille 1, 2007. http://theses.univ-amu.fr.lama.univ-amu.fr/2007AIX11058.pdf.
Full textBecause of their implication in photochemical processes leadind to the formation of tropospheric ozone and their negative effect on human health, carbonyl compounds are part of the volatile organic compounds which demand a continuous measurement of their atmospheric concentration (fast fluctuation of their atmospheric concentration). The present research meets this requirement and two sampling strategies were studied to have an automatic instrument for the continuous measurement of atmospheric carbonyl compounds. First, sampling by using a tranfer of gaseous phase in a liquid phase associated with a simultaneous chemical derivatization of the trapped compounds was studied because of its high specificity towards carbonyl compounds. However, no couple “sampling device-reagent” allows a quantitative sampling of carbonyl compounds, nor a continuous measurement in the field. Another strategy was therefore studied: cryogenic adsorption onto solid adsorbent followed by thermodesorption and analysis by GC/MS. Collection efficiency using different solid adsorbents was greater than 95% for carbonyl compounds consisting of 1 (formaldehyde, Pvap (-30°C) = 34400Pa) to 7 carbons (benzaldehyde, Pvap (-30°C) = 0,75 Pa). This sampling strategy is a successful first step towards the realization of the automatic sampling device for a continuous measurement of atmospheric carbonyls compounds
Kirman, Jerome. "Mise au point d'un formalisme syntaxique de haut niveau pour le traitement automatique des langues." Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0330/document.
Full textThe goal of computational linguistics is to provide a formal account linguistical knowledge, and to produce algorithmic tools for natural languageprocessing. Often, this is done in a so-called generative framework, where grammars describe sets of valid sentences by iteratively applying some set of rewrite rules. Another approach, based on model theory, describes instead grammaticality as a set of well-formedness logical constraints, relying on deep links between logic and automata in order to produce efficient parsers. This thesis favors the latter approach. Making use of several existing results in theoretical computer science, we propose a tool for linguistical description that is both expressive and designed to facilitate grammar engineering. It first tackles the abstract structure of sentences, providing a logical language based on lexical properties of words in order to concisely describe the set of grammaticaly valid sentences. It then draws the link between these abstract structures and their representations (both in syntax and semantics), through the use of linearization rules that rely on logic and lambda-calculus. Then in order to validate this proposal, we use it to model various linguistic phenomenas, ending with a specific focus on languages that include free word order phenomenas (that is, sentences which allow the free reordering of some of their words or syntagmas while keeping their meaning), and on their algorithmic complexity
Robert, Vincent Laprie Yves Bonneau Anne. "Modélisation de la coarticulation labiale mise en oeuvre sur une tête parlante /." S. l. : Nancy 1, 2008. http://www.scd.uhp-nancy.fr/docnum/SCD_T_2008_0077_ROBERT.pdf.
Full textAzagoh, Christiane. "Contribution à l’émergence d’une filière insecte : mise au point d’un procédé de production de farine à l’échelle pilote et caractérisation de la fraction protéique." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLA018.
Full textIn the context of resource scarcity, population growth, environmental degradation, and food supplies dependency, the production of protein-rich feed and food should increase in order to meet the demand. New resources are currently being explored as vegetal, algae, insects... This last one is environmentally friendly and represents a more sustainable protein source as compared to conventional livestock farming. Although insects are consumed by a lot of people in Asia, Africa and South-America, this is not the case in Europe. In order to meet European consumers' preferences, they need to be processed or transformed into ingredients to become a part of formulation products (i.e. powders). However, very little knowledge regarding processing methods at a pilot or industrial scale, and the composition and impact of process on the properties of insect-based ingredient exists and is available. The aim of this work was to design a process for the production of meal rich in proteins from insect at a pilot scale, to characterize it for feed and food applications, to characterize the properties of its soluble proteins, and to study the impact of the process on these properties. The Tenebrio molitor, candidate for rearing at an industrial scale, was chosen in the frame of this study. A thermo-mechanical process was designed at a pilot scale. It allowed the production of a protein-rich insect meal of 72% (bs) with 14% (bs) of lipids and 4% (bh) of water. The amino acid profile of this meal proteins meets the needs of animal nutrition and human nutrition with good protein efficiency (estimated at 2.5). The production yield of 20% (bh) (64% bs) is similar to that of fishmeal production (20% bh). In parallel, insect oil (co-product) were also produced. It is rich in acid palmitic, and essential fatty acids ω9 and ω6. It can be used in feed, food, cosmetic or bioenergy. Although the process has an impact on the physicochemical properties of soluble proteins after the transformation of larvae into flour, the soluble protein fractions of flour and of larvae have the same foaming and emulsifying properties similar to those of milk and BSA at 4 and 2% respectively. The meal proteins could be used in feed and food for their functional properties. This work contributes to insect protein understanding and the industrial extrapolation in a perspective of biorefinery designing
Fréalle, Noémie. "Formation à la gestion de crise à l’échelle communale : méthode d’élaboration et de mise en œuvre de scénarios de crise crédibles, pédagogiques et interactifs." Thesis, Lyon, 2018. https://tel.archives-ouvertes.fr/tel-01860781.
Full textCrisis management cannot be improvised, especially when the stakes are high. To cope with the consequences of crisis, it is recommended that members of crisis units be trained. In order to train them, the crisis simulation seems to be an ideal tool, because it enables to train a whole group and to provide a crisis management experience to the trainees. However, it seems that the credibility, the educational scope and the interactivity of the scenarios deployed in these simulations are called into question. The research work endeavor to improve the development and the implementation of crisis scenarios during simulation at town level. To achieve this objective, four elements are studied: crisis management processes, educational concepts, scenarios used in current crisis management training and serious games. This allows to define that it necessary to improve the execution devices to implement more credible, educational and interactive scenarios. The proposed method, based on an empirical approach through the implementation of 21 pilot exercises involving 280 trainees, is structured into four steps: (i) the definition of the execution parameters, (ii) the characterization of the data needed by the facilitators, (iii) the modelling of the circulation of information between the facilitators and (iv) the structuring of a shared medium for facilitators. The experimental framework and protocol allow to analyze the first results to confirm the interest of the method in a crisis management training context
Jaccarini, André. "Grammaires modulaires de l'arabe : modélisations, mise en oeuvre informatique et stratégies." Paris 4, 1997. http://www.theses.fr/1997PA040025.
Full textIn this work we expound, in a unified theoretical frame, the main linguistic models and the associated parsers we have developed in the D. A. T. A. T (département d'analyse et de traitement automatique des textes, IREMAN-CNRS). The most salient feature of these parsers is that they can work without a lexicon but can be enhanced by the introduction of selective lexicons. Our aim is then to design a syntactic monitor for the morphological program in order to reduce different ambiguities which are inherent to Arabic writing systems. In order to achieve accurate descriptions we have designed modular programs that we can modify according to the "complexification" of linguistic data and an evaluation method for grammar. The already existing morphological parser without a lexicon can be applied to non-vocalized as well as vocalized Arabic texts in order to extract roots, to vocalize partially automatically and hierarchize ambiguities. In this sense this parser constitutes a powerful tool for research in linguistic engineering itself: the method of grammar variations will allow the design of compact modular grammars applicable to various needs and research areas. Our aim is to create a generator for linguistic applications rather than the mere applications themselves. For example optical character recognition (OCR) and speech processing require compact linguistic modules of verification. The use of enormous lexicons may be a handicap in some computational configurations. Our method allows the calculation of the optimum grammar
Schenone, Carlo. "Représentation des connaissances pour la mise a jour automatique des systèmes d'information géographique par photos aériennes." Lyon, INSA, 1994. http://www.theses.fr/1994ISAL0108.
Full textThis text has been prepared within the Photopolis project whose goal is the updating of Geographical Information Systems (GIS) using aerial photos. This process is done for discovering places where a verification shows modifications regarding its previous state. The system follows different steps. The first one normalises the aerial photos by eliminating optical distortions. The photos are divided into homogeneous parts depending on their texture. Another process will
Barbe, Thierry. "Méthodologie et outils pour la mise en oeuvre automatique d'une synthèse de parole de haute qualité." Grenoble INPG, 1990. http://www.theses.fr/1990INPG0147.
Full textAmoretti, René. "Modélisation et commande optimale d'un réseau de distribution d'eau potable : mise en oeuvre, test et étude des performances sur le réseau de Fium'Orbo." Aix-Marseille 3, 1990. http://www.theses.fr/1990AIX30001.
Full textCano, Emmanuelle. "Cartographie des formations végétales naturelles à l’échelle régionale par classification de séries temporelles d’images satellitaires." Thesis, Rennes 2, 2016. http://www.theses.fr/2016REN20024/document.
Full textForest cover mapping is an essential tool for forest management. Detailed maps, characterizing forest types at a régional scale, are needed. This need can be fulfilled by médium spatial resolution optical satellite images time sériés. This thesis aims at improving the supervised classification procédure applied to a time sériés, to produce maps detailing forest types at a régional scale. To meet this goal, the improvement of the results obtained by the classification of a MODIS time sériés, performed with a stratification of the study area, was assessed. An improvement of classification accuracy due to stratification built by object-based image analysis was observed, with an increase of the Kappa index value and an increase of the reject fraction rate. These two phenomena are correlated to the classified végétation area. A minimal and a maximal value were identified, respectively related to a too high reject fraction rate and a neutral stratification impact.We carried out a second study, aiming at assessing the influence of the médium spatial resolution time sériés organization and of the algorithm on classification quality. Three distinct classification algorithms (maximum likelihood, Support Vector Machine, Random Forest) and several time sériés were studied. A significant improvement due to temporal and radiométrie effects and the superiority of Random Forest were highlighted by the results. Thematic confusions and low user's and producer's accuracies were still observed for several classes. We finally studied the improvement brought by a spatial resolution change for the images composing the time sériés to discriminate classes of mixed forest species. The conclusions of the former study (MODIS images) were confirmed with DEIMOS images. We can conclude that these effects are independent from input data and their spatial resolution. A significant improvement was also observed with an increase of the Kappa index value from 0,60 with MODIS data to 0,72 with DEIMOS data, due to a decrease of the mixed pixels rate
Ghorayeb, Hicham. "Conception et mise en œuvre d'algorithmes de vision temps-réel pour la vidéo surveillance intelligente." Phd thesis, École Nationale Supérieure des Mines de Paris, 2007. http://pastel.archives-ouvertes.fr/pastel-00003064.
Full textAndry, François. "Mise en œuvre de prédictions linguistiques dans un système de dialogue oral Homme-machine coopératif." Paris 13, 1992. http://www.theses.fr/1992PA132019.
Full textTugui, Catalin Adrian. "Mise en place d'une démarche de conception pour circuits hautes performances basée sur des méthodes d'optimisation automatique." Phd thesis, Supélec, 2013. http://tel.archives-ouvertes.fr/tel-00789352.
Full textLeblond, Isabelle. "Recalage à long terme d'images sonar par mise en correspondance de cartes de classification automatique des fonds." Brest, 2006. http://www.theses.fr/2006BRES2004.
Full textThis study deals with sea-floor localization problem by using sidescan sonar images. Here, we assume that the considered seafloor areas have previously been mapped. The size of each considered seafloor area is over a hundred square metres and the images can be recorded with different courses. A direct matching of theses images is obviously unacceptable. Therefore, we consider a two step method : extraction of symbolic data named “landmarks” and a matching process. Before these two steps, we begin by presenting the different data. We particularly focus on the physics of the seafloor, in order to grasp the different types of usable landmarks. We present also the problematic of sonar registration, both in terms of difficulties and proposed solutions to solve them. A second part is devoted to the preprocessing of the data : correction of the TVG (Time Varying Gain) by a normalization of the images according to the grazing angle and the despecklelisation by thresholding on wavelet decomposition. Concerning the landmark extraction, we apply a segmentation/classification of the data. Classifying parameters extracted from a Gabor wavelet decomposition are used to realize a classification by the nearest neighbour in several stages. The obtained landmarks contain different kinds of seafloor and their boundaries. The registration is achieved using mainly two steps. The first step is a symbolic registration, where the classification results are matched more an more accurately. The second one is a quantitative registration, where data, extracted directly from images, are matched
Eynard, Jean-Paul Bolliet Louis. "Réalisation d'outils logiciels pour la mise en œuvre de microprocessseurs dans la conduite automatique de procédés complexes." S.l. : Université Grenoble 1, 2008. http://dumas.ccsd.cnrs.fr/dumas-00334254.
Full textFriot, Laurent. "Méthodologie de mise en oeuvre d'une régulation de climatisation par modèle interne appliquée au transport ferroviaire." Poitiers, 1995. http://www.theses.fr/1995POIT2355.
Full textGadhoumi, Kais. "Mise en oeuvre d'un algorithme de traitement de la parole basé sur la quantification vectorielle pour une prothèse cochléaire." Sherbrooke : Université de Sherbrooke, 2001.
Find full textVoisin, Yvon. "Détermination d'un critère pour la mise au point automatique des caméras pour des scènes à faible profondeur de champ : contribution à la mise au point des microscopes." Besançon, 1993. http://www.theses.fr/1993BESA2016.
Full textEggert, Elmar. "La dérivation toponymes-gentilés en français : mise en évidence des régularités utilisables dans le cadre d'un traitement automatique." Tours, 2002. http://www.theses.fr/2002TOUR2032.
Full textSabatier, Laura. "Étude des conséquences de traitements physiques sur le cheveu, de l’échelle moléculaire à celle de la fibre." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASS076.
Full textThermomechanical hair styling is preferred by users for easy and temporary reshaping of hair. However, the result is not always up to expectations, particularly due to poor shape stability over time and possible hair damage. In this work, we aim to improve hairstyling devices. To this end, we need to understand the effects of such treatments on hair in order to determine conditions which allow the best shape holding while minimizing hair damage. To achieve this, we use tensile testing, X-ray diffraction and infrared spectroscopy experiments. First, we studied the structural organization of natural hair. We highlighted a “core-skin” distribution of structures with a regular core which is all the more off-centered as curvature is high. Subsequently, we identified the main parameters of thermomechanical reshaping: temperature, stress and application time. Then, we evaluated the effects of these parameters on mechanical behavior and hair nanostructure. Our study shows that applied stress is a key factor: we defined stresses range allowing preservation of hair structure and its mechanical properties and the one leading to degradation or even driving to beta-sheets transition. Efficiency of the different treatment conditions in producing long-lasting shape over time was then evaluated. In addition, we analyzed the structural mechanisms that occur during stretching for native and pretreated hair: we used X-ray microdiffraction coupled with continuous stretching of hair. Consequently, we were able to monitor a beta sheet structure in hair during stretching. The original results obtained during this work, bridging internal molecular mechanisms and macroscopic behavior of hair, will allow to develop new thermomechanical treatments at industrial scale
Maurine, Patrick. "Développement et mise en oeuvre de méthodologies d'étalonnage de robots manipulateurs industriels." Montpellier 2, 1996. http://www.theses.fr/1996MON20268.
Full textLahoussine_Aquede. "Évaluation des méthodes semi-numérique et numérique pour la mise à jour de la carte topographique à l’échelle du 1/50 000 à partir des données de télédétection." Mémoire, Université de Sherbrooke, 1993. http://hdl.handle.net/11143/7869.
Full textLabiche, Alexandre. "Contribution à l’étude du stroma des carcinomes ovariens par une analyse morphologique, morphométrique et expérimentale, à l’échelle histologique et ultrastructurale : mise en évidence de l’importance de la vascularisation." Caen, 2008. http://www.theses.fr/2008CAEN4002.
Full textThe ovarian carcinoma is a cancer of very poor prognosis, as diagnosis occurs at an advanced stage. A better understanding of the natural history of this cancer is critical interest in order to discover new targeted therapies and to determine new markers for prognosis prediction and therapeutic follow-up. We have chosen to study the tumour stroma which plays probably a crucial role in the tumorigenesis and drug resistance of ovarian carcinomas. This study highlighted that the amount of blood vessels, mast cells and stroma can be used as prognosis indicators. Then, we have focused our study on peritoneal metastasis which are responsible of the recurrence of this disease. We have also finalized an in vivo mice model of peritoneal carcinosis to investigate the different steps of stroma development. Finally, we have tested on this model the effect of an anti-angiogenic drug associated or not with classical chemotherapy
Wu, Yao Kuang Marie-Blandine. "Robustesse des systèmes auteurs multimédias : contribution théorique et mise en oeuvre." Paris 8, 2000. http://www.theses.fr/2000PA081746.
Full textMartinez, Cristian. "Grammaires locales étendues : principes, mise en œuvre et applications pour l’extraction de l’information." Thesis, Paris Est, 2017. http://www.theses.fr/2017PESC1075/document.
Full textLocal grammars constitute a descriptive formalism of linguistic phenomena and are commonly represented using directed graphs. Local grammars are used to recognize and extract patterns in a text, but they had some inherent limits in dealing with unexpected variations as well as in their capacity to access exogenous knowledge, in other words information to extract, during the analysis, from external resources and which may be useful to normalize, enhance validate or link the recognized patterns. In this thesis, we introduce the notion of extended local grammar, a formalism capable to extend the classic model of local grammars. The means are twofold: on the one hand, it is achieved by adding arbitrary conditional-functions, called extended functions, which are not predefined in advance and are evaluated from outside of the grammar. On the other hand, it is achieved by allowing the parsing engine to trigger events that can also be processed as extended functions. The work presented herewith is divided into three parts: In the first part, we study the principles regarding the construction of the extended local grammars. Then, we present a proof-of-concept of a corpus-processing tool which implements the proposed formalism. Finally, we study some techniques to extract information from both well-formed and noisy texts. We focus on the coupling of external resources and non-symbolic methods in the construction of our grammars and we highlight the suitability of this approach in order to overcome the inherent limitations of classical local grammars
Binet, Karine. "Mise en oeuvre d'un système d'aide à la détection de cibles terrestres camouflées." Rennes 1, 2005. http://www.theses.fr/2005REN1S085.
Full textBouyoucef, El Khier. "Contribution à l'étude et la mise en oeuvre d'indicateurs quantitatifs et qualitatifs d'estimation de la complexité pour la régulation du processus d'auto-organisation d'une structure neuronale modulaire de traitement d'information." Paris 12, 2007. http://www.theses.fr/2007PA120049.
Full textThe classification is a key tool in numerous domains, notably medical and industrial domains. The main failure of classifiers is due to the reliability of the exploited theorical or empirical models. Within the framework of this doctoral thesis, we are interested in the study of data complexity notion, data representing a problem of classification within the framework of a new treelike neural structure of data processing termed « Tree-like Divide To Simplify » (T-DTS). In this approach, a problem of classification will be dealt by several local models adapted to the difficulty of the problem. We used and built quantitative and qualitative complexity indicators from which the role is to obtain an adequacy between the complexity and the structure of treatment. Several data bases issued from artificial and real problems of classifications are tested to validate the pertinance of the valeur of complexity and to compare the performances of the structure T-DTS following different modes of function and also compare it to other algorithms of classification
Bouyoucef, El Khier Madani Kurosh. "Contribution à l'étude et la mise en oeuvre d'indicateurs quantitatifs et qualitatifs d'estimation de la complexité pour la régulation du processus d'auto-organisation d'une structure neuronale modulaire de traitement d'information." Créteil : Université de Paris-Val-de-Marne, 2008. http://doxa.scd.univ-paris12.fr:8080/theses-npd/th0405301.pdf.
Full textTencé, Marcel. "Un système informatique pour l'acquisition et le traitement des données en microscopie électronique : réalisation, mise au point et application." Paris 11, 1987. http://www.theses.fr/1987PA112449.
Full textAs a consequence of its general layout, the new generation of scanning transmission electron microscopes (STEM) is particularly well suited to be interfaced with a computer which has several functions the control of the electron microscope parameters used for image acquisition, the storage of the recorded data and their a posteriori processing. STEM instruments offer several specific characters which determine the technical choices for the elaboration of the digital system hardware and for building the required interfaces. These are the capability of simultaneously recording the data delivered by the different detection channels for one probe position on the specimen. It has also to handle the sequences of energy filtered images necessary for achieving elemental mapping. Finally, the replication of images in a given set of working conditions is the key for applying cross correlation techniques and estimating image parameters such as point resolution or signal to noise ratio. This work describes the hardware which has been built and the software which has been elaborated to fulfill these goals. As a conclusion, we present three different applications made with this system : i) on-line measurement of the fractal dimension of aggregates, ii) estimate of the spatial resolution in EELS chemical mapping, iii) quantitative development of these methods with a reasonable extrapolation of the detection limits to the identification of a single atom