Dissertationen zum Thema „Evaluation et Optimisation“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Evaluation et Optimisation" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Malik, Salman. „Evaluation et Optimisation des Réseaux Sans Fil Denses“. Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00719083.
Der volle Inhalt der QuelleLeclerc, Xavier. „Evaluation et optimisation de stratégies de correction génique“. Thesis, Evry-Val d'Essonne, 2008. http://www.theses.fr/2008EVRY0029/document.
Der volle Inhalt der QuelleCurrent strategies for gene therapy of inherited diseases consist in adding functional copies of the gene that is defective. Yet, an attractive alternative would be to correct the mutated gene that already exists in the affected individual. While many gene correction agents were described for their ability to correct mutations, no direct comparison of the repair efficiency has ever been made. Here, we present a side by side quantitative comparison of the correction efficiency of different forms of donor nucleic acids, including synthetic DNA oligodeoxynucleotides, large DNA fragments and sequences carried by a recombinant adeno-associated virus. Also, we show that a pretreatment of cells with doxorubicin (a topoisomerase II inhibitor) or with phleomycin (a glycopeptide antibiotic), both substances known to induce DNA damages, can increase the correction efficiency of the different repair tools
Leclerc, Xavier Kichler Antoine. „Evaluation et optimisation de stratégies de correction génique“. S. l. : S. n, 2008. http://www.biblio.univ-evry.fr/theses/2008/2008EVRY0029.pdf.
Der volle Inhalt der QuelleMambo, Shadrack. „Optimisation et évaluation des performance en traitement d'image“. Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1125/document.
Der volle Inhalt der QuelleD’Tech Thesis SummaryThe importance of medical imaging as a core component of several medical application and healthcare diagnosis cannot be over emphasised. Integration of useful data acquired from different images is vital for proper analysis of information contained in the images under observation. For the integration process to be successful, a procedure referred to as image registration is necessary.The purpose of image registration is to align two images in order to find a geometric transformation that brings one image into the best possible spatial correspondence with another image by optimising a registration criterion. The two images are known as the target image and the source image. Image registration methods consist of having the two images referenced with control points. This is followed by a registration transformation that relates the two images and a similarity metric function that aims to measure the qualitative value of closeness or degree of fitness between the target image and the source image. Finally, an optimiser which seeks an optimal transformation inside the defined solution search space is performed.This research presents an automated image registration algorithm for solving multimodal image registration on lung Computer Tomography (CT) scan pairs, where a comparison between regular step gradient descent optimisation technique and evolutionary optimisation was investigated. The aim of this research is to carry out optimisation and performance evaluation of image registration techniques in order to provide medical specialists with estimation on how accurate and robust the registration process is. Lung CT scan pairs are registered using mutual information as a similarity measure, affine transformation and linear interpolation. In order to minimise the cost function, an optimiser, which seeks the optimal transformation inside the defined search space is applied.Determination of a transformation model that depends on transformation parameters and identification of similarity metric based on voxel intensity were carried out. By fitting transformation to control points, three transformation models were compared. Affine transformation produced the best recovered image when compared to non-reflective similarity and projective transformations. The results of this research compares well with documented results from EMPIRE 10 Challenge research and conforms to both theoretical principles as well as practical applications.The contribution of this research is its potential to increase the scientific understanding of image registration of anatomical body organs. It lays a basis for further research in performance evaluation of registration techniques and validation of procedures to other types of algorithms and image registration application areas, such as remote sensing, satellite communication, biomedical engineering, robotics, geographical information systems and mapping, among others
Léger, Crémoux Séverine. „Evaluation et optimisation de chemins combinatoires de circuits VLSI“. Montpellier 2, 1998. http://www.theses.fr/1998MON20129.
Der volle Inhalt der QuelleRoche, Gilles. „L'angiographie dynamique des membres inferieurs : evaluation, optimisation et protocoles“. Rennes 1, 1992. http://www.theses.fr/1992REN1M036.
Der volle Inhalt der QuelleLioris, Eugénie. „Evaluation et optimisation de systèmes de taxis collectifs en simulation“. Marne-la-Vallée, ENPC, 2010. http://www.theses.fr/2010ENPC1009.
Der volle Inhalt der QuelleThe economical development of an urban region depends greatly on ease of access. The taxiis recognised throughout the world as a convenient means of transport providing a high qualitypersonal service for both normal and emergency purposes. Despite the advantages however,the taxi is also too expensive for most people to use routinely for their daily transport. If ataxi and hence the expense could be shared between passengers, there would be the potentialto preserve almost the original standard of service whilst increasing vehicle productivity bybecoming “collective”. This is not a new concept, it was studied in 1971 by P. H. Fargier andG. Cohen who suggested and studied such a system. Despite this, even today it may be regardedas a somewhat premature and revolutionary idea for a strictly regulated market. Reviewing theform of operation applied to this mode of transport could allow a collective taxi system to forman acceptable part of the public transport system by offering this service to the majority of thepopulation without being limited to the privileged on the basis of cost. Chapter Iof this document presents a review of existing “Transport on Demand” systemsat various locations in the world, describing the various models developed on this subject aswell as the corresponding algorithms treating the problems encountered. This is followed by anintroduction into the project of “Collective Taxis” studied in this work, and finally we give anoverview of the rest of this document. Chapter IIbrings up the multiple problems and questions that have to be dealt with duringthe study of a collective taxi system covering an entire urban area. At this point, the need toconstruct a simulator is justified and its architecture is explained : its design is comprised ofseparate mechanical and decision-making components connected to each other. The technicalchoices concerning the chosen programming language and software are discussed together withthe different exploitation modes of the simulator, the input data and the information producedby the simulation runs. Chapter IIIprovides an insight into the decentralised approach consisting of a system basedentirely on casual business obtained from clients at the roadside with no facility for advancedbooking available. This chapter extensively describes the modelling of a discrete event simulatorspecifically constructed for this management approach followed by the decision making issuesconcerning this approach and the algorithms developed to provide solutions and allow optimisedcontrol of the systemChapter IVis a thorough debrief of a long campaign of numerical simulations covering afictitious but realistic example. The principal idea of this study is to provide a firm method-ology from which one should be able to evaluate the applied strategy and measure the systemperformance. Initially, methods of analysing a single simulation are being illustrated whilst showing thevarious outcomes and conclusions that can be produced by dealing with the simulation results. Thus the quality of service provided by the given policy can be judged and quantified as well asthe associated functioning costs. Further on, a series of simulations, where the value of multiple parameters vary, are beingexplored with the aim of realising all possible compromises between contradictory criteria foroptimising real-time management and system dimensioning. This process also points out the influence of the exogenous data like the demand (geometry and intensity) to the performance ofthe system. Chapter Vhandles system management from the central dispatching point of view, whereprospective clients call a control centre to make a pre-booking (this is called the centralised modeof operation). A mixed mode can also be considered by superimposing the centralised mode ontothe decentralised one. The mechanical part of the present simulator, which is responsible for processing the systemevents, is designed to enable the management of the three possible approaches (decentralised,centralised and mixed). Nevertheless the decisional part of the mixed and centralised modes hasnot been completely developed because of the lack of time (the real time management of thecentralised approach is much more complex than the decentralised one). Consequently numerical results are not presented concerning the centralised and mixedmodes. For the moment, this chapter is making do with the description of the model and eventsof the centralised and mixed scenarios, evoking the algorithmic part up to a certain point. Chapter VIdevelops some conclusions regarding this work and future possible developments
Li, Jie Xie Xiaolan. „Evaluation et optimisation des performances des systèmes de production distribution“. [S.l.] : [s.n.], 2006. ftp://ftp.scd.univ-metz.fr/pub/Theses/2006/Li.Jie.SMZ0610.pdf.
Der volle Inhalt der QuelleLe, Menn Frédéric. „Evaluation et optimisation de la locomotion de systèmes mobiles reconfigurables“. Paris 6, 2007. http://www.theses.fr/2007PA066347.
Der volle Inhalt der QuelleAline, Michel. „Evaluation et optimisation de performances en délai en technologie CMOS submicronique“. Montpellier 2, 2001. http://www.theses.fr/2001MON20075.
Der volle Inhalt der QuelleVieille, Laurent. „Bases de donnees deductives : evaluation et optimisation de programmes logiques recursifs“. Paris 6, 1988. http://www.theses.fr/1988PA066587.
Der volle Inhalt der QuelleYin, Fei. „Evaluation et optimisation de performance dans les réseaux IEEE 802. 16“. Paris 6, 2009. http://www.theses.fr/2009PA066123.
Der volle Inhalt der QuelleMartinez, Medina Lourdes. „Optimisation des requêtes distribuées par apprentissage“. Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM015.
Der volle Inhalt der QuelleDistributed data systems are becoming increasingly complex. They interconnect devices (e.g. smartphones, tablets, etc.) that are heterogeneous, autonomous, either static or mobile, and with physical limitations. Such devices run applications (e.g. virtual games, social networks, etc.) for the online interaction of users producing / consuming data on demand or continuously. The characteristics of these systems add new dimensions to the query optimization problem, such as multi-optimization criteria, scarce information on data, lack of global system view, among others. Traditional query optimization techniques focus on semi (or not at all) autonomous systems. They rely on information about data and make strong assumptions about the system behavior. Moreover, most of these techniques are centered on the optimization of execution time only. The difficulty for evaluating queries efficiently on nowadays applications motivates this work to revisit traditional query optimization techniques. This thesis faces these challenges by adapting the Case Based Reasoning (CBR) paradigm to query processing, providing a way to optimize queries when there is no prior knowledge of data. It focuses on optimizing queries using cases generated from the evaluation of similar past queries. A query case comprises: (i) the query, (ii) the query plan and (iii) the measures (computational resources consumed) of the query plan. The thesis also concerns the way the CBR process interacts with the query plan generation process. This process uses classical heuristics and makes decisions randomly (e.g. when there are no statistics for join ordering and selection of algorithms, routing protocols). It also (re)uses cases (existing query plans) for similar queries parts, improving the query optimization, and therefore evaluation efficiency. The propositions of this thesis have been validated within the CoBRa optimizer developed in the context of the UBIQUEST project
Hmad, Ouadie. „Evaluation et optimisation des performances de fonctions pour la surveillance de turboréacteurs“. Thesis, Troyes, 2013. http://www.theses.fr/2013TROY0029.
Der volle Inhalt der QuelleThis thesis deals with monitoring systems of turbojet engines. The development of such systems requires a performance evaluation and optimization phase prior to their introduction in operation. The work has been focused on this phase, and more specifically on the performance of the detection and the prognostic functions of two systems. Performances metrics related to each of these functions as well as their estimate have been defined. The monitored systems are, on the one hand, the start sequence for the detection function and on the other hand, the oil consumption for the prognostic function. The used data come from flights in operation without degradation, simulations of degradation were necessary for the performance assessment. Optimization of detection performance was obtained by tuning a threshold on the decision statistics taking into account the airlines requirements in terms of good detection rate and false alarm rate. Two approaches have been considered and their performances have been compared for their best configurations. Prognostic performances of over oil consumption, simulated using Gamma processes, have been assessed on the basis of the relevance of maintenance decision induced by the prognostic. This thesis has allowed quantifying and improving the performance of the two considered functions to meet the airlines requirements. Other possible improvements are proposed as prospects to conclude this thesis
Gauvrit, Jean-Yves. „Evaluation et optimisation des techniques d'ARM avec injection en pathologie neuro-vasculaire“. Lille 2, 2005. http://www.theses.fr/2005LIL2S010.
Der volle Inhalt der QuelleMRI and the whole of the software of post treatments hold a capital place in the exploration of cerebral arteriovenous malformations and intracranial aneurysms. In cerebral arteriovenous malformations (cAVM), the non-invasive techniques of angiography (MRA and AngioCT) provide a precise anatomical study but contrary to the conventional angiography do not bring any hemodynamic information however essential. A new technique of MRA, the dynamic MRA, based on a repeated acquisition of images and an image subtraction, makes it possible to obtain this kinetic information. Our work showed the interest of this technique in the exploration of cAVM but also in other intracranial vascular diseases (dural fistulas, aneurysms, venous thrombosis) at the same time in the detection of pathology but also in the definition of prognostic arguments in particular with the hemorrhagic risk. Initially acquired with 2D technique, the use of the techniques of parallel imagery, allowed the development of a three-dimensional (3D) dynamic MRA reconciling high space resolution and short temporal resolution. We showed the interest of the 3D dynamic MRA in the follow-up of radiosurgically treated cAVM by analyzing the reduction of size of the nidus or the disappearance of the venous drainage. The endovascular treatment of intracranial aneurysms has occupied, for a few years, an important place in the therapeutic assumption. The clinical and radiological follow-up is essential because the risk of repetitions remains badly documented. Whereas the techniques of MRA with injection are the sequences of reference in cervical vascular pathology, they are developed little in intracranial exploration because of venous overlaps. However, their capacities to obtain a high contrast and to reduce the flow artefacts make of them, techniques adapted to the monitoring of treated aneurysms by coils. Our work showed the contribution of the MRA with gadolinium injection in the detection of the aneurismal recurrences but also the interest in the management of the early and late recanalisations
Baskiotis, Nicolas. „Evaluation d'algorithmes pour et par l'apprentissage“. Paris 11, 2008. http://www.theses.fr/2008PA112275.
Der volle Inhalt der QuelleOne of the main concerns involving machine learning applications is the a priori choice of an algorithm likely to yield the best classification performances. Our work is inspired by research in combinatorial optimisation on the phase transition problem. We suggest a dual approach to the standard view on this problem through metaleraning. Our goal is to build a competence map according to descriptors on the problem space which enables to identify the regimes where the system’s performance are steady. We assess this approach on C 4. 5. The second part of our work deals with machine learning problems in software testing. More precisely, we study a statistical structural method of software testing that uses the sampling of the so-called feasible paths in the control graph of the problems that is to be tested. In certain cases, the portion of feasible paths is very low, which hinders this method. Our goal is to design a learning and optimisation system that yields a random generator biaised towards the feasible paths and that warrantees a suitable sampling of the target concept. We designed the MLST system that implements active learning in a graph. We tested our work on both real and artificial problems and showed that our system achieves significantly improvement regarding the coverage of target concepts with respect to the initial data
Henriet, Julien. „Evaluation, optimisation et validation de protocoles distribués de gestion de la concurrence pour les interactions coopératives“. Besançon, 2005. http://www.theses.fr/2005BESA2023.
Der volle Inhalt der QuelleCooperative work over Internet introduces constraints in terms of access and modification of shared objects. Users need to access the same objects concurrently in real-time. At each moment, the same image of the production area is to be shared on each connected site. The first chapter of this thesis is a state of the art of communication protocols, consistency management protocols and telemedicine platforms that allow collaborative work. The second chapter presents an new protocol for consistency management over that kind of platform. A probabilistic study of this new protocol proves and evaluates the optimization and the cost of this new protocol called the Optimistic Pilgrim. An analysis, an optimization and a validation of the Chameleon protocol dedicated to communication management over a collaborative platform is presented in the third chapter. At last, the fourth chapter evaluates the performance of these protocols through an implementation of a prototype and also presents the adequation of each protocol to each tool of the collaborative teleneurological platform of the TéNeCi project
Grand, Christophe. „Optimisation et commande des modes de déplacement des systèmes locomoteurs hybrides roue-patte : application au robot Hylos“. Paris 6, 2004. http://www.theses.fr/2004PA066460.
Der volle Inhalt der QuelleVallet, Gilles. „Evaluation et optimisation d'un programme de réentraînement à l'effort individualisé chez des bronchopneumopathes chroniques obstructifs“. Montpellier 1, 1996. http://www.theses.fr/1996MON14002.
Der volle Inhalt der QuelleDine, Abdelhamid. „Localisation et cartographie simultanées par optimisation de graphe sur architectures hétérogènes pour l’embarqué“. Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS303/document.
Der volle Inhalt der QuelleSimultaneous Localization And Mapping is the process that allows a robot to build a map of an unknown environment while at the same time it determines the robot position on this map.In this work, we are interested in graph-based SLAM method. This method uses a graph to represent and solve the SLAM problem. A graph optimization consists in finding a graph configuration (trajectory and map) that better matches the constraints introduced by the sensors measurements. Graph optimization is characterized by a high computational complexity that requires high computational and memory resources, particularly to explore large areas. This limits the use of graph-based SLAM in real-time embedded systems. This thesis contributes to the reduction of the graph-based computational complexity. Our approach is based on two complementary axes: data representation in memory and implementation on embedded heterogeneous architectures. In the first axis, we propose an incremental data structure to efficiently represent and then optimize the graph. In the second axis, we explore the use of the recent heterogeneous architectures to speed up graph-based SLAM. We propose an efficient implementation model for embedded applications. We highlight the advantages and disadvantages of the evaluated architectures, namely GPU-based and FPGA-based System-On-Chips
Corsi, Marco. „Evaluation et optimisation de portefeuille dans un modèle de diffusion avec sauts en observations partielles : aspects théoriques et numériques“. Paris 7, 2007. http://www.theses.fr/2007PA077038.
Der volle Inhalt der QuelleThis thesis investigates some aspects of the portfolio optimization under incomplete observation. The work is organized in three parts that analyze the following topics: Part 1. Portfolio optimization under partial observation in a jump-diffusion model. Part 2 Indifference price under partial observations in a jump-diffusion model. Part 3 Numerical approximation by quantization of discrete time control problems under partial observations and applications in finance. In the first two parts we consider the case of continuous time observation while in the third one we analyze the case of discrete time observation
HEYER, MICHAEL. „Optimisation et evaluation d'un systeme photovoltaique relie au reseau en tenant particulierement compte de l'interface onduleur“. Paris, ENMP, 1995. http://www.theses.fr/1995ENMP0598.
Der volle Inhalt der QuelleRaynal, Mathieu. „Systèmes de saisie de textes pour les personnes handicapées moteur : optimisation, interaction et mesure de l'utilisabilité“. Toulouse 3, 2005. http://www.theses.fr/2005TOU30274.
Der volle Inhalt der QuelleThis thesis tackles problems of optimization, design and usability of new Augmentative and Alternative Communication systems (AAC) for motor disabled people. The problem is to decrease the covered distance (with the pointer of the pointing device) for keyboarding. This work suggests two optimizations: a) application of the genetic algorithms to optimize the character layout on soft keyboard; b) the KeyGlass system which proposes additional characters (thanks to systems of character prediction) near to the last character tapped. To be able to evaluate the contribution of the optimizations carried out and their usability, we designed the evaluation platform E-Assiste which aims to fill a lack of corpus of tests and environments of evaluation and to allow the study of the uses of our systems in situ
PLATONI, KALLIOPI. „Optimisation par l'analyse de decomposition en valeurs singulieres (svd) et evaluation quantitative des irradiations en conditions stereotaxiques“. Toulouse 3, 1998. http://www.theses.fr/1998TOU30172.
Der volle Inhalt der QuelleBen, Cheikh Henda. „Evaluation et optimisation de la performance des flots dans les réseaux stochastiques à partage de bande passante“. Thesis, Toulouse, INSA, 2015. http://www.theses.fr/2015ISAT0013/document.
Der volle Inhalt der QuelleWe study queueing-theoretic models for the performance evaluation and optimization of bandwidth-sharing networks. We first propose simple and explicit approximations for the main performance metrics of elastic flows in bandwidth-sharing networks operating under balanced fairness. Assuming that an admission control mechanism is used to limit the number of simultaneous streaming flows, we then study the competition for bandwidth between elastic and streaming flows and propose performance approximations based on a quasi-stationary assumption. Simulation results show the good accuracy of the proposed approximations. We then investigate the energy-delay tradeoff in bandwidth-sharing networks in which nodes can regulate their speed according to the load of the system. Assuming that the network is initially congested, we investigate the rate allocation to the classes that drains out the network with minimum total energy and delay cost. We formulate this optimal resource allocation problem as a Markov decision process which proves tobe both analytically and computationally challenging. We thus propose to solve this stochastic problem using a deterministic fluid approximation. For a single link sharedby an arbitrary number of classes, we show that the optimal-fluid solution follows thewell-known cμ rule and give an explicit expression for the optimal speed. Finally, we consider cloud computing platforms under the SaaS model. Assuming a fair share of the capacity of physical resources between virtual machines executed concurrently, we propose simple queueing models for predicting response times of applications.The proposed models explicitly take into account the different behaviors of the different classes of applications (interactive, CPU-intensive or permanent applications). Experiments on a real virtualized platform show that the mathematical models allow to predict response times accurately
Schemeleva, Kseniya. „Optimisation de la politique de lotissement et de séquencement pour une ligne de production soumise aux aléas“. Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2010. http://tel.archives-ouvertes.fr/tel-00667950.
Der volle Inhalt der QuelleFerré, Jean-Christophe. „Evaluation et optimisation de lʹacquisition et du post-traitement de lʹétude de la perfusion cérébrale par ʺ Arterial Spin Labeling ʺ“. Rennes 1, 2011. http://www.theses.fr/2011REN1B149.
Der volle Inhalt der QuelleArterial spin labeling (ASL) is a magnetic resonance perfusion imaging technique for measuring cerebral blood flow (CBF) which uses magnetically labeled arterial blood water as an endogenous tracer. ASL is completely noninvasive and can be repeated because it is performed without injection of contrast media, or radiation exposure. Moreover, CBF quantification is convenient and reproducible. However ASL is a low signal-to-noise ratio measurement technique. This technique becomes available on clinical MRI scanner. In this context, the aim of this work was to assess and optimize the image acquisition and the data processing acquired with two clinical techniques. We have demonstrated a correlation between acquisition parameters and hemodynamic parameters and we showed a maps’ quality improving using 32-channel coil combined with parallel imaging. New denoising methods were implemented and an optimized complete workflow was used to compare fASL and BOLD fMRI. A “template” approach was also assessed to detect individual increased perfusion area. Clinically, we used ASL to detect hypoperfusion defect on acute ischemic stroke and focal perfusion abnormalities in patients with chronic and resistant depression. Our results showed that a combination of optimized conditions acquisition and dedicated processing could help ASL to be more accurate in clinical research and practice
Tayeb, Jamel. „Optimisation des performances et de la consommation de puissance électrique pour architecture Intel Itanium/EPIC“. Valenciennes, 2008. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/9eed6aef-dfaf-4a17-883f-d217a1d9a000.
Der volle Inhalt der QuelleThis thesis proposes, in its first part, to extend the EPIC architecture of the Itanium processor family by providing a hardware stack. The principal idea defended here is that it is possible to close the existing performance gap between generic architectures and application specific designs to run virtual machines (FORTH,. NET, Java, etc). With this intention, we propose to reallocate dynamically a subset of the EPIC architecture’s resources to implement a hardware evaluation stack. Two implementations are proposed, both non-intrusive and compatible with existing binary codes. The fundamental difference between these stacks lies in their manager: software or hardware. The hardware controlled evaluation stack offers support for advanced functions such as the support of strongly typed evaluation stacks required by. NET’s CIL. Thus, we propose a single pass CIL binary translator into EPIC binary, using the hardware evaluation stack. In the second part of this thesis, we studied the energy efficiency of software applications. First, we defined a methodology and developed tools to measure the energy consumption and the useful work provided by the software. In a second time, we engaged the study of source code transformation rules in order to reduce/control the quantity of consumed energy by the software
Darmont, Jérôme. „Optimisation et évaluation de performance pour l'aide à la conception et à l'administration des entrepôts de données complexes“. Habilitation à diriger des recherches, Université Lumière - Lyon II, 2006. http://tel.archives-ouvertes.fr/tel-00143361.
Der volle Inhalt der QuelleLe travail présenté dans ce mémoire vise à proposer des solutions innovantes au niveau de l'optimisation et de l'évaluation des performances des entrepôts de données. Nous avons en effet conçu une approche générique dont l'objectif est de proposer automatiquement à l'administrateur d'un entrepôt des solutions permettant d'optimiser les temps d'accès aux données. Le principe de cette approche est d'appliquer des techniques de fouille de données sur une charge (ensemble de requêtes) représentative de l'utilisation de l'entrepôt de données afin de déduire une configuration quasi-optimale d'index et/ou de vues matérialisées. Des modèles de coût permettent ensuite de sélectionner parmi ces structures de données les plus efficaces en terme de rapport gain de performance/surcharge.
Par ailleurs, l'évaluation de performance peut venir en appui de la conception des entrepôts de données. Ainsi, afin de valider notre approche de manière expérimentale, nous avons également conçu plusieurs bancs d'essais génériques. Le principe directeur qui a présidé à leur élaboration est l'adaptabilité. En effet, pour comparer l'efficacité de différentes techniques d'optimisation des performances, il est nécessaire de les tester dans différents environnements, sur différentes configurations de bases de données et de charges, etc. La possibilité d'évaluer l'impact de différents choix d'architecture est aussi une aide appréciable dans la conception des entrepôts de données. Nos bancs d'essais permettent donc de générer diverses configurations d'entrepôts de données, ainsi que des charges décisionnelles qui s'y appliquent.
Finalement, nos solutions d'optimisation et d'évaluation des performances ont été mises en oeuvre dans les contextes des entrepôts de données relationnels et XML.
MIRGAIN, CYRILLE. „Evaluation et optimisation du contact initial gaz-solides dans une chambre de melange - application au craquage catalytique temps court“. Paris 6, 1997. http://www.theses.fr/1997PA066137.
Der volle Inhalt der QuelleBichon, Marie. „Evaluation et optimisation de la durée de vie d'électrodes élaborées en voie aqueuse pour des batteries de nouvelle génération“. Thesis, Nantes, 2019. http://www.theses.fr/2019NANT4051.
Der volle Inhalt der QuelleIncreasing the energy density of Liion batteries while reducing their manufacturing costs is a milestone towards widespread application of electric vehicles. In this regard, high capacity materials have been developed, among them lithium nickel manganese cobaltlayered oxide (NMC), as their energy densitycan be increased with higher nickel content. Aqueous processing of these materials can contribute to cost reduction, by using water as a solvent instead of standard, toxic Nmethylpyrrolidone (NMP), whose use requires pricey installation to recover the vapors upon drying of the electrode. This study aims at developing aqueous formulations of NMC532 cathodes (50% nickel, 30% manganese and 20% cobalt), and assess their electrochemical performance. As layered oxides are sensitive towards water, surface modification of the material upon immersion in aqueous slurries was investigated through various characterizations. Several routes were proposed to develop water-based cathodes with electrochemical performance comparable to NMP-based electrodes. In particular, corrosion of the aluminum current collector occurring in aqueous processing owing to the alkalinity of the NMC slurry is avoided through the addition of phosphoric acid which buffers the slurry pH, or by using a carbon-coated collector. The ageing behavior of such aqueous processed electrodes was then investigated through post mortem analyses after long-term cycling in Li-ion cells
Guinot, Benjamin. „Evaluation multicritère des technologies de stockage couplées aux énergies renouvelables : conception et réalisation de la plateforme de simulation ODYSSEY pour l'optimisation du dimensionnement et de la gestion énergétique“. Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-00934515.
Der volle Inhalt der QuelleTaschner, Christian. „Evaluation et optimisation des techniques d'ARM dynamique pour la prise en charge des malformations artério-veineuses intracrâniennes traitées par gamma knife“. Lille 2, 2007. http://www.theses.fr/2007LIL2S042.
Der volle Inhalt der QuelleMoussa, Amer. „Evaluation et optimisation de la synthèse de substrats de l'élastase de Pseudomonas aeruginosa à l'origine des infections associées aux maladies nosocomiales“. Montpellier 2, 2007. http://www.theses.fr/2007MON20016.
Der volle Inhalt der QuelleBonnemoy, Claude. „Sur quelques aspects de l'utilisation de methodes deterministes en milieu stochastique et inversement“. Clermont-Ferrand 2, 1987. http://www.theses.fr/1987CLF2E396.
Der volle Inhalt der QuelleHabachi, Oussama. „Optimisation des Systèmes Partiellement Observables dans les Réseaux Sans-fil : Théorie des jeux, Auto-adaptation et Apprentissage“. Phd thesis, Université d'Avignon, 2012. http://tel.archives-ouvertes.fr/tel-00799903.
Der volle Inhalt der QuelleBazzoli, Caroline. „Evaluation et optimisation de protocoles dans les modèles non linéaires à effets mixtes : application à la modélisation de la pharmacologie des antiretroviraux“. Paris 7, 2009. http://www.theses.fr/2009PA077238.
Der volle Inhalt der QuelleThis thesis considers design evaluation and optimisation in the context of nonlinear mixed effect models for multiple responses, as for instance for the joint modelling of the pharmacokinetics and the pharmacodynamics of a drug or for thé pharmacokinetics of a parent drug and its metabolite. We extended the computation of the Fisher information matrix for nonlinear mixed effect models with multiple responses using the usual first-order linéarisation. We evaluated this extension and showed by simulation its relevance. We also extended the Fedorov-Wynn algorithm to include cost functions on the sampling times. We implemented these developments in new versions of the R function PFIM dedicated to design evaluation and optimisation. We also applied them to plasma and intracellular pharmacokinetics of antiretroviral drugs. We performed a review on clinical studies measuring intracellular concentrations of antiretro viral drugs in HIV patients and found that most of these studies had small sample sizes and were not always adequately designed. We developed the first joint pharmacokinetic models of zidovudine and lamivudine and of their respective intracellular metabolites. We estimated the long half-lifes of both metabolites and found large between patients variabilities. Last, we showed that population design optimisation allows the derivation of efficient designs according to clinical constraints for further analysis of these molecules using these joint models
Hernandez, Sébastien. „Evaluation et optimisation du mécanisme de Handhover dans un Réseau Local Sans Fil dédié aux applications à trafic contraint par le temps“. Phd thesis, Clermont-Ferrand 2, 2006. https://theses.hal.science/docs/00/70/32/74/PDF/2006CLF21682.pdf.
Der volle Inhalt der QuelleHernandez, Sébastien. „Evaluation et optimisation du mécanisme de Handhover dans un Réseau Local Sans Fil dédié aux applications à trafic contraint par le temps“. Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2006. http://tel.archives-ouvertes.fr/tel-00703274.
Der volle Inhalt der QuelleGuyony, Valerie. „Caractérisation et optimisation du procédé de cuisson extrusion humide de matières protéiques végétales pour ’obtention d’une texture fibreuse“. Thesis, Nantes, Ecole nationale vétérinaire, 2020. http://www.theses.fr/2020ONIR149F.
Der volle Inhalt der QuelleAs vegetarian trends are growing, meat analogues, trying to mimic the appearance, texture and taste of meat, have emerged since the 2000s. Texturization process used is generally extrusion cooking. This thesis focuses on the texturization of plant proteins by wet extrusion cooking, and more specifically on the optimization of fibration in order to imitate as closely as possible the fibrous texture characteristic of meat. The objectives of the thesis are the understanding of the physical and chemical phenomena occurring during the wet extrusion cooking process and the optimization of process and raw material parameters to maximize the intensity of fibration. Among the process parameters, particular attention was paid to two main parameters: the dimensions of the die and the cooling temperature of the extrudat in the die. The study of raw material parameters was carried out by analyzing, on the one hand, the impact of the physico-chemical and functional properties of two different soy concentrates and, on the other, the impact of the addition of wheat gluten and/or pea isolate or fiber on the textural properties and fibration of the extrudate obtained. Finally, extrudate were transformed into meat analog steaks. The manufacturing process and reference recipe, based on the use of dehydrated textured proteins, have been adapted to allow their substitution by wet extrudates
Jong, Audrey de. „Evaluation et optimisation de la prise en charge des voies aériennes et de la ventilation du patient obèse en médecine périopératoire : du bloc opératoire à la réanimation“. Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT013.
Der volle Inhalt der QuelleThe objectives of this thesis were to evaluate and improve the perioperative airway and ventilation management of the obese patient.First, we specifically analyzed the airway management of obese patients. Two large databases, established respectively in the operating room and intensive care unit (ICU), were used. The ICU database was used to develop and validate a score for predicting difficult endo-tracheal intubation in ICU in the general population, the MACOCHA score. In the same database, obesity was not found to be a risk factor of severe cardiovascular collapse after intubation. The specific analysis of obese patients in these two databases identified risk factors of difficult intubation in the operating room and intensive care. In obese patients, difficult intubation and severe complications such as hypoxemia, were respectively twice and twenty times more common in intensive care than in the operating room.Secondly, obese patients, especially if they are intubated in an emergency situation, are at greater risk of developing acute respiratory distress syndrome (ARDS) after the airway is secured for general anesthesia. We specifically studied one of the most effective treatment of ARDS, prone positioning, in obese patients, showing that this technique is safe and effective. Survival was better in obese patients than in non-obese patients. To explain this better prognosis of obese patients with ARDS, we hypothesized that their diaphragmatic function might be better. This hypothesis was confirmed experimentally in rat specimens: diaphragmatic strength was significantly greater in obese rats then in non-obese rats, before and after mechanical ventilation. This increased diaphragmatic strength may partially explain the "obesity paradox", i.e. that obesity is associated with a better prognosis in the critical care setting. In order to further explain this better prognosis, we analyzed more than 700 obese patients after matching for admission type (medical or surgical). This analysis showed that ICU admission on medical grounds was an independent risk factor for mortality compared to a surgical cause of admission. The Simplified Acute Physiology Score II, the severity index most often used to predict mortality in critically ill patients, overestimated mortality in case of surgical admission in obese patients. A new model predicting mortality in obese patients was developed. Finally we specifically studied obese patients with multiple trauma. The proportion of massive transfusion during the first 24 hours of support was increased in these patients. A score to guide the early management of these obese patients in hemorrhagic shock has subsequently been validated.Once optimal airway and ventilation management is achieved, weaning from mechanical ventilation can be particularly difficult in obese patients, especially in the context of "difficult airway". We therefore proceeded in a third part to compare several weaning trials in obese patients, in order to determine the most representative test to predict post-extubation respiratory work. It was the T-piece or the total absence of mechanical support (0 pressure support and 0 positive airway pressure) that best predicted post-extubation respiratory work.These results will support the implementation of randomized controlled trials comparing specific therapeutic strategies in obese patients to reduce complications of intubation, ventilation and extubation failure in perioperative medicine
SARWARY, M. CHAKER. „Synthese d'automates d'etats fini pour les circuits integres vlsi : optimisation du codage par evaluation du cout apres synthese et synthese d'automates a pile“. Paris 6, 1995. http://www.theses.fr/1995PA066463.
Der volle Inhalt der QuelleThiaux, Yaël. „Optimisation des profils de consommation pour minimiser les coûts économique et énergétique sur cycle de vie des systèmes photovoltaïques autonomes et hybrides - Evaluation de la technologie Li-ion“. Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2010. http://tel.archives-ouvertes.fr/tel-00502428.
Der volle Inhalt der QuelleRizk, Carl. „Contribution to the evaluation and optimization of passengers' screening at airports“. Thesis, Toulouse, INPT, 2019. http://www.theses.fr/2019INPT0121.
Der volle Inhalt der QuelleSecurity threats have emerged in the past decades as a more and more critical issue for Air Transportation which has been one of the main ressource for globalization of economy. Reinforced control measures based on pluridisciplinary research and new technologies have been implemented at airports as a reaction to different terrorist attacks. From the scientific perspective, the efficient screening of passengers at airports remain a challenge and the main objective of this thesis is to open new lines of research in this field by developing advanced approaches using the resources of Computer Science. First this thesis introduces the main concepts and definitions of airport security and gives an overview of the passenger terminal control systems and more specifically the screening inspection positions are identified and described. A logical model of the departure control system for passengers at an airport is proposed. This model is transcribed into a graphical view (Controlled Satisfiability Graph-CSG) which allows to test the screening system with different attack scenarios. Then a probabilistic approach for the evaluation of the control system of passenger flows at departure is developped leading to the introduction of Bayesian Colored Petri nets (BCPN). Finally an optimization approach is adopted to organize the flow of passengers at departure as best as possible given the probabilistic performance of the elements composing the control system. After the establishment of a global evaluation model based on an undifferentiated serial processing of passengers, is analyzed a two-stage control structure which highlights the interest of pre-filtering and organizing the passengers into separate groups. The conclusion of this study points out for the continuation of this theme
Oudghiri, yousfi Imane. „Conception, analyse exergétique et optimisation des cycles à mélange de réfrigérants : Application à la liquéfaction du bio-méthane“. Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEM037.
Der volle Inhalt der QuelleThe growing fossil fuel consumption is leading to depletion of natural gas reserves. Hence, a considerable interest is given to the use of biogas as substitute for natural gas. However, for certain applications, biogas must be liquefied to facilitate its transport and storage. Many studies have focused on large and mid-scale natural gas liquefaction processes with a lack of literature studies on biogas micro-liquefaction. Due to the similarities between biomethane and natural gas, mixed refrigerant (MR) cycles for natural gas liquefaction are considered to liquefy the biomethane. The prime objective of this doctoral work is to develop a thermodynamic refrigeration cycle with low energy consumption and eventually nonflammable refrigeration for micro-scale biomethane liquefaction. In order to procure the needed cooling of biomethane liquefaction processes, different configurations are proposed. The first two architectures are mixed refrigerant cycles carrying out both sensible (pre-cooling) and latent (condensation) heat removal. The latest architecture, mixed refrigerants are nonflammable. All configurations are simulated in Aspen HYSYS and assessed considering the exergy analysis method. In order to find the optimum operating parameters and the best design, a tool integrating PIKAIA genetic algorithm and Microsoft Visual Basic (VBA) was developed. Once the steady state simulation and optimization are achieved, a study of the dynamic behavior of the process at partial load operating mode is conducted. The design of an architecture suitable for liquefaction with a capacity of 10 Nm3 / h was made. A techno-economic study of this bench was developed to evaluate the total updated cost for the liquefaction of biomethane
Bernard, Jean-Philippe. „Observations sub-millimetriques a grande sensibilite : optimisation photometrique et evaluation des performances de l'instrument spm-pronaos; observations et modelisation numerique de l'emission infra-rouge et sub-millimetrique du milieu inter-stellaire dense“. Paris 7, 1991. http://www.theses.fr/1991PA077275.
Der volle Inhalt der QuelleDray, Xavier. „Evaluation et optimisation des techniques d'abord transgastrique de la cavité péritonéale en chirurgie endoscopique transluminale par les orifices naturels (natural orifice translumenal endoscopic surgery, notes)“. Paris, CNAM, 2009. http://www.theses.fr/2009CNAM0672.
Der volle Inhalt der QuelleLa NOTES (Natural Orifice Translumenal Endoscopic Surgery) est une technique chirurgicale qui consiste à aborder la cavité péritonéale à l’aide d’un endoscope souple, à travers les orifices naturels puis à travers la paroi du tube digestif ou des voies uro-génitales. La maîtrise d’un abord per-oral et transgastrique (disponible pour les deux sexes, par opposition à l’abord transvaginal) est un objectif hautement souhaitable pour le développement de cette technique. Les travaux présentés dans cette thèse démontrent la faisabilité d’interventions endoscopiques souples par voie transgastrique sur modèle porcin vivant. Ils mettent en évidence : (1) l’intérêt de protocoles anti-infectieux avant une NOTES transgastrique ; (2) la nécessité de réaliser un pneumopéritoine avant création d’un abord transgastrique pour prévenir la survenue de plaies d’organes de voisinage; (3) la meilleure tolérance d’une dilatation au ballon plutôt que d’une incision au sphinctérotome lors de la création de la voie d’abord transgastrique ; (4) la possibilité d’introduire de façon stérile par voie transgastrique du matériel prothétique pour un usage intrapéritonéal (telle qu’illustrée par une technique de cure d’éventration ombilicale par NOTES transgastrique) ; (5) le caractère encore imparfait et hautement dépendant des avancées technologiques des méthodes endoscopiques de création (par exemple à l’aide d’un laser de 2 µm de longueur d’onde) et de fermeture (par exemple par clips, points en T, agrafes) de la voie d’abord transgastrique ; et (6) la faisabilité de tests d’étanchéité mini-invasifs et hautement spécifiques (utilisant par exemple de l’hydrogène dilué)
Rahmouni, Maher. „Ordonnancement et optimisations pour la synthèse de haut niveau des circuits de controle“. Grenoble INPG, 1997. http://www.theses.fr/1997INPG0028.
Der volle Inhalt der QuelleMonède, Hocquard Lucie. „Evaluation clinique, caractérisation mécanique et modélisation pour l'évolution de la conception d'un implant rachidien dynamique“. Thesis, Bordeaux 1, 2012. http://www.theses.fr/2012BOR14721/document.
Der volle Inhalt der QuelleThe main focus of any implantable medical device is to improve the health of the patient by providing minimum risk. For this purpose, the study of the B Dyn spinal implant comprises several constituents: - The carrying out of a clinical follow up, - The analysis and choice of technical solutions (corrective actions) - The creation of a digital tool for further development (preventive actions).The initial bibliographical study enables to comprehend the functional anatomy of the lumbar spine, to understand the pathological states and their consequences and finally to list the associated surgical techniques (osseous resection, implantation of devices…).The clinical follow-up of a population of thirty patients then underlines the contributions (somatic and functional) of the B Dyn in its first design. For a few cases, the analysis of radiographs in flexion shows an incipient deterioration in the ring probably related to an accidental overloading of the implant. This observation requires an evolution in the design of the implant.An analysis of the initial design and the mechanical characterization in traction, allow targeting the corrective actions to be applied in the context of this evolution. The developed approach is based on the experimental evaluation in order to select technical solutions that would satisfy the functional criteria; this leads to an evolution of the choice of the ring material.To conduct subsequent developments, a finite element model is created. Thus the digital approach replaces the restrictive and expensive experimental approach. The preliminary characterization of elastomers is necessary to obtain materials data to work out this model. The results of the first simulations of a tensile test are compared to experimental data in the perspective of the model validation.At this stage, the B Dyn study provides a first solution of implant evolution and a numerical tool for the future analysis of technical solutions
Turki, Sadok. „Impact du délai de livraison sur le niveau de stock : une approche basée sur la méthode IPA“. Electronic Thesis or Diss., Metz, 2010. http://www.theses.fr/2010METZ029S.
Der volle Inhalt der QuelleThe first part of our work, we study a manufacturing system composed of a machine, a buffer with infinite capacity and customer. We applied this to system a continuous flow model taking into account a constant delivery time. To evaluate the performance of our system, we relied on the method of infinitesimal perturbation analysis (IPA), we performed simulations using an algorithm based on this method to determine the optimal buffer level which minimizes the cost function. This total cost is the sum of inventory cost, backlog cost and transportation cost. In the second part of our work, we applied a discrete flow model to the same system studied in the first part. The infinitesimal perturbation analysis method (IPA) is also applied to this model to determine the optimal inventory level. Applying this method to a discrete flow model is an innovation. Indeed, as we know, there is no work applying the method to IPA models to discrete flow. In the last part, we considered a manufacturing system with a random delivery time between the costumer and the producer. Then we studied the impact of transportation costs on the optimal buffer level. The method of infinitesimal perturbation analysis is also applied for both types of models (discrete flows and continuous flows model)