Дисертації з теми "Mapping optimisation"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-43 дисертацій для дослідження на тему "Mapping optimisation".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Kim, Dae Gyu. "Mapping based constraint handling methods for evolutionary algorithms." Thesis, University of Sussex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.311406.
Повний текст джерелаKianifar, Mohammed R. "Application of multidisciplinary design optimisation frameworks for engine mapping and calibration." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/14843.
Повний текст джерелаSkoglund, Martin. "Inertial Navigation and Mapping for Autonomous Vehicles." Doctoral thesis, Linköpings universitet, Reglerteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-110373.
Повний текст джерелаLINK-SIC
Rountree, Lindsay. "Optimisation of perimetric stimuli for mapping changes in spatial summation in glaucoma." Thesis, Cardiff University, 2018. http://orca.cf.ac.uk/111180/.
Повний текст джерелаLittle, Collin. "Sono degradation of phenanthrene : mapping & optimisation of reactor conditions using sono-chemi-luminescence." Thesis, Glasgow Caledonian University, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.688256.
Повний текст джерелаAkinyemi, Segun Ajayi. "Optimisation of selective extraction techniques as a tool for geochemical mapping in the Southern Africa region." Thesis, University of the Western Cape, 2008. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_3444_1260521237.
Повний текст джерелаThe complex nature and composition of regolith cover in Southern Africa is a major challenge to geochemical mapping for concealed mineralization. Some of the setbacks to successful geochemical exploration may be ascribed to the use of various partial extraction techniques,without a profound understanding of the regolith components and their composition. This investigation therefore focuses on the use of hydroxylamine partial extraction geochemistry for geochemical mapping in regolith over two contrasting environments viz
aeolian sand-calcrete regolith over Au mineralization at Amalia Blue Dot Mine in South Africa and lateritic regolith covering the Ni-Cu deposit at Kabanga Main and Luhuma in Tanzania. Regolith samples from the above areas were sieved and extracted with hydroxylamine hydrochloride solution and analyzed for multi-element by AAS and ICP-MS techniques. A stepwise optimization of the hydroxylamine extraction technique of samples from both areas was carried out and incorporated into the analytical programme (in a pilot study). Results of hydroxylamine partial extraction generally gave better anomaly contrast and reflection of bedrock mineralization than the conventional aqua regia techniques that were previously used in the region. The results however show that lateritic regolith may be best extracted using 0.25M hydroxylamine while 0.1M concentration appears most suitable for extraction of aeolian-calcrete regolith. The above results are corroborated by principal component analysis of the analytical data that show various element associations, e.g. with Fe-Mn oxides while others possibly belong to the loosely adsorbed or exchangeable group. The 
gochemical maps in the pilot study areas at Amalia, Kabanga and Luhuma show elevated element contents or clusters of anomalies of diverse elements associated with Fe-Mn oxides. Geochemical mapping at Kabanga with deeply concealed mineralization however shows variability of subdued element patterns over mineralized areas. Geochemical signatures associated with hydroxylamine hydrochloride partial leach are therefore characterized by a lower geochemical background than that using conventional aqua regia leach. This study leads recommending for further investigations into partial extraction of the exchangeable group of elements, possibly using ammonium acetate.
Führ, Gereon [Verfasser], Rainer [Akademischer Betreuer] Leupers, and Tobias [Akademischer Betreuer] Gemmeke. "MPSoC power-performance trade-off : strategies for SW mapping optimisation / Gereon Führ ; Rainer Leupers, Tobias Gemmeke." Aachen : Universitätsbibliothek der RWTH Aachen, 2021. http://d-nb.info/123314426X/34.
Повний текст джерелаZaharia, Mihai Valentin. "Contributions à l’étude des machines à reluctance variable pour application alterno-démarreur automobile." Thesis, Ecole centrale de Lille, 2016. http://www.theses.fr/2016ECLI0022/document.
Повний текст джерелаThe switched reluctance machine has a simple construction making it cheaper in execution but one of the drawbacks of this machine is the torque ripple. This thesis had as first target, the usage of an optimization tool to calculate the best control parameters to correct this major drawback in motor and generator operation modes. Hence, an analytical model that takes into account the machine geometry and that is able to be simulated in both operation modes by adjusting the commutation angles was provided and implemented in a calculation environment. The second target of this work is to investigate a method to reduce the optimization time without lowering the accuracy of the results. The strategy used in the optimization process is known in literature as the space mapping technique, more precisely for this thesis output space mapping proportional and manifold mapping were studied. After testing them on a mathematical model it was possible to continue the investigation on defining the optimal control parameters of a three-phases 6/8 SR machine prototype, being able that further to use this strategy in a much complicated process, i.e. defining the right geometry and control of a SR machine to be used in automotive integrated starter alternator systems. The final target of the thesis was to conduct experiments and tests on the existing prototype in order to partially validate the results of the optimization process
Nhan, Nhat-Quang. "Optimisation de précodeurs linéaires pour les systèmes MIMO à récepteurs itératifs." Thesis, Brest, 2016. http://www.theses.fr/2016BRES0062/document.
Повний текст джерелаThe long-term evolution (LTE) and the LTE-Advanced (LTE-A) standardizations are predicted to play essential roles in the future fifth-generation (5G) mobile networks. These standardizations require high data rate and high quality of service, which assures low error-rate and low latency. Besides, as discussed in the recent surveys, low complexity communication systems are also essential in the next 5G mobile networks. To adapt to the modern trend of technology, in this PhD thesis, we investigate the multiple-input multiple-output (MIMO) wireless communication schemes. In the first part of this thesis, low-complex forward error correction (FEC) codes are used for low complexity and latency. By considering iterative receivers at the receiver side, we exploit MIMO linear precoding and mapping methods to optimize the error-rate performance of these systems. In the second part of this thesis, non-binary low density parity check (NB-LDPC) codes are investigated. We propose to use MIMO precoders to reduce the complexity for NB-LDPC encoded MIMO systems. A novel low complexity decoding algorithm for NB-LDPC codes is also proposed at the end of this thesis
Hassani, Bijarbooneh Farshid. "Constraint Programming for Wireless Sensor Networks." Doctoral thesis, Uppsala universitet, Avdelningen för datalogi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-241378.
Повний текст джерелаProFuN
Tanner, Michael. "BOR2G : Building Optimal Regularised Reconstructions with GPUs (in cubes)." Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:1928c996-d913-4d7e-8ca5-cf247f90aa0f.
Повний текст джерелаAlmqvist, Saga, and Lana Nore. "Where to Stack the Chocolate? : Mapping and Optimisation of the Storage Locations with Associated Transportation Cost at Marabou." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-135912.
Повний текст джерелаIdag är lagerhanteringen i Marabou fabriken ordnat på sådant sätt att artiklarna är lagrade utifrån vilken linje den tillhör och därmed står i ett lager nära den specifika linjen. Dock finns det lagerplatser idag som inte är optimerade, i den mån att det endast är lagrade från vana och vad som anses enklast. Därmed är lagerplatserna inte ordnade utifrån någon standard.I detta examensarbete föreslår vi därför de mest optimala lagerplatserna med hänsyn till totala transportkostnaderna. Det här problemet kan modelleras som ett matchningsproblem som kan lösas av en så kallad Ungersk algoritm. Denna ska resultera i den optimala matchningen mellan produktionslinjens behov mot lagerplatserna i fabriken med tillhörande kostnad. För att använda Ungerska algoritmen samlade vi in data av den totala mängd artiklar som fanns i fabriken för 2016, vilket togs fram genom datasystemet SAP som Marabou använder sig av. Därefter justerade vi datat genom att dela upp alla artiklarna i antalet pallar samt vilken linje den tillhör. Denna information kompletterades med empiriska undersökningar genom egna observationer samt kvalitativa intervjuer med de anställda i fabriken. I metoden använder vi tre olika implementeringar av den Ungerska algoritmen. I resultatet presenteras resultaten från de olika tillvägagångsätten tillsammans med flera palloptimeringsförslag. I slutet sammanställs flera förbättringsförslag och idéer om vidareutveckling i rapporten.
O'Donovan, Anthony Gareth. "Facies mapping of the Vaal Reef placer as an aid to remnant pillar extraction and stope width optimisation." Thesis, Rhodes University, 1992. http://hdl.handle.net/10962/d1005559.
Повний текст джерелаHage, Hassan Maya. "Méthodologies de conception optimale de systèmes de conversion électromécanique." Phd thesis, Université Paris Sud - Paris XI, 2014. http://tel.archives-ouvertes.fr/tel-01002008.
Повний текст джерелаBourbia, Salma. "Algorithmes de prise de décision pour la "cognitive radio" et optimisation du "mapping" de reconfigurabilité de l'architecture de l'implémentation numérique." Phd thesis, Supélec, 2013. http://tel.archives-ouvertes.fr/tel-00931350.
Повний текст джерелаFrost, Duncan. "Long range monocular SLAM." Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:af38cfa6-fc0a-48ab-b919-63c440ae8774.
Повний текст джерелаVan, der Walt Madele. "A design environment for the automated optimisation of low cross-polarisation horn antennas." Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/5144.
Повний текст джерелаENGLISH ABSTRACT: The aggressive space mapping algorithm is used in this project for the optimisation of electromagnetic structures. This technique combines the use of fast, less accurate models with more time-consuming, high precision models in the optimisation of a design. MATLAB’s technical computing environment possesses powerful tools for optimisation as well as the graphical representation and mathematical post-processing of data. A software interface, which uses Visual Basic for Applications, is created between MATLAB and the electromagnetic solvers, CST Microwave Studio and μWave Wizard, that are used for the fine and coarse model calculations. The interface enables the direct interchange of data, which allows MATLAB to control the optimisation for the automation of the design process. The optimisation of a microwave coaxial resonator with input coupling is used to demonstrate the design environment. An accurate equivalent circuit model is available to describe the problem. The space mapping optimisation of this structure works well, with a significant improvement in the efficiency of the optimisation when compared to standard optimisation techniques. Multimode horn antennas are of interest for use as feeds in radio-astronomy telescope systems. The design of a stepped circular horn antenna in the space mapping design environment is presented. The horn’s radiation pattern is optimised for low cross-polarisation. This structure is much more complex to model than the resonator example. The generalised scattering matrix representation is used in the coarse model description. The far-fields are calculated from the aperture fields by means of the Fast Fourier Transform. Various tests confirm that the optimisation is steered in the right direction as long as the coarse model response follows the trend of the fine model response over the optimisation space. The presented design environment is a powerful tool for the automation of the design of electromagnetic structures.
AFRIKAANSE OPSOMMING: Die aggressiewe ruimte-afbeelding algoritme word in hierdie projek gebruik vir die optimering van elektromagnetiese strukture. Hierdie tegniek kombineer die gebruik van vinnige, minder akkurate modelle tesame met tydrowende hoë presisie modelle tydens die optimering van ’n ontwerp. MATLAB se tegniese verwerkingsomgewing beskik oor kragtige gereedskap vir optimering sowel as die grafiese voorstelling en wiskundige verwerking van data. ’n Sagteware koppelvlak, wat Visual Basic for Applications benut, word geskep tussen MATLAB en die elektromagnetiese oplossers, CST Microwave Studio en μWave Wizard, wat vir die fyn en growwe model berekeninge gebruik word. Hierdie koppelvlak maak die direkte uitruiling van data moontlik, wat MATLAB in staat stel om die optimering te beheer ten einde die ontwerpsproses te outomatiseer. Die optimering van ’n mikrogolf koaksiale resoneerder met intree koppeling word gebruik om die ontwerpsomgewing te demonstreer. ’n Akkurate ekwivalente stroombaanmodel is beskikbaar om die probleem mee te beskryf. Die ruimte-afbeelding optimering van hierdie struktuur werk goed en toon ’n aansienlike verbetering in die doeltreffendheid van die optimering wanneer dit met standaard optimeringstegnieke vergelyk word. Multimodus horingantennes is van belang in radio-astronomie, waar dit as voere vir teleskope gebruik word. Die ontwerp van ’n trapvormige, sirkelvormige horingantenne in die ruimte-afbeelding ontwerpsomgewing word aangebied. Die stralingspatroon van die horing word optimeer vir lae kruispolarisasie. Hierdie struktuur is heelwat meer ingewikkeld om te modelleer as die resoneerder voorbeeld. Die veralgemeende strooimatriks voorstelling word gebruik in die growwe model beskrywing. Die ver-velde word bereken vanaf die velde in die bek van die antenne, deur gebruik te maak van die Vinnige Fourier Transform. Verskeie toetse bevestig dat die optimering in die regte rigting gestuur word, solank as wat die growwe model se gedrag dié van die fyn model oor die optimeringsgebied navolg. Die ontwerpsomgewing wat hier aangebied word, is ’n kragtige stuk gereedskap vir die outomatisering van die ontwerp van elektromagnetiese strukture.
Rehn-Sonigo, Veronika. "Multi-criteria Mapping and Scheduling of Workflow Applications onto Heterogeneous Platforms." Phd thesis, Ecole normale supérieure de lyon - ENS LYON, 2009. http://tel.archives-ouvertes.fr/tel-00424118.
Повний текст джерелаPlacement de répliques dans les réseaux hiérarchiques - Dans ce type d'application, plusieurs clients émettent des requêtes à quelques serveurs et la question est : où doit-on placer des répliques dans le réseau afin que toutes les requêtes puissent être traitées. Nous discutons et comparons plusieurs politiques de placement de répliques dans des réseaux hiérarchiques en respectant des contraintes de capacité de serveur, de qualité
de service et de bande-passante. Les requêtes des clients sont connues a priori, tandis que le nombre et la position des serveurs sont à déterminer. L'approche traditionnelle dans la littérature est de forcer toutes les requêtes d'un client à être traitées par le serveur le plus proche dans le réseau hiérarchique. Nous introduisons et étudions deux nouvelles politiques. Une principale contribution de ce travail est l'évaluation de l'impact de ces nouvelles politiques sur le coût total de replication. Un autre but important est d'évaluer l'impact de l'hétérogénéité des serveurs, d'une perspective à la
fois théorique et pratique. Nous établissons plusieurs nouveaux résultats de complexité, et nous présentons plusieurs heuristiques
efficaces en temps polynomial.
Applications de flux de données - Nous considérons des applications de flux de données qui peuvent être exprimées comme des graphes linéaires. Un exemple pour ce type d'application est le traitement numérique d'images, où les images sont traitées en
régime permanent. Plusieurs critères antagonistes doivent être optimisés, tels que le débit et la latence (ou une combinaison) ainsi que la latence et la fiabilité (i.e. la probabilité que le calcul soit réussi) de l'application. Bien qu'il soit possible de trouver
des algorithmes polynomiaux simples pour les plates-formes entièrement homogènes, le problème devient NP-difficile lorsqu'on s'attaque à des plates-formes hétérogènes. Nous présentons une formulation en programme linéaire pour ce dernier problème. De
plus nous introduisons plusieurs heuristiques bi-critères efficaces en temps polynomial, dont la performance relative est évaluée par des simulations extensives. Dans une étude de cas, nous présentons des simulations et des résultats expérimentaux (programmés en MPI) pour le graphe d'application de l'encodeur JPEG sur une grappe de calcul.
Applications complexes de streaming - Considérons l'exécution d'applications organisées en arbres d'opérateurs, i.e. l'application en régime permanent d'un ou plusieurs arbres d'opérateurs à données multiples qui doivent être mis à jour continuellement à différents endroits du réseau. Un premier but est de fournir à l'utilisateur un ensemble de processeurs qui doit être acheté ou loué pour garantir que le débit minimum de l'application en régime permanent soit atteint. Puis nous étendons notre modèle aux applications multiples : plusieurs applications concurrentes sont exécutées en même
temps dans un réseau, et on doit assurer que toutes les applications puissent atteindre leur débit requis. Une autre contribution de ce travail est d'apporter des résultats de complexité pour des instances variées du problème. La troisième contribution est l'élaboration
de plusieurs heuristiques polynomiales pour les deux modèles d'application. Un objectif premier des heuristiques pour applications concurrentes est la réutilisation des résultats intermédiaires qui sont partagés parmi différentes applications.
Codol, Jean-Marie. "Hybridation GPS/Vision monoculaire pour la navigation autonome d'un robot en milieu extérieur." Thesis, Toulouse, INSA, 2012. http://www.theses.fr/2012ISAT0060/document.
Повний текст джерелаWe are witnessing nowadays the importation of ICT (Information and Communications Technology) in robotics. These technologies will give birth, in upcoming years, to the general public service robotics. This future, if realised, shall be the result of many research conducted in several domains: mechatronics, telecommunications, automatics, signal and image processing, artificial intelligence ... One particularly interesting aspect in mobile robotics is hence the simultaneous localisation and mapping problem. Consequently, to access certain informations, a mobile robot has, in many cases, to map/localise itself inside its environment. The following question is then posed: What precision can we aim for in terms of localisation? And at what cost?In this context, one of the objectives of many laboratories indulged in robotics research, and where results impact directly the industry, is the positioning and mapping of the environment. These latter tasks should be precise, adapted everywhere, integrated, low-cost and real-time. The prediction sensors are inexpensive ones, such as a standard GPS (of metric precision), and a set of embeddable payload sensors (e.g. video cameras). These type of sensors constitute the main support in our work.In this thesis, we shed light on the localisation problem of a mobile robot, which we choose to handle with a probabilistic approach. The procedure is as follows: we first define our "variables of interest" which are a set of random variables, and then we describe their distribution laws and their evolution models. Afterwards, we determine a cost function in such a manner to build up an observer (an algorithmic class where the objective is to minimize the cost function).Our contribution consists of using brute GPS measures (brute measures or raw datas are measures issued from code and phase correlation loops, called pseudo-distance measures of code and phase, respectively) for a low-cost navigation, which is precise in an external suburban environment. By implementing the so-called "whole" property of GPS phase ambiguities, we expand the navigation to achieve a GPS-RTK (Real-Time Kinematic) system in a precise and low-cost local differential mode.Our propositions has been validated through experimentations realized on our robotic demonstrator
Rincent, Renaud. "Optimisation des stratégies de génétique d'association et de sélection génomique pour des populations de diversité variable : Application au maïs." Thesis, Paris, AgroParisTech, 2014. http://www.theses.fr/2014AGPT0018/document.
Повний текст джерелаMajor progresses have been achieved in genotyping technologies, which makes it easier to decipher the relationship between genotype and phenotype. This contributed to the understanding of the genetic architecture of traits (Genome Wide Association Studies, GWAS), and to better predictions of genetic value to improve breeding efficiency (Genomic Selection, GS). The objective of this thesis was to define efficient ways of leading these approaches. We first derived analytically the power from classical GWAS mixed model and showed that it was lower for markers with a small minimum allele frequency, a strong differentiation among population subgroups and that are strongly correlated with markers used for estimating the kinship matrix K. We considered therefore two alternative estimators of K. Simulations showed that these were as efficient as classical estimators to control false positive and provided more power. We confirmed these results on true datasets collected on two maize panels, and could increase by up to 40% the number of detected associations. These panels, genotyped with a 50k SNP-array and phenotyped for flowering and biomass traits, were used to characterize the diversity of Dent and Flint groups and detect QTLs. In GS, studies highlighted the importance of relationship between the calibration set (CS) and the predicted set on the accuracy of predictions. Considering low present genotyping cost, we proposed a sampling algorithm of the CS based on the G-BLUP model, which resulted in higher accuracies than other sampling strategies for all the traits considered. It could reach the same accuracy than a randomly sampled CS with half of the phenotyping effort
Castro, Márcio. "Optimisation de la performance des applications de mémoire transactionnelle sur des plates-formes multicoeurs : une approche basée sur l'apprentissage automatique." Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENM074/document.
Повний текст джерелаMulticore processors are now a mainstream approach to deliver higher performance to parallel applications. In order to develop efficient parallel applications for those platforms, developers must take care of several aspects, ranging from the architectural to the application level. In this context, Transactional Memory (TM) appears as a programmer friendly alternative to traditional lock-based concurrency for those platforms. It allows programmers to write parallel code as transactions, which are guaranteed to execute atomically and in isolation regardless of eventual data races. At runtime, transactions are executed speculatively and conflicts are solved by re-executing conflicting transactions. Although TM intends to simplify concurrent programming, the best performance can only be obtained if the underlying runtime system matches the application and platform characteristics. The contributions of this thesis concern the analysis and improvement of the performance of TM applications based on Software Transactional Memory (STM) on multicore platforms. Firstly, we show that the TM model makes the performance analysis of TM applications a daunting task. To tackle this problem, we propose a generic and portable tracing mechanism that gathers specific TM events, allowing us to better understand the performances obtained. The traced data can be used, for instance, to discover if the TM application presents points of contention or if the contention is spread out over the whole execution. Our tracing mechanism can be used with different TM applications and STM systems without any changes in their original source codes. Secondly, we address the performance improvement of TM applications on multicores. We point out that thread mapping is very important for TM applications and it can considerably improve the global performances achieved. To deal with the large diversity of TM applications, STM systems and multicore platforms, we propose an approach based on Machine Learning to automatically predict suitable thread mapping strategies for TM applications. During a prior learning phase, we profile several TM applications running on different STM systems to construct a predictor. We then use the predictor to perform static or dynamic thread mapping in a state-of-the-art STM system, making it transparent to the users. Finally, we perform an experimental evaluation and we show that the static approach is fairly accurate and can improve the performance of a set of TM applications by up to 18%. Concerning the dynamic approach, we show that it can detect different phase changes during the execution of TM applications composed of diverse workloads, predicting thread mappings adapted for each phase. On those applications, we achieve performance improvements of up to 31% in comparison to the best static strategy
Sabri, Yasmine. "Mail-Returns Process Optimization Using Lean Thinking Principles at The Swedish Tax Agency." Thesis, KTH, Industriell produktion, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-98792.
Повний текст джерелаZátopek, Jakub. "Optimalizace hodnotového toku." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-219893.
Повний текст джерелаGou, Changjiang. "Task Mapping and Load-balancing for Performance, Memory, Reliability and Energy." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEN047.
Повний текст джерелаThis thesis focuses on multi-objective optimization problems arising when running scientific applications on high performance computing platforms and streaming applications on embedded systems. These optimization problems are all proven to be NP-complete, hence our efforts are mainly on designing efficient heuristics for general cases, and proposing optimal solutions for special cases.Some scientific applications are commonly modeled as rooted trees. Due to the size of temporary data, processing such a tree may exceed the local memory capacity. A practical solution on a multiprocessor system is to partition the tree into many subtrees, and run each on a processor, which is equipped with a local memory. We studied how to partition the tree into several subtrees such that each subtree fits in local memory and the makespan is minimized, when communication costs between processors are accounted for.Then, a practical work of tree scheduling arising in parallel sparse matrix solver is examined. The objective is to minimize the factorization time by exhibiting good data locality and load balancing. The proportional mapping technique is a widely used approach to solve this resource-allocation problem. It achieves good data locality by assigning the same processors to large parts of the task tree. However, it may limit load balancing in some cases. Based on proportional mapping, a dynamic scheduling algorithm is proposed. It relaxes the data locality criterion to improve load balancing. The performance of our approach has been validated by extensive experiments with the parallel sparse matrix direct solver PaStiX.Streaming applications often appear in video and audio domains. They are characterized by a series of operations on streaming data, and a high throughput. Multi-Processor System on Chip (MPSoC) is a multi/many-core embedded system that integrates many specific cores through a high speed interconnect on a single die. Such systems are widely used for multimedia applications. Lots of MPSoCs are batteries-operated. Such a tight energy budget intrinsically calls for an efficient schedule to meet the intensive computation demands. Dynamic Voltage and Frequency Scaling (DVFS) can save energy by decreasing the frequency and voltage at the price of increasing failure rates. Another technique to reduce the energy cost and meet the reliability target consists in running multiple copies of tasks. We first model applications as linear chains and study how to minimize the energy consumption under throughput and reliability constraints, using DVFS and duplication technique on MPSoC platforms.Then, in a following study, with the same optimization goal, we model streaming applications as series-parallel graphs, which are more complex than simple chains and more realistic. The target platform has a hierarchical communication system with two levels, which is common in embedded systems and high performance computing platforms. The reliability is guaranteed through either running tasks at the maximum speed or triplication of tasks. Several efficient heuristics are proposed to tackle this NP-complete optimization problem
Vitter, Maxime. "Cartographier l'occupation du sol à grande échelle : optimisation de la photo-interprétation par segmentation d'image." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSES011/document.
Повний текст джерелаOver the last fifteen years, the emergence of remote sensing data at Very High Spatial Resolution (VHRS) and the democratization of Geographic Information Systems (GIS) have helped to meet the new and growing needs for spatial information. The development of new mapping methods offers an opportunity to understand and anticipate land cover change at large scales, still poorly known. In France, spatial databases about land cover and land use at large scale have become an essential part of current planning and monitoring of territories. However, the acquisition of this type of database is still a difficult need to satisfy because the demands concern tailor-made cartographic productions, adapted to the local problems of the territories. Faced with this growing demand, regular service providers of this type of data seek to optimize manufacturing processes with recent image-processing techniques. However, photo interpretation remains the favoured method of providers. Due to its great flexibility, it still meets the need for mapping at large scale, despite its high cost. Using fully automated production methods to substitute for photo interpretation is rarely considered. Nevertheless, recent developments in image segmentation can contribute to the optimization of photo-interpretation practice. This thesis presents a series of tools that participate in the development of digitalization assistance for the photo-interpretation exercise. The assistance results in the realization of a pre-cutting of the landscape from a segmentation carried out on a VHRS image. Tools development is carried out through three large-scale cartographic services, each with different production instructions, and commissioned by public entities. The contribution of these automation tools is analysed through a comparative analysis between two mapping procedures: manual photo interpretation versus digitally assisted segmentation. The productivity gains brought by segmentation are evaluated using quantitative and qualitative indices on different landscape configurations. To varying degrees, it appears that whatever type of landscape is mapped, the gains associated with assisted mapping are substantial. These gains are discussed both technically and thematically from a commercial perspective
Nivoliers, Vincent. "Échantillonnage pour l'approximation de fonctions sur des maillages." Thesis, Université de Lorraine, 2012. http://www.theses.fr/2012LORR0161/document.
Повний текст джерелаDigitalisation is an operation which consists in storing an object in a computer for further manipulation using data processing tools. In this document, we are interested in the digitalisation of three-dimensional objects. It is first a matter of recording the shape of the object. Many methods have been developed to address this problem, and we will focus on objects described as meshes. On such objects the storage of attributes like colour, temperature or electrical charge is often useful, depending on the application. We will describe two complementary approaches to deal with this issue. The first one relies on texture mapping. This technique consists in unfolding ? parametrising ? the mesh on a flat image in which the attribute is stored. A value recovered from the image can therefore be associated with each point of the object. We will describe a method which hides the seam artifact, commonly encountered using this technique. Unfolding the mesh demands that its quality be good, which is not always the case. We thus secondly describe a surface sampling method based on a restricted Voronoï diagram. We especially detail how to efficiently compute such an object and how to optimise it with respect to some quality measure. These results are then applied to the surface fitting problem
Farah, Jad. "Amélioration des mesures anthroporadiamétriques personnalisées assistées par calcul Monte Carlo : optimisation des temps de calculs et méthodologie de mesure pour l’établissement de la répartition d’activité." Thesis, Paris 11, 2011. http://www.theses.fr/2011PA112183/document.
Повний текст джерелаTo optimize the monitoring of female workers using in vivo spectrometry measurements, it is necessary to correct the typical calibration coefficients obtained with the Livermore male physical phantom. To do so, numerical calibrations based on the use of Monte Carlo simulations combined with anthropomorphic 3D phantoms were used. Such computational calibrations require on the one hand the development of representative female phantoms of different size and morphologies and on the other hand rapid and reliable Monte Carlo calculations. A library of female torso models was hence developed by fitting the weight of internal organs and breasts according to the body height and to relevant plastic surgery recommendations. This library was next used to realize a numerical calibration of the AREVA NC La Hague in vivo counting installation. Moreover, the morphology-induced counting efficiency variations with energy were put into equation and recommendations were given to correct the typical calibration coefficients for any monitored female worker as a function of body height and breast size. Meanwhile, variance reduction techniques and geometry simplification operations were considered to accelerate simulations.Furthermore, to determine the activity mapping in the case of complex contaminations, a method that combines Monte Carlo simulations with in vivo measurements was developed. This method consists of realizing several spectrometry measurements with different detector positioning. Next, the contribution of each contaminated organ to the count is assessed from Monte Carlo calculations. The in vivo measurements realized at LEDI, CIEMAT and KIT have demonstrated the effectiveness of the method and highlighted the valuable contribution of Monte Carlo simulations for a more detailed analysis of spectrometry measurements. Thus, a more precise estimate of the activity distribution is given in the case of an internal contamination
Berger, Karl-Eduard. "Placement de graphes de tâches de grande taille sur architectures massivement multicoeurs." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLV026/document.
Повний текст джерелаThis Ph.D thesis is devoted to the study of the mapping problem related to massively parallel embedded architectures. This problem arises from industrial needs like energy savings, performance demands for synchronous dataflow applications. This problem has to be solved considering three criteria: heuristics should be able to deal with applications with various sizes, they must meet the constraints of capacities of processors and they have to take into account the target architecture topologies. In this thesis, tasks are organized in communication networks, modeled as graphs. In order to determine a way of evaluating the efficiency of the developed heuristics, mappings, obtained by the heuristics, are compared to a random mapping. This comparison is used as an evaluation metric throughout this thesis. The existence of this metric is motivated by the fact that no comparative heuristics can be found in the literature at the time of writing of this thesis. In order to address this problem, two heuristics are proposed. They are able to solve a dataflow process network mapping problem, where a network of communicating tasks is placed into a set of processors with limited resource capacities, while minimizing the overall communication bandwidth between processors. They are applied on task graphs where weights of tasks and edges are unitary set. The first heuristic, denoted as Task-wise Placement, places tasks one after another using a notion of task affinities. The second algorithm, named Subgraph-wise Placement, gathers tasks in small groups then place the different groups on processors using a notion of affinities between groups and processors. These algorithms are tested on tasks graphs with grid or logic gates network topologies. Obtained results are then compared to an algorithm present in the literature. This algorithm maps task graphs with moderated size on massively parallel architectures. In addition, the random based mapping metric is used in order to evaluate results of both heuristics. Then, in a will to address problems that can be found in industrial cases, application cases are widen to tasks graphs with tasks and edges weights values similar to those that can be found in the industry. A progressive construction heuristic named Regret Based Approach, based on game theory, is proposed. This heuristic maps tasks one after another. The costs of mapping tasks according to already mapped tasks are computed. The process of task selection is based on a notion of regret, present in game theory. The task with the highest value of regret for not placing it, is pointed out and is placed in priority. In order to check the strength of the algorithm, many types of task graphs (grids, logic gates networks, series-parallel, random, sparse matrices) with various size are generated. Tasks and edges weights are randomly chosen using a bimodal law parameterized in order to have similar values than industrial applications. Obtained results are compared to the Task Wise placement, especially adapted for non-unitary values. Moreover, results are evaluated using the metric defined above
Cortés, Ríos Julio César. "Targeted feedback collection for data source selection with uncertainty." Thesis, University of Manchester, 2018. https://www.research.manchester.ac.uk/portal/en/theses/targeted-feedback-collection-for-data-source-selection-with-uncertainty(3ce078e5-4e5a-4bf5-afb3-1ee51b863925).html.
Повний текст джерелаGhasemi, Dehkordi Sepehr. "Towards an optimal model for green and safe driving." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/131162/1/Sepehr_Ghasemi%20Dehkordi_Thesis.pdf.
Повний текст джерелаCheng, Sibo. "Error covariance specification and localization in data assimilation with industrial application Background error covariance iterative updating with invariant observation measures for data assimilation A graph clustering approach to localization for adaptive covariance tuning in data assimilation based on state-observation mapping Error covariance tuning in variational data assimilation: application to an operating hydrological model." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPAST067.
Повний текст джерелаData assimilation techniques are widely applied in industrial problems of field reconstruction or parameter identification. The error covariance matrices, especially the background matrix in data assimilation are often difficult to specify. In this thesis, we are interested in the specification and localization of covariance matrices in multivariate and multidimensional systems in an industrial context. We propose to improve the covariance specification by iterative processes. Hence, we developed two new iterative methods for background matrix recognition. The power of these methods is demonstrated numerically in twin experiments with independent errors or relative to true states. We then propose a new concept of localization and applied it for error covariance tuning. Instead of relying on spatial distance, this localization is established purely on links between state variables and observations. Finally, we apply these new approaches, together with other classical methods for comparison, to a multivariate hydrological model. Variational assimilation is implemented to correct the observed precipitation in order to obtain a better river flow forecast
Qu, Zheng. "Théorie de Perron-Frobenius non linéaire et méthodes numériques max-plus pour la résolution d'équations d'Hamilton-Jacobi." Phd thesis, Ecole Polytechnique X, 2013. http://pastel.archives-ouvertes.fr/pastel-00927122.
Повний текст джерелаSaad, Sawsan. "Conception et Optimisation Distribuée d’un Système d’Information des Services d’Aide à la Mobilité Urbaine Basé sur une Ontologie Flexible dans le Domaine de Transport." Thesis, Ecole centrale de Lille, 2010. http://www.theses.fr/2010ECLI0017/document.
Повний текст джерелаNowadays, information related on displacement and mobility in a transport network represents certainly a significant potential. So, this work aims to modeling, to optimize and to implement an Information System of Services to Aid the Urban Mobility (ISSAUM).The ISSAUM has firstly to decompose each set of simultaneous requests into a set of sub-requests called tasks. Each task corresponds to a service which can be proposed different by several information providers with different. An information provider which aims to propose some services through our ISSAUM has to register its ontology. Indeed, ISSAUM is related to an Extended and distributed Transport Multimodal Network (ETMN) which contains several heterogeneous data sources. The dynamic and distributed aspects of the problem incite us to adopt a multi-agent approach to ensure a continual evolution and a pragmatic flexibility of the system. So, we proposed to automate the modeling of services by using ontology idea. Our ISSAUM takes into account possible disturbance through the ETMN. In order to satisfy user requests, we developed a negotiation protocol between our system agents. The proposed ontology mapping negotiation model based on the knowledge management system for supporting the semantic heterogeneity and it organized as follow: Negotiation Layer (NL), the Semantic Layer (SEL), and the Knowledge Management Systems Layer(KMSL).We detailed also the reassignment process by using Dynamic Reassigned Tasks (DRT) algorithm supporting by ontology mapping approach. Finally, the experimental results presented in this thesis, justify the using of the ontology solution in our system and its role in the negotiation process
Björklund, Ted, and Wictor Fors. "Waste Management With a Green Supply Chain : A case study regarding how for-profit organisations should utilise waste management." Thesis, Mälardalens högskola, Industriell ekonomi och organisation, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-39338.
Повний текст джерелаSyftet med den här studien är att hitta ett hållbart tillvägagångssätt gällande avfallshantering och hur vinstdrivande organisationer skall agera i frågor kopplade till miljömässiga och kostnadsmässiga perspektiv. Det tillsammans med att applicera en grön logistikkedja för att undersöka detta tillvägagångssätt utifrån flera perspektiv av ett hållbart synsätt. En kvalitativ studie har genomförts, en studie vars tillvägagångssätt har varit abduktivt. En abduktiv studie hjälper till att hålla studien opartisk och samtidigt öppen för en iterationsprocess för att hitta nya teorier. En fältstudie har gjorts på Saab AB i Arboga och en referensstudie är genomförd på Saab AB i Malmslätt. Fältstudierna har blivit byggda kring semistrukturerade intervjuer tillsammans med observationer och dokumentinsamling. All data från fältstudierna har blivit jämförda och analyserade med hjälp av de tre analytiska verktyg som beskrivits för att nå en diskussion och flera slutsatser. Denna studie har visat att avfallshantering är viktigt att använda inom flera områden i en vinstdrivande organisation. Huvudområdena som har tagits upp som viktiga är operationer, anställda, aktörer och miljön. Problem kopplade till medvetenhet och kunskap hos de anställda kan vara en nyckel gällande att hitta hållbara lösningar tillsammans med andra aktörer för att kunna utveckla en avfallshanteringssituation tillsammans med en grön logistikkedja. Genom att ta itu med problem ihopkopplade mellan olika miljöer och operationerna på plats gör det möjligt att dra slutsatsen att varje förändring som sker innebär en högre initial kostnad men som senare kan komma att bli omvandlad till en vinst för organisationen, antingen i form av pengar eller i form av information, rykte eller attraktivitet.
Khlissa, Radhouane. "Contribution à la définition des méthodes d'optimisation rapides et économiques pour le dimensionnement d'actionneurs électriques." Thesis, Compiègne, 2015. http://www.theses.fr/2015COMP2190/document.
Повний текст джерелаThis thesis focuses on the application of the Space Mapping optimization technique in the case of the sizing of electrical actuators taking into account a multi-physical modeling. The main interest in this type of optimization method is to considerably reduce the cost of optimal sizing. The need to use such optimization approach is due to several considerations. First, electrical actuators modeling tends to increasingly require the consideration of several physical phenomena (such as magnetic, electrical thermal and mechanical phenomena) in order to better describe observed and measured phenomena. Besides, it becomes necessary to take into account couplings between the different physical phenomena to precisely calculate the interdependencies between these phenomena. In this context, taking into account the thermal aspect in the case of electrical machines is particularly highlighted. A lumped parameter model of a permanent magnet synchronous machine is built. An experimental procedure has been followed to validate calculation results and define some elements of the proposed model. When implemented numerically, all points mentioned above increase the cost of the calculation of the performances of the electrical actuator, and then the cost of the optimal sizing. Thus, the use of an optimization technique based on surrogate models permits to reduce the optimal sizing cost. Space Mapping technique was used in this work as a solution to find a compromise between the quality of the found results and the calculation time. It is particularly used to solve an optimal sizing problem of a permanent magnet synchronous machine used as starter in a hybrid vehicle application. The Space Mapping optimization approach was compared to a classical one using a unique modeling of sized the electrical actuator : no surrogate model is used in the classical approach. Il is demonstrated that the Space Mapping techniques find optimization results that are similar to those found by the classical approach, yet, in a much more efficiently. Space Mapping techniques require only few calculations of the multi-physical model of the actuator
Ravon, Gwladys. "Problèmes inverses pour la cartographie optique cardiaque." Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0118/document.
Повний текст джерелаSince the 80's optical mapping has become an important tool for the study and the understanding of cardiac arythmias. This experiment allows the visualization of fluorescence fluxes through tissue surface. The fluorescence is directly related to the transmembrane potential. Information about its three-dimension distribution is hidden in the data on the surfaces. Our aim is to exploit this surface measurements to reconstruct the depolarization front in the thickness. For that purpose we developed a method based on the resolution of an inverse problem. The forward problem is made of two diffusion equations and the parametrization of the wavefront. The inverse problem resolution enables the identification of the front characteristics. The method has been tested on in silico data with different ways to parameter the front (expanding sphere, eikonal equation). The obtained results are very satisfying, and compared to a method derived by Khait et al. [1]. Moving to experimental data put in light an incoherence in the model. We detail the possible causes we explored to improve the model : constant illumination, optical parameters, accuracy of the diffusion approximation. Several inverse problems are considered in this manuscript, that involves several cost functions and associated gradients. For each case, the calculation of the gradient is explicit, often with the gradient method. The presented method was also applied on data other than cardiac optical mapping
Petit, Tristan. "Caractérisation des fonds marins et estimation bathymétrique par inversion de modèle de transfert radiatif : application à l'imagerie hyperspectrale en milieu coralien." Thesis, Brest, 2017. http://www.theses.fr/2017BRES0023/document.
Повний текст джерелаAirborne hyperspectral imaging is a potential candidate for mapping and monitoring coral reefs at large scale and with high spatial resolution. In this thesis, we first present the processing steps to be applied to hyperspectral signals for extracting information about seabed types, bathymetry and water optical properties, and we discuss their efficiency with respect to two main confounding factors: (i) low signal to noise ratio of the measured signals, and (ii) large number and variability of physical interactions occurring between the entrance of sunlight into the atmosphere and its measurement by the hyperspectral sensor. Considering these limitations, we examine the performance of an already existing water column processing method: semi-analytical model inversion by optimization. We first evaluate the robustness of seabed type and bathymetry estimation for six different inversion setups. The results on hyperspectral images acquired over Réunion Island reefs in 2009 show that the choice of the inversion setup plays an important role on the quality of the estimations and that the most widely used inversion setup does not always produce the best results. We then evaluate the importance of the accuracy of the parameterization of the direct semi-analytical model. This is done through a sensitivity analysis performed on both simulated and real hyperspectral data acquired in Réunion Island in 2015. The analysis is performed for each inversion setup previously studied. This study shows that in coral reef context the accuracy of the parameterization of the direct model is less important than the choice of the inversion setup. We also demonstrate that it is not possible to identify the most influent parameters of the direct model because it depends on the relative concentration of each optically active constituent
Pecen, Pavel. "Navržení nástrojů pro řízení internetového obchodu, optimalizace a standardizace procesů pro vybraný podnik." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-124597.
Повний текст джерелаBenda, Ondřej. "Optimalizace činnosti měrového střediska." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2008. http://www.nusl.cz/ntk/nusl-228196.
Повний текст джерелаMarcelin, Yvesner. "Développements récents en analyse multivoque : prédérivées et optimisation multivoque." Thesis, Antilles, 2016. http://www.theses.fr/2016ANTI0040/document.
Повний текст джерелаThis work is devoted to the study of prederivatives of set-valued maps and the theory of optimization. First, we establish results regarding the existence of several kinds of prederivatives for some classes set-valued maps. Specially for set-valued maps enjoying convexity properties. Subsequently, we apply our results in the framework of set optimization by establishing both necessary and sufficient optimality conditions, involving such prederivatives, for set optimization problems. Under convexity assumptions, we prove some natural results fitting the paradigm of minimizers in convex optimization. Then, we apply some of our theoretical results to a model of welfare economics by establishing in particular an equivalence between the weak Pareto optimal allocations of the model and the weak minimizes of a set optimization problem associated. Taking adventadge of several generalized interiority notions existing in the literature, we discuss in a unified way corresponding notions of relaxed minimizers In order to establish stability results, we introduce a topology on vector ordered spaces from which we derive a concept of convergence that we use to define two concepts of variational convergence that allow us to study both the upper and the lower stability of sets of relaxed minimizers we consider
Bonnal, Thomas. "Développements de modèles optiques et de méthodes non supervisées de résolution des problèmes bilinéaires : application à l’imagerie vibrationnelle." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE1063.
Повний текст джерелаComplementary information, to that provided by elemental analysis and diffraction techniques, is needed to characterize inorganic materials. Fourier Transform Infrared spectroscopy enables to characterize covalent bonds and the environment of functional groups in materials. Thus, it is a technique of interest to study hydrated materials, amorphous materials or any materials, which may experience ageing phenomena. By combining this technique with a micrometric motorized stage, cartographies of chemical compounds can be obtained on several square millimeters: this is the infrared microscopy technique. This Ph.D. thesis focuses on the use of reflected light, in particular through the study of specular reflection and of Attenuated Total Reflectance (ATR). After a first part focused on the different acquisition set-ups, a second part covers the unsupervised methodologies of resolution employed to obtain chemical maps. They result in one map for each component present in the analyzed area. Dimensions reduction techniques and multivariate statistics techniques are implemented to estimate the number of components and their infrared spectra; minimization problems under constraints are solved to retrieve chemical information. When specular reflection is used to acquire spectra, no contact is made with the sample, thus no damage of the analyzed area occurs during the acquisition. A priori, it is a great technique to study the evolution of a material. However, this technique suffers from the complexity of interpretation of the resulting spectra. With the objective to democratize the use of specular reflection to obtain chemical maps, models based on geometrical optics and including diffraction, correction of interferograms and classical homogenization techniques have been developed. This work resulted in an optical model linking the angle of incidence, the polarization state and the dielectric optical constants of the material with the reflected light, which is measured. A model material, constituted of three distinct phases, detectable in the infrared range, has specially been fabricated to validate this optical model. This model set the stage for the use of elliptically polarized light in the determining of the complex refractive indices of materials in the infrared range. Thanks to this development, infrared spectroscopes, equipped with a classical set-up to control the angle of incidence, can now be used in addition to ellipsometry techniques
Numan, Mostafa Wasiuddin. "Mapping of processing elements of hardware-based production systems on networks on chip." Thesis, 2017. http://hdl.handle.net/2440/112587.
Повний текст джерелаThesis (Ph.D.) -- University of Adelaide, School of Electrical and Electronic Engineering, 2017.
Van, Zyl Pieter. "Performance investigation into selected object persistence stores." Diss., 2010. http://hdl.handle.net/2263/26497.
Повний текст джерелаComputer Science
unrestricted