Dissertations / Theses on the topic 'Information relaxation'

To see the other types of publications on this topic, follow the link: Information relaxation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 25 dissertations / theses for your research on the topic 'Information relaxation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Jianxin. "Adaptive query relaxation and processing over heterogeneous xml data sources." Swinburne Research Bank, 2009. http://hdl.handle.net/1959.3/66874.

Full text
Abstract:
Thesis (Ph.D) - Swinburne University of Technology, Faculty of Information & Communication Technologies, 2009.
A dissertation submitted to the Faculty of Information and Communication Technologies, Swinburne University of Technology in partial fulfillment of the requirements for the degree of Doctor of Philosophy, 2009. Typescript. "August 2009". Bibliography p. 161-171.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, S. A. "KICS : representation of regulatory information and the use of case-based reasoning to support the relaxation process." Thesis, University of Edinburgh, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.664100.

Full text
Abstract:
This thesis presents a case-based reasoning system (KICS) which can assist domain experts in interpreting building regulations in relaxation cases. In traditional legal decision support systems, it has been regarded as natural to represent legal rules in statutes in terms of If-Then decision rules and to link these rules to a separate case-based reasoning system for handling cases. However, we take a view that legal rules in the statutory regulations are the results of accumulation and generalisation of rulings made in case histories and this has led us to a unified case-based approach to handle both statutory regulations and cases. First, we propose a unified case-based model of regulatory information. In this model, regulatory information, i.e., legal rules from statutes and precedent cases, are represented as models, and interpretation hierarchies of legal rules are represented as abstraction hierarchies of models in the Model Knowledge Base. Actual cases are stored in the case Library together with arguments debated. Background domain knowledge used in classifying input cases are represented as semantic networks and heuristic rules in the Domain Knowledge Base. Second, we propose to use case-based reasoning to access and maintain regulatory information. Models relevant to the input case are retrieved by identifying the level of abstraction at which the input case is described and by selecting models similar to the input case. If the input case is not compliant with the retrieved models, principles behind retrieved legal rules and previous similar cases are explained to the user and ask whether relaxation can be granted. Decision on relaxation is made by the user, and rulings made in cases in which relaxation is granted are acquired by generalising them and (if possible) combining them with existing models in the abstraction hierarchies.
APA, Harvard, Vancouver, ISO, and other styles
3

Laurent, Sabine. "Orientation optique et relaxation du spin du trion dans les boîtes quantiques d'InAs/GaAs." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2004. http://tel.archives-ouvertes.fr/tel-00007028.

Full text
Abstract:
Le spin électronique dans les semi-conducteurs fait actuellement l'objet de nombreuses investigations. Il pourrait en effet constituer, en tant que système quantique à 2 niveaux, l'élément de base d'un futur ordinateur quantique: un quantum-bit. Une des limites physiques d'un tel ordinateur provient en premier lieu de la décohérence de la superposition des états up et down de ce quantum-bit. La problématique est donc d'obtenir un système dans lequel ce temps de cohérence soit suffisamment long pour permettre de futures manipulations quantiques. Dans ce contexte, les boîtes quantiques semi-conductrices chargées avec un seul électron se présentent comme des candidats prometteurs. Ces systèmes, de part leur dimensions nanométriques, ont en effet la propriété de posséder une structure électronique discrétisée, ce qui supprime les principaux mécanismes responsables de la relaxation du spin. Ce travail de thèse tend donc à déterminer le temps de relaxation du spin dans une boîte.
APA, Harvard, Vancouver, ISO, and other styles
4

Rival, Olivier. "Organic materials for quantum computation." Thesis, University of Oxford, 2009. http://ora.ox.ac.uk/objects/uuid:3674b9ce-c284-47b5-ab0d-76d094c849f0.

Full text
Abstract:
Quantum mechanics has a long history of helping computer science. For a long time, it provided help only at the hardware level by giving a better understanding of the properties of matter and thus allowing the design of ever smaller and ever more efficient components. For the last few decades, much research has been dedicated to finding whether one can change computer science even more radically by using the principles of quantum mechanics at both the hardware and algorithm levels. This field of research called Quantum Information Processing (QIP) has rapidly seen interesting theoretical developments: it was in particular shown that using superposition of states leads to computers that could outperform classical ones. The experimental side of QIP however lags far behind as it requires an unprecedented amount of control and understanding of quantum systems. Much effort is spent on finding which particular systems would provide the best physical implementation of QIP concepts. Because of their nearly endless versatility and the high degree of control over their synthesis, organic materials deserve to be assessed as a possible route to quantum computers. This thesis studies the QIP potential of spin degrees of freedom in several such organic compounds. Firstly, a study on low-spin antiferromagnetic rings is presented. It is shown that in this class of molecular nanomagnets the relaxation times are much longer than previously expected and are in particular long enough for up to a few hundred quantum operations to be performed. A detailed study of the relaxation mechanisms is presented and, with it, routes to increasing the phase coherence time further by choosing the suitable temperature, isotopic and chemical substitution or solvent. A study of higher-spin systems is also presented and it is shown that the relaxation mechanisms are essentially the same as in low-spin compounds. The route to multi-qubit system is also investigated: the magnetic properties of several supermolecular assemblies, in particular dimers, are investigated. Coupling between neighbouring nanomagnets is demonstrated and experimental issues are raised concerning the study of the coherent dynamics of dimers. Finally a study of the purely organic compound phenanthrene is reported. In this molecule the magnetic moment does not result from the interactions between several transition metal ions as in molecular nanomagnets but from the photoexcitation of an otherwise diamagnetic molecule. The interest of such a system in terms of QIP is presented and relaxation times and coupling to relevant nuclei are identified.
APA, Harvard, Vancouver, ISO, and other styles
5

Sapena, Masip Emili. "A constraint-based hypergraph partitioning approach to coreference resolution." Doctoral thesis, Universitat Politècnica de Catalunya, 2012. http://hdl.handle.net/10803/83904.

Full text
Abstract:
The objectives of this thesis are focused on research in machine learning for coreference resolution. Coreference resolution is a natural language processing task that consists of determining the expressions in a discourse that mention or refer to the same entity. The main contributions of this thesis are (i) a new approach to coreference resolution based on constraint satisfaction, using a hypergraph to represent the problem and solving it by relaxation labeling; and (ii) research towards improving coreference resolution performance using world knowledge extracted from Wikipedia. The developed approach is able to use entity-mention classi cation model with more expressiveness than the pair-based ones, and overcome the weaknesses of previous approaches in the state of the art such as linking contradictions, classi cations without context and lack of information evaluating pairs. Furthermore, the approach allows the incorporation of new information by adding constraints, and a research has been done in order to use world knowledge to improve performances. RelaxCor, the implementation of the approach, achieved results in the state of the art, and participated in international competitions: SemEval-2010 and CoNLL-2011. RelaxCor achieved second position in CoNLL-2011.
La resolució de correferències és una tasca de processament del llenguatge natural que consisteix en determinar les expressions d'un discurs que es refereixen a la mateixa entitat del mon real. La tasca té un efecte directe en la minería de textos així com en moltes tasques de llenguatge natural que requereixin interpretació del discurs com resumidors, responedors de preguntes o traducció automàtica. Resoldre les correferències és essencial si es vol poder “entendre” un text o un discurs. Els objectius d'aquesta tesi es centren en la recerca en resolució de correferències amb aprenentatge automàtic. Concretament, els objectius de la recerca es centren en els següents camps: + Models de classificació: Els models de classificació més comuns a l'estat de l'art estan basats en la classificació independent de parelles de mencions. Més recentment han aparegut models que classifiquen grups de mencions. Un dels objectius de la tesi és incorporar el model entity-mention a l'aproximació desenvolupada. + Representació del problema: Encara no hi ha una representació definitiva del problema. En aquesta tesi es presenta una representació en hypergraf. + Algorismes de resolució. Depenent de la representació del problema i del model de classificació, els algorismes de ressolució poden ser molt diversos. Un dels objectius d'aquesta tesi és trobar un algorisme de resolució capaç d'utilitzar els models de classificació en la representació d'hypergraf. + Representació del coneixement: Per poder administrar coneixement de diverses fonts, cal una representació simbòlica i expressiva d'aquest coneixement. En aquesta tesi es proposa l'ús de restriccions. + Incorporació de coneixement del mon: Algunes correferències no es poden resoldre només amb informació lingüística. Sovint cal sentit comú i coneixement del mon per poder resoldre coreferències. En aquesta tesi es proposa un mètode per extreure coneixement del mon de Wikipedia i incorporar-lo al sistem de resolució. Les contribucions principals d'aquesta tesi son (i) una nova aproximació al problema de resolució de correferències basada en satisfacció de restriccions, fent servir un hypergraf per representar el problema, i resolent-ho amb l'algorisme relaxation labeling; i (ii) una recerca per millorar els resultats afegint informació del mon extreta de la Wikipedia. L'aproximació presentada pot fer servir els models mention-pair i entity-mention de forma combinada evitant així els problemes que es troben moltes altres aproximacions de l'estat de l'art com per exemple: contradiccions de classificacions independents, falta de context i falta d'informació. A més a més, l'aproximació presentada permet incorporar informació afegint restriccions i s'ha fet recerca per aconseguir afegir informació del mon que millori els resultats. RelaxCor, el sistema que ha estat implementat durant la tesi per experimentar amb l'aproximació proposada, ha aconseguit uns resultats comparables als millors que hi ha a l'estat de l'art. S'ha participat a les competicions internacionals SemEval-2010 i CoNLL-2011. RelaxCor va obtenir la segona posició al CoNLL-2010.
APA, Harvard, Vancouver, ISO, and other styles
6

Dai, Jianxing. "Analysis and Design of a High-Frequency RC Oscillator Suitable for Mass Production." Thesis, Linköpings universitet, Elektroniska Kretsar och System, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-138423.

Full text
Abstract:
Oscillators are components providing clock signals. They are widely required by low-cost on-chip applications, such as biometric sensors and SoCs. As part of a sensor, a relaxation oscillator is implemented to provide a clock reference. Limited by the sensor application, a clock reference outside the sensor is not desired. An RC implementation of the oscillator has a balanced accuracy performance with low-cost advantage. Hence an RC relaxation oscillator is chosen to provide the clock inside the sensor. This thesis proposes a current mode relaxation oscillator to achieve low frequency standard deviation across different supplies, temperatures and process corners. A comparison between a given relaxation oscillator and the proposed design is made as well. All oscillators in this thesis use 0.18 μm technology and 1.8 V nominal supply. The proposed oscillator manages to achieve a frequency standard deviation across all PVT variations less than ±6.5% at 78.4 MHz output frequency with a power dissipation of 461.2 μW. The layout of the oscillator's core area takes up 0.003 mm2.
APA, Harvard, Vancouver, ISO, and other styles
7

Marino, Robert. "Propriétés magnétiques et optiques de cristaux dopés terres rares pour l’information quantique." Thesis, Lille 1, 2011. http://www.theses.fr/2011LIL10101/document.

Full text
Abstract:
La maitrise de l’information représente un avantage concurrentiel de nos jours. Malgré une intensification des moyens développés pour protéger les flux de données, il n’est actuellement pas possible d’échanger à distance et de façon complètement sure, une information entre deux interlocuteurs. Néanmoins, des travaux menés par Bennett et Brassard ont montré qu’il est possible d’atteindre un niveau de sécurité maximum en utilisant un protocole quantique de transmission de l’information. Ce protocole se base sur l’utilisation de réseaux télécom utilisant des répéteurs quantiques à la place des répéteurs classiques. La voie étudiée dans cette thèse, réalisée en partie dans le cadre du projet européen QuRep, a pour but l’amélioration des connaissances sur les monocristaux dopés aux ions de terre rare qui sont des candidats de choix pour la mise au point de répéteurs quantiques. Deux grands axes ont émergés : dans un premier temps nous avons essayé de comprendre quels sont les facteurs de succès et limitatifs dans l’utilisation du cristal de Nd : YSO en tant qu’hôte pour les mémoires quantiques avec pour objectif le transfert de la cohérence électronique vers des niveaux hyperfins. Dans un second temps, nous avons étudié un cristal présentant une structure hyperfine directement accessible en optique, Er : YLF afin de vérifier sa potentielle utilisation pour les mémoires quantiques. Ces travaux ont permis, entre autre, de réaliser un transfert de cohérence d’un niveau Zeeman électronique vers un niveau hyperfin avec un temps de stockage de plus de 300 µs, ce qui permet d’envisager une mémoire quantique dans Nd : YSO permettant de réémettre un photon à la demande
The control of information is a competitive advantage today. Despite an intensification of the means developed to protect the data stream, it is currently not possible to exchange remotely and in a completely safe way information between two parties. However, the work of Bennett and Brassard have shown that it is possible to achieve a maximum level of security using a protocol for transmitting quantum information. This protocol is based on the use of telecom networks using quantum repeaters in place of conventional repeaters.The route studied in this thesis, carried out partly in the framework of the European Project QuRep, aims to improve knowledge on single crystals doped with rare earth ions that are good candidates for the development of quantum repeaters. Two main areas emerged: on the one hand, we tried to understand the success and limiting factors regarding the use of Nd : YSO single crystal as host for quantum memories. The objective was also to transfer the coherence from an electronic Zeeman level to the hyperfine levels. In a second step, we studied a crystal with a hyperfine structure directly accessible in optics, Er : YLF to assess its potential use for quantum memories. Among other things, we achieved the transfer of coherence from a Zeeman level to an hyperfine level with a storage time of over 300 microseconds, which allows to consider the development an on demand readout quantum memory in Nd : YSO
APA, Harvard, Vancouver, ISO, and other styles
8

Joseph, Binoy. "Clustering For Designing Error Correcting Codes." Thesis, Indian Institute of Science, 1994. http://hdl.handle.net/2005/66.

Full text
Abstract:
In this thesis we address the problem of designing codes for specific applications. To do so we make use of the relationship between clusters and codes. Designing a block code over any finite dimensional space may be thought of as forming the corresponding number of clusters over the particular dimensional space. In literature we have a number of algorithms available for clustering. We have examined the performance of a number of such algorithms, such as Linde-Buzo-Gray, Simulated Annealing, Simulated Annealing with Linde-Buzo-Gray, Deterministic Annealing, etc, for design of codes. But all these algorithms make use of the Eucledian squared error distance measure for clustering. This distance measure does not match with the distance measure of interest in the error correcting scenario, namely, Hamming distance. Consequently we have developed an algorithm that can be used for clustering with Hamming distance as the distance measure. Also, it has been observed that stochastic algorithms, such as Simulated Annealing fail to produce optimum codes due to very slow convergence near the end. As a remedy, we have proposed a modification based on the code structure, for such algorithms for code design which makes it possible to converge to the optimum codes.
APA, Harvard, Vancouver, ISO, and other styles
9

Zaïdi, Abdelhamid. "Séparation aveugle d'un mélange instantané de sources autorégressives gaussiennes par la méthode du maximum de vraissemblance exact." Université Joseph Fourier (Grenoble), 2000. http://www.theses.fr/2000GRE10233.

Full text
Abstract:
Cette these est consacree a l'etude du probleme de la separation aveugle d'un melange instantane de sources gaussiennes autoregressives, sans bruit additif, par la methode du maximum de vraisemblance exact. La maximisation de la vraisemblance est decomposee, par relaxation, en deux sous-problemes d'optimisation, egalement traites par des techniques de relaxation. Le premier consiste en l'estimation de la matrice de separation a structure autoregressive des sources fixee. Le second est d'estimer cette structure lorsque la matrice de separation est fixee. Le premier probleme est equivalent a la maximisation du determinant de la matrice de separation sous contraintes non lineaires. Nous donnons un algorithme de calcul de la solution de ce probleme pour lequel nous precisons les conditions de convergence. Nous montrons l'existence de l'estimateur du maximum de vraisemblance dont nous prouvons la consistance. Nous determinons egalement la matrice d'information de fisher relative au parametre global et nous proposons un indice pour mesurer les performances des methodes de separation. Puis nous analysons, par simulation, les performances de l'estimateur ainsi defini et nous montrons l'amelioration qu'il apporte a la procedure de quasi-maximum de vraisemblance ainsi qu'aux autres methodes du second ordre.
APA, Harvard, Vancouver, ISO, and other styles
10

Swoboda, Paul [Verfasser], and Christoph [Akademischer Betreuer] Schnörr. "New Convex Relaxations and Global Optimality in Variational Imaging / Paul Swoboda ; Betreuer: Christoph Schnörr." Heidelberg : Universitätsbibliothek Heidelberg, 2016. http://d-nb.info/1178009688/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Strekalovskiy, Evgeny [Verfasser], Daniel [Akademischer Betreuer] Cremers, and Antonin [Akademischer Betreuer] Chambolle. "Convex Relaxation of Variational Models with Applications in Image Analysis / Evgeny Strekalovskiy. Betreuer: Daniel Cremers. Gutachter: Daniel Cremers ; Antonin Chambolle." München : Universitätsbibliothek der TU München, 2015. http://d-nb.info/1080299394/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Klodt, Maria Verfasser], Daniel [Akademischer Betreuer] Cremers, Wolfgang [Akademischer Betreuer] Förstner, and Bjoern Holger [Akademischer Betreuer] [Menze. "Convex Relaxation Methods for Image Segmentation and Stereo Reconstruction / Maria Klodt. Gutachter: Wolfgang Förstner ; Daniel Cremers ; Bjoern Menze. Betreuer: Daniel Cremers." München : Universitätsbibliothek der TU München, 2014. http://d-nb.info/107062425X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Heinz, Stefan [Verfasser], Thorsten [Akademischer Betreuer] Koch, J. Christopher [Gutachter] Beck, and Thorsten [Gutachter] Koch. "Presolving techniques and linear relaxations for cumulative scheduling / Stefan Heinz ; Gutachter: J. Christopher Beck, Thorsten Koch ; Betreuer: Thorsten Koch." Berlin : Technische Universität Berlin, 2018. http://d-nb.info/1160593450/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Brown, Richard Matthew. "Coherent transfer between electron and nuclear spin qubits and their decoherence properties." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:21e043b7-3b72-44d7-8095-74308a6827dd.

Full text
Abstract:
Conventional computing faces a huge technical challenge as traditional transistors will soon reach their size limitations. This will halt progress in reaching faster processing speeds and to overcome this problem, require an entirely new approach. Quantum computing (QC) is a natural solution offering a route to miniaturisation by, for example, storing information in electron or nuclear spin states, whilst harnessing the power of quantum physics to perform certain calculations exponentially faster than its classical counterpart. However, QCs face many difficulties, such as, protecting the quantum-bit (qubit) from the environment and its irreversible loss through the process of decoherence. Hybrid systems provide a route to harnessing the benefits of multiple degrees of freedom through the coherent transfer of quantum information between them. In this thesis I show coherent qubit transfer between electron and nuclear spin states in a 15N@C60 molecular system (comprising a nitrogen atom encapsulated in a carbon cage) and a solid state system, using phosphorous donors in silicon (Si:P). The propagation uses a series of resonant mi- crowave and radiofrequency pulses and is shown with a two-way fidelity of around 90% for an arbitrary qubit state. The transfer allows quantum information to be held in the nuclear spin for up to 3 orders of magnitude longer than in the electron spin, producing a 15N@C60 and Si:P ‘quantum memory’ of up to 130 ms and 1.75 s, respectively. I show electron and nuclear spin relaxation (T1), in both systems, is dominated by a two-phonon process resonant with an excited state, with a constant electron/nuclear T1 ratio. The thesis further investigates the decoherence and relaxation properties of metal atoms encapsulated in a carbon cage, termed metallofullerenes, discovering that exceptionally long electron spin decoherence times are possible, such that these can be considered a viable QC candidate.
APA, Harvard, Vancouver, ISO, and other styles
15

Bouakaz, Saïda. "Approche stochastique de la segmentation des images : un modèle de coopération entre les primitives de régions et de frontières." Grenoble 1, 1987. http://tel.archives-ouvertes.fr/tel-00324438.

Full text
Abstract:
Dans le domaine de l'étiquetage, les méthodes stochastiques s'inscrivent dans un cadre ou l'affectation d'un objet est perçue comme une connaissance dynamique. Cette affectation peut être modifiée selon l'évolution des connaissances contextuelles, en retour elle est susceptible d'influer sur l'état instantané des connaissances. De telles méthodes présentent l'avantage d'introduire un caractère de contrôle local à l'évolution de l'étiquetage. C'est en s'appuyant sur cette notion d'étiquetage, sous son aspect local, que nous avons abordé le problème de la segmentation des images ayant remarque que les décisions au niveau de chaque point influent et sont influencées par celles des voisins. En outre, on remarque que jusqu'à présent, les méthodes de segmentation se fondaient, essentiellement, sur un choix entre 2 types de modèles : le modèle région et le modèle frontière. Les méthodes stochastiques, itératives abordées dans ce mémoire permettent de faire intervenir simultanément les 2 entités et de les traiter au sein d'un processus unique. Le procédé consiste à introduire pour chaque point, au niveau de son vecteur d'étiquetage des informations de type région et des informations de type frontière. Les relations contextuelles interviennent sous forme d'interactions inter-classes et d'interactions inter-entités
APA, Harvard, Vancouver, ISO, and other styles
16

Ait, Ouahmed Mohammed Amine. "Optimisation dans l'auto-partage à un seul sens avec voitures électriques et relocalisations." Thesis, Avignon, 2018. http://www.theses.fr/2018AVIG0228/document.

Full text
Abstract:
Cette thèse a pour objectif de modéliser et résoudre des problèmes d’optimisation d’un système d’auto-partage avec des voitures électriques dit « à un seul sens », où les utilisateurs peuvent prendre une voiture dans une station et la laisser ensuite dans une autre. Ce fonctionnement conduit généralement à une situation de déséquilibre dans la répartition des voitures avec certaines stations pleines et d’autres vides. Une des solutions utilisées par les opérateurs d’autopartage pour pallier ce problème est le recours à des agents pour déplacer les voitures selon le besoin. Identifier et répondre à ce besoin est un problème d’optimisation non trivial, notamment à cause de l’usage de véhicules électriques, ce qui engendre des contraintes de rechargement de batteries et d’autonomie. Le problème d’optimisation est décomposé en deux sous-problèmes : le premier est le problème d’affectation des voitures aux clients, ainsi que leurs routages, que nous nommons ROCSP pour Recharging One way Car Sharing Problem ; le second problème est celui du planning des agents et leurs routages que nous nommons ESRP pour Employee Scheduling Routing Problem. 1. Résolution du ROCSP : deux modélisations en Programmation Linéaire en Nombres Entiers (PLNE) sont proposées, la première basée sur les flots et la deuxième sur les chemins, ce qui fait que les deux modèles intègrent de manière différente les contraintes de recharge électrique. Comme la résolution exacte à travers les modèles PLNE s’avère très gourmande en temps de calcul et non adaptée aux instances d’auto-partage de taille réelle, nous proposons des heuristiques qui permettent dans un temps raisonnable d’optimiser la redistribution des voitures et la gestion du service. Ces heuristiques permettent de calculer le nombre de voitures et les différentes opérations de relocalisation (redistribution des voitures) à réaliser sur une journée donnée. 2. Résolution du ESRP : un modèle PLNE est proposé pour la résolution exacte du ESRP, et, en complément, des heuristiques sont proposées pour une résolution approchée et relativement rapide. L’objectif est la détermination du nombre minimal d’agents nécessaire pour effectuer les opérations de relocalisation qui découlent du premier problème, le ROCSP. Dans une partie prospective, et une fois les ROCSP et ESRP résolus dans leur version statique, nous nous focaliserons sur une autre variante du problème avec réservation dynamique. Nous proposons également d’explorer un nouveau concept - l’auto-copartage - qui se veut une hybridation entre autopartage et covoiturage. Les algorithmes proposés ont été validés sur le réseau Auto Bleue de la ville de Nice essentiellement, qui gère une flotte de véhicules électriques, en s’appuyant sur des modèles de génération de flux pour estimer la demande, mais aussi d’autres instances que nous avons générées pour simuler d’autres villes, au sein d’un Système d’Information Géographique
This thesis aims at modelling and solving optimization problems related to the management of one-way-electric-car-sharing systems, where users can take a car from a station, use it, and then return it to another station. This generally leads to an imbalanced distribution of cars, with some full stations and other empty ones. A solution to this problem, implemented by car-sharing operators, is to employ staff agents to move cars as needed. However, identifying this need is a non-trivial optimization problem, especially since the system may be more constrained when the vehicles used are electric, which generates battery recharging and autonomy constraints. The global optimization problem addressed is then divided into two sub-problems. The first one is assigning the cars to customers, as well as their routing; it is denoted by ROCSP (Recharging OneWay Car Sharing Problem). The second problem involves agents planning and routing; it is denoted by ESRP (Employee Scheduling Routing Problem). 1. For the ROCSP, we propose two Mixed-integer linear programming (MILP) modelizations of the problem: One based on flows and the other based on paths. This means that the two models include the battery-recharging constraints in two different ways. As the exact resolution through the MILP models is quite expensive in terms of computational time and is not adapted for the resolution of real-size car-sharing instances, we introduce heuristics that enable the optimization of cars-redistribution and service management of the service within a reasonable amount of time. These heuristics allows the calculation of the number of cars and the various redistribution operations to be performed on a given day. 2. For the ESRP, this second problem is also addressed with MILP models for the exact resolution, and some heuristics are suggested for an approximate resolution. This process has reasonable calculation time and aims at finding the minimum number of agents to perform the necessary relocation operations that stem from the first problem, namely, the ROCSP. Once the ROCSP and ESRP solved in their static versions, we then focus on the ROCSP by exploring another variant of the problem : ROCSP with dynamic reservation. We also suggest to explore a new concept : Auto-CoPartage, which is a hybridization of car-sharing and carpooling. The stated algorithms are validated on the Auto Bleue electrical vehicles fleet in the network of the city of Nice, essentially by relying on flow generation models to estimate the demand, but also using other instances that we have generated for other cities. All the data are handled using a Geographical Information System
APA, Harvard, Vancouver, ISO, and other styles
17

Oplatková, Hana. "Žiju tarot." Master's thesis, Vysoké učení technické v Brně. Fakulta výtvarných umění, 2012. http://www.nusl.cz/ntk/nusl-232344.

Full text
Abstract:
Private deck of cards created during six-month survey and documentation of daily experiences. The package contains 49 cards and it is inspired by a set of 78 tarot cards. Text content - reverse side of the card was created using diary notes. Face side of the card was chosen as a representation of processes taking place usually in days when the card was read.
APA, Harvard, Vancouver, ISO, and other styles
18

Ghosh, Arindam. "Quantum Information Processing By NMR : Relaxation Of Pseudo Pure States, Geometric Phases And Algorithms." Thesis, 2006. http://hdl.handle.net/2005/454.

Full text
Abstract:
This thesis focuses on two aspects of Quantum Information Processing (QIP) and contains experimental implementation by Nuclear Magnetic Resonance (NMR) spectroscopy. The two aspects are: (i) development of novel methodologies for improved or fault tolerant QIP using longer lived states and geometric phases and (ii) implementation of certain quantum algorithms and theorems by NMR. In the first chapter a general introduction to Quantum Information Processing and its implementation using NMR as well as a description of NMR Hamiltonians and NMR relaxation using Redfield theory and magnetization modes are given. The second chapter contains a study of relaxation of Pseudo Pure States (PPS). PPS are specially prepared initial states from where computation begins. These states, being non-equilibrium states, relax with time and hence introduce error in computation. In this chapter we have studied the role of Cross-Correlations in relaxation of PPS. The third and fourth chapters, respectively report observation of cyclic and non-cyclic geometric phases. When the state of a qubit is subjected to evolution either adiabatically or non-adiabatically along the surface of the Bloch sphere, the qubit sometimes gain a phase factor apart from the dynamic phase. This is known as the Geometric phase, as it depends only on the geometry of the path of evolution. Geometric phase is used in Fault tolerant QIP. In these two chapters we have demonstrated how geometric phases of a qubit can be measured using NMR. The fifth and sixth chapters contain the implementations of “No Deletion” and “No Cloning” (quantum triplicator for partially known states) theorems. No Cloning and No Deletion theorems are closely related. The former states that an unknown quantum states can not be copied perfectly while the later states that an unknown state can not be deleted perfectly either. In these two chapters we have discussed about experimental implementation of the two theorems. The last chapter contains implementation of “Deutsch-Jozsa” algorithm in strongly dipolar coupled spin systems. Dipolar couplings being larger than the scalar couplings provide better opportunity for scaling up to larger number of qubits. However, strongly coupled systems offer few experimental challenges as well. This chapter demonstrates how a strongly coupled system can be used in NMR QIP.
APA, Harvard, Vancouver, ISO, and other styles
19

LIN, MING-YAO, and 林銘瑤. "New approaches to the input and recognition of Chinese characters using contextual information and the relaxation technique." Thesis, 1988. http://ndltd.ncl.edu.tw/handle/80829564548566174984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Trudeau, Jon David. "The use of ab initio calculations and NMR relaxation time experiments to obtain molecular structure and dynamic information of systems in solution." 1993. http://catalog.hathitrust.org/api/volumes/oclc/31042555.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Wang, Chun. "High-Dimensional Portfolio Management: Taxes, Execution and Information Relaxations." Thesis, 2014. https://doi.org/10.7916/D8M043JJ.

Full text
Abstract:
Portfolio management has always been a key topic in finance research area. While many researchers have studied portfolio management problems, most of the work to date assumes trading is frictionless. This dissertation presents our investigation of the optimal trading policies and efforts of applying duality method based on information relaxations to portfolio problems where the investor manages multiple securities and confronts trading frictions, in particular capital gain taxes and execution cost. In Chapter 2, we consider dynamic asset allocation problems where the investor is required to pay capital gains taxes on her investment gains. This is a very challenging problem because the tax to be paid whenever a security is sold depends on the tax basis, i.e. the price(s) at which the security was originally purchased. This feature results in high-dimensional and path-dependent problems which cannot be solved exactly except in the case of very stylized problems with just one or two securities and relatively few time periods. The asset allocation problem with taxes has several variations depending on: (i) whether we use the exact or average tax-basis and (ii) whether we allow the full use of losses (FUL) or the limited use of losses (LUL). We consider all of these variations in this chapter but focus mainly on the exact and average-cost tax-basis LUL cases since these problems are the most realistic and generally the most challenging. We develop several sub-optimal trading policies for these problems and use duality techniques based on information relaxations to assess their performances. Our numerical experiments consider problems with as many as 20 securities and 20 time periods. The principal contribution of this chapter is in demonstrating that much larger problems can now be tackled through the use of sophisticated optimization techniques and duality methods based on information-relaxations. We show in fact that the dual formulation of exact tax-basis problems are much easier to solve than the corresponding primal problems. Indeed, we can easily solve dual problem instances where the number of securities and time periods is much larger than 20. We also note, however, that while the average tax-basis problem is relatively easier to solve in general, its corresponding dual problem instances are non-convex and more difficult to solve. We therefore propose an approach for the average tax-basis dual problem that enables valid dual bounds to still be obtained. In Chapter 3, we consider a portfolio execution problem where a possibly risk-averse agent needs to trade a fixed number of shares in multiple stocks over a short time horizon. Our price dynamics can capture linear but stochastic temporary and permanent price impacts as well as stochastic volatility. In general it's not possible to solve even numerically for the optimal policy in this model, however, and so we must instead search for good sub-optimal policies. Our principal policy is a variant of an open-loop feedback control (OLFC) policy and we show how the corresponding OLFC value function may be used to construct good primal and dual bounds on the optimal value function. The dual bound is constructed using the recently developed duality methods based on information relaxations. One of the contributions of this chapter is the identification of sufficient conditions to guarantee convexity, and hence tractability, of the associated dual problem instances. That said, we do not claim that the only plausible models are those where all dual problem instances are convex. We also show that it is straightforward to include a non-linear temporary price impact as well as return predictability in our model. We demonstrate numerically that good dual bounds can be computed quickly even when nested Monte-Carlo simulations are required to estimate the so-called dual penalties. These results suggest that the dual methodology can be applied in many models where closed-form expressions for the dual penalties cannot be computed. In Chapter 4, we apply duality methods based on information relaxations to dynamic zero-sum games. We show these methods can easily be used to construct dual lower and upper bounds for the optimal value of these games. In particular, these bounds can be used to evaluate sub-optimal policies for zero-sum games when calculating the optimal policies and game value is intractable.
APA, Harvard, Vancouver, ISO, and other styles
22

Ruiz, Lacedelli Octavio. "Essays in information relaxations and scenario analysis for partially observable settings." Thesis, 2019. https://doi.org/10.7916/d8-mwkk-mr35.

Full text
Abstract:
This dissertation consists of three main essays in which we study important problems in engineering and finance. In the first part of this dissertation, we study the use of Information Relaxations to obtain dual bounds in the context of Partially Observable Markov Decision Processes (POMDPs). POMDPs are in general intractable problems and the best we can do is obtain suboptimal policies. To evaluate these policies, we investigate and extend the information relaxation approach developed originally for Markov Decision Processes. The use of information relaxation duality for POMDPs presents important challenges, and we show how change-of-measure arguments can be used to overcome them. As a second contribution, we show that many value function approximations for POMDPs are supersolutions. By constructing penalties from supersolutions we are able to achieve significant variance reduction when estimating the duality gap directly, and the resulting dual bounds are guaranteed to provide tighter bounds than those provided by the supersolutions themselves. Applications in robotic navigation and telecommunications are given in Chapter 2. A further application of this approach is provided in Chapter 5 in the context of personalized medicine. In the second part of this dissertation, we discuss a number of weaknesses inherent in traditional scenario analysis. For instance, the standard approach to scenario analysis aims to compute the P&L of a portfolio resulting from joint stresses to underlying risk factors, leaving all unstressed risk factors set to zero. This approach ignores thereby the conditional distribution of the unstressed risk factors given the stressed risk factors. We address these weaknesses by embedding the scenario analysis within a dynamic factor model for the underlying risk factors. We recur to multivariate state-space models that allow the modeling of real-world behavior of financial markets, like volatility clustering for example. Additionally, these models are sufficiently tractable to permit the computation (or simulation from) the conditional distribution of unstressed risk factors. Our approach permits the use of observable and unobservable risk factors. We provide applications to fixed income and options portfolios, where we are able to show the degree in which the two scenario analysis approaches can lead to dramatic differences. In the third part, we propose a framework to study a Human-Machine interaction system within the context of financial Robo-advising. In this setting, based on risk-sensitive dynamic games, the robo-advisor adaptively learns the preferences of the investor as the investor makes decisions that optimize her risk-sensitive criterion. The investor and machine's objectives are aligned but the presence of asymmetric information makes this joint optimization process a game with strategic interactions. By considering an investor with mean-variance risk preferences we are able to reduce the game to a POMDP. The human-machine interaction protocol features a trade-off between allowing the robo-advisor to learn the investors preferences through costly communications and optimizing the investor's objective relying on outdated information.
APA, Harvard, Vancouver, ISO, and other styles
23

Marino, Robert. "Propriétés magnétiques et optiques de monocristaux dopés terres rares pour l'information quantique." Phd thesis, 2011. http://tel.archives-ouvertes.fr/tel-00712947.

Full text
Abstract:
La maitrise de l'information représente un avantage concurrentiel de nos jours. Malgré une intensification des moyens développés pour protéger les flux de données, il n'est actuellement pas possible d'échanger à distance et de façon complètement sure, une information entre deux interlocuteurs. Néanmoins, des travaux menés par Bennett et Brassard ont montré qu'il est possible d'atteindre un niveau de sécurité maximum en utilisant un protocole quantique de transmission de l'information. Ce protocole se base sur l'utilisation de réseaux télécom utilisant des répéteurs quantiques à la place des répéteurs classiques. La voie étudiée dans cette thèse, réalisée en partie dans le cadre du projet européen QuRep, a pour but l'amélioration des connaissances sur les monocristaux dopés aux ions de terre rare qui sont des candidats de choix pour la mise au point de répéteurs quantiques. Deux grands axes ont émergés : dans un premier temps nous avons essayé de comprendre quels sont les facteurs de succès et limitatifs dans l'utilisation du cristal de Nd : YSO en tant qu'hôte pour les mémoires quantiques avec pour objectif le transfert de la cohérence électronique vers des niveaux hyperfins. Dans un second temps, nous avons étudié un cristal présentant une structure hyperfine directement accessible en optique, Er : YLF afin de vérifier sa potentielle utilisation pour les mémoires quantiques. Ces travaux ont permis, entre autre, de réaliser un transfert de cohérence d'un niveau Zeeman électronique vers un niveau hyperfin avec un temps de stockage de plus de 300 μs, ce qui permet d'envisager une mémoire quantique dans Nd : YSO permettant de réémettre un photon à la demande
APA, Harvard, Vancouver, ISO, and other styles
24

Christophel, Philipp M. [Verfasser]. "Separation algorithms for cutting planes based on mixed integer row relaxations / von Philipp M. Christophel." 2009. http://d-nb.info/995702489/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Buakaz, Saïda. "Approche stochastique de la segmentation des images‎ : un modèle de coopération entre les primitives de régions et de frontières." Phd thesis, 1987. http://tel.archives-ouvertes.fr/tel-00324438.

Full text
Abstract:
Dans le domaine de l'étiquetage, les méthodes stochastiques s'inscrivent dans un cadre ou l'affectation d'un objet est perçue comme une connaissance dynamique. Cette affectation peut être modifiée selon l'évolution des connaissances contextuelles, en retour elle est susceptible d'influer sur l'état instantané des connaissances. De telles méthodes présentent l'avantage d'introduire un caractère de contrôle local à l'évolution de l'étiquetage. C'est en s'appuyant sur cette notion d'étiquetage, sous son aspect local, que nous avons abordé le problème de la segmentation des images ayant remarque que les décisions au niveau de chaque point influent et sont influencées par celles des voisins. En outre, on remarque que jusqu'à présent, les méthodes de segmentation se fondaient, essentiellement, sur un choix entre 2 types de modèles : le modèle région et le modèle frontière. Les méthodes stochastiques, itératives abordées dans ce mémoire permettent de faire intervenir simultanément les 2 entités et de les traiter au sein d'un processus unique. Le procédé consiste à introduire pour chaque point, au niveau de son vecteur d'étiquetage des informations de type région et des informations de type frontière. Les relations contextuelles interviennent sous forme d'interactions inter-classes et d'interactions inter-entités
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography