Dissertations / Theses on the topic 'Converse de la théorie de l'information'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Converse de la théorie de l'information.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Marsiglietti, Arnaud. "Géométrie des mesures convexes et liens avec la théorie de l’information." Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1032/document.
Full textThis thesis is devoted to the study of convex measures as well as the relationships between the Brunn-Minkowski theory and the Information theory. I pursue the works by Costa and Cover who highlighted similarities between two fundamentals inequalities in the Brunn-Minkowski theory and in the Information theory. Starting with these similarities, they conjectured, as an analogue of the concavity of entropy power, that the n-th root of the parallel volume of every compact subset of $R^n$ is concave, and I give a complete answer to this conjecture. On the other hand, I study the convex measures defined by Borell and I established for these measures a refined inequality of the Brunn-Minkowski type if restricted to convex symmetric sets. This thesis is split in four parts. First, I recall some basic facts. In a second part, I prove the validity of the conjecture of Costa-Cover under special conditions and I show that the conjecture is false in such a generality by giving explicit counterexamples. In a third part, I extend the positive results of this conjecture by extending the notion of the classical volume and by establishing functional versions. Finally, I generalize recent works of Gardner and Zvavitch by improving the concavity of convex measures under different kind of hypothesis such as symmetries
Brunero, Federico. "Unearthing the Impact of Structure in Data and in Topology for Caching and Computing Networks." Electronic Thesis or Diss., Sorbonne université, 2022. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2022SORUS368.pdf.
Full textCaching has shown to be an excellent expedient for the purposes of reducing the traffic load in data networks. An information-theoretic study of caching, known as coded caching, represented a key breakthrough in understanding how memory can be effectively transformed into data rates. Coded caching also revealed the deep connection between caching and computing networks, which similarly show the same need for novel algorithmic solutions to reduce the traffic load. Despite the vast literature, there remain some fundamental limitations, whose resolution is critical. For instance, it is well-known that the coding gain ensured by coded caching not only is merely linear in the overall caching resources, but also turns out to be the Achilles heel of the technique in most practical settings. This thesis aims at improving and deepening the understanding of the key role that structure plays either in data or in topology for caching and computing networks. First, we explore the fundamental limits of caching under some information-theoretic models that impose structure in data, where by this we mean that we assume to know in advance what data are of interest to whom. Secondly, we investigate the impressive ramifications of having structure in network topology. Throughout the manuscript, we also show how the results in caching can be employed in the context of distributed computing
Hu, Xu. "Towards efficient learning of graphical models and neural networks with variational techniques." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1037.
Full textIn this thesis, I will mainly focus on variational inference and probabilistic models. In particular, I will cover several projects I have been working on during my PhD about improving the efficiency of AI/ML systems with variational techniques. The thesis consists of two parts. In the first part, the computational efficiency of probabilistic graphical models is studied. In the second part, several problems of learning deep neural networks are investigated, which are related to either energy efficiency or sample efficiency
Calzolari, Giacomo. "Essais en théorie de l'information." Toulouse 1, 2000. http://www.theses.fr/2000TOU10070.
Full textGrinbaum, Alexei. "Le rôle de l'information dans la théorie quantique." Phd thesis, Ecole Polytechnique X, 2004. http://tel.archives-ouvertes.fr/tel-00007634.
Full textKaced, Tarik. "Partage de secret et théorie algorithmique de l'information." Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20170/document.
Full textOur work deals with secret sharing in the theoretical point of views of Shannon's Information Theory and Kolmogorov's Algorithmic Information Theory. We are going to explain how these three subjects are naturally deeply intertwined.Information inequalities play a central role in this text. They are the inequalities for Shannon entropy, but also they are in exact correspondence with the inequalities for Kolmogorov complexity. Kolmogorov complexity formalizes the idea of randomness for strings.These two reasons alone justify to consider the notion of secret sharing in the Algorithmic framework (if one can share a random secret one can share anything).Originally, secret sharing was first studied under the combinatorial lens, only later was it more generally formalized using information-theoretic measures. This step allowed the use of information inequalities which revealed to bevery important to understand the existence of secret-sharing schemes with respect to efficiency.The investigation of information inequalities is at its debut. We contribute to the subject by introducing the notion of essentially conditional inequalities, which shows once again that information inequalities are yet not fully understood
Nechita, Ion. "États aléatoires, théorie quantique de l'information et probabilités libres." Phd thesis, Université Claude Bernard - Lyon I, 2009. http://tel.archives-ouvertes.fr/tel-00371592.
Full textGarivier, Aurélien. "Modèles contextuels et alphabets infinis en théorie de l'information." Paris 11, 2006. http://www.theses.fr/2006PA112192.
Full textThis thesis explores some contemporary aspects of information theory, from source coding to issues of model selection. We first consider the problem of coding memoryless sources on a countable, infinite alphabet. As it is impossible to provide a solution which is both efficient and general, two approaches are considered: we first establish conditions under which the entropic rate can be reached, and we consider restricted classes for which tail probabilities are controlled. The second approach does not set any condition on the sources but provides a partial solution by coding only a part of the information - the pattern - which captures the repetitions in the message. In order to study more complex processes, we come back to the case of finite memory sources on a finite alphabet : it has given rise to many works and efficient algorithms like the Context Tree Weighting (CTW) Method. We show here that this method is also efficient on anon-parametric class of infinite memory sources: the renewal processes. We show then that the ideas on which CTW is based lead to a consistent estimator of the memory structure of a process, when this structure is finite. In fact, we complete the study of the BIC context tree estimator for Variable Length Markov Chains. In the last part, it is shown how similar ideas can be generalized for more complex sources on a (countable or not) infinite alphabet. We obtain consistent estimators for the order of hidden Markov models with Poisson and Gaussian emission
Dion, Emmanuel. "La théorie de l'information appliquée à l'enquête par questionnaire." Clermont-Ferrand 1, 1995. http://www.theses.fr/1995CLF10157.
Full textInformations measurement as defined by information theory (particularly entropy, equivocation, ambiguity, capacity, redundancy and transmitted information) proposes a new series of tools for the evaluation of survey questionnaires. These tools make it possible to measure the amount of information one can get from each surveyed person, and established general questionnaire design rules that lead to the optimization of this amount of information
Bensadon, Jérémy. "Applications de la théorie de l'information à l'apprentissage statistique." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS025/document.
Full textWe study two different topics, using insight from information theory in both cases: 1) Context Tree Weighting is a text compression algorithm that efficiently computes the Bayesian combination of all visible Markov models: we build a "context tree", with deeper nodes corresponding to more complex models, and the mixture is computed recursively, starting with the leaves. We extend this idea to a more general context, also encompassing density estimation and regression; and we investigate the benefits of replacing regular Bayesian inference with switch distributions, which put a prior on sequences of models instead of models. 2) Information Geometric Optimization (IGO) is a general framework for black box optimization that recovers several state of the art algorithms, such as CMA-ES and xNES. The initial problem is transferred to a Riemannian manifold, yielding parametrization-invariant first order differential equation. However, since in practice, time is discretized, this invariance only holds up to first order. We introduce the Geodesic IGO (GIGO) update, which uses this Riemannian manifold structure to define a fully parametrization invariant algorithm. Thanks to Noether's theorem, we obtain a first order differential equation satisfied by the geodesics of the statistical manifold of Gaussians, thus allowing to compute the corresponding GIGO update. Finally, we show that while GIGO and xNES are different in general, it is possible to define a new "almost parametrization-invariant" algorithm, Blockwise GIGO, that recovers xNES from abstract principles
Martimort, David. "Multiprincipaux en économie de l'information." Toulouse 1, 1992. http://www.theses.fr/1992TOU10011.
Full textThis thesis developps a set of theoretical contributions which abandonns unicity of contract of the usual incentive theory. We offer a conceptual framework rich enough to be applied to complex organizations. We develop a theoretical analysis of common agency structure, the limit of the revelation principle in this content. We analyse competing hierarchies with exclusive agents. We end up using these results in applications in industrial organization : choice of market structures (exclusive agencies or common agency) policyrket structures
Mora, T. "Géométrie et inférence dans l'optimisation et en théorie de l'information." Phd thesis, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00175221.
Full textAprès une introduction physique des problèmes et des concepts liés aux domaines sus-évoqués, les méthodes de passage de messages, basées sur l'approximation de Bethe, sont introduites. Ces méthodes sont utiles d'un point de vue physique, car elle permettent d'étudier les propriétés thermodynamiques d'ensemble d'instances aléatoires. Elles sont également utiles pour l'inférence. L'analyse de spectres de distances est ensuite effectuée à l'aide de méthodes combinatoires et de passage de messages, et mises à profit afin de prouver et l'existence de la fragmentation dans les problèmes de satisfaction de contraintes, et d'en étudier les aspects importants.
Mora, Thierry. "Géométrie et inférence dans l'optimisation et en théorie de l'information." Paris 11, 2007. http://www.theses.fr/2007PA112162.
Full textBeauchêne, Daniel. "L'information industrielle : définition et spécification." Chambéry, 1993. http://www.theses.fr/1993CHAMS013.
Full textBohnet-Waldraff, Fabian. "Entanglement and Quantumness - New numerical approaches -." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS318/document.
Full textThe main topic of this compilation thesis is the investigation of multipartite entanglement of finite dimensional systems. We developed a numerical algorithm that detects if a multipartite state is entangled or separable in a finite number of steps of a semi-definite optimization task. This method is an extension of previously known semi-definite methods, which are inconclusive when the state is separable. In our case, if the state is separable, an explicit decomposition into a mixture of separable states can be extracted. This was achieved by mapping the entanglement problem onto the mathematically well studied truncated moment problem.Additionally, a new way of writing the partially transposed state for symmetric multi-qubit states was developed which simplifies many results previously known in entanglement theory. This new way of writing the partial transpose criterion unifies different interpretations and alternative formulations of the partial transpose criterion and it is also a part in the aforementioned semi-definite algorithm.The geometric properties of entangled symmetric states of two qubits were studied in detail: We could answer the question of how far a given pure state is from the convex hull of symmetric separable states, as measured by the Hilbert-Schmidt distance, by giving an explicit formula for the closest separable symmetric state. For mixed states we could provide a numerical upper and analytical lower bound for this distance.For a larger number of qubits we investigated the ball of absolutely classical states, i.e.~symmetric multi-qubit states that stay separable under any unitary transformation. We found an analytical lower bound for the radius of this ball around the maximally mixed symmetric state and gave a numerical upper bound on this radius, by searching for an entangled state as close as possible to the maximally mixed symmetric state.The tensor representation of a symmetric multi-qubit state, or spin-$j$ state, allowed us to study entanglement properties based on the spectrum of the tensor via tensor eigenvalues. The definiteness of this tensor relates to the entanglement of the state it represents and, hence, the smallest tensor eigenvalue can be used to detect entanglement. However, the tensor eigenvalues are more difficult to determine than the familiar matrix eigenvalues which made the investigation computationally more challenging.The relationship between the value of the smallest tensor eigenvalue and the amount of entanglement in the state was also investigated. It turned out that they are strongly correlated for small system sizes, i.e.~for up to six qubits. However, to investigate this correlation we needed an independent way to gauge the amount of entanglement of a state and in order to do so we improved existing numerical methods to determine the distance of a state to the set of separable symmetric states, using a combination of linear and quadratic programming.The tensor representation of symmetric multi-qubit states was also used to formally define a new tensor class of regularly decomposable tensors that corresponds to the set of separable symmetric multi-qubit states
Faÿ, Armelle. "Sur la propagation de l'information dans les réseaux probabilistes." Paris 6, 1997. http://www.theses.fr/1997PA066770.
Full textOllivier, Harold. "Eléments de théorie de l'information quantique, décohérence et codes correcteurs quantiques." Phd thesis, Ecole Polytechnique X, 2004. http://pastel.archives-ouvertes.fr/pastel-00001131.
Full textOllivier, Harold. "Eléments de théorie de l'information quantique, décohérence et codes correcteurs d'erreurs." Palaiseau, Ecole polytechnique, 2004. http://www.theses.fr/2004EPXX0027.
Full textBourdon, Jérémie. "Analyse dynamique d'algorithmes : exemples en arithmétique et en théorie de l'information." Caen, 2002. http://www.theses.fr/2002CAEN2062.
Full textSaulnier, Boris. "Aspects multi-échelles de l'information en biologie : de la physique à la biologie." Paris 7, 2006. http://www.theses.fr/2006PA077225.
Full textOne goal of biocomputing is to rebuild the states and dynamic of a biological System from genetic data. For that, we use in this thesis different entropies, which give a precise mathematical form to the information quantity associated to a measurement process. We show, within the symbolic dynamic framework, the equivalence of different information quantities associated to a data source. But thé effective calculation requires infinite data sequences. Therefore, the reconstruction of a dynamic requires an a priori theoretical characterization, which often lacks in biology. We show that probabilistic representation, which underlies information computation, depends on a theory and its symetries. As a consequence, we show that a frequency measured in a population is not necessarily meaningfull for a given organism. Then we use linear thermodynamics of irrerversible processes to develop an innovative result, numerically confirmed, linked to scaling laws of biology thermodynamics. For that, we decompose the thermodynamical entropy in two parts, associated respectively to organization and disorganization processes. Last, we use the hyperbolic approach, and the notion of relevant information in statistical quantum physics, to show the necessity to deal with hierarchies of entropy in biology. This opens the way to the definition of an intrisically scale relative information
Oury, Marion. "Essais sur l'information presque complète en théorie des jeux et en économie." Jouy-en Josas, HEC, 2008. http://www.theses.fr/2008EHEC0011.
Full textEconomic models always contain a description of the informational environnement, i. E. , an answer to the following questions: What is the knowledge of each agent about her final payments? What is her knowledge about the knowledge of the other agents? and so on. . . Of course, the modeler does not know exactly the mental representation each agent has regarding her environment. Therefore, to derive economic results, she has to abstract away from details and make strong simplifying assumptions that are meant to be satisfied only approximately in the actual situation. One of the most commonly used idealizations is that of complete information. In complete information settings, the final payments are common knowledge among the agents. The aim of this Ph D thesis is to clarify the strategic implications of this assumption by relaxing it slightly and studying models where information is nearly complete. It is shown that these implications are important. I also give some economic applications of these results in mechanism design and problems of information transmission
Haddouz, Riad. "Etude d'algorithmes de résolution du problème de satisfaisabilité et d'une classe de données pour les comparer." Aix-Marseille 2, 1991. http://www.theses.fr/1991AIX22024.
Full textBomble, Laëtitia. "Contrôle de la dynamique de systèmes complexes : Application à l'information quantique et classique." Paris 11, 2010. http://www.theses.fr/2010PA112087.
Full textThis thesis is dedicated to the study and research of new ways of implementing and manipulating information on quantum systems. On such systems, the usual (classical) logic can be used, improving size and duration, or a new logic (quantum) using quantum properties of this systems can be developed allowing new logic operations. The systems used are here eigen states of molecules. In the order of mapped information in such systems, we need to be able to manipulate their dynamics, here we use a laser field designed to make a determined transformation on the molecule (pi-pulse or STIRAP pulses designed by genetic algorithms or pulses generated by optimal control). The goal of this thesis was to search quantum systems among the molecules to be used as candidates for classical and quantum computation and to modelize on them logic gates with various implementations and ways of control. On the classical computation aspect, a simulation of a full adder by a STIRAP process on sulfur dioxide has been made and so a realization of a classical gate on a quantum system has been proposed. On the quantum computation aspect, some circuits has been implemented by vibrational computing on bromoacetyl chloride on nitrous acid and on thiophosgene controlled by optimal control. The utilization on a network of ultra-cold trapped diatomic molecules coupled by dipolar interaction has also been studied
Piantanida, Pablo. "THÉORIE DE L'INFORMATION MULTI-UTILISATEUR : INFORMATION ADJACENTE ET CONNAISSANCE IMPARFAITE DU CANAL." Phd thesis, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00168330.
Full textDans cette thèse, nous introduisons d'abord la notion de "estimation-induced outage capacity" pour des canaux mono-utilisateur, où lémetteur et le récepteur \oe uvrent dans le but de construire des codes assurant une communication fiable avec une certaine qualité de service (QoS), quel que soit le degré d'exactitude de l'estimation qui apparaît au cours de la transmission. Dans notre cas, la contrainte sur la qualité de service permet d'atteindre les taux souhaités avec une probabilité d'erreur faible (le service de communication visé), même dans le cas où les estimations du canal sont mauvaises. Nos résultats fournissent une notion intuitive de l'impact des estimations et des caractéristiques du canal (e.g. SNR, taille des séquences d'apprentissage, voies de retour) sur le taux de coupure.
Ensuite, le décodeur optimal atteignant cette capacité est étudié. Nous nous focalisons sur les familles de décodeurs qui peuvent être implémentées sur la plupart des systèmes pratiques de modulation codée. Nous nous basons sur le décodeur théorique qui atteint la capacité, pour obtenir une métrique practique de décodage pour des canaux sans mémoire arbitraires qui minimise la probabilité d'erreur de transmission moyennée sur toutes les erreurs d'estimation du canal. Cette métrique est ensuite appliquée au cas des canaux MIMO à évanouissements. D'après notre notion du taux de coupure, nous déterminons le taux maximal d'information atteignable associé au décodeur proposé. Les résultats numériques montrent que, sans introduire de complexité additionnelle dans le décodage, la métrique proposée fournit des gains significatifs, en termes de taux d'information atteignables et de taux d'erreur binaire (BER), dans le cas de modulation codée à bits entrelacés (BICM).
Nous considérons ensuite les effets d'une estimation imparfaite connue par les récepteurs avec (ou sans) connaissance imparfaite à lémetteur sur la capacité de canaux dépendant d'états avec information adjacente non-causale à lémetteur. Ceci est abordé via la notion de communication fiable basée sur la probabilité d'erreur moyennée sur toutes les erreurs d'estimation du canal. Cette notion nous permet de considérer la capacité d'un canal composé (plus bruité) de type Gelfand et Pinsker. Nous obtenons le schéma de codage optimal de type "Dirty-paper coding (DPC)" qui atteint la capacité (sous l'hypothèse d'entrées Gaussiennes) du canal à évanouissements de type Costa. Les résultats illustrent le compromis pratique entre la quantité d'apprentissage du canal et son impact sur les performances de l'annulation d'interférences du schéma DPC. Cette approche nous permet d'étudier la région de capacité de canaux MIMO multi-utilisateur de diffusion à évanouissements (MIMO-BC), où les mobiles (les récepteurs) disposent uniquement d'une estimation bruitée des paramètres du canal, et ces estimations sont (ou non) disponibles à la station de base (l'émetteur). Nous observons en particulier, le résultat surprenant que pour ce canal de diffusion avec une antenne unique à l'émetteur et aux récepteurs et des estimées imparfaites du canal connues aux récepteurs, une connaissance de ses estimées à l'émetteur n'est pas nécessaire pour atteindre des taux élevés.
Finalement, nous présentons plusieurs schémas réalisables de type DPC pour l'insertion de l'information multi-utilisateur en soulignant leur relation étroite avec la théorie de l'information multi-utilisateur. Nous montrons d'abord qu'en fonction de l'application visée et des conditions requises sur les différents messages en termes de robustesse et de transparence, il y a un parallèle entre l'insertion de l'information multi-utilisateur et les canaux multi-utilisateur avec information adjacente à l'émetteur. Nous nous focalisons sur les canaux de diffusion Gaussiens (GBC) et sur les canaux Gaussiens à accès multiples (MAC). Ceci consiste en une conception commune de schémas de codage pratiques de type DPC basés sur les solutions théoriques correspondant à ces canaux. Ces résultats étendent les implémentations pratiques de type QIM, DC-QIM et SCS initialement conçues pour un utilisateur au cas de plusieurs utilisateurs. Nous montrons que l'écart avec les performances optimales (les régions de capacités) peut être minimisé en utilisant des mots de code basés sur un réseau maillé de dimension finie.
Piantanida, Juan Pablo. "Théorie de l'information multi-utilisateur : information adjacente et connaissance imparfaite du canal." Paris 11, 2007. http://www.theses.fr/2007PA112053.
Full textThe capacity of single and multi-user channels under imperfect channel knowledge are investigated. We address these channel mismatch scenarios by introducing two novel notions of reliable communication under channel estimation errors, for which we provide an associated coding theorem and its corresponding converse. Basically, we exploit for our purpose an interesting feature of channel estimation through use of pilot symbols. This feature is the availability of the statistic characterizing the quality of channel estimates. We first introduce the notion of estimation-induced outage capacity, where the transmitter and the receiver strive to construct codes for ensuring reliable communication with a quality-of-service, no matter which degree of accuracy estimation arises during a transmission. Then the optimal decoder achieving this capacity is investigated. We derive a practical decoding metric and its achievable rates, for arbitrary memoryless channels that minimizes the average of the transmission error probability over all channel estimation errors. We next consider the effects of imperfect channel estimation at the receivers with imperfect (or without) channel knowledge at the transmitter on the capacity of state-dependent channels withe non-causal CSI at the transmitter (e. G. The multi-user Fading MIMO Broadcast Channel). We address this through the notion of reliable communication based on the average of the transmission error probability over all channel estimation erros. Finally, we consider several implementable DPC schemes for multi-user information embedding, through emphasizing their tight relationship with conventional multi-user information theory
Ferenczi, Sébastien. "Systèmes de rang fini." Aix-Marseille 2, 1990. http://www.theses.fr/1990AIX22016.
Full textRoubert, Benoît. "Approche semi-classique de l'information quantique." Toulouse 3, 2010. http://thesesups.ups-tlse.fr/1123/.
Full textToday, a large community of scientists is working to make possible the achievement of a quantum computer, a machine that can offer at least in theory (and especially for problems whose complexity grows exponentially with the size of the system) a degree of performance inaccessible to its classical counterpart. This thesis is looking at the possibility of producing a semi-classical approach of quantum information in two areas of interest: the cloning of a qubit, and the amplification of spin in spin chains. In the first part of this thesis is studied the role of interference in quantum cloners. We study in particular the case of cloners without interference (as defined, in the thesis) that turned out to be an intermediary case (that can be qualified of semi-classical) between purely quantum cloners (which propagate coherences and probabilities of density matrices) and classical cloners (which carry only the probabilities). In the second part, the phenomenon of amplification is studied in spin chains, which allows to amplify the state of a unique spin in a state of polarization of the entire chain, problem for which the semi-classical approach (valid because of the large number of spins) is used to show the unexpectedly important role played by the edge effects in these kind of systems
La, Fortelle Arnaud de. "Contribution à la théorie des grandes déviations et applications." Marne-la-Vallée, ENPC, 2000. http://www.theses.fr/2000ENPC0018.
Full textThis thesis focuses on the change of measure and its links with the entropy in the context of large deviations. We first present basic large deviations techniques settling the framework; the change of measure is set out together with properties of the entropy and related results (weak LDP. . . ) for i. I. D. Variables and Markov processes. Then some notions somehow naturally appear: Ruelle-Lanford fonctions, empirical generators, etc. The analysis can be pursued forther into several directions: weak LDP theory, Martin boundary, calculus of "quasi-harmonie" fonctions in networks (related to the tails of stationary distributions), asymptotic calculations and extension of these techniques to semi-Markov processes. A unity of structure in these problems is exhibited by means of the entropy fonction. The second part of the thesis contains articles to appear: LDP for Markov chains in discrete and continuous time; LDP for some, polling networks
Blanc, Jean-luc. "Transmission de l'information et complexité des activités de populations neuronales." Thesis, Aix-Marseille, 2012. http://www.theses.fr/2012AIXM4720/document.
Full textIn this thesis, we address the problem of transmission and information processing by neuronal assemblies, in terms of the interdisciplinary approach of complex systems by referring mainly to the formalisms of information theory and dynamical systems. In this context, we focus on the mechanisms underlying sensory information representation by neuronal activity through neural coding. We explore the structure of this code under several scales through the study of different neuronal population electrophysiological signals (singel unit, LFP and EEG). We have implemented various indices in order to extract objectively information from neural activity, but also to characterize the underlying dynamics from finite size time series (the entropy rate). We also defined a new indicator (the mutual information rate), which quantifies self-organization and relations of coupling between two systems. Using theoretical and numerical approaches, we analyze some characteristic properties of these indices and propose their use in the context of the study of neural systems. This work allows us to characterize the complexity of different neuronal activity associated to information transmission dynamics
Chachoua, Mohamed. "Une théorie symbolique de l'entropie pour le traitement des informations incertaines." Angers, 1998. http://www.theses.fr/1998ANGE0026.
Full textRobin, Stéphane. "Le monopole bilatéral : une approche expérimentale de l'attention à l'autre et de l'information incomplète." Grenoble 2, 1997. http://www.theses.fr/1997GRE21048.
Full textGennero, Marie-Claude. "Contribution à l'étude théorique et appliquée des codes correcteurs d'erreurs." Toulouse 3, 1990. http://www.theses.fr/1990TOU30243.
Full textLe, Treust Maël. "Théorie de l'information, jeux répétés avec observation imparfaite et réseaux de communication décentralisés." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00656783.
Full textZitouni, Djamel. "De la modelisation au traitement de l'information médicale." Antilles-Guyane, 2009. http://www.theses.fr/2010AGUY0382.
Full textThe intensive care unit is a complex environment ; the practice of medicine is specific. The handling of a patient during his/her stay should be done by care staffs with specific knowledge. To help care staffs in their tasks, a plethora of equipment is dedicated to them. These equipments evolve constantly. In the search of a continuous improvement in this activity, the use of automated increasingly appears as a major support and a future challenge for medical practices. Over the last thirty years, several attempts have been made to develop automated guidelines. However, most of these tools are prone to numerous unsolved issues, both in the translation of textual protocols to formal forms and in the treatment of information coming from biomedical monitors. To overcome biases of diagnosis support systems, we chose a different approach. We have defined a formalism that allows caregivers formalizing medical knowledge. We spent the last three years in the intensive care unit of the university hospital of Fort de France with the aim to develop a complete chain of data processing. The final goal was the automation of guidelines in the room, at the patient’s bedside. We propose a set of methods and tools to establish the complete chain of treatment follow-up for a patient, from admission to discharge. This methodology is based on a bedside experimental station: Aidiag (AIDe aux DIAGnostic). This station is a patient-centered tool that also adequately fits to medical routines. A genuine methodology for analyzing biomedical signals allows a first signal processing prior to their physiological interpretation. An artificial intelligence engine (Think!) and a new formalism (Oneah)
Lequesne, Justine. "Tests statistiques basés sur la théorie de l'information, applications en biologie et en démographie." Caen, 2015. http://www.theses.fr/2015CAEN2007.
Full textEntropy and divergence, basic concepts of information theory, are used to construct statistical tests in two different contexts, which divides this thesis in two parts. The first part deals with goodness-of-fit tests for distributions with densities based on the maximum entropy principle under generalized moment constraints for Shannon entropy, and also for generalized entropies. These tests are then applied in biology for a study of DNA replication. The second part deals with a comparative analysis of contingency tables through tests based on the Kullback-Leibler divergence between a distribution with fixed marginals, and another acting as a reference. Application of this new method to real data sets, and especially to results of PISA surveys in demography, show its advantages as compared to classical methods
Kaced, Tarik. "Partage de secret et théorie algorithmique de l'information." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2012. http://tel.archives-ouvertes.fr/tel-00763117.
Full textThitimajshima, Punya. "Les codes convolutifs récursifs systématiques et leurs applications à la concatenation parallèle." Brest, 1993. http://www.theses.fr/1993BRES2025.
Full textLagorce, Nicolas. "Une approche convolutive des codes p-adiques." Limoges, 2000. http://www.theses.fr/2000LIMO0001.
Full textCusset, François. "L' invention de la théorie française : le cas de sémiotexte et le transfert culturel transatlantique." Paris 10, 2000. http://www.theses.fr/2000PA100135.
Full textAs a strictly american category, french theory was made into a specific (and groundbreaking) pratical and cognitive object within the academia - primorily in the field of literary studies - and secondarily by certain groups of users, from the art world to some internet communities. It revolves around a canon of translated works from so-called 'post-structuralist' french authors (from Foucault to Derrida, Lyotard, Deleuze and Baudrillard). Focusing on the case-study of New-York-based semiotext(e) ( a pioneering journal and publishing house founded by Sylvère Lotringer), four successive steps are followed to unveil the features of french theory as a communicational process and cultural transfer : a look at how it was constructed as a set of discourses and media contents, an analysis of its various readings and interpretations, an attempt at relating its actual mediations to the concept of 'lifeworlds', and a final grasp of its systemic function within america's intellectual attention space. The role of additional mediators, from university presses to art galleries, is confronted to Semiotext(e)'s. The overall approach is the pragmatics and hermeneutics of communication, while 'cognitive ecology', media analysis and 'translinguistics' shed some additional light on the process of how and why french theory was invented in the US
Robinet, Vivien. "Modélisation cognitive computationnelle de l'apprentissage inductif de chunks basée sur la théorie algorithmique de l'information." Grenoble INPG, 2009. http://www.theses.fr/2009INPG0108.
Full textThis thesis presents a computational cognitive model of inductive learning based both on the MDL and on the chunking mechanism. The chunking process is used as a basis for many computational cognitive models. The MDL principle is a formalisation of the simplicity principle. It implements the notion of simplicity through the well-defined concept of codelength. The theoretical results justifying the simplicity principle are established through the algorithmic information theory. The MDL principle could be considered as a computable approximation of measures defined in this theory. Using these two mechanisms the model automatically generates the shortest representation of discrete stimuli. Such a representation could be compared to those produced by human participants facing the same set of stimuli. Through the proposed model and experiments, the purpose of this thesis is to assess both the theoretical and the practical effectiveness of the simplicity principle for cognitive modeling
Pins, Delphine. "La loi de Piéron : un outil méthodologique pour l'étude du traitement de l'information sensorielle." Paris 5, 1996. http://www.theses.fr/1996PA05H015.
Full textApproaching the psychology of perception in terms of information processing leads to consider reaction time (rt) as an objective correlate of information processing time. Rt, in fact, provides an irreplacable indicator of the complexity of the levels of information processing that are involved in perceptual tasks. The thesis defended here is that a psychophysical law (pieron's law) provides an optimal tool for the study of variations in sensory information processing. It is shown that the nature and the complexity of the underlying psychophysiological processes can be inferred from variations of the parameters of pieron's law. This work sheds light on the factors which determine these parameters and thus reveals the ultimate significance of this psychophysical law. We wanted to provide an answer to two questions in particular : 1) how does luminance processing combine with factors such as the general complexity of the psychophysical task, conditions of stimulus presentation, and level of training? 2) how can we explain the systematic decrease of rt with increasing stimulus luminance? The rt studies discussed in this thesis clearly show that luminance and other taskspecific dimensions are processed independently. The effect of luminance intensity is found to be the same, regardless of the type of psychophysical task. We furthermore demonstrate the extent to which some dimensions that are closely related to the main factor, which is absolute luminance, contribute to explain why rt decreases with stimulus intensity
Sandri, Sandra. "La combinaison de l'information incertaine et ses aspects algorithmiques." Toulouse 3, 1991. http://www.theses.fr/1991TOU30199.
Full textRaineau, Laurence. "L'information en économie centralement planifiée : enjeu de la transition vers l'économie de marché." Paris 1, 1994. http://www.theses.fr/1994PA010018.
Full textThe thesis follows simultaneously two main analytical topics. Firstly, it develops a theoretical analysis of information and secondly it analyses the informational problem of centrally planned economies. The purchased aim is an insight into the challenge posed by the transition from a centralized to a decentralized system of information. The first part is a critical analysis of the informational and communicational systems of the centrally planned economies. It points out the deficiency of the existing analyses concerning biased information or quality of the exchanged messages. The second part focusses on the theoretical developments on information in centrally planned economy. The neoclassical hypotheses on information are discussed though the analysis of market socialism models. Through the study of hayekian critics of socialism, the austrain approach of the information is then discussed. In the third part the author proposes a more personal analysis of information and of the logics and deficiencies of the centrally planned economies. In disruption with the objectivist informational approach, the analysis differenciates the exchanged data (indices), the signal and the meaning information. Linguistics and phenomenology are then helpful to define information and to understand its origin in a system
Walsh, Isabelle. "La théorie de la toupie : Une approche culturelle des usages des technologies de l’information." Paris 9, 2009. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2009PA090049.
Full textIn this dissertation, we propose the Spinning Top Theory which, through a conceptualization of culture at the individual level, allows us to capture the cultural dimension of IT-usage(s) studied through the lens of IT-culture at group level. We define and study the concept of IT-culture that we propose to apply and assess with the help of a construct we name IT-acculturation. We model this construct and propose a measure for it. This allows us to propose a new model of IT-usage(s) which integrates its(their) cultural dimension. We understand IT-usage(s) as a cultural phenomenon, socially constructed at multiple levels, and structured through a progressive IT-acculturation
Immordino, Giovanni. "Rôle de l'incertitude scientifique dans la prise de décision." Toulouse 1, 1999. http://www.theses.fr/1999TOU10055.
Full textBougaret, Sophie. "Prise en compte de l'incertitude dans la valorisation des projets de recherche et développement : la valeur de l'information nouvelle." Toulouse, INPT, 2002. http://www.theses.fr/2002INPT016G.
Full textBeaudoux, Florian. "Synthèse cristalline et spectroscopie cohérente d’un tungstate de lanthane pour application aux mémoires quantiques." Paris 6, 2010. http://www.theses.fr/2010PA066608.
Full textBelkasmi, Mostafa. "Contribution à l'étude des codes correcteurs multicirculants." Toulouse 3, 1992. http://www.theses.fr/1992TOU30120.
Full textZhou, Rong. "Etude des Turbo codes en blocs Reed-Solomon et leurs applications." Rennes 1, 2005. http://www.theses.fr/2005REN1S027.
Full textBarbier, Morgan. "Décodage en liste et application à la sécurité de l'information." Phd thesis, Ecole Polytechnique X, 2011. http://pastel.archives-ouvertes.fr/pastel-00677421.
Full text