Dissertations / Theses on the topic 'Accès au haut débit'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Accès au haut débit.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Adjih, Cédric. "Multimédia et accès à l'Internet haut débit : l'analyse de la filière du câble." Versailles-St Quentin en Yvelines, 2001. http://www.theses.fr/2001VERS0017.
Full textHadam, Pawel. "Transports nouvelle génération dans les réseaux à très haut débit." Phd thesis, Grenoble INPG, 2005. http://tel.archives-ouvertes.fr/tel-00010643.
Full textKhawam, Kinda. "L'ordonnancement opportuniste dans les réseaux mobiles de nouvelle génération." Paris, ENST, 2006. http://www.theses.fr/2006ENST0030.
Full textThe scarce resources in wireless systems compounded by their highly variable and error prone propagation characteristics stress the need for efficient resource management. Scheduling is a key tool to allocate efficiently the radio frequency spectrum. While fading effects have long been combated in wireless networks, primarily devoted to voice calls, they are now seen as an opportunity to increase the capacity of novel wireless networks that incorporate data traffic. For data applications, there is a service flexibility afforded by the delay tolerance of elastic traffic and by their ability to adapt their rate to the variable channel quality. Channel-aware scheduling exploit these characteristics by making use of channel state information to ensure that transmission occurs when radio conditions are most favourable. When users have heterogeneous characteristics and quality of service requirements, channel-aware scheduling becomes a challenging task. In this thesis, channel-aware transmission schemes for supporting downlink non-real time services are proposed and analyzed for novel cellular systems. The proposed schemes are designed for providing various QoS requirements for users while increasing the system global throughput
Youssef, Mazen. "Modélisation, simulation et optimisation des architectures de récepteur pour les techniques d’accès W-CDMA." Thesis, Metz, 2009. http://www.theses.fr/2009METZ007S/document.
Full textThis thesis focuses on the design of the air interface of W-CDMA (Wideband Code Division Multiple Access) systems, particularly on the aspects related to the channel access problems at the reception side. The main concern herein is the design of the baseband digital parts, that is, the RAKE receiver. This receiver is in charge of the signal demodulation and responsible for making profit of signal diversity. This late functionality is particularly important as it allows to counter signal fading by detecting and combining multipath components (leading to signal reinforcement) Given the central role of the RAKE receiver, its design and implementation are of paramount importance. In this thesis, we propose a new architecture for the RAKE receiver: CodeRAKE. The main architectural characteristics being aimed are high flexibility and scalability, yet preserving a good trade-off between resource use (and hence, area consumption) and performance (operation speed). In order to satisfy the flexibility and scalability constraints, the CodeRAKE architecture is modular and partitioned according to the number of users and the number of codes per user, with the resource limitation and performance preservation constraints in mind. The high levels of modularity of the CodeRAKE architecture allow an easy use of parallelisation techniques, which in turn allow an easy increase of performances, particularly at the base station side.The architectural approach proposed herein are versatile and can be easily adapted to other existing or future protocols. It responds to the challenge of the coming years, where the receiver will have to support multiple protocols and access interfaces, including control software layers
Barros, Francisco Ricardo Magalhães. "The role of public policies in the development of broadband infrastructures : The Case of Brazil." Electronic Thesis or Diss., Institut polytechnique de Paris, 2025. http://www.theses.fr/2025IPPAT013.
Full textThe liberalization of state monopolies in the electronic communications sector aims to foster a competitive environment that attracts new players and investments through sector-specific regulation. In Brazil, this approach was adopted to expand communications infrastructure and ensure universal local access, with partial success, especially in large cities. However, growth in fiber-optic networks stalled between 2010 and 2012, prompting regulatory interventions and government incentives. The Brazilian case highlights the expanding role of regulators and the government in network expansion, challenged by slow regulatory evolution, high costs, and low investment appeal in small and medium municipalities. Recently, a new competitive model, supported by deregulation and incentives, enabled small local operators to partially address infrastructure gaps, though these efforts have not fully met public policy goals to reduce regional inequalities. This thesis investigates the relationship between the expansion of communications infrastructure, particularly broadband, and key economic variables, with a focus on social inequality and digital exclusion
Youssef, Mazen. "Modélisation, simulation et optimisation des architectures de récepteur pour les techniques d’accès W-CDMA." Electronic Thesis or Diss., Metz, 2009. http://www.theses.fr/2009METZ007S.
Full textThis thesis focuses on the design of the air interface of W-CDMA (Wideband Code Division Multiple Access) systems, particularly on the aspects related to the channel access problems at the reception side. The main concern herein is the design of the baseband digital parts, that is, the RAKE receiver. This receiver is in charge of the signal demodulation and responsible for making profit of signal diversity. This late functionality is particularly important as it allows to counter signal fading by detecting and combining multipath components (leading to signal reinforcement) Given the central role of the RAKE receiver, its design and implementation are of paramount importance. In this thesis, we propose a new architecture for the RAKE receiver: CodeRAKE. The main architectural characteristics being aimed are high flexibility and scalability, yet preserving a good trade-off between resource use (and hence, area consumption) and performance (operation speed). In order to satisfy the flexibility and scalability constraints, the CodeRAKE architecture is modular and partitioned according to the number of users and the number of codes per user, with the resource limitation and performance preservation constraints in mind. The high levels of modularity of the CodeRAKE architecture allow an easy use of parallelisation techniques, which in turn allow an easy increase of performances, particularly at the base station side.The architectural approach proposed herein are versatile and can be easily adapted to other existing or future protocols. It responds to the challenge of the coming years, where the receiver will have to support multiple protocols and access interfaces, including control software layers
Moise, Diana Maria. "Optimisation de la gestion des données pour les applications MapReduce sur des infrastructures distribuées à grande échelle." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2011. http://tel.archives-ouvertes.fr/tel-00696062.
Full textMoise, Diana. "Optimisation de la gestion des données pour les applications MapReduce sur des infrastructures distribuées à grande échelle." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2011. http://tel.archives-ouvertes.fr/tel-00653622.
Full textFrançois, Nicolas. "Analyse d'interactions moléculaires à haut débit." Paris 5, 2010. http://www.theses.fr/2010PA05S003.
Full textThis thesis tackles the locks held by emerging technologies for analyzing high throughput molecular interactions, allowing to acquire continuous measurements upon parallel interactions. Two technologies are promising: the target marking by fluorescence, which is a proven technique in biology; and the surface plasmon resonance (SPR), requiring no labeling of molecules. This thesis suggests an original automatic approach for image analysis of high throughput interactions, applicable to both fluorescence and SPR methods. From a mathematical model based upon common characteristics describing experimental studies, it combines 2D and 3D geodesic operators, classification and constraints of physical behavior. Coupled with the experimental protocol specific to each of the high throughput methods, involving calibrations and spatiotemporal filtering of noise, it ultimately provides a complete technology for the analysis of high throughput data. Assessed qualitatively and quantitatively on a data set, both synthetic and experimental, it confirms our expectations regarding the sensitivity of detection systems with high throughput, even at low concentrations of target. The characteristics of the interactions studied thus estimated were compared with those obtained by reference methods, or theoretical values. The results are in agreement, validating that the use of high throughput techniques for analyzing molecular interactions and the methodology developed for the exploitation of such data. This study thus opens new perspectives on the use of these technologies as well as part of research in biology, in a clinical setting to aid diagnosis or gene therapy
Faudeil, Stéphane. "Les turbo-codes à haut débit." Brest, 1994. http://www.theses.fr/1994BRES2019.
Full textPerrine, Clency. "Modem vectoriel HF à haut débit." Rennes 1, 2005. http://www.theses.fr/2005REN1S159.
Full textBelmiloud, Naser. "Microrhéomètre sur silicium pour chimie haut débit." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2008. http://tel.archives-ouvertes.fr/tel-00399501.
Full textAinsi l'analyse de la réponse fréquentielle des microstructures mobiles permet de remonter aux
propriétés des fluides en fonction de la fréquence.
Gonzales, Chavez Julio. "Etude des jonctions optiques à haut débit." Limoges, 1989. http://www.theses.fr/1989LIMO4003.
Full textMuthlethaler, Paul. "Protocole d'accès pour réseaux à haut-débit." Paris 9, 1989. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1989PA090028.
Full textCuevas, Ordaz Francisco Javier. "Turbo décodage de code produit haut débit." Lorient, 2004. http://www.theses.fr/2004LORIS033.
Full textThis thesis continues research undertaken on new error correcting code techniques, following work on block turbo codes, BTCs, introduced in 1994 by R. Pyndiah. It proposes an innovative architecture for the turbo decoding of product codes, using extended BCH codes as elementary codes. This innovative architecture stores several data at the same address and performs parallel decoding to increase the data rate. First, we propose a new high rate turbo decoding architecture using a BCH (32,26,4) code with a soft input-soft output (SISO), correcting 1 error (Hamming code). Then, we dedicate the second group of results to decoding BCH (128,106,8) code for high data rates with strong error correction power, correcting 3 errors (minimum distance of product code d=64) and high code rates (R close to 0,7). The first advantage of theses designs is that they use only one memory (n2 data grouped into blocks of m2) at the input. The elementary decoder designs presented are capable of treating m data simultaneously, with m=1, 2, 4 and 8. The second result is that by using m parallel decoders of the same type for the architecture of the turbo decoder, we obtain a decoding m2 higher speed and a m2/2 surface area, for these elementary decoders. To compare the performance and complexity of the decoders, we use C language for behavioural simulations, VHDL for functional simulations and Synopsys Design Compiler for the synthesis. The results obtained open up the possibility of future integration on silicon for turbo decoders with strong error correction power (minimum distance 64, code rate 0. 8) and very high data rate (6. 4 Gbits/s with a CMOS 0. 18μm target library)
Yan, Yiqing. "Scalable and accurate algorithms for computational genomics and dna-based digital storage." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS078.
Full textCost reduction and throughput improvement in sequencing technology have resulted in new advances in applications such as precision medicine and DNA-based storage. However, the sequenced result contains errors. To measure the similarity between the sequenced result and reference, edit distance is preferred in practice over Hamming distance due to the indels. The primitive edit distance calculation is quadratic complex. Therefore, sequence similarity analysis is computationally intensive. In this thesis, we introduce two accurate and scalable sequence similarity analysis algorithms, i) Accel-Align, a fast sequence mapper and aligner based on the seed–embed–extend methodology, and ii) Motif-Search, an efficient structure-aware algorithm to recover the information encoded by the composite motifs from the DNA archive. Then, we use Accel-Align as an efficient tool to study the random access design in DNA-based storage
Jolly, Julien. "Préparation et caractérisation haut débit de catalyseurs nanostructurés." Poitiers, 2010. http://theses.edel.univ-poitiers.fr/theses/2010/Jolly-Julien/2010-Jolly-Julien-These.pdf.
Full textThe term of indoor air pollution has emerged recently, i. E. Pollution in confined spaces such as buildings (home, office, etc…) or transports (planes, trains, cars, etc…). The main objective is to eliminate these polluants, especially volatile organic compounds (VOCs) and this at a lower energy cost. The solution based on the catalytic oxidation may be a good way to achieve that. But another question still remains: what strategy to use in order to reach the desired result? Different strategies can be chosen: a classical approach based primarily on prior knowledge and know-how or a more pragmatic approach called high-throughput experimentation (THE). It is the latter which is chosen in this project. Firstly, the key steps and the tools related to this approach will be presented before applying it to a concrete case study, the oxidation is isopropyl alcohol. Infrared thermography is chosen as the main analytical technique in order to select the best candidates (primary screening). Moroever, we will present the potentiality of this technique about the characterization of fundamental properties of catalytic materials such as specific surface area or the acid-base properties. Finally, an oscillating catalytic phenomenon in fixed conditions of temperature and pressure is highlighted. Combination of different analytical techniques in situ as led to a better understanding of this oscillating behavior
Reminiac, François. "Aérosolthérapie et dispositifs de haut débit nasal humidifié." Thesis, Tours, 2017. http://www.theses.fr/2017TOUR3302.
Full textAerosol therapy is a complex method of drug delivery frequently used in intensive care units, step down units and emergency departments, especially in obstructive patients which makes bronchodilators the most prescribed drug class in critical care. Critically ill patients often require ventilatory support, including nasal high flow therapy which showed promising clinical benefits. Given the physiologic effects of nasal high flow therapy, its implementation may be beneficial for obstructive patients. Consequently, the question of aerosol administration, especially bronchodilators, in patients undergoing nasal high flow arises. Aerosol administration during nasal high flow therapy directly in the high flow circuit could be a simple, efficient and possibly beneficial method. However, technical and physiological issues may jeopardize efficacy of this combined administration
Lalanne, aulet David. "Développement d'outils miniaturisés pour la microbiologie haut-débit." Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0160/document.
Full textMicrobiology is the part of science linked with the study of microorganisms and their properties. Since its beginnings in the XV IIth century, the methods developped by the microbiologists revealed a huge potential of knowledge and applications.In the last decades, industrials realized the interests of this study areaThe need to prevent contaminations by inhibiting microbial development, or on the contrary the will to improve it to enjoy the chemical transformations capacities of the microorganisms gave birth to an increasing demand for microbiological tests. The poor yield of traditionnal methods does not allow to satisfy this need, and the search for new test methods is thus focalizing interests. Miniaturized fluidic tools have already proven their potential for this kind of applications, and yet, their validation towards traditionnal methods often lacks.In this work, we aim at developping miniaturized cultivation techniques and optimized growth analysis methods, to study the scale reduction impact of incubator's size on growth, in order to end up with a high-throughput tool for biocide caracterization
Aubert, Julie. "Analyse statistique de données biologiques à haut débit." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS048/document.
Full textThe technological progress of the last twenty years allowed the emergence of an high-throuput biology basing on large-scale data obtained in a automatic way. The statisticians have an important role to be played in the modelling and the analysis of these numerous, noisy, sometimes heterogeneous and collected at various scales. This role can be from several nature. The statistician can propose new concepts, or new methods inspired by questions asked by this biology. He can propose a fine modelling of the phenomena observed by means of these technologies. And when methods exist and require only an adaptation, the role of the statistician can be the one of an expert, who knows the methods, their limits and the advantages.In a first part, I introduce different methods developed with my co-authors for the analysis of high-throughput biological data, based on latent variables models. These models make it possible to explain a observed phenomenon using hidden or latent variables. The simplest latent variable model is the mixture model. The first two presented methods constitutes two examples: the first in a context of multiple tests and the second in the framework of the definition of a hybridization threshold for data derived from microarrays. I also present a model of coupled hidden Markov chains for the detection of variations in the number of copies in genomics taking into account the dependence between individuals, due for example to a genetic proximity. For this model we propose an approximate inference based on a variational approximation, the exact inference not being able to be considered as the number of individuals increases. We also define a latent-block model modeling an underlying structure per block of rows and columns adapted to count data from microbial ecology. Metabarcoding and metagenomic data correspond to the abundance of each microorganism in a microbial community within the environment (plant rhizosphere, human digestive tract, ocean, for example). These data have the particularity of presenting a dispersion stronger than expected under the most conventional models (we speak of over-dispersion). Biclustering is a way to study the interactions between the structure of microbial communities and the biological samples from which they are derived. We proposed to model this phenomenon using a Poisson-Gamma distribution and developed another variational approximation for this particular latent block model as well as a model selection criterion. The model's flexibility and performance are illustrated on three real datasets.A second part is devoted to work dedicated to the analysis of transcriptomic data derived from DNA microarrays and RNA sequencing. The first section is devoted to the normalization of data (detection and correction of technical biases) and presents two new methods that I proposed with my co-authors and a comparison of methods to which I contributed. The second section devoted to experimental design presents a method for analyzing so-called dye-switch design.In the last part, I present two examples of collaboration, derived respectively from an analysis of genes differentially expressed from microrrays data, and an analysis of translatome in sea urchins from RNA-sequencing data, how statistical skills are mobilized, and the added value that statistics bring to genomics projects
Truffot, Jérôme. "Conception de réseaux haut débit sous contrainte de sécurisation." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2007. http://tel.archives-ouvertes.fr/tel-00718379.
Full textMuller, Olivier. "Architectures multiprocesseurs monopuces génériques pour turbo-communications haut-débit." Phd thesis, Université de Bretagne Sud, 2007. http://tel.archives-ouvertes.fr/tel-00545236.
Full textGouret, Wilfried. "Implantation de la communication CPL haut débit en embarqué." Phd thesis, INSA de Rennes, 2007. http://tel.archives-ouvertes.fr/tel-00286944.
Full textAgha, Khaldoun Al. "Optimisation des ressources pour réseaux mobiles à haut débit." Versailles-St Quentin en Yvelines, 1998. http://www.theses.fr/1998VERS0016.
Full textDu fait de l'etroite bande passante affectee au lien de transmission radio, l'optimisation de l'allocation de ressources dans les reseaux mobiles s'avere cruciale pour pouvoir y integrer le support haut debit. En effet, la diminution de la taille des cellules permet une reutilisation des ressources plus importante et demultiplie le nombre de transferts intercellulaires. Ceci implique une gestion complexe des ressources a travers les stations de base. Cette these s'est donc proposee d'etudier les schemas d'allocation existants, et d'elaborer des solutions pour les adapter aux variations de trafic, par essence, instantanees et peu previsibles dans les reseaux sans fil. Tout d'abord, l'attention s'est portee sur le schema fixe d'allocation. Partant de ce schema, a ete proposee une nouvelle methode fixe d'allocation qui permet d'equilibrer la charge du trafic, souvent non uniforme, en fonction des ressources disponibles. Cet ajustement s'est appuye sur un concept de seuil d'alarme gere par un systeme multi-agent. Les effort se sont ensuite concentres sur le schema d'allocation hybride, en particulier sur la repartition fixe et dynamique des ressources aux stations de base. Cette repartition etant invariable dans le temps, la demarche a consiste a la rendre flexible et a la faire evoluer en fonction du temps et par apprentissage. Pour ce faire, une architecture watm integrant un systeme multi-agent a ete concue, laquelle a permis de realiser un nouveau schema capable d'offrir une repartition adaptative et qui prend en consideration les conditions de trafic du reseau. Cette these s'est achevee sur la mise au point de trois schemas dynamiques issus du channel segregation. Fondes sur la communication entre les stations de base avoisinantes, ces schemas sont capables d'offrir plusieurs qualites de service a differentes classes de trafic. En effet, les agents intelligents installes dans les stations de base cooperent pour se partager les ressources avec un minimum de conflits
Muller, Olivier David. "Architectures multiprocesseurs monopuces génériques pour turbo-communications haut-débit." Lorient, 2007. http://www.theses.fr/2007LORIS106.
Full textPujol, Hubert. "Réseau d'interconnexion haut débit pour les architectures parallèles connexionnistes." Paris 11, 1994. http://www.theses.fr/1994PA112192.
Full textTruffot, Jérome. "Conception de réseaux haut débit sous contrainte de sécurisation." Clermont-Ferrand 2, 2007. http://www.theses.fr/2007CLF21793.
Full textClaude, Jean-Baptiste. "Développement d'outils de résolution de structure à haut débit." Aix-Marseille 2, 2007. http://www.theses.fr/2007AIX22063.
Full textWagner, Frédéric. "Redistribution de données à travers un réseau à haut débit." Phd thesis, Université Henri Poincaré - Nancy I, 2005. http://tel.archives-ouvertes.fr/tel-00011705.
Full textrégulièrement des données. Un tel échange s'effectue par une redistribution de données. Nous étudions comment effectuer une telle redistribution le plus efficacement possible en minimisant temps de communication et congestion du réseau.
Nous utilisons pour ce faire, une modélisation du problème à l'aide de graphes bipartis. Le modèle choisi permet une prise en compte du délai d'initialisation des communications, des différentes bandes passantes et impose une limite d'une communication simultanée par interface réseau (modèle 1-port) et de k communications simultanées sur la dorsale.
Nous effectuons une validation expérimentale du modèle puis l'utilisons pour développer deux algorithmes d'ordonnancement
des communications. Nous montrons que chacun d'entre eux
est un algorithme d'approximation garantissant un temps d'exécution dans le pire des cas 8/3 fois plus élevé que le temps optimal.
Nous concluons l'étude de ces algorithmes par une série d'expériences démontrant de bonnes performances en pratique.
Enfin, nous étendons le problème initial au cas de grappes hétérogènes :
ce cas imposant de sortir du modèle 1-port, nous montrons comment modifier nos algorithmes pour en tirer parti.
Nous étudions également le cas de redistributions exécutées en régime permanent sur un réseau d'une topologie plus complexe autorisant les communications locales.
Jacquin, Ludovic. "Compromis performance/sécurité des passerelles très haut débit pour Internet." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-00911075.
Full textMirat, Olivier. "Analyse haut-débit du comportement spontané d'un organisme modèle " simple "." Phd thesis, Université René Descartes - Paris V, 2013. http://tel.archives-ouvertes.fr/tel-00881755.
Full textDoan, Trung-Tung. "Epidémiologie moléculaire et métagénomique à haut débit sur la grille." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2012. http://tel.archives-ouvertes.fr/tel-00778073.
Full textGuéguen, Emeric. "Etude et optimisation des techniques UWB haut débit multibandes OFDM." Phd thesis, INSA de Rennes, 2009. http://tel.archives-ouvertes.fr/tel-00361467.
Full textAprès la présentation du système MB-OFDM et de ses performances, nous proposons une nouvelle forme d'onde pour l'UWB nommée LP-OFDM. Elle est obtenue en ajoutant au système MB-OFDM une fonction de précodage linéaire combinant avantageusement l'OFDM et la technique d'étalement de spectre. L'utilisation du précodage linéaire permet une meilleure exploitation de la diversité fréquentielle du canal. Il offre également une plus grande granularité dans le choix des débits, augmentant ainsi la flexibilité du système. Il est important de souligner que l'ajout de la fonction de précodage linéaire s'accompagne d'une augmentation minime de la complexité du système.
Des techniques d'optimisation du système sont proposées. Afin de minimiser l'interférence entre les codes d'étalement, une technique reposant sur l'allocation des séquences d'étalement est décrite et un critère de sélection proposé. Nous avons également mis en évidence la nécessité de trouver un compromis entre le rendement de codage et la longueur des séquences d'étalement afin de tirer au mieux partie de la diversité du canal tout en limitant l'interférence entre les codes. Une amélioration notable des performances du système LP-OFDM est ainsi obtenue par rapport au système MB-OFDM. Le système LP-OFDM étendu au cas du MIMO utilisant un codage temps-espace d'Alamouti est également étudié. Les résultats obtenus montrent là encore une amélioration significative des performances permettant d'envisager une augmentation de la portée ou des débits de transmission.
Enfin, l'impact d'un interférent à bande étroite de type WiMAX sur les performances du système LP-OFDM est étudié comparativement au système MB-OFDM. Il en ressort que le système LP-OFDM est plus robuste que le système MB-OFDM face à un brouilleur. L'utilisation du précodage linéaire permet en effet d'étaler la puissance du signal interférent en réception lors du déprécodage linéaire.
Elsankary, Kamal. "Dévelopement [sic] de gestionnaires de périphériques génériques à haut débit." Thèse, Université du Québec à Trois-Rivières, 2001. http://depot-e.uqtr.ca/2725/1/000098655.pdf.
Full textSaade, Gaëlle. "Toxicité de la hadronthérapie à ultra-haut débit de dose." Electronic Thesis or Diss., Nantes Université, 2024. http://www.theses.fr/2024NANU4043.
Full textUltra-High Dose Rate Radiotherapy (UHDR-RT), on the order of Gray/msec, is a promising technique for reducing toxicity to healthy tissues. Furthermore, the ballistic properties of hadrons (protons/helium ions) allow better tumor targeting and reduced dose delivery to surrounding healthy tissues. We hypothesized that UHDR hadrontherapy is the optimal method to limit side effects. Our research shows that UHDR hadrontherapy preserves the development of zebrafish embryos compared to conventional radiotherapy (conv-RT). Through transcriptional expression, we have also shown, that cell cycle arrest is less affected after UHDR protontherapy (UHDR- PT) than after conv-PT. Additionally, UHDR-PT showed less DNA damage and apoptosis. We then investigated the early effects of UHDR-PT on the vascular system, using transgenic zebrafish embryos. Our results indicate that existing vessels are not structurally affected by irradiation. However, new vessel development was inhibited. Similar results were obtained with pseudo-tubes formed by endothelial cells in vitro. Nevertheless, an increase in pericytes around intersegmental vessels was observed after PT-UHDD. Moreover, by RT- qPCR, we observed an increase in vascular response genes following UHDR-PT, notably VE-cadherin. In conclusion, this work highlights the protective effect of UHDR hadrontherapy on the development of zebrafish embryos. This phenotype appears to be partially independent of improved vascular protection
Drouard, Joeffrey. "Analyse économique du marché du haut débit : contributions théoriques et empiriques." Phd thesis, Télécom ParisTech, 2010. http://pastel.archives-ouvertes.fr/pastel-00543896.
Full textCrenne, Jérémie. "Sécurité Haut-débit pour les Systèmes Embarqués à base de FPGAs." Phd thesis, Université de Bretagne Sud, 2011. http://tel.archives-ouvertes.fr/tel-00655959.
Full textVanstraceele, Christophe. "Turbo codes et estimation paramétrique pour les communications à haut débit." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2005. http://tel.archives-ouvertes.fr/tel-00133951.
Full textLimasset, Antoine. "Nouvelles approches pour l'exploitation des données de séquences génomique haut débit." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S049/document.
Full textNovel approaches for the exploitation of high throughput sequencing data In this thesis we discuss computational methods to deal with DNA sequences provided by high throughput sequencers. We will mostly focus on the reconstruction of genomes from DNA fragments (genome assembly) and closely related problems. These tasks combine huge amounts of data with combinatorial problems. Various graph structures are used to handle this problem, presenting trade-off between scalability and assembly quality. This thesis introduces several contributions in order to cope with these tasks. First, novel representations of assembly graphs are proposed to allow a better scaling. We also present novel uses of those graphs apart from assembly and we propose tools to use such graphs as references when a fully assembled genome is not available. Finally we show how to use those methods to produce less fragmented assembly while remaining tractable
Gazal, Steven. "La consanguinité à l'ère du génome haut-débit : estimations et applications." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA11T026/document.
Full textAn individual is said to be inbred if his parents are related and if his genealogy contains at least one inbreeding loop leading to a common ancestor. The inbreeding coefficient of an individual is defined as the probability that the individual has received two alleles identical by descent, coming from a single allele present in a common ancestor, at a random marker on the genome. The inbreeding coefficient is a central parameter in genetics, and is used in population genetics to characterize the population structure, and also in genetic epidemiology to search for genetic factors involved in recessive diseases.The inbreeding coefficient was traditionally estimated from genealogies, but methods have been developed to avoid genealogies and to estimate this coefficient from the information provided by genetic markers distributed along the genome.With the advances in high-throughput genotyping techniques, it is now possible to genotype hundreds of thousands of markers for one individual, and to use these methods to reconstruct the regions of identity by descent on his genome and estimate a genomic inbreeding coefficient. There is currently no consensus on the best strategy to adopt with these dense marker maps, in particular to take into account dependencies between alleles at different markers (linkage disequilibrium).In this thesis, we evaluated the different available methods through simulations using real data with realistic patterns of linkage disequilibrium. We highlighted an interesting approach that consists in generating several submaps to minimize linkage disequilibrium, estimating an inbreeding coefficient of each of the submaps based on a hidden Markov method implemented in FEstim software, and taking as estimator the median of these different estimates. The advantage of this approach is that it can be used on any sample size, even on an individual, since it requires no linkage disequilibrium estimate. FEstim is a maximum likelihood estimator, which allows testing whether the inbreeding coefficient is significantly different from zero and determining the most probable mating type of the parents. Finally, through the identification of homozygous regions shared by several consanguineous patients, our strategy permits the identification of recessive mutations involved in monogenic and multifactorial diseases.To facilitate the use of our method, we developed the pipeline FSuite, to interpret results of population genetics and genetic epidemiology studies, as shown on the HapMap III reference panel, and on a case-control Alzheimer's disease data
Giovannini-Chami, Lisa. "Déterminants épithéliaux de la maladie asthmatique : apports des approches haut débit." Nice, 2011. http://www.theses.fr/2011NICE4091.
Full textAsthma has been proposed as a « barrier » disease linked with intrinsic susceptibility of respiratory epithelium. Epithelial susceptibility to stress, in particular viral-induced stress, leads to alteration of epithelial repair mechanisms, and provides a substratum for the development of remodelling. The molecular determinants of this susceptibility still remain unknown. Recent high throughput techniques now make possible a comprehensive analysis of the repertoire of genetic variants associated with the disease. In complement to genetic studies based on large cohorts of patients, we aimed at identifying altered gene expression affecting signalling pathways associated with epithelial cells in different well-defined phenotypes of paediatric allergic respiratory disease. We have established gene expression profiles in nasal brushings from children with allergic rhinitis, and sensitized to house dust mites. Children were (or not) asthmatics. Asthma was controlled (or not). Comparison of all these different profiles with those obtained in a model of primary culture of differentiated nasal epithelium, and stimulated by interleukins IL-4 or IL-13, by interferons IFN- α IFN-β or IFN-γ or at different stages of in vitro differentiation allowed us to demonstrate an epithelial activation signature common to all allergic children. The signature was primarily driven by Th2 cytokines. In addition, children with uncontrolled asthma showed a decrease in the level of transcripts induced by interferon. This work shows the interaction of the epithelium with viral pathogens in early childhood as a key element in the genesis of asthma and its control
Lecat-Guillet, Nathalie. "Identification d’inhibiteurs du symporteur sodium-iode par criblage à haut débit." Paris 11, 2006. http://www.theses.fr/2006PA112345.
Full textMieyeville, Fabien. "Modélisation de liaisons optiques inter- et intra-puces à haut débit." Ecully, Ecole centrale de Lyon, 2001. http://www.theses.fr/2001ECDL0018.
Full textGouret, Wilfried. "Contribution à l'étude des communications courant porteur haut débit pour l'embarqué." Rennes, INSA, 2007. https://tel.archives-ouvertes.fr/tel-00286944.
Full textThe evolution of the number of electronics systems in the vehicles generates an increase in the exchanges between the systems. These cables reach lengths higher than 2 km on certain cars. To reduce the number of cables, the concept of multiplexing is a solution. It consists in making forward on only one and even flow line of the communications belonging to several pairs of equipment. The solutions containing dedicated protocols are already present on the current vehicles. The advantages of such solutions are multiple, but generate disparities on the associated protocols. The communications between the protocols require interfaces, which penalize the data flows and are not any more real times. In order to reduce the number of cables, we were interested in a solution based on the techniques known as powerline communication. To prove the feasibility of such a system on a vehicle is the object of our contribution of the PREDIT CCPE project
Frigui, Nejm Eddine. "Maintenance automatique du réseau programmable d'accès optique de très haut débit." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2019. http://www.theses.fr/2019IMTA0127/document.
Full textPassive Optical Network (PON) representing one of the most attractive FTTH access network solutions, have been widely deployed for several years thanks to their ability to offer high speed services. However, due to the dynamicity of users traffic patterns, PONs need to rely on an efficient upstream bandwidth allocation mechanism. This mechanism is currently limited by the static nature of Service Level Agreement (SLA) parameters which can lead to an unoptimized bandwidth allocation in the network. The objective of this thesis is to propose a new management architecture for optimizing the upstream bandwidth allocation in PON while acting only on manageable parameters to allow the involvement of self-decision elements into the network. To achieve this, classification techniques based on machine learning approaches are used to analyze the behavior of PON users and to specify their upstream data transmission tendency. A dynamic adjustment of some SLA parameters is then performed to maximize the overall customers’ satisfaction with the network
Ric, Audrey Marie Amélie. "Caractérisation d'aptamères par électrophorèse capillaire couplée au séquençage haut-débit Illumina." Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30388/document.
Full textAptamers are oligomers of small single-stranded DNA or RNA which can have strong and specific interactions with some targets when they fold into three-dimensional structures. The objective of this thesis was to complete existing studies on the use of capillary electrophoresis in order to develop a method for the selection of aptamers by CE coupled to laser induced fluorescence and Illumina high-throughput sequencing. In a first step, we developed a method of detection and separation by capillary electrophoresis coupled with the double detection UV-LEDIF of a DNA library interacting with a target: thrombin. It is a model already studied and for which two aptamers have been published. We used aptamer T29 as part of our study because it has the best affinity. Capillary Electrophoresis is a powerful analytical tool that facilitates the selection efficiency of aptamers and specifies the determination of the interaction parameters. We thus were able to determine the affinity constant KD by CE-UV-LEDIF on the basic model: thrombin. Moreover, we also show how the use of Tris buffer can degrade single-stranded DNA during capillary electrophoresis and we propose as an alternative the use of a dibasic sodium phosphate buffer which avoids the phenomenon of degradation. Finally, we explain the difficulty of amplification by qPCR and PCR of an aptamer such as T29 with a G-quadruplex structure. We showed that the Illumina high-throughput sequencing allowed us to find a correlation between the number of sequenced molecules and the number of sequences obtained. Analysis of the sequences obtained shows a significant amount (20%) of T29 sequences which do not correspond to the sequence of this aptamer. This shows that the PCR and high-throughput sequencing steps for the detection of G-quadruplex can induce bias in the identification of these molecules
Benoit, Landry. "Imagerie multimodalité appliquée au phénotypage haut-débit des semences et plantules." Thesis, Angers, 2015. http://www.theses.fr/2015ANGE0084.
Full textAlong this work, we have used the potentiality of different modalities of imagery that we apply to the plant domain so as to contribute to the high-throughput phenotyping of seeds and seedlings. We have mainly committed ourselves to the search for answers to two specific and important problematic in this domain. We begin by showing the applicability of visible imaging using an inactinic light and passive thermographic imaging to image the development of seeds and seedlings, a biological phenomenon usually occurring in soil and darkness. We present our contributions to this type of imaging through our contributions to the conception and the realization of a vision system using visible inactinic imaging, whose finality is the realization of individualized automated measurement on the seeds, the seedlings and the organs of the seedlings. This system handle seedling crossing, through the original use of anisotropic diffusion, which allowed us to multiply, without information loss, the output by ten. Furthermore, this system carries out the separation of the organs by means of a generic criterion based on gravitropism. The validation of the image processing algorithms of the vision system use original ways (numerical simulation and test of the influence of the uncertainty through agronomic simulation). Thermographic imaging, which captures the passive heat radiation of objects, allows us to visualize and to measure seeds and seedlings in the darkness. It also allows realizing the segmentation and the tracking of the organs of seedlings. This imaging technology also allowed us to demonstrate the feasibility of a non-destructive determination of sugar quantity in organs of beet seedlings. We then propose a generic methodology that allows the conception of spectrally optimized low-cost sensors, according to determined application tasks. This methodology uses information theory, to extract from, relatively expensive, hyperspectral imaging, the information needed for the conception of the dedicated low-cost sensors. The interest of this methodology for plant phenotyping has been shown and justifies its transfer to the world of research in plant biology
Hannoun-Lévi, Jean-Michel. "Optimisation de la distribution de dose en curiethérapie de haut débit." Nice, 2008. http://www.theses.fr/2008NICE4060.
Full textMumtaz, Sami. "Nouvelles techniques de codage pour les communications optiques à haut-débit." Paris, Télécom ParisTech, 2011. https://pastel.hal.science/pastel-00679068.
Full textSince the development of high-speed electronics, digital signal processing has become essential to optical transmission systems. In particular, equalization now allows compensating signal distortions resulting from the propagation in the fiber. In this thesis, we focus on another type of signal processing: the coding techniques. We present three new ideas in order to improve systems performance by reducing either the bit-error rate or the receiver complexity. Forward-error correction has been used for many years in optical transmission systems in order to guarantee very low bit-error rates. With the increase of transmission bit rates, more powerful codes are expected such as low-density parity check (LDPC) codes. We study this new family of codes and show its impressive performance. We also propose a new construction of LDPC code that has optimal performance in comparison to codes proposed in the literature. Then we present a new interleaving scheme for optical transmission systems using differential encoding. Our technique leads to performance improvement and allows a large complexity reduction of the receiver. Finally we present a new coding technique for optical systems, based on space-time coding. We show that polarization dependent loss can be almost entirely suppressed in transmissions systems using polarization division multiplexing
Alameh, Malak. "Phénotypage à haut débit des variants du canal potassique cardiaque hERG." Electronic Thesis or Diss., Nantes Université, 2024. http://www.theses.fr/2024NANU1023.
Full textHereditary long QT syndrome type 2 (LQTS2) is a cardiac rhythm disorder caused by loss-of-function mutations in the KCNH2 gene, encoding the Herg cardiac ion channel. Today, over 3,000 variants of the KCNH2 gene are listed in the international ClinVar database, but the majority are classified as Variants of Unknown Significance (VUS), as their potential pathogenicity has yet to be determined. Reclassification of these VUS is essential to improve follow-up of affected individuals. This project focused on the study of variants identified in the French Bamacœur database, with the main aim of rapidly reclassifying VUSs. We studied 303 variants in total, reported in the French database Bamacoeur including 201 VUS, 81 pathogenic controls and 21 benign controls. All biophysical parameters are measured, including current amplitude, activation and inactivation curves and kinetics. Z-score values are used to determine regions of normal or abnormal functionality. The reclassification of VUS is made possible given the high level of evidence in this study. We identified 151/201 VUS variants in the normal function zone and 50 variants in the abnormal function zone. In addition, we identified regionspecific abnormalities in the hERG sequence. These data will be used to create a database integrating functional, clinical and genetic information, providing a valuable tool for clinicians in the diagnosis and prevention of LQTS2