Literatura académica sobre el tema "Generic decoding"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Generic decoding".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Generic decoding"

1

Guobin Shen, Guang-Ping Gao, Shipeng Li, Heung-Yeung Shum y Ya-Qin Zhang. "Accelerate video decoding with generic GPU". IEEE Transactions on Circuits and Systems for Video Technology 15, n.º 5 (mayo de 2005): 685–93. http://dx.doi.org/10.1109/tcsvt.2005.846440.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Lax, R. F. "Generic interpolation polynomial for list decoding". Finite Fields and Their Applications 18, n.º 1 (enero de 2012): 167–78. http://dx.doi.org/10.1016/j.ffa.2011.07.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Kushnerov, Alexander V. y Valery A. Lipnitski. "Generic BCH codes. Polynomial-norm error decoding". Journal of the Belarusian State University. Mathematics and Informatics, n.º 2 (30 de julio de 2020): 36–48. http://dx.doi.org/10.33581/2520-6508-2020-2-36-48.

Texto completo
Resumen
The classic Bose – Chaudhuri – Hocquenghem (BCH) codes is famous and well-studied part in the theory of error-correcting codes. Generalization of BCH codes allows us to expand the range of activities in the practical correction of errors. Some generic BCH codes are able to correct more errors than classic BCH code in one message block. So it is important to provide appropriate method of error correction. After our investigation it was found that polynomial-norm method is most convenient and effective for that task. The result of the study was a model of a polynomial-norm decoder for a generic BCH code at length 65.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Dupuis, Frédéric, Jan Florjanczyk, Patrick Hayden y Debbie Leung. "The locking-decoding frontier for generic dynamics". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 469, n.º 2159 (8 de noviembre de 2013): 20130289. http://dx.doi.org/10.1098/rspa.2013.0289.

Texto completo
Resumen
It is known that the maximum classical mutual information, which can be achieved between measurements on pairs of quantum systems, can drastically underestimate the quantum mutual information between them. In this article, we quantify this distinction between classical and quantum information by demonstrating that after removing a logarithmic-sized quantum system from one half of a pair of perfectly correlated bitstrings, even the most sensitive pair of measurements might yield only outcomes essentially independent of each other. This effect is a form of information locking but the definition we use is strictly stronger than those used previously. Moreover, we find that this property is generic, in the sense that it occurs when removing a random subsystem. As such, the effect might be relevant to statistical mechanics or black hole physics. While previous works had always assumed a uniform message, we assume only a min-entropy bound and also explore the effect of entanglement. We find that classical information is strongly locked almost until it can be completely decoded. Finally, we exhibit a quantum key distribution protocol that is ‘secure’ in the sense of accessible information but in which leakage of even a logarithmic number of bits compromises the secrecy of all others.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Jouguet, Paul y Sebastien Kunz-Jacques. "High performance error correction for quantum key distribution using polar codes". Quantum Information and Computation 14, n.º 3&4 (marzo de 2014): 329–38. http://dx.doi.org/10.26421/qic14.3-4-8.

Texto completo
Resumen
We study the use of polar codes for both discrete and continuous variables Quantum Key Distribution (QKD). Although very large blocks must be used to obtain the efficiency required by quantum key distribution, and especially continuous variables quantum key distribution, their implementation on generic x86 Central Processing Units (CPUs) is practical. Thanks to recursive decoding, they exhibit excellent decoding speed, much higher than large, irregular Low Density Parity Check (LDPC) codes implemented on similar hardware, and competitive with implementations of the same codes on high-end Graphic Processing Units (GPUs).
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Xu, Liyan, Fabing Duan, Xiao Gao, Derek Abbott y Mark D. McDonnell. "Adaptive recursive algorithm for optimal weighted suprathreshold stochastic resonance". Royal Society Open Science 4, n.º 9 (septiembre de 2017): 160889. http://dx.doi.org/10.1098/rsos.160889.

Texto completo
Resumen
Suprathreshold stochastic resonance (SSR) is a distinct form of stochastic resonance, which occurs in multilevel parallel threshold arrays with no requirements on signal strength. In the generic SSR model, an optimal weighted decoding scheme shows its superiority in minimizing the mean square error (MSE). In this study, we extend the proposed optimal weighted decoding scheme to more general input characteristics by combining a Kalman filter and a least mean square (LMS) recursive algorithm, wherein the weighted coefficients can be adaptively adjusted so as to minimize the MSE without complete knowledge of input statistics. We demonstrate that the optimal weighted decoding scheme based on the Kalman–LMS recursive algorithm is able to robustly decode the outputs from the system in which SSR is observed, even for complex situations where the signal and noise vary over time.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Florescu, Dorian y Daniel Coca. "A Novel Reconstruction Framework for Time-Encoded Signals with Integrate-and-Fire Neurons". Neural Computation 27, n.º 9 (septiembre de 2015): 1872–98. http://dx.doi.org/10.1162/neco_a_00764.

Texto completo
Resumen
Integrate-and-fire neurons are time encoding machines that convert the amplitude of an analog signal into a nonuniform, strictly increasing sequence of spike times. Under certain conditions, the encoded signals can be reconstructed from the nonuniform spike time sequences using a time decoding machine. Time encoding and time decoding methods have been studied using the nonuniform sampling theory for band-limited spaces, as well as for generic shift-invariant spaces. This letter proposes a new framework for studying IF time encoding and decoding by reformulating the IF time encoding problem as a uniform sampling problem. This framework forms the basis for two new algorithms for reconstructing signals from spike time sequences. We demonstrate that the proposed reconstruction algorithms are faster, and thus better suited for real-time processing, while providing a similar level of accuracy, compared to the standard reconstruction algorithm.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Li, Yinan, Jianan Lu y Badrish Chandramouli. "Selection Pushdown in Column Stores using Bit Manipulation Instructions". Proceedings of the ACM on Management of Data 1, n.º 2 (13 de junio de 2023): 1–26. http://dx.doi.org/10.1145/3589323.

Texto completo
Resumen
Modern analytical database systems predominantly rely on column-oriented storage, which offers superior compression efficiency due to the nature of the columnar layout. This compression, however, creates challenges in decoding speed during query processing. Previous research has explored predicate pushdown on encoded values to avoid decoding, but these techniques are restricted to specific encoding schemes and predicates, limiting their practical use. In this paper, we propose a generic predicate pushdown approach that supports arbitrary predicates by leveraging selection pushdown to reduce decoding costs. At the core of our approach is a fast select operator capable of directly extracting selected encoded values without decoding, by using Bit Manipulation Instructions, an instruction set extension to the X86 architecture. We empirically evaluate the proposed techniques in the context of Apache Parquet using both micro-benchmarks and the TPC-H benchmark, and show that our techniques improve the query performance of Parquet by up to one order of magnitude with representative scan queries. Further experimentation using Apache Spark demonstrates speed improvements of up to 5.5X even for end-to-end queries involving complex joins.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Rybalov, A. N. "On the generic complexity of the decoding problem for linear codes". Prikladnaya diskretnaya matematika. Prilozhenie, n.º 12 (1 de septiembre de 2019): 198–202. http://dx.doi.org/10.17223/2226308x/12/56.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Jia, Xiaojun y Zihao Liu. "One-Shot M-Array Pattern Based on Coded Structured Light for Three-Dimensional Object Reconstruction". Journal of Control Science and Engineering 2021 (2 de junio de 2021): 1–16. http://dx.doi.org/10.1155/2021/6676704.

Texto completo
Resumen
Pattern encoding and decoding are two challenging problems in a three-dimensional (3D) reconstruction system using coded structured light (CSL). In this paper, a one-shot pattern is designed as an M-array with eight embedded geometric shapes, in which each 2 × 2 subwindow appears only once. A robust pattern decoding method for reconstructing objects from a one-shot pattern is then proposed. The decoding approach relies on the robust pattern element tracking algorithm (PETA) and generic features of pattern elements to segment and cluster the projected structured light pattern from a single captured image. A deep convolution neural network (DCNN) and chain sequence features are used to accurately classify pattern elements and key points (KPs), respectively. Meanwhile, a training dataset is established, which contains many pattern elements with various blur levels and distortions. Experimental results show that the proposed approach can be used to reconstruct 3D objects.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Generic decoding"

1

Florjanczyk, Jan. "The locking-decoding frontier for generic dynamics". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=106400.

Texto completo
Resumen
The intuition that the amount of classical correlations between two systems be bounded by their size does not hold true in general for quantum states. In the setting of information locking, measurements on a pair of quantum systems that appear to be completely uncorrelated can become maximally correlated with a small increment in the size of one of the systems. A new information locking scheme based on generic unitary channels is presented and a strengthened definition of locking based on a measure of indistinguishability is used. The new definition demonstrates that classical information can be kept arbitrarily low until it can be completely decoded. Unlike previous locking results, non-uniform input messages are allowed and shared entanglement between the pair of quantum systems is considered. Whereas past locking results relied on schemes with an explicit "key" register, this requirement is eliminated in favor of an arbitrary quantum subsystem. Furthermore, past results considered only projective measurements at the receiver. Here locking effects can be shown even in the case where the receiver is armed with the most general type of measurement. The locking effect is found to be generic and finds applications in entropic security and models for black hole evaporation.
L'intuition que le montant des corrélations classiques entre deux systèmes sont limités par leur taille est incorrect en général pour les états quantiques. En cas de verrouillage, des mesures sur une paire de systèmes quantiques qui semblent être totalement décorrélées peuvent devenir corrélées au maximum avec une minuscule augmentation de la taille d'un des systèmes. Une nouvelle forme de verrouillage utilisant des canaux unitaire génériques est introduite et la définition de verrouillage est renforcée a base d'une mesure d'indiscernabilité. La nouvelle définition montre que l'information classique peut être arbitrairement bas jusqu'à ce qu'elle puisse être complètement décodée. Aux contraire des résultats précédents, des messages non-uniforme et l'intrication entre la paire de systèmes sont considérés. Auparavant, il était nécessaire d'avoir un registre explicite pour une "clé", cette nécessité est supprimée en faveure d'un sous-système quantique arbitraire. De plus, les résultats précédent considéraient que les mesures projective mais nous démontrons des effets de verrouillage même dans le cas où le récepteur est armé avec les mesures les plus générales. Nous trouvons l'effet de verrouillage générique et montrons des applications pour la sécurité entropique et pour un modèl d'évaporation des trous noirs.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Mahmudi, Ali. "The investigation into generic VHDL implementation of generalised minimum distance decoding for Reed Solomon codes". Thesis, University of Huddersfield, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.417302.

Texto completo
Resumen
This thesis is concerned with the hardware implementation in VHDL (VHSIC Hardware Description Language) of a Generalised Minimum Distance (GMD) decoder for Reed Solomon (RS) codes. The generic GMD decoder has been implemented for the Reed Solomon codes over GF(28 ). It works for a number of RS codes: RS(255, 239), RS(255, 241), RS(255, 243), RS(255, 245), RS(255, 247), RS(255, 249), and RS(255, 251). As a comparison, a Hard Decision Decoder (HDD) using the Welch-Berlekamp algorithm for the same RS codes is also implemented. The designs were first implemented in MAT LAB. Then, the designs were written in VHDL and the target device was the AItera Field Programmable Gate Array (FPGA) Stratix EP 1 S25-B672C6. The GMD decoder achieved an internal clock speed of 66.29 MHz with RS(255, 251) down to 57.24 MHz with RS(255, 239). In the case of HDD, internal clock speeds were 112.01 MHz with RS(255, 251) down to 86.23 MHz with RS(255, 239). It is concluded that the GMD needs a lot of extra hardware compared to the HDD. The decoder GMD needs as little as 35% extra hardware in the case of RS(255, 251) decoder, but it needs 100% extra hardware for the RS(255, 241) decoder. If there is an option to choose the type of RS code to use, it is preferable to use the HDD decoder rather than the GMD decoder. In real world, the type of RS code to use is usually fixed by the standard regulation. Hence, one of the alternative way to enhance the decoding performance is by using the GMD decoder
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Leuschner, Jeff. "A new generic maximum-likelihood metric expression for space-time block codes with applications to decoding". Thesis, Kingston, Ont. : [s.n.], 2007. http://hdl.handle.net/1974/633.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Shi, Aishan. "Decoding the Genetic Code: Unraveling the Language of Scientific Paradigms". Thesis, The University of Arizona, 2013. http://hdl.handle.net/10150/297762.

Texto completo
Resumen
Scientific revolutions have not only significantly broadened our knowledge underlying physical laws and natural patterns, but also shifted the cultural paradigm through which science is understood and practiced. These paradigm shifts, as Thomas Kuhn denoted them, are facilitated through changes in language, because language is the only method of articulating - and thereby establishing - truth, according to Friedrich Nietzsche and Michel Foucalt. Steven Shapin analyzed the progression of these linguistic changes in global scientific revolutions and Bruno Latour categorized them in the local laboratory setting. One of the most recent revolutions in science, the discovery of the double helix by James Watson and Francis Crick, altered the understanding and application of genetics. James Watson’s personal account of this discovery, the Double Helix, makes the point to change the general perception of science as intellectual labor to innovative play. However, he does not portray his discovery in the scientific elegance that it deserves as the culmination of nearly a century of research and the marriage of quantum physics and biology. This thesis explores the paradigm shifts that developed with each scientific revolution, how they led to the double helix, and finally, the paradigm shift of the "Structure-Function Relationship" that accompanied this discovery.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Halsteinli, Erlend. "Real-Time JPEG2000 Video Decoding on General-Purpose Computer Hardware". Thesis, Norwegian University of Science and Technology, Department of Electronics and Telecommunications, 2009. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-8996.

Texto completo
Resumen

There is widespread use of compression in multimedia content delivery, e.g. within video on demand services and transport links between live events and production sites. The content must undergo compression prior to transmission in order to deliver high quality video and audio over most networks, this is especially true for high definition video content. JPEG2000 is a recent image compression standard and a suitable compression algorithm for high definition, high rate video. With its highly flexible embedded lossless and lossy compression scheme, JPEG2000 has a number of advantages over existing video codecs. The only evident drawbacks with respect to real-time applications, are that the computational complexity is quite high and that JPEG2000, being an image compression codec as opposed to video codec, typically has higher bandwidth requirements. Special-purpose hardware can deliver high performance, but is expensive and not easily updated. A JPEG2000 decoder application running on general-purpose computer hardware can complement solutions depending on special-purpose hardware and will experience performance scaling together with the available processing power. In addition, production costs will be none-existing, once developed. The application implemented in this project is a streaming media player. It receives a compressed video stream through an IP interface, decodes it frame by frame and presents the decoded frames in a window. The decoder is designed to better take advantage of the processing power available in today's desktop computers. Specifically, decoding is performed on both CPU and GPU in order to decode minimum 50 frames per second of a 720p JPEG2000 video stream. The CPU executed part of the decoder application is written in C++, based on the Kakadu SDK and involve all decoding steps up to and including reverse wavelet transform. The GPU executed part of the decoder is enabled by the CUDA programming language, and include luma upsampling and irreversible color transform. Results indicate that general purpose computer hardware today easily can decode JPEG2000 video at bit rates up to 45 Mbit/s. However, when the video stream is received at 50 fps through the IP interface, packet loss at the socket level limits the attained frame rate to about 45 fps at rates of 40 Mbit/s or lower. If this packet loss could be eliminated, real-time decoding would be obtained up to 40 Mbit/s. At rates above 40 Mbit/s, the attained frame rate is limited by the decoder performance and not the packet loss. Higher codestream rates should be endurable if reverse wavelet transform could be mapped from the CPU to the GPU, since the current pipeline is highly unbalanced.

Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kessy, Regina. "Decoding the donor gaze : documentary, aid and AIDS in Africa". Thesis, University of Huddersfield, 2014. http://eprints.hud.ac.uk/id/eprint/23747/.

Texto completo
Resumen
The discourse of ‘the white man’s burden’ that originated in the nineteenth century with missionaries and colonialism still underpins much of the development ideology towards Africa today. The overwhelming assumption that rich Western countries can and should address ‘underdevelopment’ through aid only stigmatizes African reality, framing it to mirror the worldview of the international donors who fund most non-profit interventionist documentaries. In the ‘parachute filmmaking’ style that results, facilitated by financial resources and reflecting the self-serving intentions of the donors, the non-profit filmmaker functions simply as an agent of meaning rather than authentic author of the text. Challenged by limited production schedules and lacking in cultural understanding most donor-sponsored films fall back on an ethnocentric one-size-fits-all template of an ‘inferior other’ who needs to be ‘helped’. This study sets out to challenge the ‘donor gaze’ in documentary films which ‘speak about’ Africa, arguing instead for a more inclusive style of filmmaking that gives voice to its subjects by ‘speaking with’ them. The special focus is on black African women whose images are used to signify helplessness, vulnerability and ignorance, particularly in donor-funded documentaries addressing HIV/AIDS. Through case studies of four films this study asks: 1. How do documentary films reinforce the donor gaze? (how is the film speaking and why?) 2. Can the donor gaze be challenged? (should intentionality always override subjectivity of the filmed subjects?) Film studies approach the gaze psychoanalytically (e.g. Mulvey 1975) but this study focuses on the conscious gaze of filmmakers because they reinforce or challenge ‘the pictures in our heads.’ Sight is an architect of meaning. Gaze orders reality but the documentary gaze can re-order it. The study argues that in Africa, the ‘donor gaze’ constructs meaning by ‘speaking about’ reality and calls instead for a new approach for documentary to ‘speak with’ reality.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Al-Wasity, Salim Mohammed Hussein. "Application of fMRI for action representation : decoding, aligning and modulating". Thesis, University of Glasgow, 2018. http://theses.gla.ac.uk/30761/.

Texto completo
Resumen
Functional magnetic resonance imaging (fMRI) is an important tool for understanding neural mechanisms underlying human brain function. Understanding how the human brain responds to stimuli and how different cortical regions represent the information, and if these representational spaces are shared across brains and critical for our understanding of how the brain works. Recently, multivariate pattern analysis (MVPA) has a growing importance to predict mental states from fMRI data and to detect the coarse and fine scale neural responses. However, a major limitation of MVPA is the difficulty of aligning features across brains due to high variability in subjects’ responses and hence MVPA has been generally used as a subject specific analysis. Hyperalignment, solved this problem of feature alignment across brains by mapping neural responses into a common model to facilitate between subject classifications. Another technique of growing importance in understanding brain function is real-time fMRI Neurofeedback, which can be used to enable individuals to alter their brain activity. It facilitates people’s ability to learn control of their cognitive processes like motor control and pain by learning to modulate their brain activation in targeted regions. The aim of this PhD research is to decode and to align the motor representations of multi-joint arm actions based on different modalities of motor simulation, for instance Motor Imagery (MI) and Action Observation (AO) using functional Magnetic Resonance Imaging (fMRI) and to explore the feasibility of using a real-time fMRI neurofeedback to alter these action representations. The first experimental study of this thesis was performed on able-bodied participants to align the neural representation of multi-joint arm actions (lift, knock and throw) during MI tasks in the motor cortex using hyperalignment. Results showed that hyperalignment affords a statistically higher between-subject classification (BSC) performance compared to anatomical alignment. Also, hyperalignment is sensitive to the order in which subjects entered the hyperalignment algorithm to create the common model space. These results demonstrate the effectiveness of hyperalignment to align neural responses in motor cortex across subjects to enable BSC of motor imagery. The second study extended the use of hyperalignment to align fronto-parietal motor regions by addressing the problems of localization and cortical parcellation using cortex based alignment. Also, representational similarity analysis (RSA) was applied to investigate the shared neural code between AO+MI and MI of different actions. Results of MVPA revealed that these actions as well as their modalities can be decoded using the subject’s native or the hyperaligned neural responses. Furthermore, the RSA showed that AO+MI and MI representations formed separate clusters but that the representational organization of action types within these clusters was identical. These findings suggest that the neural representations of AO+MI and MI are neither the same nor totally distinct but exhibit a similar structural geometry with respect to different types of action. Results also showed that MI dominates in the AO+MI condition. The third study was performed on phantom limb pain (PLP) patients to explore the feasibility of using real-time fMRI neurofeedback to down-regulate the activity of premotor (PM) and anterior cingulate (ACC) cortices and whether the successful modulation will reduce the pain intensity. Results demonstrated that PLP patients were able to gain control and decrease the ACC and PM activation. Those patients reported decrease in the ongoing level of pain after training, but it was not statistically significant. The fourth study was conducted on healthy participants to study the effectiveness of fMRI neurofeedback on improving motor function by targeting Supplementary Motor Cortex (SMA). Results showed that participants learnt to up-regulate their SMA activation using MI of complex body actions as a mental strategy. In addition, behavioural changes, i.e. shortening of motor reaction time was found in those participants. These results suggest that fMRI neurofeedback can assist participants to develop greater control over motor regions involved in motor-skill learning and it can be translated into an improvement in motor function. In summary, this PhD thesis extends and validates the usefulness of hyperalignment to align the fronto-parietal motor regions and explores its ability to generalise across different levels of motor representation. Furthermore, it sheds light on the dominant role of MI in the AO+MI condition by examining the neural representational similarity of AO+MI and MI tasks. In addition, the fMRI neurofeedback studies in this thesis provide proof-of-principle of using this technology to reduce pain in clinical applications and to enhance motor functions in a healthy population, with the potential for translation into the clinical environment.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Carrier, Kevin. "Recherche de presque-collisions pour le décodage et la reconnaissance de codes correcteurs". Electronic Thesis or Diss., Sorbonne université, 2020. http://www.theses.fr/2020SORUS281.

Texto completo
Resumen
Les codes correcteurs d'erreurs sont des outils ayant pour fonction originale de corriger les erreurs produites par des canaux de communication imparfaits. Dans un contexte non coopératif, se pose le problème de reconnaître des codes inconnus à partir de la seule connaissance de mots de code bruités. Ce problème peut s'avérer difficile pour certaines familles de codes, notamment pour les codes LDPC qui sont très présents dans nos systèmes de télécommunication modernes. Dans cette thèse, nous proposons de nouvelles techniques pour reconnaître plus facilement ces codes. À la fin des années 70, McEliece eu l'idée de détourner la fonction première des codes pour les utiliser dans des chiffrements, initiant ainsi une famille de solutions cryptographiques alternative à celle fondée sur la théorie des nombres. Un des avantages de la cryptographie fondée sur les codes est qu'elle semble résister au paradigme de calcul quantique ; notamment grâce à la robustesse du problème de décodage générique. Ce dernier a été profondément étudié ces 60 dernières années. Les dernières améliorations utilisent toutes des algorithmes de recherche de couples de points proches dans une liste. Dans cette thèse, nous améliorons le décodage générique en proposant notamment une nouvelle façon de rechercher des couples proches. Notre méthode repose sur l'utilisation de décodages en liste de codes polaires pour construire des fonctions de hachage floues. Dans ce manuscrit, nous traitons également la recherche de couples de points éloignés. Notre solution peut être utilisée pour améliorer le décodage en grandes distances qui a récemment trouvé des applications dans des designs de signature
Error correcting codes are tools whose initial function is to correct errors caused by imperfect communication channels. In a non-cooperative context, there is the problem of identifying unknown codes based solely on knowledge of noisy codewords. This problem can be difficult for certain code families, in particular LDPC codes which are very common in modern telecommunication systems. In this thesis, we propose new techniques to more easily recognize these codes. At the end of the 1970s, McEliece had the idea of ​​redirecting the original function of codes to use in ciphers; thus initiating a family of cryptographic solutions which is an alternative to those based on number theory problems. One of the advantages of code-based cryptography is that it seems to withstand the quantum computing paradigm; notably thanks to the robustness of the generic decoding problem. The latter has been thoroughly studied for more than 60 years. The latest improvements all rely on using algorithms for finding pairs of points that are close to each other in a list. This is the so called near-collisions search problem. In this thesis, we improve the generic decoding by asking in particular for a new way to find close pairs. To do this, we use list decoding of Arikan's polar codes to build new fuzzy hashing functions. In this manuscript, we also deal with the search for pairs of far points. Our solution can be used to improve decoding over long distances. This new type of decoding finds very recent applications in certain signature models
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Ramis, Zaldívar Juan Enrique. "Decoding the genetic landscape of pediatric and young adult germinal center-derived B-cell non-Hodgkin lymphoma". Doctoral thesis, Universitat de Barcelona, 2021. http://hdl.handle.net/10803/672372.

Texto completo
Resumen
B-cell non-Hodgkin lymphoma (B-NHL) of pediatric and young adult population is a diverse group of neoplasms predominantly composed of aggressive B-cell lymphomas from the germinal center (GC). Molecular characterization of pediatric series has allowed the identification of several subtypes that predominantly occur in this subgroup of age. Despite of that, genomic features of these pediatric entities and their relationship to other B-NHL in this group of patients have not been extensively investigated. This thesis has aimed to address this gap of knowledge by performing a genetic and molecular characterization of large series of pediatric and young adult variants of GC-derived B-NHL including the Burkitt- like lymphoma with 11q aberration (BLL-11q) , pediatric type follicular lymphoma (PTFL) and large B-cell lymphomas (LBCL) such as diffuse large B-cell lymphomas (DLBCL) , high grade B- cell lymphomas, not otherwise specified (HGBCL, NOS) and large B-cell lymphoma with IRF4 rearrangement (LBCL-IRF4) entities. In the Study 1 we have molecularly characterized a series of 11 BLL-11q observing that BLL- 11q differed clinically, morphologically and immunophenotypically from conventional BL and instead showed features more consistent with HGBCL or DLBCL. Genomic profile was also different from that of BL and DLBCL with a mutational landscape characterized by the lack of typical BL mutations in ID3, TCF3, or CCND3 genes and recurrent specific BTG2 and ETS1 mutations, not present in BL but in germinal center B-cell (GCB) DLBCL subtype. All these observations suggest that BLL-11q is a neoplasm closer to other GC-derived lymphomas rather than BL. In Study 2, we expanded our knowledge on the genetic alterations associated to PTFL by verifying the presence of MAP2K1 and IRF8 mutations in a previously well characterized series of 43 PTFL. We demonstrate the activating effect of MAP2K1 mutations by immunohistochemical analysis observing phosphorylation of the MAP2K1 downstream target extracellular signal-regulated kinase in those mutated cases. Besides, we demonstrate the specificity of MAP2K1 and IRF8-K66R mutations since they are absent in conventional FL or in t(14;18)-negative FL. Finally, in the Study 3 we characterized a large series of LBCL including DLBCL, HGBCL, NOS and LBCL-IRF4 through an integrative analysis including targeted next generation sequencing, copy number, and transcriptome data. Results showed that each subgroup displayed different molecular profiles. LBCL-IRF4 had frequent mutations in IRF4 and NF-κB pathway genes (CARD11, CD79B) whereas DLBCL, NOS was predominantly of GCB-DLBCL subtype and carried gene mutations similar to the adult counterpart (e.g., SOCS1 and KMT2D). A subset of HGBCL, NOS displayed recurrent alterations of BL-related genes such as MYC, ID3, CCND3 and SMARCA4, whereas other cases were genetically closer to GCB-DLBCL. Interestingly, we could identify age-related differences in pediatric DLBCL since pediatric and young adult cases were mainly of GCB subtype, displayed low genetic complexity and virtually lacked primary aberrations (BCL2, MYC and BCL6 rearrangements). Finally, we identify clinical and molecular features related to unfavorable outcome such as age >18 years, high LDH levels, activated B-cell (ABC) DLBCL profile, high genetic complexity, homozygous deletions of 19p13.3/TNFSF7/TNFSF9, gains of 1q21-q44/MDM4/MCL1 and TP53 mutations. Altogether, we conclude that GC-derived B-NHL of pediatric and young adult population is a heterogeneous group of tumors including different entities with specific molecular profiles and clinical behavior. This thesis has contributed to increase the knowledge of these lymphoma entities identifying biomarkers that might be helpful to improve their diagnosis and to design management strategies more adapted to their particular biological behavior.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Kamel, Ehab. "Decoding cultural landscapes : guiding principles for the management of interpretation in cultural world heritage sites". Thesis, University of Nottingham, 2011. http://eprints.nottingham.ac.uk/11845/.

Texto completo
Resumen
Conserving the cultural significance of heritage sites - as the guardians of social unity, place identity, and national pride - plays an essential role in maintaining sustainable social development, as well as preserving the variations identifying cultural groups and enriching the interaction between them. Consequently, and considering the importance of the built environment in communicating, as well as documenting, cultural messages, this research project, started in 2007, develops a set of guiding principles for interpretation management, as a process for conserving cultural World Heritage Sites; by maintaining and communicating their cultural significance through managing newly added architectural, urban, and landscape designs to such heritage sites. This research was mainly conducted to investigate and explain a concern regarding a gap that is increasing between people and the cultural heritage contexts they reside- particularly in Egypt- and to suggest a strategy for professionals to understand such sites from a perspective that reflects the public cognition. Adopting Grounded Theory methodology, the research develops a series of principles, which are intended to guide the process of cultural heritage conservation; through a critical analysis of current heritage conservation practices in World Heritage Sites. The research shows how they [the guiding principles] correspond to the contemporary perception of cultural heritage in literature, for which, a thorough discussion of literature, as well as critical analysis of UNESCO’s heritage conventions and ICOMOS charters are carried out. The research raises, discusses, and answers several key questions concerning heritage conservation, such as: whether UNESCO’s conventions target the right heritage or not; the conflicts appearing between heritage conservation documents (conventions and charters); whether intangible heritage can be communicated through design; and the effect of Western heritage ideology on heritage conservation practices. This is carried out through the use of interpretive discourse analysis of literature and heritage documents, and personal site observations and questionnaire surveys carried out in two main World Heritage Sites: Historic Cairo in Egypt and Liverpool city in the UK. The two case studies contributed to the understanding of the general public’s perception of cultural Heritage Sites, and how such perception is reflected in current heritage conservation practices. The thesis decodes cultural World Heritage Sites into three intersecting levels: the ‘cultural significances’ (or ‘open codes’), which represent different categories under which people perceive historic urban landscapes; the ‘cultural concepts’ (or ‘axial codes’), which are considered as the objectives of heritage conservation practice, and represent the general concepts under which cultural significances influence the heritage interpretation process; and finally, the ‘interpretation strategy tactics’, the UNCAP Strategy (or the ‘selective coding’), which are the five overarching principles guiding the interpretation management process in cultural heritage sites. This strategy, the UNCAP (Understanding people; Narrating the story; Conserving the spirit of place; Architectural engagement; and Preserving the built heritage), developed throughout this research, is intended to help heritage site managers, curators, architects, urban designers, landscape architects, developers, and decision makers to build up a thorough understanding of heritage sites, which should facilitate the establishment of more interpretive management plans for such sites, and enhance the communication of meanings and values of their physical remains, as well as emphasizing the ‘spirit of place’; for achieving socio-cultural sustainability in the development of World Heritage Sites.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Generic decoding"

1

Decoding your dreams. New York: H. Holt, 1988.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Decoding your dreams. London: Unwin Hyman, 1989.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Langs, Robert. Decoding your dreams. London: Unwin Paperbacks, 1990.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

F, Gesteland Raymond y SpringerLink (Online service), eds. Recoding: Expansion of Decoding Rules Enriches Gene Expression. New York, NY: Springer Science+Business Media, LLC, 2010.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Berghoff, Hartmut. Decoding Modern Consumer Societies. New York: Palgrave Macmillan, 2012.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Decoding the past: The psychohistorical approach. New Brunswick, N.J., U.S.A: Transaction Publishers, 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Decoding the past: The psychohistorical approach. Berkeley: University of California Press, 1985.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Tanzi, Rudolph E. Decoding darkness: The search for the genetic causes of Alzheimer's disease. Cambridge, Mass: Perseus Publishing, 2000.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Tanzi, Rudolph E. Decoding darkness: The search for the genetic causes of Alzheimer's disease. Cambridge, Mass: Perseus Pub., 2000.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

B, Parson Ann, ed. Decoding darkness: The search for the genetic causes of Alzheimer's disease. Cambridge, Mass: Perseus Publ., 2000.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Generic decoding"

1

Dupuis, Frédéric, Jan Florjanczyk, Patrick Hayden y Debbie Leung. "The Locking-Decoding Frontier for Generic Dynamics". En Theory of Quantum Computation, Communication, and Cryptography, 23–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54429-3_3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Sheng, Mingyang, Yongqiang Ma, Kai Chen y Nanning Zheng. "VAE-Based Generic Decoding via Subspace Partition and Priori Utilization". En IFIP Advances in Information and Communication Technology, 220–32. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-34107-6_18.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Papadimitriou, Angeliki, Nikolaos Passalis y Anastasios Tefas. "Decoding Generic Visual Representations from Human Brain Activity Using Machine Learning". En Lecture Notes in Computer Science, 597–606. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-11015-4_45.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Cassagne, Adrien, Bertrand Le Gal, Camille Leroux, Olivier Aumage y Denis Barthou. "An Efficient, Portable and Generic Library for Successive Cancellation Decoding of Polar Codes". En Languages and Compilers for Parallel Computing, 303–17. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-29778-1_19.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Chailloux, André, Thomas Debris-Alazard y Simona Etinski. "Classical and Quantum Algorithms for Generic Syndrome Decoding Problems and Applications to the Lee Metric". En Post-Quantum Cryptography, 44–62. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81293-5_3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Vinuesa, Carola G. y Matthew C. Cook. "Genetic Analysis of Systemic Autoimmunity". En Decoding the Genomic Control of Immune Reactions, 103–28. Chichester, UK: John Wiley & Sons, Ltd, 2007. http://dx.doi.org/10.1002/9780470062128.ch10.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Keeling, Kim M. y David M. Bedwell. "Recoding Therapies for Genetic Diseases". En Recoding: Expansion of Decoding Rules Enriches Gene Expression, 123–46. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-89382-2_6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Farabaugh, Philip J. "Programmed Alternative Decoding as Programmed Translational Errors". En Programmed Alternative Reading of the Genetic Code, 183–201. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4615-5999-3_9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Karupiah, Gunasegaran, Vijay Panchanathan, Isaac G. Sakala y Geeta Chaudhri. "Genetic Resistance to Smallpox: Lessons from Mousepox". En Decoding the Genomic Control of Immune Reactions, 129–40. Chichester, UK: John Wiley & Sons, Ltd, 2007. http://dx.doi.org/10.1002/9780470062128.ch11.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Min-Oo, Gundula, Mary M. Stevenson, Anny Fortin y Philippe Gros. "Genetic Control of Host-Pathogen Interactions in Mice". En Decoding the Genomic Control of Immune Reactions, 156–68. Chichester, UK: John Wiley & Sons, Ltd, 2007. http://dx.doi.org/10.1002/9780470062128.ch13.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Generic decoding"

1

Bitzer, Sebastian, Alessio Pavoni, Violetta Weger, Paolo Santini, Marco Baldi y Antonia Wachter-Zeh. "Generic Decoding of Restricted Errors". En 2023 IEEE International Symposium on Information Theory (ISIT). IEEE, 2023. http://dx.doi.org/10.1109/isit54713.2023.10206983.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Bitzer, Sebastian, Julian Renner, Antonia Wachter-Zeh y Violetta Weger. "Generic Decoding in the Cover Metric". En 2023 IEEE Information Theory Workshop (ITW). IEEE, 2023. http://dx.doi.org/10.1109/itw55543.2023.10160246.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Puchinger, Sven, Julian Renner y Johan Rosenkilde. "Generic Decoding in the Sum-Rank Metric". En 2020 IEEE International Symposium on Information Theory (ISIT). IEEE, 2020. http://dx.doi.org/10.1109/isit44484.2020.9174497.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Brakensiek, Joshua, Sivakanth Gopi y Visu Makam. "Generic Reed-Solomon Codes Achieve List-Decoding Capacity". En STOC '23: 55th Annual ACM Symposium on Theory of Computing. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3564246.3585128.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

"HDL LIBRARY OF PROCESSING UNITS FOR GENERIC AND DVB-S2 LDPC DECODING". En International Conference on Security and Cryptography. SciTePress - Science and and Technology Publications, 2006. http://dx.doi.org/10.5220/0001570000170024.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Hou, TianQi, YuHao Liu, Teng Fu y Jean Barbier. "Sparse superposition codes under VAMP decoding with generic rotational invariant coding matrices". En 2022 IEEE International Symposium on Information Theory (ISIT). IEEE, 2022. http://dx.doi.org/10.1109/isit50566.2022.9834843.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Janakiram, Balaji, M. Girish Chandra, B. S. Adiga, S. G. Harihara y P. Balamuralidhar. "A generic conflict-free architecture for decoding LDPC codes using Perfect Difference Networks". En 2010 Australian Communications Theory Workshop (AusCTW). IEEE, 2010. http://dx.doi.org/10.1109/ausctw.2010.5426777.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Leuschner, Jeff y Shahram Yousefi. "A New Generic Maximum-Likelihood Metric Expression for Space-Time Block Codes With Applications To Decoding". En 2007 41st Annual Conference on Information Sciences and Systems. IEEE, 2007. http://dx.doi.org/10.1109/ciss.2007.4298429.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Arava, V. K. Prasad, Manhwee Jo, HyoukJoong Lee y Kiyoung Choi. "A Generic Design for Encoding and Decoding Variable Length Codes in Multi-codec Video Processing Engines". En 2008 IEEE Computer Society Annual Symposium on VLSI. IEEE, 2008. http://dx.doi.org/10.1109/isvlsi.2008.49.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Qian, Qiao, Minlie Huang, Haizhou Zhao, Jingfang Xu y Xiaoyan Zhu. "Assigning Personality/Profile to a Chatting Machine for Coherent Conversation Generation". En Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/595.

Texto completo
Resumen
Endowing a chatbot with personality is challenging but significant to deliver more realistic and natural conversations. In this paper, we address the issue of generating responses that are coherent to a pre-specified personality or profile. We present a method that uses generic conversation data from social media (without speaker identities) to generate profile-coherent responses. The central idea is to detect whether a profile should be used when responding to a user post (by a profile detector), and if necessary, select a key-value pair from the profile to generate a response forward and backward (by a bidirectional decoder) so that a personality-coherent response can be generated. Furthermore, in order to train the bidirectional decoder with generic dialogue data, a position detector is designed to predict a word position from which decoding should start given a profile value. Manual and automatic evaluation shows that our model can deliver more coherent, natural, and diversified responses.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Generic decoding"

1

Loughry, Thomas A. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding. Office of Scientific and Technical Information (OSTI), febrero de 2015. http://dx.doi.org/10.2172/1170513.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Zhang, Hongbin, Shahal Abbo, Weidong Chen, Amir Sherman, Dani Shtienberg y Frederick Muehlbauer. Integrative Physical and Genetic Mapping of the Chickpea Genome for Fine Mapping and Analysis of Agronomic Traits. United States Department of Agriculture, marzo de 2010. http://dx.doi.org/10.32747/2010.7592122.bard.

Texto completo
Resumen
Chickpea is the third most important pulse crop in the world and ranks first in the Middle East; however, it has been subjected to only limited research in modern genomics. In the first period of this project (US-3034-98R) we constructed two large-insert BAC and BIBAC libraries, developed 325 SSR markers and mapped QTLs controlling ascochyta blight resistance (ABR) and days to first flower (DTF). Nevertheless, the utilities of these tools and results in gene discovery and marker-assisted breeding are limited due to the absence of an essential platform. The goals of this period of the project were to use the resources and tools developed in the first period of the project to develop a BAC/BIBAC physical map for chickpea and using it to identify BAC/BIBACcontigs containing agronomic genes of interest, with an emphasis on ABR and DTF, and develop DNA markers suitable for marker-assisted breeding. Toward these goals, we proposed: 1) Fingerprint ~50,000 (10x) BACs from the BAC and BIBAC libraries, assemble the clones into a genome-wide BAC/BIBAC physical map, and integrate the BAC/BIBAC map with the existing chickpea genetic maps (Zhang, USA); 2) fine-map ABR and DTFQTLs and enhance molecular tools for chickpea genetics and breeding (Shahal, Sherman and DaniShtienberg, Israel; Chen and Muehlbauer; USA); and 3) integrate the BAC/BIBAC map with the existing chickpea genetic maps (Sherman, Israel; Zhang and Chen, USA). For these objectives, a total of $460,000 was requested originally, but a total of $300,000 was awarded to the project. We first developed two new BAC and BIBAC libraries, Chickpea-CME and Chickpea- CHV. The chickpea-CMEBAC library contains 22,272 clones, with an average insert size of 130 kb and equivalent to 4.0 fold of the chickpea genome. The chickpea-CHVBIBAC library contains 38,400 clones, with an average insert size of 140 kb and equivalent to 7.5 fold of the chickpea genome. The two new libraries (11.5 x), along with the two BAC (Chickpea-CHI) and BIBAC (Chickpea-CBV) libraries (7.1 x) constructed in the first period of the project, provide libraries essential for chickpea genome physical mapping and many other genomics researches. Using these four libraries we then developed the proposed BAC/BIBAC physical map of chickpea. A total of 67,584 clones were fingerprinted, and 64,211 (~11.6 x) of the fingerprints validated and used in the physical map assembly. The physical map consists of 1,945 BAC/BIBACcontigs, with each containing an average of 39.2 clones and having an average physical length of 559 kb. The contigs collectively span ~1,088 Mb, being 1.49 fold of the 740- Mb chickpea genome. Third, we integrated the physical map with the two existing chickpea genetic maps using a total of 172 (124 + 48) SSR markers. Fourth, we identified tightly linked markers for ABR-QTL1, increased marker density at ABR-QTL2 and studied the genetic basis of resistance to pod abortion, a major problem in the east Mediterranean, caused by heat stress. Finally, we, using the integrated map, isolated the BAC/BIBACcontigs containing or closely linked to QTL4.1, QTL4.2 and QTL8 for ABR and QTL8 for DTF. The integrated BAC/BIBAC map resulted from the project will provide a powerful platform and tools essential for many aspects of advanced genomics and genetics research of this crop and related species. These includes, but are not limited to, targeted development of SNP, InDel and SSR markers, high-resolution mapping of the chickpea genome and its agronomic genes and QTLs, sequencing and decoding of all genes of the genome using the next-generation sequencing technology, and comparative genome analysis of chickpea versus other legumes. The DNA markers and BAC/BIBACcontigs containing or closely linked to ABR and DTF provide essential tools to develop SSR and SNP markers well-suited for marker-assisted breeding of the traits and clone their corresponding genes. The development of the tools and knowledge will thus promote enhanced and substantial genetic improvement of the crop and related legumes.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía