Dissertations / Theses on the topic 'Quantum Random Number Generators'

To see the other types of publications on this topic, follow the link: Quantum Random Number Generators.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Quantum Random Number Generators.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Abellán, Sánchez Carlos. "Quantum random number generators for industrial applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2018. http://hdl.handle.net/10803/587190.

Full text
Abstract:
Randomness is one of the most intriguing, inspiring and debated topics in the history of the world. It appears every time we wonder about our existence, about the way we are, e.g. Do we have free will? Is evolution a result of chance? It is also present in any attempt to understand our anchoring to the universe, and about the rules behind the universe itself, e.g. Why are we here and when and why did all this start? Is the universe deterministic or does unpredictability exist? Remarkably, randomness also plays a central role in the information era and technology. Random digits are used in communication protocols like Ethernet, in search engines and in processing algorithms as page rank. Randomness is also widely used in so-called Monte Carlo methods in physics, biology, chemistry, finance and mathematics, as well as in many other disciplines. However, the most iconic use of random digits is found in cryptography. Random numbers are used to generate cryptographic keys, which are the most basic element to provide security and privacy to any form of secure communication. This thesis has been carried out with the following questions in mind: Does randomness exist in photonics? If so, how do we mine it and how do we mine it in a massively scalable manner so that everyone can easily use it? Addressing these two questions lead us to combine tools from fundamental physics and engineering. The thesis starts with an in-depth study of the phase diffusion process in semiconductor lasers and its application to random number generation. In contrast to other physical processes based on deterministic laws of nature, the phase diffusion process has a pure quantum mechanical origin, and, as such, is an ideal source for generating truly unpredictable digits. First, we experimentally demonstrated the fastest quantum random number generation scheme ever reported (at the time), using components from the telecommunications industry only. Up to 40 Gb/s were demonstrated to be possible using a pulsed scheme. We then moved towards building prototypes and testing them with partners in supercomputation and fundamental research. In particular, the devices developed during this thesis were used in the landmark loophole- free Bell test experiments of 2015. In the process of building the technology, we started a new research focus as an attempt to answer the following question: How do we know that the digits that we generate are really coming from the phase diffusion process that we trust? As a result, we introduced the randomness metrology methodology, which can be used to derive quantitative bounds on the quality of any physical random number generation device. Finally, we moved towards miniaturisation of the technology by leveraging techniques from the photonic integrated circuits technology industry. The first fully integrated quantum random number generator was demonstrated using a novel two-laser scheme on an Indium Phosphide platform. In addition, we also demonstrated the integration of part of the technology on a Silicon Photonics platform, opening the door towards manufacturing in the most advanced semiconductor industry.
L’aleatorietat és un dels temes més intrigants, inspiradors i debatuts al llarg de la història. És un concepte que sorgeix quan ens preguntem sobre la nostra pròpia existència i de per què som com som. Tenim freewill? És l’evolució resultat de l’atzar? L’aleatorietat és també un tema que sorgeix quan intentem entendre la nostra relació amb l’univers mateix. Per què estem aquí? Quan o com va començar tot això? És l’univers una màquina determinista o hi ha cabuda per a l’atzar? Sorprenentment, l’aleatorietat també juga un paper crucial en l’era de la informació i la tecnologia. Els nombres aleatoris es fan servir en protocols de comunicació com Ethernet, en algoritmes de classificació i processat com Page Rank. També usem l’aleatorietat en els mètodes Monte Carlo, que s’utilitzen en els àmbits de la física, la biologia, la química, les finances o les matemàtiques. Malgrat això, l’aplicació més icònica per als nombres aleatoris la trobem en el camp de la criptografia o ciber-seguretat. Els nombres aleatoris es fan servir per a generar claus criptogràfiques, l’element bàsic que proporciona la seguretat i privacitat a les nostres comunicacions. Aquesta tesi parteix de la següent pregunta fonamental: Existeix l’aleatorietat a la fotònica? En cas afirmatiu, com podem extreure-la i ferla accessible a tothom? Per a afrontar aquestes dues preguntes, s’han combinat eines des de la física fonamental fins a l’enginyeria. La tesi parteix d’un estudi detallat del procés de difusió de fase en làsers semiconductors i de com aplicar aquest procés per a la generació de nombres aleatoris. A diferència d’altres processos físics basats en lleis deterministes de la natura, la difusió de fase té un origen purament quàntic, i per tant, és una font ideal per a generar nombres aleatoris. Primerament, i fent servir aquest procés de difusió de fase, vam crear el generador quàntic de nombres aleatoris més ràpid mai implementat (en aquell moment) fent servir, únicament, components de la indústria de les telecomunicacions. Més de 40 Gb/s van ser demostrats fent servir un esquema de làser polsat. Posteriorment, vam construir diversos prototips que van ser testejats en aplicacions de ciència fonamental i supercomputació. En particular, alguns dels prototips desenvolupats en aquesta tesi van ser claus en els famosos experiments loophole-free Bell tests realitzats l’any 2015. En el procés de construir aquests prototips, vam iniciar una nova línia de recerca per a intentar contestar una nova pregunta: Com sabem si els nombres aleatoris que generem realment sorgeixen del procés de difusió de fase, tal com nosaltres creiem? Com a resultat, vam introduir una nova metodologia, la metrologia de l’aleatorietat. Aquesta es pot fer servir per a derivar límits quantificables sobre la qualitat de qualsevol dispositiu de generació de nombres aleatoris físic. Finalment, ens vam moure en la direcció de la miniaturització de la tecnologia utilitzant tècniques de la indústria de la fotònica integrada. En particular, vam demostrar el primer generador de nombres aleatoris quàntic totalment integrat, fent servir un esquema de dos làsers en un xip de Fosfur d’Indi. En paral·lel, també vam demostrar la integració d’una part del dispositiu emprant tecnologia de Silici, obrint les portes, per tant, a la producció a gran escala a través de la indústria més avançada de semiconductors.
La aleatoriedad es uno de los temas más intrigantes, inspiradores y debatidos a lo largo de la historia. Es un concepto que surge cuando nos preguntamos sobre nuestra propia existencia y de por qué somos como somos. ¿Tenemos libre albedrío? ¿Es la evolución resultado del azar? La aleatoriedad es también un tema que surge cuando intentamos entender nuestra relación con el universo. ¿Por qué estamos aquí? ¿Cuándo y cómo empezó todo esto? ¿Es el universo una máquina determinista o existe espacio para el azar? Sorprendentemente, la aleatoriedad también juega un papel crucial en la era de la información y la tecnología. Los números aleatorios se usan en protocolos de comunicación como Ethernet, y en algoritmos de clasificación y procesado como Page Rank. También la utilizamos en los métodos Monte Carlo, que sirven en los ámbitos de la física, la biología, la química, las finanzas o las matemáticas. Sin embargo, la aplicación más icónica para los números aleatorios la encontramos en el campo de la criptografía y la ciberseguridad. Aquí, los números aleatorios se usan para generar claves criptográficas, proporcionando el elemento básico para dotar a nuestras comunicaciones de seguridad y privacidad. En esta tesis partimos de la siguiente pregunta fundamental: ¿Existe la aleatoriedad en la fotónica? En caso afirmativo, ¿Cómo podemos extraerla y hacerla accesible a todo el mundo? Para afrontar estas dos preguntas, se han combinado herramientas desde la física fundamental hasta la ingeniería. La tesis parte de un estudio detallado del proceso de difusión de fase en láseres semiconductores y de cómo aplicar este proceso para la generación de números aleatorios. A diferencia de otros procesos físicos basados en leyes deterministas de la naturaleza, la difusión de fase tiene un origen puramente cuántico y, por lo tanto, es una fuente ideal para generar números aleatorios. Primeramente, y utilizando este proceso de difusión de fase, creamos el generador cuántico de números aleatorios más rápido nunca implementado (en ese momento) utilizando únicamente componentes de la industria de las telecomunicaciones. Más de 40 Gb/s fueron demostrados utilizando un esquema de láser pulsado. Posteriormente, construimos varios prototipos que fueron testeados en aplicaciones de ciencia fundamental y supercomputación. En particular, algunos de los prototipos desarrollados en esta tesis fueron claves en los famosos experimentos Loophole-free Bell tests realizados en el 2015. En el proceso de construir estos prototipos, iniciamos una nueva línea de investigación para intentar dar respuesta a una nueva pregunta: ¿Cómo sabemos si los números aleatorios que generamos realmente surgen del proceso de difusión de fase, tal y como nosotros creemos? Como resultado introdujimos una nueva metodología, la metrología de la aleatoriedad. Esta se puede usar para derivar límites cuantificables sobre la calidad de cualquier dispositivo de generación de números aleatorios físico. Finalmente, nos movimos en la dirección de la miniaturización de la tecnología utilizando técnicas de la industria de la fotónica integrada. En particular, creamos el primer generador de números aleatorios cuántico totalmente integrado utilizando un esquema de dos láseres en un chip de Fosfuro de Indio. En paralelo, también demostramos la integración de una parte del dispositivo utilizando tecnología de Silicio, abriendo las puertas, por tanto, a la producción a gran escala a través de la industria más avanzada de semiconductores.
APA, Harvard, Vancouver, ISO, and other styles
2

Raffaelli, Francesco. "Quantum random number generators in integrated photonics." Thesis, University of Bristol, 2019. http://hdl.handle.net/1983/b20b0798-755d-4a57-843f-3951805e9f53.

Full text
Abstract:
Random numbers find applications in a range of different fields, from quantum key distribution and classical cryptography to fundamental science. They also find extensive use in gambling and lotteries. By exploiting the probabilistic nature of Quantum Mechanics, quantum random number generators (QRNGs) provide a secure and efficient means to produce random numbers. Most of the quantum random number generators demonstrated so far have been built in bulk optics, either using free space or fibre-optic components. While showing good performance, most of these demonstrations are strongly limited in real life applications, due to issues such as size, costs and the manufacturing process. In this thesis I report the demonstration of three different QRNGs based on integrated photonics. First, I demonstrated a QRNG based on homodyne measurement of optical vacuum states on a Silicon-on-insulator (SOI) chip. Second, I developed a SOI QRNG based on phase fluctuations from a laser diode. In these two schemes all the optical and opto-electronic components, excluding the laser, were integrated onto a silicon-on-insulator device. These schemes, being built on a silicon-on-insulator chip are potentially CMOS compatible and pave the way for being integrated onto other more complex systems. These QRNGs showed Gbps generation rates and passed the statistical tests provided by NIST. Third, I report the preliminary study of a QRNG based on homodyne measurement of optical vacuum states onto a Indium Phosphide (InP) chip. In this third experiment, all the components, including a laser diode, were monolithically integrated in the same chip, which provide a great advantage in terms of the overall size of the optics of the device.
APA, Harvard, Vancouver, ISO, and other styles
3

Bisadi, Zahra. "All-Silicon-Based Photonic Quantum Random Number Generators." Doctoral thesis, University of Trento, 2017. http://eprints-phd.biblio.unitn.it/2603/1/ZAHRA_BISADI_Thesis.pdf.

Full text
Abstract:
Random numbers are fundamental elements in different fields of science and technology such as computer simulation like Monte Carlo-method simulation, statistical sampling, cryptography, games and gambling, and other areas where unpredictable results are necessary. Random number generators (RNG) are generally classified as “pseudo”-random number generators (PRNG) and "truly" random number generators (TRNG). Pseudo random numbers are generated by computer algorithms with a (random) seed and a specific formula. The random numbers produced in this way (with a small degree of unpredictability) are good enough for some applications such as computer simulation. However, for some other applications like cryptography they are not completely reliable. When the seed is revealed, the entire sequence of numbers can be produced. The periodicity is also an undesirable property of PRNGs that can be disregarded for most practical purposes if the sequence recurs after a very long period. However, the predictability still remains a tremendous disadvantage of this type of generators. Truly random numbers, on the other hand, can be generated through physical sources of randomness like flipping a coin. However, the approaches exploiting classical motion and classical physics to generate random numbers possess a deterministic nature that is transferred to the generated random numbers. The best solution is to benefit from the assets of indeterminacy and randomness in quantum physics. Based on the quantum theory, the properties of a particle cannot be determined with arbitrary precision until a measurement is carried out. The result of a measurement, therefore, remains unpredictable and random. Optical phenomena including photons as the quanta of light have various random, non-deterministic properties. These properties include the polarization of the photons, the exact number of photons impinging a detector and the photon arrival times. Such intrinsically random properties can be exploited to generate truly random numbers. Silicon (Si) is considered as an interesting material in integrated optics. Microelectronic chips made from Si are cheap and easy to mass-fabricate, and can be densely integrated. Si integrated optical chips, that can generate, modulate, process and detect light signals, exploit the benefits of Si while also being fully compatible with electronic. Since many electronic components can be integrated into a single chip, Si is an ideal candidate for the production of small, powerful devices. By complementary metal-oxide-semiconductor (CMOS) technology, the fabrication of compact and mass manufacturable devices with integrated components on the Si platform is achievable. In this thesis we aim to model, study and fabricate a compact photonic quantum random number generator (QRNG) on the Si platform that is able to generate high quality, "truly" random numbers. The proposed QRNG is based on a Si light source (LED) coupled with a Si single photon avalanche diode (SPAD) or an array of SPADs which is called Si photomultiplier (SiPM). Various implementations of QRNG have been developed reaching an ultimate geometry where both the source and the SPAD are integrated on the same chip and fabricated by the same process. This activity was performed within the project SiQuro—on Si chip quantum optics for quantum computing and secure communications—which aims to bring the quantum world into integrated photonics. By using the same successful paradigm of microelectronics—the study and design of very small electronic devices typically made from semiconductor materials—, the vision is to have low cost and mass manufacturable integrated quantum photonic circuits for a variety of different applications in quantum computing, measure, sensing, secure communications and services. The Si platform permits, in a natural way, the integration of quantum photonics with electronics. Two methodologies are presented to generate random numbers: one is based on photon counting measurements and another one is based on photon arrival time measurements. The latter is robust, masks all the drawbacks of afterpulsing, dead time and jitter of the Si SPAD and is effectively insensitive to ageing of the LED and to its emission drifts related to temperature variations. The raw data pass all the statistical tests in national institute of standards and technology (NIST) tests suite and TestU01 Alphabit battery without a post processing algorithm. The maximum demonstrated bit rate is 1.68 Mbps with the efficiency of 4-bits per detected photon. In order to realize a small, portable QRNG, we have produced a compact configuration consisting of a Si nanocrystals (Si-NCs) LED and a SiPM. All the statistical test in the NIST tests suite pass for the raw data with the maximum bit rate of 0.5 Mbps. We also prepared and studied a compact chip consisting of a Si-NCs LED and an array of detectors. An integrated chip, composed of Si p+/n junction working in avalanche region and a Si SPAD, was produced as well. High quality random numbers are produced through our robust methodology at the highest speed of 100 kcps. Integration of the source of entropy and the detector on a single chip is an efficient way to produce a compact RNG. A small RNG is an essential element to guarantee the security of our everyday life. It can be readily implemented into electronic devices for data encryption. The idea of "utmost security" would no longer be limited to particular organs owning sensitive information. It would be accessible to every one in everyday life.
APA, Harvard, Vancouver, ISO, and other styles
4

Ritchie, Robert Peter. "Efficient Constructions for Deterministic Parallel Random Number Generators and Quantum Key Distribution." Miami University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=miami1619099112895031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Marangon, Davide Giacomo. "Improving Quantum Key Distribution and Quantum Random Number Generation in presence of Noise." Doctoral thesis, Università degli studi di Padova, 2015. http://hdl.handle.net/11577/3424117.

Full text
Abstract:
The argument of this thesis might be summed up as the exploitation of the noise to generate better noise. More specifically this work is about the possibility of exploiting classic noise to effectively transmit quantum information and measuring quantum noise to generate better quantum randomness. What do i mean by exploiting classical noise to transmit effectively quantum information? In this case I refer to the task of sending quantum bits through the atmosphere in order set up transmissions of quantum key distribution (QKD) and this will be the subject of Chapter 1 and Chapter 2. In the Quantum Communications framework, QKD represents a topic with challenging problems both theoretical and experimental. In principle QKD offers unconditional security, however practical realizations of it must face all the limitations of the real world. One of the main limitation are the losses introduced by real transmission channels. Losses cause errors and errors make the protocol less secure because an eavesdropper could try to hide his activity behind the losses. When this problem is addressed under a full theoretical point of view, one tries to model the effect of losses by means of unitary transforms which affect the qubits in average according a fixed level of link attenuation. However this approach is somehow limiting because if one has a high level of background noise and the losses are assumed in average constant, it could happen that the protocol might abort or not even start, being the predicted QBER to high. To address this problem and generate key when normally it would not be possible, we have proposed an adaptive real time selection (ARTS) scheme where transmissivity peaks are instantaneously detected. In fact, an additional resource may be introduced to estimate the link transmissivity in its intrinsic time scale with the use of an auxiliary classical laser beam co-propagating with the qubits but conveniently interleaved in time. In this way the link scintillation is monitored in real time and the selection of the time intervals of high channel transmissivity corresponding to a viable QBER for a positive key generation is made available. In Chapter 2 we present a demonstration of this protocol in conditions of losses equivalent to long distance and satellite links, and with a range of scintillation corresponding to moderate to severe weather. A useful criterion for the preselection of the low QBER interval is presented that employs a train of intense pulses propagating in the same path as the qubits, with parameters chosen such that its fluctuation in time reproduces that of the quantum communication. For what concern the content of Chapter 3 we describe a novel principle for true random number generator (TRNG) which is based on the observation that a coherent beam of light crossing a very long path with atmospheric turbulence may generate random and rapidly varying images. To implement our method in a proof of concept demonstrator, we have chosen a very long free space channel used in the last years for experiments in Quantum Communications at the Canary Islands. Here, after a propagation of 143 km at an altitude of the terminals of about 2400 m, the turbulence in the path is converted into a dynamical speckle at the receiver. The source of entropy is then the atmospheric turbulence. Indeed, for such a long path, a solution of the Navier-Stokes equations for the {atmospheric flow in which the beam propagates is out of reach. Several models are based on the Kolmogorov statistical theory, which parametrizes the repartition of kinetic energy as the interaction of decreasing size eddies. However, such models only provide a statistical description for the spot of the beam and its wandering and never an instantaneous prediction for the irradiance distribution. These are mainly ruled by temperature variations and by the wind and cause fluctuations in the air refractive index. For such reason, when a laser beam is sent across the atmosphere, this latter may be considered as a dynamic volumetric scatterer which distorts the beam wavefront. We will evaluate the experimental data to ensure that the images are uniform and independent. Moreover, we will assess that our method for the randomness extraction based on the combinatorial analysis is optimal in the context of Information Theory. In Chapter 5 we will present a new approach for what concerns the generation of random bits from quantum physical processes. Quantum Mechanics has been always regarded as a possible and valuable source of randomness, because of its intrinsic probabilistic Nature. However the typical paradigm is employed to extract random number from a quantum system it commonly assumes that the state of said system is pure. Such assumption, only in theory would lead to full and unpredictable randomness. The main issue however it is that in real implementations, such as in a laboratory or in some commercial device, it is hardly possible to forge a pure quantum state. One has then to deal with quantum state featuring some degree of mixedness. A mixed state however might be somehow correlated with some other system which is hold by an adversary, a quantum eavesdropper. In the extreme case of a full mixed state, practically one it is like if he is extracting random numbers from a classical state. In order to do that we will show how it is important to shift from a classical randomness estimator, such as the min-classical entropy H-min(Z) of a random variable Z to quantum ones such as the min-entropy conditioned on quantum side information E. We have devised an effective protocol based on the entropic uncertainty principle for the estimation of the min-conditional entropy. The entropic uncertainty principle lets one to take in account the information which is shared between multiple parties holding a multipartite quantum system and, more importantly, lets one to bound the information a party has on the system state after that it has been measured. We adapted such principle to the bipartite case where an user Alice, A, is supplied with a quantum system prepared by the provider Eve, E, who could be maliciously correlated to it. In principle then Eve might be able to predict all the outcomes of the measurements Alice performs on the basis Z in order to extract random numbers from the system. However we will show that if Alice randomly switches from the measurement basis to a basis X mutually unbiased to Z, she can lower bound the min entropy conditioned to the side information of Eve. In this way for Alice is possible to expand a small initial random seed in a much larger amount of trusted numbers. We present the results of an experimental demonstration of the protocol where random numbers passing the most rigorous classical tests of randomness were produced. In Chapter 6, we will provide a secure generation scheme for a continuos variable (CV) QRNG. Since random true random numbers are an invaluable resource for both the classical Information Technology and the uprising Quantum one, it is clear that to sustain the present and future even growing fluxes of data to encrypt it is necessary to devise quantum random number generators able to generate numbers in the rate of Gigabit or Terabit per second. In the Literature are given several examples of QRNG protocols which in theory could reach such limits. Typically, these are based on the exploitation of the quadratures of the electro-magnetic field, regarded as an infinite bosonic quantum system. The quadratures of the field can be measured with a well known measurement scheme, the so called homodyne detection scheme which, in principle, can yield an infinite band noise. Consequently the band of the random signal is limited only by the passband of the devices used to measure it. Photodiodes detectors work commonly in the GHz band, so if one sample the signal with an ADC enough fast, the Gigabit or Terabit rates can be easily reached. However, as in the case of discrete variable QRNG, the protocols that one can find in the Literature, do not properly consider the purity of the quantum state being measured. The idea has been to extend the discrete variable protocol of the previous Chapter, to the Continuous case. We will show how in the CV framework, not only the problem of the state purity is given but also the problem related to the precision of the measurements used to extract the randomness.
L'argomento di questa tesi può essere riassunto nella frase utilizzare il rumore classico per generare un migliore rumore quantistico. In particolare questa tesi riguarda da una parte la possibilita di sfruttare il rumore classico per trasmettere in modo efficace informazione quantistica, e dall'altra la misurazione del rumore classico per generare una migliore casualita quantistica. Nel primo caso ci si riferisce all'inviare bit quantistici attraverso l'atmosfera per creare trasmissioni allo scopo di distribuire chiavi crittografiche in modo quantistico (QKD) e questo sara oggetto di Capitolo 1 e Capitolo 2. Nel quadro delle comunicazioni quantistiche, la QKD è caratterizzata da notevoli difficolta sperimentali. Infatti, in linea di principio la QKD offre sicurezza incondizionata ma le sue realizzazioni pratiche devono affrontare tutti i limiti del mondo reale. Uno dei limiti principali sono le perdite introdotte dai canali di trasmissione. Le perdite causano errori e gli errori rendono il protocollo meno sicuro perché un avversario potrebbe camuffare la sua attivita di intercettazione utilizzando le perdite. Quando questo problema viene affrontato da un punto di vista teorico, si cerca di modellare l'effetto delle perdite mediante trasformazioni unitarie che trasformano i qubits in media secondo un livello fisso di attenuazione del canale. Tuttavia questo approccio è in qualche modo limitante, perché se si ha ha un elevato livello di rumore di fondo e le perdite si assumono costanti in media, potrebbe accadere che il protocollo possa abortire o peggio ancora, non iniziare, essendo il quantum bit error rate (QBER) oltre il limite (11\%) per la distribuzione sicura. Tuttavia, studiando e caratterizzando un canale ottico libero, si trova che il livello di perdite è tutt'altro che stabile e che la turbolenza induce variazioni di trasmissivita che seguono una statistica log-normale. Il punto pertanto è sfruttare questo rumore classico per generare chiave anche quando normalmente non sarebbe possibile. Per far ciò abbiamo ideato uno schema adattativo per la selezione in tempo reale (ARTS) degli istanti a basse perdite in cui vengono istantaneamente rilevati picchi di alta trasmissivita. A tal scopo, si utilizza un fascio laser classico ausiliario co-propagantesi con i qubit ma convenientemente inframezzato nel tempo. In questo modo la scintillazione viene monitorata in tempo reale e vengono selezionati gli intervalli di tempo che daranno luogo ad un QBER praticabile per una generazione di chiavi. Verra quindi presentato un criterio utile per la preselezione dell'intervallo di QBER basso in cui un treno di impulsi intensi si propaga nello stesso percorso dei qubits, con i parametri scelti in modo tale che la sua oscillazione nel tempo riproduce quello della comunicazione quantistica. Nel Capitolo 2 presentiamo quindi una dimostrazione ed i risultati di tale protocollo che è stato implementato presso l'arcipelago delle Canarie, tra l'isola di La Palma e quella di Tenerife: tali isole essendo separate da 143 km, costituiscono un ottimo teatro per testare la validita del protocollo in quanto le condizioni di distanza sono paragonabili a quelle satellitari e la gamma di scintillazione corrisponde quella che si avrebbe in ambiente con moderato maltempo in uno scenario di tipo urbano. Per quanto riguarda il contenuto del Capitolo 3 descriveremo un metodo innovativo per la generazione fisica di numeri casuali che si basa sulla constatazione che un fascio di luce coerente, attraversando un lungo percorso con turbolenza atmosferica da luogo ad immagini casuali e rapidamente variabili. Tale fenomeno è stato riscontrato a partire dai diversi esperimenti di comunicazione quantistica effettuati alle Isole Canarie, dove il fascio laser classico utilizzato per puntare i terminali, in fase di ricezione presentava un fronte d'onda completamente distorto rispetto al tipico profilo gaussiano. In particolare ciò che si osserva è un insieme di macchie chiare e scure che si evolvono geometricamente in modo casuale, il cosiddetto profilo dinamico a speckle. La fonte di tale entropia è quindi la turbolenza atmosferica. Infatti, per un canale di tale lunghezza, una soluzione delle equazioni di Navier-Stokes per il flusso atmosferico in cui si propaga il fascio è completamente fuori portata, sia analiticamente che per mezzo di metodi computazionali. Infatti i vari modelli di dinamica atmosferica sono basati sulla teoria statistica Kolmogorov, che parametrizza la ripartizione dell'energia cinetica come l'interazione di vortici d'aria di dimensioni decrescenti. Tuttavia, tali modelli forniscono solo una descrizione statistica per lo spot del fascio e delle sue eventuali deviazioni ma mai una previsione istantanea per la distribuzione dell' irraggiamento. Per tale motivo, quando un raggio laser viene inviato attraverso l'atmosfera, quest'ultima può essere considerato come un diffusore volumetrico dinamico che distorce il fronte d'onda del fascio. All'interno del Capitolo verranno presentati i dati sperimentali che assicurano che le immagini del fascio presentano le caratteristiche di impredicibilita tali per cui sia possibile numeri casuali genuini. Inoltre, verra presentato anche il metodo per l'estrazione della casualita basato sull'analisi combinatoria ed ottimale nel contesto della Teoria dell'Informazione. In Capitolo 5 presenteremo un nuovo approccio per quanto riguarda la generazione di bit casuali dai processi fisici quantistici. La Meccanica quantistica è stata sempre considerata come la migliore fonte di casualita, a causa della sua intrinseca natura probabilistica. Tuttavia il paradigma tipico impiegato per estrarre numeri casuali da un sistema quantistico assume che lo stato di detto sistema sia puro. Tale assunzione, in principio comporta una generazione in cui il risultato delle misure è complemente impredicibile secondo la legge di Born. Il problema principale tuttavia è che nelle implementazioni reali, come in un laboratorio o in qualche dispositivo commerciale, difficilmente è possibile creare uno stato quantico puro. Generalmente ciò che si ottiene è uno stato quantistico misto. Uno stato misto tuttavia potrebbe essere in qualche modo correlato con un altro sistema quantistico in possesso, eventualmente, di un avversario. Nel caso estremo di uno stato completamente misto, un generatore quantistico praticamente è equivalente ad un generatore che impiega un processo di fisica classica, che in principio è predicibile. Nel Capitolo, si mostrera quindi come sia necessario passare da un estimatore di casualita classico, come l' entropia minima classica $ H_ {min (Z) $ di una variabile casuale $ Z $ ad un estimatore che tenga conto di una informazione marginale $E$ di tipo quantistico, ovvero l'entropia minima condizionata $H_{min(Z|E)$. La entropia minima condizionata è una quantita fondamentale perchè consente di derivare quale sia il minimo contenuto di bit casuali estraibili dal sistema, in presenza di uno stato non puro. Abbiamo ideato un protocollo efficace basato sul principio di indeterminazione entropica per la stima dell'entropia min-condizionale. In generale, il principio di indeterminazione entropico consente di prendere in considerazione le informazioni che sono condivise tra più parti in possesso di un sistema quantistico tri-partitico e, soprattutto, consente di stimare il limite all'informazione che un partito ha sullo stato del sistema, dopo che è stato misurato. Abbiamo adattato tale principio al caso bipartito in cui un utente Alice, $A$, è dotato di un sistema quantistico che nel caso in studio ipotizziamo essere preparato dall'avversario stesso, Eve $E$, e che quindi potrebbe essere con esso correlato. Quindi, teoricamente Eve potrebbe essere in grado di prevedere tutti i risultati delle misurazioni che Alice esegue sulla sua parte di sistema, cioè potrebbe avere una conoscenza massima della variabile casuale $Z$ in cui si registrano i risultati delle misure nella base $\mathcal{Z$. Tuttavia mostreremo che se Alice casualmente misura il sistema in una base $\mathcal{X$ massimamente complementare a $\mathcal{Z$, Alice può inferire un limite inferiore l'entropia per $H_{min(Z|E)$. In questo modo per Alice, utilizzando tecniche della crittografia classeica, è possibile espandere un piccolo seme iniziale di casualita utilizzato per la scelta delle basi di misura, in una quantita molto maggiore di numeri sicuri. Presenteremo i risultati di una dimostrazione sperimentale del protocollo in cui sono stati prodotti numeri casuali che passano i più rigorosi test per la valutazione della casualita. Nel Capitolo 6, verra illustrato un sistema di generazione ultraveloce di numeri casuali per mezzo di variabili continue(CV) QRNG. Siccome numeri casuali genuini sono una preziosa risorsa sia per l'Information Technology classica che quella quantistica, è chiaro che per sostenere i flussi sempre crescenti di dati per la crittografia, è necessario mettere a punto generatori in grado di produrre streaming con rate da Gigabit o Terabit al secondo. In Letteratura sono riportati alcuni esempi di protocolli QRNG che potrebbero raggiungere tali limiti. In genere, questi si basano sulla misura dele quadrature del campo elettromagnetico che può essere considerato come un infinito sistema quantistico bosonico. Le quadrature del campo possono essere misurate con il cosiddetto sistema di rivelazione a omodina che, in linea di principio, può estrarre un segnale di rumore a banda infinita. Di conseguenza, la banda del segnale casuale viene ad essere limitata solo dalla banda passante dei dispositivi utilizzati per misurare. Siccome, rilevatori a fotodiodi lavorano comunemente nella banda delle decine dei GHz, se il segnale è campionato con un ADC sufficientemente veloce e con un elevato numero di bit di digitalizzazione, rate da Gigabit o Terabit sono facilmente raggiungibili. Tuttavia, come nel caso dei QRNG a variabili discrete, i protocolli che si hanno in Letteratura, non considerano adeguatamente la purezza dello stato quantistico da misurare. Nel L'idea è di estendere il protocollo a variabile discreta del capitolo precedente, al caso continuo. Mostreremo come nell'ambito CV, non solo sia abbia il problema della purezza dello stato ma anche il problema relativo alla precisione delle misure utilizzate su di esso. Proporremo e daremo i risultati sperimentali per un nuovo protocollo in grado di estrarre numeri casuali ad alto rate e con un elevato grado di sicurezza.
APA, Harvard, Vancouver, ISO, and other styles
6

AMINO, ROBERT, and JONI BAITAR. "Probabilistic Pseudo-random Number Generators." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-157351.

Full text
Abstract:
Random numbers are essential in many computer applications and games. The goal of this report is to examine two of the most commonly used random number generators and try to determine some of their strengths and weaknesses. These generators are the Linear Congruential Generator(LCG) and the Mersenne Twister(MT). The main objective will be to determine which one of these is the most optimal for low intensive usage and everyday work. Although some of the test results were in conclusive,there were some indications that MT is the better Pseudorandom Number Generator (PRNG) and therefore the preferred PRNG. However, be wary that this is not a general guideline and some implementations may differ from this.The final verdict was thus that MT is a more favourable option(mainly due to its speed) for everyday work, bothon a practical and theoretical level, if a choice should arise between the two options.
Slumptal representerar en viktig komponent i många datorspel, simulationer och övriga progam. Två av de mest förekommande slumptalsgeneratorerna är Linjär kongruensgeneratorn (LKG) samt Mersenne Twister(MT). Huvudfrågan som skall besvaras i denna rapport är huruvida, för vardagligt bruk, den ena generatorn är att föredra framför den andra. Ett antal tester kommer att utföras för att försöka finna eventuella styrkor samt svagheter med respektive generator.Baserat på ett fåtal tester är MT att föredra framför LKG. Detta stämmer väl överens med teorin. Notera dock att detta inte alltid gäller och att det kan förekomma skiljaktigheter mellan de båda alternativen som strider mot det tidigare påståendet. Detta är främst beroende på vilka implementationer som används för respektive generator. Slutsatsen är således att användning av MT ändå rekommenderasframför LKG, främst på grund av den snabba genereringshastigheten för MT.
APA, Harvard, Vancouver, ISO, and other styles
7

Kasikara, Gulin. "Progresses In Parallel Random Number Generators." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606651/index.pdf.

Full text
Abstract:
Monte Carlo simulations are embarrassingly parallel in nature, so having a parallel and efficient random number generator becomes crucial. To have a parallel generator with uncorrelated processors, parallelization methods are implemented together with a binary tree mapping. Although, this method has considerable advantages, because of the constraints arising from the binary tree structure, a situation defined as problem of falling off the tree occurs. In this thesis, a new spawning method that is based on binary tree traversal and new spawn processor appointment is proposed to use when falling off the tree problem is encountered. With this method, it is seen that, spawning operation becomes more costly but the independency of parallel processors is guaranteed. In Monte Carlo simulations, random number generation time should be unperceivable when compared with the execution time of the whole simulation. That is why
linear congruential generators with Mersenne prime moduli are used. In highly branching Monte Carlo simulations, cost of parameterization also gains importance and it becomes reasonable to consider other types of primes or other parallelization methods that provide different balance between parameterization cost and random number generation cost. With this idea in mind, in this thesis, for improving performance of linear congruential generators, two approaches are proposed. First one is using Sophie-Germain primes as moduli and second one is using a hybrid method combining both parameterization and splitting techniques. Performance consequences of Sophie-Germain primes over Mersenne primes are shown through graphics. It is observed that for some cases proposed approaches have better performance consequences.
APA, Harvard, Vancouver, ISO, and other styles
8

Karanam, Shashi Prashanth. "Tiny true random number generator." Fairfax, VA : George Mason University, 2009. http://hdl.handle.net/1920/4587.

Full text
Abstract:
Thesis (M.S.)--George Mason University, 2009.
Vita: p. 91. Thesis director: Jens-Peter Kaps. Submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering. Title from PDF t.p. (viewed Oct. 12, 2009). Includes bibliographical references (p. 88-90). Also issued in print.
APA, Harvard, Vancouver, ISO, and other styles
9

Tso, Chi-wai, and 曹志煒. "Stringency of tests for random number generators." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B29748367.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ruhault, Sylvain. "Security analysis for pseudo-random number generators." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0014/document.

Full text
Abstract:
La génération d’aléa joue un rôle fondamental en cryptographie et en sécurité. Des nombres aléatoires sont nécessaires pour la production de clés cryptographiques ou de vecteurs d’initialisation et permettent également d’assurer que des protocoles d’échange de clé atteignent un niveau de sécurité satisfaisant. Dans la pratique, les bits aléatoires sont générés par un processus de génération de nombre dit pseudo-aléatoire, et dans ce cas, la sécurité finale du système dépend de manière cruciale de la qualité des bits produits par le générateur. Malgré cela, les générateurs utilisés en pratique ne disposent pas ou peu d’analyse de sécurité permettant aux utilisateurs de connaître exactement leur niveau de fiabilité. Nous fournissons dans cette thèse des modèles de sécurité pour cette analyse et nous proposons des constructions prouvées sûres et efficaces qui répondront à des besoins de sécurité forts. Nous proposons notamment une nouvelle notion de robustesse et nous étendons cette propriété afin d’adresser les attaques sur la mémoire et les attaques par canaux cachés. Sur le plan pratique, nous effectuons une analyse de sécurité des générateurs utilisés dans la pratique, fournis de manière native dans les systèmes d’exploitation (/dev/random sur Linux) et dans les librairies cryptographiques (OpenSSL ou Java SecureRandom) et nous montrons que ces générateurs contiennent des vulnérabilités potentielles
In cryptography, randomness plays an important role in multiple applications. It is required in fundamental tasks such as key generation and initialization vectors generation or in key exchange. The security of these cryptographic algorithms and protocols relies on a source of unbiased and uniform distributed random bits. Cryptography practitioners usually assume that parties have access to perfect randomness. However, quite often this assumption is not realizable in practice and random bits are generated by a Pseudo-Random Number Generator. When this is done, the security of the scheme depends of course in a crucial way on the quality of the (pseudo-)randomness generated. However, only few generators used in practice have been analyzed and therefore practitioners and end users cannot easily assess their real security level. We provide in this thesis security models for the assessment of pseudo-random number generators and we propose secure constructions. In particular, we propose a new definition of robustness and we extend it to capture memory attacks and side-channel attacks. On a practical side, we provide a security assessment of generators used in practice, embedded in system kernel (Linux /dev/random) and cryptographic libraries (OpenSSL and Java SecureRandom), and we prove that these generators contain potential vulnerabilities
APA, Harvard, Vancouver, ISO, and other styles
11

Xu, Xiaoke. "Benchmarking the power of empirical tests for random number generators." Click to view the E-thesis via HKUTO, 2008. http://sunzi.lib.hku.hk/hkuto/record/B41508464.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Gärtner, Joel. "Analysis of Entropy Usage in Random Number Generators." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-214567.

Full text
Abstract:
Cryptographically secure random number generators usually require an outside seed to be initialized. Other solutions instead use a continuous entropy stream to ensure that the internal state of the generator always remains unpredictable. This thesis analyses four such generators with entropy inputs. Furthermore, different ways to estimate entropy is presented and a new method useful for the generator analysis is developed. The developed entropy estimator performs well in tests and is used to analyse entropy gathered from the different generators. Furthermore, all the analysed generators exhibit some seemingly unintentional behaviour, but most should still be safe for use.
Kryptografiskt säkra slumptalsgeneratorer behöver ofta initialiseras med ett oförutsägbart frö. En annan lösning är att istället konstant ge slumptalsgeneratorer entropi. Detta gör det möjligt att garantera att det interna tillståndet i generatorn hålls oförutsägbart. I den här rapporten analyseras fyra sådana generatorer som matas med entropi. Dessutom presenteras olika sätt att skatta entropi och en ny skattningsmetod utvecklas för att användas till analysen av generatorerna. Den framtagna metoden för entropiskattning lyckas bra i tester och används för att analysera entropin i de olika generatorerna. Alla analyserade generatorer uppvisar beteenden som inte verkar optimala för generatorns funktionalitet. De flesta av de analyserade generatorerna verkar dock oftast säkra att använda.
APA, Harvard, Vancouver, ISO, and other styles
13

Korsbakke, Andreas, and Robin Ringsell. "Promestra Security compared with other random number generators." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18238.

Full text
Abstract:
Background. Being able to trust cryptographic algorithms is a crucial part of society today, because of all the information that is gathered by companies all over the world. With this thesis, we want to help both Promestra AB and potential future customers to evaluate if you can trust their random number generator. Objectives. The main objective for the study is to compare the random number generator in Promestra security with the help of the test suite made by the NationalInstitute of Standards and Technology. The comparison will be made with other random number generators such as Mersenne Twister, Blum-Blum-Schub and more. Methods. The selected method in this study was to gather a total of 100 million bits of each random number generator and use these in the National Institute ofStandards and Technology test suite for 100 tests to get a fair evaluation of the algorithms. The test suite provides a statistical summary which was then analyzed. Results. The results show how many iterations out of 100 that have passed and also the distribution between the results. The obtained results show that there are some random number generators that have been tested that clearly struggles in many of the tests. It also shows that half of the tested generators passed all of the tests. Conclusions. Promestra security and Blum-Blum-Schub is close to passing all the tests, but in the end, they cannot be considered to be the preferable random number generator. The five that passed and seem to have no clear limitations are:Random.org, Micali-Schnorr, Linear-Congruential, CryptGenRandom, and MersenneTwister.
APA, Harvard, Vancouver, ISO, and other styles
14

Bang, Jung Woong. "An Empirical Comparison of Random Number Generators: Period, Structure, Correlation, Density, and Efficiency." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc277807/.

Full text
Abstract:
Random number generators (RNGs) are widely used in conducting Monte Carlo simulation studies, which are important in the field of statistics for comparing power, mean differences, or distribution shapes between statistical approaches. Statistical results, however, may differ when different random number generators are used. Often older methods have been blindly used with no understanding of their limitations. Many random functions supplied with computers today have been found to be comparatively unsatisfactory. In this study, five multiplicative linear congruential generators (MLCGs) were chosen which are provided in the following statistical packages: RANDU (IBM), RNUN (IMSL), RANUNI (SAS), UNIFORM(SPSS), and RANDOM (BMDP). Using a personal computer (PC), an empirical investigation was performed using five criteria: period length before repeating random numbers, distribution shape, correlation between adjacent numbers, density of distributions and normal approach of random number generator (RNG) in a normal function. All RNG FORTRAN programs were rewritten into Pascal which is more efficient language for the PC. Sets of random numbers were generated using different starting values. A good RNG should have the following properties: a long enough period; a well-structured pattern in distribution; independence between random number sequences; random and uniform distribution; and a good normal approach in the normal distribution. Findings in this study suggested that the above five criteria need to be examined when conducting a simulation study with large enough sample sizes and various starting values because the RNG selected can affect the statistical results. Furthermore, a study for purposes of indicating reproducibility and validity should indicate the source of the RNG, the type of RNG used, evaluation results of the RNG, and any pertinent information related to the computer used in the study. Recommendations for future research are suggested in the area of other RNGs and methods not used in this study, such as additive, combined, mixed and shifted RNGs.
APA, Harvard, Vancouver, ISO, and other styles
15

Epstein, Peter Carleton University Dissertation Computer Science. "Generating geometric objects at random." Ottawa, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Leifgen, Matthias. "Protocols and components for quantum key distribution." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2016. http://dx.doi.org/10.18452/17473.

Full text
Abstract:
In dieser Doktorarbeit werden zwei Konzepte der Quanteninformationsverarbeitung realisiert. Der Quantenschlüsselaustausch ist revolutionär, weil er perfekte Sicherheit gewährleistet. Zahlreiche Quantenkryptografieprotokolle wurden schon untersucht. Zwei Probleme bestehen. Zum einen ist es sehr schwer, die Bedingungen herzustellen, die in den Annahmen für perfekte Sicherheit impliziert sind. Zum anderen sind die Reichweiten auf momentan etwa 200 km begrenzt, aufgrund des abnehmenden Signals gegenüber des konstanten Rauschens. Ein Experiment dieser Doktorarbeit beschäftigt sich mit dem ersten Problem. Insbesondere der übertragene Quantenzustands ist kritisch für die Sicherheit des Verfahrens. Es werden Einzelphotonen von Stickstoff- Fehlstellen-Zentren und zum ersten Mal von Silizium-Fehlstellen-Zentren für einen Quantenschlüsselaustausch mit Hilfe des BB84-Protokolls benutzt. Die Abweichung von idealen Einzelphotonenzuständen sowie deren Bedeutung für die Sicherheit werden analysiert. Die Übertragung von Quantenzuständen via Satellit könnte das Problem der begrenzten Reichweite lösen. Das neue Frequenz-Zeit- Protokoll eignet sich dafür besonders gut. Es wird während dieser Arbeit zum ersten Mal überhaupt implementiert. Umfangreiche Untersuchungen inklusive der Variation wesentlicher experimenteller Parameter geben Aufschluss über die Leistungsfähigkeit und Sicherheit des Protokolls. Außerdem werden elementare Bestandteile eines vollautomatischen Experiments zum Quantenschlüsselaustausch über Glasfasern in der sogenannten Time-bin-Implementierung mit autonomem Sender und Empfänger realisiert. Ein anderes Konzept der Quanteninformationsverarbeitung ist die Herstellung zufälliger Bitfolgen durch den Quantenzufall. Zufällige Bitfolgen haben zahlreiche Anwendungsgebiete in der Kryptografie und der Informatik. Die Realisierung eines Quantenzufallszahlengenerators mit mathematisch beschreibbarer und getesteter Zufälligkeit und hoher Bitrate wird ebenfalls beschrieben.
In this thesis, photonic quantum states are used for experimental realisations of two different concepts of quantum information processing. Quantum key distribution (QKD) is revolutionary because it is the only cryptographic scheme offering unconditional security. Two major problems prevail: Firstly, matching the conditions for unconditional security is challenging, secondly, long distance communication beyond 200 km is very demanding because an increasingly attenuated quantum state starts to fail the competition with constant noise. One experiment accomplished in this thesis is concerned with the first problem. The realisation of the actual quantum state is critical. Single photon states from nitrogen and for the first time also silicon vacancy defect centres are used for a QKD transmission under the BB84 (Bennett and Brassard 1984). The deviation of the used single photon states from the ideal state is thoroughly investigated and the information an eavesdropper obtains due to this deviation is analysed. Transmitting quantum states via satellites is a potential solution to the limited achievable distances in QKD. A novel protocol particularly suited for this is implemented for the first time in this thesis, the frequency-time (FT) protocol. The protocol is thoroughly investigated by varying the experimental parameters over a wide range and by evaluating the impact on the performance and the security. Finally, big steps towards a fully automated fibre-based BB84 QKD experiment in the time-bin implementation with autonomous sender and receiver units are accomplished. Another important concept using quantum mechanical properties as a resource is a quantum random number generator (QRNG). Random numbers are used for various applications in computing and cryptography. A QRNG supplying bits with high and quantifiable randomness at a record-breaking rate is reported and the statistical properties of the random output is thoroughly tested.
APA, Harvard, Vancouver, ISO, and other styles
17

Xu, Xiaoke, and 許小珂. "Benchmarking the power of empirical tests for random numbergenerators." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B41508464.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Petura, Oto. "True random number generators for cryptography : Design, securing and evaluation." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSES053.

Full text
Abstract:
Les nombres aléatoires sont essentiels pour les systèmes cryptographiques modernes. Ils servent de clés cryptographiques, de nonces, de vecteurs d’initialisation et de masques aléatoires pour la protection contre les attaques par canaux cachés. Dans cette thèse, nous traitons des générateurs de nombres aléatoires dans les circuits logiques (FPGA et ASIC). Nous présentons les méthodes fondamentales de génération de nombres aléatoires dans des circuits logiques. Ensuite, nous discutons de différents types de TRNG en utilisant le jitter d’horloge comme source d’aléa. Nous faisons une évaluation rigoureuse de divers noyaux TRNG conformes à la norme AIS-20/31 et mis en œuvre dans trois familles de FPGA différentes: Intel Cyclone V, Xilinx Spartan-6 et Microsemi SmartFusion2. Puis, nous présentons l’implémentation des noyaux TRNG sélectionnés dans des ASIC et leur évaluation. Ensuite, nous étudions en profondeur PLL-TRNG afin de fournir une conception sécurisée de ce TRNG ainsi que des tests intégrés. Enfin, nous étudions les TRNG basés sur les oscillateurs. Nous comparons de différentes méthodes d'extraction d’aléa ainsi que de différents types d'oscillateurs et le comportement du jitter d'horloge à l'intérieur de chacun d'eux. Nous proposons également des méthodes de mesure du jitter intégrée pour le test en ligne des TRNG basés sur les oscillateurs
Random numbers are essential for modern cryptographic systems. They are used as cryptographic keys, nonces, initialization vectors and random masks for protection against side channel attacks. In this thesis, we deal with random number generators in logic devices (Field Programmable Gate Arrays – FPGAs and Application Specific Integrated Circuits – ASICs). We present fundamental methods of generation of random numbers in logic devices. Then, we discuss different types of TRNGs using clock jitter as a source of randomness. We provide a rigorous evaluation of various AIS-20/31 compliant TRNG cores implemented in three different FPGA families : Intel Cyclone V, Xilinx Spartan-6 and Microsemi SmartFusion2. We then present the implementation of selected TRNG cores in custom ASIC and we evaluate them. Next, we study PLL-TRNG in depth in order to provide a secure design of this TRNG together with embedded tests. Finally, we study oscillator based TRNGs. We compare different randomness extraction methods as well as different oscillator types and the behavior of the clock jitter inside each of them. We also propose methods of embedded jitter measurement for online testing of oscillator based TRNGs
APA, Harvard, Vancouver, ISO, and other styles
19

Pareschi, Fabio <1976&gt. "Chaos-based random number generators: monolithic implementation, testing and applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/467/1/PhDThesis-Pareschi_Chaos-based_random_number_generatos.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pareschi, Fabio <1976&gt. "Chaos-based random number generators: monolithic implementation, testing and applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/467/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Crunk, Anthony Wayne. "A portable C random number generator." Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/45720.

Full text
Abstract:
Proliferation of computers with varying word sizes has led to increases in software use where random number generation is required. Several techniques have been developed. Criteria of randomness, portability, period, reproducibility, variety, speed, and storage are used to evaluate developed generation methods. The Tausworthe method is the only method to meet the portability requirement, and is chosen to be implemented. A C language implementation is proposed as a possible implementation and test results are presented to confirm the acceptability of the proposed code.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
22

Lian, Guinan. "Testing Primitive Polynomials for Generalized Feedback Shift Register Random Number Generators." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd1131.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Alice, Reinaudo. "Empirical testing of pseudo random number generators based on elliptic curves." Thesis, Linnéuniversitetet, Institutionen för matematik (MA), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-44875.

Full text
Abstract:
An introduction on random numbers, their history and applications is given, along with explanations of different methods currently used to generate them. Such generators can be of different kinds, and in particular they can be based on physical systems or algorithmic procedures. The latter type of procedures gives rise to pseudo-random number generators. Specifically, several such generators which are based on elliptic curves are examined. Therefore, in order to ease understanding, a basic primer on elliptic curves over fields and the operations arising from their group structure is also provided. Empirical tests to verify randomness of generated sequences are then considered. Afterwards, there are some statistical considerations and observations about theoretical properties of the generators at hand, useful in order to use them optimally. Finally, several randomly generated curves are created and used to produce pseudo-random sequences which are then tested by means of the previously described generators. In the end, an analysis of the results is attempted and some final considerations are made.
APA, Harvard, Vancouver, ISO, and other styles
24

Dusitsin, Krid, and Kurt Kosbar. "Accuracy of Computer Simulations that use Common Pseudo-random Number Generators." International Foundation for Telemetering, 1998. http://hdl.handle.net/10150/609238.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California
In computer simulations of communication systems, linear congruential generators and shift registers are typically used to model noise and data sources. These generators are often assumed to be close to ideal (i.e. delta correlated), and an insignificant source of error in the simulation results. The samples generated by these algorithms have non-ideal autocorrelation functions, which may cause a non-uniform distribution in the data or noise signals. This error may cause the simulation bit-error-rate (BER) to be artificially high or low. In this paper, the problem is described through the use of confidence intervals. Tests are performed on several pseudo-random generators to access which ones are acceptable for computer simulation.
APA, Harvard, Vancouver, ISO, and other styles
25

Pilcher, Martha Geraldine. "Development and validation of random cut test problem generator." Diss., Georgia Institute of Technology, 1985. http://hdl.handle.net/1853/24560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Apprey-Hermann, Joseph Kwame. "Evaluating The Predictability of Pseudo-Random Number Generators Using Supervised Machine Learning Algorithms." Youngstown State University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1588805461290138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Marino, Ricardo. "Number statistics in random matrices and applications to quantum systems." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLS045/document.

Full text
Abstract:
L'objectif principal de cette thèse est de répondre à la question: étant donné une matrice aléatoire avec spectre réel, combien de valeurs propres tomber entre A et B? Ceci est une question fondamentale dans la théorie des matrices aléatoires et toutes ses applications, autant de problèmes peuvent être traduits en comptant les valeurs propres à l'intérieur des régions du spectre. Nous appliquons la méthode de gaz Coulomb à ce problème général dans le cadre de différents ensembles de matrice aléatoire et l'on obtient de résultats pour intervalles générales [a, b]. Ces résultats sont particulièrement intéressants dans l'étude des variations des systèmes fermioniques unidimensionnelles de particules confinées non-interaction à la température zéro
The main goal of this thesis is to answer the question: given a random matrix with real spectrum, how many eigenvalues fall between a and b? This is a fundamental question in random matrix theory and all of its applications, as many problems can be translated into counting eigenvalues inside regions of the spectrum. We apply the Coulomb gas method to this general problem in the context of different random matrix ensembles and we obtain many results for general intervals [a,b]. These results are particularly interesting in the study of fermionic fluctuations for one-dimensional systems of confined non-interacting particles at zero temperature
APA, Harvard, Vancouver, ISO, and other styles
28

Narayanan, Ramaswamy Karthik. "ROLLBACK-ABLE RANDOM NUMBER GENERATORS FOR THE SYNCHRONOUS PARALLEL ENVIRONMENT FOR EMULATION AND DISCRETE-EVENT SIMULATION (SPE." Master's thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4352.

Full text
Abstract:
Random Numbers form the heart and soul of a discrete-event simulation system. There are few situations where the actions of the entities in the process being simulated can be completely predicted in advance. The real world processes are more probabilistic than deterministic. Hence, such chances are represented in the system by using various statistical models, like random number generators. These random number generators can be used to represent a various number of factors, such as length of the queue. However, simulations have grown in size and are sometimes required to run on multiple machines, which share the various methods or events in the simulation among themselves. These Machines can be distributed across a LAN or even the internet. In such cases, to keep the validity of the simulation model, we need rollback-able random number generators. This thesis is an effort to develop such rollback able random number generators for the Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) environment developed by NASA. These rollback-able random number generators will also add several statistical distribution models to the already rich SPEEDES library.
M.S.
Other
Engineering and Computer Science
Modeling and Simulation
APA, Harvard, Vancouver, ISO, and other styles
29

Hörmann, Wolfgang, and Gerhard Derflinger. "Universal Generators for Correlation Induction." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 1994. http://epub.wu.ac.at/524/1/document.pdf.

Full text
Abstract:
Compared with algorithms specialized for a single distribution universal (also called automatic or black-box) algorithms for continuous distributions were relatively seldom discussed. But they have important advantages for the user: One algorithm coded and tested only once can do the same or even more than a whole library of standard routines. It is only necessary to have a program available that can evaluate the density of the distribution up to a multiplicative factor. In this paper we show that transformed density rejection is well suited to construct universal algorithms suitable for correlation induction which is important for variance reduction in simulation. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
30

Borgen, Gary S. "Application of Dither to Low Resolution Quantization Systems." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611650.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
A significant problem in the processing chain of a low resolution quantization system is the Analog to Digital converter quantization error. The classical model of quantization treats the error generated as a random additive process that is independent of the input and uniformly distributed. This model is valid for complex or random input signals that are large relative to a least significant bit. But the model fails catastrophically for small, simple signals applied to high resolution quantization systems, and in addition, the model fails for simple signals applied to low resolution quantization systems, i.e. one to 6 bits resolution. This paper will discuss a means of correcting this problem by the application of dither. Two methods of dither will be discussed as well as a real-life implementation of the techniques.
APA, Harvard, Vancouver, ISO, and other styles
31

Bamps, Cédric. "Self-Testing and Device-Independent Quantum Random Number Generation with Nonmaximally Entangled States." Doctoral thesis, Universite Libre de Bruxelles, 2018. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/266954.

Full text
Abstract:
The generation of random number sequences, that is, of unpredictable sequences free from any structure, has found numerous applications in the field of information technologies. One of the most sensitive applications is cryptography, whose modern practice makes use of secret keys that must indeed be unpredictable for any potential adversary. This type of application demands highly secure randomness generators.This thesis contributes to the device-independent approach to quantum random number generation (DIRNG, for Device-Independent Random Number Generation). Those methods of randomness generation exploit the fundamental unpredictability of the measurement of quantum systems. In particular, the security of device-independent methods does not appeal to a specific model of the device itself, which is treated as a black box. This approach therefore stands in contrast to more traditional methods whose security rests on a precise theoretical model of the device, which may lead to vulnerabilities caused by hardware malfunctions or tampering by an adversary.Our contributions are the following. We first introduce a family of robust self-testing criteria for a class of quantum systems that involve partially entangled qubit pairs. This powerful form of inference allows us to certify that the contents of a quantum black box conforms to one of those systems, on the sole basis of macroscopically observable statistical properties of the black box.That result leads us to introduce and prove the security of a protocol for randomness generation based on such partially entangled black boxes. The advantage of this method resides in its low shared entanglement cost, which allows to reduce the use of quantum resources (both entanglement and quantum communication) compared to existing DIRNG protocols.We also present a protocol for randomness generation based on an original estimation of the black-box correlations. Contrary to existing DIRNG methods, which summarize the accumulated measurement data into a single quantity---the violation of a unique Bell inequality---, our method exploits a complete, multidimensional description of the black-box correlations that allows it to certify more randomness from the same number of measurements. We illustrate our results on a numerical simulation of the protocol using partially entangled states.
La génération de suites de nombres aléatoires, c'est-à-dire de suites imprévisibles et dépourvues de toute structure, trouve de nombreuses applications dans le domaine des technologies de l'information. L'une des plus sensibles est la cryptographie, dont les pratiques modernes font en effet appel à des clés secrètes qui doivent précisément être imprévisibles du point de vue d'adversaires potentiels. Ce type d'application exige des générateurs d'aléa de haute sécurité.Cette thèse s'inscrit dans le cadre de l'approche indépendante des appareils des méthodes quantiques de génération de nombres aléatoires (en anglais, Device-Independent Random Number Generation ou DIRNG). Ces méthodes exploitent la nature fondamentalement imprévisible de la mesure des systèmes quantiques. En particulier, l'appellation "indépendante des appareils" implique que la sécurité de ces méthodes ne fait pas appel à un modèle théorique particulier de l'appareil lui-même, qui est traité comme une boîte noire. Cette approche se distingue donc de méthodes plus traditionnelles dont la sécurité repose sur un modèle théorique précis de l'appareil et peut donc être compromise par un dysfonctionnement matériel ou l'intervention d'un adversaire.Les contributions apportées sont les suivantes. Nous démontrons tout d'abord une famille de critères de "self-testing" robuste pour une classe de systèmes quantiques impliquant des paires de systèmes à deux niveaux (qubits) partiellement intriquées. Cette forme d'inférence particulièrement puissante permet de certifier que le contenu d'une boîte noire quantique est conforme à l'un de ces systèmes, sur base uniquement de propriétés statistiques de la boîte observables macroscopiquement.Ce résultat nous amène à introduire et à prouver la sécurité d'une méthode de génération d'aléa basée sur ces boîtes noires partiellement intriquées. L'intérêt de cette méthode réside dans son faible coût en intrication, qui permet de réduire l'usage de ressources quantiques (intrication ou communication quantique) par rapport aux méthodes de DIRNG existantes.Nous présentons par ailleurs une méthode de génération d'aléa basée sur une estimation statistique originale des corrélations des boîtes noires. Contrairement aux méthodes de DIRNG existantes, qui résument l'ensemble des mesures observées à une seule grandeur (la violation d'une inégalité de Bell unique), notre méthode exploite une description complète (et donc multidimensionnelle) des corrélations des boîtes noires qui lui permet de certifier une plus grande quantité d'aléa pour un même nombre de mesures. Nous illustrons ensuite cette méthode numériquement sur un système de qubits partiellement intriqués.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
32

Stewart, Robert Grisham. "A Statistical Evaluation of Algorithms for Independently Seeding Pseudo-Random Number Generators of Type Multiplicative Congruential (Lehmer-Class)." Digital Commons @ East Tennessee State University, 2007. https://dc.etsu.edu/etd/2049.

Full text
Abstract:
To be effective, a linear congruential random number generator (LCG) should produce values that are (a) uniformly distributed on the unit interval (0,1) excluding endpoints and (b) substantially free of serial correlation. It has been found that many statistical methods produce inflated Type I error rates for correlated observations. Theoretically, independently seeding an LCG under the following conditions attenuates serial correlation: (a) simple random sampling of seeds, (b) non-replicate streams, (c) non-overlapping streams, and (d) non-adjoining streams. Accordingly, 4 algorithms (each satisfying at least 1 condition) were developed: (a) zero-leap, (b) fixed-leap, (c) scaled random-leap, and (d) unscaled random-leap. Note that the latter satisfied all 4 independent seeding conditions. To assess serial correlation, univariate and multivariate simulations were conducted at 3 equally spaced intervals for each algorithm (N=24) and measured using 3 randomness tests: (a) the serial correlation test, (b) the runs up test, and (c) the white noise test. A one-way balanced multivariate analysis of variance (MANOVA) was used to test 4 hypotheses: (a) omnibus, (b) contrast of unscaled vs. others, (c) contrast of scaled vs. others, and (d) contrast of fixed vs. others. The MANOVA assumptions of independence, normality, and homogeneity were satisfied. In sum, the seeding algorithms did not differ significantly from each other (omnibus hypothesis). For the contrast hypotheses, only the fixed-leap algorithm differed significantly from all other algorithms. Surprisingly, the scaled random-leap offered the least difference among the algorithms (theoretically this algorithm should have produced the second largest difference). Although not fully supported by the research design used in this study, it is thought that the unscaled random-leap algorithm is the best choice for independently seeding the multiplicative congruential random number generator. Accordingly, suggestions for further research are proposed.
APA, Harvard, Vancouver, ISO, and other styles
33

Leone, Nicolò. "A quantum entropy source based on Single Photon Entanglement." Doctoral thesis, Università degli studi di Trento, 2022. https://hdl.handle.net/11572/339572.

Full text
Abstract:
In this thesis, I report on how to use Single Photon Entanglement for generating certified quantum random numbers. Single Photon Entanglement is a particular type of entanglement which involves non-contextual correlations between two degrees of freedom of a single photon. In particular, here I consider momentum and polarization. The presence of the entanglement was validated using different attenuated coherent and incoherent sources of light by evaluating the Bell inequality, a well-known entanglement witness. Different non-idealities in the calculation of the inequality are discussed addressing them both theoretically and experimentally. Then, I discuss how to use the Single Photon Entanglement for generating certified quantum random numbers using a semi-device independent protocol. The protocol is based on a partial characterization of the experimental setup and the violation of the Bell's inequality. An analysis of the non-idealities of the devices employed in the experimental setup is also presented In the last part of the thesis, the integrated photonic version of the previously introduced experiments is discussed: first, it is presented how to generate single photon entangled states exploiting different degrees of freedom with respect to the bulk experiment. Second, I discuss how to perform an integrated test of the Bell's inequality.
APA, Harvard, Vancouver, ISO, and other styles
34

Bakiri, Mohammed. "Hardware implementation of a pseudo random number generator based on chaotic iteration." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD014/document.

Full text
Abstract:
La sécurité et la cryptographie sont des éléments clés pour les dispositifs soumis à des contraintes comme l’IOT, Carte à Puce, Systèm Embarqué, etc. Leur implémentation matérielle constitue un défi en termes de limitation en ressources physiques, vitesse de fonctionnement, capacité de mémoire, etc. Dans ce contexte, comme la plupart des protocoles s’appuient sur la sécurité d’un bon générateur de nombres aléatoires, considéré comme un élément indispensable dans le noyau de sécurité. Par conséquent, le présent travail propose des nouveaux générateurs pseudo-aléatoires basés sur des itérations chaotiques, et conçus pour être déployés sur des supports matériels, à savoir sur du FPGA ou du ASIC. Ces implémentations matérielles peuvent être décrites comme des post-traitements sur des générateurs existants. Elles transforment donc une suite de nombres non-uniformes en une autre suite de nombres uniformes. La dépendance entre l’entrée et la sortie a été prouvée chaotique selon les définitions mathématiques du chaos fournies notamment par Devaney et Li-Yorke. Suite à cela, nous effectuant tout d’abord un état de l’art complet sur les mises en œuvre matérielles et physiques des générateurs de nombres pseudo-aléatoires (PRNG, pour pseudorandom number generators). Nous proposons ensuite de nouveaux générateurs à base d’itérations chaotiques (IC) qui seront testés sur notre plate-forme matérielle. L’idée de départ était de partir du n-cube (ou, de manière équivalente, de la négation vectorielle dans les IC), puis d’enlever un cycle Hamiltonien suffisamment équilibré pour produire de nouvelles fonctions à itérer, à laquelle s’ajoute une permutation en sortie. Les méthodes préconisées pour trouver de bonnes fonctions serons détaillées, et le tout sera implanté sur notre plate-forme FPGA. Les générateurs obtenus disposent généralement d’un meilleur profil statistique que leur entrée, tout en fonctionnant à une grande vitesse. Finalement, nous les implémenterons sur de nombreux supports matériels (65-nm ASIC circuit and Zynq FPGA platform)
Security and cryptography are key elements in constrained devices such as IoT, smart card, embedded system, etc. Their hardware implementations represent a challenge in terms of limitations in physical resources, operating speed, memory capacity, etc. In this context, as most protocols rely on the security of a good random number generator, considered an indispensable element in lightweight security core. Therefore, this work proposes new pseudo-random generators based on chaotic iterations, and designed to be deployed on hardware support, namely FPGA or ASIC. These hardware implementations can be described as post-processing on existing generators. They transform a sequence of numbers not uniform into another sequence of numbers uniform. The dependency between input and output has been proven chaotic, according notably to the mathematical definitions of chaos provided by Devaney and Li-Yorke. Following that, we firstly elaborate or develop out a complete state of the art of the material and physical implementations of pseudo-random number generators (PRNG, for pseudorandom number generators). We then propose new generators based on chaotic iterations (IC) which will be tested on our hardware platform. The initial idea was to start from the n-cube (or, in an equivalent way, the vectorial negation in CIs), then remove a Hamiltonian cycle balanced enough to produce new functions to be iterated, for which is added permutation on output . The methods recommended to find good functions, will be detailed, and the whole will be implemented on our FPGA platform. The resulting generators generally have a better statistical profiles than its inputs, while operating at a high speed. Finally, we will implement them on many hardware support (65-nm ASIC circuit and Zynq FPGA platform)
APA, Harvard, Vancouver, ISO, and other styles
35

Greiner, Johannes N. [Verfasser], and Jörg [Akademischer Betreuer] Wrachtrup. "Algorithms and resources for quantum technology, sensing and random number generation / Johannes N. Greiner ; Betreuer: Jörg Wrachtrup." Stuttgart : Universitätsbibliothek der Universität Stuttgart, 2020. http://d-nb.info/1233681265/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Julis, Guenaëlle de. "Analyse d'accumulateurs d'entropie pour les générateurs aléatoires cryptographiques." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM075.

Full text
Abstract:
En cryptographie, l'utilisation de nombres aléatoires est fréquente (graine, token, ...) et une mauvaise génération d'aléa peut compromettre toute la sécurité d'un protocole, comme en témoigne régulièrement l'actualité. Les générateurs de nombres aléatoires à usage cryptographique sont des composants formés de trois modules : la source brute qui produit de l'aléa (un algorithme ou un phénomène physique), un retraitement pour corriger les défauts de la source, et un retraitement cryptographique pour obtenir l'aléa final. Cette thèse se focalise sur l'analyse des générateurs issus d'une source physique, en vue de dégager des retraitements adaptés à leurs propriétés et résistants à des perturbations de leur environnement d'utilisation. La complexité des dispositifs entravant souvent la formulation explicite d'un modèle stochastique prouvé, leur évaluation repose principalement sur une analyse statistique. Or, les tests statistiques, principale méthode recommandée par les institutions gouvernementales (ANSSI, BSI, NIST) pour certifier ces composants, peuvent détecter des anomalies mais ne permettent pas de les identifier, et de les caractériser. Les travaux de cette thèse structurent la modélisation d'une source d'aléa, vue comme une suite de variables aléatoires, affinent les tests statistiques, et ajoutent une analyse temporelle pour détecter et expliciter ses anomalies au niveau global ou local. Les résultats ont été implantés dans une librairie composée d'un simulateur de perturbations, des outils statistiques et temporels obtenus, des batteries de tests recommandées (FIPS, AIS31, Test U01, SP800), et de retraitements appropriés à certaines anomalies. La structure mise en place a permis d'extraire des familles d'anomalies de motifs dont les propriétés rendent certains tests incapables de distinguer la source anormale d'une source idéalement aléatoire. L'analyse des faiblesses inhérentes aux méthodes statistiques a montré que l'interprétation d'un test par intervalle de rejet ou taux de réussite n'est pas adapté à la détection de certaines fautes de transition. Elle a aussi permis d'étudier les méthodes d'estimations d'entropie, notamment les estimateurs proposés dans la norme SP800-90. Par ailleurs, les paramètres de spécifications de certains générateurs, dont un déduit du standard de chiffrement AES, se sont avérés distinguables grâce aux statistiques de test. Les outils temporels développés évaluent la structure des anomalies, leur évolution au cours du temps et analysent les motifs déviants au voisinage d'un motif donné. Cela a permis d'une part d'appliquer les tests statistiques avec des paramètres pertinents, et d'autre part de présenter des retaitements dont la validité repose sur la structure des anomalies et non sur leur amplitude
While random numbers are frequently used in cryptography (seed, token, ...), news regurlarly prove how bad randomness generation can compromise the wole security of a protocol. Random number generators for crypthography are components with three steps : a source (an algorithm or physical phenomenon) produces raw numbers which are two times postprocessed to fix anomalies. This thesis focuses on the analysis of physical random bit generators in order to extract postprocessing which will be adapted to the anomalies of the source. As the design of a physical random bit generator is complex, its evaluation is mainly a statistical analysis with hypothesis testing. However, the current standards (AIS31, FIPS140-2, Test U01, SP800) can not provide informations to characterize anomalies. Thus, this thesis adjust several tests and add a time analysis to identify and to make global and local anomalies explicit. A C library was developped, providing anomalies simulator and tools to apply statistical and time analysis results on random bit generators
APA, Harvard, Vancouver, ISO, and other styles
37

Bessac, Julie. "Sur la construction de générateurs aléatoires de conditions de vent au large de la Bretagne." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S067/document.

Full text
Abstract:
Mon travail porte sur la construction de générateurs aléatoires de conditions de vent en Bretagne. Ces modèles permettent de simuler artificiellement des conditions météorologiques réalistes et sont couramment utilisés pour la gestion des risques liés aux aléas climatiques. Ils sont construits sur la base de données historiques dans le but de produire des simulations cohérentes avec le climat actuel mais peuvent intégrer des scénarios de changement climatique. Les séquences simulées permettent de pallier le manque de données réelles et sont utilisées en entrée de modèles économiques ou écologiques
This work is aimed at constructing stochastic weather generators. These models enable to simulate artificially weather data that have statistical properties consistent with observed meteorology and climate. Outputs of these models are generally used in impact studies in agriculture or in ecology
APA, Harvard, Vancouver, ISO, and other styles
38

Beckwith, Luke Parkhurst. "An Investigation of Methods to Improve Area and Performance of Hardware Implementations of a Lattice Based Cryptosystem." Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/100798.

Full text
Abstract:
With continuing research into quantum computing, current public key cryptographic algorithms such as RSA and ECC will become insecure. These algorithms are based on the difficulty of integer factorization or discrete logarithm problems, which are difficult to solve on classical computers but become easy with quantum computers. Because of this threat, government and industry are investigating new public key standards, based on mathematical assumptions that remain secure under quantum computing. This paper investigates methods of improving the area and performance of one of the proposed algorithms for key exchanges, "NewHope." We describe a pipelined FPGA implementation of NewHope512cpa which dramatically increases the throughput for a similar design area. Our pipelined encryption implementation achieves 652.2 Mbps and a 0.088 Mbps/LUT throughput-to-area (TPA) ratio, which are the best known results to date, and achieves an energy efficiency of 0.94 nJ/bit. This represents TPA and energy efficiency improvements of 10.05× and 8.58×, respectively, over a non-pipelined approach. Additionally, we investigate replacing the large SHAKE XOF (hash) function with a lightweight Trivium based PRNG, which reduces the area by 32% and improves energy efficiency by 30% for the pipelined encryption implementation, and which could be considered for future cipher specifications.
Master of Science
Cryptography is prevalent in almost every aspect of our lives. It is used to protect communication, banking information, and online transactions. Current cryptographic protections are built specifically upon public key encryption, which allows two people who have never communicated before to setup a secure communication channel. However, due to the nature of current cryptographic algorithms, the development of quantum computers will make it possible to break the algorithms that secure our communications. Because of this threat, new algorithms based on principles that stand up to quantum computing are being investigated to find a suitable alternative to secure our systems. These algorithms will need to be efficient in order to keep up with the demands of the ever growing internet. This paper investigates four hardware implementations of a proposed quantum-secure algorithm to explore ways to make designs more efficient. The improvements are valuable for high throughput applications, such as a server which must handle a large number of connections at once.
APA, Harvard, Vancouver, ISO, and other styles
39

MORETTI, RICCARDO. "Digital Nonlinear Oscillators: A Novel Class of Circuits for the Design of Entropy Sources in Programmable Logic Devices." Doctoral thesis, Università di Siena, 2021. http://hdl.handle.net/11365/1144376.

Full text
Abstract:
In recent years, cybersecurity is gaining more and more importance. Cryptography is used in numerous applications, such as authentication and encryption of data in communications, access control to restricted or protected areas, electronic payments. It is safe to assume that the presence of cryptographic systems in future technologies will become increasingly pervasive, leading to a greater demand for energy efficiency, hardware reliability, integration, portability, and security. However, this pervasiveness introduces new challenges: the implementation of conventional cryptographic standards approved by NIST requires the achievement of performance in terms of timing, chip area, power and resource consumption that are not compatible with reduced complexity hardware devices, such as IoT systems. In response to this limitation, lightweight cryptography comes into play - a branch of cryptography that provides tailor-made solutions for resource-limited devices. One of the fundamental classes of cryptographic hardware primitives is represented by Random Number Generators (RNGs), that is, systems that provide sequences of integers that are supposed to be unpredictable. The circuits and systems that implement RNGs can be divided into two categories, namely Pseudo Random Number Generators (PRNGs) and True Random Number Generators (TRNGs). PRNGs are deterministic and possibly periodic finite state machines, capable of generating sequences that appear to be random. In other words, a PRNG is a device that generates and repeats a finite random sequence, saved in memory, or generated by calculation. A TRNG, on the other hand, is a device that generates random numbers based on real stochastic physical processes. Typically, a hardware TRNG consists of a mixed-signal circuit that is classified according to the stochastic process on which it is based. Specifically, the most used sources of randomness are chaotic circuits, high jitter oscillators, circuits that measure other stochastic processes. A chaotic circuit is an analog or mixed-signal circuit in which currents and voltages vary over time based on certain mathematical properties. The evolution over time of these currents and voltages can be interpreted as the evolution of the state of a chaotic nonlinear dynamical system. Jitter noise can instead be defined as the deviation of the output signal of an oscillator from its true periodicity, which causes uncertainty in its low-high and high-low transition times. Other possible stochastic processes that a TRNG can use may involve radioactive decay, photon detection, or electronic noise in semiconductor devices. TRNG proposals presented in the literature are typically designed in the form of Application Specific Integrated Circuits (ASICs). On the other hand, in recent years more and more researchers are exploring the possibility of designing TRNGs in Programmable Logic Devices (PLDs). A PLD offers, compared to an ASIC, clear advantages in terms of cost and versatility. At the same time, however, there is currently a widespread lack of trust in these PLD-based architectures, particularly due to strong cryptographic weaknesses found in Ring Oscillator-based solutions. The goal of this thesis is to show how this mistrust does not depend on poor performance in cryptographic terms of solutions for the generation of random numbers based on programmable digital technologies, but rather on a still immature approach in the study of TRNG architectures designed on PLDs. During the thesis chapters a new class of nonlinear circuits based on digital hardware is introduced that can be used as entropy sources for TRNGs implemented in PLDs, identified by the denomination of Digital Nonlinear Oscillators (DNOs). In Chapter 2 a novel class of circuits that can be used to design entropy sources for True Random Number Generation, called Digital Nonlinear Oscillators (DNOs), is introduced. DNOs constitute nonlinear dynamical systems capable of supporting complex dynamics in the time-continuous domain, although they are based on purely digital hardware. By virtue of this characteristic, these circuits are suitable for their implementation on Programmable Logic Devices. By focusing the analysis on Digital Nonlinear Oscillators implemented in FPGAs, a preliminary comparison is proposed between three different circuit topologies referable to the introduced class, to demonstrate how circuits of this type can have different characteristics, depending on their dynamical behavior and the hardware implementation. In Chapter 3 a methodology for the analysis and design of Digital Nonlinear Oscillators based on the evaluation of their electronics aspects, their dynamical behavior, and the information they can generate is formalized. The presented methodology makes use of different tools, such as figures of merit, simplified dynamical models, advanced numerical simulations and experimental tests carried out through implementation on FPGA. Each of these tools is analyzed both in its theoretical premises and through explanatory examples. In Chapter 4 the analysis and design methodologies of Digital Nonlinear Oscillators formalized in Chapter 3 are used to describe the complete workflow followed for the design of a novel DNO topology. This DNO is characterized by chaotic dynamical behaviors and can achieve high performance in terms of generated entropy, downstream of a reduced hardware complexity and high sampling frequencies. By exploiting the simplified dynamical model, the advanced numerical simulations in Cadence Virtuoso and the FPGA implementation, the presented topology is extensively analyzed both from a theoretical point of view (notable circuit sub-elements that make up the topology, bifurcation diagrams, internal periodicities) and from an experimental point of view (generated entropy, source autocorrelation, sensitivity to routing, application of standard statistical tests). In Chapter 5 an algorithm, called Maximum Worst-Case Entropy Selector (MWCES), that aims to identify, within a set of entropy sources, which offers the best performance in terms of worst-case entropy, also known in literature as "min-entropy", is presented. This algorithm is designed to be implemented in low-complexity digital architectures, suitable for lightweight cryptographic applications, thus allowing online maximization of the performance of a random number generation system based on Digital Nonlinear Oscillators. This chapter presents the theoretical premises underlying the algorithm formulation, some notable examples of its generic application and, finally, considerations related to its hardware implementation in FPGA.
APA, Harvard, Vancouver, ISO, and other styles
40

Stanco, Andrea. "High Performances Systems for Applications of Quantum Information." Doctoral thesis, Università degli studi di Padova, 2018. http://hdl.handle.net/11577/3426352.

Full text
Abstract:
This thesis work is about the realization of hardware and software systems for Quantum Random Number Generation (QRNG) and Quantum Key Distribution (QKD). Such systems were developed to guarantee a full functionality for a broader investigation of these two cutting edge applications of Quantum Information field. The thesis describes in details both the hardware and the software that were developed for FPGA-CPU board, Time-to-Digital converter (TDC) devices and computers, along with QRNG and QKD specific applications and their results. Randy was the first FPGA-based QRNG device to be developed; it uses a light source attenuated to single-photon level and one single-photon avalanche diode (SPAD). From the sampling of the SPAD electrical signal, the device produces random numbers through dedicated generation protocols and through the Peres unbiasing algorithm in order to maximize the output generation bit rate. Furthermore, the device allows to generate real time random numbers. This feature is used for the time setting of electro-optical components for extending Wheeler’s delayed-choice experiment to space. The same techniques were applied to a second device, LinoSPAD; it combines an FPGA-chip and a CMOS-SPADs array. Moreover, in this device, a TDC improves the photon detection time accuracy. Along with a dedicated post-processing based on Zhou-Bruk algorithm, the TDC allowed to reach a final bit rate equivalent to 300 Mbit/s. As far as QKD systems are concerned, within the collaboration among the University of Padova, the Italian Space Agency (ASI) with the Matera Laser Ranging Observatory (MLRO) and the Chinese Academy of Sciences (CAS) a TDC device management software was developed. The project aim is to realize a quantum cryptographic key exchange between the Chinese satellite Micius and MLRO. The software was designed to manage the entire data acquisition synchronized with UTC time. Furthermore, another software was designed to deal with electro-optomechanical and electro-optical components. The software is aim at the time-variant compensation of the beam angular changes through the optical path. Once again, within a collaboration between ASI and University of Padova, a full free space QKD system over tenths of kilometers was developed. It required the design of various components. This work describes the QKD source along with the dedicated FPGA board design. Such board generates the electrical impulses to control the qubit laser along with the electro-optic phase and intensity modulators.
Il presente lavoro di tesi tratta la realizzazione di sistemi hardware e software per Quantum Random Number Generation (QRNG) e Quantum Key Distribution (QKD). Tali sistemi sono stati sviluppati al fine di garantire una completa funzionalità per l’investigazione a tutto campo di queste due applicazioni che ad oggi risultano essere le più promettenti nell’ambito della Quantum Information. Vengono presentati in dettaglio sia l’hardware sia i software utilizzati che sono stati sviluppati per schede FPGA-CPU, dispositivi di Time-to-Digital converter (TDC) e computer. Vengono inoltre descritte le applicazioni specifiche di QRNG e QKD assieme ai risultati ottenuti. Randy è stato il primo dispositivo QRNG sviluppato su scheda FPGA e utilizza una sorgente luminosa attenuata a singolo fotone e un single-photon avalanche diode (SPAD). A partire dal campionamento del segnale elettrico dello SPAD, il dispositivo produce numeri randomici tramite protocolli di generazione appositi e tramite l’applicazione dell’algoritmo di unbiasing di Peres per massimizzare il bit rate. Il dispositivo permette inoltre di generare numeri randomici in tempo reale. Questa caratteristica viene utilizzata per la gestione temporizzata di componenti elettro-ottici per l’estensione allo spazio dell’esperimento a scelta ritardata di Wheeler’s. Le stesse tecniche sono state in seguito applicate ad un secondo dispositivo, LinoSPAD, che integra un chip FPGA e un array di CMOS-SPAD. Tale dispositivo prevede inoltre un TDC per aumentare la precisione temporale di dectection dei fotoni. Questa caratteristica, unita all’uso di una procedure di post-processing appositamente sviluppata e basata sull’algoritmo di Zhou-Bruk, ha permesso di raggiungere un bit rate finale pari a 300 Mbit/s. Per quanto riguarda i sistemi QKD, all’interno di un progetto di collaborazione tra l’Università di Padova, l’Agenzia Spaziale Italiana (ASI) insieme al Matera Laser Ranging Observatory (MLRO) e la Chinese Academy of Sciences (CAS) è stato sviluppato un software di gestione di un dispositivo TDC. Il progetto prevede la realizzazione di uno scambio di chiave crittografica quantistica tra il satellite cinese Micius e l’osservatorio di Matera. Il software è stato progettato per la gestione dell’intera acquisizione dati sincronizzata al tempo UTC. Inoltre è stato sviluppato anche un software per la gestione di componenti elettro-optomeccanici e elettro-ottici atti alla compensazione tempo variante delle variazioni angolari del fascio nel percorso ottico. Sempre all’interno di una collaborazione tra ASI e Università di Padova, è stato sviluppato un sistema completo di QKD free space per distanze nell’ordine di decine di chilometri. Lo sviluppo del sistema ha richiesto la progettazione di molteplici componenti. In questo lavoro viene descritta la parte della sorgente QKD e quindi della progettazione della scheda FPGA dedicata. Tale scheda ha il compito di generare gli impulsi elettrici per il controllo del laser per la produzione dei qubit e per il controllo dei modulatori di fase e di intensità elettro-ottici.
APA, Harvard, Vancouver, ISO, and other styles
41

Haddad, Patrick. "Caractérisation et modélisation de générateurs de nombres aléatoires dans les circuits intégrés logiques." Thesis, Saint-Etienne, 2015. http://www.theses.fr/2015STET4008/document.

Full text
Abstract:
Les générateurs de nombres aléatoires sont des blocs destinés à produire des quantités numériques qui doivent être indépendantes et uniformément distribuées. Ces RNG sont utilisés dans des contextes sécuritaires où l'utilisation de nombres aléatoires est requise (génération de clefs cryptographiques, nonces des protocoles cryptographiques, marqueurs anti-rejeu, contre-mesures face aux attaques par canaux cachés) et où leur qualité est primordiale. Tous les composants électroniques ayant une fonction sécuritaire, comme par exemple les cartes à puces, incluent un ou plusieurs générateurs aléatoires (basés sur des principes physiques). En conséquence, le RNG est une brique centrale des applications sécuritaires et sa défaillance, totale ou partielle met donc en péril la fonctionnalité dans son ensemble. Ce travail de thèse porte sur l'étude des RNG physiques (PTRNG) et la modélisation de l'aléa à partir des caractérisations électroniques et mathématiques du circuit. Cette étude se place essentiellement dans le contexte de la norme AIS 31 du BSI* qui fait référence dans de nombreux pays européens. Cette norme est l‘une des rares qui impose des caractérisations sur les PTRNG, incluant notamment un modèle stochastique de ce dernier. Dans ce contexte, il est crucial de pouvoir valider la méthodologie d'évaluation proposée par ces normes et c'est sur ce point que j'ai focalisé mon travail de thèse.*Bundesamt für Sicherheit in der Informationstechnik, agence fédérale allemande chargée de la sécurité des technologies de l'information
Random number generators (RNG) are primitives that produce independent and uniformly distributed digital values, RNG are used in secure environments where the use of random numbers is required (generation of cryptographic keys, nonces in cryptographic protocols, padding values, countermeasures against side-channel attacks) and where the quality of the randomness is essential. All electronic components with a security function, such as smart cards, include one or more random generators (based on physical principles). Consequently, the RNG is an essential primitive for security applications. A flaw in security of the random number generation process directly impacts the security of the cryptographic system. This thesis focuses on the study of physical RNG (PTRNG), the modeling of its randomness and an electronic characterizations of the circuit. This study is in the context of the AIS-31 standard which is published by the BSI* and followed by many European countries. This standard is one of the few that require a characterizations of the PTRNG and a stochastic model. In this context, it is crucial to validate the evaluation methodology proposed by these standards and l focused on them during my thesis.*Bundesamt fiir Sicherheit in der Informationstechnik, federal agency German responsible for the security of information technology
APA, Harvard, Vancouver, ISO, and other styles
42

Nemanja, Gazivoda. "Нова метода за повећање ефективне резолуције стохастичких мерних инструмената високих перформанси." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2019. https://www.cris.uns.ac.rs/record.jsf?recordId=110757&source=NDLTD&language=en.

Full text
Abstract:
Дисертација истражује утицај примене дитерског сигнала (дискретног аналогног униформног шума) генерисаног новом методом на повећање ефективне резолуције мерних инструмената базираних на стохастичкој дигиталној мерној методи. У дисертацији је дат преглед досадашњих решења базираних на стохастичкој дигиталној мерној методи у циљу сагледавања потребе и оправданости истраживања. Предложено решење представља комбинацију псеудослучајног и истински случајног генератора и као такво задржава најбоље особине из обе области. Дaт je прeдлoг нoве методе гeнeрисaњa шумa унифoрмнe рaспoдeлe aмплитудa. Умeстo уoбичajeнoг нaчинa гeнeрисaњa кoришћeњeм гeнeрaтoрa псeудoслучajних брojeвa и Д/А кoнвeртoрa, oвдe сe прeдлaжe гeнeрисaњe зaснoвaнo нa нeунифoрмнoм одабирању тeстeрaстoг или троугаоног нaпoнa. Oсим уштeдe збoг нeкoришћeњa Д/А кoнвeртoрa, дoбит je и мoгућнoст гeнeрисaњa нaпoнa из кoнтинуалнoг, умeстo дискрeтнoг скупa aмплитудa. Вeћинa дaнaшњих хaрдвeрских гeнeрaтoрa псeудoслучajнoг нaпoнa, сe бaзирa нa употреби микрoкoнтрoлeра и Д/A кoнвeртoра, пa je нa тaj нaчин рeзoлуциja гeнeрисaњa псeудoслучajнoг нaпoнa oгрaничeнa резолуцијом Д/A кoнвeрторa. Одабирањем тестерастог или троугаоног напона предложеном методом се остварује готово неограничена резолуција. Употреба овако генерисаног дитерског сигнала доводи до повећања ефективне резолуције код стохастичких мерних инстурмената. Симулацијом је одређена оптимална структура генератора на основу предложене методе. Експериментална мерења су изведена помоћу развијеног прототипа хардверског генератора.
Disertacija istražuje uticaj primene diterskog signala (diskretnog analognog uniformnog šuma) generisanog novom metodom na povećanje efektivne rezolucije mernih instrumenata baziranih na stohastičkoj digitalnoj mernoj metodi. U disertaciji je dat pregled dosadašnjih rešenja baziranih na stohastičkoj digitalnoj mernoj metodi u cilju sagledavanja potrebe i opravdanosti istraživanja. Predloženo rešenje predstavlja kombinaciju pseudoslučajnog i istinski slučajnog generatora i kao takvo zadržava najbolje osobine iz obe oblasti. Dat je predlog nove metode generisanja šuma uniformne raspodele amplituda. Umesto uobičajenog načina generisanja korišćenjem generatora pseudoslučajnih brojeva i D/A konvertora, ovde se predlaže generisanje zasnovano na neuniformnom odabiranju testerastog ili trougaonog napona. Osim uštede zbog nekorišćenja D/A konvertora, dobit je i mogućnost generisanja napona iz kontinualnog, umesto diskretnog skupa amplituda. Većina današnjih hardverskih generatora pseudoslučajnog napona, se bazira na upotrebi mikrokontrolera i D/A konvertora, pa je na taj način rezolucija generisanja pseudoslučajnog napona ograničena rezolucijom D/A konvertora. Odabiranjem testerastog ili trougaonog napona predloženom metodom se ostvaruje gotovo neograničena rezolucija. Upotreba ovako generisanog diterskog signala dovodi do povećanja efektivne rezolucije kod stohastičkih mernih insturmenata. Simulacijom je određena optimalna struktura generatora na osnovu predložene metode. Eksperimentalna merenja su izvedena pomoću razvijenog prototipa hardverskog generatora.
The dissertation investigates the impact of the application of the dithering signal (discrete analogue uniform noise) generated by the new method for increasing the effective resolution of measurement instruments based on the stochastic digital measurement method. The dissertation provides an overview of the existing solutions based on the stochastic digital measurement method in order to understand the need and justification of the research. The proposed solution represents a combination of a pseudorandom and truly random generator and as such holds the best features in both areas. А suggestion of a new way of generating the noise of the uniform distribution of amplitudes is presented. Instead of the usual way of generating using a pseudorandom number generator and a D/A converter, the generation based on the nonuniform sampling of sawtooth or triangle voltage is proposed. In addition to the savings due to the non-use of a D/A converter, the possibility of generating voltages from a continual instead of a discrete amplitude set is also obtained. Most of today's hardware pseudorandom voltage generators are based on the use of a microcontroller and D/A converter, so in this way the resolution of the pseudorandom voltage generation is limited by the resolution of the D/A converter. By sampling the sawtooth or triangle voltage using the proposed method, almost unlimited resolution is achieved. The use of this generated dither signal leads to an increase in effective resolution in stochastic measuring instruments. The simulation determined the optimal structure of the generator based on the proposed method. Experimental measurements were made using developed hardware.
APA, Harvard, Vancouver, ISO, and other styles
43

Soucarros, Mathilde. "Analyse des générateurs de nombres aléatoires dans des conditions anormales d'utilisation." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00759976.

Full text
Abstract:
Les nombres aléatoires ont été de tous temps utilisés pour des jeux de hasard, plus récemment pour créer des codes secrets et ils sont aujourd'hui nécessaire à l'exécution de programmes informatiques. Les générateurs de nombres aléatoires sont maintenant bien éloignés de simples dés à lancer et sont constitués de circuits électroniques ou d'algorithmes. Ceci pose des problèmes quant à la reconnaissance du caractère aléatoire des nombres générés. De plus, de la même manière ou autrefois les dés étaient pipés pour augmenter les chances de gagner, il est aujourd'hui possible d'influencer la sortie des générateurs de nombres aléatoires. Ce sujet est donc toujours d'actualité avec des exemples récents très médiatisés. Ceci concernait en effet la console de jeu PS3 qui génère un nombre aléatoire constant où la distribution de clefs secrètes redondantes sur internet. Ce mémoire présente l'étude de plusieurs générateurs ainsi que diverses manières de les perturber. Il montre ainsi des faiblesses inhérentes à leurs conceptions et des conséquences possibles de leur défaillance sur des composants de sécurité. Ces travaux ont de plus permis de mettre en évidence l'importance des problématiques concernant le test des nombres aléatoires ainsi que des retraitements corrigeant des biais dans ces nombres.
APA, Harvard, Vancouver, ISO, and other styles
44

Nieto-Silleras, Olmo. "Device-independent randomness generation from several Bell estimators." Doctoral thesis, Universite Libre de Bruxelles, 2018. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/271365.

Full text
Abstract:
The device-independent (DI) framework is a novel approach to quantum information science which exploits the nonlocality of quantum physics to certify the correct functioning of a quantum information processing task without relying on any assumption on the inner workings of the devices performing the task. This thesis focuses on the device-independent certification and generation of true randomness for cryptographic applications. The existence of such true randomness relies on a fundamental relation between the random character of quantum theory and its nonlocality, which arises in the context of Bell tests. Device-independent randomness generation (DIRG) and quantum key distribution (DIQKD) protocols usually evaluate the produced randomness (as measured by the conditional min-entropy) as a function of the violation of a given Bell inequality. However, the probabilities characterising the measurement outcomes of a Bell test are richer than the degree of violation of a single Bell inequality. In this work we show that a more accurate assessment of the randomness present in nonlocal correlations can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterising the behaviour of the device is considered. As a side result, we show that to every behaviour there corresponds an optimal Bell expression allowing to certify the maximal amount of DI randomness present in the correlations. Based on these results, we introduce a family of protocols for DIRG secure against classical side information that relies on the estimation of an arbitrary number of Bell expressions, or even directly on the experimental frequencies of the measurement outcomes. The family of protocols we propose also allows for the evaluation of randomness from a subset of measurement settings, which can be advantageous when considering correlations for which some measurement settings result in more randomness than others. We provide numerical examples illustrating the advantage of this method for finite data, and show that asymptotically it results in an optimal generation of randomness from experimental data without having to assume beforehand that the devices violate a specific Bell inequality.
L'approche indépendante des appareils ("device-independent" en anglais) est une nouvelle approche en informatique quantique. Cette nouvelle approche exploite la non-localité de la physique quantique afin de certifier le bon fonctionnement d'une tâche sans faire appel à des suppositions sur les appareils menant à bien cette tâche. Cette thèse traite de la certification et la génération d'aléa indépendante des appareils pour des applications cryptographiques. L'existence de cet aléa repose sur une relation fondamentale entre le caractère aléatoire de la théorie quantique et sa non-localité, mise en lumière dans le cadre des tests de Bell. Les protocoles de génération d'aléa et de distribution quantique de clés indépendants des appareils mesurent en général l'aléa produit en fonction de la violation d'une inégalité de Bell donnée. Cependant les probabilités qui caracterisent les résultats de mesures dans un test de Bell sont plus riches que le degré de violation d'une seule inégalité de Bell. Dans ce travail nous montrons qu'une évaluation plus exacte de l'aléa présent dans les corrélations nonlocales peut être faite si l'on tient compte de plusieurs expressions de Bell à la fois ou de l'ensemble des probabilités (ou comportement) caractérisant l'appareil testé. De plus nous montrons qu'à chaque comportement correspond une expression de Bell optimale permettant de certifier la quantité maximale d'aléa présente dans ces corrélations. À partir de ces resultats, nous introduisons une famille de protocoles de génération d'aléa indépendants des appareils, sécurisés contre des adversaires classiques, et reposant sur l'évaluation de l'aléa à partir d'un nombre arbitraire d'expressions de Bell, ou même à partir des fréquences expérimentales des résultats de mesure. Les protocoles proposés permettent aussi d'évaluer l'aléa à partir d'un sous-ensemble de choix de mesure, ce qui peut être avantageux lorsque l'on considère des corrélations pour lesquelles certains choix de mesure produisent plus d'aléa que d'autres. Nous fournissons des exemples numériques illustrant l'avantage de cette méthode pour des données finies et montrons qu'asymptotiquement cette méthode résulte en un taux de génération d'aléa optimal à partir des données expérimentales, sans devoir supposer à priori que l'expérience viole une inégalité de Bell spécifique.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
45

Triestino, Michele. "La dynamique des difféomorphismes du cercle selon le point de vue de la mesure." Phd thesis, Ecole normale supérieure de lyon - ENS LYON, 2014. http://tel.archives-ouvertes.fr/tel-01065468.

Full text
Abstract:
Les travaux de ma thèse s'articulent en trois parties distinctes.Dans la première partie j'étudie les mesures de Malliavin-Shavguldize sur les difféomorphismes du cercle et de l'intervalle. Il s'agit de mesures de type " Haar " pour ces groupes de dimension infinie : elles furent introduites il a une vingtaine d'années pour permettre une étude de leur théorie des représentations. Un premier chapitre est dédié à recueillir les résultats présents dans la littérature et et les représenter dans une forme plus étendue, avec un regard particulier sur les propriétés de quasi-invariance de ces mesures. Ensuite j'étudie de problèmes de nature plus dynamique : quelle est la dynamique qu'on doit s'attendre d'un difféomorphisme choisi uniformément par rapport à une mesure de Malliavin-Shavguldize ? Je démontre en particulier qu'il y a une forte présence des difféomorphismes de type Morse-Smale.La partie suivante vient de mon premier travail publié, obtenu en collaboration avec Andrés Navas. Inspirés d'un théorème récent de Avila et Kocsard sur l'unicité des distributions invariantes par un difféomorphisme lisse minimal du cercle, nous analysons le même problème en régularité faible, avec des argument plus géométriques.La dernière partie est constituée des résultats récemment obtenus avec Mikhail Khristoforov et Victor Kleptsyn. Nous abordons les problèmes reliés à la gravité quantique de Liouville en étudiant des espaces auto-similaires qui sont la limite de graphes finis. Nous démontrons qu'il est possible de trouver des distances aléatoires non-triviales sur ces espaces qui sont compatibles avec la structure auto-similaire.
APA, Harvard, Vancouver, ISO, and other styles
46

Fujdiak, Radek. "Analýza a optimalizace datové komunikace pro telemetrické systémy v energetice." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2017. http://www.nusl.cz/ntk/nusl-358408.

Full text
Abstract:
Telemetry system, Optimisation, Sensoric networks, Smart Grid, Internet of Things, Sensors, Information security, Cryptography, Cryptography algorithms, Cryptosystem, Confidentiality, Integrity, Authentication, Data freshness, Non-Repudiation.
APA, Harvard, Vancouver, ISO, and other styles
47

Xu, Feihu. "Practical Issues in Quantum Cryptography." Thesis, 2012. http://hdl.handle.net/1807/32641.

Full text
Abstract:
Quantum key distribution (QKD) can provide unconditional security based on the fundamental laws of quantum physics. Unfortunately, real-life implementations of a QKD system may contain overlooked imperfections and thus violate the practical security of QKD. It is vital to explore these imperfections. In this thesis, I study two practical imperfections in QKD: i) Discovering security loophole in a commercial QKD system: I perform a proof-of-principle experiment to demonstrate a technically feasible quantum attack on top of a commercial QKD system. The attack I utilize is called phase-remapping attack. ii) Generating high-speed truly random numbers: I propose and experimentally demonstrate an ultrafast QRNG at a rate over 6 Gb/s, which is based on the quantum phase fluctuations of a laser. Moreover, I consider a potential adversary who has partial knowledge of the raw data and discuss how one can rigorously remove such partial knowledge with post-processing.
APA, Harvard, Vancouver, ISO, and other styles
48

Haw, Jing Yan. "Continuous Variable Optimisation of Quantum Randomness and Probabilistic Linear Amplification." Phd thesis, 2018. http://hdl.handle.net/1885/143588.

Full text
Abstract:
In the past decade, quantum communication protocols based on continuous variables (CV) has seen considerable development in both theoretical and experimental aspects. Nonetheless, challenges remain in both the practical security and the operating range for CV systems, before such systems may be used extensively. In this thesis, we present the optimisation of experimental parameters for secure randomness generation and propose a non-deterministic approach to enhance amplification of CV quantum state. The first part of this thesis examines the security of quantum devices: in particular, we investigate quantum random number generators (QRNG) and quantum key distribution (QKD) schemes. In a realistic scenario, the output of a quantum random number generator is inevitably tainted by classical technical noise, which potentially compromises the security of such a device. To safeguard against this, we propose and experimentally demonstrate an approach that produces side-information independent randomness. We present a method for maximising such randomness contained in a number sequence generated from a given quantum-to-classical-noise ratio. The detected photocurrent in our experiment is shown to have a real-time random-number generation rate of 14 (Mbit/s)/MHz. Next, we study the one-sided device-independent (1sDI) quantum key distribution scheme in the context of continuous variables. By exploiting recently proven entropic uncertainty relations, one may bound the information leaked to an eavesdropper. We use such a bound to further derive the secret key rate, that depends only upon the conditional Shannon entropies accessible to Alice and Bob, the two honest communicating parties. We identify and experimentally demonstrate such a protocol, using only coherent states as the resource. We measure the correlations necessary for 1sDI key distribution up to an applied loss equivalent to 3.5 km of fibre transmission. The second part of this thesis concerns the improvement in the transmission of a quantum state. We study two approximate implementations of a probabilistic noiseless linear amplifier (NLA): a physical implementation that truncates the working space of the NLA or a measurement-based implementation that realises the truncation by a bounded postselection filter. We do this by conducting a full analysis on the measurement-based NLA (MB-NLA), making explicit the relationship between its various operating parameters, such as amplification gain and the cut-off of operating domain. We compare it with its physical counterpart in terms of the Husimi Q-distribution and their probability of success. We took our investigations further by combining a probabilistic NLA with an ideal deterministic linear amplifier (DLA). In particular, we show that when NLA gain is strictly lesser than the DLA gain, this combination can be realised by integrating an MB-NLA in an optical DLA setup. This results in a hybrid device which we refer to as the heralded hybrid quantum amplifier. A quantum cloning machine based on this hybrid amplifier is constructed through an amplify-then-split method. We perform probabilistic cloning of arbitrary coherent states, and demonstrate the production of up to five clones, with the fidelity of each clone clearly exceeding the corresponding no-cloning limit.
APA, Harvard, Vancouver, ISO, and other styles
49

Jung, Deng-Wei, and 鍾鐙褘. "Appraising Various Random Number Generators." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/01062541405655648666.

Full text
Abstract:
碩士
銘傳大學
應用統計資訊學系碩士班
98
This thesis appraises various random number generators – Fast Multiple Recursive Generators, DX-k-2 generators, DX-k-3 generators, DX-k-4 generators, MDX-k-2 generators, AX-k-2 generators, AX-k-1 generators, AX-k-1A generators, ATNE generators, Fast Matrix Congruential Generators, and Efficient Matrix Congruential Generators with Linear Congruential Generators. With maple 9.5 and C language it compares various important properties of generators listed above,such as efficiency including the generating process analysis, period, two-dimensional lattice plot, and memory usage. It also applies TestU01, the most stringent Big Crush battery with more than 150 statistical tests, to the generators listed above. The test results are as reported and summarized.
APA, Harvard, Vancouver, ISO, and other styles
50

"Progresses In Parallel Random Number Generators." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606651/index.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography