Dissertations / Theses on the topic 'Lattice naturale'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 37 dissertations / theses for your research on the topic 'Lattice naturale.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
MASCIA, CLAUDIA. "Studio di componenti del lattice di Euphorbia characias." Doctoral thesis, Università degli Studi di Cagliari, 2015. http://hdl.handle.net/11584/266809.
Full textTAGLIARO, IRENE. "NOVEL COLLOIDAL APPROACH TO PREPARE HIGHLY-LOADED SILICA-BASED ELASTOMERIC NANOCOMPOSITES." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2019. http://hdl.handle.net/10281/241175.
Full textSustainability has become a field of great interest in the world industry. For the scientific community the challenge lies in the identification of green synthetic approaches and new alternatives to petroleum-based materials. In the case of the tyre industry, the challenge is to identify possible design strategies and alternatives to reduce the environmental impact throughout the life cycle of tyres, by means of both the use of environmentally friendly materials and the development of innovative products, having reduced energy consumption and CO2 emissions. In this context, this PhD thesis is focused on the preparation of eco-friendly silica-based nanocomposites by using a colloidal approach to increase the dispersion of hydrophilic fillers in line with the new requirements of sustainability from the EU policies. The colloidal approach aims at compounding nanocomposites with hydrophilic fillers, whose efficient dispersion through traditional mixing still remains a challenging issue, due to their poor compatibility with the organic matrix. This technique aims at increasing the filler dispersion without any expensive surface modification, with the elimination of the volatile component released during mixing, producing significant benefits for environment and workers. Two different colloidal approaches were applied: i) latex compounding technique (LCT) and ii) in situ emulsion polymerization to prepare highly-loaded nanocomposite rubber materials containing silica-based fillers, silica and sepiolite (Sep) clay, considered a promising filler candidate for the polymer strengthening due to its fibrous structure and high particle aspect ratio (AR). The concentration, the charge and the shape of silica-based nanofillers were studied as relevant parameters on stabilization and destabilization of natural and synthetic polyisoprene latexes. An effective LCT procedure was established to produce eco-friendly composites, namely masterbaches (MBs), by incorporating silica or Sep into natural rubber latex (i.e. emulsion in water of cis-1,4-polyisoprene), through the flocculation (i.e. aggregation resulting from the bridging of polymer particles) of the silica-based nanofillers/rubber mixed aqueous system. LCT showed to favour a homogeneous dispersion of hydrophilic Sep fibers in the rubber matrix. The main physicochemical parameters which control aggregation processes in the aqueous medium, i.e. pH, -potential, concentration, as well as the morphological features of the final Sep-natural rubber MBs, were comprehensively investigated helping to figure out the Sep-NR interactions and to propose a flocculation mechanism, based on electrostatic and depletion attraction forces, remarkably connected both to the high content (50 wt.%) and to the peculiar anisotropy of Sep fibers. Furthermore, the MBs with high filler loadings were used to produce environmentally friendly composites, by combining LTC and melt mixing. This combined approach could take advantage of the good filler distribution and prevents dust from floating in the air during processing. In situ Pickering polymerization was considered as an alternative colloidal approach to produce eco-friendly nanocomposites. Polyisoprene/silica-based structured particles were synthesized on the base of the stabilizing effects of inorganic fillers which act like surfactants lowering the interfacial tension and stabilizing the emulsion. On the basis of our results, we suggested a possible mechanism for emulsion polymerizations stabilized by solid particles. In conclusion, the colloidal approach, based on both LTC and in situ Pickering emulsion polymerization, can be considered as green, simple and effective method suitable for high-performance technological applications. The outcomes indicate the suitability of the adopted strategies as a sustainable procedure for the production of high-loaded silica based-rubber nanocomposites.
Wegener, Claudia B. "Natural dualities for varieties generated by lattice-structured algebras." Thesis, University of Oxford, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302397.
Full textHashemi, Mohammad. "Lattice Boltzmann Simulation of Natural Convection During Dendritic Growth." University of Akron / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=akron1459444594.
Full textHardter, Eric. "Modifying the Natural State of Nucleic Acids: Three-Dimensional DNA Lattices and Extended Deoxyribonucleosides." Thesis, Boston College, 2014. http://hdl.handle.net/2345/bc-ir:103610.
Full textBy virtue of encoding and transferring hereditary information, nucleic acids effectively represent the blueprint for life as we know it. Given the biological relevance of this class of polymers, it comes as no surprise that scientists are constantly striving to reach a greater understanding of the innumerable genetic corridors contained within the human genome. This has led to the rational design and synthesis of numerous nucleoside analogues in an attempt to alter and subsequently control native nucleic acid structure and function. The first attempts at harnessing the latent abilities of DNA are described in Chapter 2. Multiple tetrahedral branching "hubs" were designed, synthesized and characterized, at which point single-stranded DNA could be elongated from each of the four points of origin. Ensuing hybridization studies were performed with the goal that the binding traits of these elongated tetrahedral lattices could be monitored, and that fully formed lattices could potentially function as means of drug encapsulation or molecular tethering. Chapter 3 describes direct alteration of the standard DNA backbone. Successive synthetic efforts towards creating a 6'-extended deoxyadenosine molecule are detailed, and its effects on the stability of duplexed DNA (along with sister molecules 6'-deoxythymidine and an elongated 3'-deoxythymidine) are also defined. Upon insertion into DNA, this class of extended nucleosides could ultimately lead to a new duplex structure, as well as novel binding properties
Thesis (PhD) — Boston College, 2014
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Chemistry
Craig, Andrew Philip Knott. "Canonical extensions of bounded lattices and natural duality for default bilattices." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:c7137abe-b076-42ad-9691-bddfca36967e.
Full textBOCANEGRA, CIFUENTES JOHAN AUGUSTO. "Lattice Boltzmann Method: applications to thermal fluid dynamics and energy systems." Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1060259.
Full textIn many energy systems fluids play a fundamental role, and computational simulations are a valuable tool to study their complex dynamics. The Lattice Boltzmann Method (LBM) is a relatively new numerical method for computational fluid dynamics, but its applications can be extended to physical phenomena beyond fluid flows. This thesis presents applications of the LBM to thermal fluid dynamics and energy systems. Specific applications considered are: application to nuclear reactor engineering problems; thermal fluid dynamic behavior of a Natural Circulation Loop; nanoparticles gravitational sedimentation; acoustical problems. The main original contributions derived from this work are: first, the systematic description of the current status of LBM applications to nuclear reactors problems, including test cases and benchmark simulations; second, the development and validation of a LBM model for a single-phase natural circulation loop; third, the development and validation of a LBM model for gravitational sedimentation of nanoparticles, and fourth, the systematic description of the current status of LBM applications to acoustics, including simulations of test cases. The development of this thesis was not limited to simulations; experimental studies in parallel connected natural circulation loops of small inner diameter were conducted, showing the wide applicability of the one-dimensional theoretical models used to validate the LBM results. Additional contributions derived from this work: 1. the applicability of the method to study neutron transport and nuclear waste disposal using porous materials was shown. 2. changes in the thermophysical performance of the natural circulation loop when the loop reached a non-laminar (transition) regime were found at a Reynolds number lower than the typical range. 3. variable diffusion and sedimentation parameters were effective to model the experimental sedimentation curves. In conclusion, this work shows that the LBM is a versatile and powerful computational tool that can be used beyond the common Computational Fluid Dynamics applications.
Bray, Christina L., Robert G. Bryant, M. J. Cox, Gianni Ferrante, Y. Goddard, Sandip Sur, and Joseph P. Hornack. "The proton Nuclear Magnetic Resonance spin-lattice relaxation rate of some hydrated synthetic and natural sands." Universitätsbibliothek Leipzig, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-192008.
Full textBray, Christina L., Robert G. Bryant, M. J. Cox, Gianni Ferrante, Y. Goddard, Sandip Sur, and Joseph P. Hornack. "The proton Nuclear Magnetic Resonance spin-lattice relaxation rate of some hydrated synthetic and natural sands." Diffusion fundamentals 10 (2009) 8, S. 1-3, 2009. https://ul.qucosa.de/id/qucosa%3A14098.
Full textRydén, Gabriel. "Ab initio lattice dynamics and Anharmonic effects in refractory Rock-salt structure TaN ceramic." Thesis, Linköpings universitet, Teoretisk Fysik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-174208.
Full textXue, Boyu. "3D Printed Lattice Structure for Driveline Applications." Thesis, KTH, Materialvetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-299270.
Full textGitterstrukturer har fått mycket uppmärksamhet som cellulära material under de senaste åren på grund av deras enastående egenskaper, t.ex. hög hållfasthet i förhållande till vikt, värmeöverföring, energiabsorption och förmåga att förbättra buller-, vibrations- och bullerskador (NVH-beteende). Denna typ av struktur har fått ett uppsving av tekniken för additiv tillverkning (AM), som kan tillverka geometrier i praktiskt taget vilken form som helst. På grund av ekonomiska och miljömässiga krav används lättviktsdesign i allt större utsträckning inom bilindustrin och byggnadsutrustning. NVH-egenskaperna är en viktig fråga för anläggningsutrustning. De konventionella konstruktionernas NVH-beteende bestäms dock huvudsakligen av massan, vilket innebär att tystnad ofta kräver tunga system, vilket leder till ökad energiförbrukning och större utsläpp. Miljötrenderna och den ekonomiska konkurrens som följer av detta har därför begränsat de traditionella (tunga) lösningarna för att förbättra NVH-egenskaperna och gjort lättviktsdesignen svårare. Nya lösningar är nödvändiga för att lösa svårigheten och utmaningen med att kombinera NVH- och lättviktskrav. I den här forskningen genomfördes topologioptimering på en komponent för en ny ledad transportörtransmission (NAHT) för att balansera lättvikts- och NVH-beteende. Den topologioptimerade 3D-modellen fylldes med en icke-homogen gitterstruktur med optimal gittertäthet via storleksoptimering. Gitterstrukturoptimering är en typ av topologioptimering, och det är termen för att beskriva dessa förfaranden. För att tillverka den komplicerade gitterstrukturen krävs additiv tillverkning (eller 3D-utskrift) (efter topologi- och gitterstrukturoptimering). De nya modellerna analyserades med hjälp av finita elementmetoden (FEM), och resultaten av analysen jämfördes med resultaten av de ursprungliga modellerna. Efter jämförelsen erhölls positiva resultat, vilket visar att optimering av topologi och gitterstruktur kan tillämpas vid utformning av komponenter för byggutrustning. Enligt resultaten kan optimering av gitterstrukturen skapa en tillförlitlig lättviktsdesign med bra NVH-beteende. Dessutom har gitterstrukturens organisering och layout en betydande inverkan på den totala prestandan.
Shafei, Babak. "Reactive transport in natural porous media: contaminant sorption and pore-scale heterogeneity." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45785.
Full textIvanov, Angelov Mitko. "Sound Scattering by Lattices of Heated Wires." Doctoral thesis, Universitat Politècnica de València, 2016. http://hdl.handle.net/10251/63275.
Full text[ES] El objetivo de este trabajo es demostrar teoréticamente y experimentalmente como la propagación de ondas acústicas puede ser controlada por gradientes de temperatura. Empezando con el caso más simple de dos hilos calientes en aire, el estudio se extiende sobre estructuras periódicas conocidas como cristales sónicos (CS). Se ha utilizado el Método de Elementos Finitos (FEM) para realizar simulaciones numéricas con el objetivo de demonstrar la colimación y focalización de ondas acústicas en CS bidimensionales (2D) cuya fracción de llenado es ajustable mediante gradientes de temperatura. Como parte de la investigación se ha analizado la reflexión de Bragg y el efecto de tipo Fabry-Perot asociados con los CSs estudiados. Entre los ejemplos tratados figuran un CS con una transmitancia ajustable a voluntad, dentro de ciertos límites. También se han estudiado lentes acústicas bidimensionales de gradiente de índice, basadas en gradiente de temperatura. Utilizando cortinas paralelas de hilos calientes cuya temperatura varía según una ley dada se puede diseñar una lente GRIN con propiedades determinadas. Por otra parte, cambiando la temperatura de los hilos se puede lograr un cambio en la fracción de llenado dentro del GRIN CS. Así, el índice de refracción local, que está directamente relacionado con la fracción de llenado, se cambia también y se obtiene una variación de gradiente de índice dentro del GRIN CS. Este GRIN CS es una analogía directa de medios con gradiente, observados en la naturaleza. Otro aspecto de este trabajo trata sobre el ajuste de algunas propiedades de un SC como el índice de refracción efectivo o la densidad efectiva con el objetivo de obtener unas propiedades deseadas del cristal. Como el ajuste activo de los bandgaps fonónicos es ciertamente deseado para futuras aplicaciones con funcionalidades mejoradas, hasta ahora se han hecho varios intentos de desarrollar CSs de características ajustables. Controlando el ángulo de incidencia o la frecuencia de funcionamiento, un GRIN CS puede ajustar dinámicamente la curvatura de la trayectoria de propagación dentro de la estructura CS. Entre los últimos estudios de CSs las fracciones de llenado se ajustaron mediante una deformación física directa de la estructura o mediante estímulos externos (por ejemplo campos eléctricos o magnéticos). El primero es poco práctico para una gran parte de las aplicaciones y el segundo a menudo requiere estímulos muy fuertes para ajustes modestos. En este trabajo se propone otra forma de ajustar las propiedades de un CS. Las propiedades acústicas del medio de propagación (densidad, índice de refracción) dependen de la temperatura, por tanto, introduciendo gradientes de temperatura dentro de dicho medio pueden ajustarse a voluntad las propiedades del CS dentro de ciertos límites. La manera de obtener gradientes de temperatura dentro del CS, propuesta en este estudio, es mediante hilos de nicrom calentados con corrientes eléctricas. Hay algunas ventajas importantes de este método. En primer lugar, cambiando la intensidad de corriente eléctrica que circula por los hilos se puede conseguir cambiar dinámicamente las propiedades del CS. En segundo lugar, es relativamente más fácil de cambiar la fracción de llenado simplemente ajustando la intensidad de la corriente eléctrica que modificar físicamente la estructura o aplicar fuertes campos eléctricos o magnéticos. En conclusión, el método propuesto en esta tesis permite, en principio, conseguir materiales y estructuras con propiedades acústicas ajustables dinámicamente mediante el control de la temperatura a través de la corriente eléctrica en los hilos, dentro de ciertos límites. De esta forma se puede experimentar fácilmente a escala macroscópica fenómenos de propagación de ondas análogos a los que ocurren en estructuras microscópicas para la propagación de ondas electromagnéticas de alta frecuencia (microondas y l
[CAT] L'objectiu d'este treball és demostrar teorèticament i experimentalment com la propagació d'ones acústiques pot ser controlada per gradients de temperatura. Començant amb el cas més simple de dos fils calents en aire, l'estudi s'estén sobre estructures periòdiques conegudes com a cristalls sónics (CS) . S'ha utilitzat el Mètode d'Elements Finits (FEM) per a realitzar simulacions numèriques amb l'objectiu de demonstrar la col¿limació i focalització d'ones acústiques en CS bidimensionals (2D) la fracció de omplit de la qual és ajustable per mitjà de gradients de temperatura. Com a part de la investigació s'ha analitzat la reflexió de Bragg i l'efecte de tipus Fabry-Perot associats amb els CSs estudiats. Entre els exemples tractats figuren un CS amb una transmitancia ajustable a voluntat, dins de certs límits. També s'han estudiat lents acústiques bidimensionals de gradient d'índex, basades en gradient de temperatura. Utilitzant cortines paral¿leles de fils calents la temperatura de la qual varia segons una llei donada es pot dissenyar una lent GRIN amb propietats determinades. D'altra banda, canviant la temperatura dels fils es pot aconseguir un canvi en la fracció d'ompliment dins del GRIN CS. Així, l'índex de refracció local, que està directament relacionat amb la fracció d'ompliment, es canvia també i s'obté una variació de gradient d'índex dins del GRIN CS. Este GRIN CS és una analogia directa de mitjans amb gradient, observats en la naturalesa. Com les seues analogies òptiques, les lents, estudiades en este treball, tenen les superfícies planes i són més fàcils de fabricar que les lents corbades. La deflexión de les ones acústiques obtinguda per mitjà d'una lent de gradient GRIN es pot utilitzar per a focalitzar o colimar feixos de so. Un altre aspecte d'este treball tracta sobre l'ajust d'algunes propietats d'un SC com l'índex de refracció efectiu o la densitat efectiva amb l'objectiu d'obtindre unes propietats desitjades del cristall. Com l'ajust actiu dels bandgaps fonónicos és certament desitjat per a futures aplicacions amb funcionalitats millorades, fins ara s'han fet diversos intents de desenrotllar CSs de característiques ajustables. Controlant l'angle d'incidència o la freqüència de funcionament, un GRIN CS pot ajustar dinàmicament la curvatura de la trajectòria de propagació dins de l'estructura CS. Entre els últims estudis de CSs les fraccions d'ompliment es van ajustar per mitjà d'una deformació física directa de l'estructura o per mitjà d'estímuls externs. El primer és poc pràctic per a una gran part de les aplicacions i el segon sovint requerix estímuls molt forts per a ajustos modestos. En este treball es proposa una altra forma d'ajustar les propietats d'un CS. Les propietats acústiques del mig de propagació (densitat, índex de refracció) depenen de la temperatura, per tant, introduint gradients de temperatura dins del dit mitjà poden ajustar-se a voluntat les propietats del CS dins de certs límits. La manera d'obtindre gradients de temperatura dins del CS, proposta en este estudi, és per mitjà de fils de Nicrom calfats amb corrents elèctrics. Hi ha alguns avantatges importants d'este mètode. En primer lloc, canviant la intensitat de corrent elèctric que circula pels fils es pot aconseguir canviar dinàmicament les propietats del CS. En segon lloc, és relativament més fàcil de canviar la fracció d'ompliment simplement ajustant la intensitat del corrent elèctric que modificar físicament l'estructura o aplicar forts camps elèctrics o magnètics. En conclusió, el mètode proposat en esta tesi permet, en principi, aconseguir materials i estructures amb propietats acústiques ajustables dinàmicament per mitjà del control de la temperatura a través del corrent elèctric en els fils, dins de certs límits. D'esta manera es pot experimentar fàcilment a escala macroscòpica fenòmens de propagació d'ones anàlegs a què ocorren e
Ivanov Angelov, M. (2016). Sound Scattering by Lattices of Heated Wires [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/63275
TESIS
Beneitez, Miguel, and Johan Sundin. "Turbulent flow control via nature inspired surface modifications." Thesis, KTH, Mekanik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-206189.
Full textMånga naturligt förekommande flöden är turbulenta. Naturen har också gett upphov till flera typer av ytskikt som kan påverka dessa. Hajars skinn har räfflor, fiskar har slem som innehåller polymerer och lotusblommans yta har superhydrofobiska egenskaper, men ofta har dessa naturliga ytskikt också andra egenskaper. På grund av miljoner år av anpassning så finns det ändå många skäl att studera dessa. Detta arbete är en studie av naturinspirerade ytskikt, där målet har varit passiva flödesmanipulationer. Målet har inte varit att åstadkomma en ytfriktionsminskning, utan att få en bättre förståelse om hur dessa ytskikt påverkar turbulenta flöden. Simuleringar har utförts i en kanalliknande geometri, där en kanalväggs randvillkor har modifierats. En makroskopisk beskrivning har använts för att simulera superhydrofobiska och porösa ytor och en mikroskopisk beskriving har använts för att simulera fibrer, både stela och böjbara, fastsatta på en kanalvägg. För flödet med det makroskopiskt beskrivna randvillkoret har en pseudospektral metod använts och för flödet med det mikroskopiskt beskrivna randvillkoret har en lattice-Boltzmannmetod använts. Den superhydrofobiska ytan implementerades genom en generell tensorformulering. Ett randvillkor med nollskild hastighet i kanalens riktning gav upphov till en ytfriktionsminskning och ett randvillkor med nollskild hastighet vinkelrät mot kanalens riktning gav upphov till en ökad ytfriktion, i överensstämmelse med tidigare resultat. Nollskilda icke-diagonala tensorelement gav upphov till en smärre ökning av ytfriktionen, utan att nämnvärt förändra flödet. De porösa ytorna gav upphov till en ytfriktionsökning och hade stor inverkan på de turbulenta strukturerna. Dessa ytor bildade tvådimensionella struturer vinkelrät mot kanalens riktning. Varken de stela eller de böjbara fibrerna gav upphov till stora ändringar i hastighetsfältet. Däremot uppstor cirkulationszoner och dessa var starkare under stråkstrukturer med hög hastighet. De böjbara fibrerna uppvisade likheter med porösa material genom en interaktion mellan det vertikala hastighetsfältet och de turbulenta tryckfluktuationerna. Denna interaktion uppstod inte för de stela fibrerna. Fibrernas böjning korrelerade också i stor utsträckning till tryckfluktuationerna.
Ruzzi, Anna. "Stabilizzazione di succo di carota tramite trattamenti non termici e utilizzo di antimicrobici naturali." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/19384/.
Full textRehhali, Khaoula. "Simulations de la convection naturelle couplée au rayonnement surfacique par la méthode de Boltzmann sur réseau : cas des chauffages variable et discret." Electronic Thesis or Diss., Amiens, 2019. http://www.theses.fr/2019AMIE0001.
Full textIn this thesis, a numerical study is carried out on the coupling phenomena between natural convection and surface radiation in square cavities whose walls are subjected to discrete or non-uniform temperatures. Indeed, the first study carried out is concerned with a problem of convection-radiation coupling in a square cavity inclined and filled with air, having on one side a wall heated at a constant temperature and on the opposite side, a wall heated linearly. The remaining walls are considered adiabatic. In the second study, the cavity has partially heated vertical walls (symmetrically and asymmetrically), a cooled upper wall and an adiabatic bottom wall. The objective of these numerical studies is to analyze the effect of surface radiation and the different governing parameters (heating mode, Rayleigh number, angle of inclination, temperature difference) on the flow structure and the heat transfer. The second objective of this thesis is to test the performance of the multiple relaxation time (MRT) scheme of the Lattice-Boltzmann method (LBM) in the presence of convection radiation coupling. The results of this study revealed that the considered governing parameters have a significant effect on the flow structure and heat transfer through the cavity
Bartl, Eduard. "Mathematical foundations of graded knowledge spaces." Diss., Online access via UMI:, 2009.
Find full textIncludes bibliographical references.
Maquignon, Nicolas. "Vers un modèle multiphases et multicomposants (MPMC) de type Lattice Boltzmann Method (LBM) pour la simulation dynamique d'un fluide cyogénique dans l'eau." Thesis, Littoral, 2015. http://www.theses.fr/2015DUNK0426/document.
Full textIn this thesis, a LBM MPMC model with heat exchange is developed. Data assimilation tests and optical flow measurements are made in order to validate the model. The application context of this thesis is the mixture of a cryogenic fluid with water. In the first part, a bibliographical work reminding the Boltzmann equation and its various assumptions and simplifications, as well as the algorithmic aspect of the LBM are exposed. A comparison between SRT and MRT collision operator is performed, and a simulation of turbulent phenomena at different Reynolds numbers is studied, especially with the benchmark of the instability from Von Karman. In the second part, the MPMC model from Shan & Chen is reminded and extended to the case of the inter-component heat exchanges. Quantitative validations are made, especially with the benchmark of a two-phase or two-component Couette fluid. Consistency is tested against Laplace's law rule, or against a benchmark involving heat conduction. Qualitative testing of condensations in a multi-component medium are proposed to validate the heat exchange between components in the presence of a phase transition. In the third part of this thesis, a validation method for data assimilation is introduced, with the ensemble Kalman filter. A state estimation test of a bi-phase fluid is realized, and compatibility of the ensemble Kalman filtering to the LBM MPMC model is assessed. For validation of the behavior of the model for a two-component case, a substitution fluid (non-cryogenic) for LNG, butane, was selected to permit observations in experimental conditions which are accessible. Then, an experimental platform of injection of liquid butane in a pressurised water column is presented. Shadowgraph images from liquid butane experiments in water are exposed and an optical flow calculation algorithm is applied to these images. A qualitative assessment of the velocity field obtaines by application of this algorithm is performed
Angleby, Linda. "Structural and electronic properties of bare and organosilane-functionalized ZnO nanopaticles." Thesis, Linköping University, Linköping University, Linköping University, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-58691.
Full textA systematic study of trends in band gap and lattice energies for bare zinc oxide nanoparticles were performed by means of quantum chemical density functional theory (DFT) calculations and density of states (DOS) calculations. The geometry of the optimized structures and the appearance of their frontier orbitals were also studied. The particles studied varied in sizes from (ZnO)6 up to (ZnO)192.The functionalization of bare and hydroxylated ZnO surfaces with MPTMS was studied with emphasis on the adsorption energies for adsorption to different surfaces and the effects on the band gap for such adsorptions.
Tari, Kévin. "Automorphismes des variétés de Kummer généralisées." Thesis, Poitiers, 2015. http://www.theses.fr/2015POIT2301/document.
Full textLn this work, we classify non-symplectic automorphisms of varieties deformation equivalent to 4-dimensional generalized Kummer varieties, having a prime order action on the Beauville-Bogomolov lattice. Firstly, we give the fixed loci of natural automorphisms of this kind. Thereafter, we develop tools on lattices, in order to apply them to our varieties. A lattice-theoritic study of 2-dimensional complex tori allows a better understanding of natural automorphisms of Kummer-type varieties. Finaly, we classify all the automorphisms described above on thos varieties. As an application of our results on lattices, we complete also the classification of prime order automorphisms on varieties deformation-equivalent to Hilbert schemes of 2 points on K3 surfaces, solving the case of order 5 which was still open
Pitou, Cynthia. "Extraction d'informations textuelles au sein de documents numérisés : cas des factures." Thesis, La Réunion, 2017. http://www.theses.fr/2017LARE0015.
Full textDocument processing is the transformation of a human understandable data in a computer system understandable format. Document analysis and understanding are the two phases of document processing. Considering a document containing lines, words and graphical objects such as logos, the analysis of such a document consists in extracting and isolating the words, lines and objects and then grouping them into blocks. The subsystem of document understanding builds relationships (to the right, left, above, below) between the blocks. A document processing system must be able to: locate textual information, identify if that information is relevant comparatively to other information contained in the document, extract that information in a computer system understandable format. For the realization of such a system, major difficulties arise from the variability of the documents characteristics, such as: the type (invoice, form, quotation, report, etc.), the layout (font, style, disposition), the language, the typography and the quality of scanning.This work is concerned with scanned documents, also known as document images. We are particularly interested in locating textual information in invoice images. Invoices are largely used and well regulated documents, but not unified. They contain mandatory information (invoice number, unique identifier of the issuing company, VAT amount, net amount, etc.) which, depending on the issuer, can take various locations in the document. The present work is in the framework of region-based textual information localization and extraction.First, we present a region-based method guided by quadtree decomposition. The principle of the method is to decompose the images of documents in four equals regions and each regions in four new regions and so on. Then, with a free optical character recognition (OCR) engine, we try to extract precise textual information in each region. A region containing a number of expected textual information is not decomposed further. Our method allows to determine accurately in document images, the regions containing text information that one wants to locate and retrieve quickly and efficiently.In another approach, we propose a textual information extraction model consisting in a set of prototype regions along with pathways for browsing through these prototype regions. The life cycle of the model comprises five steps:- Produce synthetic invoice data from real-world invoice images containing the textual information of interest, along with their spatial positions.- Partition the produced data.- Derive the prototype regions from the obtained partition clusters.- Derive pathways for browsing through the prototype regions, from the concept lattice of a suitably defined formal context.- Update incrementally the set of protype regions and the set of pathways, when one has to add additional data
Guerra, João Carlos Carneiro. "Magneto-Optical Response of 2- Dimensional Lattices." Dissertação, 2018. https://hdl.handle.net/10216/118781.
Full textSilva, Daniel José da. "Lattice location of transition metals in silicon by means of emission channeling." Tese, 2014. https://repositorio-aberto.up.pt/handle/10216/100891.
Full textGuerra, João Carlos Carneiro. "Magneto-Optical Response of 2- Dimensional Lattices." Master's thesis, 2018. https://hdl.handle.net/10216/118781.
Full textGomes, Mariana Melo Nogueira Rosa. "Polar properties, phase sequence and lattice dynamics of K0.5Na0.5NbO3 ceramics prepared through different sintering methods." Dissertação, 2019. https://hdl.handle.net/10216/123594.
Full textSilva, Daniel José da. "Lattice location of transition metals in silicon by means of emission channeling." Doctoral thesis, 2014. https://repositorio-aberto.up.pt/handle/10216/100891.
Full textGomes, Mariana Melo Nogueira Rosa. "Polar properties, phase sequence and lattice dynamics of K0.5Na0.5NbO3 ceramics prepared through different sintering methods." Master's thesis, 2019. https://hdl.handle.net/10216/123594.
Full textBarbosa, Marcelo Baptista. "Electronic structure, lattice location and stability of dopants in wide band gap semiconductors." Doctoral thesis, 2019. https://hdl.handle.net/10216/119210.
Full textBarbosa, Marcelo Baptista. "Electronic structure, lattice location and stability of dopants in wide band gap semiconductors." Tese, 2019. https://hdl.handle.net/10216/119210.
Full textTsai, Yi-yuan, and 蔡益元. "Code Lattices and Natural Code Evolution Paths." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/3aqf2u.
Full text中原大學
資訊工程研究所
92
For students of computer science, programming is an indispensable knowledge and technical ability. However, during the course of learning programming, the learners are often faced with hesitations, setbacks, and frustrations. Having written a piece of code after spending a lot of time and effort, a learner may only find that there are many mistakes (bugs) in the code. Though there may be a strong desire on the part of the learner to try to correct the code, he/she may only “wonder around”, not knowing where to start and what to do. The result is often that the learner acts like a “headless fly”, trying everything everywhere. Perhaps, with good lucks, the learner can finally find a way of correctly revising the code, but only after trying many “wrong ways”. An instructor, on the other hand, often has to face many different pieces of “wrong” code, each with some kind of strange bugs in it. Though the instructor knows what are wrong with any particular piece of code, there is often a great difficulty in trying to “guide” the student (the author of the code) in correctly revising the code and learning the “correct” way of programming. This problem is further complicated by the fact that the instructor often cannot figure out why the student wrote his/her particular piece of code this way, not to mention how the instructor may be able to “correct” the student’s way of thinking. This is a further hindrance in providing appropriate assistance to the learner so that the learner can “think correctly” and write “correct code”. Because of this, we constructed a computer-assisted learning system (a CAL system, for short), and we call this system CSD (for Code Schema Development). CSD may be said to be constructivism-based. The main idea is to control the problem-solving environment and provide scaffolding for code constructions in such a way so that the leaner can (1) develop the intended code schemas and (2) correctly apply the developed code schemas in solving programming problems. A secondary goal of CSD is to raise the learner’s confidence and interests in programming. CSD tracks the learner’s actions and answers (including the intermediate answers) and records everything in a database. From the learners’ records, we seek to analyze the various “tracks of thinking”, with a goal of trying to identify the various “paths” of code evolution, showing how the learners progress from the initial erroneous code to the final correct code. In related previous research works, the main focus was on the identification of error patterns in the learners’ code, and researchers were not concerned with how the learners made revisions in order to obtain the “right” code. In this research, we propose a way of describing how a learner “evolves” from one error pattern to another in producing his/her final answer (the correct code). In doing so, we also try to analyze what the learner may be thinking when he/she wrote the (erroneous) code.
Chun-KaiChang and 張駿愷. "Lattice Boltzmann simulation of nanofluid natural/mixed convection heat transfer." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/28913572604744787599.
Full text國立成功大學
機械工程學系碩博士班
100
In the present study, mathematical modeling is performed to simulate two-dimensional incompressible natural and mix convection of nanofluids in a vertical square enclosure and channel using the lattice Boltzmann method(LBM).Consider the effects of different concentration of nanofluids (0%, 2%, 4%), Rayleight number and Richardson number to averaged Nusselt number, and Consider the effects of putting a rectangle hole in vertical channel with different ratio of height and length to Nusselt number, averaged Nusselt number, and temperature field. The inlet velocity is chosen appropriately, so as to ensure the reasonable adaptation of fluid field and to avoid un-physical compressible effect. Numerical results show that the averaged Nusselt number of nanofluids can be higher than pure water. Increasing the nanoparticle volume fraction will enhance the averaged Nusselt number. The averaged Nusselt number also increases by the temperature change caused by the increase of the Rayleigh number, the characteristic velocity change caused by Richardson number reduction. In the presence of significantly effects of the ratio of height and length of the rectangle hole to velocity field and vortex shape, and the averaged Nusselt number will decrease when the rectangle hole increased.
Rodrigues, Pedro Miguel da Rocha. "Functional lattice instabilities in naturally layered perovskites: from local probe studies to macroscopic cross-coupling effects." Doctoral thesis, 2021. https://hdl.handle.net/10216/139585.
Full textHuang, Nai-Zhu, and 黃迺筑. "The Natural Measure of Symbolic Dynamical Systems in the Two-Dimensional Lattice Model." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/hmmzbw.
Full text國立交通大學
應用數學系所
107
This thesis investigates the natural measure of a symbolic dynamical system in two-dimensional lattice model. For a two-dimensional shift of finite type has irreducible ordering matrix H_2, we provided a method for the natural measure of two-dimensional lattice model. And derive all the one-dimensional model into two-dimension. Eventually apply this method to have the exact value of natural measure of totally symmetric system in the two-dimension.
Chen, Yi-Hsin, and 陳逸昕. "Lattice Boltzmann method for simulating the macroscopic and mesoscopic natural convection flows inside a rectangular cavity." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/53906405335715367280.
Full text國立成功大學
工程科學系碩博士班
95
Natural convection inside a closed cavity is one of the interesting investigations in many scientific and industrial applications. Many numerical methods have been applied to analyze this problem, including the lattice Boltzmann method (LBM), which has emerged as one of the most powerful computational fluid dynamics (CFD) methods in recent years. Using a simple LB model with the Boussinesq approximation, this study investigates the 2D natural convection problem inside a rectangular cavity at different reference Rayleigh numbers, Knudsen numbers, and aspect ratios of cavity when the Prandtl number is fixed as within the range of in macroscopic scale ( ) and in mesoscopic scale ( ) respectively. The flow structures with instability phenomena in macroscopic scale and in mesoscopic scale are compared and analyzed. In simulating the natural convection problems, a model for choosing the appropriate value of the velocity scale, i.e. , is significantly important to simulate the natural convection problems by LBM. Current work proposes a model to determine the value of characteristic velocity (V) based on kinetic theory. A spectrum analysis is performed to identify the unsteady periodic or quasi-periodic oscillatory flow structure. The relationship between the Nusselt number and the reference Rayleigh number is also exhibited. The simulation results show that the instable flow is generated dependent on the Rayleigh numbers, Knudsen numbers, and aspect ratios of cavity. Meanwhile, the effect of Knudsen number and aspect ratio play the significant roles to influence the oscillatory flow structures.
Hsu, Hao, and 徐豪. "A study on the topological nature of Landau levels in a 2D muffin-tin potential lattice." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/84108410755641157467.
Full text國立交通大學
電子物理系所
103
In this work we consider the topological features in a square lattice on a 2DEG under the action of a normal magnetic field. Specifically, the magnetic flux per unit cell is fixed at one half of a flux quanta. Exact numerical diagonalization of the full Hamiltonian is performed to obtain energy band |N,k> of the N-th Landau level (LL).This calculation is facilitated by a TKNN-type (PRL 49,405(1982)) basis wavefunctions |n,k>, where the magnetic translation symmetry is built-in and the n-th eigenstate of a simple harmonic oscillator is used. As the lattice potential U is tuned on, our focus is upon the gap-closing between |N,k>, the k at which the gap-closing occurs, the dispersion relation of the 2 gap-closing LL, the Berry curvatures, and the change in the Chern number. The change in the Chern number for each gap-closing point in the k space is one (two) when the closing bands have linear (quadratic) energy dispersion. Further analysis shows that the closing bands have typically two dominating components in |n1,k> and |n2,k> such that |n1-n2| equals 1 or 3 (2) for the linear (quadratic) energy dispersion case. For the quadratic energy dispersion case, the Berry curvature takes on a volcano type protrusion encycling the high-symmetry k point (also the gap-closing point). This is resulted from our finding that at this high-symmetry point the lattice potential couples basis wavefunctions with the condition |n1-n2|=4 only. We use the kp method to obtain the effective Hamiltonian. To our surprise, for the quadratic energy dispersion case, using two eigenstates of the closing bands as basis is not enough no matter how small the gap between them is since these two basis won't be coupled by the effective Hamiltonian. The smallest dimension of the effective Hamiltonian is three. We further use the Lowdin perturbation to put in the coupling with third basis to get the appropriate two-by-two effective Hamiltonian which matches the energy dispersion and the Berry curvature of the full Hamiltonian well.
Monteiro, Rafael Torrado. "Post-quantum cryptography: lattice-based cryptography and analysis of NTRU public-key cryptosystem." Master's thesis, 2016. http://hdl.handle.net/10451/28303.
Full textEm 1994, Peter Shor desenvolveu um algoritmo quântico para a fatorização de inteiros e para o problema do logaritmo discreto, fazendo de ambos RSA e ECC (Elliptic Curve Cryptography) facilmente quebráveis por computadores quânticos. Como se acredita que computadores quânticos irão fisicamente existir nos próximos poucos anos, quase toda a informação encriptada deixará de estar segura, e o impacto na vida prática das pessoas será no mínimo devastador. Aspetos como a comunicação entre pessoas, empresas e governos, operações bancárias, comércio eletrónico, passwords, entre muitos outros, estarão comprometidos. A criptografia de chave pública é hoje indispensável para indivíduos ou entidades comunicarem em segurança, logo, a existência de computadores quânticos constitui uma enorme ameaça para a nossa criptografia moderna. Caso computadores quânticos venham realmente a existir, substituir os atuais criptossistemas em uso vai ser algo necessário, uma vez que os criptossistemas resistentes a computadores quânticos em uso estão muito longe de ser suficientemente eficientes para serem postos em prática. É portanto imperativo estudar criptossistemas candidatos a substituir os atuais em termos da sua eficiência, praticidade e segurança contra ambos computadores clássicos e quânticos. Em 2016, o NIST, National Institute of Standards and Technology, instituto do Departamento do Comércio dos EUA, anunciou a sua busca por tais construções criptográficas resistentes a computadores quânticos. Estas devem ser eficientes e suficientemente boas para ser adotadas por um largo número de anos. Este instituto prevê a chegada de computadores quânticos para 2030, o que revela que existe pouco tempo para estudar as eventuais propostas de substituição aos criptossistemas atuais. Além disso, esta substituição demorará certamente o seu tempo, de maneira que se deve começar já a investir na busca de construções criptográficas resistentes a computadores quânticos. Desenvolvido por Jeffrey Hoffstein, Jill Pipher e Joseph Silverman (o mesmo Silverman famoso pelo seu trabalho em curvas elípticas) nos anos 90, apresentado na conferencia internacional de criptologia CRYPTO `96 e publicado em 1998, o criptossistema NTRU é um exemplo de um criptossistema resistente à computação quântica com potencial para vir a substituir os atuais criptossistemas clássicos em uso. Embora seja considerado um criptossistema baseado em reticulados, a sua descrição original não os usa, ainda que seja possível descrevê-lo inteiramente usando-os. Ainda assim, no artigo onde vem apresentado constam ataques baseados em reticulados que aproveitam certos aspetos que lhe são intrínsecos. A teoria por detrás do criptossistema NTRU é bastante simples, dado que se baseia em álgebra polinomial e na redução módulo dois inteiros primos entre si. O criptossistema NTRU é, portanto, muito fácil de ser compreendido, implementado, e de ser posto em prática: ambas as chaves pública e privada são rapidamente computáveis, tendo comprimento O(N), fazendo clara distinção com as chaves de comprimento O(N2) de outros criptossistemas ditos rápidos, tais como os criptossistemas McEliece e GGH, e encriptar e desencriptar com o criptossistema NTRU leva O(N2) operações, o que faz do mesmo consideravelmente mais rápido que o criptossistema RSA e a criptografia baseada em curvas elípticas. Além disto, o criptossistema NTRU é considerado como sendo um criptossistema probabilístico, uma vez que usa um elemento aleatório para encriptar uma mensagem. Desse modo, uma mensagem pode ter múltiplas encriptações possíveis. Desde a publicação do criptossistema NTRU, foram emitidos alguns relatórios técnicos, alguns consistindo em algoritmos que permitiram o criptossistema NTRU ganhar eficiência, outros descrevendo melhoramentos de ataques já existentes e também novos ataques e possíveis soluções para os contornar. A análise de segurança feita até hoje, incluindo por criptógrafos reconhecidos como Don Coppersmith, Johan Hastad, Andrew Odlyzho e Adi Shamir, deu ao criptossistema NTRU um estatuto de legitimidade e contribuiu para futura investigação, dado que o criptossistema NTRU se revelou interessante e promissor. Contudo, a análise realizada até hoje não implicou alterações profundas na estrutura do criptossistema NTRU, embora parâmetros mais largos tenham sido recomendados ao longo do tempo para alcançar os níveis de segurança desejados. Apesar de não ser acompanhado de uma prova de segurança, o criptossistema NTRU tem-se revelado seguro, uma vez que nenhum ataque quer clássico quer quântico com impacto significativo no criptossistema foi encontrado, o que revela que é uma boa alternativa aos criptossistemas clássicos em uso. O nosso estudo do criptossistema NTRU começou por explorar a página da companhia Security Innovation, onde se pode encontrar uma breve introdução à criptografia pós-quântica e um vídeo que explica o porquê da mesma ser estudada. Esta página contém um vasto número de recursos, incluindo tutoriais, surveys, relatórios técnicos, resumos e artigos, assim como uma vasta lista de artigos que escrutinam o criptossistema NTRU. De modo a aprofundar o estudo do criptossistema NTRU, explorámos e investigámos uma grande parte da documentação existente nesta página. Existe, como se pode constatar, uma quantidade tremenda de documentação para analisar, e devido a limitações no tempo optámos por estudar o criptossistema NTRUEncrypt, começando pelas suas raízes. Em 1997, Don Coppersmith e Adi Shamir fizeram a primeira análise de segurança ao criptossistema NTRU, em que descobriram que o método de ataque mais eficaz contra o criptossistema consistia em algoritmos de redução de reticulados com vista a encontrar vetores muito curtos numa classe particular de reticulados. Até à data, este é o método de ataque mais eficaz ao criptossistema NTRU. A segurança do criptossistema NTRU tem como base evidências empíricas, que mostra que encontrar vetores muito curtos num reticulado é um problema difícil, especialmente quando a dimensão do reticulado é muito grande. O estudo dos reticulados, denominado Geometria dos Números por Minkowski, data anteriormente das suas aplicações criptográficas e é uma área que contribuiu para o desenvolvimento de muitas outras, tais como a física, análise, álgebra e geometria. Este é um tópico bastante estudado, principalmente devido às aplicações dos algoritmos de redução de reticulados. Como a segurança do NTRU depende principalmente de ataques baseados em reticulados, criámos um capítulo inteiro dedicado aos reticulados e a algoritmos de redução de reticulados. Um importante avanço nesta área foi o algoritmo LLL (por vezes chamado algoritmo L3) de Lenstra, Lenstra e Lovász em 1982, desenvolvido antes dos reticulados serem considerados relevantes em criptografia. Este algoritmo permite encontrar um vetor moderadamente curto num reticulado em tempo polinomial. Com o passar do tempo, este algoritmo foi melhorado, dando origem ao algoritmo BKZ (Blockwise-Korkine-Zolotarev), que é até hoje na prática o melhor algoritmo de redução de reticulados permitindo encontrar vetores muito curtos em reticulados. Este algoritmo deve-se a Schnorr (1987), no entanto não é executado em tempo polinomial. Portanto, encontrar vetores muito curtos num reticulado de larga dimensão continua a ser um problema em aberto. Em 2011, Yuanmi Chen e Phong Q. Nguyen criaram um algoritmo que permite simular o algoritmo BKZ, denominado BKZ simulation algorithm, podendo prever aproximadamente o output e o tempo de execução do algoritmo BKZ. Este trabalho foi dividido em três partes. A primeira é uma pequena introdução à computação quântica, principalmente aos básicos da mecânica quântica e à Transformada de Fourier, uma vez que alguns leitores poderão não conhecer estes conteúdos. Além disso, algumas noções de computação quântica são mencionadas neste trabalho, pelo que faz todo o sentido incluir uma breve referência às mesmas. A segunda parte trata de reticulados em geral, apresenta os algoritmos de redução de reticulados LLL e BKZ (juntamente com o BKZ simulation algorithm) e também alguns exemplos de criptossistemas de chave pública que se baseiam em reticulados. A terceira parte revela uma análise do criptossistema NTRU, focando-se principalmente em ataques baseados em reticulados, que são os que de longe melhor se aplicam aos reticulados associados ao criptossistema NTRU. Como perspetiva de trabalho futuro, apresenta-se uma ideia de um algoritmo quântico para a redução de reticulados.
In 1994, Peter Shor has developed a quantum algorithm for integer factorization and the discrete logarithm problem, known as Shor's algorithm. This was a great finding in the quantum field, given that it makes essentially all currently used cryptographic constructions (such as RSA and ECC) easily breakable by a quantum computer. Since quantum computers are believed to physically exist in the next few years, these encryption schemes will be no longer reliable, making the encrypted data compromised. It is thus of crucial importance to build and analyze new quantum-resistant cryptographic schemes. These new cryptographic constructions should be efficient and secure enough to be used in practice and standardized for a large number of years, replacing the current ones if needed. In the last year, NIST (National Institute of Standards and Technology, from U.S. Department of Commerce) has announced its quest of finding such quantum-resistant cryptographic constructions. The NTRU encryption scheme, developed by Jeffrey Hoffstein, Jill Pipher and Joseph Silverman (the same Silverman famous for his work on elliptic curves) in the 90's, presented in 1996 and published in 1998, is an example of a quantum-resistant proposal. Although it is not supported with a theoretical security proof, the scrutiny done since its presentation until now reveal that NTRU is secure and is a good successor for replacing the current constructions currently in use. It is a fact that there already exist some classical cryptosystems that are quantum-resistant. The McEliece cryptosystem (1978) is an example of such a quantum-resistant construction, but it is not good to be used in practice. The theory behind the NTRU cryptosystem is very simple, therefore easy to understand, and NTRU can be very easily implemented and put in practice: both private and public keys are easy and quickly computable, having O(N) length, making a clear difference between the length O(N2) of other cryptosystems considered as fast, and encrypting and decrypting with NTRU takes O(N2) operations, which makes it considerably faster than RSA and ECC. To date, NTRU remains secure against both classical and quantum computers. The most effective attacks on NTRU are based on lattices, namely using lattice reduction algorithms, which goal is to find very short vectors in a lattice. One can apply the LLL algorithm (1982) of Arjen Lenstra, Hendrik Lenstra and László Lovász, which is recognized as a very important achievement in many areas of mathematics. It runs in polynomial time, but has an exponential approximation factor. This algorithm was improved over time. The most important improvement is due to Schnorr (1987), who introduced the Blockwise-Korkine-Zolotarev algorithm, also known as BKZ algorithm, which is to date the best lattice reduction algorithm to be put in practice. In 2011, Yuanmi Chen and Phong Q. Nguyen improved the BKZ algorithm, revising lattice security estimates, and created the BKZ simulation algorithm, permitting to predict both the output and the running time of the BKZ algorithm without the need of running BKZ, since in high dimension with blocksize ≥ 45 the running time of BKZ is considerably long. There is a lot of investigation in this area, mainly because of its cryptographic applications. This work can be divided in three parts: first, we present a slight and short introduction to quantum computing, namely the basics of quantum mechanics and the Fourier Transform; secondly, we give a sufficient groundwork on lattices, present the LLL and BKZ lattice reduction algorithms, the BKZ simulation algorithm as well, and give some examples of public-key encryption schemes; in third place, we give an analysis of the NTRU cryptosystem, focusing mainly on lattice-based attacks. Finally, as work to do in the future, we present an idea of a quantum algorithm for lattice reduction.
Abreu, Maria Zita Fiqueli de. "Esquemas de assinatura digital Lattice-based e experimentação de certificados híbridos com criptografia pós-quântica." Master's thesis, 2020. http://hdl.handle.net/1822/73679.
Full textOs algoritmos criptográficos pós-quânticos têm como premissa de segurança a dificuldade na resolução de problemas matemáticos que se conjecturam difíceis na computação quântica. O interesse em implantar esses algoritmos tem vindo a crescer por forma a que a informação esteja protegida contra ataques quânticos no futuro. O National Institute of Standards and Technology (NIST) tem, atualmente, aberto um concurso para a seleção de algoritmos criptográficos pós-quânticos [1]. Nesta dissertação estuda-se dois desses algoritmos, mais precisamente, os esquemas de assinatura digital qTESLA e Crystals-Dilithium, tendo como ferramenta principal, no que concerne à implementação não otimizada dos mesmos, o software SageMath. Embora o concurso da NIST seja um passo importante, é relevante que exista uma transição dos protocolos atuais para um novo modelo, integrando soluções híbridas. Nesse sentido, e tendo em vista uma melhor transição dos algoritmos clássicos, analisa-se a adaptação dos certificados à criptografia pós-quântica e faz-se a experimentação de certificados híbridos com os esquemas de assinatura já mencionados. Este trabalho foi desenvolvido em parceria com a Universidade do Minho e a Multicert.
The post-quantum cryptographic algorithms have as security premise the difficulty in solving mathematical problems that are conjectured difficult in quantum computing. The interest in implementing these algorithms has been growing so that the information is protected against quantum attacks in the future. The National Institute of Standards and Technology (NIST) has currently opened a call for selection of post-quantum cryptographic algorithms [1]. In this dissertation, two of these algorithms are studied, more precisely, the digital signature schemes qTESLA and Crystals-Dilithium, having as its main tool, regarding their non-optimized implementation, the SageMath software. Although the NIST contest is an important step, it is important that there is a transition from the current protocols to a new model, integrating hybrid solutions. In this sense, and with a view to a better transition from classical algorithms, the adaptation of certificates to post-quantum cryptography is analyzed and hybrid certificates are experimented with the signature schemes already mentioned. This work was developed in partnership with the University of Minho and Multicert.