Littérature scientifique sur le sujet « Average measure of entanglement »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Average measure of entanglement ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Average measure of entanglement"

1

Dahlsten, O. C. O., et M. B. Plenio. « Entanglement probability distribution of bi-partite randomised stabilizer states ». Quantum Information and Computation 6, no 6 (septembre 2006) : 527–38. http://dx.doi.org/10.26421/qic6.6-5.

Texte intégral
Résumé :
We study the entanglement properties of random pure stabilizer states in spin-1/2 particles. We obtain a compact and exact expression for the probability distribution of the entanglement values across any bipartite cut. This allows for exact derivations of the average entanglement and the degree of concentration of measure around this average. We also give simple bounds on these quantities. We find that for large systems the average entanglement is near maximal and the measure is concentrated around it.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Brennen, G. « An observable measure of entanglement for pure states of multi-qubit systems ». Quantum Information and Computation 3, no 6 (novembre 2003) : 619–26. http://dx.doi.org/10.26421/qic3.6-5.

Texte intégral
Résumé :
Recently, Meyer and Wallach [Meyer and Wallach (2002), J. of Math. Phys., 43, pp. 4273] proposed a measure of multi-qubit entanglement that is a function on pure states. We find that this function can be interpreted as a physical quantity related to the average purity of the constituent qubits and show how it can be observed in an efficient manner without the need for full quantum state tomography. A possible realization is described for measuring the entanglement of a chain of atomic qubits trapped in a 3D optical lattice.
Styles APA, Harvard, Vancouver, ISO, etc.
3

ZHANG, YE-QI, et JING-BO XU. « ENTANGLEMENT SWAPPING OF PAIR CAT STATES ». International Journal of Quantum Information 09, no 03 (avril 2011) : 993–1003. http://dx.doi.org/10.1142/s0219749911007496.

Texte intégral
Résumé :
We investigate the entanglement swapping of the continuous variable states by taking the pair Schrödinger cat states as the input entangled states, in which the two-mode squeezed vacuum and the pair cat states serve as the quantum channel, respectively. The entanglement of the initial states as well as the final states is analyzed by adopting the logarithmic negativity as the measure of entanglement. The quantum teleportation task by exploiting the swapped states as the quantum channel is also considered, where a coherent state serves as the target state and the average fidelity is examined.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Lapeyre Jr., G. John, Sébastien Perseguers, Maciej Lewenstein et Antonio Acín. « Distribution of entanglement in networks of bi-partite full-rank mixed states ». Quantum Information and Computation 12, no 5&6 (mai 2012) : 502–34. http://dx.doi.org/10.26421/qic12.5-6-10.

Texte intégral
Résumé :
We study quantum entanglement distribution on networks with full-rank bi-partite mixed states linking qubits on nodes. In particular, we use entanglement swapping and purification to partially entangle widely separated nodes. The simplest method consists of performing entanglement swappings along the shortest chain of links connecting the two nodes. However, we show that this method may be improved upon by choosing a protocol with a specific ordering of swappings and purifications. A priori, the design that produces optimal improvement is not clear. However, we parameterize the choices and find that the optimal values depend strongly on the desired measure of improvement. As an initial application, we apply the new improved protocols to the Erd\"os--R\'enyi network and obtain results including low density limits and an exact calculation of the average entanglement gained at the critical point.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Soteros, C. E., D. W. Sumners et S. G. Whittington. « Entanglement complexity of graphs in Z3 ». Mathematical Proceedings of the Cambridge Philosophical Society 111, no 1 (janvier 1992) : 75–91. http://dx.doi.org/10.1017/s0305004100075174.

Texte intégral
Résumé :
AbstractIn this paper we are concerned with questions about the knottedness of a closed curve of given length embedded in Z3. What is the probability that such a randomly chosen embedding is knotted? What is the probability that the embedding contains a particular knot? What is the expected complexity of the knot? To what extent can these questions also be answered for a graph of a given homeomorphism type?We use a pattern theorem due to Kesten 12 to prove that almost all embeddings in Z3 of a sufficiently long closed curve contain any given knot. We introduce the idea of a good measure of knot complexity. This is a function F which maps the set of equivalence classes of embeddings into 0, ). The F measure of the unknot is zero, and, generally speaking, the more complex the prime knot decomposition of a given knot type, the greater its F measure. We prove that the average value of F diverges to infinity as the length (n) of the embedding goes to infinity, at least linearly in n. One example of a good measure of knot complexity is crossing number.Finally we consider similar questions for embeddings of graphs. We show that for a fixed homeomorphism type, as the number of edges n goes to infinity, almost all embeddings are knotted if the homeomorphism type does not contain a cut edge. We prove a weaker result in the case that the homeomorphism type contains at least one cut edge and at least one cycle.
Styles APA, Harvard, Vancouver, ISO, etc.
6

MAJTEY, A. P., et A. R. PLASTINO. « TYPICAL BEHAVIOR OF THE GLOBAL ENTANGLEMENT OF AN OPEN MULTIQUBIT SYSTEM IN A NON-MARKOVIAN REGIMEN ». International Journal of Quantum Information 10, no 06 (septembre 2012) : 1250063. http://dx.doi.org/10.1142/s0219749912500633.

Texte intégral
Résumé :
We investigate the decay of the global entanglement, due to decoherence, of multiqubit systems interacting with a reservoir in a non-Markovian regime. We assume that during the decoherence process each qubit of the system interacts with its own, independent environment. Most previous works on this problem focused on particular initial states or families of initial states amenable of analytical treatment. Here we determine numerically the typical, average behavior of the system corresponding to random initial pure states uniformly distributed (in the whole Hilbert space of n-qubit pure states) according to the Haar measure. We study systems consisting of 3, 4, 5, and 6 qubits. In each case we consider also the entanglement dynamics corresponding to important particular initial states, such as the GHZ states or multiqubit states maximizing the global entanglement, and determine in which cases any of these states is representative of the average behavior associated with general initial states.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Rajeev, Manasa, et Christine C. Helms. « A Study of the Relationship between Polymer Solution Entanglement and Electrospun PCL Fiber Mechanics ». Polymers 15, no 23 (28 novembre 2023) : 4555. http://dx.doi.org/10.3390/polym15234555.

Texte intégral
Résumé :
Electrospun fibers range in size from nanometers to micrometers and have a multitude of potential applications that depend upon their morphology and mechanics. In this paper, we investigate the effect of polymer solution entanglement on the mechanical properties of individual electrospun polycaprolactone (PCL) fibers. Multiple concentrations of PCL, a biocompatible polymer, were dissolved in a minimum toxicity solvent composed of acetic acid and formic acid. The number of entanglements per polymer (ne) in solution was calculated using the polymer volume fraction, and the resultant electrospun fiber morphology and mechanics were measured. Consistent electrospinning of smooth fibers was achieved for solutions with ne ranging from 3.8 to 4.9, and the corresponding concentration of 13 g/dL to 17 g/dL PCL. The initial modulus of the resultant fibers did not depend upon polymer entanglement. However, the examination of fiber mechanics at higher strains, performed via lateral force atomic force microscopy (AFM), revealed differences among the fibers formed at various concentrations. Average fiber extensibility increased by 35% as the polymer entanglement number increased from a 3.8 ne solution to a 4.9 ne solution. All PCL fibers displayed strain-hardening behavior. On average, the stress increased with strain to the second power. Therefore, the larger extensibilities at higher ne also led to a more than double increase in fiber strength. Our results support the role of polymer entanglement in the mechanical properties of electrospun fiber at large strains.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Horodecki, Michał. « Simplifying Monotonicity Conditions for Entanglement Measures ». Open Systems & ; Information Dynamics 12, no 03 (septembre 2005) : 231–37. http://dx.doi.org/10.1007/s11080-005-0920-5.

Texte intégral
Résumé :
We show that for a convex function the following, rather modest conditions, are equivalent to monotonicity under local operations and classical communication. The conditions are: (i) invariance under local unitaries, (ii) invariance under adding local ancilla in arbitrary state (iii) on mixtures of states possessing local orthogonal flags the function is equal to its average. The result holds for multipartite systems. It is intriguing that the obtained conditions are equalities. The only inequality is hidden in the condition of convexity.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Smith, Philip, et Eleni Panagiotou. « The second Vassiliev measure of uniform random walks and polygons in confined space ». Journal of Physics A : Mathematical and Theoretical 55, no 9 (3 février 2022) : 095601. http://dx.doi.org/10.1088/1751-8121/ac4abf.

Texte intégral
Résumé :
Abstract Biopolymers, like chromatin, are often confined in small volumes. Confinement has a great effect on polymer conformations, including polymer entanglement. Polymer chains and other filamentous structures can be represented by polygonal curves in three-space. In this manuscript, we examine the topological complexity of polygonal chains in three-space and in confinement as a function of their length. We model polygonal chains by equilateral random walks in three-space and by uniform random walks (URWs) in confinement. For the topological characterization, we use the second Vassiliev measure. This is an integer topological invariant for polygons and a continuous functions over the real numbers, as a function of the chain coordinates for open polygonal chains. For URWs in confined space, we prove that the average value of the Vassiliev measure in the space of configurations increases as O(n 2) with the length of the walks or polygons. We verify this result numerically and our numerical results also show that the mean value of the second Vassiliev measure of equilateral random walks in three-space increases as O(n). These results reveal the rate at which knotting of open curves and not simply entanglement are affected by confinement.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Biswas, George, Anindya Biswas et Ujjwal Sen. « Inhibition of spread of typical bipartite and genuine multiparty entanglement in response to disorder ». New Journal of Physics 23, no 11 (1 novembre 2021) : 113042. http://dx.doi.org/10.1088/1367-2630/ac37c8.

Texte intégral
Résumé :
Abstract The distribution of entanglement of typical multiparty quantum states is not uniform over the range of the measure utilized for quantifying the entanglement. We intend to find the response to disorder in the state parameters on this non-uniformity for typical states. We find that the typical entanglement, averaged over the disorder, is taken farther away from uniformity, as quantified by decreased standard deviation, in comparison to the clean case. The feature is seemingly generic, as we see it for Gaussian and non-Gaussian disorder distributions, for varying strengths of the disorder, and for disorder insertions in one and several state parameters. The non-Gaussian distributions considered are uniform and Cauchy–Lorentz. Two- and three-qubit pure state Haar-uniform generations are considered for the typical state productions. We also consider noisy versions of the initial states produced in the Haar-uniform generations. A genuine multiparty entanglement monotone is considered for the three-qubit case, while concurrence is used to measure two-qubit entanglement.
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Average measure of entanglement"

1

Amouzou, Grâce Dorcas Akpéné. « Etude de l’intrication par les polynômes de Mermin : application aux algorithmes quantiques ». Electronic Thesis or Diss., Bourgogne Franche-Comté, 2024. http://www.theses.fr/2024UBFCK063.

Texte intégral
Résumé :
Cette thèse explore la mesure de l'intrication dans certains états hypergraphiques, dans certains algorithmes quantiques tels que les algorithmes quantiques d'estimation de phase et de comptage, ainsi que dans les circuits d'agents réactifs, à l'aide de la mesure géométrique de l'intrication, d'outils issus des polynômes de Mermin et des matrices de coefficients. L'intrication est un concept présent en physique quantique qui n'a pas d'équivalent connu à ce jour en physique classique.Le coeur de notre recherche repose sur la mise en place de dispositifs de détection et de mesure de l'intrication afin d'étudier des états quantiques du point de vue de l'intrication.Dans cette optique, des calculs sont d'abord effectués numériquement puis sur simulateur et ordinateur quantiques. Effectivement, trois des outils exploités sont implémentables sur machine quantique, ce qui permet de comparer les résultats théoriques et "réels"
This thesis explores the measurement of entanglement in certain hypergraph states, in certain quantum algorithms like the Quantum Phase estimation and Counting algorithms as well as in reactive agent circuits, using the geometric measurement of entanglement, tools from Mermin polynomials and coefficient matrices. Entanglement is a concept present in quantum physics that has no known equivalent to date in classical physics.The core of our research is based on the implementation of entanglement detection and measurement devices in order to study quantum states from the point of view of entanglement.With this in mind, calculations are first carried out numerically and then on a quantum simulator and computer. Indeed, three of the tools used can be implemented on a quantum machine, which allows us to compare theoretical and "real" results
Styles APA, Harvard, Vancouver, ISO, etc.
2

Tunison, Robert. « Average Taxonomic Distinctness as a Measure of Global Biodiversity ». University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1535635897703678.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Woldekristos, Habtom G. « Tripartite Entanglement in Quantum Open Systems ». Miami University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=miami1250185666.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Sanja, Lončar. « Negative Selection - An Absolute Measure of Arbitrary Algorithmic Order Execution ». Phd thesis, Univerzitet u Novom Sadu, Prirodno-matematički fakultet u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=104861&source=NDLTD&language=en.

Texte intégral
Résumé :
Algorithmic trading is an automated process of order execution on electronic stock markets. It can be applied to a broad range of financial instruments, and it is  characterized by a signicant investors' control over the execution of his/her orders, with the principal goal of finding the right balance between costs and risk of not (fully) executing an order. As the measurement of execution performance gives information whether best execution is achieved, a signicant number of diffeerent benchmarks is  used in practice. The most frequently used are price benchmarks, where some of them are determined before trading (Pre-trade benchmarks), some during the trading  day (In-traday benchmarks), and some are determined after the trade (Post-trade benchmarks). The two most dominant are VWAP and Arrival Price, which is along with other pre-trade price benchmarks known as the Implementation Shortfall (IS).We introduce Negative Selection as a posteriori measure of the execution algorithm performance. It is based on the concept of Optimal Placement, which represents the ideal order that could be executed in a given time win-dow, where the notion of ideal means that it is an order with the best execution price considering  market  conditions  during the time window. Negative Selection is dened as a difference between vectors of optimal and executed orders, with vectors dened as a quantity of shares at specied price positionsin the order book. It is equal to zero when the order is optimally executed; negative if the order is not (completely) filled, and positive if the order is executed but at an unfavorable price.Negative Selection is based on the idea to offer a new, alternative performance measure, which will enable us to find the  optimal trajectories and construct optimal execution of an order.The first chapter of the thesis includes a list of notation and an overview of denitions and theorems that will be used further in the thesis. Chapters 2 and 3 follow with a  theoretical overview of concepts related to market microstructure, basic information regarding benchmarks, and theoretical background of algorithmic trading. Original results are presented in chapters 4 and 5. Chapter 4 includes a construction of optimal placement, definition and properties of Negative Selection. The results regarding the properties of a Negative Selection are given in [35]. Chapter 5 contains the theoretical background for stochastic optimization, a model of the optimal execution formulated as a stochastic optimization problem with regard to Negative Selection, as well as original work on nonmonotone line search method [31], while numerical results are in the last, 6th chapter.
Algoritamsko trgovanje je automatizovani proces izvršavanja naloga na elektronskim berzama. Može se primeniti na širok spektar nansijskih instrumenata kojima se trguje na berzi i karakteriše ga značajna kontrola investitora nad izvršavanjem njegovih naloga, pri čemu se teži nalaženju pravog balansa izmedu troška i rizika u vezi sa izvršenjem naloga. S ozirom da se merenjem performasi izvršenja naloga određuje da li je postignuto najbolje izvršenje, u praksi postoji značajan broj različitih pokazatelja. Najčešće su to pokazatelji cena, neki od njih se određuju pre trgovanja (eng. Pre-trade), neki u toku trgovanja (eng. Intraday), a neki nakon trgovanja (eng. Post-trade). Dva najdominantnija pokazatelja cena su VWAP i Arrival Price koji je zajedno sa ostalim "pre-trade" pokazateljima cena poznat kao Implementation shortfall (IS).Pojam negative selekcije se uvodi kao "post-trade" mera performansi algoritama izvršenja, polazeći od pojma optimalnog naloga, koji predstavlja idealni nalog koji se  mogao izvrsiti u datom vremenskom intervalu, pri ćemu se pod pojmom "idealni" podrazumeva nalog kojim se postiže najbolja cena u tržišnim uslovima koji su vladali  u toku tog vremenskog intervala. Negativna selekcija se definiše kao razlika vektora optimalnog i izvršenog naloga, pri čemu su vektori naloga defisani kao količine akcija na odgovarajućim pozicijama cena knjige naloga. Ona je jednaka nuli kada je nalog optimalno izvršen; negativna, ako nalog nije (u potpunosti) izvršen, a pozitivna ako je nalog izvršen, ali po nepovoljnoj ceni.Uvođenje mere negativne selekcije zasnovano je na ideji da se ponudi nova, alternativna, mera performansi i da se u odnosu na nju nađe optimalna trajektorija i konstruiše optimalno izvršenje naloga.U prvom poglavlju teze dati su lista notacija kao i pregled definicija i teorema  neophodnih za izlaganje materije. Poglavlja 2 i 3 bave se teorijskim pregledom pojmova i literature u vezi sa mikrostrukturom tržišta, pokazateljima trgovanja i algoritamskim trgovanjem. Originalni rezultati su predstavljeni u 4. i 5. poglavlju. Poglavlje 4 sadrži konstrukciju optimalnog naloga, definiciju i osobine negativne selekcije. Teorijski i praktični rezultati u vezi sa osobinama negativna selekcije dati su u [35]. Poglavlje 5 sadrži teorijske osnove stohastičke optimizacije, definiciju modela za optimalno izvršenje, kao i originalni rad u vezi sa metodom nemonotonog linijskog pretraživanja [31], dok 6. poglavlje sadrži empirijske rezultate.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Klinkert, Rickard. « Uncertainty Analysis of Long Term Correction Methods for Annual Average Winds ». Thesis, Umeå universitet, Institutionen för fysik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-59690.

Texte intégral
Résumé :
For the construction of a wind farm, one needs to assess the wind resources of the considered site location. Using reference time series from numerical weather prediction models, global assimilation databases or observations close to the area considered, the on-site measured wind speeds and wind directions are corrected in order to represent the actual long-term wind conditions. This long-term correction (LTC) is in the typical case performed by making use of the linear regression within the Measure-Correlate-Predict (MCP) method. This method and two other methods, Sector-Bin (SB) and Synthetic Time Series (ST), respectively, are used for the determination of the uncertainties that are associated with LTC.The test area that has been chosen in this work, is located in the region of the North Sea, using 22 quality controlled meteorological (met) station observations from offshore or nearby shore locations in Denmark, Norway and Sweden. The time series that has been used cover the eight year period from 2002 to 2009 and the year with the largest variability in the wind speeds, 2007, is used as the short-term measurement period. The long-term reference datasets that have been used are the Weather Research and Forecast model, based on both ECMWF Interim Re-Analysis (ERA-Interim) and National Centers for Environmental Prediction Final Analysis (NCEP/FNL), respectively and additional reference datasets of Modern Era Re-Analysis (MERRA) and QuikSCAT satellite observations. The long-term period for all of the reference datasets despite QuikSCAT, correspond to the one of stations observations. The QuikSCAT period of observations used cover the period from November 1st, 1999 until October 31st, 2009.The analysis is divided into three parts. Initially, the uncertainty connected to the corresponding reference dataset, when used in LTC method, is investigated. Thereafter the uncertainty due to the concurrent length of the on-site measurements and reference dataset is analyzed. Finally, the uncertainty is approached using a re-sampling method of the Non-Parametric Bootstrap. The uncertainty of the LTC method SB, for a fixed concurrent length of the datasets is assessed by this methodology, in an effort to create a generic model for the estimation of uncertainty in the predicted values for SB.The results show that LTC with WRF model datasets based on NCEP/FNL and ERA-Interim, respectively, is slightly different, but does not deviate considerably in comparison when comparing with met station observations. The results also suggest the use of MERRA reference dataset in connection with long-term correction methods. However, the datasets of QuikSCAT does not provide much information regarding the overall quality of long-term correction, and a different approach than using station coordinates for the withdrawal of QuikSCAT time series is preferred. Additionally, the LTC model of Sector-Bin is found to be robust against variation in the correlation coefficient between the concurrent datasets. For the uncertainty dependence of concurrent time, the results show that an on-site measurement period of one consistent year or more, gives the lowest uncertainties compared to measurements of shorter time. An additional observation is that the standard deviation of long-term corrected means decreases with concurrent time. Despite the efforts of using the re-sampling method of Non-Parametric Bootstrap the estimation of the uncertainties is not fully determined. However, it does give promising results that are suggested for investigation in further work.
För att bygga en vindkraftspark är man i behov av att kartlägga vindresurserna i det aktuella området. Med hjälp av tidsserier från numeriska vädermodeller (NWP), globala assimileringsdatabaser och intilliggande observationer korrigeras de uppmätta vindhastigheterna och vindriktningarna för att motsvara långtidsvärdena av vindförhållandena. Dessa långtidskorrigeringsmetoder (LTC) genomförs generellt sett med hjälp av linjär regression i Mät-korrelera-predikera-metoden (MCP). Denna metod, och två andra metoder, Sektor-bin (SB) och Syntetiska tidsserier (ST), används i denna rapport för att utreda de osäkerheter som är knutna till långtidskorrigering.Det testområde som är valt för analys i denna rapport omfattas av Nordsjöregionen, med 22 meteorologiska väderobservationsstationer i Danmark, Norge och Sverige. Dessa stationer är till största del belägna till havs eller vid kusten. Tidsserierna som används täcker åttaårsperioden från 2002 till 2009, där det året med högst variabilitet i uppmätt vindhastighet, år 2007, används som den korta mätperiod som blir föremål för långtidskorrigeringen. De långa referensdataseten som använts är väderprediktionsmodellen WRF ( Weather Research and Forecast Model), baserad både på data från NCEP/FNL (National Centers for Environmental Prediciton Final Analysis) och ERA-Interim (ECMWF Interim Re-analysis). Dessutom används även data från MERRA (Modern Era Re-Analysis) och satellitobservationer från QuikSCAT. Långtidsperioden för alla dataset utom QuikSCAT omfattar samma period som observationsstationerna. QuikSCAT-datat som använts omfattar perioden 1 november 1999 till 31 oktober 2009.Analysen är indelad i tre delar. Inledningsvis behandlas osäkerheten som är kopplad till referensdatans ingående i långtidskorrigeringsmetoderna. Därefter analyseras osäkerhetens beroende av längden på den samtidiga datan i referens- och observationsdataseten. Slutligen utreds osäkerheten med hjälp av en icke-parametrisk metod, en s.k. Bootstrap: Osäkerheten i SB-metoden för en fast samtidig längd av tidsserierna från observationer och referensdatat uppskattas genom att skapa en generell modell som estimerar osäkerheten i estimatet.Resultatet visar att skillnaden när man använder WRF-modellen baserad både på NCEP/FNL och ERA-Interim i långtidskorrigeringen är marginell och avviker inte markant i förhållande till stationsobservationerna. Resultatet pekar också på att MERRA-datat kan användas som långtidsreferensdataset i långtidsdkorrigeringsmetoderna. Däremot ger inte QuikSCAT-datasetet tillräckligt med information för att avgöra om det går att använda i långtidskorrigeringsmetoderna. Därför föreslås ett annat tillvägagångssätt än stationsspecifika koordinater vid val av koordinater lämpliga för långtidskorrigering. Ytterligare ett resultat vid analys av långtidskorrigeringsmetoden SB, visar att metoden är robust mot variation i korrelationskoefficienten.Rörande osäkerhetens beroende av längden på samtidig data visar resultaten att en sammanhängande mätperiod på ett år eller mer ger den lägsta osäkerheten i årsmedelvindsestimatet, i förhållande till mätningar av kortare slag. Man kan även se att standardavvikelsen av de långtidskorrigerade medelvärdena avtar med längden på det samtidiga datat. Den implementerade ickeparametriska metoden Bootstrap, som innefattar sampling med återläggning, kan inte estimera osäkerheten till fullo. Däremot ger den lovande resultat som föreslås för vidare arbete.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Teng, Peiyuan. « Tensor network and neural network methods in physical systems ». The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1524836522115804.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Han, Yong. « Some problems about SLE ». Thesis, Orléans, 2017. http://www.theses.fr/2017ORLE2007/document.

Texte intégral
Résumé :
Cette thèse se concentre sur trois sujets liés aux SLE(k) processus. La première partie concerne le processus dipolar SLE(k) et la mesure de restriction conforme à la bande ; La deuxième partie porte sur la propriété de connectivité de la mesure de la boucle brownienne ; Et la troisième partie porte sur le spectre des moyens intégrés généralisés du processus entier intérieur des processus Loewner piloté par un processus Lévy
This thesis focuses on three topics related to the SLE(k) processes. The first part is about the dipolar SLE(k)process and the conformal restriction measure on the strip ; the second part is about the connectivity propertyof the Brownian loop measure ; and the third part is about the generalized integral means spectrum of the innerwhole plane Loewner processes driven by a Lévy process
Styles APA, Harvard, Vancouver, ISO, etc.
8

Albouy, Olivier. « Discrete algebra and geometry applied to the Pauli group and mutually unbiased bases in quantum information theory ». Phd thesis, Université Claude Bernard - Lyon I, 2009. http://tel.archives-ouvertes.fr/tel-00612229.

Texte intégral
Résumé :
Pour d non puissance d'un nombre premier, le nombre maximal de bases deux à deux décorrélées d'un espace de Hilbert de dimension d n'est pas encore connu. Dans ce mémoire, nous commençons par donner une construction de bases décorrélées en lien avec une famille de représentations irréductibles de l'algèbre de Lie su(2) et faisant appel aux sommes de Gauss.Puis nous étudions de façon systématique la possibilité de construire de telle bases au moyen des opérateurs de Pauli. 1) L'étude de la droite projective sur Zdm montre que, pour obtenir des ensembles maximaux de bases décorrélées à l'aide d'opérateurs de Pauli, il est nécessaire de considérer des produits tensoriels de ces opérateurs. 2) Les sous-modules lagrangiens de Zd2n, dont nous donnons une classification complète, rendent compte des ensembles maximalement commutant d'opérateurs de Pauli. Cette classification permet de savoir lesquels de ces ensembles sont susceptibles de donner des bases décorrélées : ils correspondent aux demi-modules lagrangiens, qui s'interprètent encore comme les points isotropes de la droite projective (P(Mat(n, Zd)²),ω). Nous explicitons alors un isomorphisme entre les bases décorrélées ainsi obtenues et les demi-modules lagrangiens distants, ce qui précise aussi la correspondance entre sommes de Gauss et bases décorrélées. 3) Des corollaires sur le groupe de Clifford et l'espace des phases discret sont alors développés.Enfin, nous présentons quelques outils inspirés de l'étude précédente. Nous traitons ainsi du rapport anharmonique sur la sphère de Bloch, de géométrie projective en dimension supérieure, des opérateurs de Pauli continus et nous comparons l'entropie de von Neumann à une mesure de l'intrication par calcul d'un déterminant.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Pagliarani, Stefano. « Portfolio optimization and option pricing under defaultable Lévy driven models ». Doctoral thesis, Università degli studi di Padova, 2014. http://hdl.handle.net/11577/3423519.

Texte intégral
Résumé :
In this thesis we study some portfolio optimization and option pricing problems in market models where the dynamics of one or more risky assets are driven by Lévy processes, and it is divided in four independent parts. In the first part we study the portfolio optimization problem, for the logarithmic terminal utility and the logarithmic consumption utility, in a multi-defaultable Lévy driven model. In the second part we introduce a novel technique to price European defaultable claims when the pre-defaultable dynamics of the underlying asset follows an exponential Lévy process. In the third part we develop a novel methodology to obtain analytical expansions for the prices of European derivatives, under stochastic and/or local volatility models driven by Lévy processes, by analytically expanding the integro-differential operator associated to the pricing problem. In the fourth part we present an extension of the latter technique which allows for obtaining analytical expansion in option pricing when dealing with path-dependent Asian-style derivatives.
In questa tesi studiamo alcuni problemi di portfolio optimization e di option pricing in modelli di mercato dove le dinamiche di uno o più titoli rischiosi sono guidate da processi di Lévy. La tesi é divisa in quattro parti indipendenti. Nella prima parte studiamo il problema di ottimizzare un portafoglio, inteso come massimizzazione di un’utilità logaritmica della ricchezza finale e di un’utilità logaritmica del consumo, in un modello guidato da processi di Lévy e in presenza di fallimenti simultanei. Nella seconda parte introduciamo una nuova tecnica per il prezzaggio di opzioni europee soggette a fallimento, i cui titoli sottostanti seguono dinamiche che prima del fallimento sono rappresentate da processi di Lévy esponenziali. Nella terza parte sviluppiamo un nuovo metodo per ottenere espansioni analitiche per i prezzi di derivati europei, sotto modelli a volatilità stocastica e locale guidati da processi di Lévy, espandendo analiticamente l’operatore integro-differenziale associato al problema di prezzaggio. Nella quarta, e ultima parte, presentiamo un estensione della tecnica precedente che consente di ottenere espansioni analitiche per i prezzi di opzioni asiatiche, ovvero particolari tipi di opzioni il cui payoff dipende da tutta la traiettoria del titolo sottostante.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Stevenson, Robin. « Generating quantum resources through measurement and control ». Phd thesis, 2013. http://hdl.handle.net/1885/150657.

Texte intégral
Résumé :
The quantum resources of entanglement and single photons are key to a range of quantum applications. These resources are challenging to produce cleanly and efficiently. This thesis investigates and improves methods of producing entanglement and single photons using measurement and control. To robustly produce entanglement, this thesis extends work by Carvalho et al., who developed a method for creating an entangled state of two atoms. This is done by coupling the atoms to a damped cavity, and using measurement of the output of the cavity to trigger a feedback pulse on the atoms. The robustness of this scheme against imperfect localisation of the two atoms is tested in this thesis, and limits are placed on the size of a trap that will still allow an entangled state to be produced. Additionally, using three-level (Lambda configuration) atoms slows the rate at which the system evolves, allowing more time for measurement and feedback to be applied, though it does not counteract the influence of spontaneous emission. In extending this work to multiple atoms we found that the system rapidly becomes more complex, and we developed a strategy for choosing a feedback pulse that stabilises a specific, highly entangled steady state of multiple particles. In addition to this general strategy, we developed a local and separable feedback for generating entanglement in a four-partite system. This thesis also investigates the production of single photons through the rephased amplified spontaneous emission (RASE) protocol. This protocol offers the promise of a stream of precisely shaped single photons, though it is plagued by efficiency issues. We develop a model of RASE and show that it overcomes the trade-off between efficiency and photon spacing by using different optical depths for the amplified spontaneous emission (ASE) preparation of the ensemble, and the RASE emission of single photons. By tailoring the density profile of the ensemble to mode-match the RASE emission of a single photon, we also able to eliminate reabsorption of the emitted photon, which would otherwise have limited the efficiency to 70%.
Styles APA, Harvard, Vancouver, ISO, etc.

Livres sur le sujet "Average measure of entanglement"

1

Everett, J. R. A report on experiments to measure average fibre diameters by optical fourier analysis. Melbourne : Commonwealth Scientific and Industrial Research Organization, 1986.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Lei, Pui-Wa. Alternatives to the grade point average as a measure of academic achievement in college. Iowa City, Iowa : ACT, Inc., 2001.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Lei, Pui-Wa. Alterntives to the grade point average as a measure of academic achievement in college. Iowa City, Iowa : ACT, Inc., 2001.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

(Editor), John A. Prestbo, dir. The Market's Measure : An Illustrated History of America Told Through the Dow Jones Industrial Average. Dow Jones & Company, Inc., 1999.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Van Olmen, Daniël, et Johan Van Der Auwera. Modality and Mood in Standard Average European. Sous la direction de Jan Nuyts et Johan Van Der Auwera. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199591435.013.11.

Texte intégral
Résumé :
The chapter discusses the research on the features of the mood and modality systems of European languages that stand a chance of being due to some measure of the areal convergence captured with the term “Standard Average European.” These features are: (i) the compositional nature of the prohibitive, (ii) the number of non-indicative non-imperative moods, (iii) the relation between canonical and non-canonical imperatives, (iv) the use of word order for the interrogative, the (v) multifunctionality, (vi) verbiness, and (vii) grammaticalization of modal markers. While all of these characterize European languages, only features (i), (v), (vi), and (vii) are potential Standard Average European features.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Beenakker, Carlo W. J. Extreme eigenvalues of Wishart matrices : application to entangled bipartite system. Sous la direction de Gernot Akemann, Jinho Baik et Philippe Di Francesco. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198744191.013.37.

Texte intégral
Résumé :
This article describes the application of random matrix theory (RMT) to the estimation of the bipartite entanglement of a quantum system, with particular emphasis on the extreme eigenvalues of Wishart matrices. It first provides an overview of some spectral properties of unconstrained Wishart matrices before introducing the problem of the random pure state of an entangled quantum bipartite system consisting of two subsystems whose Hilbert spaces have dimensions M and N respectively with N ≤ M. The focus is on the smallest eigenvalue which serves as an important measure of entanglement between the two subsystems. The minimum eigenvalue distribution for quadratic matrices is also considered. The article shows that the N eigenvalues of the reduced density matrix of the smaller subsystem are distributed exactly as the eigenvalues of a Wishart matrix, except that the eigenvalues satisfy a global constraint: the trace is fixed to be unity.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Kaboudan, Mak. Computational Spatiotemporal Modeling of Southern California Home Prices. Sous la direction de Shu-Heng Chen, Mak Kaboudan et Ye-Rong Du. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199844371.013.13.

Texte intégral
Résumé :
Average quarterly price changes in six contiguous southern California cities are obtained and used, first, to determine if price changes in contiguous cities are spatiotemporally contagious, then to forecast each city’s average prices for four quarters (or one year, 2014). In order to capture the contagious effects, a spatiotemporal contagion response measure is proposed and computed. The measure quantifies the responsiveness of residential home-price changes in one location (or city) to lagged price changes in another location. Average home characteristics (such as square footage and number of bedrooms), as well as lagged average quarterly mortgage rates, lagged average quarterly unemployment rates, and lagged average quarterly price changes of all locations, are input variables used to estimate the response measures and produce price forecasts. Models and forecasts are obtained first using genetic programming then compared to outcomes obtained using linear regressions.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Schreyer, Paul. GDP. Sous la direction de Matthew D. Adler et Marc Fleurbaey. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199325818.013.3.

Texte intégral
Résumé :
GDP is first and foremost a measure of economic activity and production, and distinct from a measure of welfare, even when narrowly defined as material well-being. Despite this difference, GDP and welfare are not unrelated concepts. Links include the scope of final products that enter GDP and the welfare basis of price indices that are used to compute real GDP. Further, the national accounts systematically link GDP with household consumption and income, the key determinants of average material well-being. Last, in measures of intertemporal social welfare, GDP appears through the need to account for future changes in productivity. This chapter also describes efforts to adjust GDP to gauge welfare more directly but concludes that they have not gained traction because there is no well-articulated theory that would indicate the scope and nature of the required adjustments and because GDP is tremendously useful as a measure of production that needs complementing but not substituting.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Gupta, Rajesh. Randomized controlled trial evidence for gabapentin in post-herpetic neuralgia. Sous la direction de Paul Farquhar-Smith, Pierre Beaulieu et Sian Jagger. Oxford University Press, 2018. http://dx.doi.org/10.1093/med/9780198834359.003.0069.

Texte intégral
Résumé :
The landmark paper discussed in this chapter is ‘Gabapentin for the treatment of postherpetic neuralgia: A randomized controlled trial’, published by Rowbotham et al. in 1998. In the study, a 4-week initial period of titration of gabapentin (up to a maximum of 3,600 mg) or matching placebo was given, followed by a further 4-week period at the maximum tolerated dose. The primary efficacy measure was change in average daily pain score from start to finish of the treatment, and secondary measures observed were the average daily sleep score, a short-form McGill Pain Questionnaire, the subject’s global impression of change, the investigator-rated clinical global impression of change, the Short Form 36, a quality-of-life questionnaire, and a Profile of Mood States questionnaire. Subjects receiving gabapentin had significant reduction in daily pain scores as well as improvement in secondary measures of pain, although with an increased incidence of side effects.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Sorrentino, Alfonso. Action-Minimizing Invariant Measures for Tonelli Lagrangians. Princeton University Press, 2017. http://dx.doi.org/10.23943/princeton/9780691164502.003.0003.

Texte intégral
Résumé :
This chapter discusses the notion of action-minimizing measures, recalling the needed measure–theoretical material. In particular, this allows the definition of a first family of invariant sets, the so-called Mather sets. It discusses their main dynamical and symplectic properties, and introduces the minimal average actions, sometimes called Mather's α‎- and β‎-functions. A thorough discussion of their properties (differentiability, strict convexity or lack thereof) is provided and related to the dynamical and structural properties of the Mather sets. The chapter also describes these concepts in a concrete physical example: the simple pendulum.
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Average measure of entanglement"

1

Weik, Martin H. « average measure of information content ». Dans Computer Science and Communications Dictionary, 92. Boston, MA : Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_1200.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Franco, Federica, Alessia Amelio et Sergio Greco. « 3D Average Common Submatrix Measure ». Dans Digital Libraries : The Era of Big Data and Data Science, 26–32. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-39905-4_4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Della Mea, Vincenzo, Gianluca Demartini, Luca Di Gaspero et Stefano Mizzaro. « Experiments on Average Distance Measure ». Dans Lecture Notes in Computer Science, 492–95. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11735106_49.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Amelio, Alessia, et Clara Pizzuti. « Average Common Submatrix : A New Image Distance Measure ». Dans Image Analysis and Processing – ICIAP 2013, 170–80. Berlin, Heidelberg : Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41181-6_18.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Kwok, Rex, Abhaya Nayak et Norman Foo. « Coherence measure based on average use of formulas ». Dans PRICAI’98 : Topics in Artificial Intelligence, 553–64. Berlin, Heidelberg : Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0095300.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Nowostawski, Mariusz, et Andrzej Gecow. « Identity Criterion for Living Objects Based on the Entanglement Measure ». Dans Semantic Methods for Knowledge Management and Communication, 159–69. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23418-7_15.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Peleg, David, et Uri Pincas. « The Average Hop Count Measure For Virtual Path Layouts ». Dans Lecture Notes in Computer Science, 255–69. Berlin, Heidelberg : Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45414-4_18.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Nguyen, Loan Thi Thuy, Trinh D. D. Nguyen, Anh Nguyen, Phuoc-Nghia Tran, Cuong Trinh, Bao Huynh et Bay Vo. « Efficient Method for Mining High-Utility Itemsets Using High-Average Utility Measure ». Dans Computational Collective Intelligence, 305–15. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63007-2_24.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Matsuo, Yutaka, Yukio Ohsawa et Mitsuru Ishizuka. « Average-Clicks : A New Measure of Distance on the World Wide Web ». Dans Web Intelligence : Research and Development, 106–14. Berlin, Heidelberg : Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45490-x_11.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Flandoli, Franco, Andrea Papini et Marco Rehmeier. « Average Dissipation for Stochastic Transport Equations with Lévy Noise ». Dans Mathematics of Planet Earth, 45–59. Cham : Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-70660-8_3.

Texte intégral
Résumé :
AbstractWe show that, in one spatial and arbitrary jump dimension, the averaged solution of a Marcus-type SPDE with pure jump Lévy transport noise satisfies a dissipative deterministic equation involving a fractional Laplace-type operator. To this end, we identify the correct associated Lévy measure for the driving noise. We consider this a first step in the direction of a non-local version of enhanced dissipation, a phenomenon recently proven to occur for Brownian transport noise and the associated local parabolic PDE by the first author. Moreover, we present numerical simulations, supporting the fact that dissipation occurs for the averaged solution, with a behavior akin to the diffusion due to a fractional Laplacian, but not in a pathwise sense.
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Average measure of entanglement"

1

Seshadri, Suparna, Karthik V. Myilswamy, Zhao-Hui Ma, Yu-Ping Huang et Andrew M. Weiner. « Measuring frequency-bin entanglement from a quasi-phase-matched lithium niobate microring ». Dans CLEO : Fundamental Science, FTu4F.3. Washington, D.C. : Optica Publishing Group, 2024. http://dx.doi.org/10.1364/cleo_fs.2024.ftu4f.3.

Texte intégral
Résumé :
We employ phase modulation to measure the phase coherence between 31.75 GHz-spaced frequency bins in a biphoton frequency comb generated from an integrated quasi-phase-matched thin-film lithium niobate microresonator.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Wang, Po-Han, et Ray-Kuang Lee. « Characterization of thermal degraded optical cat states by non-Gaussianity ». Dans Quantum 2.0, QW3A.6. Washington, D.C. : Optica Publishing Group, 2024. http://dx.doi.org/10.1364/quantum.2024.qw3a.6.

Texte intégral
Résumé :
We apply the non-Gaussianity measure to degraded optical states, based on the Hilbert-Schmidt distance to thermal squeezed states, and reveal a counter-intuitive increase of non-Gaussianity when the coupled thermal states at a low average photon number.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Wang, Po-Han, et Ray-Kuang Lee. « Characterization of thermal degraded optical cat states by non-Gaussianity ». Dans CLEO : Applications and Technology, JW2A.153. Washington, D.C. : Optica Publishing Group, 2024. http://dx.doi.org/10.1364/cleo_at.2024.jw2a.153.

Texte intégral
Résumé :
We apply the non-Gaussianity measure to degraded optical states, based on the Hilbert-Schmidt distance to thermal squeezed states, and reveal a counter-intuitive increase of non-Gaussianity when the coupled thermal states at a low average photon number.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Beigi, Salman. « Maximal entanglement — ; A new measure of entanglement ». Dans 2014 Iran Workshop on Communication and Information Theory (IWCIT). IEEE, 2014. http://dx.doi.org/10.1109/iwcit.2014.6842486.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Jack, B., J. Leach, J. Romero, S. Franke-Arnold, S. M. Barnett et M. J. Padgett. « Spatial Light Modulators to Measure Entanglement Between Spatial States ». Dans Frontiers in Optics. Washington, D.C. : OSA, 2009. http://dx.doi.org/10.1364/fio.2009.jtub4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Lo Franco, Rosario, Farzam Nosrati, Alessia Castellini et Giuseppe Compagno. « Robust entanglement preparation through spatial indistinguishability quantified by entropic measure ». Dans Entropy 2021 : The Scientific Tool of the 21st Century. Basel, Switzerland : MDPI, 2021. http://dx.doi.org/10.3390/entropy2021-09872.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Tajima, Hiroyasu. « A New Second Law of Information Thermodynamics Using Entanglement Measure ». Dans Proceedings of the 12th Asia Pacific Physics Conference (APPC12). Journal of the Physical Society of Japan, 2014. http://dx.doi.org/10.7566/jpscp.1.012129.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Życzkowski, Karol. « Geometry of Quantum Entanglement ». Dans Workshop on Entanglement and Quantum Decoherence. Washington, D.C. : Optica Publishing Group, 2008. http://dx.doi.org/10.1364/weqd.2008.embs3.

Texte intégral
Résumé :
A geometric approach to investigation of quantum entanglement is advocated. Analyzing the space of pure states of a bipartite system we show how an entanglement measure of a gives state can be related to its distance to the closest separable state. We study geometry of the (N2 - 1)-dimensional convex body of mixed quantum states acting on an N-dimensional Hilbert space and demonstrate that it belongs to the class of sets of a constant height. The same property characterizes the set of all separable states of a two-qubit system. These results contribute to our understanding of quantum entanglement and its dynamics.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Jian, Zhang, et Zhou Jin. « Surface Roughness Measure Based on Average Texture Cycle ». Dans 2010 2nd International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC). IEEE, 2010. http://dx.doi.org/10.1109/ihmsc.2010.174.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Sutcliffe, Evan, Matty J. Hoban et Alejandra Beghelli. « Multi-User Entanglement Routing for Quantum Mesh Networks ». Dans Optical Fiber Communication Conference. Washington, D.C. : Optica Publishing Group, 2023. http://dx.doi.org/10.1364/ofc.2023.w4k.2.

Texte intégral
Résumé :
Multipath routing for sharing entanglement between multiple devices was simulated on real network topologies and compared against single-path routing. Results show improved entanglement rates, especially for topologies with a higher average nodal degree.
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Average measure of entanglement"

1

Muñoz, Ercio, et Mariel Siravegna. When Measure Matters : Coresidence Bias and Integenerational Mobility Revisited. Inter-American Development Bank, mai 2023. http://dx.doi.org/10.18235/0004881.

Texte intégral
Résumé :
We provide novel evidence of the impact of coresidence bias on a large set of indicators of intergenerational mobility in education. We begin re-examining a recent claim that the correlation coecient is less biased than the regression coecient. Then, we expand our analysis to show that there are indicators with varying average levels of coresidence bias going from less than 1% to more than 10%. However, some indicators with minimal bias produce high levels of re-ranking that make them uninformative to rank populations by the level of mobility. In contrast, other indicators with large bias generate more reliable rankings.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Allik, Mirjam, Dandara Ramos, Marilyn Agranonik, Elzo Pereira Pinto Junior, Maria Yury Ichihara, Mauricio Barreto, Alastair Leyland et Ruth Dundas. Developing a Small-Area Deprivation Measure for Brazil. University of Glasgow, mai 2020. http://dx.doi.org/10.36399/gla.pubs.215898.

Texte intégral
Résumé :
This report describes the development of the BrazDep small-area deprivation measure for the whole of Brazil. The measure uses the 2010 Brazilian Population Census data and is calculated for the smallest possible geographical area level, the census sectors. It combines three variables – (1) percent of households with per capita income ≤ 1/2 minimum wage; (2) percent of people not literate, aged 7+; and (3) average of percent of people with inadequate access to sewage, water, garbage collection and no toilet and bath/shower – into a single measure. Similar measures have previously been developed at the census sector level for some states or municipalities, but the deprivation measure described in this report is the first one to be provided for census sectors for the whole of Brazil. BrazDep is a measure of relative deprivation, placing the census sectors on a scale of material well-being from the least to the most deprived. It is useful in comparing areas within Brazil in 2010, but cannot be used to make comparisons across countries or time. Categorical versions of the measure are also provided, placing census sectors into groups of similar levels of deprivation. Deprivation measures, such as the one developed here, have been developed for many countries and are popular tools in public health research for describing the social patterning of health outcomes and supporting the targeting and delivery of services to areas of higher need. The deprivation measure is exponentially distributed, with a large proportion of areas having a low deprivation score and a smaller number of areas experiencing very high deprivation. There is significant regional variation in deprivation; areas in the North and Northeast of Brazil have on average much higher deprivation compared to the South and Southeast. Deprivation levels in the Central-West region fall between those for the North and South. Differences are also great between urban and rural areas, with the former having lower levels of deprivation compared to the latter. The measure was validated by comparing it to other similar indices measuring health and social vulnerability at the census sector level in states and municipalities where it was possible, and at the municipal level for across the whole of Brazil. At the municipal level the deprivation measure was also compared to health outcomes. The different validation exercises showed that the developed measure produced expected results and could be considered validated. As the measure is an estimate of the “true” deprivation in Brazil, uncertainty exists about the exact level of deprivation for all of the areas. For the majority of census sectors the uncertainty is small enough that we can reliably place the area into a deprivation category. However, for some areas uncertainty is very high and the provided estimate is unreliable. These considerations should always be kept in mind when using the BrazDep measure in research or policy. The measure should be used as part of a toolkit, rather than a single basis for decision-making. The data together with documentation is available from the University of Glasgow http: //dx.doi.org/10.5525/gla.researchdata.980. The data and this report are distributed under Creative Commons Share-Alike license (CC BY-SA 4.0) and can be freely used by researchers, policy makers or members of public.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Bürgi, Constantin, et Nisan Gorgulu. Spatial Closeness of Population and Economic Growth. Copenhagen School of Energy Infrastructure, 2022. http://dx.doi.org/10.22439/csei.pb.014.

Texte intégral
Résumé :
We look at the spatial angle of economic growth. Specifically, we assess whether areas where people live closer together experience faster growth. Traditional measures like population density or urbanization are not optimal, as they are affected by large uninhabited areas or capped, respectively. We thus introduce a new measure Spatial Population Concentration (SPC) that captures how many people live on average within a given radius of every person within a geographic area. This measure allows for a more accurate measurement of the population concentration than traditional measures, as it does not share some of their shortcomings. Next, we show for U.S. counties that areas with a high spatial population concentration experience faster growth. We nd that counties with a low value of SPC measure in 1990 experienced substantially lower GDP growth over the next 25 years.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Meyer, Erik, et Erik Meyer. Night skies data report : Photometric assessment of night sky quality at Haleakal ? National Park. National Park Service, 2024. http://dx.doi.org/10.36967/2305714.

Texte intégral
Résumé :
This report characterizes night sky conditions in Haleakal? National Park (HALE) using measurements made in the park unit and models of regional conditions based on satellite data. Calibrated night sky imagery was obtained to characterize the night sky at 2 sites. These ground-based observations were collected on 3 nights from 2012-06-12 to 2012-06-14. Satellite data from 2022 was used to create a map of predicted night sky conditions in and around the park. The sky overhead remains pristine with the average zenith brightness of 21.78 mag/arcsec2. We estimate more than 98% of stars were still visible for most of the time, providing an outstanding opportunity to observe the natural night sky from the park. The whole sky over HALE is 11%-18% (mean=14%) brighter than average natural levels, indicating excellent dark sky conditions on average. In HALE, we classified the sky as Bortle Class 3: rural sky, based on the visibility of astronomical objects. The average naked eye limiting magnitude (NELM) is 6.7, which is approaching near pristine under average conditions. Our Sky Quality Meter (SQM) measurements average to 21.74 mag/arcsec2, indicating the zenith is darker than what we can accurately measure with a SQM. Some indication of light pollution is evident along the horizon. Clouds may appear faintly illuminated in the brightest parts of the sky near the horizon but are dark overhead. The Milky Way still appears complex. The main impacts to HALE?s night sky quality were the light domes from Kula, Kihei, Pukalani, Wailea and Kahului. Light domes from these communities were observed along the horizon, although none exceeded the natural brightness of the Milky Way at the time of the study.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Bacharach, Miguel, et William J. Vaughan. Household Water Demand Estimation. Inter-American Development Bank, mars 1994. http://dx.doi.org/10.18235/0011616.

Texte intégral
Résumé :
This working paper addresses the issues: 1) whether there is a simultaneous equations problem when estimating demand for water with multipart rate schedules and, if there is one, what techniques should be used to correct for it; and 2) whether average or marginal price is the relevant measure in estimating the demand function. In addition, the issue is raised of sample selection bias in a rate schedule, combined with a fixed charge for consumption below the level where a block rate tariff per unit consumed. This study uses results from a sample of 685 families from 34 localities in rural Argentina (taken in 1987) and presents an analysis of the results with different estimation techniques.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Monsalve, Emma, Alma Romero et António Afonso. Public Sector Efficiency : Evidence for Latin America. Inter-American Development Bank, mars 2013. http://dx.doi.org/10.18235/0006949.

Texte intégral
Résumé :
We compute Public Sector Performance (PSP) and Public Sector Efficiency (PSE) indicators and Data Envelopment Analysis (DEA) efficiency scores for a sample of twenty-three Latin American and Caribbean Countries (LAC) to measure efficiency of public spending for the period 2001-2010. Our results show that the PSE is inversely correlated with the size of the government, while the efficiency frontier is essentially defined by Chile, Guatemala, and Peru. Moreover, on average, output quantities could theoretically be proportionally increased by 19 percent with the same level of inputs. In addition, the performed Tobit analysis suggests that more transparency and regulatory quality improve the efficiency scores, while more transparency and control of corruption increase output-oriented efficiency.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Adsit, Sarah E., Theodora Konstantinou, Konstantina Gkritza et Jon D. Fricker. Public Acceptance of INDOT’s Traffic Engineering Treatments and Services. Purdue University, 2021. http://dx.doi.org/10.5703/1288284317280.

Texte intégral
Résumé :
As a public agency, interacting with and understanding the public’s perspective regarding agency activities is an important endeavor for the Indiana Department of Transportation (INDOT). Although INDOT conducts a biennial customer satisfaction survey, it is occasionally necessary to capture public perception regarding more specific aspects of INDOT’s activities. In particular, INDOT needs an effective way to measure and track public opinions and awareness or understanding of a select set of its traffic engineering practices. To evaluate public acceptance of specific INDOT traffic engineering activities, a survey consisting of 1.000 adults residing within the State of Indiana was conducted. The survey population was representative in terms of age and gender of the state as of the 2010 U.S. Census. The survey was administered during the months of July and August 2020. Public awareness regarding emerging treatments not currently implemented in Indiana is low and opposition to the same new technologies is prominent. Older or female drivers are less likely to be aware of emerging treatments, and older drivers are more likely to oppose potential implementation of these treatments. Although roundabouts are commonplace in Indiana, multi-lane roundabouts remain controversial among the public. Regarding maintenance and protection of traffic during work zones and considering full or partial roadway closure, public preference is for partial closure; this preference is stronger in rural areas. The public equally agrees and disagrees that INDOT minimizes construction related traffic delays. Approximately 76% of Indiana drivers believe themselves to above average drivers, while an additional 23% believe themselves to be average. Driver perceptions of average highway speeds speed are not aligned with posted speed limit as the perceived average speed on Indiana’s urban freeways and rural and urban state highways is considerably higher than the actual speed limit.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Aguiar, Brandon, Paul Bianco et Arvind Agarwal. Using High-Speed Imaging and Machine Learning to Capture Ultrasonic Treatment Cavitation Area at Different Amplitudes. Florida International University, octobre 2021. http://dx.doi.org/10.25148/mmeurs.009773.

Texte intégral
Résumé :
The ultrasonic treatment process strengthens metals by increasing nucleation and decreasing grain size in an energy efficient way, without having to add anything to the material. The goal of this research endeavor was to use machine learning to automatically measure cavitation area in the Ultrasonic Treatment process to understand how amplitude influences cavitation area. For this experiment, a probe was placed into a container filled with turpentine because it has a similar viscosity to liquid aluminum. The probe gyrates up and down tens of micrometers at a frequency of 20 kHz, which causes cavitations to form in the turpentine. Each experimental trial ran for 5 seconds. We took footage on a high-speed camera running the UST probe from 20% to 35% amplitude in increments of 1%. Our research examined how the amplitude of the probe changed the cavitation area per unit time. It was vital to get a great contrast between the cavitations and the turpentine so that we could train a machine learning model to measure the cavitation area in a software called Dragonfly. We observed that as amplitude increased, average cavitation area also increased. Plotting cavitation area versus time shows that the cavitation area for a given amplitude increases and decreases in a wave-like pattern as time passes.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Harris, Bernard. Anthropometric history and the measurement of wellbeing. Verlag der Österreichischen Akademie der Wissenschaften, juin 2021. http://dx.doi.org/10.1553/populationyearbook2021.rev02.

Texte intégral
Résumé :
It has often been recognised that the average height of a population is influencedby the economic, social and environmental conditions in which it finds itself, andthis insight has inspired a generation of historians to use anthropometric data toinvestigate the health and wellbeing of past populations. This paper reviews someof the main developments in the field, and assesses the extent to which heightremains a viable measure of historical wellbeing. It explores a number of differentissues, including the nature of human growth; the impact of variations in diet andexposure to disease; the role of ethnicity; the relationships between height, mortalityand labour productivity; and the “social value” of human stature. It concludes that,despite certain caveats, height has retained its capacity to act as a “mirror” of theconditions of past societies, and of the wellbeing of their members.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Carpio, Carlos, Manuel Garcia, Ana R. Rios, Tullaya Boonsaeng, Juan M. Murguia et Alcido Wander. Static and Dynamic Economic Resilience Indicators for Agrifood Supply Chains : The COVID-19 Pandemic in Latin America and the Caribbean. Inter-American Development Bank, juin 2023. http://dx.doi.org/10.18235/0004976.

Texte intégral
Résumé :
Given its enormous adverse effects on production systems and the economy, the recent COVID-19 pandemic has heightened interest in studying resilience in agrifood systems; however, only a few studies have used formal methods for resilience measurement. This study's overall objective was to identify, develop, and use indicators to measure the resilience of the agrifood supply chain. Specific research objectives were 1) to identify and develop survey-based indicators of the economic resilience of agribusinesses; 2) to use the indicators to measure and analyze the economic resilience of the agrifood supply chain in Latin America and the Caribbean (LAC) to the COVID-19 pandemic; and 3) to evaluate differences in the economic resilience of agribusinesses in the different supply chain stages to the COVID-19 pandemic. Data for the study were collected through two online surveys conducted in 2020 and 2022. Two resilience indicators were identified and developed: a static (SRES) and a dynamic (DRES) indicator. SRES measures the ability of businesses to avoid business losses within each study period. DRES measures firms capacity to recover business activity after an initial negative revenue shock. Study results reflect that, on average, the LAC agrifood supply chain firms in the sample were able to adapt and recover from the disruptions of a global health pandemic. The effects of the pandemic were not homogeneous across firms, nor was their adaptive resilience to the disruption.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie