Literatura académica sobre el tema "Sample Degeneracy"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Sample Degeneracy".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Sample Degeneracy"

1

Lewenstein, Maciej, J. Ignacio Cirac y Luis Santos. "Cooling of a small sample of Bose atoms with accidental degeneracy". Journal of Physics B: Atomic, Molecular and Optical Physics 33, n.º 19 (15 de septiembre de 2000): 4107–29. http://dx.doi.org/10.1088/0953-4075/33/19/321.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Tonn, Babette, Jacques Lacaze y Stephanie Duwe. "Degenerated Graphite Growth in Ductile Iron". Materials Science Forum 925 (junio de 2018): 62–69. http://dx.doi.org/10.4028/www.scientific.net/msf.925.62.

Texto completo
Resumen
As part of a study devoted to the effect of trace elements on graphite degeneracy, near-eutectic ductile iron melts were prepared to which minute amounts of lead and of both lead and cerium were added. The melts were cast into an insulated Y4 mould, giving a solidification time of about 1 hour and a cooling time to room temperature of about 15 hours. In the thermal centre of the Pb containing sample graphite spheroids as well as intergranular lamellar graphite have been found. At the same location of the casting containing both Pb and Ce, exploded as well as chunky graphite could be observed, while the formation of intergranular lamellar graphite has been suppressed. Deep etching of the samples allowed reaching the following conclusions: i) intergranular graphite in the SG-Pb sample often, if not always, originates on graphite nodules and extends towards the last to freeze areas; ii) in one location of the SG-PbCe sample, chunky graphite strings were observed to originate on an exploded nodule, thus confirming the close relationship between these two forms of graphite. Because of the over-treatment in cerium of the SG-PbCe sample, other unusual degenerate graphite was observed which appears as coarse aggregates of "porous" graphite after deep etching.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Perenon, L., S. Ilić, R. Maartens y A. de la Cruz-Dombriz. "Improvements in cosmological constraints from breaking growth degeneracy". Astronomy & Astrophysics 642 (octubre de 2020): A116. http://dx.doi.org/10.1051/0004-6361/202038409.

Texto completo
Resumen
Context. The key probes of the growth of a large-scale structure are its rate f and amplitude σ8. Redshift space distortions in the galaxy power spectrum allow us to measure only the combination fσ8, which can be used to constrain the standard cosmological model or alternatives. By using measurements of the galaxy-galaxy lensing cross-correlation spectrum or of the galaxy bispectrum, it is possible to break the fσ8 degeneracy and obtain separate estimates of f and σ8 from the same galaxy sample. Currently there are very few such separate measurements, but even this allows for improved constraints on cosmological models. Aims. We explore how having a larger and more precise sample of such measurements in the future could constrain further cosmological models. Methods. We considered what can be achieved by a future nominal sample that delivers an ∼1% constraint on f and σ8 separately, compared to the case with a similar precision on the combination fσ8. Results. For the six cosmological parameters of ΛCDM, we find improvements of ∼5–50% on their constraints. For modified gravity models in the Horndeski class, the improvements on these standard parameters are ∼0–15%. However, the precision on the sum of neutrino masses improves by 65% and there is a significant increase in the precision on the background and perturbation Horndeski parameters.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Sonnenfeld, Alessandro. "Statistical strong lensing". Astronomy & Astrophysics 656 (diciembre de 2021): A153. http://dx.doi.org/10.1051/0004-6361/202142062.

Texto completo
Resumen
Context. Time-delay lensing is a powerful tool for measuring the Hubble constant H0. However, in order to obtain an accurate estimate of H0 from a sample of time-delay lenses, very good knowledge of the mass structure of the lens galaxies is needed. Strong lensing data on their own are not sufficient to break the degeneracy between H0 and the lens model parameters on a single object basis. Aims. The goal of this study is to determine whether it is possible to break the H0-lens structure degeneracy with the statistical combination of a large sample of time-delay lenses, relying purely on strong lensing data with no stellar kinematics information. Methods. I simulated a set of 100 lenses with doubly imaged quasars and related time-delay measurements. I fitted these data with a Bayesian hierarchical method and a flexible model for the lens population, emulating the lens modelling step. Results. The sample of 100 lenses on its own provides a measurement of H0 with 3% precision, but with a −4% bias. However, the addition of prior information on the lens structural parameters from a large sample of lenses with no time delays, such as that considered in Paper I, allows for a 1% level inference. Moreover, the 100 lenses allow for a 0.03 dex calibration of galaxy stellar masses, regardless of the level of prior knowledge of the Hubble constant. Conclusions. Breaking the H0-lens model degeneracy with lensing data alone is possible, but 1% measurements of H0 require either many more than 100 time-delay lenses or knowledge of the structural parameter distribution of the lens population from a separate sample of lenses.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Molenberghs, Geert, Michael G. Kenward, Marc Aerts, Geert Verbeke, Anastasios A. Tsiatis, Marie Davidian y Dimitris Rizopoulos. "On random sample size, ignorability, ancillarity, completeness, separability, and degeneracy: Sequential trials, random sample sizes, and missing data". Statistical Methods in Medical Research 23, n.º 1 (18 de abril de 2012): 11–41. http://dx.doi.org/10.1177/0962280212445801.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Li, Tiancheng, Shudong Sun, Tariq Pervez Sattar y Juan Manuel Corchado. "Fight sample degeneracy and impoverishment in particle filters: A review of intelligent approaches". Expert Systems with Applications 41, n.º 8 (junio de 2014): 3944–54. http://dx.doi.org/10.1016/j.eswa.2013.12.031.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ravel, Bruce. "Path degeneracy and EXAFS analysis of disordered materials". Journal of Synchrotron Radiation 21, n.º 6 (1 de octubre de 2014): 1269–74. http://dx.doi.org/10.1107/s1600577514014982.

Texto completo
Resumen
Analysis of EXAFS data measured on a material with a disordered local configuration environment around the absorbing atom can be challenging owing to the proliferation of photoelectron scattering paths that must be considered in the analysis. In the case where the absorbing atom exists in multiple inequivalent sites, the problem is compounded by having to consider each site separately. A method is proposed for automating the calculation of theory for inequivalent sites, then averaging the contributions from sufficiently similar scattering paths. With this approach, the complexity of implementing a successful fitting model on a highly disordered sample is reduced. As an example, an analysis of TiK-edge data on zirconolite, CaZrTi2O7, which has three inequivalent Ti sites, is presented.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Andreon, S., G. Trinchieri y A. Moretti. "Low X-ray surface brightness clusters: implications on the scatter of the M–T and LT relations". Monthly Notices of the Royal Astronomical Society 511, n.º 4 (3 de febrero de 2022): 4991–98. http://dx.doi.org/10.1093/mnras/stac307.

Texto completo
Resumen
ABSTRACT We aim at studying scaling relations of a small but well-defined sample of galaxy clusters that includes the recently discovered class of objects that are X-ray faint for their mass. These clusters have an average low X-ray surface brightness, a low gas fraction, and are underrepresented (by a factor of 10) in X-ray surveys or entirely absent in Sunyaev-Zeldovich (SZ) surveys. With the inclusion of these objects, we find that the temperature–mass relation has an unprecedentedly large scatter, 0.20 ± 0.03 dex at fixed mass, as wide as allowed by the temperature range, and the location of a cluster in this plane depends on its surface brightness. Clusters obey a relatively tight luminosity–temperature relation independently of their brightness. We interpret the wide difference in scatter around the two relations as due to the fact that X-ray luminosity and temperature are dominated by photons coming from small radii (in particular for T we used a 300 kpc aperture radius) and strongly affected by gas thermodynamics (e.g. shocks and cool cores), whereas mass is dominated by dark matter at large radii. We measure a slope of 2.0 ± 0.2 for the L500–T relation. Given the characteristics of our sample, this value is free from collinearity (degeneracy) between evolution and slope and from hypothesis on the undetected population, which both affect the analysis of X-ray-selected samples, and can therefore be profitably used both as reference and to break the aforementioned degeneracy of X-ray-selected samples.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Repp, Andrew y István Szapudi. "Galaxy bias and σ8 from counts in cells from the SDSS main sample". Monthly Notices of the Royal Astronomical Society: Letters 498, n.º 1 (22 de agosto de 2020): L125—L129. http://dx.doi.org/10.1093/mnrasl/slaa139.

Texto completo
Resumen
ABSTRACT The counts-in-cells (CIC) galaxy probability distribution depends on both the dark matter clustering amplitude σ8 and the galaxy bias b. We present a theory for the CIC distribution based on a previous prescription of the underlying dark matter distribution and a linear volume transformation to redshift space. We show that, unlike the power spectrum, the CIC distribution breaks the degeneracy between σ8 and b on scales large enough that both bias and redshift distortions are still linear; thus, we obtain a simultaneous fit for both parameters. We first validate the technique on the Millennium Simulation and then apply it to the Sloan Digital Sky Survey main galaxy sample. We find σ8 = 0.92 ± .08 and $b = 1.39^{+.11}_{-.09}$ consistent with previous complementary results from redshift distortions and from Planck.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Kallend, John S., R. B. Schwarz y A. D. Rollett. "Resolution of Superimposed Diffraction Peaks in Texture Analysis of a YBa2Cu3O7 Polycrystal". Textures and Microstructures 13, n.º 2-3 (1 de enero de 1991): 189–97. http://dx.doi.org/10.1155/tsm.13.189.

Texto completo
Resumen
Texture measurements in polycrystalline 123 oxide superconductors are complicated by the superposition of Bragg reflections in the pole figures due to the near degeneracy of the crystal structure. A method is described, based on an extension of the WIMV algorithm, for resolving these superpositions and determining the crystal orientation distribution (OD). The method is exemplified by OD analysis of a magnetically aligned, strongly textured powder sample of YBa2Cu3O7.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Sample Degeneracy"

1

Atta-Asiamah, Ernest. "Distributed Inference for Degenerate U-Statistics with Application to One and Two Sample Test". Diss., North Dakota State University, 2020. https://hdl.handle.net/10365/31777.

Texto completo
Resumen
In many hypothesis testing problems such as one-sample and two-sample test problems, the test statistics are degenerate U-statistics. One of the challenges in practice is the computation of U-statistics for a large sample size. Besides, for degenerate U-statistics, the limiting distribution is a mixture of weighted chi-squares, involving the eigenvalues of the kernel of the U-statistics. As a result, it’s not straightforward to construct the rejection region based on this asymptotic distribution. In this research, we aim to reduce the computation complexity of degenerate U-statistics and propose an easy-to-calibrate test statistic by using the divide-and-conquer method. Specifically, we randomly partition the full n data points into kn even disjoint groups, and compute U-statistics on each group and combine them by averaging to get a statistic Tn. We proved that the statistic Tn has the standard normal distribution as the limiting distribution. In this way, the running time is reduced from O(n^m) to O( n^m/km_n), where m is the order of the one sample U-statistics. Besides, for a given significance level , it’s easy to construct the rejection region. We apply our method to the goodness of fit test and two-sample test. The simulation and real data analysis show that the proposed test can achieve high power and fast running time for both one and two-sample tests.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Keeley, Ryan F. "Design and Implementation of Degenerate qPCR/qRT-PCR Primers to Detect Microbial Nitrogen Metabolism in Wastewater and Wastewater-Related Samples". Scholar Commons, 2019. https://scholarcommons.usf.edu/etd/7826.

Texto completo
Resumen
Nitrogen cycling processes can be tracked using quantitative Polymerase Chain Reaction (qPCR) to determine the presence and qReverse Transcriptase-PCR (qRT-PCR) to determine expression of key genes, or ‘biological markers’, for nitrogen metabolism. Nitrification is catalyzed in part, by two enzymes: ammonia monooxygenase (AMO; NH3 NH2OH) and nitrite oxidoreductase (NXR; NO2- NO3-). For denitrification, four enzymes act sequentially: nitrate reductase (NAR/NAP; NO3- NO2-), nitrite reductase (NIR; NO2- NO), nitric oxide reductase (NOR; NO  N2O), and nitrous oxide reductase (NOS; N2O  N2). A principle of wastewater treatment (WWT) is to remove excess nitrogen by taking advantage of natural nitrogen cycling or biological nitrogen removal (BNR). This process involves using microorganisms to bring influent ammonia through nitrification and denitrification to release nitrogen gas, which does not contribute to eutrophication. A novel shortcut nitrogen removal configuration could increase nitrogen removal efficiency by promoting nitritation/denitritation, reducing the classic nitrogen cycle by removing the redundant oxidation/reduction step to nitrate (NO3-). Here, three nitrogen transformations were used to track the three main phases in the nitrogen cycle; ammonia monooxygenase for nitrification, nitrite oxidoreductase for shortcut, and nitrous oxide reductase for denitrification. Primers for qPCR and qRT-PCR were designed to capture as much sequence diversity as possible for each step. Genes from bacteria known to perform the nitrogen transformations of interest (amoA, nxrB, nosZ) were used to BLAST-query the Integrated Microbial Genomes & Microbiomes database (img.jgi.doe.gov) to find homologs from organisms commonly found in WWT. These sequences were then aligned to find regions sufficiently conserved for primer design. These PCR primers were tested against standards for each gene and used to track nitrogen transformation potential and expression in a novel lab-scale algal photo-sequencing batch reactor which promotes shortcut nitrogen removal from wastewater across three solids retention times (SRT, or mean cell residence time); 5, 10 and 15 days. SRT 15 had the greatest total nitrogen removal with nitritation and denitritation observed. Nitrate was not detected in the first cycle and shortcut nitrogen removal was supported by low levels of nxrB genes and transcripts. Simultaneous nitrification/denitrification was supported by elevated concentrations of nosZ during the light period and less nitrite produced than ammonium consumed. Nitritation was predominantly performed by Betaproteobacteria amoA and nitrous oxide reduction was predominantly from nosZ group I (Proteobacteria-type).
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Viscardi, Cecilia. "Approximate Bayesian Computation and Statistical Applications to Anonymized Data: an Information Theoretic Perspective". Doctoral thesis, 2021. http://hdl.handle.net/2158/1236316.

Texto completo
Resumen
Realistic statistical modelling of complex phenomena often leads to considering several latent variables and nuisance parameters. In such cases, the Bayesian approach to inference requires the computation of challenging integrals or summations over high dimensional spaces. Monte Carlo methods are a class of widely used algorithms for performing simulated inference. In this thesis, we consider the problem of sample degeneracy in Monte Carlo methods focusing on Approximate Bayesian Computation (ABC), a class of likelihood-free algorithms allowing inference when the likelihood function is analytically intractable or computationally demanding to evaluate. In the ABC framework sample degeneracy arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such "poor" parameter proposals, i.e., parameter values having an (exponentially) small probability of producing simulation outcomes close to the observed data, do not contribute at all to the representation of the parameter's posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose two algorithms, referred to as the Large Deviations Approximate Bayesian Computation algorithms (LD-ABC), where the ABC typical rejection step is avoided altogether. We adopt an information theoretic perspective resorting to the Method of Types formulation of Large Deviations, thus first restricting our attention to models for i.i.d. discrete random variables and then extending the method to parametric finite state Markov chains. We experimentally evaluate our method through proof-of-concept implementations. Furthermore, we consider statistical applications to anonymized data. We adopt the point of view of an evaluator interested in publishing data about individuals in an ananonymized form that allows balancing the learner’s utility against the risk posed by an attacker, potentially targeting individuals in the dataset. Accordingly, we present a unified Bayesian model applying to data anonymized employing group-based schemes and a related MCMC method to learn the population parameters. This allows relative threat analysis, i.e., an analysis of the risk for any individual in the dataset to be linked to a specific sensitive value beyond what is implied for the general population. Finally, we show the performance of the ABC methods in this setting and test LD-ABC at work on a real-world obfuscated dataset.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Sample Degeneracy"

1

Fischer, Nick. Antidemocracy and Authoritarianism. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252040023.003.0012.

Texto completo
Resumen
This chapter examines antidemocracy and paranoid authoritarianism as part of the Anticommunist Spider Web. It shows how anticommunist conspiracy theory, anticommunist propaganda, and the actions of many anticommunists encouraged the destruction of democracy and its replacement by a system of government by kinship group or tribe. It argues that the propaganda issued by the Spider Web, stressing the inherent disloyalty and degeneracy of huge sections of the community, inevitably pointed toward the restriction of American citizenship to those who truly deserved it. Anticommunism sought to restrict the franchise to people of the same ethnic background and religious and political beliefs. So even though anticommunist rhetoric emphasized the virtues of republican government and the universal basis of citizenship, it ultimately sought to legitimize an antidemocratic and even authoritarian society.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Gamberini, Andrea. The Ideology of the Regional State. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198824312.003.0015.

Texto completo
Resumen
This chapter analyses the way in which the Visconti justified their seigneury and their expansionist policies on an ethical and political level. They attempted to set themselves up as paladins in the war against tyranny—now seen not in its traditional sense as one of the degenerate forms of government, but as a division of the political body, a prime cause of war and an obstacle to peace. Through this intrepid conceptual twist, pro-Visconti circles were thus notably successful in deflecting any delegitimizing accusation away from their masters, while at the same time elaborating an ethico-political justification for their expansionism: a first glimpse, here, of the ideological foundations of the regional state.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Sample Degeneracy"

1

Stenner, A. Jackson y Mark Stone. "Generally Objective Measurement of Human Temperature and Reading Ability: Some Corollaries". En Explanatory Models, Unit Standards, and Personalized Learning in Educational Measurement, 167–77. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3747-7_13.

Texto completo
Resumen
AbstractWe argue that a goal of measurement is general objectivity: point estimates of a person’s measure (height, temperature, and reader ability) should be independent of the instrument and independent of the sample in which the person happens to find herself. In contrast, Rasch’s concept of specific objectivity requires only differences (i.e., comparisons) between person measures to be independent of the instrument. We present a canonical case in which there is no overlap between instruments and persons: each person is measured by a unique instrument. We then show what is required to estimate measures in this degenerate case. The canonical case encourages a simplification and reconceptualization of validity and reliability. Not surprisingly, this reconceptualization looks a lot like the way physicists and chemometricians think about validity and measurement error. We animate this presentation with a technology that blurs the distinction between instruction, assessment, and generally objective measurement of reader ability. We encourage adaptation of this model to health outcomes measurement.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Garrett, Steven L. "Membranes, Plates, and Microphones". En Understanding Acoustics, 283–330. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44787-8_6.

Texto completo
Resumen
Abstract The restoring forces on membranes are due to the applied tension, while the restoring forces for plates are due to the flexural rigidity of the plate’s material. The transition to two dimensions introduces some features that did not show up in our analysis of one-dimensional vibrating systems. Instead of applying boundary conditions at one or two points, those constraints will have to be applied along a line or a curve. In this way, incorporation of the boundary condition is linked inexorably to the choice of coordinate systems used to describe the resultant normal mode shape functions. For two-dimensional vibrators, two indices are required to specify the frequency of a normal mode, fm,n, with the number of modes in a given frequency interval increasing in proportion to the center frequency of the interval, even though that interval remains a fixed frequency span. It is also possible that modes with different mode numbers might correspond to the same frequency of vibration, a situation that is designated as “modal degeneracy.” A membrane’s response to sound pressures provides the basis for broadband condenser microphone technology that produces signals related to the electrical properties of that capacitor and the charge stored on its plates.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Chemin, Jean-Yves, Benoit Desjardins, Isabelle Gallagher y Emmanuel Grenier. "Other Layers". En Mathematical Geophysics. Oxford University Press, 2006. http://dx.doi.org/10.1093/oso/9780198571339.003.0019.

Texto completo
Resumen
Note that Ω=Ω×[0, 1] is a particular case where the boundary layers are purely horizontal or purely vertical. In the general case of an open domain Ω, the boundaries have various orientations. As long as the tangential plane ∂Ω is not vertical, the boundary layers are of Ekman type, with a size of order where ν is the normal of the tangential plane. When ν.r→0, namely when the tangential plane becomes vertical, Ekman layers become larger and larger, and degenerate for ν.r=0 in another type of boundary layer, called equatorial degeneracy of the Ekman layer. We will now detail this phenomenon in the particular case of a rotating sphere. Mathematically, almost everything is widely open! Let Ω=B(0,R) be a ball. Let θ be the latitude in spherical coordinates. The equatorial degeneracy of the Ekman layer is difficult to study. We will just give the conclusions of the analytical studies of. The Ekman layer is a good approximation of the boundary layer as long as |θ|≫E1/5. For |θ|≪E1/5 the Ekman layer degenerates into a layer of size E2/5. • for |θ|≫(εν)1/5, Ekman layer of size • for |θ| of order (εν)1/5, degeneracy of the Ekman layer into a layer of size (εν)1/5 in depth and (εν)2/5 in latitude. Let us now concentrate on the motion between two concentric rotating spheres, the speed of rotation of the spheres being the same. In this case, Ω=B(0,R) − B(0, r) where 0<r<R. Keeping in mind meteorology, the interesting case arises when R − r ≪ R: the two spheres have almost equal radius. Let us study the fluid at some latitude θ. If θ ≠0, locally, the space between the two spheres can be considered as flat and treated as a domain between two nearby plates. The conclusions of the previous paragraphs can be applied. Two Ekman layers are created, one near the inner sphere and the other one near the outer sphere.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ting, T. T. C. "Degenerate and Near Degenerate Materials". En Anisotropic Elasticity. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195074475.003.0016.

Texto completo
Resumen
The Stroh formalism presented in Sections 5.3 and 5.5 assumes that the 6×6 fundamental elasticity matrix N is simple, i.e., the three pairs of eigenvalues pα are distinct. The eigenvectors ξα (α=l,2,3) are independent of each other, and the general solution (5.3-10) consists of three independent solutions. The formalism remains valid when N is semisimple. In this case there is a repeated eigenvalue, say p2=p1 ,but there exist two independent eigenvectors ξ2 and ξ1 associated with the repeated eigenvalue. The general solution (5.3-10) continues to consist of three independent solutions. Moreover one can always choose ξ2 and ξ1 such that the orthogonality relations (5.5-11) and the subsequent relations (5.5-13)-(5.5- 17) hold. When N is nonsemisimple with p2=p1, there exists only one independent eigenvector associated with the repeated eigenvalue. The general solution (5.3-10) now contains only two independent solutions. The orthogonality relations (5.5-11) do not hold for α,β=l,2 and 4,5, and the relations (5.5-13)-(5.5-17) are not valid. Anisotropic elastic materials with a nonsemisimple N are called degenerate materials. They are degenerate in the mathematical sense, not necessarily in the physical sense. Isotropic materials are a special group of degenerate materials that happen to be degenerate also in the physical sense. There are degenerate anisotropic materials that have no material symmetry planes (Ting, 1994). It should be mentioned that the breakdown of the formalism for degenerate materials is not limited to the Stroh formalism. Other formalisms have the same problem. We have seen in Chapters 8 through 12 that in many applications the arbitrary constant q that appears in the general solution (5.3-10) can be determined analytically using the relations (5.5-13)-(5.5- 17). These solutions are consequently not valid for degenerate materials. Alternate to the algebraic representation of S, H, L in (5.5-17), it is shown in Section 7.6 that one can use an integral representation to determine S, H, L without computing the eigenvalues pα and the eigenvectors ξα. If the final solution is expressed in terms of S, H, and L the solution is valid for degenerate materials.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Kennedy, Seán y Joseph Valente. "‘A form that accommodates the mess’: Degeneration and / as Disability in Beckett’s Happy Days". En The Edinburgh Companion to Irish Modernism, 387–404. Edinburgh University Press, 2020. http://dx.doi.org/10.3366/edinburgh/9781474456692.003.0022.

Texto completo
Resumen
The essay begins with Max Nordau’s theory of degeneration, which intact form signals cultural health and broken form cultural deterioration, to show how Samuel Beckett de-pathologized the indices of degeneracy by finding a form to “normalize” the messy feelings of non-normative experience. Beckett uses disability as aesthetic principle by formalising affect, thereby universalizing divergence from ideals of embodiment.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Alzamel, Mai, Lorraine A. K. Ayad, Giulia Bernardini, Roberto Grossi, Costas S. Iliopoulos, Nadia Pisanti, Solon P. Pissis y Giovanna Rosone. "Comparing Degenerate Strings". En A Mosaic of Computational Topics: from Classical to Novel. IOS Press, 2020. http://dx.doi.org/10.3233/stal200005.

Texto completo
Resumen
Uncertain sequences are compact representations of sets of similar strings. They highlight common segments by collapsing them, and explicitly represent varying segments by listing all possible options. A generalized degenerate string (GD string) is a type of uncertain sequence. Formally, a GD string Ŝ is a sequence of n sets of strings of total size N, where the ith set contains strings of the same length ki but this length can vary between different sets. We denote by W the sum of these lengths k0, k1, …, kn-1. Our main result is an O(N + M)-time algorithm for deciding whether two GD strings of total sizes N and M, respectively, over an integer alphabet, have a non-empty intersection. This result is based on a combinatorial result of independent interest: although the intersection of two GD strings can be exponential in the total size of the two strings, it can be represented in linear space. We then apply our string comparison tool to devise a simple algorithm for computing all palindromes in Ŝ in O(min{W, n2}N)-time. We complement this upper bound by showing a similar conditional lower bound for computing maximal palindromes in Ŝ. We also show that a result, which is essentially the same as our string comparison linear-time algorithm, can be obtained by employing an automata-based approach.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

"of the Pareto distribution as ~ is a degenerate form of the two-parameter exponential distribution (8.1.1) in which". En Truncated and Censored Samples, 189–92. CRC Press, 2016. http://dx.doi.org/10.1201/b16946-73.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

K R, Kaviya y Deepa S. "An Inclusive Survey on Various Adaptive Beam Forming Algorithm for 5G Communications Systems". En Intelligent Systems and Computer Technology. IOS Press, 2020. http://dx.doi.org/10.3233/apc200182.

Texto completo
Resumen
There are several existing wireless system in 5G technology, originating interference in same frequency band and degenerate the concert of received signal. Antenna System comprise of different Beam forming methods in which direction of required signal is generated by the beam and nulls and the voids are set in the direction of unwanted signal (Interference). The survey of different blind and non-blind beam forming algorithms are discussed using smart antenna and phased array. It involves Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Recursive Least Square (RLS), Sample Matrix Inversion(SMI), Linear Constrained Minimum Variance (LCMV), Constant Modulus (CMA), Decision feedback equalization based LMS (DFE-LMS) are considered. These algorithms are outlined to be claimed in 5G network to provide good quality, capacity and dealing with coincidence of signals and interference.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Wilkinson, A. B. "Conclusion". En Blurring the Lines of Race and Freedom, 235–46. University of North Carolina Press, 2020. http://dx.doi.org/10.5149/northcarolina/9781469658995.003.0008.

Texto completo
Resumen
The concluding chapter finishes the book at the time of the U.S. Revolution and shows that Mulattoes and others of mixed ancestry had struggled throughout the colonial period for freedom. During the Revolutionary era, many of the Founding Fathers and other EuroAmericans deprived “Mulattoes,” “Negroes,” and other people of color of the same freedoms they sought from the British empire. This concluding section wraps up larger themes of the book around racial fluidity and hypodescent. It also explores further implications for how British colonists defined interracial mixture and negatively labelled people they perceived to be of mixed race as a deplorable group that were affiliated with early ideas of hybrid degeneracy. Still, people of blended heritage fashioned themselves as a group deserving of respect and the same liberties as Europeans and EuroAmericans in the early United States. Ultimately the fight for independence and equal recognition by mixed-heritage people was part of the larger freedom struggle by other poor and free people of color.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Moses Samuel Elizabeth, Allen. "Third-Order Nonlinearity Measurement Techniques". En Crystal Growth - Technologies and Applications [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.106506.

Texto completo
Resumen
To measure the degenerate (single-frequency) optical nonlinearities, third-order nonlinearity measurement and their related techniques were employed. When a laser beam is induced on a nonlinear (NL) medium, a phase change is easily identified using third-order nonlinearity measurement techniques (Z-scan). When the sample material is scanned on Z-axis, the phase change is denoted by sign and magnitude, the refractive index which is directly related to the change in the index of refraction. The nonlinear absorption from the absorption coefficient is independent of the index of refraction which is a required parameter for calculating nonlinear refraction. Further, the change in transmission caused by nonlinear absorption of the subjected material is related to the change in absorption coefficient which is easily determined by the Z-scan technique. From Z-scan responses, real and imaginary parts of third-order nonlinear susceptibility (χ3) can be determined. The Z-scan technique is an interesting process that leads to optical power limiting and nonlinear optical propagation.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Sample Degeneracy"

1

Jenkins, Ryan R. y Nejat Olgac. "Double Imaginary Root Degeneracies in Time-Delayed Systems and CTCR Treatment". En ASME 2017 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/dscc2017-5013.

Texto completo
Resumen
The dynamics we treat here is a very special and degenerate class of linear time-invariant time-delayed systems (LTI-TDS) with commensurate delays, which exhibit a double imaginary root for a particular value of the delay. The stability behavior of the system within the immediate proximity of this parametric setting which creates the degenerate dynamics is investigated. Several recent investigations also handled this class of systems from the perspective of calculus of variations. We approach the same problem from a different angle, using a recent paradigm called Cluster Treatment of Characteristic Roots (CTCR). We convert one of the parameters in the system into a variable and perturb it around the degenerate point of interest, while simultaneously varying the delay. Clearly, only a particular selection of this arbitrary parameter and the delay enforce the degeneracy. All other adjacent points would be free of the mentioned degeneracy, and therefore can be handled with the CTCR paradigm. Analysis then reveals that the parametrically limiting stability behavior of the dynamics can be extracted by simply using CTCR. The results are shown to be very much aligned with the other investigations on the problem. Simplicity and numerical speed of CTCR may be considered as practical advantages in analyzing such systems. This approach also exhibits the capabilities of CTCR in handling these degenerate cases contrary to the convictions in earlier reports. An example case study is provided to demonstrate these features.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Akimoto, R., K. Ando, F. Sasaki, S. Kobayashi y T. Tani. "Femtosecond Carrier Spin Dynamics in CdTe/Cd0.6Mn0.4Te Quantum Wells". En International Conference on Ultrafast Phenomena. Washington, D.C.: Optica Publishing Group, 1996. http://dx.doi.org/10.1364/up.1996.tue.38.

Texto completo
Resumen
In recent years, many interests are focused on the spin relaxation in the semiconductor hetero structures such as the quantum wells, since the spin relaxation time is much faster than the carrier life time. In the quantum wells, the degeneracy between the heavy hole and the light hole excitons is lifted, so that the one spin state in the conduction- and valence-band state can be excited selectively by the circularly polarized light. In the previous study of the spin relaxation in the quantum wells, the pump-probe using the circular polarization where the wavelength of the pump and the prove are the same and resonant with the heavy hole exciton[1-4], or the time-resolved luminescence measurement where the heavy hole exciton is excited resonantly by the circularly polarized pulse, and the decay of the circular polarization in the luminescence is measured[5-9], have been employed. In both cases of the measurements for the undoped quantum wells, the spin relaxation of the heavy hole exciton is contributed to both the electron spin relaxation and the heavy hole spin relaxation, simultaneously. A possible way to isolate the electron spin relaxation from the heavy hole relaxation in GaAs/AlGaAs quantum wells, is to use the p-doped quantum wells for the electron spin relaxation and the n-doped one for the heavy hole spin relaxation[9]. However, the doped quantum wells may be quite different from undoped quantum wells in the spin relaxation mechanism such as carrier-impurity scattering, the Coulomb screening of the carriers and so on. Therefore, here, we present an approach to measure the electron spin relaxation separately from the heavy hole one in the undoped quantum wells by the measurement of the femtosecond time-resolved circular dichroic spectrum. The present pump-probe method has the unconventional configuration in the absorption saturation measured from unoccupied light hole (lh) spin state and occupied heavy hole (hh) spin state simultaneously using the circularly polarized probe pulse with the continuum spectrum. The sample used in our experiments is CdTe/Cd0.6Mn0.4Te quantum wells, where the sp-d exchange interaction between the carrier spin in the well and the Mn ion spin in the barrier can be controlled by the confinement degree of the carrier wave function and we can examine the effect of the sp-d exchange interaction on the carrier spin relaxation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Kabamba, Pierre T. "The Von Neumann Threshold of Self-Reproducing Systems". En ASME 2008 Dynamic Systems and Control Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/dscc2008-2297.

Texto completo
Resumen
This paper is devoted to the study of systems of entities that are capable of generating other entities of the same kind and, possibly, self-reproducing. The main technical issue addressed is to quantify the requirements that such entities must meet to be able to produce a progeny that is not degenerate, i.e., that has the same reproductive capability as the progenitor. A novel theory that allows an explicit quantification of these requirements is presented. The notion of generation rank of an entity is introduced, and it is proved that the generation process, in most cases, is degenerative in that it strictly and irreversibly decreases the generation rank from parent to descendant. It is also proved that there exists a threshold of rank such that this degeneracy can be avoided if and only if the entity has a generation rank that meets that threshold — this is the von Neumann rank threshold. Based on this threshold, an information threshold is derived, which quantifies the minimum amount of information that must be provided to specify an entity such that its descendants are not degenerate. Furthermore, a complexity threshold is obtained, which quantifies the minimum length of the description of that entity in a given language. Examples that illustrate the theory are provided.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Lowenthal, Dennis D., Charles Miyake, Dave Cunningham, Dean Guyers, Charles Hamilton, Frank Braun y J. J. Ewing. "High-efficiency, Q-switched, mode-locked KTP OPO". En OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1992. http://dx.doi.org/10.1364/oam.1992.tuww2.

Texto completo
Resumen
We have demonstrated an efficient, synchronously pumped, doubly resonant, KTP optical parametric oscillator (OPO). The efficiencies achieved are in excess of 60% for conversion to 2.128 μm by using a conventional mode-locked and Q-switched Nd:YAG 1.064-μm driver source. In some applications this device may be more attractive than directly using a 2.1-μm laser source. The quantum efficiency of the OPO is unity because it operates at the degenerate point (both the signal and idler waves have essentially the same wavelength at 2.128 μm), and both the signal and idler waves can be extracted to provide equally useful output power. Consequently, the overall conversion efficiency is equal to the photon conversion efficiency. Since the OPO is degenerate, it is also doubly resonant, but it does not suffer from the instability problems commonly associated with such devices. This is in part due to the large nonlinear drive, high output coupling (>95%), and simultaneous existence of many longitudinal modes. Normally, a degenerate OPO has a large frequency bandwidth. However, in KTP the phase matching must be type II, and this provides a narrow bandwidth at degeneracy of approximately 5 cm−1. This fact is extremely important for driving subsequent nonlinear stages. By using the 2-μm output power as a drive source, we have achieved frequency downconversion into the mid-IR in synchronously pumped AgGaSe2 OPO’s and we achieved 40% conversion of 2.128 μm to 3.8 plus 4.8 μm.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Ramsey, J. Michael y William B. Whitten. "Spectrochemical Analysis Using Degenerate Four Wave Mixing". En Laser Applications to Chemical Analysis. Washington, D.C.: Optica Publishing Group, 1987. http://dx.doi.org/10.1364/laca.1987.tha7.

Texto completo
Resumen
Degenerate four-wave mixing, or DFWM, has been shown to have considerable potential as a technique for optical spectrometry (1)(2). Spectral resolution comparable to the natural linewidth can be obtained for atomic vapor samples due to the Doppler free nature of the measurements. To our knowledge, however, the applicability of the DFWM technique to trace elemental analysis has not been explored. Pender and Hesselink (3) have shown that DFWM can take place in an air-acetylene flame with sodium from an aspirated aqueous solution as the sample. However, they were attempting to show that DFWM could be used as a combustion diagnostic tool and involved measurements at relatively high concentrations. In this talk we will present the results of a study of degenerate four-wave mixing in atomic sodium produced in an analytical flame.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

"GEOMETRICAL DEGENERACY REMOVAL BY VIRTUAL DISTURBANCES - An Application to Surface Reconstruction from Point Slice Samples". En International Conference on Computer Graphics Theory and Applications. SciTePress - Science and and Technology Publications, 2008. http://dx.doi.org/10.5220/0001098101130118.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ting, Yung, Le Ba Tan, Gunawan Hariyanto, Bing-Kuan Hou, Lang Van Thang, Cheng-Yu Chen y Tran Thai Son. "Improvement of Degeneracy for Hybrid Transducer". En ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87753.

Texto completo
Resumen
In this paper, a new design of hybrid transducer consisting of individual longitudinal and torsional vibrators is investigated. Degeneracy approach for searching the same resonant frequency of both the longitudinal and torsional ceramic vibrators is the primary task of hybrid transducer design. In this study, a new structure is proposed so that a range of degeneracy can be found, which provides more possible configuration design of the longitudinal and torsional vibrators. A case study is presented to demonstrate the structure design and the approach of degeneracy. Also, an optimal design of large torque is achieved based on the searched range of degeneracy. With the determined driving frequency for both the longitudinal and torsional vibrators, simulation is carried out to verify the function and performance of the transducer.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

BOUYER, P. "A new optimized trapping method to create ultra-cold and degenerate atomic samples". En Optical Trapping Applications. Washington, D.C.: OSA, 2009. http://dx.doi.org/10.1364/ota.2009.otuc5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Schumaker, Bonny L. "What is a broadband squeezed state?" En OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1985. http://dx.doi.org/10.1364/oam.1985.fm2.

Texto completo
Resumen
A broadband or two-mode (nondegenerate) squeezed state is the natural two-mode analog of a single-mode (degenerate) squeezed state. The squeezing is a result of correlations between photons in the two modes. Two-mode squeezed states are produced by the same kind of physical process and the same kinds of physical device that produce single-mode squeezed states (e.g., a parametric amplifier), simply by moving away from degeneracy. They are to be contrasted with another kind of two-mode state, one produced by separately squeezing two single modes. The latter are produced by a different kind of physical process and different physical devices from those that produce two-mode squeezed states. States that are products of two single-mode squeezed states do not exhibit squeezing, for the photons in the two modes are not correlated. There is a formal sense, however, in which these different kinds of two-mode states are (unitarily) equivalent. This formal equivalence tells one that, to achieve the desired squeezing by separately squeezing two modes, one would have to use a frequency-converting device before and after squeezing the two modes separately. This process produces the required correlations, hence the squeezing, but it Is clearly not the natural way to obtain a broadband squeezed state.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Patterson, Frank, J. Brock y M. Caponi. "Continuous-wave degenerate four-wave mixing in bulk GaAs and GaAs/AlGaAs MQW samples". En OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1987. http://dx.doi.org/10.1364/oam.1987.mr12.

Texto completo
Resumen
Degenerate four-wave mixing (DFWM) in GaAs/AlGaAs multiple quantum wells (MQWs) has been demonstrated in forward and backward configurations using pulsed lasers and with cw lasers in the forward configuration.1, 2 In bulk GaAs, DFWM has only been demonstrated at a low temperature with pulsed lasers.3 We report the first demonstration of cw backward DFWM in room temperature bulk GaAs and GaAs/AlGaAs MQW samples. The DFWM reflectivity is measured in the band gap region as a function of laser wavelength with intensity as a parameter. When the MQW samples are pumped at intensities sufficient to saturate the light and heavy hole excitonic resonances, a dispersive DFWM spectrum is observed, consistent with conduction band only contributions.4
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía