Gotowa bibliografia na temat „Analytical variability”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Analytical variability”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Analytical variability"

1

Banfi, G., L. Drago i G. Lippi. "Analytical Variability in Athletes Haematological Testing". International Journal of Sports Medicine 31, nr 03 (marzec 2010): 218. http://dx.doi.org/10.1055/s-0030-1248327.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Rosenthal-Allieri, Maria Alessandra, Marie-Line Peritore, Albert Tran, Philippe Halfon, Sylvia Benzaken i Alain Bernard. "Analytical variability of the Fibrotest proteins". Clinical Biochemistry 38, nr 5 (maj 2005): 473–78. http://dx.doi.org/10.1016/j.clinbiochem.2004.12.012.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Nicas, Mark, Barton P. Simmons i Robert C. Spear. "ENVIRONMENTAL VERSUS ANALYTICAL VARIABILITY IN EXPOSURE MEASUREMENTS". American Industrial Hygiene Association Journal 52, nr 12 (grudzień 1991): 553–57. http://dx.doi.org/10.1080/15298669191365199.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Budyak, Ivan L., Kristi L. Griffiths i William F. Weiss. "Estimating analytical variability in two-dimensional data". Analytical Biochemistry 513 (listopad 2016): 36–38. http://dx.doi.org/10.1016/j.ab.2016.08.021.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Fourier, Anthony, Erik Portelius, Henrik Zetterberg, Kaj Blennow, Isabelle Quadrio i Armand Perret-Liaudet. "Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability". Clinica Chimica Acta 449 (wrzesień 2015): 9–15. http://dx.doi.org/10.1016/j.cca.2015.05.024.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Tayyarah, Rana, Michael J. Morton i Jason W. Flora. "HPHC Testing of Tobacco and Smoke to Examine Cigarette Temporal Variability". Contributions to Tobacco & Nicotine Research 31, nr 2 (1.07.2022): 112–26. http://dx.doi.org/10.2478/cttr-2022-0012.

Pełny tekst źródła
Streszczenie:
Summary Commercial cigarettes were analyzed for harmful and potentially harmful constituents (HPHCs) in tobacco and smoke to investigate temporal product variability independent of analytical variability over one week, one year, and three years. Cigarettes from the worldwide market with various design features were collected over a 3-year period, stored, and tested concurrently for HPHCs to minimize analytical variability; repeat testing of reference cigarette 3R4F was included as an analytical control for the study design. Physical parameters were found to be relatively consistent. No trends in variability were noted based on blend type, smoke analyte matrix, or magnitude of an HPHC's yield. Combustion-related HPHCs generally showed low variation. Long-term batch-to-batch variability was found to be higher than short-term variability for tobacco-related compounds that have the potential to vary over time due to weather and agronomic practices. “Tar”, nicotine, and carbon monoxide were tested in multiple labs and showed greater lab-to-lab variability than batch-to-batch variability across all phases. Based on the results of this study, commercial cigarette products appear to have relatively low product variability. The low analyte variability noted in this study with products tested under unconventionally controlled analytical conditions serves to indicate that analytical variability may be a significant contributor to overall variability for general product testing over time and in interlaboratory studies. Laboratory controls and using a matched reference product across studies and between laboratories are important to assess testing differences and variability.
Style APA, Harvard, Vancouver, ISO itp.
7

Eller, Peter M., H. Amy Feng, Ruiguang S. Song, Rosa J. Key-Schwartz, Curtis A. Esche i Jensen H. Groff. "Proficiency Analytical Testing (PAT) Silica Variability, 1990–1998". AIHAJ 60, nr 4 (lipiec 1999): 533–39. http://dx.doi.org/10.1202/0002-8894(1999)060<0533:patsv>2.0.co;2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Bentley, C., J. J. Crawford i C. A. Broderius. "Analytical and Physiological Variability of Salivary Microbial Counts". Journal of Dental Research 67, nr 11 (listopad 1988): 1409–13. http://dx.doi.org/10.1177/00220345880670111001.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Eller, Peter M., H. Amy Feng, Ruiguang S. Song, Rosa J. Key-Schwartz, Curtis A. Esche i Jensen H. Groff. "Proficiency Analytical Testing (PAT) Silica Variability, 1990–1998". American Industrial Hygiene Association Journal 60, nr 4 (1.07.1999): 533–39. http://dx.doi.org/10.1080/00028899908984475.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Lim, Chun Yee, Shin Ow Yang, Corey Markus i Tze Ping Loh. "Calibration frequency and analytical variability of laboratory measurements". Clinica Chimica Acta 539 (styczeń 2023): 87–89. http://dx.doi.org/10.1016/j.cca.2022.12.006.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Analytical variability"

1

Gräf, Michael. "Two-Dimensional Analytical Modeling of Tunnel-FETs". Doctoral thesis, Universitat Rovira i Virgili, 2017. http://hdl.handle.net/10803/450516.

Pełny tekst źródła
Streszczenie:
Basat en un mecanisme de transport de corrent de banda a banda, el túnel-FET és capaç de superar la limitació de pendent sub-llindar física del MOSFET de 60 mV /dec. Per tant, s'ha convertit en un dels dispositius més prometedors per ser el successor del MOSFET clàssic en els últims anys. Aquesta tesi descriu tots els passos necessaris per modelar analíticament un Túnel-FET de doble porta. El model inclou una solució electrostàtica de dues dimensions en totes les regions del dispositiu, el que permet fins i tot simulacions hetero-unió del dispositiu. Per a un comportament més realista del dispositiu, cal tenir en compte el rendiment del dispositiu que limita els perfils de dopatge de forma Gaussiana en les unions del canal. Les expressions per a les probabilitats de túnel de banda a banda i les de Trap-Assisted-Tunneling (TAT) són executades per un enfocament WKB quasi bidimensional. El corrent del dispositiu es calcula mitjançant la teoria de transmissió de Landauer. El model és vàlid per a dispositius de canal curt i les estàncies estan ben comparades amb les dades de simulació TCAD Sentaurus i amb les medicions proporcionades. S'introdueix un modelo general per les flactuacions del dopant aleatoria, que prediu les influencies característiques del dispositiu en el corrent de sortida i el voltatge llindar. El model s'aplica al MOSFET, així com a dispositius TFET.
Basado en un mecanismo de transporte de corriente banda a banda, el Tunnel-FET es capaz de superar la limitación de pendiente sub-umbral física del MOSFET de 60 mV/dec. Por lo tanto, esto lo convierte en uno de los dispositivos más prometedores para ser el sucesor del MOSFET clásico en los últimos años. Esta tesis describe todos los pasos necesarios para modelar analíticamente un Tunnel-FET de doble puerta. El modelo incluye una solución electrostática bidimensional en todas las regiones del dispositivo, lo que permite incluso simulaciones de hetero-unión del dispositivo. Para un comportamiento más realista del dispositivo se tiene en cuenta el rendimiento del dispositivo que limita los perfiles de dopaje de forma Gaussiana en las uniones del canal. Las expresiones para las probabilidades de túnel de banda a banda y de Trap-Assisted-Tunneling (TAT) se implementan mediante un enfoque de WKB cuasi bidimensional. La corriente del dispositivo se calcula mediante la teoría de transmisión de Landauer. El modelo es válido para dispositivos de canal corto y las estancias están bien comparadas con los datos de simulación TCAD Sentaurus y con las mediciones proporcionadas. Se introduce un modelo general para las fluctuaciones del dopado aleatorio, que predice las influencias características del dispositivo en la corriente de salida y el voltaje umbral. El modelo se aplica al MOSFET, así como a los dispositivos TFET.
Based on a band-to-band current transport mechanism, the Tunnel-FET is able to overcome the physical subthreshold slope limitation of the MOSFET of 60 mV/dec. Therefore, it has become one of the most promising devices to be the successor of the classical MOSFET in the last few years. This thesis describes all necessary steps to analytically model a double-gate Tunnel-FET. The model includes a two-dimensional electrostatic solution in all device regions, which enables even hetero-junction device simulations. Device performance limiting Gaussian-shaped doping profiles at the channel junctions are taken into account for a realistic device behavior. Expressions for the band-to-band and trap-assisted-tunneling probabilities are implemented by a quasi two-dimensional WKB approach. The device current is calculated based on Landauer's transmission theory. The model is valid for short-channel devices and stays is good agreement with the TCAD Sentaurus simulation data and with the provided measurements. A general model for random-dopant-fluctuations is introduced, which predicts characteristic device influences on the output current and threshold voltage. The model is applied to MOSFET, as well as TFET devices.
Style APA, Harvard, Vancouver, ISO itp.
2

Germani, Élodie. "Exploring and mitigating analytical variability in fMRI results using representation learning". Electronic Thesis or Diss., Université de Rennes (2023-....), 2024. http://www.theses.fr/2024URENS031.

Pełny tekst źródła
Streszczenie:
Dans cette thèse, nous nous intéressons aux variations induites par différentes méthodes d'analyse, ou variabilité analytique, dans les études d'imagerie cérébrale. C'est un phénomène qui est désormais connu dans la communauté, et notre objectif est maintenant de mieux comprendre les facteurs menant à cette variabilité et de trouver des solutions pour mieux la prendre en compte. Pour cela, j’analyse des données et j’explore les relations entre les résultats de différentes méthodes. Parallèlement, j’étudie les contraintes liées à la réutilisation de données et je propose des solutions basées sur l'intelligence artificielle afin de rendre les études plus robustes
In this thesis, we focus on the variations induced by different analysis methods, also known as analytical variability, in brain imaging studies. This phenomenon is now well known in the community, and our aim is now to better understand the factors leading to this variability and to find solutions to better account for it. To do so, I analyse data and explore the relationships between the results of different methods. At the same time, I study the constraints related to data reuse and I propose solutions based on artificial intelligence to build more robust studies
Style APA, Harvard, Vancouver, ISO itp.
3

Chatfield, Marion J. "Uncertainty of variance estimators in analytical and process variability studies". Thesis, University of Southampton, 2018. https://eprints.soton.ac.uk/422240/.

Pełny tekst źródła
Streszczenie:
This thesis demonstrates that the half-t distribution is the prior of choice for estimating uncertainty of variance estimators in routine analysis of analytical and process variance components studies. Industrial studies are often performed to estimate sources of variation e.g. to improve and quantify measurement or process capability. Understanding the uncertainty of those estimators is important, especially for small studies. A Bayesian analysis is proposed – providing a flexible methodology which easily copes with the complex and varied nature of the studies and the varied quantities of interest. The prior is a fundamental component of a Bayesian analysis. The choice of prior is appraised and the coverage of the credible intervals obtained using six families of priors is assessed. A half-t prior (with several degrees of freedom) on the standard deviation is recommended in preference to a uniform or half-Cauchy prior, when some information exists on the magnitude of variability ‘core’ to the process or analytical method. Whilst a half-t prior has been previously proposed, through extensive simulation it is demonstrated that it is the prior of choice for estimating uncertainty of variance estimators in routine analysis of analytical and process variation studies. The coverage of 95% credible intervals for variance components and total variance is 93% (approximately) or above across a range of realistic scenarios. Other priors investigated, including Jeffreys’, a FLAT prior and inverse gamma distributions on stratum variances available in PROC MIXED1 in the SAS/STAT® software, are less satisfactory. This evaluation is novel: for one-way variance component designs there is very limited evaluation of the half-t prior when estimating the uncertainty of the variance component estimators; for the two-way or more complex none has been found. Since the coverage issues were primarily for the mid-level variance component, evaluation of designs more complex than one-way is important. Highest posterior density intervals are recommended with the metric of the parameter being important. Additionally, a scale based on the intra-class correlation coefficient is proposed for plotting the credible intervals.
Style APA, Harvard, Vancouver, ISO itp.
4

Anderson, Neil R. "An investigation of the pre-analytical variability in laboratory testing and its influence on result interpretation and patient management". Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/108557/.

Pełny tekst źródła
Streszczenie:
Interpretation of laboratory tests in clinical practice is based on an understanding of the disease process within or between individuals. This is demonstrated by the variability of pathology results as compared to the previous result or against the reference range, made up from the intrinsic pathophysiological changes and also variation associated with the in vitro changes to the sample. My work is on identification and minimisation of the result variation in the pre-analytical phase, accounting for 60-70% of the errors associated with laboratory testing. The first project of my thesis is based on four studies that consider the in vitro stability of parathyroid hormone (PTH) and C-reactive protein (CRP), in which significant sample degradation is observed due to sample tube type, anticoagulant used and time to separation. The second project considers ethnic variation as a source of intra individual variation. Specifically considering intra individual ethnic variation in total cholesterol (TC) and high density lipoprotein cholesterol (HDLC), reporting significant differences were observed between Caucasian Indo-Asians in HDLC, in addition I investigated the relationship between low maternal vitamin B12 concentrations in Caucasian women and cord blood cholesterol. The third project considered the variation in laboratory results due to pre-existing conditions causing interference in common laboratory tests. I published on the effect of lipaemia on common laboratory tests, showing lipaemia does have a significant effect on laboratory tests. The following study found that the raised prolactin seen in rheumatoid arthritis is not artefactual but due to changes in cross reactivity due of prolactin subtypes. The final paper of this project shows, through a collection of case studies falsely elevated serum calcium levels in patients with paraproteinaemia.
Style APA, Harvard, Vancouver, ISO itp.
5

XHELAJ, ANDI. "Downburst Wind Field Reconstruction by means of a 2D Analytical Model and Investigation of the Parameter’s Variability through an Ensemble Approach". Doctoral thesis, Università degli studi di Genova, 2022. https://hdl.handle.net/11567/1097493.

Pełny tekst źródła
Streszczenie:
A “downburst” is defined as a diverging wind system that occurs when a strong downdraft induces an outflow of damaging winds on or near the ground. Severe wind damage in many parts of the world are often due to thunderstorm outflows and their knowledge is therefore relevant for structural safety and design wind speed evaluation. Nevertheless, there is not yet a shared model for thunderstorm outflows and their actions on structures. In this paper, an analytical model that simulates the horizontal mean wind velocity originated from a travelling downburst is proposed. The horizontal wind velocity is expressed as the vector summation of three independent components: the stationary radial velocity generated by an impinging jet over a flat surface, the downdraft translating velocity, which corresponds to the parent cloud motion, and the boundary layer background wind field at the surface where the downburst is immersed. A parametric analysis is also developed and coupled with the analytical model aiming to investigate two observed downburst events and extract their main parameters – e.g. downdraft diameter, touch-down position, translating downdraft speed and direction, intensity and decay period - in order to reconstruct the space-time evolution of these events. Due to large computational cost for the reconstruction of a single downburst wind field a novel strategy is implemented to speed up the process. Two metaheuristic optimization algorithms are used, namely the Teaching Learning Based Optimization and the Differential Evolution. The metric to evaluate the algorithm’s efficiency is based on the convergence behaviour of the objective function towards the best solution as the number of iterations increases. The decision variable parameters (e.g. downdraft diameter, touch-down position, translating downdraft speed and direction, iintensity,and decay period and so on) that minimize the objective function are very important in Wind Engineering since their knowledge allows statistical analysis of the intense wind fields that are generated during downburst winds, and therefore allows to better define the actions that these extreme events have on structures. Lastly the proposed model was validated against s strong downburst event that took place in Sânnicolau Mare (Romania) during the summer of 2021. This event was accompanied by hail of 2-3 cm in size and the hail near the surface was driven by the downburst wind. This means that the horizontal velocity of the ice projectile near the surface was less or equal to the horizontal downburst wind velocity. After this strong event, a damage survey was carried out in collaboration between the University of Genoa (Italy) and the University of Bucharest (Romania). The damage survey identified locations of buildings in Sânnicolau Mare that suffered hail damage during the event. The analytical model was used to reproduce the recorded wind speed and direction due to the severe downburst. Using the simulated wind field, the simulated damage “footprint” (i.e., the maximum wind speed that occurred at a given place at any time during the passage of the downburst) was calculated. The simulated footprint was able to matches with a very good extent the areas that suffered from hail damage, and consequently permit to validate the proposed analytical model.
Style APA, Harvard, Vancouver, ISO itp.
6

Kim, Dong Yeub. "An Analytical Study of the Short-run Variability of Korea's Balance of payments, 1961-85: Application of Keynesian and Monetary Approaches to the Problem". DigitalCommons@USU, 1989. https://digitalcommons.usu.edu/etd/4104.

Pełny tekst źródła
Streszczenie:
The relationships among the balance of payments and other macroeconomic variables in the Korean economy for the period 1961-85 are analyzed in this study. Theoretical studies on the effects of government policies on the economy and the balance of payments were conducted under both the Keynesian and monetary approaches. The Keynesian approach concentrates on the commodity and capital market adjustment factors and does not focus on the money market factors, whereas the monetary approach considers the balance of payments adjustments as a symptom of money market disequilibrium alone. The basic assumptions of those two approaches, taken seperately, are not fully relevant to the Korean economy, which has unemployed resources, a high proportion of non-traded goods to traded goods, and monetary effects of balance of payments changes. Therefore, a model combining monetary and real factors to explain the short-run behavior of Korea's balance of payments in a single framework is developed. The empirical results of the combined model show that its explanatory power is much higher than either of the two models taken separately. For balance of payments adjustment policy in Korea during the period 1961-85, fiscal and foreign exchange rate policy instruments were found to be very effective in the short-run, but monetary policy instruments were not.
Style APA, Harvard, Vancouver, ISO itp.
7

Fritsch, Clément. "Étude de la variabilité inter et intra spécifique des extractibles présents dans les écorces de résineux et de feuillus exploités industriellement dans le nord-est de la France". Electronic Thesis or Diss., Université de Lorraine, 2021. https://docnum.univ-lorraine.fr/ulprive/DDOC_T_2021_0300_FRITSCH.pdf.

Pełny tekst źródła
Streszczenie:
Les travaux de recherche présentés portent sur l’étude de la variabilité inter et intra spécifique des substances extractibles des écorces d’Abies alba, Picea abies, Pseudotsuga menziesii, Quercus robur et Fagus sylvatica. La connaissance de la variabilité de la composition chimique des écorces est un outil d’aide à la décision pour une valorisation industrielle future des écorces, qui apporterait une plus-value à la filière bois. Ainsi, les constituants des écorces sont séparés en deux catégories avec les polyphénols qui ont été analysés principalement à partir des extraits eau/éthanol et les biopolymères qui ont été extraits à partir des résidus d’écorces provenant des extractions successives. Pour les extractions eau/éthanol (1:1), 5 essences avec 8 arbres par essence et plus de 10 hauteurs par arbres ont été extraits. Une augmentation des taux d’extraits avec la hauteur a été observée pour les résineux et d’après ces résultats, il semble que les résineux contiennent davantage de matières extractibles que les feuillus et que la quantité d’extractibles augmente avec la hauteur. Pour les extractions successives avec le mélange toluène/éthanol (2:1), puis avec l’éthanol (100%), seulement 4 arbres par essence ont été extraits car il s’agissait essentiellement de récupérer des écorces dépourvues d’extractibles en vue d’analyser les biopolymères. Par la suite, les extraits d’écorces ont été étudiés grâce à des méthodes analytiques complémentaires que sont la LC-UV-MS, GC-MS, IR, RMN1H, MALDI-TOF, SEC ainsi que des tests spécifiques permettant de déterminer les taux de phénols totaux, holocellulose, α-cellulose, hémicellulose, lignine, subérine et cendres. Les résultats mettent en évidence l’existence d’une variabilité de la composition chimique des écorces de résineux en fonction de la hauteur : les taux de polyphénols, subérine, lignine et holocellulose diminuent avec la hauteur tandis que les taux de terpènes et de cendres augmentent avec la hauteur. A l’aide du même protocole analytique, une variabilité interarbres importante a été mise en évidence pour les écorces de feuillus. Certaines différences observées ont été expliquées par des paramètres biologiques tels que la hauteur, l’âge des tissus, les conditions de stockage des écorces ou encore une allométrie différente entre les arbres
This work focuses on the study of the inter and intra specific variability of the extractives substances of the barks of Abies alba, Picea abies, Pseudotsuga menziesii, Quercus robur and Fagus sylvatica. New knowledge about the variability in chemical composition of different bark types can be used as a support decision tool for a future industrial valorization, which would bring an added value to the wood industry. Thus, the constituents of the bark are separated into two categories with the polyphenols which were analyzed mainly from the water/ethanol extracts and the biopolymers which were extracted from the bark residues from the successive extractions. For the water/ethanol extractions (1:1), 5 species with 8 trees per species and more than 10 heights per tree were extracted. An increase of extractives rates with height was observed for softwoods and from these results it appears that softwoods contain more extractives than hardwoods and that the amount of extractive increases with height. For the successive extractions with the toluene / ethanol mixture (2:1), then with the ethanol (100%), only 4 trees by essence were extracted because it was essentially a question of recovering bark devoid of extractives in view of analyze biopolymers. Subsequently, the bark extracts were studied using complementary analytical methods such as LC-UV-MS, GC-MS, IR, 1H NMR, MALDI-TOF, SEC as well as specific tests to determine the levels. total phenols, holocellulose, α-cellulose, hemicellulose, lignin, suberin and ash. The results demonstrate the existence of variability in the chemical composition of softwood bark as a function of height: the levels of polyphenols, suberin, lignin and holocellulose decrease with height while the levels of terpenes and ash increase. with height. Using the same analytical protocol, significant inter-tree variability was demonstrated for hardwood bark. Some observed differences have been explained by biological parameters such as height, age of tissues, bark storage conditions or even different allometry between trees
Style APA, Harvard, Vancouver, ISO itp.
8

Skinner, Michael A. "Hapsite® gas chromatograph-mass spectrometer (GC/MS) variability assessment /". Download the thesis in PDF, 2005. http://www.lrc.usuhs.mil/dissertations/pdf/Skinner2005.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Cejas, Agustin Javier Diaz. "Aperfeiçoamentos em uma framework para análise de folgas em sistemas sócio-técnicos complexos : aplicação em um laboratório químico". reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2018. http://hdl.handle.net/10183/180636.

Pełny tekst źródła
Streszczenie:
Medidas para gerenciamento de saúde e segurança em laboratórios são de extrema importância em laboratórios químicos. As pessoas que realizam qualquer atividade em um ambiente de laboratório estão expostas a diversos perigos e, consequentemente, existe o risco de ocorrência de eventos adversos para a saúde e segurança. Este trabalho foi desenvolvido em um laboratório químico de uma universidade federal tem como principal objetivo o aperfeiçoamento de uma framework que permite a realização de uma análise sistemática qualitativa e quantitativa das folgas presentes em um sistema sócio-técnico complexo. Ferramentas da Engenharia de Resiliência foram utilizadas para estudar o laboratório, o qual foi considerado como um sistema sócio-técnico complexo. Uma das características de um sistema resiliente é a capacidade de lidar com a variabilidade, o que pode ser obtido por meio de recursos de folgas (slack) no sistema. O uso da framework permitiu obter dados importantes para a análise do sistema e sugestões de melhorias. Os aperfeiçoamentos propostos na framework mostraram-se eficazes, principalmente na quantificação das folgas e variabilidades, em função da utilização do método AHP (Analytical Hierarchy Process) para a análise de dados. O método AHP tornou possível substituir o uso de questionários para toda a equipe por uma avaliação direcionada a especialistas. Ao utilizar o AHP, os dados podem ser adquiridos com maior rapidez. Outro ganho obtido com o uso do método AHP foi a possibilidade de redução de uma etapa da framework, tornando-a mais concisa.
Measures for health and safety management are of paramount importance in chemical laboratories. People who perform any activity in a laboratory environment are exposed to a variety of hazards and consequently there is a risk of adverse health and safety events. This work was developed in a chemical laboratory of a federal university, and has as main objective the improvement of a framework that allows the accomplishment of a systematic qualitative and quantitative analysis of the slack present in a complex socio-technical system. Tools of Resilience Engineering were used for studying a chemical laboratory, which was considered as a complex socio-technical system. One of the characteristics of a resilient system is the ability to deal with variability, which can be obtained through slack resources in the system. This work was developed in a chemical laboratory of a federal university and consists in the improvement of a framework that allows the accomplishment of a systematic qualitative and quantitative analysis of the slack present in the system. The use of the framework allowed to obtain data important for the analysis of the system and suggestions for improvements. The improvements proposed in the framework proved to be effective, especially in the quantification of slack and variability, as a function of the AHP (Analytical Hierarchy Process) method for data collection. The AHP method made it possible to replace the use of questionnaires for the entire team by an expert team assessment. By using AHP, data can be acquired more quickly. Another gain obtained with the use of the AHP method was the possibility of reducing one stage of the framework, making it more concise.
Style APA, Harvard, Vancouver, ISO itp.
10

McConnell, Meghan. "Advancements in the Evaluation and Implementation of Heart Rate Variability Analytics". Thesis, Griffith University, 2021. http://hdl.handle.net/10072/404855.

Pełny tekst źródła
Streszczenie:
Clinical applications for heart rate variability (HRV) have become increasingly popular, gaining momentum and value as societies increased understanding of physiology reveals their true potential to reflect health. An additional reason for the rising popularity of HRV analysis, along with many other algorithmic based medical processes, is the relatively recent exponential increase of computing power and capabilities. Despite this many medical standards lag behind this booming increase in scientific knowledge, as the risks and precautions involved with healthcare necessarily take priority. Resultantly, the standards which pertain to the acceptable tolerance for accurate R-peak detection have remain unchanged for decades. For similar reasons, medical software is also prone to lag behind state-of-the-art developments. Yet, society is currently on the precipice of an age of high computational abilities, mass data storage, and capabilities to apply deep learning algorithms to reveal patterns that were previously inconceivable. So, when considering the needs of the future in relation to the place of HRV in healthcare, there is a distinct need for its accurate and precise collection, storage, and processing. In the work presented in this dissertation, the overarching aim was to increase the reliability of electrocardiogram (ECG) based HRV for use in predictive health analytics. To ensure both clarity and attainability, this project-level aim was broken down and addressed in a series of several works. The first a im w ork w as t o address the problems associated with the precision specified f or a ccurate p eak d etection, and thereby increase the reliability of predictive health analytics generated using HRV metrics. The study conducted around this initial aim investigates the specifics of match window requirements, clarifies the difference between fiducial marker and QRS complex detection, and makes recommendations on the precision required for accurate HRV metric extraction. In the second work, the aim was to ensure that there is a reliable foundation for the conduction of HRV-related research. Here, a thorough investigation of relevant literature revealed the lack of a suitable software, particularly for research requiring the analysis of large databases. Consequently, an improved HRV analysis platform was developed. Through use of both user-feedback and quantitative comparison to highly regarded software, the proposed platform is shown to offer a similar standard in estimated HRV metrics but requires significantly l ess manual e ffort (batch-processing approach) than the traditional single patient focused approach. The third work also addressed this aim, providing the base peak detection algorithm implemented within the HRV analysis platform. Experimentation undertaken here ensured that the developed algorithm performed precise fiducial marker detection, thereby increasing the reliability of the generated HRV metrics (measured against the framework presented in the first work). In the fourth work, the aim was to address the lack of published literature on the relationship between ECG sampling frequency (fs) and extracted HRV, in order to further ensure the reliability of predictive health analytics generated using HRV metrics. Here, a quantitative experimental approach was taken to evaluate the impact of ECG fs on subsequent estimations of HRV. This experimentation resulted in a recommendation for the minimum required ECG fs for reliable HRV extraction. The aim of the final work was to further improve the foundation for future predicative health analytics, by developing a robust pre-processing algorithm capable of autonomous detection of regions of valid ECG signal. This type of algorithm should be considered of critical importance to furthering machine learning (ML) based applications in the medical field. ML algorithms are heavily reliant on access to vast amounts of data, and without an automated pre-processing stage would require an unrealistic amount of hand-processing for implementation.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Eng & Built Env
Science, Environment, Engineering and Technology
Full Text
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Analytical variability"

1

Cooper, Marcia Janet. Biological and analytical variability of repeated transferrin receptor and ferritin measurements. Ottawa: National Library of Canada, 1995.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Reuer, Matthew K. Centennial-scale elemental and isotopic variability in the tropical and subtropical North Atlantic Ocean. Cambridge, Mass: Massachusetts Institute of Technology, 2002.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Reuer, Matthew K. Centennial-scale elemental and isotopic variability in the tropical and subtropical North Atlantic Ocean. Cambridge, Mass: Massachusetts Institute of Technology, 2002.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

National Institute for Standards and Technology (U.S.) i Geological Survey (U.S.), red. Assessment of chemical variability in three independently prepared batches of National Institute for Standards and Technology SRM 2704, Buffalo River Sediment. [Denver, CO]: U.S. Dept. of the Interior, Geological Survey, 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Spyridaki, Athina. Response Variability of Statically Determinate Beam Structures Following Non-Linear Constitutive Laws and Analytical identication of progressive collapse modes of steel frames. [New York, N.Y.?]: [publisher not identified], 2017.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kirsanov, Mihail, i Vladimir Kozlov. Flat farms. Schemes and calculation formulas: a reference book. Volume 2. ru: INFRA-M Academic Publishing LLC., 2023. http://dx.doi.org/10.12737/1918490.

Pełny tekst źródła
Streszczenie:
The handbook contains 99 schemes of flat regular statically definable trusses and exact formulas for calculating the forces in the rods and deflection depending on the number of panels. Algorithms for obtaining analytical solutions in the Maple computer mathematics system are described. Cases of kinematic variability of some truss schemes with a certain number of panels have been noted. An algorithm for analytical calculation of a spatial cantilever truss and an axisymmetric dome is described. An example of farm calculation is considered. Designed for engineers, researchers, students and postgraduates of technical universities.
Style APA, Harvard, Vancouver, ISO itp.
7

Kirsanov, Mihail, i Vladimir Kozlov. Flat farms. Schemes and calculation formulas: a reference book. Volume 3. ru: INFRA-M Academic Publishing LLC., 2023. http://dx.doi.org/10.12737/1939108.

Pełny tekst źródła
Streszczenie:
The reference book contains diagrams of flat regular statically definable arched, frame and cantilever trusses. Formulas are given for calculating the forces in the rods and deflection of structures depending on the number of panels. An algorithm for obtaining analytical solutions in the Maple computer mathematics system is described. Cases of kinematic variability of some truss schemes are considered. Designed for engineers, researchers, students and postgraduates of technical universities.
Style APA, Harvard, Vancouver, ISO itp.
8

Assessment of chemical variability in three independently prepared batches of National Institute for Standards and Technology SRM 2704, Buffalo River Sediment. [Denver, CO]: U.S. Dept. of the Interior, Geological Survey, 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Assessment of chemical variability in three independently prepared batches of National Institute of Standards and Technology SRM 2704, Buffalo River Sediment. [Denver, CO]: U.S. Dept. of the Interior, Geological Survey, 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Velkushanova, Konstantina, Linda Strande, Mariska Ronteltap, Thammarat Koottatep, Damir Brdjanovic i Chris Buckley, red. Methods for Faecal Sludge Analysis. IWA Publishing, 2021. http://dx.doi.org/10.2166/9781780409122.

Pełny tekst źródła
Streszczenie:
Faecal sludge management is recognized globally as an essential component of city-wide inclusive sanitation. However, a major gap in developing appropriate and adequate management and monitoring for faecal sludge is the ability to understand and predict the characteristics and volumes of accumulated faecal sludge, and correlations to source populations. Since standard methods for sampling and analysing faecal sludge do not currently exist, results are not comparable, the actual variability is not yet fully understood, and the transfer of knowledge and data between different regions and institutions can be challenging and often arbitrary. Due to this lack of standard analytical methods for faecal sludge, methods from other fields, such as wastewater management, and soil and food science are frequently applied. However, these methods are not necessarily the most suitable for faecal sludge analysis, and have not been specifically adapted for this purpose. Characteristics of faecal sludge can be different than these other matrices by orders of magnitude. There is also a lack of standard methods for sampling, which is complicated by the difficult nature of in situ sampling, the wide range of onsite sanitation technologies and potential sampling locations, and the diverse heterogeneity of faecal sludge within onsite containments and within cities. This illustrates the urgent need to establish common methods and procedures for faecal sludge characterisation, quantification, sampling, and modelling. The aim of this book is to provide a basis for standardised methods for the analysis of faecal sludge from onsite sanitation technologies, for improved communication between sanitation practitioners, and for greater confidence in the generated data. The book presents background information on types of faecal sludge, methods for sample collection, health and safety procedures for handling, case studies of experimental design, an approach for estimating faecal sludge at community to city-wide scales, modelling containment and treatment processes, recipes for simulants, and laboratory methods for faecal sludge analysis currently in use by faecal sludge laboratories. This book will be beneficial for researchers, laboratory technicians, academics, students and sanitation practitioners. ISBN13: 9781780409115 eISBN: 9781780409122
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Analytical variability"

1

Wätzig, Hermann. "Acceptance Criteria and Analytical Variability". W Method Validation in Pharmaceutical Analysis, 265–80. Weinheim, FRG: Wiley-VCH Verlag GmbH & Co. KGaA, 2005. http://dx.doi.org/10.1002/3527604685.ch6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Herrero, Victor, Alberto Abelló i Oscar Romero. "NOSQL Design for Analytical Workloads: Variability Matters". W Conceptual Modeling, 50–64. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46397-1_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Tsimashenka, Iryna, William Knottenbelt i Peter Harrison. "Controlling Variability in Split-Merge Systems". W Analytical and Stochastic Modeling Techniques and Applications, 165–77. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-30782-9_12.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Stanta, Giorgio. "Tissue Heterogeneity as a Pre-analytical Source of Variability". W Pre-Analytics of Pathological Specimens in Oncology, 35–43. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-13957-9_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Herber, R. F. M., i K. H. Schaller. "Analytical Variability of Biological Parameters of Exposure and Early Effects". W Health Surveillance of Individual Workers Exposed to Chemical Agents, 102–9. Berlin, Heidelberg: Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/978-3-642-73476-2_13.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Lycett, Stephen J. "Cultural Transmission, Genetic Models and Palaeolithic Variability: Integrative Analytical Approaches". W New Perspectives on Old Stones, 207–34. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-6861-6_9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Wille, Farina. "Empirical Analysis of Behavioral Variability". W A Behavior Analytical Perspective on the Relationship of Context Structure and Energy Using Flexibility in Problems of Supply and Demand Mismatch, 47–118. Wiesbaden: Springer Fachmedien Wiesbaden, 2021. http://dx.doi.org/10.1007/978-3-658-35613-2_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Anala M. i B. P. Harish. "Process Variability-Aware Analytical Modeling of Delay in Subthreshold Regime with Device Stacking Effects". W Communications in Computer and Information Science, 453–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-91244-4_36.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Wille, Farina. "From Variability to Shifting Appliance Using Behavior for Demand Side Management Purposes". W A Behavior Analytical Perspective on the Relationship of Context Structure and Energy Using Flexibility in Problems of Supply and Demand Mismatch, 119–58. Wiesbaden: Springer Fachmedien Wiesbaden, 2021. http://dx.doi.org/10.1007/978-3-658-35613-2_5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Mathews, Ky L., i José Crossa. "Experimental Design for Plant Improvement". W Wheat Improvement, 215–35. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-90673-3_13.

Pełny tekst źródła
Streszczenie:
AbstractSound experimental design underpins successful plant improvement research. Robust experimental designs respect fundamental principles including replication, randomization and blocking, and avoid bias and pseudo-replication. Classical experimental designs seek to mitigate the effects of spatial variability with resolvable block plot structures. Recent developments in experimental design theory and software enable optimal model-based designs tailored to the experimental purpose. Optimal model-based designs anticipate the analytical model and incorporate information previously used only in the analysis. New technologies, such as genomics, rapid cycle breeding and high-throughput phenotyping, require flexible designs solutions which optimize resources whilst upholding fundamental design principles. This chapter describes experimental design principles in the context of classical designs and introduces the burgeoning field of model-based design in the context of plant improvement science.
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Analytical variability"

1

Viall, Wesley, Farbod Fahimi i Babak Shotorban. "Analytical and Experimental Structural Load Variability in the UH-60A Airloads Program". W Vertical Flight Society 80th Annual Forum & Technology Display, 1–7. The Vertical Flight Society, 2024. http://dx.doi.org/10.4050/f-0080-2024-1253.

Pełny tekst źródła
Streszczenie:
A framework for statistical comparison between analytical and experimental structural loads has been developed and applied to approximately 100 counters within the UH-60A Airloads test program. This framework relies on established structural load variability methods with novel applications to analytical structural load development maneuver time transient analysis. The analytical results are from Rotorcraft Comprehensive Analysis System (RCAS) spanwise structural loads developed with hub load and spanwise aerodynamic loads prescribed. RCAS consistently under predicted the Coefficient of Variation (COV) associated with spanwise Normal bending when compared to flight data. This resulted in significant scale factors required to achieve a μ+2σ reliability for structural load development. RCAS results for Edgewise bending scale factors proved slightly better than Normal bending in addition to more even over / under prediction of COV when compared to flight data.
Style APA, Harvard, Vancouver, ISO itp.
2

Ciprini, Stefano. "Analytical and numerical time-dependent modelling of SSC blazars variability". W Workshop on Blazar Variability across the Electromagnetic Spectrum. Trieste, Italy: Sissa Medialab, 2009. http://dx.doi.org/10.22323/1.063.0073.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Challa, Bhavani Sankar, Jai Narayan Tripathi i Ramachandra Achar. "A Semi-Analytical Approach for Variability-Aware Jitter Estimation". W 2023 IEEE Electrical Design of Advanced Packaging and Systems (EDAPS). IEEE, 2023. http://dx.doi.org/10.1109/edaps58880.2023.10468526.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Vardhan, P. Harsha, Sushant Mittal, A. S. Shekhawat, Swaroop Ganguly i Udayan Ganguly. "Analytical modeling of metal gate granularity induced Vt variability in NWFETs". W 2016 74th Annual Device Research Conference (DRC). IEEE, 2016. http://dx.doi.org/10.1109/drc.2016.7548446.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Amita, Ajinkya Gorad i Udayan Ganguly. "Analytical Estimation of LER-Like Variability in GAA Nano-Sheet Transistors". W 2019 International Symposium on VLSI Technology, Systems and Application (VLSI-TSA). IEEE, 2019. http://dx.doi.org/10.1109/vlsi-tsa.2019.8804637.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kumar, Aditi, Jonathan Segal, Helen Steed, Detlef Janke, Kristin Gravdal, Matthew Brookes i Hafid O. Al-Hassi. "P239 Pre-analytical DNA yield and variability influences microbiome laboratory analysis". W Abstracts of the BSG Annual Meeting, 20–23 June 2022. BMJ Publishing Group Ltd and British Society of Gastroenterology, 2022. http://dx.doi.org/10.1136/gutjnl-2022-bsg.293.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Yu Cao i L. T. Clark. "Mapping statistical process variations toward circuit performance variability: an analytical modeling approach". W 2005 42nd Design Automation Conference. IEEE, 2005. http://dx.doi.org/10.1109/dac.2005.193893.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Gummalla, Samatha, Anupama R. Subramaniam, Yu Cao i Chaitali Chakrabarti. "An analytical approach to efficient circuit variability analysis in scaled CMOS design". W 2012 13th International Symposium on Quality Electronic Design (ISQED). IEEE, 2012. http://dx.doi.org/10.1109/isqed.2012.6187560.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

M., Anala, i B. P. Harish. "Analytical Modeling of Process Variability in Subthreshold Regime for Ultra Low Power Applications". W 2018 IEEE Asia Pacific Conference on Circuits and Systems (APCCAS). IEEE, 2018. http://dx.doi.org/10.1109/apccas.2018.8605632.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ledoux, Yann, Alain Sergent i Robert Arrieux. "Impact Of The Material Variability On The Stamping Process: Numerical And Analytical Analysis". W MATERIALS PROCESSING AND DESIGN; Modeling, Simulation and Applications; NUMIFORM '07; Proceedings of the 9th International Conference on Numerical Methods in Industrial Forming Processes. AIP, 2007. http://dx.doi.org/10.1063/1.2740975.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Analytical variability"

1

Edwards, T. B., i D. K. Peeler. Analytical Plans Supporting The Sludge Batch 8 Glass Variability Study Being Conducted By Energysolutions And Cua's Vitreous State Laboratory. Office of Scientific and Technical Information (OSTI), listopad 2012. http://dx.doi.org/10.2172/1055761.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Vecherin, Sergey, Stephen Ketcham, Aaron Meyer, Kyle Dunn, Jacob Desmond i Michael Parker. Short-range near-surface seismic ensemble predictions and uncertainty quantification for layered medium. Engineer Research and Development Center (U.S.), wrzesień 2022. http://dx.doi.org/10.21079/11681/45300.

Pełny tekst źródła
Streszczenie:
To make a prediction for seismic signal propagation, one needs to specify physical properties and subsurface ground structure of the site. This information is frequently unknown or estimated with significant uncertainty. This paper describes a methodology for probabilistic seismic ensemble prediction for vertically stratified soils and short ranges with no in situ site characterization. Instead of specifying viscoelastic site properties, the methodology operates with probability distribution functions of these properties taking into account analytical and empirical relationships among viscoelastic variables. This yields ensemble realizations of signal arrivals at specified locations where statistical properties of the signals can be estimated. Such ensemble predictions can be useful for preliminary site characterization, for military applications, and risk analysis for remote or inaccessible locations for which no data can be acquired. Comparison with experiments revealed that measured signals are not always within the predicted ranges of variability. Variance-based global sensitivity analysis has shown that the most significant parameters for signal amplitude predictions in the developed stochastic model are the uncertainty in the shear quality factor and the Poisson ratio above the water table depth.
Style APA, Harvard, Vancouver, ISO itp.
3

Burns, Malcom, i Gavin Nixon. Literature review on analytical methods for the detection of precision bred products. Food Standards Agency, wrzesień 2023. http://dx.doi.org/10.46756/sci.fsa.ney927.

Pełny tekst źródła
Streszczenie:
The Genetic Technology (Precision Breeding) Act (England) aims to develop a science-based process for the regulation and authorisation of precision bred organisms (PBOs). PBOs are created by genetic technologies but exhibit changes which could have occurred through traditional processes. This current review, commissioned by the Food Standards Agency (FSA), aims to clarify existing terminologies, explore viable methods for the detection, identification, and quantification of products of precision breeding techniques, address and identify potential solutions to the analytical challenges presented, and provide recommendations for working towards an infrastructure to support detection of precision bred products in the future. The review includes a summary of the terminology in relation to analytical approaches for detection of precision bred products. A harmonised set of terminology contributes towards promoting further understanding of the common terms used in genome editing. A review of the current state of the art of potential methods for the detection, identification and quantification of precision bred products in the UK, has been provided. Parallels are drawn with the evolution of synergistic analytical approaches for the detection of Genetically Modified Organisms (GMOs), where molecular biology techniques are used to detect DNA sequence changes in an organism’s genome. The scope and limitations of targeted and untargeted methods are summarised. Current scientific opinion supports that modern molecular biology techniques (i.e., quantitative real-time Polymerase Chain Reaction (qPCR), digital PCR (dPCR) and Next Generation Sequencing (NGS)) have the technical capability to detect small alterations in an organism’s genome, given specific prerequisites of a priori information on the DNA sequence of interest and of the associated flanking regions. These techniques also provide the best infra-structure for developing potential approaches for detection of PBOs. Should sufficient information be known regarding a sequence alteration and confidence can be attributed to this being specific to a PBO line, then detection, identification and quantification can potentially be achieved. Genome editing and new mutagenesis techniques are umbrella terms, incorporating a plethora of approaches with diverse modes of action and resultant mutational changes. Generalisations regarding techniques and methods for detection for all PBO products are not appropriate, and each genome edited product may have to be assessed on a case-by-case basis. The application of modern molecular biology techniques, in isolation and by targeting just a single alteration, are unlikely to provide unequivocal evidence to the source of that variation, be that as a result of precision breeding or as a result of traditional processes. In specific instances, detection and identification may be technically possible, if enough additional information is available in order to prove that a DNA sequence or sequences are unique to a specific genome edited line (e.g., following certain types of Site-Directed Nucelase-3 (SDN-3) based approaches). The scope, gaps, and limitations associated with traceability of PBO products were examined, to identify current and future challenges. Alongside these, recommendations were made to provide the infrastructure for working towards a toolkit for the design, development and implementation of analytical methods for detection of PBO products. Recognition is given that fully effective methods for PBO detection have yet to be realised, so these recommendations have been made as a tool for progressing the current state-of-the-art for research into such methods. Recommendations for the following five main challenges were identified. Firstly, PBOs submitted for authorisation should be assessed on a case-by-case basis in terms of the extent, type and number of genetic changes, to make an informed decision on the likelihood of a molecular biology method being developed for unequivocal identification of that specific PBO. The second recommendation is that a specialist review be conducted, potentially informed by UK and EU governmental departments, to monitor those PBOs destined for the authorisation process, and actively assess the extent of the genetic variability and mutations, to make an informed decision on the type and complexity of detection methods that need to be developed. This could be further informed as part of the authorisation process and augmented via a publicly available register or database. Thirdly, further specialist research and development, allied with laboratory-based evidence, is required to evaluate the potential of using a weight of evidence approach for the design and development of detection methods for PBOs. This concept centres on using other indicators, aside from the single mutation of interest, to increase the likelihood of providing a unique signature or footprint. This includes consideration of the genetic background, flanking regions, off-target mutations, potential CRISPR/Cas activity, feasibility of heritable epigenetic and epitranscriptomic changes, as well as supplementary material from supplier, origin, pedigree and other documentation. Fourthly, additional work is recommended, evaluating the extent/type/nature of the genetic changes, and assessing the feasibility of applying threshold limits associated with these genetic changes to make any distinction on how they may have occurred. Such a probabilistic approach, supported with bioinformatics, to determine the likelihood of particular changes occurring through genome editing or traditional processes, could facilitate rapid classification and pragmatic labelling of products and organisms containing specific mutations more readily. Finally, several scientific publications on detection of genome edited products have been based on theoretical principles. It is recommended to further qualify these using evidenced based practical experimental work in the laboratory environment. Additional challenges and recommendations regarding the design, development and implementation of potential detection methods were also identified. Modern molecular biology-based techniques, inclusive of qPCR, dPCR, and NGS, in combination with appropriate bioinformatics pipelines, continue to offer the best analytical potential for developing methods for detecting PBOs. dPCR and NGS may offer the best technical potential, but qPCR remains the most practicable option as it is embedded in most analytical laboratories. Traditional screening approaches, similar to those for conventional transgenic GMOs, cannot easily be used for PBOs due to the deficit in common control elements incorporated into the host genome. However, some limited screening may be appropriate for PBOs as part of a triage system, should a priori information be known regarding the sequences of interest. The current deficit of suitable methods to detect and identify PBOs precludes accurate PBO quantification. Development of suitable reference materials to aid in the traceability of PBOs remains an issue, particularly for those PBOs which house on- and off-target mutations which can segregate. Off-target mutations may provide an additional tool to augment methods for detection, but unless these exhibit complete genetic linkage to the sequence of interest, these can also segregate out in resulting generations. Further research should be conducted regarding the likelihood of multiple mutations segregating out in a PBO, to help inform the development of appropriate PBO reference materials, as well as the potential of using off-target mutations as an additional tool for PBO traceability. Whilst recognising the technical challenges of developing and maintaining pan-genomic databases, this report recommends that the UK continues to consider development of such a resource, either as a UK centric version, or ideally through engagement in parallel EU and international activities to better achieve harmonisation and shared responsibilities. Such databases would be an invaluable resource in the design of reliable detection methods, as well as for confirming that a mutation is as a result of genome editing. PBOs and their products show great potential within the agri-food sector, necessitating a science-based analytical framework to support UK legislation, business and consumers. Differentiating between PBOs generated through genome editing compared to organisms which exhibit the same mutational change through traditional processes remains analytically challenging, but a broad set of diagnostic technologies (e.g., qPCR, NGS, dPCR) coupled with pan-genomic databases and bioinformatics approaches may help contribute to filling this analytical gap, and support the safety, transparency, proportionality, traceability and consumer confidence associated with the UK food chain.
Style APA, Harvard, Vancouver, ISO itp.
4

Bishop, Megan, Jay Clausen, Samuel Beal i Patrick Sims. Comparison of the quantitation of heavy metals in soil using handheld LIBS, XRFS, and ICP-OES. Engineer Research and Development Center (U.S.), czerwiec 2023. http://dx.doi.org/10.21079/11681/47182.

Pełny tekst źródła
Streszczenie:
Handheld laser-induced breakdown spectroscopy (LIBS) is an emerging analytical technique that shows the potential to replace X-ray fluorescence spectroscopy (XRFS) in the field characterization of soils containing heavy metals. This study explored the accuracy and precision of handheld LIBS for analyzing soils containing copper and zinc to support LIBS as a re-placement for XRFS technology in situ. Success was defined by handheld LIBS results that could be replicated across field analyzers and verified by inductively coupled plasma–optical emission spectrometry (ICP-OES). A total of 108 soil samples from eight military installations were pressed into 13 mm pellets and then analyzed by XRFS and LIBS. Handheld LIBS has a spot-size area 100-fold smaller than that of XRFS, and though it provided accurate measurements for NIST-certified reference materials, it was not able to measure unknown soils of varying soil texture with high particle size variability, regardless of sample size. Thus, soil sample particle size heterogeneity hindered the ability to provide accurate results and replicate quantitation results across LIBS and XRFS. Increasing the number of particles encountered by each shot through particle size reduction improved both field-analyzer correlation and the correlation between handheld LIBS and ICP-OES from weak (<15%) to strong (>80%).
Style APA, Harvard, Vancouver, ISO itp.
5

McKinnon, Mark, Daniel Madryzkowksi i Craig Weinschenk. Development of a Database of Contemporary Material Properties for Fire Investigation Analysis - Materials and Methods. UL Firefighter Safety Research Institute, lipiec 2020. http://dx.doi.org/10.54206/102376/zmpa6638.

Pełny tekst źródła
Streszczenie:
Meetings with the majority of the Technical Panel for the Development of an Interactive Database of Contemporary Material Properties for Fire Modeling project were held on June 29 and June 30, 2020. The major subjects of discussion included the list of proposed materials to be tested and characterized, the properties for the database, and the experimental and analytical methods to determine the properties for the database. A list of 101 materials divided into 11 categories were identified for inclusion in the database. The topics of variability in materials and aging of products and furniture items was discussed and it was concluded that investigating these variations is outside the scope of the project in this phase. The list of properties to be stored in the database for each material as well as proposed experimental methods to determine each property were discussed in the Technical Panel meetings. The discussion emphasized that the priorities for the properties represented in the database are dependent on the expected users for the database. Three potential user groups and the sets of properties that each group would likely require were identified. To ensure that the data contained in the database is useful for modeling, it was determined that prioritization would be given to complete sets of properties to be measured and stored in the database. Over the course of the two meetings, several tools were proposed to make the database easier for model practitioners to use. Once such tool included functionality to output lines of code for the models or entire model input files to simplify the process of inserting the properties into computa- tional fire models. Another tool that was discussed would involve automatically extracting derived properties from data sets or translating between complex and simple representations of burning. The next phase of the project includes conducting research to finalize the structure of the database and finalizing experimental procedures and protocols to populate the database.
Style APA, Harvard, Vancouver, ISO itp.
6

Delwiche, Michael, Boaz Zion, Robert BonDurant, Judith Rishpon, Ephraim Maltz i Miriam Rosenberg. Biosensors for On-Line Measurement of Reproductive Hormones and Milk Proteins to Improve Dairy Herd Management. United States Department of Agriculture, luty 2001. http://dx.doi.org/10.32747/2001.7573998.bard.

Pełny tekst źródła
Streszczenie:
The original objectives of this research project were to: (1) develop immunoassays, photometric sensors, and electrochemical sensors for real-time measurement of progesterone and estradiol in milk, (2) develop biosensors for measurement of caseins in milk, and (3) integrate and adapt these sensor technologies to create an automated electronic sensing system for operation in dairy parlors during milking. The overall direction of research was not changed, although the work was expanded to include other milk components such as urea and lactose. A second generation biosensor for on-line measurement of bovine progesterone was designed and tested. Anti-progesterone antibody was coated on small disks of nitrocellulose membrane, which were inserted in the reaction chamber prior to testing, and a real-time assay was developed. The biosensor was designed using micropumps and valves under computer control, and assayed fluid volumes on the order of 1 ml. An automated sampler was designed to draw a test volume of milk from the long milk tube using a 4-way pinch valve. The system could execute a measurement cycle in about 10 min. Progesterone could be measured at concentrations low enough to distinguish luteal-phase from follicular-phase cows. The potential of the sensor to detect actual ovulatory events was compared with standard methods of estrus detection, including human observation and an activity monitor. The biosensor correctly identified all ovulatory events during its testperiod, but the variability at low progesterone concentrations triggered some false positives. Direct on-line measurement and intelligent interpretation of reproductive hormone profiles offers the potential for substantial improvement in reproductive management. A simple potentiometric method for measurement of milk protein was developed and tested. The method was based on the fact that proteins bind iodine. When proteins are added to a solution of the redox couple iodine/iodide (I-I2), the concentration of free iodine is changed and, as a consequence, the potential between two electrodes immersed in the solution is changed. The method worked well with analytical casein solutions and accurately measured concentrations of analytical caseins added to fresh milk. When tested with actual milk samples, the correlation between the sensor readings and the reference lab results (of both total proteins and casein content) was inferior to that of analytical casein. A number of different technologies were explored for the analysis of milk urea, and a manometric technique was selected for the final design. In the new sensor, urea in the sample was hydrolyzed to ammonium and carbonate by the enzyme urease, and subsequent shaking of the sample with citric acid in a sealed cell allowed urea to be estimated as a change in partial pressure of carbon dioxide. The pressure change in the cell was measured with a miniature piezoresistive pressure sensor, and effects of background dissolved gases and vapor pressures were corrected for by repeating the measurement of pressure developed in the sample without the addition of urease. Results were accurate in the physiological range of milk, the assay was faster than the typical milking period, and no toxic reagents were required. A sampling device was designed and built to passively draw milk from the long milk tube in the parlor. An electrochemical sensor for lactose was developed starting with a three-cascaded-enzyme sensor, evolving into two enzymes and CO2[Fe (CN)6] as a mediator, and then into a microflow injection system using poly-osmium modified screen-printed electrodes. The sensor was designed to serve multiple milking positions, using a manifold valve, a sampling valve, and two pumps. Disposable screen-printed electrodes with enzymatic membranes were used. The sensor was optimized for electrode coating components, flow rate, pH, and sample size, and the results correlated well (r2= 0.967) with known lactose concentrations.
Style APA, Harvard, Vancouver, ISO itp.
7

Afonso, Gara, Gabriele La Spada, Thomas M. Mertens i John C. Williams. The Optimal Supply of Central Bank Reserves under Uncertainty. Federal Reserve Bank of New York, listopad 2023. http://dx.doi.org/10.59576/sr.1077.

Pełny tekst źródła
Streszczenie:
This paper provides an analytically tractable theoretical framework to study the optimal supply of central bank reserves when the demand for reserves is uncertain and nonlinear. We fully characterize the optimal supply of central bank reserves and associated market equilibrium. We find that the optimal supply of reserves under uncertainty is greater than that absent uncertainty. With a sufficient degree of uncertainty, it is optimal to supply a level of reserves that is abundant (on the flat portion of the demand curve) absent shocks. The optimal mean spread between the market interest rate and administered rates under uncertainty may be higher or lower than that absent uncertainty. Our model is consistent with the observation that the variability of interest rate spreads is a function of the level of reserves.
Style APA, Harvard, Vancouver, ISO itp.
8

Hammad, Ali, i Mohamed Moustafa. Seismic Behavior of Special Concentric Braced Frames under Short- and Long-Duration Ground Motions. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, grudzień 2019. http://dx.doi.org/10.55461/zont9308.

Pełny tekst źródła
Streszczenie:
Over the past decade, several long-duration subduction earthquakes took place in different locations around the world, e.g., Chile in 2010, Japan in 2011, China in 2008, and Indonesia in 2004. Recent research has revealed that long-duration, large-magnitude earthquakes may occur along the Cascadia subduction zone of the Pacific Northwest Coast of the U.S. The duration of an earthquake often affects the response of structures. Current seismic design specifications mostly use response spectra to identify the hazard and do not consider duration effects. Thus, a comprehensive understanding of the effect of the duration of the ground motion on structural performance and its design implications is an important issue. The goal of this study was to investigate how the duration of an earthquake affects the structural response of special concentric braced frames (SCBFs). A comprehensive experimental program and detailed analytical investigations were conducted to understand and quantify the effect of duration on collapse capacity of SCBFs, with the goal of improving seismic design provisions by incorporating these effects. The experimental program included large-scale shake table tests, and the analytical program consisted of pre-test and post-test phases. The pre-test analysis phase performed a sensitivity analysis that used OpenSees models preliminarily calibrated against previous experimental results for different configuration of SCBFs. A tornado-diagram framework was used to rank the influence of the different modeling parameters, e.g., low-cycle fatigue, on the seismic response of SCBFs under short- and long-duration ground motions. Based on the results obtained from the experimental program, these models were revisited for further calibration and validation in the post-test analysis. The experimental program included three large-scale shake-table tests of identical single-story single-bay SCBF with a chevron-brace configuration tested under different ground motions. Two specimens were tested under a set of spectrally-matched short and long-duration ground motions. The third specimen was tested under another long-duration ground motion. All tests started with a 100% scale of the selected ground motions; testing continued with an ever-increasing ground-motion scale until failure occurred, e.g., until both braces ruptured. The shake table tests showed that the duration of the earthquake may lead to premature seismic failure or lower capacities, supporting the initiative to consider duration effects as part of the seismic design provisions. Identical frames failed at different displacements demands because of the damage accumulation associated with the earthquake duration, with about 40% reduction in the displacement capacity of the two specimens tested under long-duration earthquakes versus the short-duration one. Post-test analysis focused first on calibrating an OpenSees model to capture the experimental behavior of the test specimens. The calibration started by matching the initial stiffness and overall global response. Next, the low-cycle fatigue parameters were fine-tuned to properly capture the experimental local behavior, i.e., brace buckling and rupture. The post-test analysis showed that the input for the low-cycle fatigue models currently available in the literature does not reflect the observed experimental results. New values for the fatigue parameters are suggested herein based on the results of the three shake-table tests. The calibrated model was then used to conduct incremental dynamic analysis (IDA) using 44 pairs of spectrally-matched short- and long-duration ground motions. To compare the effect of the duration of ground motion, this analysis aimed at incorporating ground-motion variability for more generalized observations and developing collapse fragility curves using different intensity measures (IMs). The difference in the median fragility was found to be 45% in the drift capacity at failure and about 10% in the spectral acceleration (Sa). Using regression analysis, the obtained drift capacity from analysis was found to be reduced by about 8% on average for every additional 10 sec in the duration of the ground motion. The last stage of this study extended the calibrated model to SCBF archetype buildings to study the effect of the duration of ground motion on full-sized structures. Two buildings were studied: a three-story and nine-story build that resembled the original SAC buildings but were modified with SCBFs as lateral support system instead of moment resisting frames. Two planer frames were adopted from the two buildings and used for the analysis. The same 44 spectrally-matched pairs previously used in post-test analysis were used to conduct nonlinear time history analysis and study the effect of duration. All the ground motions were scaled to two hazard levels for the deterministic time history analysis: 10% exceedance in 50 years and 2% exceedance in 50 years. All analysis results were interpreted in a comparative way to isolate the effect of duration, which was the main variable in the ground-motion pairs. In general, the results showed that the analyzed SCBFs experienced higher drift values under the long-duration suite of ground motions, and, in turn, a larger percentage of fractured braces under long-duration cases. The archetype SCBFs analysis provided similar conclusions on duration effects as the experimental and numerical results on the single-story single-bay frame.
Style APA, Harvard, Vancouver, ISO itp.
9

Mazzoni, Silvia, Nicholas Gregor, Linda Al Atik, Yousef Bozorgnia, David Welch i Gregory Deierlein. Probabilistic Seismic Hazard Analysis and Selecting and Scaling of Ground-Motion Records (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, listopad 2020. http://dx.doi.org/10.55461/zjdn7385.

Pełny tekst źródła
Streszczenie:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3 (WG3), Task 3.1: Selecting and Scaling Ground-motion records. The objective of Task 3.1 is to provide suites of ground motions to be used by other working groups (WGs), especially Working Group 5: Analytical Modeling (WG5) for Simulation Studies. The ground motions used in the numerical simulations are intended to represent seismic hazard at the building site. The seismic hazard is dependent on the location of the site relative to seismic sources, the characteristics of the seismic sources in the region and the local soil conditions at the site. To achieve a proper representation of hazard across the State of California, ten sites were selected, and a site-specific probabilistic seismic hazard analysis (PSHA) was performed at each of these sites for both a soft soil (Vs30 = 270 m/sec) and a stiff soil (Vs30=760 m/sec). The PSHA used the UCERF3 seismic source model, which represents the latest seismic source model adopted by the USGS [2013] and NGA-West2 ground-motion models. The PSHA was carried out for structural periods ranging from 0.01 to 10 sec. At each site and soil class, the results from the PSHA—hazard curves, hazard deaggregation, and uniform-hazard spectra (UHS)—were extracted for a series of ten return periods, prescribed by WG5 and WG6, ranging from 15.5–2500 years. For each case (site, soil class, and return period), the UHS was used as the target spectrum for selection and modification of a suite of ground motions. Additionally, another set of target spectra based on “Conditional Spectra” (CS), which are more realistic than UHS, was developed [Baker and Lee 2018]. The Conditional Spectra are defined by the median (Conditional Mean Spectrum) and a period-dependent variance. A suite of at least 40 record pairs (horizontal) were selected and modified for each return period and target-spectrum type. Thus, for each ground-motion suite, 40 or more record pairs were selected using the deaggregation of the hazard, resulting in more than 200 record pairs per target-spectrum type at each site. The suites contained more than 40 records in case some were rejected by the modelers due to secondary characteristics; however, none were rejected, and the complete set was used. For the case of UHS as the target spectrum, the selected motions were modified (scaled) such that the average of the median spectrum (RotD50) [Boore 2010] of the ground-motion pairs follow the target spectrum closely within the period range of interest to the analysts. In communications with WG5 researchers, for ground-motion (time histories, or time series) selection and modification, a period range between 0.01–2.0 sec was selected for this specific application for the project. The duration metrics and pulse characteristics of the records were also used in the final selection of ground motions. The damping ratio for the PSHA and ground-motion target spectra was set to 5%, which is standard practice in engineering applications. For the cases where the CS was used as the target spectrum, the ground-motion suites were selected and scaled using a modified version of the conditional spectrum ground-motion selection tool (CS-GMS tool) developed by Baker and Lee [2018]. This tool selects and scales a suite of ground motions to meet both the median and the user-defined variability. This variability is defined by the relationship developed by Baker and Jayaram [2008]. The computation of CS requires a structural period for the conditional model. In collaboration with WG5 researchers, a conditioning period of 0.25 sec was selected as a representative of the fundamental mode of vibration of the buildings of interest in this study. Working Group 5 carried out a sensitivity analysis of using other conditioning periods, and the results and discussion of selection of conditioning period are reported in Section 4 of the WG5 PEER report entitled Technical Background Report for Structural Analysis and Performance Assessment. The WG3.1 report presents a summary of the selected sites, the seismic-source characterization model, and the ground-motion characterization model used in the PSHA, followed by selection and modification of suites of ground motions. The Record Sequence Number (RSN) and the associated scale factors are tabulated in the Appendices of this report, and the actual time-series files can be downloaded from the PEER Ground-motion database Portal (https://ngawest2.berkeley.edu/)(link is external).
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii