Articles de revues sur le sujet « GCxGC, sample preparation, data processing »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : GCxGC, sample preparation, data processing.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « GCxGC, sample preparation, data processing ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Laycock, Paul. « Data Preparation for NA62 ». EPJ Web of Conferences 214 (2019) : 02017. http://dx.doi.org/10.1051/epjconf/201921402017.

Texte intégral
Résumé :
In 2017, NA62 recorded over a petabyte of raw data, collecting around a billion events per day of running. Data are collected in bursts of 3-5 seconds, producing output files of a few gigabytes. A typical run, a sequence of bursts with the same detector configuration and similar experimental conditions, contains 1500 bursts and constitutes the basic unit for offline data processing. A sample of 100 random bursts is used to make timing calibrations of all detectors, after which every burst in the run is reconstructed. Finally the reconstructed events are filtered by physics channel with an average reduction factor of 20, and data quality metrics are calculated. Initially a bespoke data processing solution was implemented using a simple finite state machine with limited production system functionality. In 2017, the ATLAS Tier-0 team offered the use of their production system, together with the necessary support. Data processing workflows were rewritten with better error-handling and I/O operations were minimised, the reconstruction software was improved and conditions data handling was changed to follow best practices suggested by the HEP Software Foundation conditions database working group. This contribution describes the experience gained in using these tools and methods for data-processing on a petabyte scale experiment.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Züllig, Thomas, Martin Trötzmüller et Harald C. Köfeler. « Lipidomics from sample preparation to data analysis : a primer ». Analytical and Bioanalytical Chemistry 412, no 10 (10 décembre 2019) : 2191–209. http://dx.doi.org/10.1007/s00216-019-02241-y.

Texte intégral
Résumé :
AbstractLipids are amongst the most important organic compounds in living organisms, where they serve as building blocks for cellular membranes as well as energy storage and signaling molecules. Lipidomics is the science of the large-scale determination of individual lipid species, and the underlying analytical technology that is used to identify and quantify the lipidome is generally mass spectrometry (MS). This review article provides an overview of the crucial steps in MS-based lipidomics workflows, including sample preparation, either liquid–liquid or solid-phase extraction, derivatization, chromatography, ion-mobility spectrometry, MS, and data processing by various software packages. The associated concepts are discussed from a technical perspective as well as in terms of their application. Furthermore, this article sheds light on recent advances in the technology used in this field and its current limitations. Particular emphasis is placed on data quality assurance and adequate data reporting; some of the most common pitfalls in lipidomics are discussed, along with how to circumvent them.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Hattne, Johan, Francis E. Reyes, Brent L. Nannenga, Dan Shi, M. Jason de la Cruz, Andrew G. W. Leslie et Tamir Gonen. « MicroED data collection and processing ». Acta Crystallographica Section A Foundations and Advances 71, no 4 (1 juillet 2015) : 353–60. http://dx.doi.org/10.1107/s2053273315010669.

Texte intégral
Résumé :
MicroED, a method at the intersection of X-ray crystallography and electron cryo-microscopy, has rapidly progressed by exploiting advances in both fields and has already been successfully employed to determine the atomic structures of several proteins from sub-micron-sized, three-dimensional crystals. A major limiting factor in X-ray crystallography is the requirement for large and well ordered crystals. By permitting electron diffraction patterns to be collected from much smaller crystals, or even single well ordered domains of large crystals composed of several small mosaic blocks, MicroED has the potential to overcome the limiting size requirement and enable structural studies on difficult-to-crystallize samples. This communication details the steps for sample preparation, data collection and reduction necessary to obtain refined, high-resolution, three-dimensional models by MicroED, and presents some of its unique challenges.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Young, R. J. « Automation of Focused Ion Beam (FIB) Sample Preparation ». Microscopy and Microanalysis 6, S2 (août 2000) : 512–13. http://dx.doi.org/10.1017/s1431927600035054.

Texte intégral
Résumé :
The use of focused ion beam (FIB) systems is well established as a sample preparation and imaging tool in a wide range of applications, most notably, in the semiconductor and data storage industries, but also within material and biological sciences (Figs. 1-3). The real benefit of the FIB is that the same ion beam that is used for material removal and deposition is also used for imaging the sample, which results in highly precise and localized sample preparation. In addition, the FIB can be used to prepare samples through multiple layers with different material properties, and allows the rest of specimen to be kept intact for further analysis or processing. FIB is most commonly used to prepare samples for the transmission electron microscope (TEM), the scanning electron microscope (SEM), and for the FIB itself. The FIB, which is an imaging tool in its own right, can produce secondary-electron and -ion images and collect secondary ion mass spectrometry (SIMS) data.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Casadonte, Rita, Jörg Kriegsmann, Mark Kriegsmann, Katharina Kriegsmann, Roberta Torcasio, Maria Eugenia Gallo Cantafio, Giuseppe Viglietto et Nicola Amodio. « A Comparison of Different Sample Processing Protocols for MALDI Imaging Mass Spectrometry Analysis of Formalin-Fixed Multiple Myeloma Cells ». Cancers 15, no 3 (3 février 2023) : 974. http://dx.doi.org/10.3390/cancers15030974.

Texte intégral
Résumé :
Sample processing of formalin-fixed specimens constitutes a major challenge in molecular profiling efforts. Pre-analytical factors such as fixative temperature, dehydration, and embedding media affect downstream analysis, generating data dependent on technical processing rather than disease state. In this study, we investigated two different sample processing methods, including the use of the cytospin sample preparation and automated sample processing apparatuses for proteomic analysis of multiple myeloma (MM) cell lines using imaging mass spectrometry (IMS). In addition, two sample-embedding instruments using different reagents and processing times were considered. Three MM cell lines fixed in 4% paraformaldehyde were either directly centrifuged onto glass slides using cytospin preparation techniques or processed to create paraffin-embedded specimens with an automatic tissue processor, and further cut onto glass slides for IMS analysis. The number of peaks obtained from paraffin-embedded samples was comparable between the two different sample processing instruments. Interestingly, spectra profiles showed enhanced ion yield in cytospin compared to paraffin-embedded samples along with high reproducibility compared to the sample replicate.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Behl, Isha, Genecy Calado, Ola Ibrahim, Alison Malkin, Stephen Flint, Hugh J. Byrne et Fiona M. Lyng. « Development of methodology for Raman microspectroscopic analysis of oral exfoliated cells ». Analytical Methods 9, no 6 (2017) : 937–48. http://dx.doi.org/10.1039/c6ay03360a.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Golosov, Andrei, Olga Lubimova, Mikhail Zhevora, Vladislava Markevich et Vladimir Siskov. « Data processing method for experimental studies of deformation in a rock sample under uniaxial compression ». E3S Web of Conferences 129 (2019) : 01018. http://dx.doi.org/10.1051/e3sconf/201912901018.

Texte intégral
Résumé :
As a result of experimental and theoretical studies, the patterns of behavior of rocks in a condition close to destructive are the focal nature of the preparation of macrocracking, which allowed us to include the mesocrack structure of the material, which is the main element in the preparation of macrocracking. Differences in this new approach to mathematical modeling will let adequately describe dissipative mesocrack structures of various hierarchical levels of geodesy, predict dynamic changes, structures and mechanical properties of both rock samples and massif, which also lead to resource-intensive experimental studies. In this paper, with usage of the methods of cluster, factor, and statistical analysis, we set the task of processing the data of experimental studies of the laws of deformation and preparing macro-fracture of rock samples by various methods, including acoustic and deformation observations.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Souza, T. G. F., V. S. T. Ciminelli et N. D. S. Mohallem. « An assessment of errors in sample preparation and data processing for nanoparticle size analyses by AFM ». Materials Characterization 109 (novembre 2015) : 198–205. http://dx.doi.org/10.1016/j.matchar.2015.09.020.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Jolivet, L., V. Motto-Ros, L. Sorbier, T. Sozinho et C. P. Lienemann. « Quantitative imaging of carbon in heterogeneous refining catalysts ». Journal of Analytical Atomic Spectrometry 35, no 5 (2020) : 896–903. http://dx.doi.org/10.1039/c9ja00434c.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Rodrigues, Ana M., Ana I. Ribeiro-Barros et Carla António. « Experimental Design and Sample Preparation in Forest Tree Metabolomics ». Metabolites 9, no 12 (22 novembre 2019) : 285. http://dx.doi.org/10.3390/metabo9120285.

Texte intégral
Résumé :
Appropriate experimental design and sample preparation are key steps in metabolomics experiments, highly influencing the biological interpretation of the results. The sample preparation workflow for plant metabolomics studies includes several steps before metabolite extraction and analysis. These include the optimization of laboratory procedures, which should be optimized for different plants and tissues. This is particularly the case for trees, whose tissues are complex matrices to work with due to the presence of several interferents, such as oleoresins, cellulose. A good experimental design, tree tissue harvest conditions, and sample preparation are crucial to ensure consistency and reproducibility of the metadata among datasets. In this review, we discuss the main challenges when setting up a forest tree metabolomics experiment for mass spectrometry (MS)-based analysis covering all technical aspects from the biological question formulation and experimental design to sample processing and metabolite extraction and data acquisition. We also highlight the importance of forest tree metadata standardization in metabolomics studies.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Gondane, Aishwarya, et Harri M. Itkonen. « Revealing the History and Mystery of RNA-Seq ». Current Issues in Molecular Biology 45, no 3 (24 février 2023) : 1860–74. http://dx.doi.org/10.3390/cimb45030120.

Texte intégral
Résumé :
Advances in RNA-sequencing technologies have led to the development of intriguing experimental setups, a massive accumulation of data, and high demand for tools to analyze it. To answer this demand, computational scientists have developed a myriad of data analysis pipelines, but it is less often considered what the most appropriate one is. The RNA-sequencing data analysis pipeline can be divided into three major parts: data pre-processing, followed by the main and downstream analyses. Here, we present an overview of the tools used in both the bulk RNA-seq and at the single-cell level, with a particular focus on alternative splicing and active RNA synthesis analysis. A crucial part of data pre-processing is quality control, which defines the necessity of the next steps; adapter removal, trimming, and filtering. After pre-processing, the data are finally analyzed using a variety of tools: differential gene expression, alternative splicing, and assessment of active synthesis, the latter requiring dedicated sample preparation. In brief, we describe the commonly used tools in the sample preparation and analysis of RNA-seq data.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Oskina, Yulia A., Ekaterina Pakrieva, Elvira M. Ustinova et Andrey Kryazhov. « Decomposition and Preconcentration Methods for the Determination of Pt, Pd, Re in Mineral Raw Materials ». Advanced Materials Research 1040 (septembre 2014) : 278–81. http://dx.doi.org/10.4028/www.scientific.net/amr.1040.278.

Texte intégral
Résumé :
Nowadays the actual problem of geochemistry is a deep and complex processing of mineral raw materials. Data on the quantitative content of precious and rare metals in various types of ores and rocks are necessary. It stimulates the development and improvement of chemical analytical methods for determination of these elements. Such methods are not applicable without sample preparation stage. Preparation of samples for analysis is the decomposition and preconcentration of rare and precious metals from matrix of mineral raw materials. The sample preparation schemes of platinum, palladium and rhenium are described in this paper.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Danek, Paweł, Krzysztof Ćwirta et Piotr Kopniak. « Extraction of parameters from biometric data samples ». Journal of Computer Sciences Institute 13 (30 décembre 2019) : 323–31. http://dx.doi.org/10.35784/jcsi.1327.

Texte intégral
Résumé :
This article describes possible ways to extract parameters from biometric data samples, such as fingerprint or voice recording. Influence of particular approaches to biometric sample preparation and comparision algorithms accuracy was verified. Experiment involving processing big ammount of samples with usage of particular algorithms was performed. In fingerprint detection case the image normalization, Gabor filtering and comparision method based on descriptors were used. For voice authorization LPC and MFCC alghoritms were used. In both cases satisfying accuracy (60-80%) was the result of the surveys.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Yaminsky, Igor, et Assel Akhmetova. « Atomi c-force microscopy of viruses and bacteria ». Medicina i vysokie tehnologii 2 (février 2021) : 18–21. http://dx.doi.org/10.34219/2306-3645-2021-11-2-18-21.

Texte intégral
Résumé :
The article is devoted to the study of viruses and bacteria using a scanning probe microscope in atomic force mode, in particular, to the features of sample preparation, interpretation of the data obtained, and image processing.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Pontes, João Guilherme M., Antonio Jadson M. Brasil, Guilherme C. F. Cruz, Rafael N. de Souza et Ljubica Tasic. « NMR-based metabolomics strategies : plants, animals and humans ». Analytical Methods 9, no 7 (2017) : 1078–96. http://dx.doi.org/10.1039/c6ay03102a.

Texte intégral
Résumé :
This Tutorial Review addresses the principal steps from the sample preparation, acquisition and processing of spectra, data analysis and biomarker discovery and methodologies used in NMR-based metabolomics applied for pointing to key metabolites of diseases.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Purcaro, Giorgia, Pierre-Hugues Stefanuto, Flavio A. Franchina, Marco Beccaria, Wendy F. Wieland-Alter, Peter F. Wright et Jane E. Hill. « SPME-GC×GC-TOF MS fingerprint of virally-infected cell culture : Sample preparation optimization and data processing evaluation ». Analytica Chimica Acta 1027 (octobre 2018) : 158–67. http://dx.doi.org/10.1016/j.aca.2018.03.037.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
17

Dash, N. S. « The Process of Designing a Multidisciplinary Monolingual Sample Corpus ». International Journal of Corpus Linguistics 5, no 2 (31 décembre 2000) : 179–97. http://dx.doi.org/10.1075/ijcl.5.2.05das.

Texte intégral
Résumé :
This paper discusses the approach of developing a sample of printed corpus in Bangla, one of the national languages of India and the only national language of Bangladesh. It is designed from the data collected from various published documents. The paper highlights different issues related to corpus generation, data-file preparation, language analysis, and processing as well as application potentials to different areas of pure and applied linguistics. It also includes statistical studies on the corpus along with some interpretation of the results. The difficulties that one may face during corpus generation are also pointed out.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Ruszkiczay-Rüdiger, Zsófia, Stephanie Neuhuber, Régis Braucher, Johannes Lachner, Peter Steier, Alexander Wieser, Mihály Braun, Didier Bourlès, Georges Aumaître et Karim Keddadouche. « Comparison and performance of two cosmogenic nuclide sample preparation procedures of in situ produced 10Be and 26Al ». Journal of Radioanalytical and Nuclear Chemistry 329, no 3 (18 août 2021) : 1523–36. http://dx.doi.org/10.1007/s10967-021-07916-4.

Texte intégral
Résumé :
AbstractCosmogenic radionuclide 10Be and 26Al targets (BeO and Al2O3) for AMS analysis are produced by a growing number of geochemical laboratories, employing different sample processing methods for the extraction of Be and Al from environmental materials. The reliability of this geochronological tool depends on data reproducibility independent from the preparation steps and the AMS measurements. Our results demonstrate that 10Be and 26Al concentrations of targets processed following different, commonly used protocols and measured at two AMS facilities lead to consistent results. However, insoluble fluoride precipitates, if formed during processing, can cause decreased 26Al results, while 10Be concentrations are unaffected.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Drulyte, Ieva, Rachel M. Johnson, Emma L. Hesketh, Daniel L. Hurdiss, Charlotte A. Scarff, Sebastian A. Porav, Neil A. Ranson, Stephen P. Muench et Rebecca F. Thompson. « Approaches to altering particle distributions in cryo-electron microscopy sample preparation ». Acta Crystallographica Section D Structural Biology 74, no 6 (18 mai 2018) : 560–71. http://dx.doi.org/10.1107/s2059798318006496.

Texte intégral
Résumé :
Cryo-electron microscopy (cryo-EM) can now be used to determine high-resolution structural information on a diverse range of biological specimens. Recent advances have been driven primarily by developments in microscopes and detectors, and through advances in image-processing software. However, for many single-particle cryo-EM projects, major bottlenecks currently remain at the sample-preparation stage; obtaining cryo-EM grids of sufficient quality for high-resolution single-particle analysis can require the careful optimization of many variables. Common hurdles to overcome include problems associated with the sample itself (buffer components, labile complexes), sample distribution (obtaining the correct concentration, affinity for the support film), preferred orientation, and poor reproducibility of the grid-making process within and between batches. This review outlines a number of methodologies used within the electron-microscopy community to address these challenges, providing a range of approaches which may aid in obtaining optimal grids for high-resolution data collection.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Neset, Lasse, Gracious Takayidza, Frode S. Berven et Maria Hernandez-Valladares. « Comparing Efficiency of Lysis Buffer Solutions and Sample Preparation Methods for Liquid Chromatography–Mass Spectrometry Analysis of Human Cells and Plasma ». Molecules 27, no 11 (25 mai 2022) : 3390. http://dx.doi.org/10.3390/molecules27113390.

Texte intégral
Résumé :
The use of a proper sample processing methodology for maximum proteome coverage and high-quality quantitative data is an important choice to make before initiating a liquid chromatography–mass spectrometry (LC–MS)-based proteomics study. Popular sample processing workflows for proteomics involve in-solution proteome digestion and single-pot, solid-phase-enhanced sample preparation (SP3). We tested them on both HeLa cells and human plasma samples, using lysis buffers containing SDS, or guanidinium hydrochloride. We also studied the effect of using commercially available depletion mini spin columns before SP3, to increase proteome coverage in human plasma samples. Our results show that the SP3 protocol, using either buffer, achieves the highest number of quantified proteins in both the HeLa cells and plasma samples. Moreover, the use of depletion mini spin columns before SP3 results in a two-fold increase of quantified plasma proteins. With additional fractionation, we quantified nearly 1400 proteins, and examined lower-abundance proteins involved in neurodegenerative pathways and mitochondrial metabolism. Therefore, we recommend the use of the SP3 methodology for biological sample processing, including those after depletion of high-abundance plasma proteins.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Armstrong, Cheryl M., Andrew G. Gehring, George C. Paoli, Chin-Yi Chen, Yiping He et Joseph A. Capobianco. « Impacts of Clarification Techniques on Sample Constituents and Pathogen Retention ». Foods 8, no 12 (3 décembre 2019) : 636. http://dx.doi.org/10.3390/foods8120636.

Texte intégral
Résumé :
Determination of the microbial content in foods is important, not only for safe consumption, but also for food quality, value, and yield. A variety of molecular techniques are currently available for both identification and quantification of microbial content within samples; however, their success is often contingent upon proper sample preparation when the subject of investigation is a complex mixture of components such as foods. Because of the importance of sample preparation, the present study employs a systematic approach to compare the effects of four different separation techniques (glass wool, 50 μm polypropylene filters, graphite felt, and continuous flow centrifugation (CFC)) on sample preparation. To define the physical effects associated with the use of these separation methods, a multifactorial analysis was performed where particle size and composition, both pre- and post- processing, were analyzed for four different food matrices including lean ground beef, ground pork, ground turkey and spinach. Retention of three important foodborne bacterial pathogens (Escherichia coli O157:H7, Salmonella enterica, and Listeria monocytogenes) was also examined to evaluate the feasibility of the aforementioned methods to be utilized within the context of foodborne pathogen detection. Data from the multifactorial analysis not only delineated the particle size ranges but also defined the unique compositional profiles and quantified the bacterial retention. The three filtration membranes allowed for the passage of bacteria with minimal loss while CFC concentrated the inoculated bacteria. In addition, the deposition and therefore concentration of food matrix observed with CFC was considerably higher for meat samples relative to spinach. However, filtration with glass wool prior to CFC helped clarify meat samples, which led to considerably lower amounts of solids in the CFC vessel post processing and an increase in the recovery of the bacteria. Overall, by laying a framework for the deductive selection of sample preparation techniques, the results of the study can be applied to a range of applications where it would be beneficial to scientifically guide the pairing of the criteria associated with a downstream detection method with the most advantageous sample preparation techniques for complex matrices such as foods.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Tong, Rui, Lijuan Zhang, Chuandeng Hu, Xuee Chen, Qi Song, Kai Lou, Xin Tang et al. « An Automated and Miniaturized Rotating-Disk Device for Rapid Nucleic Acid Extraction ». Micromachines 10, no 3 (22 mars 2019) : 204. http://dx.doi.org/10.3390/mi10030204.

Texte intégral
Résumé :
The result of molecular diagnostic and detection greatly dependent on the quality and integrity of the isolated nucleic acid. In this work, we developed an automated miniaturized nucleic acid extraction device based on magnetic beads method, consisting of four components including a sample processing disc and its associated rotary power output mechanism, a pipetting module, a magnet module and an external central controller to enable a customizable and automated robust nucleic acid sample preparation. The extracted nucleic acid using 293T cells were verified using real-time polymerase chain reaction (PCR) and the data implies a comparable efficiency to a manual process, with the advantages of performing a flexible, time-saving (~10 min), and simple nucleic acid sample preparation.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Feld, Geoffrey K., Melvin Lye, Christoph Eberle, Ann Wang et Chyan Ying Ke. « Semi and fully automated immunostaining sample preparation platforms improve live leukocyte recovery, reproducibility, and flow cytometry data quality ». Journal of Immunology 208, no 1_Supplement (1 mai 2022) : 173.05. http://dx.doi.org/10.4049/jimmunol.208.supp.173.05.

Texte intégral
Résumé :
Abstract Limited innovation in automated cell and organelle sample preparation methodology limits the effectiveness of modern analytical methods, such as single-cell ‘omics, flow and mass cytometry. These techniques traditionally rely on manual centrifugation-based protocols for cell washing and suspension preparation, hampering researchers’ access to the reproducibility and scalability benefits of automation. We have developed a suite of cell suspension preparation systems that enable semi and full automation of cell washing protocols. These Laminar Wash™ technologies robustly, gently, and efficiently remove debris, dead cells, and unbound reagent using laminar flow and liquid handling robotics, rather than turbulent and harsh pelleting-plus-pipetting methods. Adaptation of standard protocols to Laminar Wash automation typically improves repetitive immunostaining processes and workflows, in terms of reduced hands-on time and inter- and intra-operator variability. We demonstrate the superior live cell retention and reproducibility of Laminar Wash over centrifugation in processing murine and humanized mouse peripheral blood mononuclear cells (PBMCs) and tumor infiltrating lymphocytes (TILs) for flow cytometry. Furthermore, we show how Laminar Wash improves flow cytometry data quality, in terms of debris removal and separation of immune cell subsets for both PBMCs and TILs. Overall, these results show how Laminar Wash methodology assists in standardizing sample preparation for cytometric analysis, an important and unmet need in cancer immunotherapy discovery and manufacturing workflows.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Mussbacher, Marion, Teresa L. Krammer, Stefan Heber, Waltraud C. Schrottmaier, Stephan Zeibig, Hans-Peter Holthoff, David Pereyra, Patrick Starlinger, Matthias Hackl et Alice Assinger. « Impact of Anticoagulation and Sample Processing on the Quantification of Human Blood-Derived microRNA Signatures ». Cells 9, no 8 (18 août 2020) : 1915. http://dx.doi.org/10.3390/cells9081915.

Texte intégral
Résumé :
Blood-derived microRNA signatures have emerged as powerful biomarkers for predicting and diagnosing cardiovascular disease, cancer, and metabolic disorders. Platelets and platelet-derived microvesicles are a major source of microRNAs. We have previously shown that the inappropriate anticoagulation and storage of blood samples causes substantial platelet activation that is associated with the release of platelet-stored molecules into the plasma. However, it is currently unclear if circulating microRNA levels are affected by artificial platelet activation due to suboptimal plasma preparation. To address this issue, we used a standardized RT-qPCR test for 12 microRNAs (thrombomiR®, TAmiRNA GmbH, Vienna, Austria) that have been associated with cardiovascular and thrombotic diseases and were detected in platelets and/other hematopoietic cells. Blood was prevented from coagulating with citrate–theophylline–adenosine–dipyridamole (CTAD), sodium citrate, or ethylenediaminetetraacetic acid (EDTA) and stored for different time periods either at room temperature or at 4 °C prior to plasma preparation and the subsequent quantification of microRNAs. We found that five microRNAs (miR-191-5p, miR-320a, miR-21-5p, miR-23a-3p, and miR-451a) were significantly increased in the EDTA plasma. Moreover, we observed a time-dependent increase in plasma microRNAs that was most pronounced in the EDTA blood stored at room temperature for 24 h. Furthermore, significant correlations between microRNA levels and plasma concentrations of platelet-stored molecules pointed towards in vitro platelet activation. Therefore, we strongly recommend to (i) use CTAD as an anticoagulant, (ii) process blood samples as quickly as possible, and (iii) store blood samples at 4 °C whenever immediate plasma preparation is not feasible to generate reliable data on blood-derived microRNA signatures.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Chiaramonti, Ann N., et Laurence D. Marks. « Atomic Resolution Transmission Electron Microscopy of Surfaces ». Journal of Materials Research 20, no 7 (1 juillet 2005) : 1619–27. http://dx.doi.org/10.1557/jmr.2005.0211.

Texte intégral
Résumé :
A brief overview of transmission electron microscopy as it applies specifically to obtaining surface crystallographic information is presented. This review will encompass many of the practical aspects of obtaining surface crystal information from a transmission electron microscope, including equipment requirements, experimental techniques, sample preparation methods, data extraction and image processing, and complimentary techniques.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Wang, Jiangning, Jing Ren, Tianyu Xi, Siqin Ge et Liqiang Ji. « Specifications and Standards for Insect 3D Data ». Biodiversity Information Science and Standards 2 (21 mai 2018) : e26561. http://dx.doi.org/10.3897/biss.2.26561.

Texte intégral
Résumé :
With the continuous development of imaging technology, the amount of insect 3D data is increasing, but research on data management is still virtually non-existent. This paper will discuss the specifications and standards relevant to the process of insect 3D data acquisition, processing and analysis. The collection of 3D data of insects includes specimen collection, sample preparation, image scanning specifications and 3D model specification. The specimen collection information uses existing biodiversity information standards such as Darwin Core. However, the 3D scanning process contains unique specifications for specimen preparation, depending on the scanning equipment, to achieve the best imaging results. Data processing of 3D images includes 3D reconstruction, tagging morphological structures (such as muscle and skeleton), and 3D model building. There are different algorithms in the 3D reconstruction process, but the processing results generally follow DICOM (Digital Imaging and Communications in Medicine) standards. There is no available standard for marking morphological structures, because this process is currently executed by individual researchers who create operational specifications according to their own needs. 3D models have specific file specifications, such as object files (https://en.wikipedia.org/wiki/Wavefront_.obj_file) and 3D max format (https://en.wikipedia.org/wiki/.3ds), which are widely used at present. There are only some simple tools for analysis of three-dimensional data and there are no specific standards or specifications in Audubon Core (https://terms.tdwg.org/wiki/Audubon_Core), the TDWG standard for biodiversity-related multi-media. There are very few 3D databases of animals at this time. Most of insect 3D data are created by individual entomologists and are not even stored in databases. Specifications for the management of insect 3D data need to be established step-by-step. Based on our attempt to construct a database of 3D insect data, we preliminarily discuss the necessary specifications.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Lye, Melvin, Christoph Eberle, Ann Wang, Geoffrey K. Feld et Namyong Kim. « Abstract 1885 : Semi and fully automated immunostaining sample preparation platforms improve live leukocyte recovery, reproducibility, and cytometry data quality ». Cancer Research 82, no 12_Supplement (15 juin 2022) : 1885. http://dx.doi.org/10.1158/1538-7445.am2022-1885.

Texte intégral
Résumé :
Abstract Limited innovation in automated cell and organelle sample preparation methodology limits the effectiveness of modern analytical methods, such as single-cell ‘omics, flow and mass cytometry. These techniques traditionally rely on manual centrifugation-based protocols for cell washing and suspension preparation, hampering researchers’ access to the reproducibility and scalability benefits of automation. We have developed a suite of cell suspension preparation systems that enable semi and full automation of cell washing protocols. These Laminar Wash࣪ technologies robustly, gently, and efficiently remove debris, dead cells, and unbound reagent using laminar flow and liquid handling robotics, rather than turbulent and harsh pelleting-plus-pipetting methods. Adaptation of standard protocols to Laminar Wash automation typically improves repetitive immunostaining processes and workflows, in terms of reduced hands-on time and inter- and intra-operator variability. We demonstrate the superior live cell retention and reproducibility of Laminar Wash over centrifugation in processing murine and humanized mouse peripheral blood mononuclear cells (PBMCs) and tumor infiltrating lymphocytes (TILs) for flow cytometry. Furthermore, we show how Laminar Wash improves flow cytometry data quality, in terms of debris removal and separation of immune cell subsets for both PBMCs and TILs. Overall, these results show how Laminar Wash methodology assists in standardizing sample preparation for cytometric analysis, an important and unmet need in cancer immunotherapy discovery and manufacturing workflows. Citation Format: Melvin Lye, Christoph Eberle, Ann Wang, Geoffrey K. Feld, Namyong Kim. Semi and fully automated immunostaining sample preparation platforms improve live leukocyte recovery, reproducibility, and cytometry data quality [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2022; 2022 Apr 8-13. Philadelphia (PA): AACR; Cancer Res 2022;82(12_Suppl):Abstract nr 1885.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Xue, Huan, Nan Xiang Kuang et Hong Chuan Zhu. « The Friction Coefficient Testing Method of Metallic Sheet and Strip ». Advanced Materials Research 629 (décembre 2012) : 75–78. http://dx.doi.org/10.4028/www.scientific.net/amr.629.75.

Texte intégral
Résumé :
The research background and the generation of friction are introduced. The importance of the coefficient of friction test in sheet metal forming field is indicated. Standards of coefficient of friction of metallic sheet and other related materials testing method are described. The experimental principle, the size and preparation of testing sample, testing equipment and procedure, result data processing method are presented.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Schreier, Christina, Werner Kremer, Fritz Huber, Sindy Neumann, Philipp Pagel, Kai Lienemann et Sabine Pestel. « Reproducibility of NMR Analysis of Urine Samples : Impact of Sample Preparation, Storage Conditions, and Animal Health Status ». BioMed Research International 2013 (2013) : 1–19. http://dx.doi.org/10.1155/2013/878374.

Texte intégral
Résumé :
Introduction.Spectroscopic analysis of urine samples from laboratory animals can be used to predict the efficacy and side effects of drugs. This employs methods combining1H NMR spectroscopy with quantification of biomarkers or with multivariate data analysis. The most critical steps in data evaluation are analytical reproducibility of NMR data (collection, storage, and processing) and the health status of the animals, which may influence urine pH and osmolarity.Methods.We treated rats with a solvent, a diuretic, or a nephrotoxicant and collected urine samples. Samples were titrated to pH 3 to 9, or salt concentrations increased up to 20-fold. The effects of storage conditions and freeze-thaw cycles were monitored. Selected metabolites and multivariate data analysis were evaluated after1H NMR spectroscopy.Results.We showed that variation of pH from 3 to 9 and increases in osmolarity up to 6-fold had no effect on the quantification of the metabolites or on multivariate data analysis. Storage led to changes after 14 days at 4°C or after 12 months at −20°C, independent of sample composition. Multiple freeze-thaw cycles did not affect data analysis.Conclusion.Reproducibility of NMR measurements is not dependent on sample composition under physiological or pathological conditions.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Filla, Laura A., et James L. Edwards. « Metabolomics in diabetic complications ». Molecular BioSystems 12, no 4 (2016) : 1090–105. http://dx.doi.org/10.1039/c6mb00014b.

Texte intégral
Résumé :
In the past 15 years, the field of metabolomics has expanded the current understanding of the pathophysiology of diabetic complications far beyond oxidative stress and inflammation. Branched-chain amino acids, phospholipid metabolism, and the glutamine/glutamate cycle are just a few of the previously unknown pathways and biomarkers of diabetes which have come to light due to advancements in sensitivity, sample preparation, and data processing.
Styles APA, Harvard, Vancouver, ISO, etc.
31

Slavutskaya, Elena, et Leonid Slavutskii. « PRETEEN AGE : THE ANALYSIS OF THE MULTILEVEL PSYCHO-DIAGNOSTIC DATA BASED ON NEURAL NETWORK MODELS ». SOCIETY. INTEGRATION. EDUCATION. Proceedings of the International Scientific Conference 5 (25 mai 2018) : 455. http://dx.doi.org/10.17770/sie2018vol1.3348.

Texte intégral
Résumé :
The use of the artificial neural network (ANN) models for vertical system analysis of psycho-diagnostic data is suggested. It is shown that the ANN training, as the problem of nonlinear multi-parameter optimization, allows to create effective algorithms for the psycho-diagnostic data processing when the results of psychological testing for the different level’s characteristics have different numerical scales. On the example of processing the author's data of psycho-diagnostics (preadolescent schoolchildren), it is shown that neural network models can be used to estimate latent (hidden) connections between psychological characteristics. The proposed algorithms are based on a statistical assessment of the quality of such models, do not require a large sample of respondents. The quantitative statistical criteria for evaluating the quality of the models are estimated. The approach is sufficiently clear for practical use by psychologists who do not have a special mathematical preparation.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Wiscovitch-Russo, Rosana, Harinder Singh, Lauren M. Oldfield, Alexey V. Fedulov et Norberto Gonzalez-Juarbe. « An optimized approach for processing of frozen lung and lavage samples for microbiome studies ». PLOS ONE 17, no 4 (5 avril 2022) : e0265891. http://dx.doi.org/10.1371/journal.pone.0265891.

Texte intégral
Résumé :
The respiratory tract has a resident microbiome with low biomass and limited diversity. This results in difficulties with sample preparation for sequencing due to uneven bacteria-to-host DNA ratio, especially for small tissue samples such as mouse lungs. We compared effectiveness of current procedures used for DNA extraction in microbiome studies. Bronchoalveolar lavage fluid (BALF) and lung tissue samples were collected to test different forms of sample pre-treatment and extraction methods to increase bacterial DNA yield and optimize library preparation. DNA extraction using a pre-treatment method of mechanical lysis (lung tissue) and one-step centrifugation (BALF) increased DNA yield and bacterial content of samples. In contrast, a significant increase of environmental contamination was detected after phenol chloroform isoamyl alcohol (PCI) extraction and nested PCR. While PCI has been a standard procedure used in microbiome studies, our data suggests that it is not efficient for DNA extraction of frozen low biomass samples. Finally, a DNA Enrichment kit was tested and found to improve the 16S copy number of lung tissue with a minor shift in microbial composition. Overall, we present a standardized method to provide high yielding DNA and improve sequencing coverage of low microbial biomass frozen samples with minimal contamination.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Tisnérat-Laborde, N., J. J. Poupeau, J. F. Tannau et M. Paterne. « Development of a Semi-Automated System for Routine Preparation of Carbonate Samples ». Radiocarbon 43, no 2A (2001) : 299–304. http://dx.doi.org/10.1017/s0033822200038145.

Texte intégral
Résumé :
We constructed a semi-automated system to transform carbonate samples to CO2, as a means to increase sample-processing capacity. The physico-chemical process includes hydrolysis of carbonate, quantification of the mass of carbon and CO2 collection in a glass ampoule. The system is computer-controlled and monitored, and all the data are stored. A single run of five consecutive samples requires about 3.5 hours. Measurements of 14C concentrations were made on samples of IAEA C-1 Carrara marble to test the reliability of this semi-automated system. These measurements have allowed the determination of the total system background and the memory effect of our system.
Styles APA, Harvard, Vancouver, ISO, etc.
34

GIZA, ALEKSANDRA, EWELINA IWAN, ARKADIUSZ BOMBA et DARIUSZ WASYL. « Basics of high throughput sequencing Summary ». Medycyna Weterynaryjna 77, no 11 (2025) : 6589–2025. http://dx.doi.org/10.21521/mw.6594.

Texte intégral
Résumé :
Sequencing can provide genomic characterisation of a specific organism, as well as of a whole environmental or clinical sample. High Throughput Sequencing (HTS) makes it possible to generate an enormous amount of genomic data at gradually decreasing costs and almost in real-time. HTS is used, among others, in medicine, veterinary medicine, microbiology, virology and epidemiology. The paper presents practical aspects of the HTS technology. It describes generations of sequencing, which vary in throughput, read length, accuracy and costs ̶ and thus are used for different applications. The stages of HTS, as well as their purposes and pitfalls, are presented: extraction of the genetic material, library preparation, sequencing and data processing. For success of the whole process, all stages need to follow strict quality control measurements. Choosing the right sequencing platform, proper sample and library preparation procedures, as well as adequate bioinformatic tools are crucial for high quality results.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Dillard, Rebecca S., Cheri M. Hampton, Joshua D. Strauss, Zunlong Ke, Deanna Altomara, Ricardo C. Guerrero-Ferreira, Gabriella Kiss et Elizabeth R. Wright. « Biological Applications at the Cutting Edge of Cryo-Electron Microscopy ». Microscopy and Microanalysis 24, no 4 (août 2018) : 406–19. http://dx.doi.org/10.1017/s1431927618012382.

Texte intégral
Résumé :
AbstractCryo-electron microscopy (cryo-EM) is a powerful tool for macromolecular to near-atomic resolution structure determination in the biological sciences. The specimen is maintained in a near-native environment within a thin film of vitreous ice and imaged in a transmission electron microscope. The images can then be processed by a number of computational methods to produce three-dimensional information. Recent advances in sample preparation, imaging, and data processing have led to tremendous growth in the field of cryo-EM by providing higher resolution structures and the ability to investigate macromolecules within the context of the cell. Here, we review developments in sample preparation methods and substrates, detectors, phase plates, and cryo-correlative light and electron microscopy that have contributed to this expansion. We also have included specific biological applications.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Zhao, Nie, Chunming Yang, Fenggang Bian, Daoyou Guo et Xiaoping Ouyang. « SGTools : a suite of tools for processing and analyzing large data sets from in situ X-ray scattering experiments ». Journal of Applied Crystallography 55, no 1 (1 février 2022) : 195–203. http://dx.doi.org/10.1107/s1600576721012267.

Texte intégral
Résumé :
In situ synchrotron small-angle X-ray scattering (SAXS) is a powerful tool for studying dynamic processes during material preparation and application. The processing and analysis of large data sets generated from in situ X-ray scattering experiments are often tedious and time consuming. However, data processing software for in situ experiments is relatively rare, especially for grazing-incidence small-angle X-ray scattering (GISAXS). This article presents an open-source software suite (SGTools) to perform data processing and analysis for SAXS and GISAXS experiments. The processing modules in this software include (i) raw data calibration and background correction; (ii) data reduction by multiple methods; (iii) animation generation and intensity mapping for in situ X-ray scattering experiments; and (iv) further data analysis for the sample with an order degree and interface correlation. This article provides the main features and framework of SGTools. The workflow of the software is also elucidated to allow users to develop new features. Three examples are demonstrated to illustrate the use of SGTools for dealing with SAXS and GISAXS data. Finally, the limitations and future features of the software are also discussed.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Ultee, Eveline, Fred Schenkel, Wen Yang, Susanne Brenzinger, Jamie S. Depelteau et Ariane Briegel. « An Open-Source Storage Solution for Cryo-Electron Microscopy Samples ». Microscopy and Microanalysis 24, no 1 (18 janvier 2018) : 60–63. http://dx.doi.org/10.1017/s143192761701279x.

Texte intégral
Résumé :
AbstractCryo-electron microscopy (cryo-EM) enables the study of biological structures in situ in great detail and to solve protein structures at Ångstrom level resolution. Due to recent advances in instrumentation and data processing, the field of cryo-EM is a rapidly growing. Access to facilities and national centers that house the state-of-the-art microscopes is limited due to the ever-rising demand, resulting in long wait times between sample preparation and data acquisition. To improve sample storage, we have developed a cryo-storage system with an efficient, high storage capacity that enables sample storage in a highly organized manner. This system is simple to use, cost-effective and easily adaptable for any type of grid storage box and dewar and any size cryo-EM laboratory.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Sambari, Villa Evadelvia Ginal. « Studi Perbandingan Kadar Ni dan Fe Berdasarkan Sampel Cek Pit dan Sampel Cek Stock Pile Mining Nikel pada PT. Bintangdelapan Mineral Sulawesi Tengah ». Ge-STRAM : Jurnal Perencanaan dan Rekayasa Sipil 4, no 1 (30 mars 2021) : 41. http://dx.doi.org/10.25139/jprs.v4i1.3163.

Texte intégral
Résumé :
Nickel mining in PT. Bintangdelapan Mineral District located in the village Fatufia Bahodopi Morowali, Central Sulawesi. The purpose of this research study sampling techniques and sample checks pit stock pile, and aimed to compare the levels of Ni, Fe. The authors limit the issues on comparative levels of Ni, Fe, based on sampling and sample checks pit mining production and production sample port stock pile, using the analysis tool Minipal. Field research methods consisting of the preparation stage, the stage of data collection, data processing stage and phase of Thesis. Results in getting the checks on the sampling pit, mining samples, and sample port is an increase in levels, this is because the mining PT. Bintangdelapan Minerals has applied to both selective mining mining methods. In this sample, the researcher applied sampling method and sample check stock pile pit nickel laterite operations in accordance with standard PT. Bintangdelapan Minerals, the data obtained is processed using Microsoft Excel and then presented in the form of reading SPSS (Statistical Product And Service Solution).
Styles APA, Harvard, Vancouver, ISO, etc.
39

Bartholomäus, Alexander, Cristian Del Campo et Zoya Ignatova. « Mapping the non-standardized biases of ribosome profiling ». Biological Chemistry 397, no 1 (1 janvier 2016) : 23–35. http://dx.doi.org/10.1515/hsz-2015-0197.

Texte intégral
Résumé :
Abstract Ribosome profiling is a new emerging technology that uses massively parallel amplification of ribosome-protected fragments and next-generation sequencing to monitor translation in vivo with codon resolution. Studies using this approach provide insightful views on the regulation of translation on a global cell-wide level. In this review, we compare different experimental set-ups and current protocols for sequencing data analysis. Specifically, we review the pitfalls at some experimental steps and highlight the importance of standardized protocol for sample preparation and data processing pipeline, at least for mapping and normalization.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Pythoud, Nicolas, Joanna Bons, Geoffroy Mijola, Alain Beck, Sarah Cianférani et Christine Carapito. « Optimized Sample Preparation and Data Processing of Data-Independent Acquisition Methods for the Robust Quantification of Trace-Level Host Cell Protein Impurities in Antibody Drug Products ». Journal of Proteome Research 20, no 1 (5 octobre 2020) : 923–31. http://dx.doi.org/10.1021/acs.jproteome.0c00664.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
41

Sineglazov, Victor, et Yaroslav Kharchuk. « Intellectual System of Preparation of Images from Computer Tomographs ». Electronics and Control Systems 4, no 70 (4 janvier 2022) : 30–36. http://dx.doi.org/10.18372/1990-5548.70.16741.

Texte intégral
Résumé :
Artificial neural networks can be trained on useful signals of the source data, but can not be taught on noisy data, so it is usually performed noise reduction or error compensation. This paper implements a noise reduction model based on artificial neural networks to suppress high-noise components, which is important for optimizing pre-filtering methods. The process of cleaning computers’ tomography scans in medical examinations of patients with tuberculosis is considered as an given problem in which the suppression of noise present in the image is required.. In order to reduce the level of radiation due to it is quite harmful to human. the power of the radiation is reduced. As a result, the ratio of the useful signal to noise is reduced, which causes noise, which contaminates the image and complicates its processing. Additional shadows appears on the image that no objects exist, which can provide false diagnosis. An algorithm for structural-parametric synthesis of convolutional neural networks used in image noise suppression has been developed. Computer tomograms of tuberculosis patients provided by the Research Institute of Pulmonology and Tuberculosis of the National Academy of Medical Sciences of Ukraine were used as a training sample.
Styles APA, Harvard, Vancouver, ISO, etc.
42

Vowinckel, Jakob, Floriana Capuano, Kate Campbell, Michael J. Deery, Kathryn S. Lilley et Markus Ralser. « The beauty of being (label)-free : sample preparation methods for SWATH-MS and next-generation targeted proteomics ». F1000Research 2 (13 décembre 2013) : 272. http://dx.doi.org/10.12688/f1000research.2-272.v1.

Texte intégral
Résumé :
The combination of qualitative analysis with label-free quantification has greatly facilitated the throughput and flexibility of novel proteomic techniques. However, such methods rely heavily on robust and reproducible sample preparation procedures. Here, we benchmark a selection of in gel, on filter, and in solution digestion workflows for their application in label-free proteomics. Each procedure was associated with differing advantages and disadvantages. The in gel methods interrogated were cost effective, but were limited in throughput and digest efficiency. Filter-aided sample preparations facilitated reasonable processing times and yielded a balanced representation of membrane proteins, but led to a high signal variation in quantification experiments. Two in solution digest protocols, however, gave optimal performance for label-free proteomics. A protocol based on the detergent RapiGest led to the highest number of detected proteins at second-best signal stability, while a protocol based on acetonitrile-digestion, RapidACN, scored best in throughput and signal stability but came second in protein identification. In addition, we compared label-free data dependent (DDA) and data independent (SWATH) acquisition. While largely similar in protein detection, SWATH outperformed DDA in quantification, reducing signal variation and markedly increasing the number of precisely quantified peptides.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Cunsolo, Serena, John Williams, Michelle Hale, Daniel S. Read et Fay Couceiro. « Optimising sample preparation for FTIR-based microplastic analysis in wastewater and sludge samples : multiple digestions ». Analytical and Bioanalytical Chemistry 413, no 14 (23 avril 2021) : 3789–99. http://dx.doi.org/10.1007/s00216-021-03331-6.

Texte intégral
Résumé :
AbstractThe lack of standardised methodologies in microplastic research has been addressed in recent years as it hampers the comparison of results across studies. The quantification of microplastics in the environment is key to the assessment of the potential eco-toxicological impacts that this new category of emerging pollutants could have on terrestrial and aquatic species. Therefore, the need for protocols that are robust, simple and reliable together with their standardisation are of crucial importance. This study has focused on removal of organic matter with Fenton reagent from wastewater and sludge samples. This step of analysis was optimised by implementing a multi-digestion treatment on these samples that have high concentration of complex mixtures of organic matter, which interfere with microplastic enumeration. Moreover, this study targeted the detection of microplastics in the sub-hundred-micron size range due to the potential higher risks associated with smaller-sized particles and the limited data available from previous wastewater research. To show the validity of the method, triplicate samples of raw sewage, final effluent and sludge were independently spiked with two different sizes and types of microplastic polymers. Due to the various analytical stages required for the isolation of microplastics, time is a limiting factor in sample processing. The sequential digestion with Fenton reagent represents an inexpensive and time-efficient procedure for wastewater research providing effective degradation of organic material. These advantages over other currently available methods mean the method is suitable for analysis of large numbers of samples allowing robust monitoring data sets to be generated.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Shi, Peng, Bin Li, Jindong Huo et Lei Wen. « A Smart High-Throughput Experiment Platform for Materials Corrosion Study ». Scientific Programming 2016 (2016) : 1–9. http://dx.doi.org/10.1155/2016/6876241.

Texte intégral
Résumé :
Materials corrosion study is based on plenty of contrast experiments. Traditional corrosion experiments are time-consuming and require manual corrosion grade evaluating during the experiment. To improve the efficiency of experiment, a high-throughput experiment platform is designed and accomplished. The platform mainly consists of high-throughput corrosion reaction facility, data acquisition system, and data processing system. The corrosion reaction facility supports high-throughput materials corrosion reactions under various conditions. The data acquisition system is mainly responsible for capturing the images of samples’ surface, collecting electrochemical signals, and storing them into the computer in real time. The data processing system treats the acquired data and evaluates the degree of materials corrosion in real time by program automatically. The platform not only reduces the occupation of the equipment but also improves the efficiency of sample preparation and experiment occurrence. The experimental data shows that the platform can accomplish high-throughput corrosion contrast experiment easily and reduce the time cost obviously.
Styles APA, Harvard, Vancouver, ISO, etc.
45

Zander, Ulrich, Guillaume Hoffmann, Irina Cornaciu, Jean-Pierre Marquette, Gergely Papp, Christophe Landret, Gaël Seroul et al. « Automated harvesting and processing of protein crystals through laser photoablation ». Acta Crystallographica Section D Structural Biology 72, no 4 (24 mars 2016) : 454–66. http://dx.doi.org/10.1107/s2059798316000954.

Texte intégral
Résumé :
Currently, macromolecular crystallography projects often require the use of highly automated facilities for crystallization and X-ray data collection. However, crystal harvesting and processing largely depend on manual operations. Here, a series of new methods are presented based on the use of a low X-ray-background film as a crystallization support and a photoablation laser that enable the automation of major operations required for the preparation of crystals for X-ray diffraction experiments. In this approach, the controlled removal of the mother liquor before crystal mounting simplifies the cryocooling process, in many cases eliminating the use of cryoprotectant agents, while crystal-soaking experiments are performed through diffusion, precluding the need for repeated sample-recovery and transfer operations. Moreover, the high-precision laser enables new mounting strategies that are not accessible through other methods. This approach bridges an important gap in automation and can contribute to expanding the capabilities of modern macromolecular crystallography facilities.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Zakrzewski, Jacek, Karol Strzałkowski, Mohammed Boumhamdi, Agnieszka Marasek, Ali Abouais et Daniel M. Kamiński. « Photothermal Determination of the Surface Treatment of Cd1-xBexTe Mixed Crystals ». Applied Sciences 13, no 4 (7 février 2023) : 2113. http://dx.doi.org/10.3390/app13042113.

Texte intégral
Résumé :
Cd1−xBexTe, a new material with potential for X-ray and γ-ray detectors, was analyzed by photothermal piezoelectric spectroscopy. The samples were tested depending on beryllium content and surface preparation. The main aim of the measurements was to extract the energy gap values, which were found for x = 0.01, 0.03, 0.05, and 0.1. It was shown that mechanical (polishing) and chemical (etching) treatment strongly influenced the amplitude and phase spectra of CdBeTe crystals. Piezoelectric spectroscopy allowed for comparing the quality of preparation of both surfaces for a single sample. The sub-surface damaged layer that was created as a result of surface processing had different thermal parameters than the bulk part of the sample. It was responsible for the additional peaks in the amplitude spectrum and changes in the phase spectrum of the photothermal signal. Two different methods of sample etching were analyzed. One completely quenched the signal, and the other did not eliminate the defects present on the surface after the cutting process. The article presents the preliminary interpretation of experimental data using the modified Blonskij model.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Vowinckel, Jakob, Floriana Capuano, Kate Campbell, Michael J. Deery, Kathryn S. Lilley et Markus Ralser. « The beauty of being (label)-free : sample preparation methods for SWATH-MS and next-generation targeted proteomics ». F1000Research 2 (7 avril 2014) : 272. http://dx.doi.org/10.12688/f1000research.2-272.v2.

Texte intégral
Résumé :
The combination of qualitative analysis with label-free quantification has greatly facilitated the throughput and flexibility of novel proteomic techniques. However, such methods rely heavily on robust and reproducible sample preparation procedures. Here, we benchmark a selection of in gel, on filter, and in solution digestion workflows for their application in label-free proteomics. Each procedure was associated with differing advantages and disadvantages. The in gel methods interrogated were cost effective, but were limited in throughput and digest efficiency. Filter-aided sample preparations facilitated reasonable processing times and yielded a balanced representation of membrane proteins, but led to a high signal variation in quantification experiments. Two in solution digest protocols, however, gave optimal performance for label-free proteomics. A protocol based on the detergent RapiGest led to the highest number of detected proteins at second-best signal stability, while a protocol based on acetonitrile-digestion, RapidACN, scored best in throughput and signal stability but came second in protein identification. In addition, we compared label-free data dependent (DDA) and data independent (SWATH) acquisition on a TripleTOF 5600 instrument. While largely similar in protein detection, SWATH outperformed DDA in quantification, reducing signal variation and markedly increasing the number of precisely quantified peptides.
Styles APA, Harvard, Vancouver, ISO, etc.
48

McNichol, A. P., A. R. Gagnon, E. A. Osborne, D. L. Hutton, K. F. Von Reden et R. J. Schneider. « Improvements in Procedural Blanks at NOSAMS : Reflections of Improvements in Sample Preparation and Accelerator Operation ». Radiocarbon 37, no 2 (1995) : 683–91. http://dx.doi.org/10.1017/s0033822200031209.

Texte intégral
Résumé :
During the four years the Sample Preparation Laboratory (SPL) at the National Ocean Sciences Accelerator Mass Spectrometer (NOSAMS) Facilty has been in operation we have accumulated much data from which we can assess our progress. We evaluate our procedural blanks here and describe modifications in our procedures that have improved our analyses of older samples. In the SPL, we convert three distinct types of samples—seawater, CaCO3 and organic carbon—to CO2 prior to preparing graphite for the accelerator and have distinct procedural blanks for each procedure. Dissolved inorganic carbon (∑CO2) is extracted from acidified seawater samples by sparging with a nitrogen carrier gas. We routinely analyze “line blanks” by processing CO2 from a 14C-dead source through the entire stripping procedure. Our hydrolysis blank, IAEA C-1, is prepared by acidifying in vacuo with 100% H3PO4 at 60° overnight, identical to our sample preparation. We use a dead graphite, NBS-21, or a commercially available carbon powder for our organic combustion blank; organic samples are combusted at 850° for 5 h using CuO to provide the oxidant. Analysis of our water stripping data suggests that one step in the procedure contributes the major portion of the line blank. At present, the contribution from the line blank has no effect on our seawater analyses (fraction modern (fm) between 0.7 and 1.2). Our hydrolysis blanks can have an fm value as low as 0.0006, but are more routinely between 0.0020 and 0.0025. The fm of our best organic combustion blanks is higher than those routinely achieved in other laboratories and we are currently altering our methods to reduce it.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Kutnjak, Denis, Lucie Tamisier, Ian Adams, Neil Boonham, Thierry Candresse, Michela Chiumenti, Kris De Jonghe et al. « A Primer on the Analysis of High-Throughput Sequencing Data for Detection of Plant Viruses ». Microorganisms 9, no 4 (14 avril 2021) : 841. http://dx.doi.org/10.3390/microorganisms9040841.

Texte intégral
Résumé :
High-throughput sequencing (HTS) technologies have become indispensable tools assisting plant virus diagnostics and research thanks to their ability to detect any plant virus in a sample without prior knowledge. As HTS technologies are heavily relying on bioinformatics analysis of the huge amount of generated sequences, it is of utmost importance that researchers can rely on efficient and reliable bioinformatic tools and can understand the principles, advantages, and disadvantages of the tools used. Here, we present a critical overview of the steps involved in HTS as employed for plant virus detection and virome characterization. We start from sample preparation and nucleic acid extraction as appropriate to the chosen HTS strategy, which is followed by basic data analysis requirements, an extensive overview of the in-depth data processing options, and taxonomic classification of viral sequences detected. By presenting the bioinformatic tools and a detailed overview of the consecutive steps that can be used to implement a well-structured HTS data analysis in an easy and accessible way, this paper is targeted at both beginners and expert scientists engaging in HTS plant virome projects.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Madhuravani, B., Srujan Atluri et Hema Valpadasu. « An IoT based Machine Learning Technique for Efficient Online Load Forecasting ». Revista Gestão Inovação e Tecnologias 11, no 2 (5 juin 2021) : 547–54. http://dx.doi.org/10.47059/revistageintec.v11i2.1686.

Texte intégral
Résumé :
Internet of Things (IoT) networks are computer networks that have an extreme issue with IT security and an issue with the monitoring of computer threats in specific. The paper proposes a combination of machine learning methods and parallel data analysis to address this challenge. The architecture and a new approach to the combination of the key classifiers intended for IoT network attacks are being developed. The issue classification statement is created in which the consistency ratio to training time is the integral measure of effectiveness. To improve the preparation and assessment pace, it is suggested to use the data processing and multi-threaded mode offered by Spark. In comparison, a preprocessing data set approach is proposed, resulting in a significant reduction in the length of the sample. An experimental review of the proposed approach reveals that the precision of IoT network attack detection is 100%, and the processing speed of the data collection increases with the number of parallel threads.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie