Academic literature on the topic 'GCxGC, sample preparation, data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'GCxGC, sample preparation, data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "GCxGC, sample preparation, data processing"

1

Laycock, Paul. "Data Preparation for NA62." EPJ Web of Conferences 214 (2019): 02017. http://dx.doi.org/10.1051/epjconf/201921402017.

Full text
Abstract:
In 2017, NA62 recorded over a petabyte of raw data, collecting around a billion events per day of running. Data are collected in bursts of 3-5 seconds, producing output files of a few gigabytes. A typical run, a sequence of bursts with the same detector configuration and similar experimental conditions, contains 1500 bursts and constitutes the basic unit for offline data processing. A sample of 100 random bursts is used to make timing calibrations of all detectors, after which every burst in the run is reconstructed. Finally the reconstructed events are filtered by physics channel with an average reduction factor of 20, and data quality metrics are calculated. Initially a bespoke data processing solution was implemented using a simple finite state machine with limited production system functionality. In 2017, the ATLAS Tier-0 team offered the use of their production system, together with the necessary support. Data processing workflows were rewritten with better error-handling and I/O operations were minimised, the reconstruction software was improved and conditions data handling was changed to follow best practices suggested by the HEP Software Foundation conditions database working group. This contribution describes the experience gained in using these tools and methods for data-processing on a petabyte scale experiment.
APA, Harvard, Vancouver, ISO, and other styles
2

Züllig, Thomas, Martin Trötzmüller, and Harald C. Köfeler. "Lipidomics from sample preparation to data analysis: a primer." Analytical and Bioanalytical Chemistry 412, no. 10 (December 10, 2019): 2191–209. http://dx.doi.org/10.1007/s00216-019-02241-y.

Full text
Abstract:
AbstractLipids are amongst the most important organic compounds in living organisms, where they serve as building blocks for cellular membranes as well as energy storage and signaling molecules. Lipidomics is the science of the large-scale determination of individual lipid species, and the underlying analytical technology that is used to identify and quantify the lipidome is generally mass spectrometry (MS). This review article provides an overview of the crucial steps in MS-based lipidomics workflows, including sample preparation, either liquid–liquid or solid-phase extraction, derivatization, chromatography, ion-mobility spectrometry, MS, and data processing by various software packages. The associated concepts are discussed from a technical perspective as well as in terms of their application. Furthermore, this article sheds light on recent advances in the technology used in this field and its current limitations. Particular emphasis is placed on data quality assurance and adequate data reporting; some of the most common pitfalls in lipidomics are discussed, along with how to circumvent them.
APA, Harvard, Vancouver, ISO, and other styles
3

Hattne, Johan, Francis E. Reyes, Brent L. Nannenga, Dan Shi, M. Jason de la Cruz, Andrew G. W. Leslie, and Tamir Gonen. "MicroED data collection and processing." Acta Crystallographica Section A Foundations and Advances 71, no. 4 (July 1, 2015): 353–60. http://dx.doi.org/10.1107/s2053273315010669.

Full text
Abstract:
MicroED, a method at the intersection of X-ray crystallography and electron cryo-microscopy, has rapidly progressed by exploiting advances in both fields and has already been successfully employed to determine the atomic structures of several proteins from sub-micron-sized, three-dimensional crystals. A major limiting factor in X-ray crystallography is the requirement for large and well ordered crystals. By permitting electron diffraction patterns to be collected from much smaller crystals, or even single well ordered domains of large crystals composed of several small mosaic blocks, MicroED has the potential to overcome the limiting size requirement and enable structural studies on difficult-to-crystallize samples. This communication details the steps for sample preparation, data collection and reduction necessary to obtain refined, high-resolution, three-dimensional models by MicroED, and presents some of its unique challenges.
APA, Harvard, Vancouver, ISO, and other styles
4

Young, R. J. "Automation of Focused Ion Beam (FIB) Sample Preparation." Microscopy and Microanalysis 6, S2 (August 2000): 512–13. http://dx.doi.org/10.1017/s1431927600035054.

Full text
Abstract:
The use of focused ion beam (FIB) systems is well established as a sample preparation and imaging tool in a wide range of applications, most notably, in the semiconductor and data storage industries, but also within material and biological sciences (Figs. 1-3). The real benefit of the FIB is that the same ion beam that is used for material removal and deposition is also used for imaging the sample, which results in highly precise and localized sample preparation. In addition, the FIB can be used to prepare samples through multiple layers with different material properties, and allows the rest of specimen to be kept intact for further analysis or processing. FIB is most commonly used to prepare samples for the transmission electron microscope (TEM), the scanning electron microscope (SEM), and for the FIB itself. The FIB, which is an imaging tool in its own right, can produce secondary-electron and -ion images and collect secondary ion mass spectrometry (SIMS) data.
APA, Harvard, Vancouver, ISO, and other styles
5

Casadonte, Rita, Jörg Kriegsmann, Mark Kriegsmann, Katharina Kriegsmann, Roberta Torcasio, Maria Eugenia Gallo Cantafio, Giuseppe Viglietto, and Nicola Amodio. "A Comparison of Different Sample Processing Protocols for MALDI Imaging Mass Spectrometry Analysis of Formalin-Fixed Multiple Myeloma Cells." Cancers 15, no. 3 (February 3, 2023): 974. http://dx.doi.org/10.3390/cancers15030974.

Full text
Abstract:
Sample processing of formalin-fixed specimens constitutes a major challenge in molecular profiling efforts. Pre-analytical factors such as fixative temperature, dehydration, and embedding media affect downstream analysis, generating data dependent on technical processing rather than disease state. In this study, we investigated two different sample processing methods, including the use of the cytospin sample preparation and automated sample processing apparatuses for proteomic analysis of multiple myeloma (MM) cell lines using imaging mass spectrometry (IMS). In addition, two sample-embedding instruments using different reagents and processing times were considered. Three MM cell lines fixed in 4% paraformaldehyde were either directly centrifuged onto glass slides using cytospin preparation techniques or processed to create paraffin-embedded specimens with an automatic tissue processor, and further cut onto glass slides for IMS analysis. The number of peaks obtained from paraffin-embedded samples was comparable between the two different sample processing instruments. Interestingly, spectra profiles showed enhanced ion yield in cytospin compared to paraffin-embedded samples along with high reproducibility compared to the sample replicate.
APA, Harvard, Vancouver, ISO, and other styles
6

Behl, Isha, Genecy Calado, Ola Ibrahim, Alison Malkin, Stephen Flint, Hugh J. Byrne, and Fiona M. Lyng. "Development of methodology for Raman microspectroscopic analysis of oral exfoliated cells." Analytical Methods 9, no. 6 (2017): 937–48. http://dx.doi.org/10.1039/c6ay03360a.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Golosov, Andrei, Olga Lubimova, Mikhail Zhevora, Vladislava Markevich, and Vladimir Siskov. "Data processing method for experimental studies of deformation in a rock sample under uniaxial compression." E3S Web of Conferences 129 (2019): 01018. http://dx.doi.org/10.1051/e3sconf/201912901018.

Full text
Abstract:
As a result of experimental and theoretical studies, the patterns of behavior of rocks in a condition close to destructive are the focal nature of the preparation of macrocracking, which allowed us to include the mesocrack structure of the material, which is the main element in the preparation of macrocracking. Differences in this new approach to mathematical modeling will let adequately describe dissipative mesocrack structures of various hierarchical levels of geodesy, predict dynamic changes, structures and mechanical properties of both rock samples and massif, which also lead to resource-intensive experimental studies. In this paper, with usage of the methods of cluster, factor, and statistical analysis, we set the task of processing the data of experimental studies of the laws of deformation and preparing macro-fracture of rock samples by various methods, including acoustic and deformation observations.
APA, Harvard, Vancouver, ISO, and other styles
8

Souza, T. G. F., V. S. T. Ciminelli, and N. D. S. Mohallem. "An assessment of errors in sample preparation and data processing for nanoparticle size analyses by AFM." Materials Characterization 109 (November 2015): 198–205. http://dx.doi.org/10.1016/j.matchar.2015.09.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jolivet, L., V. Motto-Ros, L. Sorbier, T. Sozinho, and C. P. Lienemann. "Quantitative imaging of carbon in heterogeneous refining catalysts." Journal of Analytical Atomic Spectrometry 35, no. 5 (2020): 896–903. http://dx.doi.org/10.1039/c9ja00434c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rodrigues, Ana M., Ana I. Ribeiro-Barros, and Carla António. "Experimental Design and Sample Preparation in Forest Tree Metabolomics." Metabolites 9, no. 12 (November 22, 2019): 285. http://dx.doi.org/10.3390/metabo9120285.

Full text
Abstract:
Appropriate experimental design and sample preparation are key steps in metabolomics experiments, highly influencing the biological interpretation of the results. The sample preparation workflow for plant metabolomics studies includes several steps before metabolite extraction and analysis. These include the optimization of laboratory procedures, which should be optimized for different plants and tissues. This is particularly the case for trees, whose tissues are complex matrices to work with due to the presence of several interferents, such as oleoresins, cellulose. A good experimental design, tree tissue harvest conditions, and sample preparation are crucial to ensure consistency and reproducibility of the metadata among datasets. In this review, we discuss the main challenges when setting up a forest tree metabolomics experiment for mass spectrometry (MS)-based analysis covering all technical aspects from the biological question formulation and experimental design to sample processing and metabolite extraction and data acquisition. We also highlight the importance of forest tree metadata standardization in metabolomics studies.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "GCxGC, sample preparation, data processing"

1

Zhang, Penghan. "Application of GC×GC-MS in VOC analysis of fermented beverages." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/323992.

Full text
Abstract:
GC×GC is an efficient tool for the analysis of volatile compound. However, improvements are still required on VOC extraction, GC×GC setup and data processing. Different sample preparation techniques and GC×GC setup were compared based on the literature study and experimental results. Each VOC extraction technology has its own drawbacks and needs new developments. There wasn’t an ideal sample preparation technique to recover all the VOCs from the beverage sample. Furthermore, the VOCs recovered by different techniques were very different. The discussion of the pros and cons of the different techniques in our study can serve as a guide for the further development and improvement of these techniques. Combining the results from different sample preparation techniques is necessary to achieve a higher coverage of global VOC profiling. For the known fermentative aromatic compounds, the best coverage can be reached by using SPME together with SPE for beer, and VALLME for wine and cider. A fine GC×GC method development involves modulator selection, column combination and parameter optimization. Thermal modulator provides high detection sensitivity and allow exceptional trace analysis. Since the analytes coverage is the most important factor of in beverage VOC profiling, thermal modulation is a better choice. In fermented beverages, there are more polar compounds than non-polar compounds. The most suitable column combination is polar-semipolar. Same column diameters shall be used to minimize the column overloading. GC×GC parameters must be optimized. These parameters interact with each other therefore statistical prediction model is required. Response surface model is capable of doing this job while using a small number of experimental tests. The nearest neighbor distance was a suitable measurement for peak dispersion. Column and detector saturations are unavoidable if the metabolic sample is measured at one dilution level, incorrect peak deconvolution and mass spectrum construction may happen. Data processing results can be improved by a two-stage data processing strategy that will incorporate a targeted data processing and cleaning approach upstream of the “standard” untargeted analysis. Our experiments show a significant improvement in annotation and quantification results for targeted compounds causing instrumental saturation. After subtracting the saturate signal of targeted compounds, the MS construction was improved for co-eluted compounds. Incomplete signal subtraction may occur. It leads to the detection of false positive peaks or to interferences with the construction of mass spectra of co-diluted peaks. High-resolution MS libraries and more accurate peak area detection methods should be tested for further improvement.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "GCxGC, sample preparation, data processing"

1

Muskett, Frederick W. "Sample Preparation, Data Collection and Processing." In Protein NMR Spectroscopy: Practical Techniques and Applications, 5–21. Chichester, UK: John Wiley & Sons, Ltd, 2011. http://dx.doi.org/10.1002/9781119972006.ch1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abidi, Noureddine. "Sample Preparation, Data Acquisition, Spectral Data Processing and Analysis." In FTIR Microspectroscopy, 125–28. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-84426-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Souza, Amanda L., and Gary J. Patti. "A Protocol for Untargeted Metabolomic Analysis: From Sample Preparation to Data Processing." In Methods in Molecular Biology, 357–82. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-1266-8_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Assaiya, Anshul, Suparna Bhar, and Janesh Kumar. "Advances in sample preparation and data processing for single-particle cryo-electron microscopy." In Advances in Protein Molecular and Structural Biology Methods, 291–309. Elsevier, 2022. http://dx.doi.org/10.1016/b978-0-323-90264-9.00019-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Carazzone, Chiara, Julie P.G. Rodríguez, Mabel Gonzalez, and Gerson-Dirceu López. "Volatilomics of Natural Products: Whispers from Nature." In Metabolomics - Methodology and Applications in Medical Sciences and Life Sciences. IntechOpen, 2021. http://dx.doi.org/10.5772/intechopen.97228.

Full text
Abstract:
Volatilomics studies the emission of volatile compounds from living organisms like plants, flowers, animals, fruits, and microorganisms, using metabolomics tools to characterize the analytes. This is a complex process that involves several steps like sample preparation, extraction, instrumental analysis, and data processing. In this chapter, we provide balanced coverage of the different theoretical and practical aspects of the study of the volatilome. Static and dynamic headspace techniques for volatile capture will be discussed. Then, the main techniques for volatilome profiling, separation, and detection will be addressed, emphasizing gas chromatographic separation, mass spectrometry detection, and non-separative techniques using mass spectrometry. Finally, the whole volatilome data pre-processing and multivariate statistics for data interpretation will be introduced. We hope that this chapter can provide the reader with an overview of the research process in the study of volatile organic compounds (VOCs) and serve as a guide in the development of future volatilomics studies.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "GCxGC, sample preparation, data processing"

1

Fraczkiewicz, A., S. Moreau, T. Mourier, P. Bleuet, P. O. Autran, E. Capria, P. Cloetens, J. Da Silva, S. Lhostis, and F. Lorut. "Making Synchrotron Tomography a Routine Tool for 3D Integration Failure Analysis through a Limited Number of Projections, an Adapted Sample Preparation Scheme, and a Fully-Automated Post-Processing." In ISTFA 2017. ASM International, 2017. http://dx.doi.org/10.31399/asm.cp.istfa2017p0014.

Full text
Abstract:
Abstract 3D integration takes more and more importance in the microelectronics industry. This paper focuses on two types or objects, which are copper pillars (25 micrometer of diameter) and hybrid bonding samples. It aims at a statistical morphology observation of hybrid bonding structures, which underwent an electromigration test at 350 deg C and 20 mA. The goal of the study is two-fold. It is both to limit the overall time needed to perform a whole process flow, from sample preparation to reconstructed volume, and to limit the time of human intervention. To achieve this goal, three strategies are presented: improving the sample preparation scheme, reducing the number of projections with iterative algorithms and the Structural SIMilarity function, and automating the post-processing. The post-processing of the data is fully automated and directly renders the reconstructed volume. The high signal to noise ratio allows for further segmentation and analysis.
APA, Harvard, Vancouver, ISO, and other styles
2

Shokri, Shiva, Pooria Sedigh, Mehdi Hojjati, and Tsz-Ho Kwok. "A Deterministic Inspection of Surface Preparation for Metalization." In ASME 2022 17th International Manufacturing Science and Engineering Conference. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/msec2022-85334.

Full text
Abstract:
Abstract To improve the surface properties of fiber-reinforced polymer composites, one method is to employ thermal spray to apply a coating on the composite. For this purpose, it uses a metal mesh serving as an anchor between the composite and the coating to increase adhesion. However, the composite manufacturing covers the metal mesh with resin, and getting an acceptable coating is only possible through an optimum exposure of the metal mesh by sand blasting prior to coating. Therefore, this study aims to develop a computer vision and image processing method to inspect the parts and provide the operator with feedback. Initially, this approach takes the images from a single-view microscope as the inputs, and then it classifies the images into two regions of resin and metal mesh using the Otsu’s adaptive thresholding. Next, it segments the resin areas into distinct connected clusters, and it makes a histogram based on the clusters’ size. Finally, the distribution of the histogram can determine the status of the surface preparation. The state-of-the-art has only examined the sand-blasted composites manually, requiring expertise and experience. This research presents a deterministic method to automate the inspection process efficiently with an inexpensive portable digital microscope. This method is practical, especially when there is a lack of standardized data for machine learning. The experimental results show that the method can get different histograms for various samples, and it can distinguish whether a sample is under-blasted, proper-blasted, or over-blasted successfully. This study also has applications to various fields of manufacturing for defect detection and closed-loop control.
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Antao, Vadim Chuyanov, Sean Garner, William H. Steier, and Larry R. Dalton. "Modified attenuated total reflection for the fast and routine electrooptic measurements of nonlinear optical polymer thin films." In Organic Thin Films for Photonic Applications. Washington, D.C.: Optica Publishing Group, 1997. http://dx.doi.org/10.1364/otfa.1997.the.18.

Full text
Abstract:
In the course of developing EO polymers, a convenient and fast method to obtain the electrooptic (EO) coefficients, r33 and r13, is highly desirable. Some of the existing EO measurement techniques such as Fabry-Perot interferometry(1) and ellipsometry(2) require metal deposition and electrode processing for each test sample. Therefore, they are not suited for daily sample testing. Second harmonic generation (SHG) is an indirect method to measure the EO coefficients(3). It is usually performed with 1.064 Nd:YAG laser and encounters difficulties with polymers that contain high µβ chromophores because these chromophores usually have red-shifted absorption peaks that causes the Kleiman symmetry, a fundamental assumption of this technique, to break down. Attenuated total reflection (ATR) can directly measure the EO coefficients with no restriction on the wavelength of the absorption peak. One measurement scan provides the refractive index, the thickness, and an EO coefficient. Typically, the thin film electrode in contact with the test sample is made of Au or Ag in conventional ATR techniques(4). These metal thin films are soft and not durable for repeated measurements. The curve fitting algorithm for data processing(4, 5) is also inconvenient for fast sample evaluation. In this paper, a modified ATR technique for routine EO measurement is presented which does not require electrode preparation for each test sample and uses a simple algorithm for data processing.
APA, Harvard, Vancouver, ISO, and other styles
4

Eberle, A. L., T. Garbowski, S. Nickell, and D. Zeidler. "Speeding up Chip Layer Imaging with a Multi-Beam SEM." In ISTFA 2019. ASM International, 2019. http://dx.doi.org/10.31399/asm.cp.istfa2019p0283.

Full text
Abstract:
Abstract Reverse engineering of today’s integrated circuits requires proper sample preparation, high speed imaging and data processing capabilities. The electron-optical design and the data handling architecture of our multi-beam scanning electron microscopes are scalable over a large range of beam numbers, providing sufficient imaging speed - also for the foreseeable future. A first step in data processing for reverse engineering on images acquired with a multi-beam scanning electron microscope has been successfully shown in preliminary tests.
APA, Harvard, Vancouver, ISO, and other styles
5

Klingfus, Joseph, Kevin Burcham, Martin Rasche, Thomas Borchert, and Niklas Damnik. "CHIPSCANNER: Reverse Engineering Solution for Microchips at the Nanometer-Scale." In ISTFA 2011. ASM International, 2011. http://dx.doi.org/10.31399/asm.cp.istfa2011p0373.

Full text
Abstract:
Abstract Chipscanning is the high-resolution, large-area, SEM image capture of complete (or partial) IC devices. Images are acquired sequentially in matrix-array fashion over an area of interest and large image mosaics are created from the collection of smaller images. Chipscanning is of keen interest to those involved with component obsolescence, design verification, anti-counterfeiting, etc. Chipscanning, and subsequent processing of the images, can also be used to reverse engineer an IC device. The reverse engineering process can be broken down into three main tasks; sample preparation, data collection, and data processing. We present practical insight into the data collection and data processing tasks and discuss an instrument platform uniquely suited for imaging such devices.
APA, Harvard, Vancouver, ISO, and other styles
6

Luo, Jian-Shing, Chia-Chi Huang, and Jeremy D. Russell. "Embedded Gold Markers for Improved TEM/STEM Tomography Reconstruction." In ISTFA 2008. ASM International, 2008. http://dx.doi.org/10.31399/asm.cp.istfa2008p0172.

Full text
Abstract:
Abstract Electron tomography includes four main steps: tomography data acquisition, image processing, 3D reconstruction, and visualization. After acquisition, tilt-series alignments are performed. Two methods are used to align the tilt-series: cross-correlation and feature tracking. Normally, about 10-20 nm of fiducial markers, such as gold beads, are deposited onto one side of 100 mesh carbon-coated grids during the feature-tracking process. This paper presents a novel method for preparing electron tomography samples with gold beads inside to improve the feature tracking process and quality of 3D reconstruction. Results show that the novel electron tomography sample preparation method improves image alignment, which is essential for successful tomography in many contemporary semiconductor device structures.
APA, Harvard, Vancouver, ISO, and other styles
7

Courbon, Franck, Sergei Skorobogatov, and Christopher Woods. "Direct Charge Measurement in Floating Gate Transistors of Flash EEPROM Using Scanning Electron Microscopy." In ISTFA 2016. ASM International, 2016. http://dx.doi.org/10.31399/asm.cp.istfa2016p0327.

Full text
Abstract:
Abstract We present a characterization methodology for fast direct measurement of the charge accumulated on Floating Gate (FG) transistors of Flash EEPROM cells. Using a Scanning Electron Microscope (SEM) in Passive Voltage Contrast (PVC) mode we were able to distinguish between '0' and '1' bit values stored in each memory cell. Moreover, it was possible to characterize the remaining charge on the FG; thus making this technique valuable for Failure Analysis applications for data retention measurements in Flash EEPROM. The technique is at least two orders of magnitude faster than state-of-the-art Scanning Probe Microscopy (SPM) methods. Only a relatively simple backside sample preparation is necessary for accessing the FG of memory transistors. The technique presented was successfully implemented on a 0.35 μm technology node microcontroller and a 0.21 μm smart card integrated circuit. We also show the ease of such technique to cover all cells of a memory (using intrinsic features of SEM) and to automate memory cells characterization using standard image processing technique.
APA, Harvard, Vancouver, ISO, and other styles
8

Schwaiger, Johannes, Tim C. Lueth, and Franz Irlinger. "G-Code Generation for a New Printing Process Based on 3D Plastic Polymer Droplet Generation." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-63152.

Full text
Abstract:
Fabrication of plastic parts is an important scope of application for various branches of industry. This does not only concern manufacturing of end products but also production of sample parts in small lot sizes. Currently, most plastic parts are injection-molded. Consequently, it is first necessary to produce appropiate moldings, for example by milling of metal. This is very time-consuming on one hand and uneconomical concerning costs for production in small lot sizes on the other hand. Furthermore, the variety of forms is restricted considerably which is a clear disadvantage concerning the production of prototypes or spare parts. The use of a free-forming droplet generator for producing plastic parts can provide remedy. The patented principle of the printing process used in this approach is to produce droplets of liquified plastic in a preparation unit. Sequential discharge of these droplets builds a part in the installation space by solidifying of the droplets into balls. Since each 3D printing process needs its own data preprocessing, this article presents its fundamentals. STL data is used as input data and allows almost any kind of geometry. In general, a typical workflow for processing STL data is as follows: slicing volume data in order to gain contours that form 2D boarders, offsetting contours for a true to scale building process, filling of slices dependent on (offset) contours and generation of machine-code (g-code) that can be executed by the 3D printer in order to build an accurate and high-quality part. The model used in this approach is based on the droplets produced by the machine. A more detailed description of all the process-specific invidual steps from slicing up to g-code generation is presented within the scope of this paper. The continual development of custom-made algorithms based on process-specific models and parameters has resulted in the generation of g-code that could be executed on a 3D plastic polymer printer based on droplet generation for the first time. The resulting sample parts are very appealing. In conclusion, the results have shown that the whole production process can be a significant benefit especially for rapid prototyping of sample parts or spare parts.
APA, Harvard, Vancouver, ISO, and other styles
9

Tangyunyong, P., A. Y. Liang, A. W. Righter, D. L. Barton, and J. M. Soden. "Localizing Heat-Generating Defects Using Fluorescent Microthermal Imaging." In ISTFA 1996. ASM International, 1996. http://dx.doi.org/10.31399/asm.cp.istfa1996p0055.

Full text
Abstract:
Abstract Fluorescent microthermal imaging (FMI) involves coating a sample surface with a thin fluorescent film that, upon exposure to UV light source, emits temperature-dependent fluorescence [1-7]. The principle behind FMI was thoroughly reviewed at the ISTFA in 1994 [8, 9]. In two recent publications [10,11], we identified several factors in film preparation and data processing that dramatically improved the thermal resolution and sensitivity of FMI. These factors include signal averaging, the use of base mixture films, film stabilization and film curing. These findings significantly enhance the capability of FMI as a failure analysis tool. In this paper, we show several examples that use FMI to quickly localize heat-generating defects ("hot spots"). When used with other failure analysis techniques such as focused ion beam (FIB) cross sectioning and scanning electron microscope (SEM) imaging, we demonstrate that FMI is a powerful tool to efficiently identify the root cause of failures in complex ICs. In addition to defect localization, we use a failing IC to determine the sensitivity of FMI (i.e., the lowest power that can be detected) in an ideal situation where the defects are very localized and near the surface.
APA, Harvard, Vancouver, ISO, and other styles
10

Islam, M. Saiful, Jeffrey C. Suhling, and Pradeep Lall. "Measurement of the Constitutive Behavior of Underfill Encapsulants." In ASME 2003 International Electronic Packaging Technical Conference and Exhibition. ASMEDC, 2003. http://dx.doi.org/10.1115/ipack2003-35321.

Full text
Abstract:
Reliable, consistent, and comprehensive material property data are needed for microelectronic encapsulants for the purpose of mechanical design, reliability assessment, and process optimization of electronic packages. In our research efforts, the mechanical responses of several different capillary flow snap cure underfill encapsulants are being characterized. A microscale tension-torsion testing machine has been used to evaluate the uniaxial tensile stress-strain behavior of underfill materials as a function of temperature, strain rate, specimen dimensions, humidity, thermal cycling exposure, etc. A critical step to achieving accurate experimental results has been the development of a sample preparation procedure that produces mechanical test specimens that reflect the properties of true underfill encapsulant layers. In the developed method, 75–125 μm (3–5 mil) thick underfill uniaxial tension specimens are dispensed and cured using production equipment and the same processing conditions as those used with actual flip chip assemblies. Although several underfills have been examined, this work features results for the mechanical response of a single typical capillary flow snap cure underfill. A three parameter hyperbolic tangent empirical model has been shown to provide accurate fits to the observed underfill nonlinear stress-strain behavior over a range of temperatures and strain rates. In addition, typical creep data are presented.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "GCxGC, sample preparation, data processing"

1

Lehotay, Steven J., and Aviv Amirav. Fast, practical, and effective approach for the analysis of hazardous chemicals in the food supply. United States Department of Agriculture, April 2007. http://dx.doi.org/10.32747/2007.7695587.bard.

Full text
Abstract:
Background to the topic: For food safety and security reasons, hundreds of pesticides, veterinary drugs, and environmental pollutants should be monitored in the food supply, but current methods are too time-consuming, laborious, and expensive. As a result, only a tiny fraction of the food is tested for a limited number of contaminants. Original proposal objectives: Our main original goal was to develop fast, practical, and effective new approaches for the analysis of hazardous chemicals in the food supply. We proposed to extend the QuEChERS approach to more pesticides, veterinary drugs and pollutants, further develop GC-MS and LC-MS with SMB and combine QuEChERS with GC-SMB-MS and LC-SMB-EI-MS to provide the “ultimate” approach for the analysis of hazardous chemicals in food. Major conclusions, solutions and achievements: The original QuEChERS method was validated for more than 200 pesticide residues in a variety of food crops. For the few basic pesticides for which the method gave lower recoveries, an extensive solvent suitability study was conducted, and a buffering modification was made to improve results for difficult analytes. Furthermore, evaluation of the QuEChERS approach for fatty matrices, including olives and its oil, was performed. The QuEChERS concept was also extended to acrylamide analysis in foods. Other advanced techniques to improve speed, ease, and effectiveness of chemical residue analysis were also successfully developed and/or evaluated, which include: a simple and inexpensive solvent-in-silicone-tube extraction approach for highly sensitive detection of nonpolar pesticides in GC; ruggedness testing of low-pressure GC-MS for 3-fold faster separations; optimization and extensive evaluation of analyte protectants in GC-MS; and use of prototypical commercial automated direct sample introduction devices for GC-MS. GC-MS with SMB was further developed and combined with the Varian 1200 GCMS/ MS system, resulting in a new type of GC-MS with advanced capabilities. Careful attention was given to the subject of GC-MS sensitivity and its LOD for difficult to analyze samples such as thermally labile pesticides or those with weak or no molecular ions, and record low LOD were demonstrated and discussed. The new approach of electron ionization LC-MS with SMB was developed, its key components of sample vaporization nozzle and flythrough ion source were improved and was evaluated with a range of samples, including carbamate pesticides. A new method and software based on IAA were developed and tested on a range of pesticides in agricultural matrices. This IAA method and software in combination with GC-MS and SMB provide extremely high confidence in sample identification. A new type of comprehensive GCxGC (based on flow modulation) was uniquely combined with GC-MS with SMB, and we demonstrated improved pesticide separation and identification in complex agricultural matrices using this novel approach. An improved device for aroma sample collection and introduction (SnifProbe) was further developed and favorably compared with SPME for coffee aroma sampling. Implications, both scientific and agricultural: We succeeded in achieving significant improvements in the analysis of hazardous chemicals in the food supply, from easy sample preparation approaches, through sample analysis by advanced new types of GC-MS and LCMS techniques, all the way to improved data analysis by lowering LOD and providing greater confidence in chemical identification. As a result, the combination of the QuEChERS approach, new and superior instrumentation, and the novel monitoring methods that were developed will enable vastly reduced time and cost of analysis, increased analytical scope, and a higher monitoring rate. This provides better enforcement, an added impetus for farmers to use good agricultural practices, improved food safety and security, increased trade, and greater consumer confidence in the food supply.
APA, Harvard, Vancouver, ISO, and other styles
2

Semaan, Dima, and Linda Scobie. Feasibility study for in vitro analysis of infectious foodborne HEV. Food Standards Agency, September 2022. http://dx.doi.org/10.46756/sci.fsa.wfa626.

Full text
Abstract:
Hepatitis E virus (HEV) is a member of the Hepeviridae family capable of infecting humans producing a range of symptoms from mild disease to kidney failure. Epidemiological evidence suggests that hepatitis E genotype III and IV cases may be associated with the consumption of undercooked pork meat, offal and processed products such as sausages [1]. A study carried out by the Animal Health and Veterinary Laboratories Agency (AHVLA), found hepatitis E virus contamination in the UK pork production chain and that 10% of a small sample of retail pork sausages were contaminated with the virus [2]. Furthermore, studies have confirmed the presence of HEV in the food chain and the foodborne transmission of Hepatitis E virus to humans [reviewed in 5]. Likewise, Scottish shellfish at retail [6] have also been found positive for HEV viral nucleic acid and some preliminary studies indicate that the virus is also detectable in soft fruits (L Scobie; unpublished data). There are current misunderstandings in what this data represents, and these studies have raised further questions concerning the infectivity of the virus, the processing of these foods by industry and the cooking and/or preparation by caterers and consumers. There are significant gaps in the knowledge around viral infectivity, in particular the nature of the preparation of food matrices to isolate the virus, and also with respect to a consistent and suitable assay for confirming infectivity [1,3]. Currently, there is no suitable test for infectivity, and, in addition, we have no knowledge if specific food items would be detrimental to cells when assessing the presence of infectious virus in vitro. The FSA finalised a comprehensive critical review on the approaches to assess the infectivity of the HEV virus which is published [3] recommending that a cell culture based method should be developed for use with food. In order to proceed with the development of an infectivity culture method, there is a requirement to assess if food matrices are detrimental to cell culture cell survival. Other issues that may have affected the ability to develop a consistent method are the length of time the virally contaminated sample is exposed to the cells and the concentration of the virus present. In most cases, the sample is only exposed to the cells for around 1 hour and it has been shown that if the concentration is less that 1x103 copies then infection is not established [3,5,10,11].
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography