Journal articles on the topic 'Data analysis and interpretation techniques'

To see the other types of publications on this topic, follow the link: Data analysis and interpretation techniques.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Data analysis and interpretation techniques.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wu, Yanping, and Md Habibur Rahman. "Analysis of Structured Data in Biomedicine Using Soft Computing Techniques and Computational Analysis." Computational Intelligence and Neuroscience 2022 (October 10, 2022): 1–11. http://dx.doi.org/10.1155/2022/4711244.

Full text
Abstract:
In the field of biomedicine, enormous data are generated in a structured and unstructured form every day. Soft computing techniques play a major role in the interpretation and classification of the data to make appropriate decisions for making policies. The field of medical science and biomedicine needs efficient soft computing-based methods which can process all kind of data such as structured data, categorical data, and unstructured data to generate meaningful outcome for decision-making. The soft-computing methods allow clustering of similar data, classification of data, predictions from big-data analysis, and decision-making on the basis of analysis of data. A novel method is proposed in the paper using soft-computing methods where clustering mechanisms and classification mechanisms are used to process the biomedicine data for productive outcomes. Fuzzy logic and C-means clustering are devised as a collaborative approach to analyze the biomedicine data by reducing the time and space complexity of the clustering solutions. This research work is considering categorical data, numeric data, and structured data for the interpretation of data to make further decisions. Timely decisions are very important especially in the field of biomedicine because human health and human lives are involved in this field and delays in decision-making may cause threats to human lives. The COVID-19 situation was a recent example where timely diagnosis and interpretations played significant roles in saving the lives of people. Therefore, this research work has attempted to use soft computing techniques for the successful clustering of similar medical data and for quicker interpretation of data to support the decision-making processes related to medical fields.
APA, Harvard, Vancouver, ISO, and other styles
2

Fisher, M., and E. Hunter. "Digital imaging techniques in otolith data capture, analysis and interpretation." Marine Ecology Progress Series 598 (June 28, 2018): 213–31. http://dx.doi.org/10.3354/meps12531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dobson, Scott, Jennifer Dabelstein, Anita Bagley, and Jon Davids. "Interpretation of kinematic data: Visual vs. computer-based analysis techniques." Gait & Posture 7, no. 2 (March 1998): 182–83. http://dx.doi.org/10.1016/s0966-6362(98)90277-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bornik, Alexander, and Wolfgang Neubauer. "3D Visualization Techniques for Analysis and Archaeological Interpretation of GPR Data." Remote Sensing 14, no. 7 (April 1, 2022): 1709. http://dx.doi.org/10.3390/rs14071709.

Full text
Abstract:
The non-invasive detection and digital documentation of buried archaeological heritage by means of geophysical prospection is increasingly gaining importance in modern field archaeology and archaeological heritage management. It frequently provides the detailed information required for heritage protection or targeted further archaeological research. High-resolution magnetometry and ground-penetrating radar (GPR) became invaluable tools for the efficient and comprehensive non-invasive exploration of complete archaeological sites and archaeological landscapes. The analysis and detailed archaeological interpretation of the resulting large 2D and 3D datasets, and related data from aerial archaeology or airborne remote sensing, etc., is a time-consuming and complex process, which requires the integration of all data at hand, respective three-dimensional imagination, and a broad understanding of the archaeological problem; therefore, informative 3D visualizations supporting the exploration of complex 3D datasets and supporting the interpretative process are in great demand. This paper presents a novel integrated 3D GPR interpretation approach, centered around the flexible 3D visualization of heterogeneous data, which supports conjoint visualization of scenes composed of GPR volumes, 2D prospection imagery, and 3D interpretative models. We found that the flexible visual combination of the original 3D GPR datasets and images derived from the data applying post-processing techniques inspired by medical image analysis and seismic data processing contribute to the perceptibility of archaeologically relevant features and their respective context within a stratified volume. Moreover, such visualizations support the interpreting archaeologists in their development of a deeper understanding of the complex datasets as a starting point for and throughout the implemented interactive interpretative process.
APA, Harvard, Vancouver, ISO, and other styles
5

Thomas, Sabu K., and K. T. Thomachen. "Biodiversity Studies and Multicollinearity in Multivariate Data Analysis." Mapana - Journal of Sciences 6, no. 1 (May 31, 2007): 27–35. http://dx.doi.org/10.12723/mjs.10.2.

Full text
Abstract:
Multicollinearity of explanatory variables often threatens statistical interpretation of ecological data analysis in biodiversity studies. Using litter ants as an example,the impact of multicollinearity on ecological multiple regression and complications arsing from collinearity is explained.We list the various statistical techniques available for enhancing the reliability and interpretation of ecological multiple regressions in the presence of multicollinearity.
APA, Harvard, Vancouver, ISO, and other styles
6

Razminia, K., A. Hashemi, A. Razminia, and D. Baleanu. "Explicit Deconvolution of Well Test Data Dominated by Wellbore Storage." Abstract and Applied Analysis 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/912395.

Full text
Abstract:
This paper addresses some methods for interpretation of oil and gas well test data distorted by wellbore storage effects. Using these techniques, we can deconvolve pressure and rate data from drawdown and buildup tests dominated by wellbore storage. Some of these methods have the advantage of deconvolving the pressure data without rate measurement. The two important methods that are applied in this study are an explicit deconvolution method and a modification of material balance deconvolution method. In cases with no rate measurements, we use a blind deconvolution method to restore the pressure response free of wellbore storage effects. Our techniques detect the afterflow/unloading rate function with explicit deconvolution of the observed pressure data. The presented techniques can unveil the early time behavior of a reservoir system masked by wellbore storage effects and thus provide powerful tools to improve pressure transient test interpretation. Each method has been validated using both synthetic data and field cases and each method should be considered valid for practical applications.
APA, Harvard, Vancouver, ISO, and other styles
7

Yamada, Ryo, Daigo Okada, Juan Wang, Tapati Basak, and Satoshi Koyama. "Interpretation of omics data analyses." Journal of Human Genetics 66, no. 1 (May 8, 2020): 93–102. http://dx.doi.org/10.1038/s10038-020-0763-5.

Full text
Abstract:
AbstractOmics studies attempt to extract meaningful messages from large-scale and high-dimensional data sets by treating the data sets as a whole. The concept of treating data sets as a whole is important in every step of the data-handling procedures: the pre-processing step of data records, the step of statistical analyses and machine learning, translation of the outputs into human natural perceptions, and acceptance of the messages with uncertainty. In the pre-processing, the method by which to control the data quality and batch effects are discussed. For the main analyses, the approaches are divided into two types and their basic concepts are discussed. The first type is the evaluation of many items individually, followed by interpretation of individual items in the context of multiple testing and combination. The second type is the extraction of fewer important aspects from the whole data records. The outputs of the main analyses are translated into natural languages with techniques, such as annotation and ontology. The other technique for making the outputs perceptible is visualization. At the end of this review, one of the most important issues in the interpretation of omics data analyses is discussed. Omics studies have a large amount of information in their data sets, and every approach reveals only a very restricted aspect of the whole data sets. The understandable messages from these studies have unavoidable uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
8

Pavlopoulos, Sotiris, Trias Thireou, George Kontaxakis, and Andres Santos. "Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques." BioMedical Engineering OnLine 6, no. 1 (2007): 36. http://dx.doi.org/10.1186/1475-925x-6-36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kendrick, Sarah K., Qi Zheng, Nichola C. Garbett, and Guy N. Brock. "Application and interpretation of functional data analysis techniques to differential scanning calorimetry data from lupus patients." PLOS ONE 12, no. 11 (November 9, 2017): e0186232. http://dx.doi.org/10.1371/journal.pone.0186232.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Renqi, Jiang, John P. Castagna, and Wu Jian. "Applications of high-resolution seismic frequency and phase attribute analysis techniques." Earth sciences and subsoil use 45, no. 4 (January 8, 2023): 324–44. http://dx.doi.org/10.21285/2686-9993-2022-45-4-324-344.

Full text
Abstract:
Seismic prospecting for oil and gas exploration and development is limited by seismic data resolution. Improving the accuracy of quantitative interpretation of seismic data in thin layers, thereby identifying effective reservoirs and delineating favorable areas, can be a key factor for successful exploration and development. Historically, the limit of seismic resolution is usually assumed to be about 1/4 wavelength of the dominant frequency of the data in the formation of interest. Constrained seismic reflectivity inversion can resolve thinner layers than this assumed limit. This leads to a series of highresolution quantitative interpretation methods and techniques have been developed. Case studies in carbonates, clastic, and unconventional reservoirs indicate that the application of quantitative interpretation techniques such as high-resolution seismic frequency and phase attribute analysis can resolve and allow/or allow quantitative estimation of rock and fluid properties in such seismically thin layers. Band recovery using high resolution seismic processing technology can greatly improve the ability to recognize geological details such as thin layers, faults, and karst caves. Multiscale fault detection technology can effectively detect small-scale faults in addition to more readily recognized large-scale faults. Based on traditional seismic amplitude information, high-resolution spectral decomposition and phase decomposition technology expands seismic attribute analysis to the frequency and phase dimensions, boosting the interpretable geological information content of the seismic data including subsurface geological characteristics and hydrocarbon potential and thereby improving the reliability of seismic interpretation. These technologies, based on high-resolution quantitative interpretation techniques, make the identification of effective reservoirs more efficient and accurate.
APA, Harvard, Vancouver, ISO, and other styles
11

Alfarraj, Motaz, Yazeed Alaudah, Zhiling Long, and Ghassan AlRegib. "Multiresolution analysis and learning for computational seismic interpretation." Leading Edge 37, no. 6 (June 2018): 443–50. http://dx.doi.org/10.1190/tle37060443.1.

Full text
Abstract:
We explore the use of multiresolution analysis techniques as texture attributes for seismic image characterization, especially in representing subsurface structures in large migrated seismic data. Namely, we explore the Gaussian pyramid, the discrete wavelet transform, Gabor filters, and the curvelet transform. These techniques are examined in a seismic structure labeling case study on the Netherlands offshore F3 block. In seismic structure labeling, a seismic volume is automatically segmented and classified according to the underlying subsurface structure using texture attributes. Our results show that multiresolution attributes improve the labeling performance compared to using seismic amplitude alone. Moreover, directional multiresolution attributes, such as the curvelet transform, are more effective than the nondirectional attributes in distinguishing different subsurface structures in large seismic data sets and can greatly help the interpretation process.
APA, Harvard, Vancouver, ISO, and other styles
12

Paton, Gaynor S., and Jonathan Henderson. "Visualization, interpretation, and cognitive cybernetics." Interpretation 3, no. 3 (August 1, 2015): SX41—SX48. http://dx.doi.org/10.1190/int-2014-0283.1.

Full text
Abstract:
Interpretation of 3D seismic data involves the analysis and integration of many forms and derivatives of the original reflectivity data. This can lead to the generation of an overwhelming amount of data that can be difficult to use effectively when relying on conventional interpretation techniques. Our natural cognitive processes have evolved so that we can absorb and understand large amounts of complex data extremely quickly and effectively. However, these cognitive processes are heavily influenced by context and color perception. Seismic interpretation can benefit greatly through better exploiting the positive aspects of visual cognition and through techniques designed to minimize the pitfalls inherent in the cognitive process. The interpretation of data also requires the ability to combine data analysis with knowledge and expertise that is held by the interpreter. It is this combination of visual perception techniques to see the information, combined with interpreter guidance to understand what is seen, that makes interpretation of seismic data effective. Geological Expression workflows that are data driven and interpreter guided enable us to see and effectively interpret the geology that is present in the seismic data. In effect this gives us a Cognitive Interpretation of the data.
APA, Harvard, Vancouver, ISO, and other styles
13

Egozcue, Juan José, Vera Pawlowsky-Glahn, and Gregory B. Gloor. "Linear Association in Compositional Data Analysis." Austrian Journal of Statistics 47, no. 1 (January 30, 2018): 3–31. http://dx.doi.org/10.17713/ajs.v47i1.689.

Full text
Abstract:
With compositional data ordinary covariation indexes, designed for real random variables, fail to describe dependence. There is a need for compositional alternatives to covariance and correlation. Based on the Euclidean structure of the simplex, called Aitchison geometry, compositional association is identied to a linear restriction of the sample space when a log-contrast is constant. In order to simplify interpretation, a sparse and simple version of compositional association is dened in terms of balances which are constant across the sample. It is called b-association. This kind of association of compositional variables is extended to association between groups of compositional variables. In practice, exact b-association seldom occurs, and measures of degree of b-association are reviewed based on those previously proposed. Also, some techniques for testing b-association are studied. These techniques are applied to available oral microbiome data to illustrate both their advantages and diculties. Both testing and measurements of b-association appear to be quite sensible to heterogeneities in the studied populations and to outliers.
APA, Harvard, Vancouver, ISO, and other styles
14

Manataki, Merope, Antonis Vafidis, and Apostolos Sarris. "GPR Data Interpretation Approaches in Archaeological Prospection." Applied Sciences 11, no. 16 (August 17, 2021): 7531. http://dx.doi.org/10.3390/app11167531.

Full text
Abstract:
This article focuses on the possible drawbacks and pitfalls in the GPR data interpretation process commonly followed by most GPR practitioners in archaeological prospection. Standard processing techniques aim to remove some noise, enhance reflections of the subsurface. Next, one has to calculate the instantaneous envelope and produce C-scans which are 2D amplitude maps showing high reflectivity surfaces. These amplitude maps are mainly used for data interpretation and provide a good insight into the subsurface but cannot fully describe it. The main limitations are discussed while studies aiming to overcome them are reviewed. These studies involve integrated interpretation approaches using both B-scans and C-scans, attribute analysis, fusion approaches, and recent attempts to automatically interpret C-scans using Deep Learning (DL) algorithms. To contribute to the automatic interpretation of GPR data using DL, an application of Convolutional Neural Networks (CNNs) to classify GPR data is also presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
15

Jacobson, Larry, Venkataraman Jambunathan, Zhipeng Liu, and Weijun Guo. "Technical advances in pulsed-neutron interpretation for cased-hole logging: Physics, interpretation, and log examples." Interpretation 3, no. 1 (February 1, 2015): SA159—SA166. http://dx.doi.org/10.1190/int-2014-0174.1.

Full text
Abstract:
Recently developed multidetector pulsed-neutron tools (MDPNTs — a term describing a pulsed-neutron tool with at least three detectors) can provide three-phase formation fluid analysis in cased wells. These tools are 43 mm (1 11/16 in.) or 54 mm (2 1/8 in.) in diameter and can be logged in or below most tubing sizes. We reviewed traditional oil- and water-saturation techniques as well as indirect gas-saturation techniques, and we compared them with recently developed direct gas-saturation techniques, now available from MDPNTs. A log example developed the data verification and interpretation process. The interpretation process was divided into two parts: First, we verified the log data quality and second, we applied a newly developed gas model to the log data providing gas saturation without any reliance on the previously determined oil and water saturation.
APA, Harvard, Vancouver, ISO, and other styles
16

Heni, Heni Subagiharti, Diah Syahfitri Handayani, and Tuti Herawati. "Analysis of Language Styles in Fiersa Besari's Songs Based on Hermeneutic Study." Journal of Scientific Research, Education, and Technology (JSRET) 1, no. 2 (December 6, 2022): 221–27. http://dx.doi.org/10.58526/jsret.v1i2.31.

Full text
Abstract:
The purpose of this study was to determine the style of language and describe the meaning of songs by Fiersa Besari through a Hermeneutical approach based on the theory of Friedrich Ernst Daniel Schleirmacher with the theory of Grammatical Interpretation and Psychological Interpretation. This type of research method is descriptive qualitative. Data collection methods and techniques used in this study, namely library methods and field methods where the techniques used, namely recording techniques, observing techniques and note-taking techniques. The data analysis technique focuses on the workings of hermeneutics in discussing the interpretation of meaning. The results of this study the authors conclude that the lyrics of the song Fiersa Besari use a lot of figurative language such as metonymy, hyperbole, pleonasm, personification, metaphor, sarcasm, eroticism, assonation, polisyndenton, epithet, satire, cynicism, and irony. The meanings contained in the lyrics of the song are: (1) about a long-distance love story, (2) a devotional song to improve attitudes towards the Indonesian homeland, (3) a story of unrequited love, (4) satire to the people and the government of Indonesia.
APA, Harvard, Vancouver, ISO, and other styles
17

Fischer, Klaus C., Ulrich Möller, and Roland Marschall. "Advanced Seismic Data Interpretation for Carbonate Targets Based on Optimized Processing Techniques." GeoArabia 1, no. 2 (April 1, 1996): 285–96. http://dx.doi.org/10.2113/geoarabia0102285.

Full text
Abstract:
ABSTRACT Seismic data from the shelf area of the Cretaceous Shu’aiba Formation in Abu Dhabi is used to investigate stratigraphic and structural seismic anomalies. The data consists of a 2-D grid of seismic lines, acquired in the late 1980s and 1993. The data was reprocessed in several phases. The first phase consists of standard time domain processing upto final Dip Move Out stack and migration. In the second phase, a macro-velocity model for post-stack depth migration is generated and tested by the interpreters. The third phase is the interpretation of the pre-stack depth migration stack. Due to the structural irregularity of the Shu’aiba Formation, the pre-stack depth migrated data is considered the most reliable for Amplitude Versus Offset analysis. Further steps are L-1 deconvolution followed by Born Inversion. These last steps are required before the lithology can be modeled with high-resolution. The final lithological model is verified by applying forward modeling. The lithological model forms the basis for reservoir and geostatistical evaluations which account for heterogeneities.
APA, Harvard, Vancouver, ISO, and other styles
18

Marshall, Anne, and Garry Marshall. "A Decision-Tree Approach to the Interpretation of Archaeological Data." Antiquaries Journal 74 (March 1994): 1–11. http://dx.doi.org/10.1017/s0003581500024379.

Full text
Abstract:
SummaryA number of techniques capable of generating descriptions of data have been developed in the area of Artificial Intelligence. Their suitability for use with a data set compiled to record the details of Anglo-Saxon structures in Britain is considered. An appropriate method which describes the data by means of decision trees is chosen and, after some adaptation, is used to generate descriptions of this data in the form of decision trees. The resulting descriptions compare favourably with interpretations obtained by archaeological analysis.
APA, Harvard, Vancouver, ISO, and other styles
19

Abu-Siada, Ahmed. "Improved Consistent Interpretation Approach of Fault Type within Power Transformers Using Dissolved Gas Analysis and Gene Expression Programming." Energies 12, no. 4 (February 22, 2019): 730. http://dx.doi.org/10.3390/en12040730.

Full text
Abstract:
Dissolved gas analysis (DGA) of transformer oil is considered to be the utmost reliable condition monitoring technique currently used to detect incipient faults within power transformers. While the measurement accuracy has become relatively high since the development of various off-line and on-line measuring sensors, interpretation techniques of DGA results still depend on the level of personnel expertise more than analytical formulation. Therefore, various interpretation techniques may lead to different conclusions for the same oil sample. Moreover, ratio-based interpretation techniques may fail in interpreting DGA data in case of multiple fault conditions and when the oil sample comprises insignificant amount of the gases used in the specified ratios. This paper introduces an improved approach to overcome the limitations of conventional DGA interpretation techniques, automate and standardize the DGA interpretation process. The approach is built based on incorporating all conventional DGA interpretation techniques in one expert system to identify the fault type in a more consistent and reliable way. Gene Expression Programming is employed to establish this expert system. Results show that the proposed approach provides more reliable results than using individual conventional methods that are currently adopted by industry practice worldwide.
APA, Harvard, Vancouver, ISO, and other styles
20

Fomina, Anna V., Artem M. Borbat, Evgeny A. Karpulevich, and Anton Yu Naumov. "Neural network interpretation techniques for analysis of histological images of breast abnormalities." Gynecology 24, no. 6 (January 20, 2023): 529–37. http://dx.doi.org/10.26442/20795696.2022.6.201990.

Full text
Abstract:
Background. Neural networks are actively used in digital pathology to analyze histological images and support medical decision-making. A common approach is to solve the classification problem, where only class labels are the only model responses. However, one should understand which areas of the image have the most significant impact on the model's response. Machine learning interpretation techniques help solve this problem. Aim. To study the consistency of different methods of neural network interpretation when classifying histological images of the breast and to obtain an expert assessment of the results of the evaluated methods. Materials and methods. We performed a preliminary analysis and pre-processing of the existing data set used to train pre-selected neural network models. The existing methods of visualizing the areas of attention of trained models on easy-to-understand data were applied, followed by verification of their correct use. The same neural network models were trained on histological data, and the selected interpretation methods were used to systematize histological images, followed by the evaluation of the results consistency and an expert assessment of the results. Results. In this paper, several methods of interpreting machine learning are studied using two different neural network architectures and a set of histological images of breast abnormalities. Results of ResNet18 and ViT-B-16 models training on a set of histological images on the test sample: accuracy metric 0.89 and 0.89, ROC_AUC metric 0.99 and 0.96, respectively. The results were also evaluated by an expert using the Label Studio tool. For each pair of images, the expert was asked to select the most appropriate answer ("Yes" or "No") to the question: "The highlighted areas generally correspond to the Malignant class." The "Yes" response rate for the ResNet_Malignant category was 0.56; for ViT_Malignant, it was 1.0. Conclusion. Interpretability experiments were conducted with two different architectures: the ResNet18 convolutional network and the ViT-B-16 attention-enhanced network. The results of the trained models were visualized using the GradCAM and Attention Rollout methods, respectively. First, experiments were conducted on a simple-to-interpret dataset to ensure they were used correctly. The methods are then applied to the set of histological images. In easy-to-understand images (cat images), the convolutional network is more consistent with human perception; on the contrary, in histological images of breast cancer, ViT-B-16 provided results much more similar to the expert's perception.
APA, Harvard, Vancouver, ISO, and other styles
21

Pelánek, Radek. "Analyzing and Visualizing Learning Data: A System Designer's Perspective." Journal of Learning Analytics 8, no. 2 (September 3, 2021): 93–104. http://dx.doi.org/10.18608/jla.2021.7345.

Full text
Abstract:
In this work, we consider learning analytics for primary and secondary schools from the perspective of the designer of a learning system. We provide an overview of practically useful analytics techniques with descriptions of their applications and specific illustrations. We highlight data biases and caveats that complicate the analysis and its interpretation. Although we intentionally focus on techniques for internal use by designers, many of these techniques may inspire the development of dashboards for teachers or students. We also identify the consequences and challenges for research.
APA, Harvard, Vancouver, ISO, and other styles
22

Silic, J. "Interpretation of TDEM data using first and second spatial derivatives and time decay analysis." Exploration Geophysics 20, no. 2 (1989): 57. http://dx.doi.org/10.1071/eg989057.

Full text
Abstract:
Current gathering in fixed loop electromagnetic data often dominates responses from large high-grade ore bodies as well as responses from less desirable features such as fault zones, weathering troughs and regional conductors. Through decay curve analysis, current gathering can now be unambiguously recognised.Many widely used EM interpretation techniques are not applicable to current gathering (channelling) responses. An effective method of deriving the location and shape of the causative source is to study the second spatial derivative, as is shown in several examples.
APA, Harvard, Vancouver, ISO, and other styles
23

Allan, James D., Jose L. Jimenez, Paul I. Williams, M. Rami Alfarra, Keith N. Bower, John T. Jayne, Hugh Coe, and Douglas R. Worsnop. "Quantitative sampling using an Aerodyne aerosol mass spectrometer 1. Techniques of data interpretation and error analysis." Journal of Geophysical Research: Atmospheres 108, no. D3 (February 4, 2003): n/a. http://dx.doi.org/10.1029/2002jd002358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Li, Colin Lewis-Beck, Elyse Fritschel, Erdem Baser, and Onur Baser. "Applied Comparison of Meta-analysis Techniques." Journal of Health Economics and Outcomes Research 1, no. 1 (February 28, 2013): 14–22. http://dx.doi.org/10.36469/9848.

Full text
Abstract:
Background: Meta-analysis is an approach that combines findings from similar studies. The aggregation of study level data can provide precise estimates for outcomes of interest, allow for unique treatment comparisons, and explain the differences arising from conflicting study results. Proper meta-analysis includes five basic steps: identify relevant studies; extract summary data from each paper; compute study effect sizes, perform statistical analysis; and interpret and report the results. Objectives: This study aims to review meta-analysis methods and their assumptions, apply various meta-techniques to empirical data, and compare the results from each method. Methods: Three different meta-analysis techniques were applied to a dataset looking at the effects of the bacille Calmette-Guerin (BCG) vaccine on tuberculosis (TB). First, a fixed-effects model was applied; then a random-effects model; and third meta-regression with study-level covariates were added to the model. Overall and stratified results, by geographic latitude were reported. Results: All three techniques showed a statistically significant effects from the vaccination. However, once covariates were added, efficacy diminished. Independent variables, such as the latitude of the location in which the study was performed, appeared to be partially driving the results. Conclusions: Meta-analysis is useful for drawing general conclusions from a variety of studies. However, proper study and model selection are important to ensure the correct interpretation of results. Basic meta-analysis models are fixed-effects, random-effects, and meta-regression.
APA, Harvard, Vancouver, ISO, and other styles
25

Wesolowski, Marek, and Edyta Leyk. "Coupled and Simultaneous Thermal Analysis Techniques in the Study of Pharmaceuticals." Pharmaceutics 15, no. 6 (May 25, 2023): 1596. http://dx.doi.org/10.3390/pharmaceutics15061596.

Full text
Abstract:
Reliable interpretation of the changes occurring in the samples during their heating is ensured by using more than one measurement technique. This is related to the necessity of eliminating the uncertainty resulting from the interpretation of data obtained by two or more single techniques based on the study of several samples analyzed at different times. Accordingly, the purpose of this paper is to briefly characterize thermal analysis techniques coupled to non-thermal techniques, most often spectroscopic or chromatographic. The design of coupled thermogravimetry (TG) with Fourier transform infrared spectroscopy (FTIR), TG with mass spectrometry (MS) and TG with gas chromatography/mass spectrometry (GC/MS) systems and the principles of measurement are discussed. Using medicinal substances as examples, the key importance of coupled techniques in pharmaceutical technology is pointed out. They make it possible not only to know precisely the behavior of medicinal substances during heating and to identify volatile degradation products, but also to determine the mechanism of thermal decomposition. The data obtained make it possible to predict the behavior of medicinal substances during the manufacture of pharmaceutical preparations and determine their shelf life and storage conditions. Additionally, characterized are design solutions that support the interpretation of differential scanning calorimetry (DSC) curves based on observation of the samples during heating or based on simultaneous registration of FTIR spectra and X-ray diffractograms (XRD). This is important because DSC is an inherently non-specific technique. For this reason, individual phase transitions cannot be distinguished from each other based on DSC curves, and supporting techniques are required to interpret them correctly.
APA, Harvard, Vancouver, ISO, and other styles
26

Strobbia, Claudio, and Giorgio Cassiani. "Refraction microtremors: Data analysis and diagnostics of key hypotheses." GEOPHYSICS 76, no. 3 (May 2011): MA11—MA20. http://dx.doi.org/10.1190/1.3560246.

Full text
Abstract:
Surface-wave methods are quite popular for site characterization in geotechnical earthquake engineering. Among these techniques, a particular role is taken by passive methods for their ability to yield information on the low-frequency range and consequently on large depths. One such passive method, the refraction microtremors (ReMi) technique, has been proposed as a simple alternative to 2D-array techniques to estimate surface-wave dispersion by using linear arrays of geophones. The technique owes its name to the use of widely available instruments also adopted for seismic refraction. The basic hypotheses underlying ReMi are that noise is distributed isotropically in azimuth or is aligned exactly with the array. These conditions often are not met, and in most cases they are not verified because such analysis requires an accurate approach to data processing that is rarely applied. We have developed an algorithm that verifies ReMi’s basic hypotheses by analyzing experimental data. In addition, we have proposed an algorithm to identify the lowest apparent velocity on the ReMi spectra, thus avoiding interpretation problems.
APA, Harvard, Vancouver, ISO, and other styles
27

Parulekar, Prashant. "Nodal screw compressor failure analysis using data analytics—CSG application." APPEA Journal 55, no. 1 (2015): 59. http://dx.doi.org/10.1071/aj14005.

Full text
Abstract:
An engine-driven oil-injected screw compressor in CSG service failed catastrophically. Instrumentation provided on the package was ineffective in predicting or detecting the failure. As part of the Root Cause Analysis (RCA) process, a statistical analysis of the logged instrument data, as measured across a period of six months prior to the failure, was carried out. This paper uses data analytic methods to process instrument data, data visualisation techniques, advanced statistical analysis of the instrument data, and techniques to filter signal noise. The analysis recognised the multivariate behaviour and interrelationships between various operating parameters. The paper further provides insight into the interpretation of statistical measures and how to draw conclusions that explain the failure mechanism. The outcomes of the analysis presented in this paper then provided insights into establishing operating envelopes, proposed instrumentation upgrades to be provided in future and helped establish an operation and maintenance regime that should assist in preventing such failures in future.
APA, Harvard, Vancouver, ISO, and other styles
28

Brandner, Paul A., James A. Venning, and Bryce W. Pearce. "Wavelet analysis techniques in cavitating flows." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 376, no. 2126 (July 9, 2018): 20170242. http://dx.doi.org/10.1098/rsta.2017.0242.

Full text
Abstract:
Cavitating and bubbly flows involve a host of physical phenomena and processes ranging from nucleation, surface and interfacial effects, mass transfer via diffusion and phase change to macroscopic flow physics involving bubble dynamics, turbulent flow interactions and two-phase compressible effects. The complex physics that result from these phenomena and their interactions make for flows that are difficult to investigate and analyse. From an experimental perspective, evolving sensing technology and data processing provide opportunities for gaining new insight and understanding of these complex flows, and the continuous wavelet transform (CWT) is a powerful tool to aid in their elucidation. Five case studies are presented involving many of these phenomena in which the CWT was key to data analysis and interpretation. A diverse set of experiments are presented involving a range of physical and temporal scales and experimental techniques. Bubble turbulent break-up is investigated using hydroacoustics, bubble dynamics and high-speed imaging; microbubbles are sized using light scattering and ultrasonic sensing, and large-scale coherent shedding driven by various mechanisms are analysed using simultaneous high-speed imaging and physical measurement techniques. The experimental set-up, aspect of cavitation being addressed, how the wavelets were applied, their advantages over other techniques and key findings are presented for each case study. This paper is part of the theme issue ‘Redundancy rules: the continuous wavelet transform comes of age’.
APA, Harvard, Vancouver, ISO, and other styles
29

How, Jason Richard, and Simon de Lestang. "Acoustic tracking: issues affecting design, analysis and interpretation of data from movement studies." Marine and Freshwater Research 63, no. 4 (2012): 312. http://dx.doi.org/10.1071/mf11194.

Full text
Abstract:
Acoustic telemetry systems are an increasingly common way to examine the movement and behaviour of marine organisms. However, there has been little published on the methodological and analytical work associated with this technology. We tested transmitters of differing power outputs simultaneously in several trials, some lasting ~50 days, to examine the effects of power output and environmental factors (water movement, temperature, lunar cycle and time of day). There were considerable and volatile changes in detections throughout all trials. Increased water movement and temperature significantly reduced detection rates, whereas daytime and full-moon periods had significantly higher detection rates. All nine transmitters (from seven transmitter types tested) showed a sigmoidal trend between detection frequency and distance. Higher-powered transmitters had a prolonged detection distance with near-maximal detections, whereas lower-powered transmitters showed an almost immediate decline. Variation of detection frequency, transmitter type and the modelled relationship between distance and detection frequency were incorporated into a positioning trial which resulted in markedly improved position estimates over previous techniques.
APA, Harvard, Vancouver, ISO, and other styles
30

Newell, Bruce D. "Image processing and analysis fundamentals for microscopy." Proceedings, annual meeting, Electron Microscopy Society of America 53 (August 13, 1995): 678–79. http://dx.doi.org/10.1017/s0424820100139767.

Full text
Abstract:
Advances in computers and related digital hardware, coupled with sophisticated software techniques have resulted in microscopy migrating from its historical roots as a subjective, qualitative science towards a more robust position as a truly quantitative technique. Granted, we will probably never totally remove the microscopist from the process of image interpretation (at least those at this conference hope not) but we will certainly continue to progress from describing our image data in qualitative terms (e.g., many/few, large/small, equiaxis/elongated, ordered/random, rough/smooth) to quantitative measurements of number, size, shape, location, texture, and so on.To move along the path toward quantitative image interpretations requires an understanding of image processing and analysis (IP/A) fundamentals to insure that the data obtained is of the required accuracy and precision. A generalized model of the critical steps in the image processing and analysis chain is given in Figure 1. This tutorial will examine the fundamental issues in each step that impact the quality of the final result and provide a broad overview of techniques that may be applicable.
APA, Harvard, Vancouver, ISO, and other styles
31

Conroy, J., T. Dooley, and J. F. McNamara. "Stochastic Modeling and Data Analysis of a Prototype Wave Energy Convertor." Journal of Energy Resources Technology 107, no. 1 (March 1, 1985): 87–92. http://dx.doi.org/10.1115/1.3231168.

Full text
Abstract:
This study describes the application of the techniques of stochastic modeling to random data obtained from the sea trials of the wave energy device Kaimei, and the interpretation of the results in terms of device efficiency. Two models of the power absorption system are developed and the relationships derived between them yield information on the influence of the vessel motions on the absorbed power. The theory of multiple frequency response functions is applied to the data, and it is shown that the motions of the Kaimei have a detrimental effect on its energy absorption capacity and the reasons for this are investigated.
APA, Harvard, Vancouver, ISO, and other styles
32

Dusold, Laurence R., and John A. G. Roach. "Computer Assistance in Food Analysis." Journal of AOAC INTERNATIONAL 69, no. 5 (September 1, 1986): 754–56. http://dx.doi.org/10.1093/jaoac/69.5.754.

Full text
Abstract:
Abstract Laboratory computer links are a key part of acquisition, movement, and interpretation of certain types of data. Remote information retrieval from databases such as the Chemical Information System provides the analyst with structural and toxicologicai information via a laboratory terminal. Remote processing of laboratory data by large computers permits the application of pattern recognition techniques to the solution of complex multivariate problems such as the detection of food adulteration.
APA, Harvard, Vancouver, ISO, and other styles
33

Eichelberger, Nathan W., Amanda N. Hughes, and Alan G. Nunns. "Combining multiple quantitative structural analysis techniques to create robust structural interpretations." Interpretation 3, no. 4 (November 1, 2015): SAA89—SAA104. http://dx.doi.org/10.1190/int-2015-0016.1.

Full text
Abstract:
Carefully selected 2D transects contain an abundance of structural information that can constrain 3D analyses of petroleum systems. Realizing the full value of the information in a 2D transect requires combining multiple, independent structural analysis techniques using fully interactive tools. Our approach uses quantitative structural geologic software that instantaneously displays structural computations and analyses, eliminating time-intensive manual measurements and calculations. By quickly testing multiple hypotheses, we converged on an optimal solution that is consistent with available data. We have combined area-depth-strain (ADS) analysis, structural restoration, and forward modeling of a structural interpretation of a fault-propagation fold in the Niger Delta. These methods confirmed the original interpretation and furthermore quantified displacement, strain, detachment depth, and kinematic history. ADS analysis validated the interpreted detachment depth and revealed significant layer-parallel strain (LPS) that varied systematically with stratigraphic depth. The stratigraphic distribution of the LPS was diagnostic of structural style and, in this example, discriminated against fixed-axis and constant-thickness fault-propagation folding. A quantitative forward model incorporating backlimb shear and trishear fault-propagation folding accurately reproduced folding and faulting in the pregrowth section and folding in the growth section. The model-predicted strain distributions were consistent with those from ADS analysis. The highest local strains on the back limb of the structure were spatially coincident with two backthrusts, which accommodated these strains. Animations of a more complete model including the backthrusts revealed that the backthrusts formed sequentially as rock passed through the main fault bend.
APA, Harvard, Vancouver, ISO, and other styles
34

Zareba, Mateusz, Tomasz Danek, and Michal Stefaniuk. "Unsupervised Machine Learning Techniques for Improving Reservoir Interpretation Using Walkaway VSP and Sonic Log Data." Energies 16, no. 1 (January 2, 2023): 493. http://dx.doi.org/10.3390/en16010493.

Full text
Abstract:
In this paper, we present a detailed analysis of the possibility of using unsupervised machine learning techniques for reservoir interpretation based on the parameters obtained from geophysical measurements that are related to the elastic properties of rocks. Four different clustering algorithms were compared, including balanced iterative reducing and clustering using hierarchies, the Gaussian mixture model, k-means, and spectral clustering. Measurements with different vertical resolutions were used. The first set of input parameters was obtained from the walkaway VSP survey. The second one was acquired in the well using a full-wave sonic tool. Apart from the study of algorithms used for clustering, two data pre-processing paths were analyzed in the context of matching the vertical resolution of both methods. The validation of the final results was carried out using a lithological identification of the medium based on an analysis of the drill core. The measurements were performed in Silurian rocks (claystone, mudstone, marly claystone) lying under an overburdened Zechstein formation (salt and anhydrite). This formation is known for high attenuating seismic signal properties. The presented study shows results from the first and only multilevel walkaway VSP acquisition in Poland.
APA, Harvard, Vancouver, ISO, and other styles
35

Efendy, Viky Candra. "INTERPRETATION APPROACHES THROUGH CELLO GAME TECHNIQUES IN THE "ARIOSO" SONG J.S. BACH." Repertoar Journal 1, no. 1 (July 29, 2020): 60–69. http://dx.doi.org/10.26740/rj.v1n1.p60-69.

Full text
Abstract:
Arioso song by J.S Bach is one repertoire that is usually played in the advanced stages of cello learning. Besides that, Arioso song contains techniques that presenter may often not pay attention to in playing this work. This study aims to describe the interpretation approach through cello instrument playing techniques on the song Arioso J.S Bach. In this study, researchers used qualitative research methods. The research location was in the researcher's house, rented by the researcher, and in the library majoring in the Faculty of Languages and Arts. Data collection techniques used were observation and interviews. Data analysis techniques used include data reduction, data presentation and verification. Based on the results of this study, it can be concluded that in this song there are various techniques that must be played to be able to produce the desired interpretation including time signature, tempo, game techniques, and ornaments
APA, Harvard, Vancouver, ISO, and other styles
36

G, Harshitha. "Performance Analysis of a Cricketer by Data Visualization." International Journal for Research in Applied Science and Engineering Technology 10, no. 1 (January 31, 2022): 1800–1807. http://dx.doi.org/10.22214/ijraset.2022.40176.

Full text
Abstract:
Abstract: Indian Premier League is a very competitive tournament where team selection is a very tricky and tedious procedure. Analysis of sports data and Prediction of each player’s performance helps in filtering the best players. A novel method employing the techniques of Data Analytics and Data Visualization is used in this research paper to extract individual player performance from huge statistics and datasets. An application is created to bridge the space between selecting team, coaches, and team management and to give a better interpretation on player steadiness, scoring and further capabilities. In this paper, pandas library is used for data analysis and manipulation tool, Microsoft azure is used for performance prediction and HTML, CSS, flask for the front-end application. Additionally, various machine learning algorithms are applied on the same data to find the best fit. The proposed application can be beneficial for team managements and decision making Keywords: Indian Premier League, Data analytics, Data Visualization, Prediction of player’s Performance, Microsoft Azure
APA, Harvard, Vancouver, ISO, and other styles
37

Tóth, Tamás, and István Majzik. "Formal Verification of Real-Time Systems with Data Processing." Periodica Polytechnica Electrical Engineering and Computer Science 61, no. 2 (May 23, 2017): 166. http://dx.doi.org/10.3311/ppee.9766.

Full text
Abstract:
The behavior of practical safety critical systems often combines real-time behavior with structured data flow. To ensure correctness of such systems, both aspects have to be modeled and formally verified. Time related behavior can be efficiently modeled and analyzed in terms of timed automata. At the same time, program verification techniques like abstract interpretation and software model checking can efficiently handle data flow. In this paper, we describe a simple formalism that represents both aspects of such systems in a uniform and explicit way, thus enables the combination of formal analysis methods for real-time systems and software using standard techniques.
APA, Harvard, Vancouver, ISO, and other styles
38

Kaliyev, D. T. "Use of neural networks for dynamic interpretation of seismic data." Kazakhstan journal for oil & gas industry 4, no. 2 (July 20, 2022): 27–34. http://dx.doi.org/10.54859/kjogi108576.

Full text
Abstract:
Neural networks and machine learning have long been used by almost everyone in their daily lives, perhaps not always consciously. When an algorithm of social networks identifies the faces of people in a photo or a voice assistant helps us search for some information, machine learning techniques underpin all of these activities. In recent years neural networks are finding more and more applications in the fields of oil and gas exploration and production. This article aims to illustrate an example of the application of neural networks in the analysis of seismic data for an active oilfield by predicting 3D cube of petrophysical properties to further detail the geological model and search for additional hydrocarbon accumulations. One of the key conditions for successful prediction of petrophysical properties using neural networks is a wide sample of well data for effective training of a non-linear operator. In our case, since it is a producing field, there were more than 100 wells available, which fully meets the requirements of the algorithm. Another important condition for application of this technique is having high-quality well ties for the used wells, this step of the workflow will also be described within the article. A distinct feature of neural network analysis, in contrast to classical inversion, is that it does not use a seismic wavelet. The neural network automatically determines such an operator that best describes the correlation between several seismic traces in the wellbore area and the log curve. This feature reduces the analysis time and produces express results if the above mentioned conditions are met, which makes the neural network technique an effective tool for dynamic analysis of seismic data.
APA, Harvard, Vancouver, ISO, and other styles
39

DeAngelo, Michael V., Paul E. Murray, Bob A. Hardage, and Randy L. Remington. "Integrated 2D 4-C OBC velocity analysis of near-seafloor sediments, Green Canyon, Gulf of Mexico." GEOPHYSICS 73, no. 6 (November 2008): B109—B115. http://dx.doi.org/10.1190/1.2969943.

Full text
Abstract:
Using 2D four-component ocean-bottom-cable (2D 4-C OBC) seismic data processed in common-receiver gathers, we developed robust [Formula: see text] and [Formula: see text] interval velocities for the near-seafloor strata. A vital element of the study was to implement iterative interpretation techniques to correlate near-seafloor P-P and P-SV images. Initially, depth-equivalent P-P and P-SV layers were interpreted by visually matching similar events in both seismic modes. Complementary 1D ray-tracing analyses then determined interval values of subsea-floor [Formula: see text] and [Formula: see text] velocities across a series of earth layers extending from the seafloor to below the base of the hydrate stability zone (BHSZ) to further constrain these interpretations. Iterating interpretation of depth-equivalent horizons with velocity analyses allowed us to converge on physically reasonable velocity models. Simultaneous [Formula: see text] and [Formula: see text] velocity analysis provided additional model constraints in areas where data quality of one reflection mode (usually [Formula: see text] in the near-seafloor environments) would not provide adequate information to derive reliable velocity information.
APA, Harvard, Vancouver, ISO, and other styles
40

Assaqaf, Tareq. "Techniques for Interpreting English Proverbs into Arabic." International Journal of Language and Literary Studies 1, no. 1 (June 30, 2019): 72–80. http://dx.doi.org/10.36892/ijlls.v1i1.27.

Full text
Abstract:
Interpretation plays a role of a paramount significance in sending and receiving messages between people all over the world. It is of vital significance in the international conferences, symposiums and workshops where the meaning must be transferred and exchanged among the participants. In fact, proverbs are considered as one of the most important elements which are used in speech and need to be exchanged among nations around the globe. However, interpreters usually encounter some challenges in interpreting proverbs from English to Arabic or vice versa due to the cultural differences between Arabic and English as well as the lack of equivalents for some proverbs. This study investigates the techniques of interpreting the English proverbs from English to Arabic. The data of this study are collected from two basic well-known dictionaries, namely, the Lamps of Experience: a Collection of English Proverbs by Ba’alabaki (1980) and a Dictionary of Proverbs: English – Arabic by Kilani and Ashour (1991). The analysis of data reveals that many useful techniques can be used for the interpretation of proverbs. Such techniques are highlighted and graded based on their own priorities. The present study provides recommendations for interpreters, translators, researchers which might improve the quality of interpretation and translation of proverbs from English to Arabic or vice versa.
APA, Harvard, Vancouver, ISO, and other styles
41

Hoeke, J. O. O., B. Bonke, R. van Strik, E. S. Gelsema, and R. Verheij. "Evaluation of Techniques for the Presentation of Laboratory Data. II: Accuracy of Interpretation." Methods of Information in Medicine 36, no. 01 (January 1997): 17–19. http://dx.doi.org/10.1055/s-0038-1634686.

Full text
Abstract:
Abstract:Four tabular and two graphical techniques for the presentation of laboratory test results were evaluated in a reaction time experiment with 25 volunteers. Artificial variables and values were used to represent sets of 12 laboratory tests to eliminate the possible effects of clinical experience. Analyses focused on four types of errors in interpretation. Color-coded tables and one of the color-coded graphs greatly (2.8 times or better) reduced the number of incorrectly classified test results, as compared to the reference presentation technique. This was mainly due to a reduction of the number of abnormal test results that were not noticed by the subjects when using these presentation techniques.
APA, Harvard, Vancouver, ISO, and other styles
42

Rahman, S. M. "Constraint of Complex Trace Analysis for Seismic Data Processing." Journal of Scientific Research 3, no. 1 (December 19, 2010): 65. http://dx.doi.org/10.3329/jsr.v3i1.2106.

Full text
Abstract:
Time frequency representation is a powerful tool for studying seismic reflection patterns and can thus provide useful information for stratification of the subsurface. Complex trace analysis, one of the geophysical techniques, is being employed for the time frequency analysis of seismic traces as analytic signal for the interpretation of seismic data. The applicability of the complex trace analysis in seismic data processing has been studied in this paper with few synthetic signals. The signals are analyzed with complex trace analysis for time frequency representations and compared with the spectral energy distributions. It is shown that complex trace analysis is not suitable for accurate estimation of time frequency representation of the signals having simultaneous frequencies.Keywords: Time frequency; Complex trace; Analytic signal; Spectral analysis.© 2011 JSR Publications. ISSN: 2070-0237 (Print); 2070-0245 (Online). All rights reserved.doi:10.3329/jsr.v3i1.2106 J. Sci. Res. 3 (1), 65-73 (2011)
APA, Harvard, Vancouver, ISO, and other styles
43

Kim, Donghyun, Gian Antariksa, Melia Putri Handayani, Sangbong Lee, and Jihwan Lee. "Explainable Anomaly Detection Framework for Maritime Main Engine Sensor Data." Sensors 21, no. 15 (July 31, 2021): 5200. http://dx.doi.org/10.3390/s21155200.

Full text
Abstract:
In this study, we proposed a data-driven approach to the condition monitoring of the marine engine. Although several unsupervised methods in the maritime industry have existed, the common limitation was the interpretation of the anomaly; they do not explain why the model classifies specific data instances as an anomaly. This study combines explainable AI techniques with anomaly detection algorithm to overcome the limitation above. As an explainable AI method, this study adopts Shapley Additive exPlanations (SHAP), which is theoretically solid and compatible with any kind of machine learning algorithm. SHAP enables us to measure the marginal contribution of each sensor variable to an anomaly. Thus, one can easily specify which sensor is responsible for the specific anomaly. To illustrate our framework, the actual sensor stream obtained from the cargo vessel collected over 10 months was analyzed. In this analysis, we performed hierarchical clustering analysis with transformed SHAP values to interpret and group common anomaly patterns. We showed that anomaly interpretation and segmentation using SHAP value provides more useful interpretation compared to the case without using SHAP value.
APA, Harvard, Vancouver, ISO, and other styles
44

Chang, Xuyang, Simon Hallais, Kostas Danas, and Stéphane Roux. "PeakForce AFM Analysis Enhanced with Model Reduction Techniques." Sensors 23, no. 10 (May 13, 2023): 4730. http://dx.doi.org/10.3390/s23104730.

Full text
Abstract:
PeakForce quantitative nanomechanical AFM mode (PF-QNM) is a popular AFM technique designed to measure multiple mechanical features (e.g., adhesion, apparent modulus, etc.) simultaneously at the exact same spatial coordinates with a robust scanning frequency. This paper proposes compressing the initial high-dimensional dataset obtained from the PeakForce AFM mode into a subset of much lower dimensionality by a sequence of proper orthogonal decomposition (POD) reduction and subsequent machine learning on the low-dimensionality data. A substantial reduction in user dependency and subjectivity of the extracted results is obtained. The underlying parameters, or “state variables”, governing the mechanical response can be easily extracted from the latter using various machine learning techniques. Two samples are investigated to illustrate the proposed procedure (i) a polystyrene film with low-density polyethylene nano-pods and (ii) a PDMS film with carbon–iron particles. The heterogeneity of material, as well as the sharp variation in topography, make the segmentation challenging. Nonetheless, the underlying parameters describing the mechanical response naturally offer a compact representation allowing for a more straightforward interpretation of the high-dimensional force–indentation data in terms of the nature (and proportion) of phases, interfaces, or topography. Finally, those techniques come with a low processing time cost and do not require a prior mechanical model.
APA, Harvard, Vancouver, ISO, and other styles
45

Coffa, Jordy, Mark A. van de Wiel, Begoña Diosdado, Beatriz Carvalho, Jan Schouten, and Gerrit A. Meijer. "MLPAnalyzer: Data Analysis Tool for Reliable Automated Normalization of MLPA Fragment Data." Analytical Cellular Pathology 30, no. 4 (January 1, 2008): 323–35. http://dx.doi.org/10.1155/2008/605109.

Full text
Abstract:
Background: Multiplex Ligation dependent Probe Amplification (MLPA) is a rapid, simple, reliable and customized method for detection of copy number changes of individual genes at a high resolution and allows for high throughput analysis. This technique is typically applied for studying specific genes in large sample series. The large amount of data, dissimilarities in PCR efficiency among the different probe amplification products, and sample-to-sample variation pose a challenge to data analysis and interpretation. We therefore set out to develop an MLPA data analysis strategy and tool that is simple to use, while still taking into account the above-mentioned sources of variation.Materials and Methods: MLPAnalyzer was developed in Visual Basic for Applications, and can accept a large number of file formats directly from capillary sequence systems. Sizes of all MLPA probe signals are determined and filtered, quality control steps are performed, and variation in peak intensity related to size is corrected for. DNA copy number ratios of test samples are computed, displayed in a table view and a set of comprehensive figures is generated. To validate this approach, MLPA reactions were performed using a dedicated MLPA mix on 6 different colorectal cancer cell lines. The generated data were normalized using our program and results were compared to previously performed array-CGH results using both statistical methods and visual examination.Results and Discussion: Visual examination of bar graphs and direct ratios for both techniques showed very similar results, while the average Pearson moment correlation over all MLPA probes was found to be 0.42. Our results thus show that automated MLPA data processing following our suggested strategy may be of significant use, especially when handling large MLPA data sets, when samples are of different quality, or interpretation of MLPA electropherograms is too complex. It remains, however, important to recognize that automated MLPA data processing may only be successful when a dedicated experimental setup is also considered.
APA, Harvard, Vancouver, ISO, and other styles
46

Wimalasuriya, R., A. Kapukotuwa, and G. Ranasinghe. "Conceptual Framework for On-site Digital Interpretation Developments in Cultural Heritage Sites." Vidyodaya Journal of Humanities and Social Sciences 07, no. 01 (January 1, 2022): 64–84. http://dx.doi.org/10.31357/fhss/vjhss.v07i01.04.

Full text
Abstract:
On-site heritage interpretation plays a vital role in cultural heritage sites in conveying the significance and multiple heritage values to the visitors. In an era where the world is transforming with innovative digital applications, the heritage sites are also being integrated with digital interpretation techniques to deliver a better interpretation and new dimensional experience to the visitors. Though multiple digital solutions are available, not all the techniques are appropriate, applicable and feasible to every site. Besides, neither proper worldwide principles nor framework has been exerted for these digital heritage interpretation developments. Therefore, this study is focused on building a generic conceptual framework to select the most appropriate digital interpretation technique(s) that fit the context of the heritage site, giving special reference to the six Cultural World Heritage Sites of Sri Lanka. The relevant qualitative and quantitative data were gathered via in-depth interviews, field observation, literature survey and a visitor survey questionnaire. The main themes and sub-themes derived through the thematic analysis were adopted as the theoretical framework for the research to analyze the collected data of the six Cultural World Heritage Sites and the selected digital techniques. Based on the results, the study recommends appropriate digital techniques for each Cultural World Heritage Sites of the country. Further as aimed, the study presents a conceptual framework for on-site digital interpretation developments for cultural heritage sites by categorizing the 24 criteria derived for data analysis under five phases namely ‘Prepare’, ‘Assess’, ‘Design’, ‘Implement’ and ‘Sustain’.
APA, Harvard, Vancouver, ISO, and other styles
47

Allan, James D., Jose L. Jimenez, Paul I. Williams, M. Rami Alfarra, Keith N. Bower, John T. Jayne, Hugh Coe, and Douglas R. Worsnop. "Correction to “Quantitative sampling using an Aerodyne aerosol mass spectrometer: 1. Techniques of data interpretation and error analysis”." Journal of Geophysical Research: Atmospheres 108, no. D9 (May 10, 2003): n/a. http://dx.doi.org/10.1029/2003jd001607.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Welsman, Joanne R., and Neil Armstrong. "Statistical Techniques for Interpreting Body Size–Related Exercise Performance during Growth." Pediatric Exercise Science 12, no. 2 (May 2000): 112–27. http://dx.doi.org/10.1123/pes.12.2.112.

Full text
Abstract:
This paper reviews some of the statistical methods available for controlling for body size differences in the interpretation of developmental changes in exercise performance. For cross-sectional data analysis simple per body mass ratio scaling continues to be widely used, but is frequently ineffective as the computed ratio remains correlated with body mass. Linear regression techniques may distinguish group differences more appropriately but, as illustrated, only allometric (log-linear regression) scaling appropriately removes body size differences while accommodating the heteroscedasticity common in exercise performance data. The analysis and interpretation of longitudinal data within an allometric framework is complex. More established methods such as ontogenetic allometry allow insights into individual size-function relationships but are unable to describe adequately population effects or changes in the magnitude of the response. The recently developed multilevel regression modeling technique represents a flexible and sensitive solution to such problems allowing both individual and group responses to be modeled concurrently.
APA, Harvard, Vancouver, ISO, and other styles
49

Russell, Trevor G., Melinda Martin-Khan, Asaduzzaman Khan, and Victoria Wade. "Method-comparison studies in telehealth: Study design and analysis considerations." Journal of Telemedicine and Telecare 23, no. 9 (September 11, 2017): 797–802. http://dx.doi.org/10.1177/1357633x17727772.

Full text
Abstract:
When establishing telehealth services, clinicians need to be confident that the examinations, assessments and clinical decisions that they make while using technology are equivalent to conventional best practice. Method-comparison studies are ideally suited to answering these questions, however there is a lack of consistency in the telehealth literature in the study methodologies and data analysis techniques used. Methodologies should closely match clinical practice to maximise external validity and data analysis techniques should match the data types generated in order to be clinically meaningful. In this article we discuss the design, analysis and interpretation of method-comparison studies in the context of telehealth research.
APA, Harvard, Vancouver, ISO, and other styles
50

Roden, Rocky, Thomas Smith, and Deborah Sacrey. "Geologic pattern recognition from seismic attributes: Principal component analysis and self-organizing maps." Interpretation 3, no. 4 (November 1, 2015): SAE59—SAE83. http://dx.doi.org/10.1190/int-2015-0037.1.

Full text
Abstract:
Interpretation of seismic reflection data routinely involves powerful multiple-central-processing-unit computers, advanced visualization techniques, and generation of numerous seismic data types and attributes. Even with these technologies at the disposal of interpreters, there are additional techniques to derive even more useful information from our data. Over the last few years, there have been efforts to distill numerous seismic attributes into volumes that are easily evaluated for their geologic significance and improved seismic interpretation. Seismic attributes are any measurable property of seismic data. Commonly used categories of seismic attributes include instantaneous, geometric, amplitude accentuating, amplitude-variation with offset, spectral decomposition, and inversion. Principal component analysis (PCA), a linear quantitative technique, has proven to be an excellent approach for use in understanding which seismic attributes or combination of seismic attributes has interpretive significance. The PCA reduces a large set of seismic attributes to indicate variations in the data, which often relate to geologic features of interest. PCA, as a tool used in an interpretation workflow, can help to determine meaningful seismic attributes. In turn, these attributes are input to self-organizing-map (SOM) training. The SOM, a form of unsupervised neural networks, has proven to take many of these seismic attributes and produce meaningful and easily interpretable results. SOM analysis reveals the natural clustering and patterns in data and has been beneficial in defining stratigraphy, seismic facies, direct hydrocarbon indicator features, and aspects of shale plays, such as fault/fracture trends and sweet spots. With modern visualization capabilities and the application of 2D color maps, SOM routinely identifies meaningful geologic patterns. Recent work using SOM and PCA has revealed geologic features that were not previously identified or easily interpreted from the seismic data. The ultimate goal in this multiattribute analysis is to enable the geoscientist to produce a more accurate interpretation and reduce exploration and development risk.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography