To see the other types of publications on this topic, follow the link: Standardization of signals.

Journal articles on the topic 'Standardization of signals'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Standardization of signals.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Stenger, L. "Digital coding of television signals—CCIR activities for standardization." Signal Processing: Image Communication 1, no. 1 (June 1989): 29–43. http://dx.doi.org/10.1016/0923-5965(89)90018-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

KUBOTA, H. "Standardization for Worning Signals : Report of the ISO Meeting in London." JAPANES JOURNAL OF MEDICAL INSTRUMENTATION 55, no. 8 (August 1, 1985): 404–8. http://dx.doi.org/10.4286/ikakikaigaku.55.8_404.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Krigsholm and Riekkinen. "Applying Text Mining for Identifying Future Signals of Land Administration." Land 8, no. 12 (November 27, 2019): 181. http://dx.doi.org/10.3390/land8120181.

Full text
Abstract:
Companies and governmental agencies are increasingly seeking ways to explore emerging trends and issues that have the potential to shape up their future operational environments. This paper exploits text mining techniques for investigating future signals of the land administration sector. After a careful review of previous literature on the detection of future signals through text mining, we propose the use of topic models to enhance the interpretation of future signals. Findings of the study highlight the large spectrum of issues related to land interests and their recording, as nineteen future signal topics ranging from climate change mitigation and the use of satellite imagery for data collection to flexible standardization and participatory land consolidations are identified. Our analysis also shows that distinguishing weak signals from latent, well-known, and strong signals is challenging when using a predominantly automated process. Overall, this study summarizes the current discourses of the land administration domain and gives an indication of which topics are gaining momentum at present.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Xianliang, Junxia Li, Xiaobo Liu, and Zhenju Chen. "Improved EEMD-based standardization method for developing long tree-ring chronologies." Journal of Forestry Research 31, no. 6 (June 27, 2019): 2217–24. http://dx.doi.org/10.1007/s11676-019-01002-y.

Full text
Abstract:
Abstract Long tree-ring chronologies can be developed by overlapping data from living trees with data from fossil trees through cross-dating. However, low-frequency climate signals are lost when standardizing tree-ring series due to the “segment length curse”. To alleviate the segment length curse and thus improve the standardization method for developing long tree-ring chronologies, here we first calculated a mean value for all the tree ring series by overlapping all of the tree ring series. The growth trend of the mean tree ring width (i.e., cumulated average growth trend of all the series) was determined using ensemble empirical mode decomposition. Then the chronology was developed by dividing the mean value by the growth trend of the mean value. Our improved method alleviated the problem of trend distortion. Long-term signals were better preserved using the improved method than in previous detrending methods. The chronologies developed using the improved method were better correlated with climate than those developed using conservative methods. The improved standardization method alleviates trend distortion and retains more of the low-frequency climate signals.
APA, Harvard, Vancouver, ISO, and other styles
5

Miliszkiewicz, Natalia, Stanisław Walas, and Anna Tobiasz. "Current approaches to calibration of LA-ICP-MS analysis." Journal of Analytical Atomic Spectrometry 30, no. 2 (2015): 327–38. http://dx.doi.org/10.1039/c4ja00325j.

Full text
Abstract:
For solid sample quantitative analysis by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) the main analytical problems are adequate standards preparation and signals standardization.
APA, Harvard, Vancouver, ISO, and other styles
6

Němcová, Andrea, Radovan Smíšek, Lucie Maršánová, Lukáš Smital, and Martin Vítek. "A Comparative Analysis of Methods for Evaluation of ECG Signal Quality after Compression." BioMed Research International 2018 (July 18, 2018): 1–26. http://dx.doi.org/10.1155/2018/1868519.

Full text
Abstract:
The assessment of ECG signal quality after compression is an essential part of the compression process. Compression facilitates the signal archiving, speeds up signal transmission, and reduces the energy consumption. Conversely, lossy compression distorts the signals. Therefore, it is necessary to express the compression performance through both compression efficiency and signal quality. This paper provides an overview of objective algorithms for the assessment of both ECG signal quality after compression and compression efficiency. In this area, there is a lack of standardization, and there is no extensive review as such. 40 methods were tested in terms of their suitability for quality assessment. For this purpose, the whole CSE database was used. The tested signals were compressed using an algorithm based on SPIHT with varying efficiency. As a reference, compressed signals were manually assessed by two experts and classified into three quality groups. Owing to the experts’ classification, we determined corresponding ranges of selected quality evaluation methods’ values. The suitability of the methods for quality assessment was evaluated based on five criteria. For the assessment of ECG signal quality after compression, we recommend using a combination of these methods: PSim SDNN, QS, SNR1, MSE, PRDN1, MAX, STDERR, and WEDD SWT.
APA, Harvard, Vancouver, ISO, and other styles
7

Hoffman, Roy E. "Standardization of chemical shifts of TMS and solvent signals in NMR solvents." Magnetic Resonance in Chemistry 44, no. 6 (2006): 606–16. http://dx.doi.org/10.1002/mrc.1801.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Damani, Devanshi N., Divaakar Siva Baala Sundaram, Shivam Damani, Anoushka Kapoor, Adelaide M. Arruda Olson, and Shivaram P. Arunachalam. "INVESTIGATION OF SYNCHRONIZED ACQUISITION OF ELECTROCARDIOGRAM AND PHONOCARDIOGRAM SIGNALS TOWARDS ELECTROMECHANICAL PROFILING OF THE HEART." Biomedical Sciences Instrumentation 57, no. 2 (April 1, 2021): 305–12. http://dx.doi.org/10.34107/yhpn9422.04305.

Full text
Abstract:
Cardiac diseases are the leading cause of death in the world. Electrocardiogram (ECG and Phonocardiogram (PCG signals play a significant role in the diagnosis of various cardiac diseases. Simultaneous acquisition of ECG and PCG signals can open new avenues of signal processing approaches for electromechanical profiling of the heart. However, there are no standard approaches to ensure high fidelity synchronous data acquisition to enable the development of such novel technologies. In this work, the authors report results on various data capture positions that could lead to standardization of simultaneous ECG and PCG data collection. Presence of lung sounds, variations in posture, depth and frequency of breathing can lead to differences in the ECG-PCG signals recorded. This necessitates a standard approach to record and interpret the data collected. The authors recorded ECG-PCG simultaneously in six healthy subjects using a digital stethoscope to understand the differences in signal quality in various recording positions (prone, supine, bending, semi recumbent, standing, left lateral and sitting with normal and deep breathing conditions. The collected digitized signals are processed offline for signal quality using custom MATLAB software for SNR. The results indicate minimal differences in signal quality across different recording positions. Validation of this technique with larger dataset is required. Future work will investigate changes in characteristic ECG and PCG features due to position and breathing patterns.
APA, Harvard, Vancouver, ISO, and other styles
9

Dong, Li, Lingling Zhao, Yufan Zhang, Xue Yu, Fali Li, Jianfu Li, Yongxiu Lai, Tiejun Liu, and Dezhong Yao. "Reference Electrode Standardization Interpolation Technique (RESIT): A Novel Interpolation Method for Scalp EEG." Brain Topography 34, no. 4 (May 5, 2021): 403–14. http://dx.doi.org/10.1007/s10548-021-00844-2.

Full text
Abstract:
Abstract“Bad channels” are common phenomena during scalp electroencephalography (EEG) recording that arise due to various technique-related reasons, and reconstructing signals from bad channels is an inevitable choice in EEG processing. However, current interpolation methods are all based on purely mathematical interpolation theory, ignoring the neurophysiological basis of the EEG signals, and their performance needs to be further improved, especially when there are many scattered or adjacent bad channels. Therefore, a new interpolation method, named the reference electrode standardization interpolation technique (RESIT), was developed for interpolating scalp EEG channels. Resting-state and event-related EEG datasets were used to investigate the performance of the RESIT. The main results showed that (1) assuming 10% bad channels, RESIT can reconstruct the bad channels well; (2) as the percentage of bad channels increased (from 2% to 85%), the absolute and relative errors between the true and RESIT-reconstructed signals generally increased, and the correlations between the true and RESIT signals decreased; (3) for a range of bad channel percentages (2% ~ 85%), the RESIT had lower absolute error (approximately 2.39% ~ 33.5% reduction), lower relative errors (approximately 1.3% ~ 35.7% reduction) and higher correlations (approximately 2% ~ 690% increase) than traditional interpolation methods, including neighbor interpolation (NI) and spherical spline interpolation (SSI). In addition, the RESIT was integrated into the EEG preprocessing pipeline on the WeBrain cloud platform (https://webrain.uestc.edu.cn/). These results suggest that the RESIT is a promising interpolation method for both separate and simultaneous EEG preprocessing that benefits further EEG analysis, including event-related potential (ERP) analysis, EEG network analysis, and strict group-level statistics.
APA, Harvard, Vancouver, ISO, and other styles
10

Garrido Frenich, A., D. Picón Zamora, J. L. Martı́nez Vidal, and M. Martı́nez Galera. "Standardization of SPE signals in multicomponent analysis of three benzimidazolic pesticides by spectrofluorimetry." Analytica Chimica Acta 477, no. 2 (February 2003): 211–22. http://dx.doi.org/10.1016/s0003-2670(02)01423-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Banks, George C., Haley M. Woznyj, Ryan S. Wesslen, Katherine A. Frear, Gregory Berka, Eric D. Heggestad, and Heather L. Gordon. "Strategic Recruitment Across Borders: An Investigation of Multinational Enterprises." Journal of Management 45, no. 2 (August 23, 2018): 476–509. http://dx.doi.org/10.1177/0149206318764295.

Full text
Abstract:
As a result of globalization, large-scale modern-day businesses extend across borders as they engage in multinational enterprises. Such enterprises must conduct operations in disparate, culturally diverse contexts, which present challenges for implementing human resource management activities, such as whether to standardize or localize activities across borders. The current study focuses on recruitment activities, as they represent firms’ initial efforts to attract highly qualified talent. However, the extant recruitment literature has primarily been conducted in a single context or in Westernized societies; thus, it is unclear how organizations recruit across borders. Drawing on signaling theory, we explore how Fortune 1000 firms use recruiting signals in their domestic and international operations. In general, we find that firms standardize the recruiting signals across their domestic and international operations. Yet, the amount that each signal is emphasized differs in domestic and international operations and is contingent upon language. Furthermore, cultural distance between the home and host country largely does not explain the standardization of the recruiting signals. We summarize the findings and provide direction intended to guide future research.
APA, Harvard, Vancouver, ISO, and other styles
12

Calò, Pietro Giorgio, Fabio Medas, Luca Gordini, Francesco Podda, Enrico Erdas, Giuseppe Pisano, and Angelo Nicolosi. "Interpretation of intraoperative recurrent laryngeal nerve monitoring signals: The importance of a correct standardization." International Journal of Surgery 28 (April 2016): S54—S58. http://dx.doi.org/10.1016/j.ijsu.2015.12.039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Cuadros-Rodríguez, Luis, Fidel Ortega-Gavilán, Sandra Martín-Torres, Santiago Medina-Rodríguez, Ana M. Jimenez-Carvelo, Antonio González-Casado, and M. Gracia Bagur-González. "Standardization of chromatographic signals – Part I: Towards obtaining instrument-agnostic fingerprints in gas chromatography." Journal of Chromatography A 1641 (March 2021): 461983. http://dx.doi.org/10.1016/j.chroma.2021.461983.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Matskovsky, V. V., and S. Helama. "Testing long-term summer temperature reconstruction based on maximum density chronologies obtained by reanalysis of tree-ring data sets from northernmost Sweden and Finland." Climate of the Past 10, no. 4 (August 5, 2014): 1473–87. http://dx.doi.org/10.5194/cp-10-1473-2014.

Full text
Abstract:
Abstract. Here we analyse the maximum latewood density (MXD) chronologies of two published tree-ring data sets: one from Torneträsk region in northernmost Sweden (TORN; Melvin et al., 2013) and one from northern Fennoscandia (FENN; Esper et al., 2012). We paid particular attention to the MXD low-frequency variations to reconstruct summer (June–August, JJA) long-term temperature history. We used published methods of tree-ring standardization: regional curve standardization (RCS) combined with signal-free implementation. Comparisons with RCS chronologies produced using single and multiple (non-climatic) ageing curves (to be removed from the initial MXD series) were also carried out. We develop a novel method of standardization, the correction implementation of signal-free standardization, tailored for detection of pure low-frequency signal in tree-ring chronologies. In this method, the error in RCS chronology with signal-free implementation is analytically assessed and extracted to produce an advanced chronology. The importance of correction becomes obvious at lower frequencies as smoothed chronologies become progressively more correlative with correction implementation. Subsampling the FENN data to mimic the lower chronology sample size of TORN data shows that the chronologies bifurcate during the 7th, 9th, 17th and 20th centuries. We used the two MXD data sets to reconstruct summer temperature variations over the period 8 BC through AD 2010. Our new reconstruction shows multi-decadal to multi-centennial variability with changes in the amplitude of the summer temperature of 2.2 °C on average during the Common Era. Although the MXD data provide palaeoclimate research with a highly reliable summer temperature proxy, the bifurcating dendroclimatic signals identified in the two data sets imply that future research should aim at a more advanced understanding of MXD data on distinct issues: (1) influence of past population density variations on MXD production, (2) potential biases when calibrating differently produced MXD data to produce one proxy record, (3) influence of the biological age of MXD data when introducing young trees into the chronology over the most recent past and (4) possible role of waterlogging in MXD production when analysing tree-ring data of riparian trees.
APA, Harvard, Vancouver, ISO, and other styles
15

Singh, Yogendra Narain, Sanjay Kumar Singh, and Amit Kumar Ray. "Bioelectrical Signals as Emerging Biometrics: Issues and Challenges." ISRN Signal Processing 2012 (July 26, 2012): 1–13. http://dx.doi.org/10.5402/2012/712032.

Full text
Abstract:
This paper presents the effectiveness of bioelectrical signals such as the electrocardiogram (ECG) and the electroencephalogram (EEG) for biometric applications. Studies show that the impulses of cardiac rhythm and electrical activity of the brain recorded in ECG and EEG, respectively; have unique features among individuals, therefore they can be suggested to be used as biometrics for identity verification. The favourable characteristics to use the ECG or EEG signals as biometric include universality, measurability, uniqueness and robustness. In addition, they have the inherent feature of vitality that signifies the life signs offering a strong protection against spoof attacks. Unlike conventional biometrics, the ECG or EEG is highly confidential and secure to an individual which is difficult to be forged. We present a review of methods used for the ECG and EEG as biometrics for individual authentication and compare their performance on the datasets and test conditions they have used. We illustrate the challenges involved in using the ECG or EEG as biometric primarily due to the presence of drastic acquisition variations and the lack of standardization of signal features. In order to determine the large-scale performance, individuality of the ECG or EEG is another challenge that remains to be addressed.
APA, Harvard, Vancouver, ISO, and other styles
16

Cuadros-Rodríguez, Luis, Sandra Martín-Torres, Fidel Ortega-Gavilán, Ana M. Jiménez-Carvelo, Rosalía López-Ruiz, Antonia Garrido-Frenich, M. Gracia Bagur-González, and Antonio González-Casado. "Standardization of chromatographic signals – Part II: Expanding instrument-agnostic fingerprints to reverse phase liquid chromatography." Journal of Chromatography A 1641 (March 2021): 461973. http://dx.doi.org/10.1016/j.chroma.2021.461973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Brendel, Friederike, Julien Poëtte, Béatrice Cabon, and Frédéric van Dijk. "Low-cost analog fiber optic links for in-house distribution of millimeter-wave signals." International Journal of Microwave and Wireless Technologies 3, no. 2 (February 18, 2011): 231–36. http://dx.doi.org/10.1017/s1759078711000031.

Full text
Abstract:
In this article, analog fiber optic links (radio-over-fiber, RoF, links) are presented as a flexible, low-cost solution for in-house distribution of millimeter-wave (mmw) signals. Mode-locked laser diodes (MLLD) serve as inexpensive mmw sources for the downlink distribution of mmw signals across an optical fiber link. We compare the robustness of direct and external RF modulation for such MLLD-based RoF systems, whereas the error vector magnitude (EVM) of the received symbols serves as a figure of merit. On the eve of 60 GHz WLAN standardization, we experimentally investigate the transmission of narrowband WLAN (IEEE 802.11a) signals in the millimetric range at moderate data rates. We also demonstrate broadband transmission of multi-band orthogonal frequency-division multiplexing (MB-OFDM) ultra-wideband (UWB) european computer manufacturers association (ECMA 368) signals in the 60 GHz band for data rates of up to 480 Mbps.
APA, Harvard, Vancouver, ISO, and other styles
18

Matskovsky, V. V., and S. Helama. "Testing long-term summer temperature reconstruction based on maximum density chronologies obtained by reanalysis of tree-ring datasets from northernmost Sweden and Finland." Climate of the Past Discussions 9, no. 5 (October 16, 2013): 5659–700. http://dx.doi.org/10.5194/cpd-9-5659-2013.

Full text
Abstract:
Abstract. Here we analysed the maximum latewood density (MXD) chronologies of two published tree-ring datasets: from Torneträsk region in northernmost Sweden (TORN, Melvin et al., 2013) and from northern Fennoscandia (FENN, Esper et al., 2012). We paid particular attention to the MXD low-frequency variations to reconstruct long-term summer (June–August, JJA) temperature history. We used published methods of tree-ring standardization: regional curve (RC) standardization, combined with signal-free (SF) implementation. Comparisons with a single-RC (RC1) and multiple-RC (RC2) were also carried out. We develop a novel method of standardization, the correction (C) implementation to SF (hence, RC1SFC or RC2SFC), tailored for detection of pure low-frequency signal in tree-ring chronologies. In this method, the error in RC1SF (or RC2SF) chronology, is analytically assessed and extracted to produce a RC1SFC or RC2SFC chronology. In TORN, the RC1SF chronology shows higher correlation with summer temperature (JJA) than RC1SFC, whereas in FENN the temperature signals of RC1SF chronology is improved by correction implementation (RC1SFC). The highest correlation between differently standardized chronologies for two datasets is obtained using FENN-RC2SFC and TORN-RC1 chronologies. Focusing on lowest frequencies, the importance of correction becomes obvious as the chronologies become progressively more correlative with RC1SFC and RC2SFC implementations. Subsampling the FENN data (which presents a higher number of samples than TORN dataset) to the chronology sample size of TORN data shows that the chronologies consistently bifurcate during the 7th, 9th, 17th and 20th centuries. We used the two MXD datasets to reconstruct summer temperature variations over the period −48–2010 calendar years. Our new reconstruction shows multi-decadal to multi-centennial variability with changes in the amplitude of the summer temperature of 2.6 °C in average during the Common Era.
APA, Harvard, Vancouver, ISO, and other styles
19

Hiscock, Peter. "Small Signals: Comprehending the Australian Microlithic as Public Signalling." Cambridge Archaeological Journal 31, no. 2 (February 5, 2021): 313–24. http://dx.doi.org/10.1017/s0959774320000335.

Full text
Abstract:
Signalling is a critical capacity in modern human cultures but it has often been difficult to identify and understand on lithic artefacts from pre-literate contexts. Often archaeologists have minimized the signalling role of lithic tools by arguing for strong form-function relationships that constrained signalling or else imposed ethnographic information on the archaeological patterns with the assumption they assist in defining the signalling carried out in prehistory. In this paper I present a case study for which it can be shown that function does not correlate with form and that the technology fell out of use 1000–1500 years ago. This means that neither presumptions of continuity in social practice nor reference to tool use provide strong explanations for the size, shape standardization and regional differentiation of Australian microliths. Sender-receiver signalling theory is harnessed to motivate a new synthesis of these microliths, and I demonstrate that not only were these artefacts probably key objects used in public signalling but also that sender-receiver frameworks enable us to infer details about the operation of the signalling system.
APA, Harvard, Vancouver, ISO, and other styles
20

Hosseini, Seyyed Abed. "DECODING VISUAL COVERT SELECTIVE SPATIAL ATTENTION BASED ON MAGNETOENCEPHALOGRAPHY SIGNALS." Biomedical Engineering: Applications, Basis and Communications 31, no. 01 (February 2019): 1950003. http://dx.doi.org/10.4015/s1016237219500030.

Full text
Abstract:
This paper proposes a hybrid approach for inferring the target of visual covert selective spatial attention (VCSSA) from magnetoencephalography (MEG) signals. The MEG signal offers a higher spatial resolution and a lower distortion as compared with their competing brain signaling techniques, such as the electroencephalography signal. The proposed approach consists of removing global redundant patterns of MEG channels by surface Laplacian, feature extraction by Hurst exponent (H), 6th order Morlet coefficients (MCs), and Petrosian fractal dimension (PFD), standardization, feature ranking by statistical analysis, and classification by support vector machines (SVM). The results indicate that the combined use of the above elements can effectively decipher the cognitive process of VCSSA. In particular, using four-fold cross-validation, the proposed approach robustly predicts the location of the attended stimulus with an accuracy of up to 92.41% for distinguishing left from right. The results show that the fusion among wavelet coefficients and non-linear features is more robust than in other previous studies. The results also indicate that the VCSSA involves widespread functional brain activities, affecting more regions than temporal and parietal circuits. Finally, the comparison of the results with six other competing strategies indicates that a slightly higher average accuracy is obtained by the proposed approach on the same data.
APA, Harvard, Vancouver, ISO, and other styles
21

Okorn, Kristen, and Michael Hannigan. "Improving Air Pollutant Metal Oxide Sensor Quantification Practices through: An Exploration of Sensor Signal Normalization, Multi-Sensor and Universal Calibration Model Generation, and Physical Factors Such as Co-Location Duration and Sensor Age." Atmosphere 12, no. 5 (May 19, 2021): 645. http://dx.doi.org/10.3390/atmos12050645.

Full text
Abstract:
As low-cost sensors have become ubiquitous in air quality measurements, there is a need for more efficient calibration and quantification practices. Here, we deploy stationary low-cost monitors in Colorado and Southern California near oil and gas facilities, focusing our analysis on methane and ozone concentration measurement using metal oxide sensors. In comparing different sensor signal normalization techniques, we propose a z-scoring standardization approach to normalize all sensor signals, making our calibration results more easily transferable among sensor packages. We also attempt several different physical co-location schemes, and explore several calibration models in which only one sensor system needs to be co-located with a reference instrument, and can be used to calibrate the rest of the fleet of sensor systems. This approach greatly reduces the time and effort involved in field normalization without compromising goodness of fit of the calibration model to a significant extent. We also explore other factors affecting the performance of the sensor system quantification method, including the use of different reference instruments, duration of co-location, time averaging, transferability between different physical environments, and the age of metal oxide sensors. Our focus on methane and stationary monitors, in addition to the z-scoring standardization approach, has broad applications in low-cost sensor calibration and utility.
APA, Harvard, Vancouver, ISO, and other styles
22

Dawar, Niraj, and Philip Parker. "Marketing Universals: Consumers’ Use of Brand Name, Price, Physical Appearance, and Retailer Reputation as Signals of Product Quality." Journal of Marketing 58, no. 2 (April 1994): 81–95. http://dx.doi.org/10.1177/002224299405800207.

Full text
Abstract:
Marketing universals are defined as consumer behaviors within a segment and toward a particular product category that are invariant across cultures. Using several definitions of culture and three different criteria for universality, the authors evaluate whether the use of brand, price, retailer reputation, and physical product appearance as signals of quality are marketing universals for consumer electronics products. Using a sample representing 38 nationalities, they find that there are few differences in the use of quality signals across cultures for a high priority segment of consumers. They draw conclusions for the adaptation versus standardization debate and argue that certain behaviors are likely to be universal, whereas others are not. Understanding such differences is essential to designing international marketing strategies.
APA, Harvard, Vancouver, ISO, and other styles
23

Niu, Gang, Guo Shun Chen, and Pei Yuan Wang. "An Integration of Military Electronic Equipment and Flexible Test Technology." Applied Mechanics and Materials 127 (October 2011): 316–19. http://dx.doi.org/10.4028/www.scientific.net/amm.127.316.

Full text
Abstract:
Aimed at the characteristics that military electronic equipment has applied numerous technologies, many kinds of tested signals, the demands of test system’s accuracy and expansibility. Researched the test system based flexible test technology; the design ideas are structure modularization, electric interface standardization and software modularization. Especially analyzed modularization design, redundancy or parallel channels, calibration and system function test loop, flexible test can be applied well in the filed of military electronic equipment support.
APA, Harvard, Vancouver, ISO, and other styles
24

Obara, Masaharu. "International standardization of digital television. 1 Bit-parallel and bit-serial interface for component video signals." Journal of the Institute of Television Engineers of Japan 40, no. 6 (1986): 442–48. http://dx.doi.org/10.3169/itej1978.40.442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Provost, Pierre R., and Yves Tremblay. "Standardization of Northern Blot Signals by Determination of Relative Poly(A)+ RNA Amounts Loaded per Lane." Analytical Biochemistry 277, no. 1 (January 2000): 154–56. http://dx.doi.org/10.1006/abio.1999.4381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Kim, Whi-Young, Jun-Hyoung Kim, and Jun-Il Kim. "AT90s8535D CHIP APPLICATION FOR HEART RATE VARIABILITY DIAGNOSTIC SYSTEMS." Biomedical Engineering: Applications, Basis and Communications 26, no. 06 (December 2014): 1450079. http://dx.doi.org/10.4015/s1016237214500793.

Full text
Abstract:
By using wireless mobile communications and mobile information terminals, Internet and the linking of computers and information technology to human bodies effectively, mobile computing can play an important role in modern technology that can be used by anybody anytime, anywhere, and can reconsider new technology with physiological measurements and reconstruct it creatively. In particular, mobile computing can intervene in the process of inducing biometric changes before diseased symptoms develop into diseases in an aging society. Nevertheless, there are difficulties, such as data, treatment by many parameters, ambiguity of data standardization and difficulty of the simultaneous collection of data etc. Therefore, in this study, a system was embodied by excluding time limiting factors using mobile computing and selecting the mobile neural dynamic coding method based on bioelectric signals. As a result of the experiment, it could be a model for biomedical Signal Mobile Analysis Devices and mobile biometric measuring devices, whose self-measurement can be made possible by an academic approach. Furthermore, it became the research foundation of the atypical characteristics of the formation of bioelectrical signals based on mobile applications and could be modeled in the structure circuit form of a biosignal.
APA, Harvard, Vancouver, ISO, and other styles
27

Chen, Joy, and Lu-Tsou Yeh. "Optimization of Citizen Broadband Radio Service Frequency Allocation for Dynamic Spectrum Access System." December 2020 2, no. 4 (January 5, 2021): 143–48. http://dx.doi.org/10.36548/jsws.2020.4.001.

Full text
Abstract:
With the increase in mobile broadband utilization, more spectrum release is recommended by the Federal Communications Commission for spectrum sharing under a three-tire system called Citizens Broadband Radio Service. The standardization, functional and operational necessities of this framework are defined by the Wireless Innovation Forum. If an unavoidable shipborne radar appears on the channel, the channel must be vacated by the lower tier users. The timing constraints on CBRS is also stringent. Wireless stations transmit short beacon frames termed as heartbeat signals. These signals consist of the wireless channel encryption data, Service Set Identifier (SSID) and other credential data. These signals also transmit commands to vacate a channel. The heartbeat interval, timing constraint and domain proxy features are analyzed in this paper. CBSD renunciation and spectrum acquisition is performed with the help of domain proxy based communication. The CBRS-SAS channel allocation algorithm is further investigated. The communication interoperability and network robustness can improved with the introduction of secondary SAS and secondary domain proxy respectively.
APA, Harvard, Vancouver, ISO, and other styles
28

Hyung Kim, Chang, Seong Jeong, Kyoung Hak Lee, and Chae Bong Sohn. "DTV broadcasting failover switching system using PCR information." International Journal of Engineering & Technology 7, no. 2.12 (April 3, 2018): 101. http://dx.doi.org/10.14419/ijet.v7i2.12.11101.

Full text
Abstract:
Background/Objectives: Major Terrestrial broadcasters, MSO (Multi-System Operator), IPTV (Internet Protocol Television) Operators have been putting effort to prevent the transmission stops such as transmission interruption.Methods/Statistical analysis: The main and spare signals of the digital broadcasting are monitored. If a problem occurs in the main signal, it automatically switched to a preliminary signal, preventing a broadcast accident before a failover switch is performed.In this paper, we propose a method that can automatically switch signal to minimize the resulting impacts on a broadcasting accident caused by a disconnection from the input or an accident from the network. Digital broadcasts transmission abides by MPEG-2 (Moving Picture Expert Group) TS (Transport Stream) standardization.Findings: This paper proposes a method to maintain the quality of broadcast such as transmission interruption within ETSI TR 101 290 (European Telecommunications Standards Institute Technical Report) standardization. Furthermore, with this proposed method, it would be possible to minimize the error from an automatically transferred stream that sent from the emergency link within ETSI TR 101 290 standardization.Improvements/Applications: A method for preventing broadcast transmission accidents
APA, Harvard, Vancouver, ISO, and other styles
29

Wójcik, Krzysztof, and Marcin Piekarczyk. "Machine Learning Methodology in a System Applying the Adaptive Strategy for Teaching Human Motions." Sensors 20, no. 1 (January 6, 2020): 314. http://dx.doi.org/10.3390/s20010314.

Full text
Abstract:
The teaching of motion activities in rehabilitation, sports, and professional work has great social significance. However, the automatic teaching of these activities, particularly those involving fast motions, requires the use of an adaptive system that can adequately react to the changing stages and conditions of the teaching process. This paper describes a prototype of an automatic system that utilizes the online classification of motion signals to select the proper teaching algorithm. The knowledge necessary to perform the classification process is acquired from experts by the use of the machine learning methodology. The system utilizes multidimensional motion signals that are captured using MEMS (Micro-Electro-Mechanical Systems) sensors. Moreover, an array of vibrotactile actuators is used to provide feedback to the learner. The main goal of the presented article is to prove that the effectiveness of the described teaching system is higher than the system that controls the learning process without the use of signal classification. Statistical tests carried out by the use of a prototype system confirmed that thesis. This is the main outcome of the presented study. An important contribution is also a proposal to standardize the system structure. The standardization facilitates the system configuration and implementation of individual, specialized teaching algorithms.
APA, Harvard, Vancouver, ISO, and other styles
30

Belchior, Fernando Nunes, Paulo Fernando Ribeiro, and Frederico Marques Carvalho. "Comparative Analysis of Instruments Measuring Time Varying Harmonics." International Journal of Emerging Electric Power Systems 17, no. 4 (August 1, 2016): 463–69. http://dx.doi.org/10.1515/ijeeps-2015-0175.

Full text
Abstract:
Abstract This paper aims to evaluate the performance of commercial class A and class S power quality (PQ) instruments when measuring time-varying harmonics. By using a high precision programmable voltage and current source, two meters from different manufacturers are analyzed and compared. Three-phase voltage signals are applied to PQ instruments, considering 3 situations of time-varying harmonic distortions, whose harmonic distortion values are in accordance with typical values found in power systems. This work is relevant considering that international standardization documents do not pay much attention to this aspect of harmonic distortion.
APA, Harvard, Vancouver, ISO, and other styles
31

Toledo-Peral, Cinthya Lourdes, Josefina Gutiérrez-Martínez, Jorge Airy Mercado-Gutiérrez, Ana Isabel Martín-Vignon-Whaley, Arturo Vera-Hernández, and Lorenzo Leija-Salas. "sEMG Signal Acquisition Strategy towards Hand FES Control." Journal of Healthcare Engineering 2018 (2018): 1–11. http://dx.doi.org/10.1155/2018/2350834.

Full text
Abstract:
Due to damage of the nervous system, patients experience impediments in their daily life: severe fatigue, tremor or impaired hand dexterity, hemiparesis, or hemiplegia. Surface electromyography (sEMG) signal analysis is used to identify motion; however, standardization of electrode placement and classification of sEMG patterns are major challenges. This paper describes a technique used to acquire sEMG signals for five hand motion patterns from six able-bodied subjects using an array of recording and stimulation electrodes placed on the forearm and its effects over functional electrical stimulation (FES) and volitional sEMG combinations, in order to eventually control a sEMG-driven FES neuroprosthesis for upper limb rehabilitation. A two-part protocol was performed. First, personalized templates to place eight sEMG bipolar channels were designed; with these data, a universal template, called forearm electrode set (FELT), was built. Second, volitional and evoked movements were recorded during FES application. 95% classification accuracy was achieved using two sessions per movement. With the FELT, it was possible to perform FES and sEMG recordings simultaneously. Also, it was possible to extract the volitional and evoked sEMG from the raw signal, which is highly important for closed-loop FES control.
APA, Harvard, Vancouver, ISO, and other styles
32

Kehoe, E. James, Amanda J. Horne, Jennifer Kingham, Thomas Martin, and Wayne Roach. "Acquisition of a conditioned reflex in New Zealand White rabbits from three sources." Laboratory Animals 29, no. 4 (October 1, 1995): 394–99. http://dx.doi.org/10.1258/002367795780739962.

Full text
Abstract:
In studies of learning using rabbits, there has been standardization of behavioural procedures across laboratories. Less attention has been paid to variation that may arise from genetic differences and/or differences in rearing conditions. The present experiment revealed that acquisition of a conditioned reflex can be affected dramatically by such differences. Specifically, the acquisition of a conditioned reflex in New Zealand White (NZW) rabbits from 3 different suppliers was compared. All rabbits received behavioural training in which a tone or a light signalled an electrotactile stimulation of the trigeminal nerve near the rabbits' right eye. This tactile stimulus reliably elicited an eyeblink. Repeated presentations of the auditory and visual signals followed by the tactile stimulus yielded the acquisition of a conditioned response (CR), namely closure of the eyelids during the warning period provided by the signal stimuli. Two of the groups showed steady CR acquisition at a rate that matched previous results in other laboratories as well as in the senior author's laboratory. However, the third group of rabbits showed very slow acquisition, and some rabbits failed to show any CR acquisition.
APA, Harvard, Vancouver, ISO, and other styles
33

Althoff, Amanda G., Charles B. Williams, Tina McSweeney, Daniel A. Gonçalves, and George L. Donati. "Microwave-Induced Plasma Optical Emission Spectrometry (MIP OES) and Standard Dilution Analysis to Determine Trace Elements in Pharmaceutical Samples." Applied Spectroscopy 71, no. 12 (July 25, 2017): 2692–98. http://dx.doi.org/10.1177/0003702817721750.

Full text
Abstract:
In this work, we evaluate the application of microwave-induced plasma optical emission spectrometry (MIP OES) to determine of Al, Cr, Co, Cu, Fe, Mn, Ni and Zn in children’s cough syrup, eye drops, and oral antiseptic using standard dilution analysis (SDA). The SDA method is simple, with only two calibration solutions prepared per sample. The first solution (S1), composed of 50% sample +50% of a standard solution, is introduced into the plasma and the analytical signals are monitored in a time-resolved fashion. Then, the second solution (S2), composed of 50% sample +50% blank, is poured into the vial containing S1. As the solutions mix, the analytical signals gradually drop to a stable baseline. The calibration curve is computed by plotting the ratio of the analyte signal (SA) over the internal standard signal (which is also part of S1) (SIS) on the y-axis, versus the inverse of the IS concentration on the x-axis (i.e., SA/SIS versus 1/CIS). In this study, SDA results were compared with values obtained with the traditional methods of external calibration (EC), internal standardization (IS), and standard additions (SA) in MIP OES determinations. The precision (represented as percent RSD) for SDA showed values in the range of 2.50–8.00% for all samples, while conventional calibration methods showed RSDs in the range of 6.40–32.50% for EC, 8.30–21.80% for IS, and 5.20–17.40% for SA. The LODs calculated for SDA are below the maximum limits allowed by the major pharmaceutical regulatory agencies, and presents superior precision and accuracy compared to the traditional calibration methods. Considering its simplicity and efficiency, SDA is an important new tool for accurate analyses of pharmaceuticals.
APA, Harvard, Vancouver, ISO, and other styles
34

Eggemeier, F. Thomas, and John R. Amell. "Visual Probability Monitoring: Effects of Display Load and Signal Discriminability." Proceedings of the Human Factors Society Annual Meeting 30, no. 1 (September 1986): 63. http://dx.doi.org/10.1177/154193128603000116.

Full text
Abstract:
The Criterion Task Set (CTS) is a standardized battery of loading tasks developed for application to a variety of performance assessment problems, including evaluations of workload metrics and the effects of stressors on performance. One feature of the CTS that facilitates its application is a set of guidelines that specify standard loading levels for manipulation of difficulty on each task. This standardization is necessary for comparative evaluations of workload metrics and stressors, and therefore represents an essential feature of the CTS. The baseline version of the CTS includes nine primary loading tasks that represent a range of information processing functions. Perceptual input and stimulus identification functions are represented in the CTS by the Probability Monitoring (PM) task, which requires that subjects detect the occurrence of signals on a visual display. The PM display includes a variable number of dials, each consisting of a pointer that moves randomly with respect to a center reference mark. When a signal occurs, pointer movement becomes nonrandom or biased, such that a disproportionate number of moves occurs on one side of the reference mark. Difficulty of the task can be manipulated by varying the display load (number of dials) or the discriminability of the signals (Shingledecker, 1984). Signal discriminability is varied by changes in the percentage of movements that occur on the biased side of the center reference. Previous work has indicated that a 95% bias can be more readily detected than an 85% bias, and that the latter is more rapidly detected than a 75% bias. Current work with the CTS involves continued evaluation and refinement of tasks from the baseline version of the battery. The present work evaluated the effect of display load and signal discriminability in an updated version of the PM task. The new version incorporated a faster pointer movement rate (5 moves per second) than had been used in the previous task. This change was implemented to permit inclusion of more signals per experimental session than had been possible with the baseline version of the task. Since the new version differed substantially from its predecessor, the present study was conducted to evaluate the need to specify new standard loading levels for the task.
APA, Harvard, Vancouver, ISO, and other styles
35

Belchior, Fernando, Thiago Moura Galvão, Paulo Fernando Ribeiro, and Paulo Márcio Silveira. "Comparative Analysis of Power Quality Instruments in Measuring Power under Distorted Conditions." International Journal of Emerging Electric Power Systems 16, no. 5 (October 1, 2015): 421–29. http://dx.doi.org/10.1515/ijeeps-2015-0042.

Full text
Abstract:
Abstract This paper aims to evaluate the performance of commercial power quality (PQ) instruments. By using a high precision programmable voltage and current source, five meters from different manufacturers are analyzed and compared. At first, three-phase voltage signals are applied to those PQ instruments, considering harmonic distortion, voltage dip, voltage swell, as well as unbalanced voltages. These events are measured, compared and evaluated considering the different instruments. In addition, voltage and current signals are applied under different conditions, with and without harmonic and unbalances, in order to obtain the measurements of electrical power. After analyzing the data generated in all these tests, efforts are focused on establishing a relationship between the accuracy of the measurements. Considering the percentages of errors, a score is assigned to each instrument. The tests have shown “excellent” and “good” results regarding measuring active and apparent power, as well as power factor. However, for non-active power, almost all instruments had lower performance. This work is relevant considering that Brazil is deploying standardization in terms of power quality measurements aiming towards regulatory procedures and index limits.
APA, Harvard, Vancouver, ISO, and other styles
36

Mertikas, Donlon, Vuilleumier, Cullen, Féménias, and Tripolitsiotis. "An Action Plan Towards Fiducial Reference Measurements for Satellite Altimetry." Remote Sensing 11, no. 17 (August 23, 2019): 1993. http://dx.doi.org/10.3390/rs11171993.

Full text
Abstract:
Satellite altimeters have been producing, as of 1992, an amazing and historic record of sea level changes. As Europe moves into full operational altimetry, it has become imperative that the quality of these monitoring signals with their uncertainties should be controlled, fully and properly descripted, but also traced and connected to undisputable standards and units. Excellent quality is the foundation of these operational services of Europe in altimetry. In line with the above, the strategy of the Fiducial Reference Measurements for Altimetry (FRM4ALT) has been introduced to address and to achieve reliable, long-term, consistent, and undisputable satellite altimetry products for Earth observation and for sea-level change monitoring. FRM4ALT has been introduced and implemented by the European Space Agency in an effort to reach a uniform and absolute standardization for calibrating satellite altimeters. This paper examines the problem and the need behind the FRM4ALT principle to achieve an objective Earth observation. Secondly, it describes the expected FRM products and services which are to come into being out of this new observational strategy. Thirdly, it outlines the technology and the services required for reaching this goal. And finally, it elaborates upon the necessary resources, skills, partnerships, and facilities for establishing FRM standardization for altimetry.
APA, Harvard, Vancouver, ISO, and other styles
37

Kanicky, Viktor, Vitezslav Otruba, and Jean-Michel Mermet. "Use of Internal Standardization to Compensate for a Wide Range of Absorbance in the Analysis of Glasses by UV Laser Ablation Inductively Coupled Plasma Atomic Emission Spectrometry." Applied Spectroscopy 52, no. 5 (May 1998): 638–42. http://dx.doi.org/10.1366/0003702981944201.

Full text
Abstract:
A frequency-tripled Q-switched Nd:YAG laser (355 nm, 10 Hz, 5 mJ per shot, Surelite, Continuum) was used for the ablation of glasses as direct solid sampling for inductively coupled plasma atomic emission spectrometric multichannel detection. The colored and transparent glasses were glass standards used for calibration in X-ray fluorescence spectrometry and exhibited a 0.15–3.5 absorbance range at 355 nm. Translation of the target (1 mm s−1) with respect to the laser beam was used over a length of 16 mm. The depths and widths of the corresponding patterns were not related to the absorbance of the samples. However, the analytical line intensities were efficiently compensated for by using the Si(I) 251.611 nm line as an internal standard. The repeatability was therefore improved for Na, K, Ca, Mg, Sr, Ba, Zn, Pb, Al, Fe, and Sb, as well as the correlation coefficients of the regression and the centroid uncertainties of the calibration graph. Preliminary investigations were carried out to evaluate the acoustic signal emitted by the microplasma as external standardization. However, a negative correlation was found with the line intensity signals under our operating conditions.
APA, Harvard, Vancouver, ISO, and other styles
38

Scollie, Susan, Charla Levy, Nazanin Pourmand, Parvaneh Abbasalipour, Marlene Bagatto, Frances Richert, Shane Moodie, Jeff Crukley, and Vijay Parsa. "Fitting Noise Management Signal Processing Applying the American Academy of Audiology Pediatric Amplification Guideline: Verification Protocols." Journal of the American Academy of Audiology 27, no. 03 (March 2016): 237–51. http://dx.doi.org/10.3766/jaaa.15060.

Full text
Abstract:
Background: Although guidelines for fitting hearing aids for children are well developed and have strong basis in evidence, specific protocols for fitting and verifying some technologies are not always available. One such technology is noise management in children’s hearing aids. Children are frequently in high-level and/or noisy environments, and many options for noise management exist in modern hearing aids. Verification protocols are needed to define specific test signals and levels for use in clinical practice. Purpose: This work aims to (1) describe the variation in different brands of noise reduction processors in hearing aids and the verification of these processors and (2) determine whether these differences are perceived by 13 children who have hearing loss. Finally, we aimed to develop a verification protocol for use in pediatric clinical practice. Study Sample: A set of hearing aids was tested using both clinically available test systems and a reference system, so that the impacts of noise reduction signal processing in hearing aids could be characterized for speech in a variety of background noises. A second set of hearing aids was tested across a range of audiograms and across two clinical verification systems to characterize the variance in clinical verification measurements. Finally, a set of hearing aid recordings that varied by type of noise reduction was rated for sound quality by children with hearing loss. Results: Significant variation across makes and models of hearing aids was observed in both the speed of noise reduction activation and the magnitude of noise reduction. Reference measures indicate that noise-only testing may overestimate noise reduction magnitude compared to speech-in-noise testing. Variation across clinical test signals was also observed, indicating that some test signals may be more successful than others for characterization of hearing aid noise reduction. Children provided different sound quality ratings across hearing aids, and for one hearing aid rated the sound quality as higher with the noise reduction system activated. Conclusions: Implications for clinical verification systems may be that greater standardization and the use of speech-in-noise test signals may improve the quality and consistency of noise reduction verification cross clinics. A suggested clinical protocol for verification of noise management in children’s hearing aids is suggested.
APA, Harvard, Vancouver, ISO, and other styles
39

Fok, Hok Sum, Linghao Zhou, Yongxin Liu, Robert Tenzer, Zhongtian Ma, and Fang Zou. "Water Balance Standardization Approach for Reconstructing Runoff Using GPS at the Basin Upstream." Remote Sensing 12, no. 11 (May 30, 2020): 1767. http://dx.doi.org/10.3390/rs12111767.

Full text
Abstract:
While in-situ estuarine discharge has been correlated and reconstructed well with localized remotely-sensed data and hydraulic variables since the 1990s, its correlation and reconstruction using averaged GPS-inferred water storage from satellite gravimetry (i.e., GRACE) at the basin upstream based on the water balance standardization (WBS) approach remains unexplored. This study aims to illustrate the WBS approach for reconstructing monthly estuarine discharge (in the form of runoff (R)) at Mekong River Delta, by correlating the averaged GPS-inferred water storage from GRACE of the upstream Mekong Basin with the in-situ R at the Mekong River Delta estuary. The resulting R based on GPS-inferred water storage is comparable to that inferred from GRACE, regardless of in-situ stations within Mekong River Delta being used for the R reconstruction. The resulting R from the WBS approach with GPS water storage converted by GRACE mascon solution attains the lowest normalized root-mean-square error of 0.066, and the highest Pearson correlation coefficient of 0.974 and Nash-Sutcliffe efficiency of 0.950. Regardless of using either GPS-inferred or GRACE-inferred water storage, the WBS approach shows an increase of 1–4% in accuracy when compared to those reconstructed from remotely-sensed water balance variables. An external assessment also exhibits similar accuracies when examining the R estimated at another station location. By comparing the reconstructed and estimated Rs between the entrance and the estuary mouth, a relative error of 1–4% is found, which accounts for the remaining effect of tidal backwater on the estimated R. Additional errors might be caused by the accumulated errors from the proposed approach, the unknown signals in the remotely-sensed water balance variables, and the variable time shift across different years between the Mekong Basin at the upstream and the estuary at the downstream.
APA, Harvard, Vancouver, ISO, and other styles
40

Klemz, Claudio, Ligia Maria Salvo, Jayme da Cunha Bastos Neto, Afonso Celso Dias Bainy, and Helena Cristina da Silva de Assis. "Cytochrome P450 detection in liver of the catfish Ancistrus multispinis (Osteichthyes, Loricariidae)." Brazilian Archives of Biology and Technology 53, no. 2 (April 2010): 361–68. http://dx.doi.org/10.1590/s1516-89132010000200015.

Full text
Abstract:
Sensitive biological responses to environmental contaminants are useful as early warning signals to predict the damages by long-term exposure. Protocols standardization to quantify biochemical parameters in different fish species is required to validate its use as biomarkers. Comparative studies from different fish species and its interpretation are a challenge for the validation of its use as general biomarkers, representative of environmental impact. In this study, the protocol for liver cytochrome P450 (CYP) analysis from the native Brazilian fish Ancistrus multispinis was established. The microsome contamination by hemoglobin during the analysis of CYP in liver was detected, leading to misinterpretation of the results. The spectrophotometric method for CYP analysis was adapted in order to diminish the hemoglobin interference. Additionally, the western blotting method for CYP1A analysis was tested with success for this fish species.
APA, Harvard, Vancouver, ISO, and other styles
41

Fiedler, Georg Martin, Sven Baumann, Alexander Leichtle, Anke Oltmann, Julia Kase, Joachim Thiery, and Uta Ceglarek. "Standardized Peptidome Profiling of Human Urine by Magnetic Bead Separation and Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry." Clinical Chemistry 53, no. 3 (March 1, 2007): 421–28. http://dx.doi.org/10.1373/clinchem.2006.077834.

Full text
Abstract:
Abstract Background: Peptidome profiling of human urine is a promising tool to identify novel disease-associated biomarkers; however, a wide range of preanalytical variables influence the results of peptidome analysis. Our aim was to develop a standardized protocol for reproducible urine peptidome profiling by means of magnetic bead (MB) separation followed by matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). Methods: MBs with defined surface functionalities (hydrophobic interaction, cation exchange, and metal ion affinity) were used for peptide fractionation of urine. Mass accuracy and imprecision were calculated for 9 characteristic mass signals (Mr, 1000–10 000). Exogenous variables (instrument performance, urine sampling/storage conditions, freezing conditions, and freeze-thaw cycles) and endogenous variables (pH, urine salt and protein concentrations, and blood and bacteria interferences) were investigated with urine samples from 10 male and 10 female volunteers. Results: We detected 427 different mass signals in the urine of healthy donors. Within- and between-day imprecision in relative signal intensities ranged from 1% to 14% and from 4% to 16%, respectively. Weak cation-exchange and metal ion affinity MB preparations required adjustment of the urinary pH to 7. Storage time, storage temperature, the number of freeze-thaw cycles, and bacterial and blood contamination significantly influenced urine peptide patterns. Individual urine peptide patterns differed significantly within and between days. This imprecision was diminished by normalization to a urinary protein content of 3.5 μg. Conclusion: This reliable pretreatment protocol allows standardization of preanalytical modalities and facilitates reproducible peptidome profiling of human urine by means of MB separation in combination with MALDI-TOF MS.
APA, Harvard, Vancouver, ISO, and other styles
42

Fok, Hok Sum, Linghao Zhou, Yongxin Liu, Zhongtian Ma, and Yutong Chen. "Upstream GPS Vertical Displacement and its Standardization for Mekong River Basin Surface Runoff Reconstruction and Estimation." Remote Sensing 12, no. 1 (December 18, 2019): 18. http://dx.doi.org/10.3390/rs12010018.

Full text
Abstract:
Surface runoff (R), which is another expression for river water discharge of a river basin, is a critical measurement for regional water cycles. Over the past two decades, river water discharge has been widely investigated, which is based on remotely sensed hydraulic and hydrological variables as well as indices. This study aims to demonstrate the potential of upstream global positioning system (GPS) vertical displacement (VD) and its standardization to statistically derive R time series, which has not been reported in recent literature. The correlation between the in situ R at estuaries and averaged GPS-VD and its standardization in the river basin upstream on a monthly temporal scale of the Mekong River Basin (MRB) is examined. It was found that the reconstructed R time series from the latter agrees with and yields a similar performance to that from the terrestrial water storage based on gravimetric satellite (i.e., Gravity Recovery and Climate Experiment (GRACE)) and traditional remote sensing data. The reconstructed R time series from the standardized GPS-VD was found to have a 2–7% accuracy increase against those without standardization. On the other hand, it is comparable to data that are obtained by the Palmer drought severity index (PDSI). Similar accuracies are exhibited by the estimated R when externally validated through another station location with in situ time series. The comparison of the estimated R at the entrance of river delta against that at the estuaries indicates a 1–3% relative error induced by the residual ocean tidal effect at the estuary. The reconstructed R from the standardized GPS-VD yields the lowest total relative error of less than 9% when accounting for the main upstream area of the MRB. The remaining errors may be the result of the combined effect of the proposed methodology, remaining environmental signals in the data time series, and potential time lag (less than a month) between the upstream MRB and estuary.
APA, Harvard, Vancouver, ISO, and other styles
43

Frey, K. A., S. Minoshima, R. A. Koeppe, M. R. Kilbourn, K. L. Berger, and D. E. Kuhl. "Stereotaxic Summation Analysis of Human Cerebral Benzodiazepine Binding Maps." Journal of Cerebral Blood Flow & Metabolism 16, no. 3 (May 1996): 409–17. http://dx.doi.org/10.1097/00004647-199605000-00007.

Full text
Abstract:
Summation analysis strategies are recognized throughout diverse scientific fields as powerful means of differentially enhancing experimental signals over random fluctuations (noise). Such techniques, applied to emission tomographic cerebral blood flow scans, reveal subtle alterations in neuronal activity during specific behavioral states. In the present work, we extend the principles of intersubject image summation analysis to the evaluation of emission tomographic ligand-binding studies. A general methodology is presented that may be applied to a wide variety of binding site determinations. The procedure consists of anatomic standardization of individual brains to a common stereotaxic orientation, followed by statistical analyses of group versus group or individual versus group differences. We develop and evaluate performance of our technique with the use of positron emission tomographic [11C]flumazenil scans from normal volunteers, depicting the regional cerebral distribution of benzodiazepine binding sites.
APA, Harvard, Vancouver, ISO, and other styles
44

Tsao, Ming Sound, Kenneth Craddock, Guilherme Brandao, Zhaolin Xu, Wenda Greer, Yasushi Yatabe, Diana Ionescu, et al. "Canadian ALK (CALK): A pan-Canadian multicenter study to optimize and standardize ALK immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH) for ALK gene rearrangements." Journal of Clinical Oncology 31, no. 15_suppl (May 20, 2013): 8096. http://dx.doi.org/10.1200/jco.2013.31.15_suppl.8096.

Full text
Abstract:
8096 Background: ALK gene rearrangement (ALK+) has been found in 3-5% of advanced non-small cell lung cancer patients. FISH is considered the “gold standard” for identification of ALK+ tumors, but its cost-effectiveness and adoption as a screening assay has been debated. Recent reports suggested that ALK IHC may serve as an alternative screening or possibly a diagnostic method. In this context, CALK was initiated to assess the feasibility of implementing ALK IHC and/or FISH assays across Canadian hospitals. Methods: FISH-confirmed 22 ALK+ and 6 ALK- tumors were used as study samples. Unstained sections and scanned images of HE-stained slides from each tumor block were distributed to participating centres. IHC protocols with best signal to noise ratio using the 5A4 (Novocastra) or ALK-1 (Dako) antibodies were developed for various auto-stainers and implemented to suit the existing conditions of the participating centres. A common FISH protocol using the ALK break-apart probe (Abbott Molecular, Chicago, IL) was developed based on published reports. H-score was used to assess IHC. FISH signals were scored in 100 tumor cells/case by 2-3 pre-trained persons. A second round IHC study using newly distributed slides was completed by 8 centres. Results: Independent IHC scores from 12 centres and FISH scores from 11 centres were collected and analysed. The intraclass correlation coefficients (ICC) between centres for IHC and FISH were 0.84 and 0.68, respectively. Following the analysis of initial IHC results, a second round study resulted in improved ICC of 0.94. One of 23 tumors revealed IHC-/FISH+ discrepancy, with the FISH revealing unusual signal configurations that suggested an atypical rearrangement. However, the sensitivity and specificity of FISH results across centres using the 15 aberrant signals cut-off ranged from 86.7-100% and 100%, respectively. Conclusions: Standardization across multiple centres for ALK testing by IHC and FISH can be achieved. IHC detected all FISH+ ALK tumors, except for one discrepant case with atypical FISH finding of unknown clinical implication. The study was supported by a Pfizer Canada grant.
APA, Harvard, Vancouver, ISO, and other styles
45

HELAMA, S., J. K. NIELSEN, M. MACIAS FAURIA, and I. VALOVIRTA. "A fistful of shells: amplifying sclerochronological and palaeoclimate signals from molluscan death assemblages." Geological Magazine 146, no. 6 (July 15, 2009): 917–30. http://dx.doi.org/10.1017/s0016756809990033.

Full text
Abstract:
AbstractA growing body of literature is using sclerochronological information to infer past climates. Sclerochronologies are based on series of skeletal growth records of molluscs that have been correctly aligned in time. Incremental series are obtained from a number of shells to assess the temporal control and improve the climate signal in the final chronology. Much of the sclerochronological theory has been adopted from tree-ring science, due to the longer tradition and more firmly established concepts of chronology construction in dendrochronology. Compared to tree-ring studies, however, sclerochronological datasets are often characterized by relatively small sample size. Here we evaluate how effectively palaeoclimatic signal can be extracted from such a suite of samples. In so doing, the influences of the very basic methods that are applied in nearly every sclerochronological study to remove the non-climatic growth variability prior to palaeoclimatic interpretations, are ranked by their capability to amplify the desired signal. The study is performed in the context of six shells that constitute a bicentennial growth record from annual shell increments of freshwater pearl mussel. It was shown that when the individual series were detrended using the models set by the mean or the median summary curves for ageing (that is, applying Regional Curve Standardization, RCS), instead of fitting the ageing mode statistically to each series, the resulting sclerochronology displayed more low-frequency variability. Consistently, the added low-frequency variability evoked higher proxy–climate correlations. These results show the particular benefit of using the RCS method to develop sclerochronologies and preserve their low-frequency variations. Moreover, calculating the ageing curve and the final chronology by median, instead of mean, resulted in an amplified low-frequency climate signal. The results help to answer a growing need to better understand the behaviour of the sclerochronological data. In addition, we discuss the pitfalls that may potentially disrupt palaeoclimate signal detection in similar sclerochronological studies. Pitfalls may arise from shell taphonomy, water chemistry, time-variant characters of biological growth trends and small sample size.
APA, Harvard, Vancouver, ISO, and other styles
46

Bagić Babac, Marina, and Marijan Kunštić. "Mapping SDL Specification Fundamentals to Core SDL Ontology." Journal of Communications Software and Systems 6, no. 1 (March 21, 2010): 18. http://dx.doi.org/10.24138/jcomss.v6i1.195.

Full text
Abstract:
This paper gives a contribution in the efforts of Semantic web ontology development. We have developed the core ontology for Specification and Description Language (SDL),an object-oriented, formal language defined by the International Telecommunications Union Telecommunications Standardization Sector (ITU-T) as recommendation Z.100. The language is intended for the specification of complex, event-driven, real-time, and interactive applications involving many concurrent activitiesthat communicate using discrete signals. Using SDL formal model for system specification we bridge the gap between ideas in our minds and the actual implementation of the system. Being visually appealing SDL provides us with a simple tool for communication either between the softwaredevelopers or between non-experts without advanced engineering skills. In this paper we propose the ontology for the basic SDL system and process elements. We also propose a formal framework of SDL Markup Language as a medium for translating SDL model to SDL ontology.
APA, Harvard, Vancouver, ISO, and other styles
47

Lee, Eun-Seok, and Byeong-Seok Shin. "A Flexible Input Mapping System for Next-Generation Virtual Reality Controllers." Electronics 10, no. 17 (September 3, 2021): 2149. http://dx.doi.org/10.3390/electronics10172149.

Full text
Abstract:
This paper proposes an input mapping system that can transform various input signals from next-generation virtual reality devices to suit existing virtual reality content. Existing interactions of virtual reality content are developed based on input values for standardized commercial haptic controllers. This prevents the challenge of new ideas in content. However, controllers that are not compatible with existing virtual reality content have to take significant risks until commercialization. The proposed system allows content developers to map streams of new input devices to standard input events for use in existing content. This allows the reuse of code from existing content, even with new devices, effectively reducing development tasks. Further, it is possible to define a new input method from the perspective of content instead of the sensing results of the input device, allowing for content-specific standardization in content-oriented industries such as games and virtual reality.
APA, Harvard, Vancouver, ISO, and other styles
48

Phelps, Jolene, Amir Sanati-Nezhad, Mark Ungrin, Neil A. Duncan, and Arindom Sen. "Bioprocessing of Mesenchymal Stem Cells and Their Derivatives: Toward Cell-Free Therapeutics." Stem Cells International 2018 (September 12, 2018): 1–23. http://dx.doi.org/10.1155/2018/9415367.

Full text
Abstract:
Mesenchymal stem cells (MSCs) have attracted tremendous research interest due to their ability to repair tissues and reduce inflammation when implanted into a damaged or diseased site. These therapeutic effects have been largely attributed to the collection of biomolecules they secrete (i.e., their secretome). Recent studies have provided evidence that similar effects may be produced by utilizing only the secretome fraction containing extracellular vesicles (EVs). EVs are cell-derived, membrane-bound vesicles that contain various biomolecules. Due to their small size and relative mobility, they provide a stable mechanism to deliver biomolecules (i.e., biological signals) throughout an organism. The use of the MSC secretome, or its components, has advantages over the implantation of the MSCs themselves: (i) signals can be bioengineered and scaled to specific dosages, and (ii) the nonliving nature of the secretome enables it to be efficiently stored and transported. However, since the composition and therapeutic benefit of the secretome can be influenced by cell source, culture conditions, isolation methods, and storage conditions, there is a need for standardization of bioprocessing parameters. This review focuses on key parameters within the MSC culture environment that affect the nature and functionality of the secretome. This information is pertinent to the development of bioprocesses aimed at scaling up the production of secretome-derived products for their use as therapeutics.
APA, Harvard, Vancouver, ISO, and other styles
49

Sattler, Lisa-Marie, Hanna A. Schniewind, Waldemar B. Minich, Christoph W. Haudum, Petra Niklowitz, Julia Münzker, Gábor L. Kovács, Thomas Reinehr, Barbara Obermayer-Pietsch, and Lutz Schomburg. "Natural autoantibodies to the gonadotropin-releasing hormone receptor in polycystic ovarian syndrome." PLOS ONE 16, no. 4 (April 2, 2021): e0249639. http://dx.doi.org/10.1371/journal.pone.0249639.

Full text
Abstract:
Context Polycystic ovarian syndrome (PCOS) is a complex disease with different subtypes and unclear etiology. Among the frequent comorbidities are autoimmune diseases, suggesting that autoantibodies (aAb) may be involved in PCOS pathogenesis. Objective As the gonadal axis often is dysregulated, we tested the hypothesis that aAb to the gonadotropin-releasing hormone receptor (GnRH-R) are of diagnostic value in PCOS. Design An in vitro assay for quantifying aAb to the GnRH-R (GnRH-R-aAb) was established by using a recombinant fusion protein of full-length human GnRH-R and firefly luciferase. A commercial rabbit antiserum to human GnRH-R was used for standardization. Serum samples of control subjects and different cohorts of European PCOS patients (n = 1051) were analyzed. Results The novel GnRH-R-aAb assay was sensitive, and signals were linear on dilution when tested with the commercial GnRH-R antiserum. Natural GnRH-R-aAb were detected in one control (0.25%) and two PCOS samples (0.31%), and 12 samples were slightly above the threshold of positivity. The identification of samples with positive GnRH-R-aAb was reproducible and the signals showed no matrix interferences. Conclusion Natural GnRH-R-aAb are present in a very small fraction of adult control and PCOS subjects of European decent. Our results do not support the hypothesis that the GnRH-R constitutes a relevant autoantigen in PCOS.
APA, Harvard, Vancouver, ISO, and other styles
50

Luo, Zhizeng, Xianju Lu, and Xugang Xi. "EEG Feature Extraction Based on a Bilevel Network: Minimum Spanning Tree and Regional Network." Electronics 9, no. 2 (January 21, 2020): 203. http://dx.doi.org/10.3390/electronics9020203.

Full text
Abstract:
Feature extraction is essential for classifying different motor imagery (MI) tasks in a brain–computer interface (BCI). Although the methods of brain network analysis have been widely studied in the BCI field, these methods are limited by differences in network size, density, and standardization. To address this issue and improve classification accuracy, we propose a novel method, in which the hybrid features of the brain function based on the bilevel network are extracted. Minimum spanning tree (MST) based on electroencephalogram (EEG) signal nodes in different MIs is constructed as the first network layer to solve the global network connectivity problem. In addition, the regional network in different movement patterns is constructed as the second network layer to determine the network characteristics, which is consistent with the correspondence between limb movement patterns and cerebral cortex in neurophysiology. We attempt to apply MST to the classification of the MI EEG signals, and the bilevel network has better interpretability. Thereafter, a vector is formed by combining the MST fundamental features with the directional features of the regional network. Our method is validated using the BCI Competition IV Dataset I. Experimental results verify the feasibility of the bilevel network framework. Furthermore, the average classification performance of the proposed method reaches 89.50%, which is higher than that of other competing methods, thereby indicating that the bilevel network is effective for MI classification.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography