Auswahl der wissenschaftlichen Literatur zum Thema „Standardization of signals“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Standardization of signals" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Standardization of signals"

1

Stenger, L. „Digital coding of television signals—CCIR activities for standardization“. Signal Processing: Image Communication 1, Nr. 1 (Juni 1989): 29–43. http://dx.doi.org/10.1016/0923-5965(89)90018-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

KUBOTA, H. „Standardization for Worning Signals : Report of the ISO Meeting in London“. JAPANES JOURNAL OF MEDICAL INSTRUMENTATION 55, Nr. 8 (01.08.1985): 404–8. http://dx.doi.org/10.4286/ikakikaigaku.55.8_404.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Krigsholm und Riekkinen. „Applying Text Mining for Identifying Future Signals of Land Administration“. Land 8, Nr. 12 (27.11.2019): 181. http://dx.doi.org/10.3390/land8120181.

Der volle Inhalt der Quelle
Annotation:
Companies and governmental agencies are increasingly seeking ways to explore emerging trends and issues that have the potential to shape up their future operational environments. This paper exploits text mining techniques for investigating future signals of the land administration sector. After a careful review of previous literature on the detection of future signals through text mining, we propose the use of topic models to enhance the interpretation of future signals. Findings of the study highlight the large spectrum of issues related to land interests and their recording, as nineteen future signal topics ranging from climate change mitigation and the use of satellite imagery for data collection to flexible standardization and participatory land consolidations are identified. Our analysis also shows that distinguishing weak signals from latent, well-known, and strong signals is challenging when using a predominantly automated process. Overall, this study summarizes the current discourses of the land administration domain and gives an indication of which topics are gaining momentum at present.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Zhang, Xianliang, Junxia Li, Xiaobo Liu und Zhenju Chen. „Improved EEMD-based standardization method for developing long tree-ring chronologies“. Journal of Forestry Research 31, Nr. 6 (27.06.2019): 2217–24. http://dx.doi.org/10.1007/s11676-019-01002-y.

Der volle Inhalt der Quelle
Annotation:
Abstract Long tree-ring chronologies can be developed by overlapping data from living trees with data from fossil trees through cross-dating. However, low-frequency climate signals are lost when standardizing tree-ring series due to the “segment length curse”. To alleviate the segment length curse and thus improve the standardization method for developing long tree-ring chronologies, here we first calculated a mean value for all the tree ring series by overlapping all of the tree ring series. The growth trend of the mean tree ring width (i.e., cumulated average growth trend of all the series) was determined using ensemble empirical mode decomposition. Then the chronology was developed by dividing the mean value by the growth trend of the mean value. Our improved method alleviated the problem of trend distortion. Long-term signals were better preserved using the improved method than in previous detrending methods. The chronologies developed using the improved method were better correlated with climate than those developed using conservative methods. The improved standardization method alleviates trend distortion and retains more of the low-frequency climate signals.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Miliszkiewicz, Natalia, Stanisław Walas und Anna Tobiasz. „Current approaches to calibration of LA-ICP-MS analysis“. Journal of Analytical Atomic Spectrometry 30, Nr. 2 (2015): 327–38. http://dx.doi.org/10.1039/c4ja00325j.

Der volle Inhalt der Quelle
Annotation:
For solid sample quantitative analysis by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) the main analytical problems are adequate standards preparation and signals standardization.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Němcová, Andrea, Radovan Smíšek, Lucie Maršánová, Lukáš Smital und Martin Vítek. „A Comparative Analysis of Methods for Evaluation of ECG Signal Quality after Compression“. BioMed Research International 2018 (18.07.2018): 1–26. http://dx.doi.org/10.1155/2018/1868519.

Der volle Inhalt der Quelle
Annotation:
The assessment of ECG signal quality after compression is an essential part of the compression process. Compression facilitates the signal archiving, speeds up signal transmission, and reduces the energy consumption. Conversely, lossy compression distorts the signals. Therefore, it is necessary to express the compression performance through both compression efficiency and signal quality. This paper provides an overview of objective algorithms for the assessment of both ECG signal quality after compression and compression efficiency. In this area, there is a lack of standardization, and there is no extensive review as such. 40 methods were tested in terms of their suitability for quality assessment. For this purpose, the whole CSE database was used. The tested signals were compressed using an algorithm based on SPIHT with varying efficiency. As a reference, compressed signals were manually assessed by two experts and classified into three quality groups. Owing to the experts’ classification, we determined corresponding ranges of selected quality evaluation methods’ values. The suitability of the methods for quality assessment was evaluated based on five criteria. For the assessment of ECG signal quality after compression, we recommend using a combination of these methods: PSim SDNN, QS, SNR1, MSE, PRDN1, MAX, STDERR, and WEDD SWT.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Hoffman, Roy E. „Standardization of chemical shifts of TMS and solvent signals in NMR solvents“. Magnetic Resonance in Chemistry 44, Nr. 6 (2006): 606–16. http://dx.doi.org/10.1002/mrc.1801.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Damani, Devanshi N., Divaakar Siva Baala Sundaram, Shivam Damani, Anoushka Kapoor, Adelaide M. Arruda Olson und Shivaram P. Arunachalam. „INVESTIGATION OF SYNCHRONIZED ACQUISITION OF ELECTROCARDIOGRAM AND PHONOCARDIOGRAM SIGNALS TOWARDS ELECTROMECHANICAL PROFILING OF THE HEART“. Biomedical Sciences Instrumentation 57, Nr. 2 (01.04.2021): 305–12. http://dx.doi.org/10.34107/yhpn9422.04305.

Der volle Inhalt der Quelle
Annotation:
Cardiac diseases are the leading cause of death in the world. Electrocardiogram (ECG and Phonocardiogram (PCG signals play a significant role in the diagnosis of various cardiac diseases. Simultaneous acquisition of ECG and PCG signals can open new avenues of signal processing approaches for electromechanical profiling of the heart. However, there are no standard approaches to ensure high fidelity synchronous data acquisition to enable the development of such novel technologies. In this work, the authors report results on various data capture positions that could lead to standardization of simultaneous ECG and PCG data collection. Presence of lung sounds, variations in posture, depth and frequency of breathing can lead to differences in the ECG-PCG signals recorded. This necessitates a standard approach to record and interpret the data collected. The authors recorded ECG-PCG simultaneously in six healthy subjects using a digital stethoscope to understand the differences in signal quality in various recording positions (prone, supine, bending, semi recumbent, standing, left lateral and sitting with normal and deep breathing conditions. The collected digitized signals are processed offline for signal quality using custom MATLAB software for SNR. The results indicate minimal differences in signal quality across different recording positions. Validation of this technique with larger dataset is required. Future work will investigate changes in characteristic ECG and PCG features due to position and breathing patterns.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Dong, Li, Lingling Zhao, Yufan Zhang, Xue Yu, Fali Li, Jianfu Li, Yongxiu Lai, Tiejun Liu und Dezhong Yao. „Reference Electrode Standardization Interpolation Technique (RESIT): A Novel Interpolation Method for Scalp EEG“. Brain Topography 34, Nr. 4 (05.05.2021): 403–14. http://dx.doi.org/10.1007/s10548-021-00844-2.

Der volle Inhalt der Quelle
Annotation:
Abstract“Bad channels” are common phenomena during scalp electroencephalography (EEG) recording that arise due to various technique-related reasons, and reconstructing signals from bad channels is an inevitable choice in EEG processing. However, current interpolation methods are all based on purely mathematical interpolation theory, ignoring the neurophysiological basis of the EEG signals, and their performance needs to be further improved, especially when there are many scattered or adjacent bad channels. Therefore, a new interpolation method, named the reference electrode standardization interpolation technique (RESIT), was developed for interpolating scalp EEG channels. Resting-state and event-related EEG datasets were used to investigate the performance of the RESIT. The main results showed that (1) assuming 10% bad channels, RESIT can reconstruct the bad channels well; (2) as the percentage of bad channels increased (from 2% to 85%), the absolute and relative errors between the true and RESIT-reconstructed signals generally increased, and the correlations between the true and RESIT signals decreased; (3) for a range of bad channel percentages (2% ~ 85%), the RESIT had lower absolute error (approximately 2.39% ~ 33.5% reduction), lower relative errors (approximately 1.3% ~ 35.7% reduction) and higher correlations (approximately 2% ~ 690% increase) than traditional interpolation methods, including neighbor interpolation (NI) and spherical spline interpolation (SSI). In addition, the RESIT was integrated into the EEG preprocessing pipeline on the WeBrain cloud platform (https://webrain.uestc.edu.cn/). These results suggest that the RESIT is a promising interpolation method for both separate and simultaneous EEG preprocessing that benefits further EEG analysis, including event-related potential (ERP) analysis, EEG network analysis, and strict group-level statistics.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Garrido Frenich, A., D. Picón Zamora, J. L. Martı́nez Vidal und M. Martı́nez Galera. „Standardization of SPE signals in multicomponent analysis of three benzimidazolic pesticides by spectrofluorimetry“. Analytica Chimica Acta 477, Nr. 2 (Februar 2003): 211–22. http://dx.doi.org/10.1016/s0003-2670(02)01423-x.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Standardization of signals"

1

Bunn, Andrew G., Timothy J. Sharac und Lisa J. Graumlich. „Using a Simulation Model to Compare Methods of Tree-Ring Detrending and to Investigate the Detectability of Low-Frequency Signals“. Tree-Ring Society, 2004. http://hdl.handle.net/10150/262635.

Der volle Inhalt der Quelle
Annotation:
We use a simulation model to generate tree-ring like data with systematic growth forcings and subject it to two methods of standardization: Regional Curve Standardization (RCS) and Negative Exponential Curve Standardization (NECS). The coherency between very low frequency forcings (hundreds of years) and the chronologies was higher when RCS was used to detrend the component series. There was no difference between standardization methods at decadal or annual time scales. We found that the detectability of systematic forcings was heavily dependent on amplitude and wavelength of the input signal as well as the number of trees simulated. These results imply that for very long tree-ring chronologies where the analyst is interested in low-frequency variability, RCS is a better method for detrending series if the requirements for that method can be met. However, in the majority of situations NECS is an acceptable detrending method. Most critically, we found that multi-centennial signals can be recovered using both methods.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Terschová, Vanda. „Korelace charakteristických signálů laserem buzeného plazmatu“. Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-444962.

Der volle Inhalt der Quelle
Annotation:
Laser-induced breakdown spectroscopy (LIBS) is a fast analytical method, but can also be complicated. This spectroscopic method is used to provide a qualitative and quantitative analysis of a sample. The analysis is carried out by capturing the emission radiation of the generated plasma. The accuracy and stability of the measurement is affected by several parameters, such as stability of the laser, physical and chemical properties of the sample, its homogeneity and others, that can not always be eliminated. For this reason other methods are being added to the LIBS experiment that could improve the quality of this analysis. This diploma thesis is focused on a research of the literature on the standardization of laser-induced plasma signal and the possibility of using an acoustic signal for this purpose. For this reason , it is necessary to perform basic experiments and to verify if the acoustic signal correlates with the emission signal. If these signals correlate together, it would be possible to use the acoustic signal for standardization og the LIBS data, which would improve the accuracy of the analysis. In the theoretical part at the beginning other spectroscopic methods are summarised. Then the work is focused on the description of the LIBS method, possible ways of the analysis, standardization of emission signals and its review in the literature. The experimental part is aimed at the study of the acoustic signal that was performed in the framework of this study. At the beginning, there are introduced results of the basic measurements on steel and brass samples. These results were important for an optimization of the experiment. The following section shows the results obtained from the measurements of the acoustic signal on the samples with the same chemical composition but different hardness. At the end the correlation between the acoustic and emission signals is discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

BRANCACCIO, FRANCO. „Metodologia de aquisição de dados e análise por software, para sistemas de coincidências 4pß-? e sua aplicação na padronização de radionuclídeos, com ênfase em transições metaestáveis“. reponame:Repositório Institucional do IPEN, 2013. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10559.

Der volle Inhalt der Quelle
Annotation:
Made available in DSpace on 2014-10-09T12:41:56Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:04:33Z (GMT). No. of bitstreams: 0
Tese (Doutoramento)
IPEN/T
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Schiffer, Štěpán. „Způsoby korekce a standardizace signálu v laserové spektroskopii“. Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2018. http://www.nusl.cz/ntk/nusl-392852.

Der volle Inhalt der Quelle
Annotation:
The subject of this diploma thesis is the study of a sample position influence on results of an experiment in laser spectroscopy. The aim is to design an appropriate way for standardization of signal obtained at different conditions with the respect to its applicability for stand-off analysis. In the theoretical part of the diploma thesis there are the basics of LIBS method described together with the issues of stand-off experiment and both, basic and advanced approches for the processing and correction of obtained spectra. Also the experiment is designed here, which is used for the analysis of the sample inclination and distance influence on the detected signal. The choice of appropriate ways for the signal correction follows and their applicability and efficiency is then experimentally tested.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Owais, Mohammad Hamza. „Development of Intelligent Systems to Optimize Training and Real-world Performance Amongst Health Care Professionals“. University of Toledo / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1556914525013002.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Chen, Wei-Sung, und 陳偉菘. „The standardization for the time of measuring the J peak value of the Ballistocardiogram signals“. Thesis, 2016. http://ndltd.ncl.edu.tw/handle/75496363979027936257.

Der volle Inhalt der Quelle
Annotation:
碩士
國立中興大學
應用數學系所
104
Cardiovascular disease threatens human life for a long time, and influences the way of living in the future. Cardiovascular disease, such as stroke and myocardial infraction, usually breaks out suddenly. Therefore, it is necessary to monitor the current heart rate conveniently and effectively at any time. In order to achieve the target of heart rate monitoring, Electrocardiogram (ECG) and Ballistocardiogram (BCG) are available. Measurement of ECG and BCG could always detect the change of heart rate. Moreover, the peak times of peak R from ECG and peak J from BCG could be utilized to calculate RJ-interval. It is related to blood pressure, so it could be used to blood estimation noninvasively. This report is focused on the samplings of frequency from BCG filter. There were 40 participants which included 19 males and 20 females. The average age was 35.5±8.8 years old, while the average height was 165.3±8.0 centimeters and the average weight was 66.7±12.9. The measurement of ECG and BCG would be monitored at the same time, then those signals would be analyzed. The results show the band width selection of every participant was 1.25~22.5Hz. By testing in each gender, the band width selection of males is 2.5~25Hz when the band width selection of females is 1~22.5Hz.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Standardization of signals"

1

Yuan, Yifei. LTE-Advanced Relay Technology and Standardization. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Yuan, Yifei. LTE-Advanced Relay Technology and Standardization. Springer, 2016.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Yuan, Yifei. LTE-Advanced Relay Technology and Standardization. Springer, 2012.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Nardini, Luisa. In the Quest of Gallican Remnants in Gregorian Manuscripts. Herausgegeben von Patricia Hall. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199733163.013.11.

Der volle Inhalt der Quelle
Annotation:
This chapter examines Gallican components in liturgical chants copied in Gregorian manuscripts by focusing on a group of chants for the masses for the Holy Cross. These chants, copied in manuscripts from Aquitaine, bear signs of more remote pre-Gregorian roots that can be recognized in some textual and musical features, in aspects related to their liturgical collocation, and in the theological arguments they contain. Before undertaking an analysis of specific examples, the chapter first considers the extent to which royal decrees influenced chant practices in Carolingian and post-Carolingian Europe by contextualizing the process of standardization of chant within the projects of cultural reform of the Carolingians and the Merovingians. It then discusses issues of persistence and change in the transmission of liturgical repertories as well as the role of music censorship in the practice of liturgical chant.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Standardization of signals"

1

Murakami, Tokumichi. „The Development and Standardization of Ultra High Definition Video Technology“. In Signals and Communication Technology, 81–135. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-12802-8_4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Viti, Francesco, Serge P. Hoogendoorn, Henk J. van Zuylen, Isabel R. Wilmink und Bart van Arem. „Microscopic Data for Analyzing Driving Behavior at Traffic Signals“. In Traffic Data Collection and its Standardization, 171–91. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-6070-2_12.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Waiczies, Sonia, Christian Prinz, Ludger Starke, Jason M. Millward, Paula Ramos Delgado, Jens Rosenberg, Marc Nazaré, Helmar Waiczies, Andreas Pohlmann und Thoralf Niendorf. „Functional Imaging Using Fluorine (19F) MR Methods: Basic Concepts“. In Methods in Molecular Biology, 279–99. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_17.

Der volle Inhalt der Quelle
Annotation:
AbstractKidney-associated pathologies would greatly benefit from noninvasive and robust methods that can objectively quantify changes in renal function. In the past years there has been a growing incentive to develop new applications for fluorine (19F) MRI in biomedical research to study functional changes during disease states. 19F MRI represents an instrumental tool for the quantification of exogenous 19F substances in vivo. One of the major benefits of 19F MRI is that fluorine in its organic form is absent in eukaryotic cells. Therefore, the introduction of exogenous 19F signals in vivo will yield background-free images, thus providing highly selective detection with absolute specificity in vivo. Here we introduce the concept of 19F MRI, describe existing challenges, especially those pertaining to signal sensitivity, and give an overview of preclinical applications to illustrate the utility and applicability of this technique for measuring renal function in animal models.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This introduction chapter is complemented by two separate chapters describing the experimental procedure and data analysis.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Iglesias, Juan Eugenio, Ivo Dinov, Jaskaran Singh, Gregory Tong und Zhuowen Tu. „Synthetic MRI Signal Standardization: Application to Multi-atlas Analysis“. In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, 81–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15711-0_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Grist, James T., Esben Søvsø Szocska Hansen, Frank G. Zöllner und Christoffer Laustsen. „Analysis Protocol for Renal Sodium (23Na) MR Imaging“. In Methods in Molecular Biology, 689–96. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_41.

Der volle Inhalt der Quelle
Annotation:
AbstractThe signal acquired in sodium (23Na) MR imaging is proportional to the concentration of sodium in a voxel, and it is possible to convert between the two using external calibration phantoms. Postprocessing, and subsequent analysis, of sodium renal images is a simple task that can be performed with readily available software. Here we describe the process of conversion between sodium signal and concentration, estimation of the corticomedullary sodium gradient and the procedure used for quadrupolar relaxation analysis.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This analysis protocol chapter is complemented by two separate chapters describing the basic concept and experimental procedure.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Starke, Ludger, Thoralf Niendorf und Sonia Waiczies. „Data Preparation Protocol for Low Signal-to-Noise Ratio Fluorine-19 MRI“. In Methods in Molecular Biology, 711–22. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_43.

Der volle Inhalt der Quelle
Annotation:
AbstractFluorine-19 MRI shows great promise for a wide range of applications including renal imaging, yet the typically low signal-to-noise ratios and sparse signal distribution necessitate a thorough data preparation.This chapter describes a general data preparation workflow for fluorine MRI experiments. The main processing steps are: (1) estimation of noise level, (2) correction of noise-induced bias and (3) background subtraction. The protocol is supplemented by an example script and toolbox available online.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This analysis protocol chapter is complemented by two separate chapters describing the basic concept and experimental procedure.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Starke, Ludger, Karsten Tabelow, Thoralf Niendorf und Andreas Pohlmann. „Denoising for Improved Parametric MRI of the Kidney: Protocol for Nonlocal Means Filtering“. In Methods in Molecular Biology, 565–76. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_34.

Der volle Inhalt der Quelle
Annotation:
AbstractIn order to tackle the challenges caused by the variability in estimated MRI parameters (e.g., T2* and T2) due to low SNR a number of strategies can be followed. One approach is postprocessing of the acquired data with a filter. The basic idea is that MR images possess a local spatial structure that is characterized by equal, or at least similar, noise-free signal values in vicinities of a location. Then, local averaging of the signal reduces the noise component of the signal. In contrast, nonlocal means filtering defines the weights for averaging not only within the local vicinity, bur it compares the image intensities between all voxels to define “nonlocal” weights. Furthermore, it generally compares not only single-voxel intensities but small spatial patches of the data to better account for extended similar patterns. Here we describe how to use an open source NLM filter tool to denoise 2D MR image series of the kidney used for parametric mapping of the relaxation times T2* and T2.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Zöllner, Frank G., Walter Dastrù, Pietro Irrera, Dario Livio Longo, Kevin M. Bennett, Scott C. Beeman, G. Larry Bretthorst und Joel R. Garbow. „Analysis Protocol for Dynamic Contrast Enhanced (DCE) MRI of Renal Perfusion and Filtration“. In Methods in Molecular Biology, 637–53. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_38.

Der volle Inhalt der Quelle
Annotation:
AbstractHere we present an analysis protocol for dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) data of the kidneys. It covers comprehensive steps to facilitate signal to contrast agent concentration mapping via T1 mapping and the calculation of renal perfusion and filtration parametric maps using model-free approaches, model free analysis using deconvolution, the Toft’s model and a Bayesian approach.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This analysis protocol chapter is complemented by two separate chapters describing the basic concept and experimental procedure.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Reed, Galen D., Natalie J. Korn, Christoffer Laustsen und Cornelius von Morze. „Analysis Methods for Hyperpolarized Carbon (13C) MRI of the Kidney“. In Methods in Molecular Biology, 697–710. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_42.

Der volle Inhalt der Quelle
Annotation:
AbstractHyperpolarized 13C MR is a novel medical imaging modality with substantially different signal dynamics as compared to conventional 1H MR, thus requiring new methods for processing the data in order to access and quantify the embedded metabolic and functional information. Here we describe step-by-step analysis protocols for functional renal hyperpolarized 13C imaging. These methods are useful for investigating renal blood flow and function as well as metabolic status of rodents in vivo under various experimental physiological conditions.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This analysis protocol chapter is complemented by two separate chapters describing the basic concept and experimental procedure.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Hu, Lingzhi, Hua Pan und Samuel A. Wickline. „Fluorine (19F) MRI to Measure Renal Oxygen Tension and Blood Volume: Experimental Protocol“. In Methods in Molecular Biology, 509–18. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-0978-1_31.

Der volle Inhalt der Quelle
Annotation:
AbstractFluorinated compounds feature favorable toxicity profile and can be used as a contrast agent for magnetic resonance imaging and spectroscopy. Fluorine nucleus from fluorinated compounds exhibit well-known advantages of being a high signal nucleus with a natural abundance of its stable isotope, a convenient gyromagnetic ratio close to that of protons, and a unique spectral signature with no detectable background at clinical field strengths. Perfluorocarbon core nanoparticles (PFC NP) are a class of clinically approved emulsion agents recently applied in vivo for ligand-targeted molecular imaging. The objective of this chapter is to outline a multinuclear 1H/19F MRI protocol for functional kidney imaging in rodents for mapping of renal blood volume and oxygenation (pO2) in renal disease models.This chapter is based upon work from the COST Action PARENCHIMA, a community-driven network funded by the European Cooperation in Science and Technology (COST) program of the European Union, which aims to improve the reproducibility and standardization of renal MRI biomarkers. This experimental protocol chapter is complemented by a separate chapter describing the basic concept of functional imaging using fluorine (19F) MR methods.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Standardization of signals"

1

Chen, Jie, und Jun Tan. „NR V2X: Technologies, Performance, and Standardization“. In 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443357.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Fisher, Reed. „60 GHz WPAN Standardization within IEEE 802.15.3c“. In 2007 International Symposium on Signals, Systems and Electronics. IEEE, 2007. http://dx.doi.org/10.1109/issse.2007.4294424.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Kurakata, Kenji, Tazu Mizunami und Kazuma Matsushita. „Auditory signals for consumer electronics: Accessible-design approach and international standardization“. In 2009 IEEE 13th International Symposium on Consumer Electronics (ISCE). IEEE, 2009. http://dx.doi.org/10.1109/isce.2009.5157059.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Komaki, Shozo, Hiroyo Ogawa und Junichiro Ichikawa. „Feasibility Study and Standardization Activities on Radio on Fiber Devices“. In 2007 International Symposium on Signals, Systems and Electronics. IEEE, 2007. http://dx.doi.org/10.1109/issse.2007.4294421.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Xiaoshan Song, Wei Qian und Sankar. „Standardization for image characteristics in telemammography“. In IEEE International Conference on Acoustics Speech and Signal Processing ICASSP-02. IEEE, 2002. http://dx.doi.org/10.1109/icassp.2002.1004946.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Cebeci, Salih, Merve Ozyilmaz und Gokhan Ince. „Automatic Standardization System for Free Text Addresses“. In 2019 27th Signal Processing and Communications Applications Conference (SIU). IEEE, 2019. http://dx.doi.org/10.1109/siu.2019.8806349.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Atilla Hasekioglu, Arif Sirri, und Orkun Hasekioglu. „Quantum Key Distribution and Quantum Networks Standardization Efforts“. In 2020 28th Signal Processing and Communications Applications Conference (SIU). IEEE, 2020. http://dx.doi.org/10.1109/siu49456.2020.9302264.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Bruhn, S., H. Pobloth, M. Schnell, B. Grill, J. Gibbs, L. Miao, K. Jarvinen et al. „Standardization of the new 3GPP EVS codec“. In ICASSP 2015 - 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2015. http://dx.doi.org/10.1109/icassp.2015.7179064.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Liu, Xin, Shuangdong Zhu und Ken Chen. „Method of Traffic Signs Segmentation Based on Color-Standardization“. In 2009 International Conference on Intelligent Human-Machine Systems and Cybernetics. IEEE, 2009. http://dx.doi.org/10.1109/ihmsc.2009.172.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Łuka, Piotr, und Andrzej Urban. „Standardization of the Central Console in Police Vehicles“. In ICGSP '19: 2019 The 3rd International Conference on Graphics and Signal Processing. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3338472.3338494.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie