Literatura académica sobre el tema "Wavelet artefacts"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Wavelet artefacts".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Wavelet artefacts"

1

Piekarczyk, Marcin, Olaf Bar, Łukasz Bibrzycki, Michał Niedźwiecki, Krzysztof Rzecki, Sławomir Stuglik, Thomas Andersen et al. "CNN-Based Classifier as an Offline Trigger for the CREDO Experiment". Sensors 21, n.º 14 (14 de julio de 2021): 4804. http://dx.doi.org/10.3390/s21144804.

Texto completo
Resumen
Gamification is known to enhance users’ participation in education and research projects that follow the citizen science paradigm. The Cosmic Ray Extremely Distributed Observatory (CREDO) experiment is designed for the large-scale study of various radiation forms that continuously reach the Earth from space, collectively known as cosmic rays. The CREDO Detector app relies on a network of involved users and is now working worldwide across phones and other CMOS sensor-equipped devices. To broaden the user base and activate current users, CREDO extensively uses the gamification solutions like the periodical Particle Hunters Competition. However, the adverse effect of gamification is that the number of artefacts, i.e., signals unrelated to cosmic ray detection or openly related to cheating, substantially increases. To tag the artefacts appearing in the CREDO database we propose the method based on machine learning. The approach involves training the Convolutional Neural Network (CNN) to recognise the morphological difference between signals and artefacts. As a result we obtain the CNN-based trigger which is able to mimic the signal vs. artefact assignments of human annotators as closely as possible. To enhance the method, the input image signal is adaptively thresholded and then transformed using Daubechies wavelets. In this exploratory study, we use wavelet transforms to amplify distinctive image features. As a result, we obtain a very good recognition ratio of almost 99% for both signal and artefacts. The proposed solution allows eliminating the manual supervision of the competition process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Turnip, Arjon y Jasman Pardede. "Artefacts Removal of EEG Signals with Wavelet Denoising". MATEC Web of Conferences 135 (2017): 00058. http://dx.doi.org/10.1051/matecconf/201713500058.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Voskoboinikov, Yu E. "Artefacts of Wavelet Filtration of Images and Their Elimination". Optoelectronics, Instrumentation and Data Processing 56, n.º 6 (noviembre de 2020): 559–65. http://dx.doi.org/10.3103/s8756699020060138.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Lilly, Jonathan M. "Element analysis: a wavelet-based method for analysing time-localized events in noisy time series". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 473, n.º 2200 (abril de 2017): 20160776. http://dx.doi.org/10.1098/rspa.2016.0776.

Texto completo
Resumen
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Subramanian, Balambigai, Asokan Ramasamy y Kamalakannan Rangasamy. "Performance Comparison of Wavelet and Multiwavelet Denoising Methods for an Electrocardiogram Signal". Journal of Applied Mathematics 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/241540.

Texto completo
Resumen
The increase in the occurrence of cardiovascular diseases in the world has made electrocardiogram an important tool to diagnose the various arrhythmias of the heart. But the recorded electrocardiogram often contains artefacts like power line noise, baseline noise, and muscle artefacts. Hence denoising of electrocardiogram signals is very important for accurate diagnosis of heart diseases. The properties of wavelets and multiwavelets have better denoising capability compared to conventional filtering techniques. The electrocardiogram signals have been taken from the MIT-BIH arrhythmia database. The simulation results prove that there is a 29.7% increase in the performance of multiwavelets over the performance of wavelets in terms of signal to noise ratio (SNR).
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

DOWNIE, T. R. "ACCURATE SIGNAL ESTIMATION NEAR DISCONTINUITIES". International Journal of Wavelets, Multiresolution and Information Processing 02, n.º 04 (diciembre de 2004): 433–53. http://dx.doi.org/10.1142/s0219691304000627.

Texto completo
Resumen
Wavelet thresholding is an effective method for noise reduction of a wide class of naturally occurring signals. However, bias near to a discontinuity and Gibbs phenomenon are a drawback in wavelet thresholding. The extent to which this is a problem is investigated. The Haar wavelet basis is good at approximating discontinuities, but is bad at approximating other signal artefacts. A method of detecting jumps in a signal is developed that uses non-decimated Haar wavelet coefficients. This is designed to be used in conjunction with most existing thresholding methods. A detailed simulation study is carried out and results show that when discontinuities are present a substantial reduction in bias can be obtained, leading to a corresponding reduction in mean square error.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Lei, Zhou, Yan Jiangbao, Zhu Feng, Tan Xiangyu y Zhang Lifeng. "Reconstruction Method of Electrical Capacitance Tomography Based on Wavelet Fusion". MATEC Web of Conferences 176 (2018): 01031. http://dx.doi.org/10.1051/matecconf/201817601031.

Texto completo
Resumen
The accuracy of reconstructed images of Electrical Capacitance Tomography (ECT) is a bottleneck concerning the successful application of ECT. An image data fusion algorithm based on wavelet transform was proposed in this paper to improve the accuracy of reconstructed images. First, reconstructed images were obtained using conjugate gradient least square algorithm and Landweber iterative algorithm, respectively. Secondly, reconstructed images were decomposed by wavelet. After that, the approximate component was processed according to the weighted average fusion rule. The detail component was processed according to the maximum fusion rule of absolute value. Finally, the new reconstructed images were obtained by wavelet reconstruction. Simulation and static experimental results showed that the reconstructed images with higher accuracy can be obtained after fusion and the artefacts were decreased obviously.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Burger, Christiaan y David Jacobus van den Heever. "Removal of EOG artefacts by combining wavelet neural network and independent component analysis". Biomedical Signal Processing and Control 15 (enero de 2015): 67–79. http://dx.doi.org/10.1016/j.bspc.2014.09.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Romo Vázquez, R., H. Vélez-Pérez, R. Ranta, V. Louis Dorr, D. Maquin y L. Maillard. "Blind source separation, wavelet denoising and discriminant analysis for EEG artefacts and noise cancelling". Biomedical Signal Processing and Control 7, n.º 4 (julio de 2012): 389–400. http://dx.doi.org/10.1016/j.bspc.2011.06.005.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Conforto, Silvia, Tommaso D'Alessio y Stefano Pignatelli. "Optimal rejection of movement artefacts from myoelectric signals by means of a wavelet filtering procedure". Journal of Electromyography and Kinesiology 9, n.º 1 (enero de 1999): 47–57. http://dx.doi.org/10.1016/s1050-6411(98)00023-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Wavelet artefacts"

1

Romo, Vazquez Rebeca del Carmen. "Contribution à la détection et à l'analyse des signaux EEG épileptiques : débruitage et séparation de sources". Thesis, Vandoeuvre-les-Nancy, INPL, 2010. http://www.theses.fr/2010INPL005N/document.

Texto completo
Resumen
L'objectif principal de cette thèse est le pré-traitement des signaux d'électroencéphalographie (EEG). En particulier, elle vise à développer une méthodologie pour obtenir un EEG dit "propre" à travers l'identification et l'élimination des artéfacts extra-cérébraux (mouvements oculaires, clignements, activité cardiaque et musculaire) et du bruit. Après identification, les artéfacts et le bruit doivent être éliminés avec une perte minimale d'information, car dans le cas d'EEG, il est de grande importance de ne pas perdre d'information potentiellement utile à l'analyse (visuelle ou automatique) et donc au diagnostic médical. Plusieurs étapes sont nécessaires pour atteindre cet objectif : séparation et identification des sources d'artéfacts, élimination du bruit de mesure et reconstruction de l'EEG "propre". A travers une approche de type séparation aveugle de sources (SAS), la première partie vise donc à séparer les signaux EEG dans des sources informatives cérébrales et des sources d'artéfacts extra-cérébraux à éliminer. Une deuxième partie vise à classifier et éliminer les sources d'artéfacts et elle consiste en une étape de classification supervisée. Le bruit de mesure, quant à lui, il est éliminé par une approche de type débruitage par ondelettes. La mise en place d'une méthodologie intégrant d'une manière optimale ces trois techniques (séparation de sources, classification supervisée et débruitage par ondelettes) constitue l'apport principal de cette thèse. La méthodologie développée, ainsi que les résultats obtenus sur une base de signaux d'EEG réels (critiques et inter-critiques) importante, sont soumis à une expertise médicale approfondie, qui valide l'approche proposée
The goal of this research is the electroencephalographic (EEG) signals preprocessing. More precisely, we aim to develop a methodology to obtain a "clean" EEG through the extra- cerebral artefacts (ocular movements, eye blinks, high frequency and cardiac activity) and noise identification and elimination. After identification, the artefacts and noise must be eliminated with a minimal loss of cerebral activity information, as this information is potentially useful to the analysis (visual or automatic) and therefore to the medial diagnosis. To accomplish this objective, several pre-processing steps are needed: separation and identification of the artefact sources, noise elimination and "clean" EEG reconstruction. Through a blind source separation (BSS) approach, the first step aims to separate the EEG signals into informative and artefact sources. Once the sources are separated, the second step is to classify and to eliminate the identified artefacts sources. This step implies a supervised classification. The EEG is reconstructed only from informative sources. The noise is finally eliminated using a wavelet denoising approach. A methodology ensuring an optimal interaction of these three techniques (BSS, classification and wavelet denoising) is the main contribution of this thesis. The methodology developed here, as well the obtained results from an important real EEG data base (ictal and inter-ictal) is subjected to a detailed analysis by medical expertise, which validates the proposed approach
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Leung, Raymond Electrical Engineering &amp Telecommunications Faculty of Engineering UNSW. "Scalable video compression with optimized visual performance and random accessibility". Awarded by:University of New South Wales. Electrical Engineering and Telecommunications, 2006. http://handle.unsw.edu.au/1959.4/24192.

Texto completo
Resumen
This thesis is concerned with maximizing the coding efficiency, random accessibility and visual performance of scalable compressed video. The unifying theme behind this work is the use of finely embedded localized coding structures, which govern the extent to which these goals may be jointly achieved. The first part focuses on scalable volumetric image compression. We investigate 3D transform and coding techniques which exploit inter-slice statistical redundancies without compromising slice accessibility. Our study shows that the motion-compensated temporal discrete wavelet transform (MC-TDWT) practically achieves an upper bound to the compression efficiency of slice transforms. From a video coding perspective, we find that most of the coding gain is attributed to offsetting the learning penalty in adaptive arithmetic coding through 3D code-block extension, rather than inter-frame context modelling. The second aspect of this thesis examines random accessibility. Accessibility refers to the ease with which a region of interest is accessed (subband samples needed for reconstruction are retrieved) from a compressed video bitstream, subject to spatiotemporal code-block constraints. We investigate the fundamental implications of motion compensation for random access efficiency and the compression performance of scalable interactive video. We demonstrate that inclusion of motion compensation operators within the lifting steps of a temporal subband transform incurs a random access penalty which depends on the characteristics of the motion field. The final aspect of this thesis aims to minimize the perceptual impact of visible distortion in scalable reconstructed video. We present a visual optimization strategy based on distortion scaling which raises the distortion-length slope of perceptually significant samples. This alters the codestream embedding order during post-compression rate-distortion optimization, thus allowing visually sensitive sites to be encoded with higher fidelity at a given bit-rate. For visual sensitivity analysis, we propose a contrast perception model that incorporates an adaptive masking slope. This versatile feature provides a context which models perceptual significance. It enables scene structures that otherwise suffer significant degradation to be preserved at lower bit-rates. The novelty in our approach derives from a set of "perceptual mappings" which account for quantization noise shaping effects induced by motion-compensated temporal synthesis. The proposed technique reduces wavelet compression artefacts and improves the perceptual quality of video.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Hanák, Pavel. "Optická detekce elektrogramů". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-220286.

Texto completo
Resumen
The aim of the study was focused on problem of electrograms recording with using voltage sensitive dyes. Study of electrograms recording was extended with thema of optical detectors suitable for measurement and noise analysis. Elimination of noise and disturption was made in program Matlab; lowpass filtration and wavelet transformation was used. Application for electrograms analysis was developed in latter part of this work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Hanák, Pavel. "Měření a zpracování elektrogramů". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-374605.

Texto completo
Resumen
The aim of the study was focused on problem of electrograms recording from alive tissue. Study of electrograms recording was extended with noise analysis and its elimination. Elimination of noise and disturption was made in program Matlab; lowpass filtration and wavelet transformation was used. Application for electrograms analysis was developed in latter part of this work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Zaylaa, Amer. "Multichannel EHG segmentation for automatically identifying contractions and motion artifacts". Thesis, Compiègne, 2019. http://www.theses.fr/2019COMP2521.

Texto completo
Resumen
Dans cette étude, nous avons mis l’accent sur la segmentation automatique des évènements dans le signal EMG utérin et l’identification ensuite des contractions parmi ces évènements en se référant aux données de l’expert. Notre base de données comprend des signaux EMG utérins de différentes semaines de gestation acquis grâce à une matrice de 4x4 électrodes. Par conséquent, notre travail a compris tout d’abord une application de la méthode de somme cumulé dynamique (DCS) en approche monodimensionnelle sur les signaux monopolaires afin d’obtenir une grande résolution spatiale des données. Suite aux résultats obtenus, notre étude a porté sur les signaux bipolaires afin d'augmenter le rapport signal/bruit (SNR) des EMG utérin. En fait, la méthode DCS a continué en y associant une série des techniques d’éliminations des fausses ruptures détectées, soit basée sur Fisher, soit sur le SNR d’une part, et en développant des méthodes de fusion de ces ruptures d’autre part : l’une automatique tandis que l’autre est basée sur le système de vote à la majorité pondérée où chaque canal est pondéré par un facteur lors de la fusion des instants de ruptures détectés. De plus, la méthode DCS est appliquée en approche multidimensionnelle, tout d’abord sur ces signaux bipolaires, ensuite sur leurs détails après décomposition en ondelettes. En fait, nous sommes intéressés à la sélection dynamique de ces détails dans les deux approches en utilisant une technique basée sur la distance Kullback Leibler. Enfin et dans le but d’identifier les contractions et de réduire le nombre des autres évènements détectés, un essai d’extraction des paramètres de ces évènements obtenus est présenté et validé
In this study , we have focused on the automatic segmentation of events in the uterine EMG signal and then on the identification of contractions among these events by referring to the expert's knowledge. Our database includes uterine EMG signals of different weeks of gestation acquired through a matrix of 4x4 electrodes. Therefore, our work has first included an application of the dynamic cumulative sum (DCS) method in a monodimensional approach on monopolar signals in order to obtain a high spatial resolution of the data. Based on the obtained results, our study has then focused on bipolar signals in order to increase the signal-to-noise ratio (SNR) of uterine EMGs. In fact, the DCS method has continued by associating first a series of techniques for the elimination of false detected ruptures either based on Fisher or on the SNR and by developing secondly two fusion methods of these ruptures : the firts one is automatic while the other one is based on the weighted majority voting system, where each channel is weighted by a factor when merging the instants of detected ruptures. In addition, the DCS method is applied in a multidimensional approach, first on the bipolar signals, then on their details after wavelet decomposition. Infact, we were interested in the dynamic selection of these details in both approaches by using a technique based on the Kullback Leibler ditance. Finally, in order to indentify the contractions and reduce the number of other detected events, an assay of parameters extraction of these obtained events has been presented and validated
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Lopata, Jan. "Odstraňování artefaktů JPEG komprese obrazových dat". Master's thesis, 2014. http://www.nusl.cz/ntk/nusl-341236.

Texto completo
Resumen
This thesis is concerned with the removal of artefacts typical for JPEG im- age compression. First, we describe the mathematical formulation of the JPEG format and the problem of artefact removal. We then formulate the problem as an optimization problem, where the minimized functional is obtained via Bayes' theorem and complex wavelets. We describe proximal operators and algorithms and apply them to the minimization of the given functional. The final algorithm is implemented in MATLAB and tested on several test problems. 1
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Wavelet artefacts"

1

Abtahi, F., F. Seoane, K. Lindecrantz y N. Löfgren. "Elimination of ECG Artefacts in Foetal EEG Using Ensemble Average Subtraction and Wavelet Denoising Methods: A Simulation". En IFMBE Proceedings, 551–54. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-00846-2_136.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Bajaj, Nikesh. "Wavelets for EEG Analysis". En Wavelet Theory [Working Title]. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.94398.

Texto completo
Resumen
This chapter introduces the applications of wavelet for Electroencephalogram (EEG) signal analysis. First, the overview of EEG signal is discussed to the recording of raw EEG and widely used frequency bands in EEG studies. The chapter then progresses to discuss the common artefacts that contaminate EEG signal while recording. With a short overview of wavelet analysis techniques, namely; Continues Wavelet Transform (CWT), Discrete Wavelet Transform (DWT), and Wavelet Packet Decomposition (WPD), the chapter demonstrates the richness of CWT over conventional time-frequency analysis technique e.g. Short-Time Fourier Transform. Lastly, artefact removal algorithms based on Independent Component Analysis (ICA) and wavelet are discussed and a comparative analysis is demonstrated. The techniques covered in this chapter show that wavelet analysis is well-suited for EEG signals for describing time-localised event. Due to similar nature, wavelet analysis is also suitable for other biomedical signals such as Electrocardiogram and Electromyogram.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Wavelet artefacts"

1

Bigirimana, A. D., N. Siddique y D. Coyle. "A hybrid ICA-wavelet transform for automated artefact removal in EEG-based emotion recognition". En 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2016. http://dx.doi.org/10.1109/smc.2016.7844928.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Quiles Zamora, Vicente, Eduardo Iáñez, Mario Ortiz y José María Azorín. "Estudio preliminar de la detección de cambios de velocidad de la marcha a partir de señales EEG". En 11 Simposio CEA de Bioingeniería. València: Editorial Universitat Politècnica de València, 2019. http://dx.doi.org/10.4995/ceabioing.2019.10034.

Texto completo
Resumen
El análisis de las señales cerebrales para la asistencia en pacientes con movilidad reducida es un área de investigación continua en la que las nuevas tecnologías ofrecen un amplio espectro de posibilidades para la ayuda activa como los exoesqueletos y las interfaces cerebro-maquinas (BMI). En este trabajo nos centramos en realizar una interfaz BMI para el control de velocidad del miembro inferior. El objetivo de este estudio es analizar las señales electroencefalográficas (EEG) para obtener un indicador que relacione la información cerebral con marcadores de velocidad en la marcha. Para el procesamiento de las señales EEG analizamos los ritmos sensoriomotores correspondientes a las bandas Alfa y Beta en la zona motora. La interfaz BMI debe discernir entre dos estados marcha e intención de cambiar de velocidad. La creación de dicho modelo requiere caracterizar el momento del cambio, para la fase de entrenamiento. Hemos utilizado el equipo de captura de movimiento Tech MCS V3 basado en sensores inerciales. Analizamos las componentes frecuenciales de la aceleración en el dominio temporal mediante la transformada continua Wavelet (CWT). En este trabajo realizamos un estudio menor para analizar la marcha y otro principal para la validación de la interfaz BMI planteada. Tres usuarios sanos participaron en el estudio. El protocolo tiene tres fases, el usuario espera parado unos segundos tras los cuales decide comenzar la marcha a un ritmo visiblemente lento y llegado un punto voluntariamente cambia de velocidad manteniéndola unos segundos. El experimento consta de 40 repeticiones. Debido a la gran generación de artefactos durante la marcha los componentes de la señal EEG fueron descompuestos mediante ICA y rechazados según su componente espectral y un criterio apropiado. Los resultados fueron procesados con varias configuraciones de electrodos de la zona motora, en diferentes bandas de frecuencia y con y sin eliminación de artefactos. El porcentaje máximo de acierto para los tres usuarios a la hora de distinguir entre las dos clases es del 58%, no siendo resultados lo suficientemente remarcables para poder validar de forma confiable la interfaz BMI planteada.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía