Artículos de revistas sobre el tema "Wavelet artefacts"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Wavelet artefacts.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 44 mejores artículos de revistas para su investigación sobre el tema "Wavelet artefacts".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Piekarczyk, Marcin, Olaf Bar, Łukasz Bibrzycki, Michał Niedźwiecki, Krzysztof Rzecki, Sławomir Stuglik, Thomas Andersen et al. "CNN-Based Classifier as an Offline Trigger for the CREDO Experiment". Sensors 21, n.º 14 (14 de julio de 2021): 4804. http://dx.doi.org/10.3390/s21144804.

Texto completo
Resumen
Gamification is known to enhance users’ participation in education and research projects that follow the citizen science paradigm. The Cosmic Ray Extremely Distributed Observatory (CREDO) experiment is designed for the large-scale study of various radiation forms that continuously reach the Earth from space, collectively known as cosmic rays. The CREDO Detector app relies on a network of involved users and is now working worldwide across phones and other CMOS sensor-equipped devices. To broaden the user base and activate current users, CREDO extensively uses the gamification solutions like the periodical Particle Hunters Competition. However, the adverse effect of gamification is that the number of artefacts, i.e., signals unrelated to cosmic ray detection or openly related to cheating, substantially increases. To tag the artefacts appearing in the CREDO database we propose the method based on machine learning. The approach involves training the Convolutional Neural Network (CNN) to recognise the morphological difference between signals and artefacts. As a result we obtain the CNN-based trigger which is able to mimic the signal vs. artefact assignments of human annotators as closely as possible. To enhance the method, the input image signal is adaptively thresholded and then transformed using Daubechies wavelets. In this exploratory study, we use wavelet transforms to amplify distinctive image features. As a result, we obtain a very good recognition ratio of almost 99% for both signal and artefacts. The proposed solution allows eliminating the manual supervision of the competition process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Turnip, Arjon y Jasman Pardede. "Artefacts Removal of EEG Signals with Wavelet Denoising". MATEC Web of Conferences 135 (2017): 00058. http://dx.doi.org/10.1051/matecconf/201713500058.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Voskoboinikov, Yu E. "Artefacts of Wavelet Filtration of Images and Their Elimination". Optoelectronics, Instrumentation and Data Processing 56, n.º 6 (noviembre de 2020): 559–65. http://dx.doi.org/10.3103/s8756699020060138.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Lilly, Jonathan M. "Element analysis: a wavelet-based method for analysing time-localized events in noisy time series". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 473, n.º 2200 (abril de 2017): 20160776. http://dx.doi.org/10.1098/rspa.2016.0776.

Texto completo
Resumen
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Subramanian, Balambigai, Asokan Ramasamy y Kamalakannan Rangasamy. "Performance Comparison of Wavelet and Multiwavelet Denoising Methods for an Electrocardiogram Signal". Journal of Applied Mathematics 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/241540.

Texto completo
Resumen
The increase in the occurrence of cardiovascular diseases in the world has made electrocardiogram an important tool to diagnose the various arrhythmias of the heart. But the recorded electrocardiogram often contains artefacts like power line noise, baseline noise, and muscle artefacts. Hence denoising of electrocardiogram signals is very important for accurate diagnosis of heart diseases. The properties of wavelets and multiwavelets have better denoising capability compared to conventional filtering techniques. The electrocardiogram signals have been taken from the MIT-BIH arrhythmia database. The simulation results prove that there is a 29.7% increase in the performance of multiwavelets over the performance of wavelets in terms of signal to noise ratio (SNR).
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

DOWNIE, T. R. "ACCURATE SIGNAL ESTIMATION NEAR DISCONTINUITIES". International Journal of Wavelets, Multiresolution and Information Processing 02, n.º 04 (diciembre de 2004): 433–53. http://dx.doi.org/10.1142/s0219691304000627.

Texto completo
Resumen
Wavelet thresholding is an effective method for noise reduction of a wide class of naturally occurring signals. However, bias near to a discontinuity and Gibbs phenomenon are a drawback in wavelet thresholding. The extent to which this is a problem is investigated. The Haar wavelet basis is good at approximating discontinuities, but is bad at approximating other signal artefacts. A method of detecting jumps in a signal is developed that uses non-decimated Haar wavelet coefficients. This is designed to be used in conjunction with most existing thresholding methods. A detailed simulation study is carried out and results show that when discontinuities are present a substantial reduction in bias can be obtained, leading to a corresponding reduction in mean square error.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Lei, Zhou, Yan Jiangbao, Zhu Feng, Tan Xiangyu y Zhang Lifeng. "Reconstruction Method of Electrical Capacitance Tomography Based on Wavelet Fusion". MATEC Web of Conferences 176 (2018): 01031. http://dx.doi.org/10.1051/matecconf/201817601031.

Texto completo
Resumen
The accuracy of reconstructed images of Electrical Capacitance Tomography (ECT) is a bottleneck concerning the successful application of ECT. An image data fusion algorithm based on wavelet transform was proposed in this paper to improve the accuracy of reconstructed images. First, reconstructed images were obtained using conjugate gradient least square algorithm and Landweber iterative algorithm, respectively. Secondly, reconstructed images were decomposed by wavelet. After that, the approximate component was processed according to the weighted average fusion rule. The detail component was processed according to the maximum fusion rule of absolute value. Finally, the new reconstructed images were obtained by wavelet reconstruction. Simulation and static experimental results showed that the reconstructed images with higher accuracy can be obtained after fusion and the artefacts were decreased obviously.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Burger, Christiaan y David Jacobus van den Heever. "Removal of EOG artefacts by combining wavelet neural network and independent component analysis". Biomedical Signal Processing and Control 15 (enero de 2015): 67–79. http://dx.doi.org/10.1016/j.bspc.2014.09.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Romo Vázquez, R., H. Vélez-Pérez, R. Ranta, V. Louis Dorr, D. Maquin y L. Maillard. "Blind source separation, wavelet denoising and discriminant analysis for EEG artefacts and noise cancelling". Biomedical Signal Processing and Control 7, n.º 4 (julio de 2012): 389–400. http://dx.doi.org/10.1016/j.bspc.2011.06.005.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Conforto, Silvia, Tommaso D'Alessio y Stefano Pignatelli. "Optimal rejection of movement artefacts from myoelectric signals by means of a wavelet filtering procedure". Journal of Electromyography and Kinesiology 9, n.º 1 (enero de 1999): 47–57. http://dx.doi.org/10.1016/s1050-6411(98)00023-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Morel, Guy-Louis, Philippe Mahul, Marcelle Reche, Jean-Paul Viale, Christian Auboyer, Andre Geyssant, Frederic Roche, Jean-Claude Barthelemy y Vincent Pichot. "Feasibility and Interest of Continuous Diaphragmatic Fatigue Monitoring Using Wavelet Denoising in ICU and Anesthesia". Open Anesthesiology Journal 7, n.º 1 (8 de noviembre de 2013): 37–48. http://dx.doi.org/10.2174/1874321801307010037.

Texto completo
Resumen
Measures of diaphragmatic electromyography (Edi), and respiratory mechanics, have demonstrated early changes before clinical complications. However, automatic Edi data collection is not adequate today due mainly to severe artefacts as well as to loss of signal. We thus intended to develop a new device with embedded artificial intelligence to optimize automatic Edi recordings independantly of artefacts and of probe displacement. We first chose the best mathematical tool to denoise Edi, using an established database, giving multiresolution wavelets as the best, resulting in the permanent availability of the H/L spectral index, a recognized representative of diaphragmatic fatigue. Fatigue was simultaneously measured using the classical mechanical f/Vt index (Rapid Shallow Breathing Index, RSBI), as well as the transdiaphragmatic pressure. We then performed a comparison of real-time H/L and RSBI in a group of seven healthy volunteers, before and during midazolam sedation infusion 0.1 mg.kg-1, with a parallel CPAP administration (2.5, 5.0, and 10 cm H2O) intended to compensate for airways resistance due to midazolam. Procedure was ended by delivering the antagonistic flumazenil 0.2 to 0.5 mg.kg-1. Progressive fatigue due to midazolam, the relief due to CPAP, as well as the answer to the anatgonist flumazenil, were shown earlier by the H/L index than by the RSBI change. Our new H/L monitoring device may greatly improve clinical follow-up of anesthetized patients as well as help to determine the optimal period for ventilatory weaning in ICU (Clinical Trials NCT00133939).
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Kang, Seung-Kwan, Si-Young Yie y Jae-Sung Lee. "Noise2Noise Improved by Trainable Wavelet Coefficients for PET Denoising". Electronics 10, n.º 13 (24 de junio de 2021): 1529. http://dx.doi.org/10.3390/electronics10131529.

Texto completo
Resumen
The significant statistical noise and limited spatial resolution of positron emission tomography (PET) data in sinogram space results in the degradation of the quality and accuracy of reconstructed images. Although high-dose radiotracers and long acquisition times improve the PET image quality, the patients’ radiation exposure increases and the patient is more likely to move during the PET scan. Recently, various data-driven techniques based on supervised deep neural network learning have made remarkable progress in reducing noise in images. However, these conventional techniques require clean target images that are of limited availability for PET denoising. Therefore, in this study, we utilized the Noise2Noise framework, which requires only noisy image pairs for network training, to reduce the noise in the PET images. A trainable wavelet transform was proposed to improve the performance of the network. The proposed network was fed wavelet-decomposed images consisting of low- and high-pass components. The inverse wavelet transforms of the network output produced denoised images. The proposed Noise2Noise filter with wavelet transforms outperforms the original Noise2Noise method in the suppression of artefacts and preservation of abnormal uptakes. The quantitative analysis of the simulated PET uptake confirms the improved performance of the proposed method compared with the original Noise2Noise technique. In the clinical data, 10 s images filtered with Noise2Noise are virtually equivalent to 300 s images filtered with a 6 mm Gaussian filter. The incorporation of wavelet transforms in Noise2Noise network training results in the improvement of the image contrast. In conclusion, the performance of Noise2Noise filtering for PET images was improved by incorporating the trainable wavelet transform in the self-supervised deep learning framework.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Abbaspour, Hamidreza, Nasser Mehrshad, Seyyed Mohammad Razavi y Luca Mesin. "Artefacts Removal to Detect Visual Evoked Potentials in Brain Computer Interface Systems". Journal of Biomimetics, Biomaterials and Biomedical Engineering 41 (abril de 2019): 91–103. http://dx.doi.org/10.4028/www.scientific.net/jbbbe.41.91.

Texto completo
Resumen
The interference of artefacts with evoked scalp electroencephalogram (EEG) responses is a problem in event related brain computer interface (BCI) system that reduces signal quality and interpretability of user's intentions. Many strategies have been proposed to reduce the effects of non-neural artefacts, while the activity of neural sources that do not reflect the considered stimulation has been neglected. However discerning such activities from those to be retained is important, but subtle and difficult as most of their features are the same. We propose an automated method based on a combination of a genetic algorithm (GA) and a support vector machine (SVM) to select only the sources of interest. Temporal, spectral, wavelet, autoregressive and spatial properties of independent components (ICs) of EEG are inspected. The method selects the most distinguishing subset of features among this comprehensive fused set of information and identifies the components to be preserved. EEG data were recorded from 12 healthy subjects in a visual evoked potential (VEP) based BCI paradigm and the corresponding ICs were classified by experts to train and test the algorithm. They were contaminated with different sources of artefacts, including electromyogram (EMG), electrode connection problems, blinks and electrocardiogram (ECG), together with neural contributions not related to VEPs. The accuracy of ICs classification was about 88.5% and the energetic residual error in recovering the clean signals was 3%. These performances indicate that this automated method can effectively identify and remove main artefacts derived from either neural or non-neural sources while preserving VEPs. This could have important potential applications, contributing to speed and remove subjectivity of the cleaning procedure by experts. Moreover, it could be included in a real time BCI as a pre-processing step before the identification of the user’s intention.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Paulchamy, B. y Ila Vennila. "A Certain Exploration on EEG Signal for the Removal of Artefacts using Power Spectral Density Analysis through Haar wavelet Transform". International Journal of Computer Applications 42, n.º 3 (31 de marzo de 2012): 9–14. http://dx.doi.org/10.5120/5670-7409.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Marasco, D. D., G. Di Lorenzo, A. Petito, M. Altamura, G. Francavilla, L. Inverso y A. Bellomo. "Gamma band dysfunction in patients with schizophrenia during a Sternberg Task: A wavelet analysis". European Psychiatry 33, S1 (marzo de 2016): S198. http://dx.doi.org/10.1016/j.eurpsy.2016.01.466.

Texto completo
Resumen
BackgroundIncreasing body of evidence suggest that patients with schizophrenia (SCZ) present dysfunction of the gamma band oscillations (GBO) during cognitive tasks. The current study aimed to explore the GBO activity in SCZ during a Sternberg task.Materials and methodsTwenty-eight chronic stabilized SCZ and 18 healthy controls (HC), were recruited. Ongoing EEG was recorded during the execution of the Sternberg task. Continuous EEG data were band-pass filtered (1–100 Hz) and corrected for eye blink and muscle artefacts by ICA. For each subject, the event-related-spectral-perturbation (ERSP) and the inter-trial-coherence (ITC) were computed at the Pz channel only for those stimulus-locked segments containing correct responses. GBO wavelet analysis was performed with two different increasing cycle ranges (3 to 5.8 and 12 to 22.6; frequency range: 30–90 Hz), to obtain the best information about temporal and frequency dynamics. Student's t test (with multiple comparisons FDR correction) was used to compare the groups.ResultsDuring the manteinance phase (4000 to 4600 ms after the stimulus onset), SCZ presented a significant increase, respect to HC, in low GBO activity (range: 30-50 Hz;). In the other phases of the Sternberg task (encoding, probe presentation and response periods), no significant difference in GBO was observed between SCZ and HC.ConclusionsThese findings are in line with the evidence that GBO dysfunction in SCZ is present during selective phases of the working memory task. Future studies have to clarify the role of GBO dysfunction on the cognitive performance and the clinical utility of selective GBO modulation during cognitive rehabilitation.Disclosure of interestThe authors have not supplied their declaration of competing interest.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Dai, Shuxian, Yujin Zhang, Wanqing Song, Fei Wu y Lijun Zhang. "Rotation Angle Estimation of JPEG Compressed Image by Cyclic Spectrum Analysis". Electronics 8, n.º 12 (30 de noviembre de 2019): 1431. http://dx.doi.org/10.3390/electronics8121431.

Texto completo
Resumen
Image rotation is a common auxiliary method of image tampering, which can make the forged image more realistic from the geometric perspective. Most algorithms of image rotation angle estimation employ the peak value on the Fourier spectrum; however, JPEG post-processing brings additional peak interferences to the spectrum, which has a great impact on algorithm performance. In this paper, angle estimation is carried out for images compressed by JPEG. Firstly, the Fourier cyclic spectrum of image covariance is calculated, followed by semi-soft threshold wavelet transform to eliminate the block artefacts brought by JPEG compression. According to the shortest distance principle in the range of the limited amplitude, the processed cyclic spectral data are sorted to select the peak points. Finally, according to the selected peak point, the corresponding position coordinates of the theoretical peak point are found, and the rotation angle of the image is estimated by the theoretical peak point. Experimental results demonstrate that the average absolute error of the proposed algorithm is significantly lower than that of the state-of-the-art methods investigated, which highlights the promising potential of the proposed method as an image resampling detector in practical forensics applications.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Oliveira, Rui Jorge, Bento Caldeira, Teresa Teixidó y José Fernando Borges. "GPR Clutter Reflection Noise-Filtering through Singular Value Decomposition in the Bidimensional Spectral Domain". Remote Sensing 13, n.º 10 (20 de mayo de 2021): 2005. http://dx.doi.org/10.3390/rs13102005.

Texto completo
Resumen
Usually, in ground-penetrating radar (GPR) datasets, the user defines the limits between the useful signal and the noise through standard filtering to isolate the effective signal as much as possible. However, there are true reflections that mask the coherent reflectors that can be considered noise. In archaeological sites these clutter reflections are caused by scattering with origin in subsurface elements (e.g., isolated masonry, ceramic objects, and archaeological collapses). Its elimination is difficult because the wavelet parameters similar to coherent reflections and there is a risk of creating artefacts. In this study, a procedure to filter the clutter reflection noise (CRN) from GPR datasets is presented. The CRN filter is a singular value decomposition-based method (SVD), applied in the 2D spectral domain. This CRN filtering was tested in a dataset obtained from a controlled laboratory environment, to establish a mathematical control of this algorithm. Additionally, it has been applied in a 3D-GPR dataset acquired in the Roman villa of Horta da Torre (Fronteira, Portugal), which is an uncontrolled environment. The results show an increase in the quality of archaeological GPR planimetry that was verified via archaeological excavation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Vukotić, Vedran, Vivien Chappelier y Teddy Furon. "Are Classification Deep Neural Networks Good for Blind Image Watermarking?" Entropy 22, n.º 2 (8 de febrero de 2020): 198. http://dx.doi.org/10.3390/e22020198.

Texto completo
Resumen
Image watermarking is usually decomposed into three steps: (i) a feature vector is extracted from an image; (ii) it is modified to embed the watermark; (iii) and it is projected back into the image space while avoiding the creation of visual artefacts. This feature extraction is usually based on a classical image representation given by the Discrete Wavelet Transform or the Discrete Cosine Transform for instance. These transformations require very accurate synchronisation between the embedding and the detection and usually rely on various registration mechanisms for that purpose. This paper investigates a new family of transformation based on Deep Neural Networks trained with supervision for a classification task. Motivations come from the Computer Vision literature, which has demonstrated the robustness of these features against light geometric distortions. Also, adversarial sample literature provides means to implement the inverse transform needed in the third step above mentioned. As far as zero-bit watermarking is concerned, this paper shows that this approach is feasible as it yields a good quality of the watermarked images and an intrinsic robustness. We also tests more advanced tools from Computer Vision such as aggregation schemes with weak geometry and retraining with a dataset augmented with classical image processing attacks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Zavoyskih, М., A. Korobeynikov, A. Menlitdinov, V. Lyuminarskiy y Yu Kuzelin. "The electrocardiogram signal morphology analysis based on convolutional neural network". Information Technology and Nanotechnology, n.º 2416 (2019): 34–42. http://dx.doi.org/10.18287/1613-0073-2019-2416-34-42.

Texto completo
Resumen
The analysis of electrocardiogram signal morphology based on convolutional neural network is considered. Input data is obtained by splitting the signal into cardiac cycles. The calculation the average cycle is performed to exclude the artefacts. The Haar wavelet transform of the average cycle is performed. The images of size 200x6 are input data for the recognition system: 200 – number of counts constituting the cycle; 6 – number of Haar transform time scales. This work is a reconsideration of the previous work of the authors. The training samples base of marked cardiac cycle segments is the same (1500 cycles): the average cycle and the segment’s start and end indexes. In the previous work, the original composite system consisting of several modules was used as a recognition system. In current work it is proposed to use the convolutional neural network with the special structure: 4 convolutional layers, 2 dense layers, and 200 outputs for every of 3 segment. The recognition system based on neural network showed results slightly superior to the previous system. The percent of acceptable localization of the segments is the following: P – 82.2%, QRS – 88.7%, and T – 85.4%. The proposed system effectively solves the problem using the standard modules of modern artificial neural networks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Yadav, Nirmal. "Retinal blood vessels detection for diabetic retinopathy with Ridgelet transform and convolution neural network". International Journal of Wavelets, Multiresolution and Information Processing 18, n.º 06 (11 de septiembre de 2020): 2050048. http://dx.doi.org/10.1142/s0219691320500484.

Texto completo
Resumen
Applying machine learning in life sciences, especially diagnostics, has become a key area of focus for researchers. Combining machine learning with traditional algorithms provides a unique opportunity of providing better solutions for the patients. In this paper, we present study results of applying the Ridgelet Transform method on retina images to enhance the blood vessels, then using machine learning algorithms to identify cases of Diabetic Retinopathy (DR). The Ridgelet transform provides better results for line singularity of image function and, thus, helps to reduce artefacts along the edges of the image. The Ridgelet Transform method, when compared with earlier known methods of image enhancement, such as Wavelet Transform and Contourlet Transform, provided satisfactory results. The transformed image using the Ridgelet Transform method with pre-processing quantifies the amount of information in the dataset. It efficiently enhances the generation of features vectors in the convolution neural network (CNN). In this study, a sample of fundus photographs was processed, which was obtained from a publicly available dataset. In pre-processing, first, CLAHE was applied, followed by filtering and application of Ridgelet transform on the patches to improve the quality of the image. Then, this processed image was used for statistical feature detection and classified by deep learning method to detect DR images from the dataset. The successful classification ratio was 98.61%. This result concludes that the transformed image of fundus using the Ridgelet Transform enables better detection by leveraging a transform-based algorithm and the deep learning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

José, Marco V. y Ruth F. Bishop. "Scaling properties and symmetrical patterns in the epidemiology of rotavirus infection". Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences 358, n.º 1438 (29 de octubre de 2003): 1625–41. http://dx.doi.org/10.1098/rstb.2003.1291.

Texto completo
Resumen
The rich epidemiological database of the incidence of rotavirus, as a cause of severe diarrhoea in young children, coupled with knowledge of the natural history of the infection, can make this virus a paradigm for studies of epidemic dynamics. The cyclic recurrence of childhood rotavirus epidemics in unvaccinated populations provides one of the best documented phenomena in population dynamics. This paper makes use of epidemiological data on rotavirus infection in young children admitted to hospital in Melbourne, Australia from 1977 to 2000. Several mathematical methods were used to characterize the overall dynamics of rotavirus infections as a whole and individually as serotypes G1, G2, G3, G4 and G9. These mathematical methods are as follows: seasonal autoregressive integrated moving-average (SARIMA) models, power spectral density (PSD), higher-order spectral analysis (HOSA) (bispectrum estimation and quadratic phase coupling (QPC)), detrended fluctuation analysis (DFA), wavelet analysis (WA) and a surrogate data analysis technique. Each of these techniques revealed different dynamic aspects of rotavirus epidemiology. In particular, we confirm the existence of an annual, biannual and a quinquennial period but additionally we found other embedded cycles (e.g. ca . 3 years). There seems to be an overall unique geometric and dynamic structure of the data despite the apparent changes in the dynamics of the last years. The inherent dynamics seems to be conserved regardless of the emergence of new serotypes, the re-emergence of old serotypes or the transient disappearance of a particular serotype. More importantly, the dynamics of all serotypes is multiple synchronized so that they behave as a single entity at the epidemic level. Overall, the whole dynamics follow a scale-free power-law fractal scaling behaviour. We found that there are three different scaling regions in the time-series, suggesting that processes influencing the epidemic dynamics of rotavirus over less than 12 months differ from those that operate between 1 and ca . 3 years, as well as those between 3 and ca . 5 years. To discard the possibility that the observed patterns could be due to artefacts, we applied a surrogate data analysis technique which enabled us to discern if only random components or linear features of the incidence of rotavirus contribute to its dynamics. The global dynamics of the epidemic is portrayed by wavelet-based incidence analysis. The resulting wavelet transform of the incidence of rotavirus crisply reveals a repeating pattern over time that looks similar on many scales (a property called self-similarity). Both the self-similar behaviour and the absence of a single characteristic scale of the power-law fractal-like scaling of the incidence of rotavirus infection imply that there is not a universal inherently more virulent serotype to which severe gastroenteritis can uniquely be ascribed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Walinjkar, Amit. "A Composite and Wearable Sensor Kit for Location-Aware Healthcare Monitoring and Real-Time Trauma Scoring for Survival Prediction". Applied System Innovation 1, n.º 3 (12 de septiembre de 2018): 35. http://dx.doi.org/10.3390/asi1030035.

Texto completo
Resumen
With the advances in the microfabrication of analogue front-end devices, and embedded and signal processing technology, it has now become possible to devise miniaturized health monitoring kits for non-invasive real time monitoring at any location. The current commonly available kits only measure singleton physiological parameters, and a composite analysis that covers all vital signs and trauma scores seems to be missing with these kits. The research aims at using vital signs and other physiological parameters to calculate trauma scores National Early Warning Score (NEWS), Revised Trauma Score (RTS), Trauma Score - Injury Severity Score (TRISS) and Prediction of survival (Ps), and to log the trauma event to electronic health records using standard coding schemes. The signal processing algorithms were implemented in MATLAB and could be ported to TI AM335x using MATLAB/Embedded Coder. Motion artefacts were removed using a level ‘5’ stationary wavelet transform and a ‘sym4’ wavelet, which yielded a signal-to-noise ratio of 27.83 dB. To demonstrate the operation of the device, an existing Physionet, MIMIC II Numerics dataset was used to calculate NEWS and RTS scores, and to generate the correlation and regression models for a clinical class of patients with respiratory failure and admitted to Intensive Care Unit (ICU). Parameters such as age, heart rate, Systolic Blood Pressure (SysBP), respiratory rate, and Oxygen Saturation (SpO2) as predictors to Ps, showed significant positive regressions of 93% at p < 0.001. The NEWS and RTS scores showed no significant correlation (r = 0.25, p < 0.001) amongst themselves; however, the NEWS and RTS together showed significant correlations with Ps (blunt) (r = 0.70, p < 0.001). RTS and Ps (blunt) scores showed some correlations (r = 0.63, p < 0.001), and the NEWS score showed significant correlation (r = 0.79, p < 0.001) with Ps (blunt) scores. Global Positioning System (GPS) system was built into the kit to locate the individual and to calculate the shortest path to the nearest healthcare center using the Quantum Geographical Information System (QGIS) Network Analysis tool. The physiological parameters from the sensors, along with the calculated trauma scores, were encoded according to a standard Systematized Nomenclature of Medicine-Clinical Terms (SNOMED-CT) coding system, and the trauma information was logged to electronic health records using Fast Health Interoperability Resources (FHIR) servers. The FHIR servers provided interoperable web services to log the trauma event information in real time and to prepare for medical emergencies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Allian, Farhad y Rekha Jain. "The need for new techniques to identify the high-frequency MHD waves of an oscillating coronal loop". Astronomy & Astrophysics 650 (junio de 2021): A91. http://dx.doi.org/10.1051/0004-6361/202039763.

Texto completo
Resumen
Context. Magnetic arcades in the solar atmosphere, or coronal loops, are common structures known to host magnetohydrodynamic (MHD) waves and oscillations. Of particular interest are the observed properties of transverse loop oscillations, such as their frequency and mode of oscillation, which have received significant attention in recent years because of their seismological capability. Previous studies have relied on standard data analysis techniques, such as a fast Fourier transform (FFT) and wavelet transform (WT), to correctly extract periodicities and identify the MHD modes. However, the ways in which these methods can lead to artefacts requires careful investigation. Aims. We aim to assess whether these two common spectral analysis techniques in coronal seismology can successfully identify high-frequency waves from an oscillating coronal loop. Methods. We examine extreme ultraviolet images of a coronal loop observed by the Atmospheric Imaging Assembly in the 171 Å waveband on board the Solar Dynamics Observatory. We perform a spectral analysis of the loop waveform and compare our observation with a basic simulation. Results. The spectral FFT and WT power of the observed loop waveform is found to reveal a significant signal with frequency ∼2.67 mHz superposed onto the dominant mode of oscillation of the loop (∼1.33 mHz), that is, the second harmonic of the loop. The simulated data show that the second harmonic is completely artificial even though both of these methods identify this mode as a real signal. This artificial harmonic, and several higher modes, are shown to arise owing to the periodic but non-uniform brightness of the loop. We further illustrate that the reconstruction of the ∼2.67 mHz component, particularly in the presence of noise, yields a false perception of oscillatory behaviour that does not otherwise exist. We suggest that additional techniques, such as a forward model of a 3D coronal arcade, are necessary to verify such high-frequency waves. Conclusions. Our findings have significant implications for coronal seismology, as we highlight the dangers of attempting to identify high-frequency MHD wave modes using these standard data analysis techniques.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Maraun, D. y J. Kurths. "Cross wavelet analysis: significance testing and pitfalls". Nonlinear Processes in Geophysics 11, n.º 4 (11 de noviembre de 2004): 505–14. http://dx.doi.org/10.5194/npg-11-505-2004.

Texto completo
Resumen
Abstract. In this paper, we present a detailed evaluation of cross wavelet analysis of bivariate time series. We develop a statistical test for zero wavelet coherency based on Monte Carlo simulations. If at least one of the two processes considered is Gaussian white noise, an approximative formula for the critical value can be utilized. In a second part, typical pitfalls of wavelet cross spectra and wavelet coherency are discussed. The wavelet cross spectrum appears to be not suitable for significance testing the interrelation between two processes. Instead, one should rather apply wavelet coherency. Furthermore we investigate problems due to multiple testing. Based on these results, we show that coherency between ENSO and NAO is an artefact for most of the time from 1900 to 1995. However, during a distinct period from around 1920 to 1940, significant coherency between the two phenomena occurs.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Nagai, Shuto, Daisuke Anzai y Jianqing Wang. "Motion artefact removals for wearable ECG using stationary wavelet transform". Healthcare Technology Letters 4, n.º 4 (14 de junio de 2017): 138–41. http://dx.doi.org/10.1049/htl.2016.0100.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

An, Xiang y George K. Stylios. "Comparison of Motion Artefact Reduction Methods and the Implementation of Adaptive Motion Artefact Reduction in Wearable Electrocardiogram Monitoring". Sensors 20, n.º 5 (7 de marzo de 2020): 1468. http://dx.doi.org/10.3390/s20051468.

Texto completo
Resumen
A motion artefact is a kind of noise that exists widely in wearable electrocardiogram (ECG) monitoring. Reducing motion artefact is challenging in ECG signal preprocessing because the spectrum of motion artefact usually overlaps with the very important spectral components of the ECG signal. In this paper, the performance of the finite impulse response (FIR) filter, infinite impulse response (IIR) filter, moving average filter, moving median filter, wavelet transform, empirical mode decomposition, and adaptive filter in motion artefact reduction is studied and compared. The results of this study demonstrate that the adaptive filter performs better than other denoising methods, especially in dealing with the abnormal ECG signal which is measured from a patient with heart disease. In the implementation of adaptive motion artefact reduction, the results show that the use of the impedance pneumography signal as the reference input signal for the adaptive filter can effectively reduce the motion artefact in the ECG signal.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

P, Vijaya y Binu D. "Introduction to the Special Issue on Intelligence on Scalable computing for Recent Applications". Scalable Computing: Practice and Experience 21, n.º 2 (27 de junio de 2020): 157–58. http://dx.doi.org/10.12694/scpe.v21i2.1581.

Texto completo
Resumen
The special issue has been focussed to overcome the challenges of scalability, which includes size scalability, geographical scalability, administrative scalability, network and synchronous communication limitation, etc.The challenges also emerge with the development of recent applications. Hence this proposal has been planned to handle the scalability issues in recent applications. This special issue invites researchers, engineers, educators, managers, programmers, and users of computers who have particular interests in parallel processing and/or distributed computing and artificial intelligence to submit original research papers and timely review articles on the theory, design, evaluation, and use of artificial intelligence and parallel and/or distributed computing systems for emerging applications. The ten papers in this special issue cover a range of aspects of theoretical and practical research development on scalable computing. The proposal provides an effective forum for communication among researchers and practitioners from various scientific areas working in a wide variety of problem areas, sharing a fundamental common interest in improving the ability of parallel and distributed computer systems, intelligent techniques, and deep learning mechanisms and advanced soft computing techniques. The issue covers wide range of applications, but with scalable problems that to be solved by perfect hybridization of distributed computing and artificial intelligence.The first paper is “CPU-Memory Aware VM Consolidation for Cloud Data Centers” introduced a CPU Memory aware VM placement algorithm is proposed for selecting suitable destination host for migration. The Virtual Machines are selected using Fuzzy Soft Set (FSS) method VM selection algorithm. The proposed placement algorithm considers CPU, Memory, and combination of CPU-Memory utilization of VMs on the source host.In “Bird Swarm Optimization-based stacked autoencoder deep learning for umpire detection and classification”, presented the umpire detection and classification by proposing an optimization algorithm. The overall procedure of the proposed approach involves three steps, like segmentation, feature extraction, and classification. Here, the classification is done using the proposed Bird Swarm Optimization-based stacked autoencoder deep learning classifier (BSO-Stacked Autoencoders), that categories into umpire or others.In “Enhanced DBSCAN with Hierarchical tree for Web Rule Mining”, proposed an enhanced web mining model based on two contributions. At first, the hierarchical tree is framed, which produces different categories of the searching queries (different web pages). Next, to hierarchical tree model, enhanced Density-Based Spatial Clustering of Applications with Noise (DBSCAN) technique model is developed by modifying the traditional DBSCAN. This technique results in proper session identification from raw data. Moreover, this technique offers the optimal level of clusters necessitated for hierarchical clustering. After hierarchical clustering, the rule mining is adopted. The traditional rule mining technique is generally based on the frequency; however, this paper intends to enhance the traditional rule mining based on utility factor as the second contribution. Hence the proposed model for web rule mining is termed as Enhanced DBSCAN-based Hierarchical Tree (EDBHT).In “A comprehensive survey of the Routing Schemes for IoT applications”, this review article provides a detailed review of 52 research papers presenting the suggested routing protocols based on the content-based, clustering-based, fuzzy-based, Routing Protocol for Low power (RPL) and Lossy Networks, tree-based and soon. Also, a detailed analysis and discussion are made by concerning the parameters, simulation tool, and year of publication, network size, evaluation metrics, and utilized protocols. In “Chicken-Moth Search Optimization-Based Deep Convolutional Neural Network For Image Steganography”, proposed an effective pixel prediction based on image stegonography is developed, which employs error dependent Deep Convolutional Neural Network (DCNN) classifier for pixel identification. Here, the best pixels are identified from the medical image based on DCNN classifier using pixel features, like texture, wavelet energy, Gabor, scattering features, and so on. The DCNN is optimally trained using Chicken-Moth search optimization (CMSO). The CMSO is designed by integrating Chicken Swarm Optimization (CSO) and Moth Search Optimization (MSO) algorithm based on limited error.In “An Efficient Dynamic Slot Scheduling Algorithm for WSN MAC: A Distributed Approach”, an effective TDMA based slot scheduling algorithm needs to be designed. In this paper, we propose a TDMA based algorithm named DYSS that meets both the timeliness and energy efficiency in handling the collision. This algorithm finds an effective way of preparing the initial schedule by using the average two-hop neighbors count. Finally, the remaining un-allotted nodes are dynamically assigned to slots using a novel approach.In “Artefacts removal from ECG Signal: Dragonfly optimization-based learning algorithm for neural network-enhanced adaptive filtering”, proposed a method utilizes the adaptive filter termed as the (Dragonfly optimization + Levenberg Marqueret learning algorithm) DLM-based Nonlinear Autoregressive with eXogenous input (NARX) neural network for the removal of the artefacts from the ECG signals. Once the artefact signal is identified using the adaptive filter, the identified signal is subtracted from the primary signal that is composed of the ECG signal and the artefacts through an adaptive subtraction procedure.In “A Comprehensive Review on State-of-the-Art Image Inpainting Techniques”, this survey makes a critical analysis of diverse techniques regarding various image inpainting schemes. This paper goes under (i) Analyzing various image inpainting techniques that are contributed in different papers. (ii) Makes the comprehensive study regarding the performance measures and the corresponding maximum achievements in each contribution. (iii) Analytical review concerning the chronological review and various tools exploited in each of the reviewed works.In “An Efficient Way of Finding Polarity of Roman Urdu Reviews by Using Boolean Rules”, proposed a novel approach by using Boolean rules for the identification of the related and non-related comments. Related reviews are those which show the behavior of a customer about a particular product. Lexicons are built for the identification of noise, positive and negative reviews.The final paper is “Forecasting the Impact of Social Media Advertising among College Students using Higher Order Statistical Functions”, this research work plans to develop a statistical review that concerns on social media advertising among college students from diverse universities. The review analysis on social media advertising is given under six sections such as: (i) Personal Profile; (ii) Usage; (iii) Assessment; (iv) Higher Order statistics like Community, Connectedness, Openness, Dependence, and Participation; (v) Trustworthiness such as Trust, Perceived value and Perceived risk; and (vi) Towards advertisement which involves attitude towards advertisement, response towards advertisement and purchase intension.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Chrapka, Philip, Hubert de Bruin, Gary Hasey y Jim Reilly. "Wavelet-Based Muscle Artefact Noise Reduction for Short Latency rTMS Evoked Potentials". IEEE Transactions on Neural Systems and Rehabilitation Engineering 27, n.º 7 (julio de 2019): 1449–57. http://dx.doi.org/10.1109/tnsre.2019.2908951.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Perpetuini, David, Daniela Cardone, Chiara Filippini, Antonio Maria Chiarelli y Arcangelo Merla. "A Motion Artifact Correction Procedure for fNIRS Signals Based on Wavelet Transform and Infrared Thermography Video Tracking". Sensors 21, n.º 15 (28 de julio de 2021): 5117. http://dx.doi.org/10.3390/s21155117.

Texto completo
Resumen
Functional near infrared spectroscopy (fNIRS) is a neuroimaging technique that allows to monitor the functional hemoglobin oscillations related to cortical activity. One of the main issues related to fNIRS applications is the motion artefact removal, since a corrupted physiological signal is not correctly indicative of the underlying biological process. A novel procedure for motion artifact correction for fNIRS signals based on wavelet transform and video tracking developed for infrared thermography (IRT) is presented. In detail, fNIRS and IRT were concurrently recorded and the optodes’ movement was estimated employing a video tracking procedure developed for IRT recordings. The wavelet transform of the fNIRS signal and of the optodes’ movement, together with their wavelet coherence, were computed. Then, the inverse wavelet transform was evaluated for the fNIRS signal excluding the frequency content corresponding to the optdes’ movement and to the coherence in the epochs where they were higher with respect to an established threshold. The method was tested using simulated functional hemodynamic responses added to real resting-state fNIRS recordings corrupted by movement artifacts. The results demonstrated the effectiveness of the procedure in eliminating noise, producing results with higher signal to noise ratio with respect to another validated method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Gomez, Christopher, Kyoko Kataoka, Aditya Saputra, Patrick Wassmer, Atsushi Urabe, Justin Morgenroth y Akira Kato. "Photogrammetry-based Texture Analysis of a Volcaniclastic Outcrop-peel: Low-cost Alternative to TLS and Automation Potentialities using Haar Wavelet and Spatial-Analysis Algorithms". Forum Geografi 31, n.º 1 (1 de julio de 2017): 16–27. http://dx.doi.org/10.23917/forgeo.v31i1.3977.

Texto completo
Resumen
Numerous progress has been made in the field of applied photogrammetry in the last decade, including the usage of close-range photogrammetry as a mean of conservation and record of outcrops. In the present contribution, we use the SfM-MVS method combined with a wavelet decomposition analysis of the surface, in order to relate it to morphological and surface roughness data. The results demonstrated that wavelet decomposition and RMS could provide a rapid insight on the location of coarser materials and individual outliers, while arithmetic surface roughness were more useful to detect units or layers that are similar on the outcrop. The method also emphasizes the fact that the automation of the process does not allows clear distinction between any artefact crack or surface change and that human supervision is still essential despite the original goal of automating the outcrop surface analysis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Foo, Jong Yong A. "Comparison of wavelet transformation and adaptive filtering in restoring artefact-induced time-related measurement". Biomedical Signal Processing and Control 1, n.º 1 (enero de 2006): 93–98. http://dx.doi.org/10.1016/j.bspc.2006.01.001.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Astuti, Baiq Siska Febriani, Santi Wulan Purnami, R. Mohamad Atok, Wardah Rahmatul Islamiyah, Diah Puspito Wulandari y Anda Iviana Juniani. "Classify Epileptic EEG Signals Using Extreme Support Vector Machine for Ictal and Muscle Artifact Detection". International Journal of Machine Learning and Computing 11, n.º 2 (marzo de 2021): 170–75. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1031.

Texto completo
Resumen
EEG signals aids in diagnosing various wave signals recorded by the activities of the brain. It also produces unavoidable artifacts, in the recording process. The purpose of this study therefore is to detect ictal and artefact signals, with the aim of reducing interpretation errors especially those related to the muscle which are quite difficult to distinguish. The data used are EEG signal recording results obtained from Rumah Sakit Universitas Airlangga. It consisted of two classes, namely ictal and muscle artefact. The signal decomposition method used is a wavelet transform, known as DWT. While the extraction feature utilized, consist of quartile, maximum, minimum, mean and standard deviation. This study also utilized the SVM with linear, polynomial, RBF and ELM (ESVM) kernels. Research results shows that the ESVM classification time is faster than the SVM and other kernels. However, the values of accuracy, sensitivity, specificity and AUC are not better.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Prema, P., T. Kesavamurthy y K. Ramadoss. "Performance analysis of wavelet basis function in de-trending and ocular artefact removal from electroencephalogram". International Journal of Biomedical Engineering and Technology 30, n.º 3 (2019): 263. http://dx.doi.org/10.1504/ijbet.2019.100696.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Ramadoss, K., T. Kesavamurthy y P. Prema. "Performance analysis of wavelet basis function in de-trending and ocular artefact removal from electroencephalogram". International Journal of Biomedical Engineering and Technology 30, n.º 3 (2019): 263. http://dx.doi.org/10.1504/ijbet.2019.10022268.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Peters, C. H. L., R. Vullings, M. J. Rooijakkers, J. W. M. Bergmans, S. G. Oei y P. F. F. Wijn. "A continuous wavelet transform-based method for time-frequency analysis of artefact-corrected heart rate variability data". Physiological Measurement 32, n.º 10 (18 de agosto de 2011): 1517–27. http://dx.doi.org/10.1088/0967-3334/32/10/001.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Feng, Lichen, Zunchao Li y Jian Zhang. "Fast automated on‐chip artefact removal of EEG for seizure detection based on ICA‐R algorithm and wavelet denoising". IET Circuits, Devices & Systems 14, n.º 4 (22 de mayo de 2020): 547–54. http://dx.doi.org/10.1049/iet-cds.2019.0491.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Sai, Chong Yeh, Norrima Mokhtar, Masahiro Iwahashi, Paul Cumming y Hamzah Arof. "Fully automated unsupervised artefact removal in multichannel electroencephalogram using wavelet‐independent component analysis with density‐based spatial clustering of application with noise". IET Signal Processing 15, n.º 8 (12 de junio de 2021): 535–42. http://dx.doi.org/10.1049/sil2.12058.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Gómez, Kevin Alejandro Hernández, Julian D. Echeverry-Correa y Álvaro Ángel Orozco Gutiérrez. "Automatic Pectoral Muscle Removal and Microcalcification Localization in Digital Mammograms". Healthcare Informatics Research 27, n.º 3 (31 de julio de 2021): 222–30. http://dx.doi.org/10.4258/hir.2021.27.3.222.

Texto completo
Resumen
Objectives: Breast cancer is the most common cancer diagnosed in women, and microcalcification (MCC) clusters act as an early indicator. Thus, the detection of MCCs plays an important role in diagnosing breast cancer.Methods: This paper presents a methodology for mammogram preprocessing and MCC detection. The preprocessing method employs automatic artefact deletion and pectoral muscle removal based on region-growing segmentation and polynomial contour fitting. The MCC detection method uses a convolutional neural network for region-of-interest (ROI) classification, along with morphological operations and wavelet reconstruction to reduce false positives (FPs).Results: The methodology was evaluated using the mini-MIAS and UTP datasets in terms of segmentation accuracy in the preprocessing phase, as well as sensitivity and the mean FP rate per image in the MCC detection phase. With the mini-MIAS dataset, the proposed methods achieved accuracy scores of 99% for breast segmentation and 95% for pectoral segmentation, a sensitivity score of 82% for MCC detection, and an FP rate per image of 3.27. With the UTP dataset, the methods achieved accuracy scores of 97% for breast segmentation and 91% for pectoral segmentation, a sensitivity score of 78% for MCC detection, and an FP rate per image of 0.74.Conclusions: The proposed preprocessing method outperformed the state-of-the-art methods for breast segmentation and achieved relatively good results for pectoral muscle removal. Furthermore, the MCC detection module achieved the highest test accuracy in identifying potential ROIs with MCCs compared to other methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Roberts, M. B., S. A. Parfitt, M. I. Pope, F. F. Wenban-Smith, R. I. Macphail, A. Locker y J. R. Stewart. "Boxgrove, West Sussex: Rescue Excavations of a Lower Palaeolithic Landsurface (Boxgrove Project B, 1989–91)". Proceedings of the Prehistoric Society 63 (1997): 303–58. http://dx.doi.org/10.1017/s0079497x00002474.

Texto completo
Resumen
In 1988 an area of 12,000 m2in Quarry 2 at Boxgrove, West Sussex, was identified as being under threat front gravel and sand extraction. It was decided to sample the threatened area in 1989 with a series of 6 m2test pits. The results of this survey identified two areas that merited further investigation, and area excavations were carried out at Quarry 2/C and Quarry 2/D in 1990 and 1991 respectively. These concentrated on the main Pleistocene landsurface (Unit 4c) and revealed spreads of knapping debris associated with the production of flint handaxes. Two test pits and area Q2/C produced handaxes, over 90% of which had tranchet sharpening at the distal end. A small amount of core reduction and only a few flake tools were found: these were all from Quarry 2/C. Faunal remains were located in the northern part of the excavations where Unit 4c had a calcareous cover. In Quarry 2/C the bones of C. elaphus and Bison sp. exhibited traces of human modification.The project employed two methods of artefact retrieval: direct excavation in metre squares and bulk sieving of units within them. Comparison of the results from these methods suggests that, when on-site time is limited, the integration of these methods is a valid technique in both qualitative and quantitative terms for data recovery. The excavated areas are interpreted as a tool-sharpening and butchery site that may have been a fixed and known locale in the landscape (Q2/C), and a location on the periphery of an area of intensive knapping reduction (Q2/D). Sedimentological and microfaunal analyses demonstrate that Unit 4c was formed as a soil in the top of a marine-lagoonal silt, the pedogenic processes being similar to those observed after draining Dutch polder lakes. The palaeoenvironment is interpreted as an area of open grassland with some shrub and bush vegetation. In places the surface of the soil supported small ephemeral pools and flashes. This area of grassland is seen as a corridor for herds of ungulates moving east and west between the sea to the south and the relict cliff and wooded downland block to the north. Within this corridor these herds were preyed upon by various carnivores, and hominids.The temperate sediments at Boxgrove were deposited in the later part of the Cromerian Complex and immediately pre-date the Anglian Cold Stage; they are therefore around 500,000 years old. The archaeological material from these and overlying cold stage deposits is broadly contemporary with that at High Lodge, Suffolk and Waverley Wood, Warwickshire.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

"Artefact Removal from EEG Signals using Total Variation De-noising". International Journal of Innovative Technology and Exploring Engineering 9, n.º 5 (10 de marzo de 2020): 2357–61. http://dx.doi.org/10.35940/ijitee.e2703.039520.

Texto completo
Resumen
Artefacts removing (de-noising) from EEG signals has been an important aspect for medical practitioners for diagnosis of health issues related to brain. Several methods have been used in last few decades. Wavelet and total variation based de-noising have attracted the attention of engineers and scientists due to their de-noising efficiency. In this article, EEG signals have been de-noised using total variation based method and results obtained have been compared with the results obtained from the celebrated wavelet based methods . The performance of methods is measured using two parameters: signal-to-noise ratio and root mean square error. It has been observed that total variation based de-noising methods produce better results than the wavelet based methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Marsh, Richard J., Ishan Costello, Mark-Alexander Gorey, Donghan Ma, Fang Huang, Mathias Gautel, Maddy Parsons y Susan Cox. "Sub-diffraction error mapping for localisation microscopy images". Nature Communications 12, n.º 1 (23 de septiembre de 2021). http://dx.doi.org/10.1038/s41467-021-25812-z.

Texto completo
Resumen
AbstractAssessing the quality of localisation microscopy images is highly challenging due to the difficulty in reliably detecting errors in experimental data. The most common failure modes are the biases and errors produced by the localisation algorithm when there is emitter overlap. Also known as the high density or crowded field condition, significant emitter overlap is normally unavoidable in live cell imaging. Here we use Haar wavelet kernel analysis (HAWK), a localisation microscopy data analysis method which is known to produce results without bias, to generate a reference image. This enables mapping and quantification of reconstruction bias and artefacts common in all but low emitter density data. By avoiding comparisons involving intensity information, we can map structural artefacts in a way that is not adversely influenced by nonlinearity in the localisation algorithm. The HAWK Method for the Assessment of Nanoscopy (HAWKMAN) is a general approach which allows for the reliability of localisation information to be assessed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Zhou, Bo, Adam J. Ruggles, Erxiong Huang y Jonathan H. Frank. "Wavelet-based algorithm for correction of beam-steering artefacts in turbulent flow imaging at elevated pressures". Experiments in Fluids 60, n.º 8 (29 de julio de 2019). http://dx.doi.org/10.1007/s00348-019-2782-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Du, Xiuli, Jinting Liu, Wei Zhang y Ya'na Lv. "Blocking artefacts reduction based on a ripple matrix permutation image of high‐frequency images in the wavelet domain". IET Image Processing, 20 de abril de 2021. http://dx.doi.org/10.1049/ipr2.12217.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Rosario Quirino Iannone, Stefano Casadio y Bojan Bojkov. "A new method for the validation of the GOMOS high resolution temperature profiles products". Annals of Geophysics 57, n.º 5 (14 de octubre de 2014). http://dx.doi.org/10.4401/ag-6487.

Texto completo
Resumen
<p>This article proposes a new validation method for GOMOS HRTP atmospheric temperature and density profiles, with the aim of detecting and removing 0.2 to 5 km scale vertical structures in order to minimise the impact of atmospheric artefacts in the comparison exercises. The proposed approach is based on the use of the “Morlet” Continuous Wavelet Transformation (CWT), for the characterisation and removal of non-stationary and localised vertical structures, in order to produce wave-free profiles of atmospheric temperature and density. Comparison of wave-free temperature/density profiles and wavy structures profiles with those estimated from a limited number of collocated SHADOZ soundings for the years of 2003, 2004 and 2008, is discussed in detail. First results suggest that the proposed approach could lead to a significantly improved HRTP validation scheme, in terms of reduced uncertainties in the estimated biases. Furthermore, this method may be adopted for the study of the vertical component of gravity waves from high spatial/temporal resolution data.</p>
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía