Siga este enlace para ver otros tipos de publicaciones sobre el tema: Biomedical analysis techniques.

Tesis sobre el tema "Biomedical analysis techniques"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Biomedical analysis techniques".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Esposito, Andrea. "Techniques of proteomic analysis as tools for studies in biomedical field". Doctoral thesis, Universita degli studi di Salerno, 2017. http://hdl.handle.net/10556/2487.

Texto completo
Resumen
2014 - 2015
It is known that prenatal exposure to pollutants and particularly heavy metals can have long term damaging consequences on infants, due to their accumulation in-body. Since the 1990s, ten million tonnes of waste have been illegally dumped in the area around Caserta and Naples. Thus, direct exposure to waste and heavy metals during the last two decades was very frequent in the so-called “Lands of fires”. The number of children suffering from cancer and of malformed fetuses in Italy's "Land of Fires", an area where toxic waste has been dumped by the mafia, is reported significantly higher than elsewhere in the country. In this thesis we examined the proteome of the umbilical cords from malformed fetuses obtained by therapeutic abortions, after mothers' being exposed to the pollution on “land of fire” during early pregnancy, and analyzed the differences between umbilical cords from malformed fetuses to healthy ones. The main goals were to understand the impact of the contamination by heavy metals on the fetus development, and to identify new putative biomarkers of exposure to metal contaminants. All umbilical cords were obtained in Campania region (Naples and Caserta, mainly in the “land of fires”). The collection of the biological samples was carried out in collaboration with the Caserta Hospital “Sant’Anna e San Sebastiano” and with the Avellino Hospital “San Giuseppe Moscati”. A proteomic approach based on Filter-Aided Sample Preparation (FASP) method was set up and performed. This bio-analytical strategy combines the advantages of in-gel and in-solution digestion for mass spectrometry–based proteomics, greatly reduces the time required for sample preparation and enables more flexibility in sample processing. Protein identification and quantification were performed by matching mass spectrometry data in on-line protein database, using the MaxQuant 1.5.2.8 software. Statistical analyses were employed to identify proteins whose levels were sensibly different in the umbilical cords from malformed fetuses. Gene Ontology (GO) classification was used in order to obtain functional information of the differentially expressed proteins and to correlate them to the embryonic development. Finally, Matrix Metalloproteinases (MMPs) have been shown to play significant roles in a number of physiological processes, including embryogenesis and angiogenesis, but they also contribute to the development of pathological processes. Thus, gelatin zymography technique was performed to detect MMPs enzymatic activity in the umbilical cords. Our results support a significant role of MMPs in the fetus development. [edited by author]
XIV n.s.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Harris, Justin Clay. "NEW BIOINFORMATIC TECHNIQUES FOR THE ANALYSIS OF LARGE DATASETS". UKnowledge, 2007. http://uknowledge.uky.edu/gradschool_diss/544.

Texto completo
Resumen
A new era of chemical analysis is upon us. In the past, a small number of samples were selected from a population for use as a statistical representation of the entire population. More recently, advancements in data collection rate, computer memory, and processing speed have allowed entire populations to be sampled and analyzed. The result is massive amounts of data that convey relatively little information, even though they may contain a lot of information. These large quantities of data have already begun to cause bottlenecks in areas such as genetics, drug development, and chemical imaging. The problem is straightforward: condense a large quantity of data into only the useful portions without ignoring or discarding anything important. Performing the condensation in the hardware of the instrument, before the data ever reach a computer is even better. The research proposed tests the hypothesis that clusters of data may be rapidly identified by linear fitting of quantile-quantile plots produced from each principal component of principal component analysis. Integrated Sensing and Processing (ISP) is tested as a means of generating clusters of principal component scores from samples in a hyperspectral near-field scanning optical microscope. Distances from the centers of these multidimensional cluster centers to all other points in hyperspace can be calculated. The result is a novel digital staining technique for identifying anomalies in hyperspectral microscopic and nanoscopic imaging of human atherosclerotic tissue. This general method can be applied to other analytical problems as well.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Rohen, V. E. "Applications of statistical pattern recognition techniques to the analysis of ballistocardiograms". Thesis, University of Cambridge, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.235284.

Texto completo
Resumen
This dissertation describes a new approach to the analysis of Ballisto-cardiograms using Statistical Pattern Recognition technique, as well as the design and development of a new ballistocardiograph and its associated software. Ballistocardiograms are the result of forces exerted on the body, caused by the ejection of blood from the heart and the passage of the blood through the arterial system. The apparatus used in this study for the collection and display of the ballistocardiograms consisted of a specially designed stool with highly sensitive piezoelectric elements, which converted the forces acting on the stool into electric signals, connected to a specially built interface which converted the analogue signal into digital data. These were in turn analysed using a BBC model B microcomputer running special software which was written as part of this work. The methods of analysis developed here are based on Statistical Pattern Recognition and consist of units dealing with the preprocessing of the data, extraction of optimal features, and with their classification. By their nature, the lengths of the ballistocardiograms vary not only from person to person but there are also differences between the lengths of individual beats in the same subject. This presents a major problem for successful analysis. A novel method for non-linear standardisation of the ballistocardiogram length was developed and used in this study. This method allows the comparison of ballistocardiograms of different lengths, by projecting them into a waveform of uniform length, whilst maintaining all the information contained in the shape of the original signal. The projection is based on local cross-correlation of a template ballistocardiogram with a subset of the ballistocardiograms to be analysed. This results in a set of standard length records which in turn are used to determine the transformation. A feature extraction method based on double eigenanalysis was used to reduce the dimensionality of the data and for the extraction of features which discriminate best between the different classes analysed. Four classes of subjects were used in this study. A normal group which consisted of generally healthy and physically fit people, whose ballistocardiograms were also used to develop the new method for length adjustments; a group of subjects with mild hypertension; a group of patients with coronary artery stenosis, who were undergoing treatment at the Papworth Hospital; and a group consisting of subjects with clinical history of recent myocardial infarction. It was found that after standardisation of the length of the ballistocardiograms, and after extraction of those features which contain most of the discriminant information, the Nearest- Neighbour rule discriminated well between the group of normal subjects and the three remaining groups. The groups of subjects with mild hypertension and with coronary artery stenosis proved more difficult to separate. This can possibly be explained by the similarities in the characteristics of these two groups as far as ballistocardiograms are concerned. It was also found that the parts of the wave that have most of the discriminatory information are those corresponding to the ejection phase, for all the groups in general, and those corresponding to the last peaks of the ballistocardiograms (post ejection phase), for the group with recent myocardial infarction. Ballistocardiography is shown in this work to be a good non- invasive method for the study of the general performance of the heart. The methods described here for discrimination between groups and classification of different ballistocardiograms, by means of the analysis of their shape alone, have also proved very powerful. In particular, the new length standardisation method allows a more accurate monitoring of the heart function, than could be achieved so far. The techniques developed in this researh may be used for the prediction of various heart diseases in their early stages. This, together with the portability of the apparatus developed in this research, could turn the new ballistocardiograph into a standard clinical device.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Jeon, Seonghye. "Bayesian data mining techniques in public health and biomedical applications". Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/43712.

Texto completo
Resumen
The emerging research issues in evidence-based healthcare decision-making and explosion of comparative effectiveness research (CER) are evident proof of the effort to thoroughly incorporate the rich data currently available within the system. The flexibility of Bayesian data mining techniques lends its strength to handle the challenging issues in the biomedical and health care domains. My research focuses primarily on Bayesian data mining techniques for non-traditional data in this domain, which includes, 1. Missing data: Matched-pair studies with fixed marginal totals with application to meta-analysis of dental sealants effectiveness. 2. Data with unusual distribution: Modeling spatial repeated measures with excess zeros and no covariates to estimate U.S. county level natural fluoride concentration. 3. Highly irregular data: Assess overall image regularity in complex wavelet domain to classify mammography image. The goal of my research is to strengthen the link from data to decisions. By using Bayesian data mining techniques including signal and image processing (wavelet analysis), hierarchical Bayesian modeling, clinical trials meta-analyses and spatial statistics, this thesis resolves challenging issues of how to incorporate data to improve the systems of health care and bio fields and ultimately benefit public health.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Jakeway, Stephen Christopher. "Development of optical techniques for biomolecule detection in miniaturized total analysis systems". Thesis, Imperial College London, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.271699.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Seydnejad, Saeid Reza. "Analysis of heart rate variability and blood pressure variation by nonlinear modelling techniques". Thesis, Imperial College London, 1998. http://hdl.handle.net/10044/1/7814.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

D'Angelo, Maurissa S. "Analysis of Amputee Gait using Virtual Reality Rehabilitation Techniques". Wright State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=wright1279121086.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

BERNACCHIA, NATASCIA. "Measurement techniques based on image processing for the assessment of biomedical parameters". Doctoral thesis, Università Politecnica delle Marche, 2014. http://hdl.handle.net/11566/242751.

Texto completo
Resumen
L’imaging biomedicale rappresenta un tema importante nel settore della diagno-si e della ricerca clinica. I software per l’analisi delle immagini permettono di individuare automaticamente informazioni non visibili all’occhio umano. Lo sviluppo tecnologico e l'u-so di diverse modalità di imaging aprono una sfida circa la necessità di analizzare un volu-me significativo di immagini per garantire informazioni di alta qualità per diagnosi, tratta-mento e monitoraggio, in strutture cliniche così come a casa. I sistemi di misura comunemente impiegati in ambiente clinico richiedono il contatto con il soggetto, provocando discomfort e risultando non adatti per tempi di osservazione lunghi. D'altra parte, il contatto può alterare la forma, la composizione dei campioni, e tecniche al-lo stato dell'arte potrebbero richiedere molto tempo e fornire bassa risoluzione. Questa tesi di dottorato presenta una serie di applicazioni sperimentali originali dell’analisi di immagini in campo biomedicale. L'obiettivo è quello di sviluppare e validare nuove me-todologie, basate sull’analisi di immagini, per la misura senza contatto di grandezze di di-versa natura. Lo studio tratta come prima applicazione l'estrazione di caratteristiche morfologiche di ag-gregati cellulari per studiare i processi di rigenerazione in cuori infartuati, poi lo sviluppo di una metodologia senza contatto per la misura delle proprietà meccaniche dei tendini rotulei di coniglio sottoposti a prove di trazione, e di metodi innovativi per il monitoraggio dei pa-rametri fisiologici (frequenza cardiaca e respiratoria, variazioni di volume del torace) usan-do sistemi quali il dispositivo Kinect ™ e una camera digitale. I banchi sperimentali, progettati in questo lavoro, sono stati validati, ottenendo un'elevata correlazione rispetto ai metodi di riferimento. I sistemi, seppur diversi per molti aspetti, hanno dimostrato di essere adatti ai rispettivi compiti, confermando la fattibilità dell’approccio basato sull’imaging biomedicale.
Biomedical imaging represents an important topic in the field of diagnosis and clinical research. Image analysis and processing software also helps to automatically identify what might not be apparent to the human eye. The technological development and the use of different imaging modalities create more challenges, as the need to analyse a significant volume of images so that high quality information can be produced for disease diagnosis, treatment and monitoring, in clinical structures as well as at home. All the measurement systems routinely used in clinical environment require to be put in di-rect contact with the subject, which in some cases can be uncomfortable or even non-suited for long monitoring. On the other hand, in some cases contact could alter shape or composition of the samples under study, and state-of-the-art techniques could require a lot of time and provide very low resolution. This doctoral thesis presents a series of new experimental applications of the image analysis and processing in the biomedical field. The aim was to develop and validate new method-ologies, based on image analysis, for non contact measurement of quantities of different nature. The study is focused on the extraction of morphological characteristics of cell ag-gregates to assess of the regeneration processes in infarcted hearts, the design of a non con-tact methodology to measure mechanical properties of rabbit patellar tendons subjected to tensile tests, the development of new methods for the monitoring of physiological parame-ters (heart and respiration rate, chest volume variations) through the use of image acquisi-tion systems, as Kinect™ device and a digital camera. The experimental setups, designed in this work, were validated, showing high correlation respect to the reference methods. Imaging systems, although so different in many aspects, have demonstrated to be suitable for the respective tasks, confirming the feasibility of the imaging approach in the biomedical field.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Graça, Cristo dos Santos Lopes Ruano Maria da. "Investigation of real-time spectral analysis techniques for use with pulsed ultrasonic Doppler blood flow detectors". Thesis, Bangor University, 1992. https://research.bangor.ac.uk/portal/en/theses/investigation-of-realtime-spectral-analysis-techniques-for-use-with-pulsed-ultrasonic-doppler-blood-flow-detectors(f184d2a8-bde7-492a-b487-438704d3ea04).html.

Texto completo
Resumen
The goals of the work described here were the development of a method of selection of spectral estimation for use with pulsed Doppler ultrasonic blood flow instruments, and the use of this method to select an estimator and its implementation in a form suitable for real-time applications. A study of estimation accuracy of the mean frequency and bandwidth using a number of spectral estimators was carried out. Fourier based, parametric, and, minimum variance estimators were considered. A Doppler signal simulator was developed to allow the accuracy tests required. A method of selection of spectral estimators based on the accuracy of estimation of decisive signal parameters, under the constraint of low computational complexity has been proposed. This novel cost/benefit criterion, allows the possibility of weighting appropriate to estimator (mean frequency and bandwidth) and signal frequency importance (across the range of signal characteristics). For parametric spectral estimators, this criterion may also be used to select model order, leading to lower orders than FPE, AIC and CAT criteria. Its use led to the selection of a 4t' order modified covariance parametric method. A new version of the modified covariance method for spectral estimation of real signals was developed. This was created with a view to the parallel partitioning of the algorithm for parallel implementation on a transputer-based system, using OCCAM. A number of parallel topologies were implemented. Their performance was evaluated considering estimation of a single, and a sequence of Doppler signal segments, revealing the feasibility of these parallel implementations to be achieved in real-time.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Kirk, E. M. "Biomedical applications of narrow-bore liquid chromatography with computer-aided detection : Application of multivariate digital techniques to biomedical samples in narrow-bore column high-performance liquid chromatography with photodiode array detection". Thesis, University of Bradford, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.384276.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Lim, Lily. "Reliable Invasive Blood Pressure Measurements Using Fourier Optimization Techniques". University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1145285836.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Valero, Vidal Carlos. "Study of the degradation mechanisms of the CoCrMo biomedical alloy in physiological media by electrochemical techniques and surface analysis". Doctoral thesis, Universitat Politècnica de València, 2012. http://hdl.handle.net/10251/16881.

Texto completo
Resumen
La aleación biomédica CoCrMo se emplea en la elaboración de prótesis de sustituciones articulares totales o parciales de cadera y rodilla debido a su biocompatibilidad y a sus buenas propiedades mecánicas entre las que destacan su elevada resistencia a la corrosión y al desgaste. La superficie del biomaterial CoCrMo reacciona de manera espontánea con el medio que la rodea formando una capa pasiva de óxidos metálicos que auto-protege a la aleación del medio y condiciona su comportamiento frente a la corrosión. Hay que tener en cuenta que el medio en el que trabajan estas prótesis es uno de los más agresivos que se conocen lo que agrava el proceso de corrosión. Dicho proceso contribuye a la liberación de iones metálicos dentro del cuerpo humano acelerando el deterioro de dichas prótesis y problemas clínicos en los pacientes. En este contexto, la presente Tesis Doctoral pretende estudiar los mecanismos de biocorrosión que determinan la degradación de la aleación CoCrMo en condiciones fisiológicas. Para ello, en primer lugar se ha realizado la caracterización electroquímica del biomaterial en diferentes condiciones físico-químicas de relevancia biológica (composición química del fluido simulado, pH, contenido en oxígeno y potencial aplicado) las cuales influyen notablemente en las reacciones electroquímicas que tienen lugar en la interfase biometerial/medio. Posteriormente, se ha estudiado cómo influye la adsorción de albúmina (proteína modelo y mayoritaria en el cuerpo humano) en el comportamiento electroquímico de la aleación en función de la concentración de proteína y la temperatura del medio. Este estudio se ha llevado a cabo desde el punto de vista termodinámico y se ha demostrado que el proceso de adsorción de la proteína sobre la superficie de la aleación CoCrMo ocurre de manera espontánea por quimisorción modelándose correctamente mediante la Isoterma de Langmuir. Finalmente, se han estudiado las cinéticas de pasivación y de adsorción de proteína mediante la p
Valero Vidal, C. (2012). Study of the degradation mechanisms of the CoCrMo biomedical alloy in physiological media by electrochemical techniques and surface analysis [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/16881
Palancia
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

STRAZZA, ANNACHIARA. "Advanced Techniques for EMG-based Assessment of Muscular Co-Contraction During Walking". Doctoral thesis, Università Politecnica delle Marche, 2019. http://hdl.handle.net/11566/263516.

Texto completo
Resumen
L'analisi del cammino è definita come lo studio sistematico della locomozione umana. Una parte centrale dell'analisi del cammino è rappresentata dall'elettromiografia di superficie (sEMG). Un ruolo centrale nel controllo del cammino è svolto dai muscoli degli arti inferiori, e in particolare dalla co-contrazione muscolare degli arti inferiori. La co-contrazione muscolare è definita come il reclutamento concomitante di muscoli antagonisti che afferiscono a un determinato giunto. In soggetti sani, la co-contrazione esercita una pressione omogenea sulla superficie articolare, preservandone la stabilità articolare. In individui patologici, la co-contrazione sembra avere un ruolo chiave nello sviluppo di strategie di compensazione durante la riabilitazione motoria. Per la quantificazione dell'attività di co contrazione dei muscoli degli arti inferiori, diverse metodologie basate su sEMG sono state sviluppate ma uno standard per identificare la co-contrazione muscolare non è ancora disponibile. Dunque, obiettivo è l'analisi della co contrazione dei muscoli delle gambe nel dominio tempo-frequenza durante il cammino e la determinazione di dati normativi durante il cammino sano adulto e pediatrico. L'analisi mediante trasformata Wavelet (WT) è uno strumento appropriato per sviluppare un nuovo approccio per la valutazione della co-contrazione muscolare nel dominio tempo frequenza. Il metodo proposto è denominato CODE: CO-contraction DEtection. Un'ulteriore applicazione dell'analisi WT è l'estrazione e la valutazione dei suoni cardiaci fetali, dal segnale fonocardiografico fetale. Per ottenere dati di riferimento sulla co-contrazione dei muscoli dell’arto inferiore durante il cammino adulto e pediatrico è stata utilizzata la Statistical Gait Analysis (SGA), tecnica recente in grado di fornire una caratterizzazione statistica del cammino, calcolando i parametri spaziali-temporali mediante l’analisi di centinaia di passi di uno stesso soggetto durante il cammino.
Gait analysis is the systematic study of human locomotion. A central part of gait analysis is represented by surface electromyography (sEMG). The walking control is played by lower limb muscles, and in particular by lower limb muscular co-contraction. Muscular co-contraction is the concomitant recruitment of antagonist muscles crossing a joint. In healthy subjects, co-contraction occurs to achieve a homogeneous pressure on joint surface, preserving articular stability. In pathological individuals, the assessment of co-contraction appeared to have a key role for discriminating dysfunction conditions of the central nervous system. Different methodologies for muscular co-contraction assessment were developed. A co-contraction index (CI) based on the area computation under the curve of rectified EMG signal from antagonist muscles was developed. It provides an overall numerical index that could not be suitable to characterize dynamic task. To overcome this limitation, muscular co-contraction was assessed by overlapping linear envelopes or temporal interval where muscles superimposed. Thus, a gold standard for identifying muscle co-contraction is not available yet. The aim of the study is to perform an EMG-based analysis of muscular co-contraction by proposing a new and reliable techniques for leg-muscle co-contraction assessment in time-frequency domain and by providing normative co-contraction data during heathy adult and child walking. The proposed method, based on Wavelet transform (WT), is named CO-contraction DEtection algorithm (CODE). A further application of WT analysis is the extraction and assessment of fetal heart sounds, from fetal phonocardiography signal. In the present study, also a reference data on lower-limb-muscle co contraction was provided by means of Statistical Gait Analysis, a technique able to provide a statistical characterization of gait, by averaging spatial-temporal and sEMG-based parameters over hundreds of strides during walking.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Streicher, Matthew C. "KINEMATIC COMPARISON OF MARKER SET TECHNIQUES USED IN BIOMECHANICAL ANALYSIS OF THE PITCHING MOTION". University of Akron / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=akron1176684209.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Kim, YuJaung. "ASSESSMENT OF CIRCUMFERENTIAL MYOCARDIAL FUNCTION USING RADIAL TAGGED MRI". Cleveland State University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=csu1331005696.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Shah, Nilesh D. "Quantification and Improvement of Stiffness Measurement Techniques of Trabecular Bone Using Porcine Mandibular Condyles". Ohio University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1415234084.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Koglin, Ryan W. "Efficient Image Processing Techniques for Enhanced Visualization of Brain Tumor Margins". University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1415835138.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Vaizurs, Raja Sarath Chandra Prasad. "Atrial Fibrillation Signal Analysis". Scholar Commons, 2011. http://scholarcommons.usf.edu/etd/3386.

Texto completo
Resumen
Atrial fibrillation (AF) is the most common type of cardiac arrhythmia encountered in clinical practice and is associated with an increased mortality and morbidity. Identification of the sources of AF has been a goal of researchers for over 20 years. Current treatment procedures such as Cardio version, Radio Frequency Ablation, and multiple drugs have reduced the incidence of AF. Nevertheless, the success rate of these treatments is only 35-40% of the AF patients as they have limited effect in maintaining the patient in normal sinus rhythm. The problem stems from the fact that there are no methods developed to analyze the electrical activity generated by the cardiac cells during AF and to detect the aberrant atrial tissue that triggers it. In clinical practice, the sources triggering AF are generally expected to be at one of the four pulmonary veins in the left atrium. Classifying the signals originated from four pulmonary veins in left atrium has been the mainstay of signal analysis in this thesis which ultimately leads to correctly locating the source triggering AF. Unlike many of the current researchers where they use ECG signals for AF signal analysis, we collect intra cardiac signals along with ECG signals for AF analysis. AF Signal collected from catheters placed inside the heart gives us a better understanding of AF characteristics compared to the ECG. . In recent years, mechanisms leading to AF induction have begun to be explored but the current state of research and diagnosis of AF is mainly about the inspection of 12 lead ECG, QRS subtraction methods, spectral analysis to find the fibrillation rate and limited to establishment of its presence or absence. The main goal of this thesis research is to develop methodology and algorithm for finding the source of AF. Pattern recognition techniques were used to classify the AF signals originated from the four pulmonary veins. The classification of AF signals recorded by a stationary intra-cardiac catheter was done based on dominant frequency, frequency distribution and normalized power. Principal Component Analysis was used to reduce the dimensionality and further, Linear Discriminant Analysis was used as a classification technique. An algorithm has been developed and tested during recorded periods of AF with promising results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Darrington, John Mark. "Real time extraction of ECG fiducial points using shape based detection". University of Western Australia. School of Computer Science and Software Engineering, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0152.

Texto completo
Resumen
The electrocardiograph (ECG) is a common clinical and biomedical research tool used for both diagnostic and prognostic purposes. In recent years computer aided analysis of the ECG has enabled cardiographic patterns to be found which were hitherto not apparent. Many of these analyses rely upon the segmentation of the ECG into separate time delimited waveforms. The instants delimiting these segments are called the
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Jacqueline, Sophie. "Éthique de l'étude des objets d'arts extra- européens : intérêts et limites de l'utilisation de l'imagerie médicale". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASR025.

Texto completo
Resumen
Notre recherche s'intéresse à l'éthique de l'étude des objets d'arts extra-européens en particulier l'art africain, l'art océanien ou l'art sud-américain, à travers le prisme de leur examen macroscopique, à l'œil nu, et en imagerie médicale, en s'interrogeant notamment sur les intérêts et les limites de ces examens. Il existe peu de recherches scientifiques de grande envergure faisant appel aux techniques biomédicales d'analyse, à l'imagerie médicale et explorant en profondeur l'éthique des disciplines entourant les objets tout au long de leur cycle de vie.Cette étude inclut une revue de la littérature consacrée à l'arrivée des objets extra-européens dans les collections muséales françaises et à l'évolution de leur statut, tour à tour objets de curiosités, objets ethnographiques et objets d'art. Nous constatons aujourd'hui l'implication croissante du corps médical dans les processus d'expertise des objets. Notre recherche, mobilisant l'anthropologie, l'imagerie médicale et le droit, s'est fondée sur l'étude macroscopique et en imagerie médicale de plus de cent cinquante objets provenant des continents africain, asiatique, océanien et sud-américain, conservés au musée du quai Branly-Jacques Chirac et dans des collections privées. L'analyse ne se contente pas de mettre en lumière les intérêts et les limites de ces études mais souligne également l'importance cruciale des considérations éthiques à toutes les étapes de l'étude de ces objets, y compris lors de leur collecte, leur exposition, leur vente et enfin leur restauration.Notre recherche fournit une vue d'ensemble des pratiques actuelles ayant cours aux différentes étapes de la vie des objets. En croisant les regards entre traitement des objets du culte catholique et des cultes extra-européens, nous proposons de nouvelles lignes de conduite éthiques pour les pratiques muséales. Nous suggérons, qu'à une époque où les demandes de restitutions d'objets extra-européens s'intensifient, il est essentiel d'impliquer plus activement les communautés d'origine dans les prises de décisions. Nous plaidons pour la mise en place de comités d'éthique pluridisciplinaires et inclusifs afin de garantir le respect des perspectives, pratiques et croyances de chacun
Our research examines the ethical considerations involved in the study of extra-European art objects, specifically those from African, Oceanic, and South American origins, through the lens of macroscopic examination and medical imaging. We focus particularly on the merits and limitations of these techniques. There is a little large-scale of scientific research employing medical imaging to thoroughly explore the ethical dimensions associated with these objects throughout their lifecycle.This study includes a literature review on the introduction of extra-European objects into French collections and the evolving roles these objects have assumed, transitioning from curiosities to ethnographic artifacts and, to art objects. By integrating insights from anthropology, medical imaging, and legal studies, our research involves a detailed macroscopic and medical imaging analysis of over 150 objects from Africa, Asia, Oceania, and South America, housed at the musée du quai Branly-Jacques Chirac as well as in private collections. This analysis not only elucidates the benefits and constraints of such studies but also underscores the critical importance of addressing ethical considerations at every stage of the object's lifecycle, including collection, exhibition, sale, and restoration.Our findings provide a comprehensive overview of current practices at various stages in the life of these objects and propose new ethical guidelines for museum practices. Drawing parallels with the treatment of Catholic cult objects, we suggest that in an era where demands for the repatriation of extra-European objects are growing, it is essential to more actively involve the originating communities in the decision-making processes. We advocate for the establishment of multidisciplinary and inclusive ethics committees to ensure that diverse perspectives, practices, and beliefs are respected
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Dias, Philipe Ambrozio. "In situ microscopy for analysis of filamentous bacteria: optics end image evalaution". Universidade Tecnológica Federal do Paraná, 2016. http://repositorio.utfpr.edu.br/jspui/handle/1/1999.

Texto completo
Resumen
CAPES; CNPq
In the activated sludge process, problems of foaming and filamentous bulking can occur due to overgrowth of certain filamentous bacteria. Nowadays, these microorganisms are typically monitored by means of light microscopy combined with staining techniques. As drawbacks, these methods are susceptible to human errors, subjectivity and limited by the use of discontinuous microscopy. The present project aims the application of an in situ microscope (ISM) for continuous monitoring of filamentous bacteria, providing real-time examination, automated analysis and elimination of sampling, preparation and transport of samples. The ISM previously developed at the Hochschule Mannheim required adaptations for use within wastewater environment, specially in terms of impermeability and development of a cleaning mechanism. With a new objective lens design, the system was simplified to a single tubus and an externally activated cleaning system based on magnetism was created. A proper image processing algorithm was designed for automated recognition and measurement of filamentous objects, allowing real-time evaluation of images without any staining, phase-contrast or dilution techniques. Three main operations are performed: preprocessing and binarization; recognition of filaments using distance-maps and shape descriptors; measurement and display of total extended filament length. A 3D-printed prototype was used for experiments with respect to the new ISM’s design, providing images with resolution very close to the ones acquired with the previous microscope. The designed cleaning system has shown to be effective, removing dirt settled above the lens during tests. For evaluation of the image processing algorithm, samples from an industrial activated sludge plant were collected weekly for a period of twelve months and imaged without any prior conditioning, replicating real environment conditions. Experiments have shown that the developed algorithm correctly identifies trends of filament growth rate, which is the most important parameter for decision making. For reference images whose filaments were marked by specialists, the algorithm correctly recognized 72% of the filaments pixels, with a false positive rate of at most 14%. An average execution time of 0.7 second per image was achieved, demonstrating the algorithm suitability for real-time monitoring.
Em processos de lodo ativado, problemas de foaming e filamentous bulking podem ocorrer devido ao crescimento exagerado de bactérias filamentosas. Atualmente, o monitoramento de tais micro-organismos é feito por meio de métodos baseados em microscopia ótica combinada com técnicas de marcadores, os quais apresentam limitações intrínsecas da microscopia descontínua, são subjetivos e suscetíveis a erro humano. O presente projeto visa a aplicação de um microscópio in situ (ISM) para monitoramento contínuo de bactérias filamentosas, de forma a possibilitar análise instantânea, computadorizada, sem necessidades de recolher, preparar e transportar amostras. O ISM previamente desenvolvido na Hochschule Mannheim teve que ser adaptado para análise de águas residuais, especialmente em termos de impermeabilidade e a criação de um mecanismo de limpeza. Com a utilização de uma nova objetiva, o novo ISM foi simplificado para um tubo único e um sistema de limpeza ativado externamente baseado em magnetismo foi criado. Um algoritmo de processamento de imagens foi elaborado para reconhecimento e medição de comprimento de estruturas filamentosas, permitindo avaliação em tempo real de imagens sem qualquer técnica de marcadores, contraste de fase ou diluição. O mesmo consiste em três operações principais: pré-processamento e binarização; reconhecimento de filamentos por meio de mapeamento de dis- tâncias e descritores de forma; e, finalmente, medição e visualização do comprimento de cada filamento. Um protótipo construído via impressão 3D foi utilizado para avaliação o novo design do microscópio, fornecendo imagens com resolução bastante próxima das adquiridas com a versão anterior do sistema. O mecanismo de limpeza desenvolvido mostrou-se efetivo, capaz de remover partículas sedimentadas acima das lentes durante os testes. Para avaliação do algoritmo de processamento de imagens, amostras de uma planta industrial de lodo ativado foram coletadas semanalmente por um período de doze meses e imageadas sem qualquer condicionamento prévio, replicando condições reais de ambiente. Experimentos demonstraram que o algoritmo desenvolvido identifica corretamente tendências de aumento/decréscimo da concentração de filamentos, o que constitui o principal parâmetro para tomadas de decisão. Para imagens de referência cujos filamentos foram marcados por especialistas, o algoritmo reconheceu corretamente 80% dos pixels atribuídos a filamentos, com uma taxa de falso positivos de até 24%. Um tempo de execução médio de 0,7 segundo por imagem foi obtido, provando sua aptidão para formar uma ferramenta de monitoramento em tempo real.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Mercier, Michel. "Recherches sur l'image scientifique : génèse du sens et signification en microscopie électronique". Bordeaux 1, 1987. http://www.theses.fr/1987BOR10567.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Lara, Alexandre Guilherme de. "Metodologia para análise na perícia criminal de microvestígios forenses: fios de cabelo". Universidade Tecnológica Federal do Paraná, 2016. http://repositorio.utfpr.edu.br/jspui/handle/1/2006.

Texto completo
Resumen
A presente dissertação expositiva teve por objetivo o desenvolvimento de metodologia para análise forense de fios de cabelos. Considerado um microvestígio comum de ser encontrado em locais de crime contra pessoa (DOREA, 2010), fios de cabelo são capazes de fornecer informações úteis na identificação da autoria do crime, ou a entender a dinâmica de um crime violento (ROBERTSON, 1999). As fibras compostas por fio de cabelo deterioram-se lentamente (ROBBINS, 2012), e podem ser analisadas mesmo transcorrido longo período de tempo, tornando-se ainda mais relevante em locais de crime que são examinados tardiamente. Os equipamentos utilizados no presente estudo são de uso comum em laboratórios, permitindo que a metodologia seja amplamente empregada. O presente estudo buscou desenvolver metodologias para análise por microscopia ótica de fios de cabelo, identificando as características de interesse forense passíveis de serem estudadas. Utilizaram-se amostras de fios de cabelo para análise das seções transversais e longitudinais, de forma destrutiva e não destrutiva, com preparação de lâminas de forma permanente e não permanente. Como resultado obteve-se uma metodologia de baixo custo descrita para aplicações forenses em vestígios coletados em locais de crime, em armas de crimes ou em suspeitos, de forma a combater a criminalidade face o poder probatório da prova material. Constatou-se a possibilidade de identificação de características macroscópicas e microscópicas de fios de cabelo, que revelam atributos do doador e, em alguns casos, auxiliar no entendimento da dinâmica do crime. A agilidade observada no método permite ainda a obtenção de informações durante a fase preliminar ou de investigação em curtos espaços de tempo.
This monographic dissertation aimed to develop standards of microscopic forensic hair analysis. A very common trace to be found in certain crimes (DOREA, 2010), hair can be helpful to provide information regarding the perpetrator or the way that the crime was committed (ROBERTSON, 1999). Once hair is a low degradation fiber (ROBBINS, 2012), it can be analyzed long after the fact, becoming even more relevant in crime scenes processed after elapse time. The equipment’s used for this study are commonly found in forensic labs, allowing broad use. The study was restricted to literature review and further elaboration of preparation and analysis by optical microscopy of hairs methods, identifying the characteristics of forensic interest that can be studied. Samples of hair for cross section and longitudinal sections analysis had been used. Destructive and non-destructive methods in permanent and non-permanent blades where tested. As a result a low cost methodology was obtained for forensic applications in traces collected from crime scenes, weapons used in crimes or suspects, in order to combat crime by using the probative power of physical evidence.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Cao, Xi Hang. "On Leveraging Representation Learning Techniques for Data Analytics in Biomedical Informatics". Diss., Temple University Libraries, 2019. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/586006.

Texto completo
Resumen
Computer and Information Science
Ph.D.
Representation Learning is ubiquitous in state-of-the-art machine learning workflow, including data exploration/visualization, data preprocessing, data model learning, and model interpretations. However, the majority of the newly proposed Representation Learning methods are more suitable for problems with a large amount of data. Applying these methods to problems with a limited amount of data may lead to unsatisfactory performance. Therefore, there is a need for developing Representation Learning methods which are tailored for problems with ``small data", such as, clinical and biomedical data analytics. In this dissertation, we describe our studies of tackling the challenging clinical and biomedical data analytics problem from four perspectives: data preprocessing, temporal data representation learning, output representation learning, and joint input-output representation learning. Data scaling is an important component in data preprocessing. The objective in data scaling is to scale/transform the raw features into reasonable ranges such that each feature of an instance will be equally exploited by the machine learning model. For example, in a credit flaw detection task, a machine learning model may utilize a person's credit score and annual income as features, but because the ranges of these two features are different, a machine learning model may consider one more heavily than another. In this dissertation, I thoroughly introduce the problem in data scaling and describe an approach for data scaling which can intrinsically handle the outlier problem and lead to better model prediction performance. Learning new representations for data in the unstandardized form is a common task in data analytics and data science applications. Usually, data come in a tubular form, namely, the data is represented by a table in which each row is a feature (row) vector of an instance. However, it is also common that the data are not in this form; for example, texts, images, and video/audio records. In this dissertation, I describe the challenge of analyzing imperfect multivariate time series data in healthcare and biomedical research and show that the proposed method can learn a powerful representation to encounter various imperfections and lead to an improvement of prediction performance. Learning output representations is a new aspect of Representation Learning, and its applications have shown promising results in complex tasks, including computer vision and recommendation systems. The main objective of an output representation algorithm is to explore the relationship among the target variables, such that a prediction model can efficiently exploit the similarities and potentially improve prediction performance. In this dissertation, I describe a learning framework which incorporates output representation learning to time-to-event estimation. Particularly, the approach learns the model parameters and time vectors simultaneously. Experimental results do not only show the effectiveness of this approach but also show the interpretability of this approach from the visualizations of the time vectors in 2-D space. Learning the input (feature) representation, output representation, and predictive modeling are closely related to each other. Therefore, it is a very natural extension of the state-of-the-art by considering them together in a joint framework. In this dissertation, I describe a large-margin ranking-based learning framework for time-to-event estimation with joint input embedding learning, output embedding learning, and model parameter learning. In the framework, I cast the functional learning problem to a kernel learning problem, and by adopting the theories in Multiple Kernel Learning, I propose an efficient optimization algorithm. Empirical results also show its effectiveness on several benchmark datasets.
Temple University--Theses
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Jalal, Ahmed Hasnain. "Multivariate Analysis for the Quantification of Transdermal Volatile Organic Compounds in Humans by Proton Exchange Membrane Fuel Cell System". FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3886.

Texto completo
Resumen
In this research, a proton exchange membrane fuel cell (PEMFC) sensor was investigated for specific detection of volatile organic compounds (VOCs) for point-of-care (POC) diagnosis of the physiological conditions of humans. A PEMFC is an electrochemical transducer that converts chemical energy into electrical energy. A Redox reaction takes place at its electrodes whereas the volatile biomolecules (e.g. ethanol) are oxidized at the anode and ambient oxygen is reduced at the cathode. The compounds which were the focus of this investigation were ethanol (C2H5OH) and isoflurane (C3H2ClF5O), but theoretically, the sensor is not limited to only those VOCs given proper calibration. Detection in biosensing, which needs to be carried out in a controlled system, becomes complex in a multivariate environment. Major limitations of all types of biosensors would include poor selectivity, drifting, overlapping, and degradation of signals. Specific detection of VOCs in multi-dimensional environments is also a challenge in fuel cell sensing. Humidity, temperature, and the presence of other analytes interfere with the functionality of the fuel cell and provide false readings. Hence, accurate and precise quantification of VOC(s) and calibration are the major challenges when using PEMFC biosensor. To resolve this problem, a statistical model was derived for the calibration of PEMFC employing multivariate analysis, such as the “Principal Component Regression (PCR)” method for the sensing of VOC(s). PCR can correlate larger data sets and provides an accurate fitting between a known and an unknown data set. PCR improves calibration for multivariate conditions as compared to the overlapping signals obtained when using linear (univariate) regression models. Results show that this biosensor investigated has a 75% accuracy improvement over the commercial alcohol breathalyzer used in this study when detecting ethanol. When detecting isoflurane, this sensor has an average deviation in the steady-state response of ~14.29% from the gold-standard infrared spectroscopy system used in hospital operating theaters. The significance of this research lies in its versatility in dealing with the existing challenge of the accuracy and precision of the calibration of the PEMFC sensor. Also, this research may improve the diagnosis of several diseases through the detection of concerned biomarkers.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Courtellemont, Pierre. "Architecture multi-processeurs pour le traitement du signal EEG". Rouen, 1989. http://www.theses.fr/1989ROUES003.

Texto completo
Resumen
L'algorithme proposé repose sur le principe des moindres carrés récursifs. Il diffère des méthodes usuelles par une adaptation des paramètres qui se fait globalement sur une fenêtre d'observation et non à chaque nouvel échantillon. Cette technique a permis la mise au point d'un algorithme de détection à deux seuils successifs
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Lee, Seungyup. "A RAPID CYCLE LENGTH VARIABILITY DETECTION TECHNIQUE OF ATRIAL ELECTROGRAMS IN ATRIAL FIBRILLATION". Case Western Reserve University School of Graduate Studies / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=case1207255208.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Forni, Riccardo. "Virtual Histology: a novel technique to analyze myocardial tissue composition". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022.

Buscar texto completo
Resumen
Questo studio delinea la fattibilità di una nuova tecnica chiamata Istologia Virtuale dove lo stato del tessuto viene valutato direttamente estraendo ed analizzando valori Hounsfield da CT cardiache. Il tessuto viene prelevato in diverse posizioni del setto intraventricolare e della parete esterna del ventricolo sinistro per valutarne lo stato. Viene proposto un flusso di lavoro riproducibile e ripetibile per ottenere campioni con un volume di 1 cc partendo da immagini 2D. I campioni estratti vengono analizzati per creare un modello rappresentativo dei soggetti sani e viene poi confrontato con quello di soggetti affetti da rottura del setto intraventricolare post-infarto e cardiomiopatia ipetrofica, per diagnosticare la patologia direttamente dallo stato dei tessuti e non dalla sintomatologia. Vengono presentati e discussi i migliori parametri per caratterizzare il profilo densitometrico specifico di una porzione 3D di tessuto. Anche se i risultati per media, deviazione standard ed entropia sono incoraggianti, non sono sufficienti per distinguere accuratamente le patologie, quindi grazie ad un'ulteriore estrazione di features ci si è spostati ad un problema di Machine Learning. I nuovi parametri sono legati all'intensità del singolo pixel, proprietà dei profili densitometrici, e alla texture. Le misure di intensità sono caratteristiche 3D mentre per la texture, i risultati sono una media di slices 2D. Lo sbilanciamento delle classi ostacola la corretta classificazione di alcuni dati appartenenti ai soggetti affetti da patologie, ma le tecniche di Data Augmentation permettono di fornire una prova di ciò che sarà possibile in futuro con una corte bilanciata di soggetti. Uno studio di Feature Importance viene eseguito per capire quali caratteristiche sono più rappresentative nel dividere il dataset e il risultato è che la moda del profilo è una delle migliori per classificare i campioni seguita correlazione fra due pixels e dal contrasto.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Borgiani, Edoardo. "Geometrical optimization of a short-stem hip implant for the reduction of proximal stress shielding (ottimizzazione geometrica di uno stelo corto di protesi d?anca per la riduzione della schermatura del carico prossimale)". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amslaurea.unibo.it/8581/.

Texto completo
Resumen
Nowadays the number of hip joints arthroplasty operations continues to increase because the elderly population is growing. Moreover, the global life expectancy is increasing and people adopt a more active way of life. For this reasons, the demand of implant revision operations is becoming more frequent. The operation procedure includes the surgical removal of the old implant and its substitution with a new one. Every time a new implant is inserted, it generates an alteration in the internal femur strain distribution, jeopardizing the remodeling process with the possibility of bone tissue loss. This is of major concern, particularly in the proximal Gruen zones, which are considered critical for implant stability and longevity. Today, different implant designs exist in the market; however there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. The aim of the study is to investigate the stress shielding effect generated by different implant design parameters on proximal femur, evaluating which ranges of those parameters lead to the most physiological conditions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Pacola, Edras Reily. "Uso da análise de discriminante linear em conjunto com a transformada wavelet discreta no reconhecimento de espículas". Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1828.

Texto completo
Resumen
CAPES
Pesquisadores têm concentrado esforços, nos últimos 20 anos, aplicando a transformada wavelet no processamento, filtragem, reconhecimento de padrões e na classificação de sinais biomédicos, especificamente em sinais de eletroencefalografia (EEG) contendo eventos característicos da epilepsia, as espículas. Várias famílias de wavelets-mães foram utilizadas, mas sem um consenso sobre qual wavelet-mãe é a mais adequada para essa finalidade. Os sinais utilizados apresentam uma gama muito grande de eventos e não possuem características padronizadas. A literatura relata sinais de EEG amostrados entre 100 a 600 Hz, com espículas variando de 20 a 200 ms. Nesse estudo foram utilizadas 98 wavelets. Os sinais de EEG foram amostrados de 200 a 1 kHz. Um neurologista marcou um conjunto de 494 espículas e um conjunto de 1500 eventos não-espícula. Esse estudo inicia avaliando a quantidade de decomposições wavelets necessárias para a detecção de espículas, seguido pela análise detalhada do uso combinado de wavelets-mães de uma mesma família e entre famílias. Na sequência é analisada a influência de descritores e o uso combinado na detecção de espículas. A análise dos resultados desses estudos indica que é mais adequado utilizar um conjunto de wavelets-mães, com vários níveis de decomposição e com vários descritores, ao invés de utilizar uma única wavelet-mãe ou um descritor específico para a detecção de espículas. A seleção desse conjunto de wavelets, de níveis de decomposição e de descritores permite obter níveis de detecção elevados conforme a carga computacional que se deseje ou a plataforma computacional disponível para a implementação. Como resultado, esse estudo atingiu níveis de desempenho entre 0,9936 a 0,9999, dependendo da carga computacional. Outras contribuições desse estudo referem-se à análise dos métodos de extensão de borda na detecção de espículas; e a análise da taxa de amostragem de sinais de EEG no desempenho do classificador de espículas, ambos com resultados significativos. São também apresentadas como contribuições: uma nova arquitetura de detecção de espículas, fazendo uso da análise de discriminante linear; e a apresentação de um novo descritor, energia centrada, baseado na resposta dos coeficientes das sub-bandas de decomposição da transformada wavelet, capaz de melhorar a discriminação de eventos espícula e não-espícula.
Researchers have concentrated efforts in the past 20 years, by applying the wavelet transform in processing, filtering, pattern recognition and classification of biomedical signals, in particular signals of electroencephalogram (EEG) containing events characteristic of epilepsy, the spike. Several families of mother-wavelets were used, but there are no consensus about which mother-wavelet is the most adequate for this purpose. The signals used have a wide range of events. The literature reports EEG signals sampled from 100 to 600 Hz with spikes ranging from 20 to 200 ms. In this study we used 98 wavelets. The EEG signals were sampled from 200 Hz up to 1 kHz. A neurologist has scored a set of 494 spikes and a set 1500 non-spike events. This study starts evaluating the amount of wavelet decompositions required for the detection of spikes, followed by detailed analysis of the combined use of mother-wavelets of the same family and among families. Following is analyzed the influence of descriptors and the combined use of them in spike detection. The results of these studies indicate that it is more appropriate to use a set of mother-wavelets, with many levels of decomposition and with various descriptors, instead of using a single mother-wavelet or a specific descriptor for the detection of spikes. The selection of this set of wavelets, decomposition level and descriptors allows to obtain high levels of detection according to the computational load desired or computing platform available for implementation. This study reached performance levels between 0.9936 to 0.9999, depending on the computational load. Other contributions of this study refer to the analysis of the border extension methods for spike detection; and the influences of the EEG signal sampling rate in the classifier performance, each one with significant results. Also shown are: a new spike detection architecture by making use of linear discriminant analysis; and the presentation of a new descriptor, the centred energy, based on the response of the coefficients of decomposition levels of the wavelet transform, able to improve the discrimination of spike and non-spike events.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

CHOUDHRY, MAHIPAL SINGH. "ANALYSIS OF BIOMEDICAL SIGNALS PROCESSING TECHNIQUES". Thesis, 2016. http://dspace.dtu.ac.in:8080/jspui/handle/repository/16204.

Texto completo
Resumen
The biomedical signal is a summarizing term for all kinds of signals that can be continu ally measured and monitored from biological beings. Electrocardiogram (ECG) and Elec troencephalogram (EEG) are most important 1-D biomedical signals as they are linked with activities of heart and brain respectively, the most important organs of human body. Magnetic Resonance Imaging (MRI) is the most popular medical imaging tech nique. It has a wide range of applications in medical diagnosis and it is preferred over other methods for medical imaging purpose for the reason that it does not involve any ionizing radiation. Importance of MRI can be understood with the fact that over 50,000 MRI scanners are estimated to be in use worldwide for biomedical imaging purpose. Acquisition of a biomedical signal is not sufficient but it is required to process the acquired signal to get the relevant information “buried” in it. This may be due to the fact that the signal is affected by noise during signal acquisition and thus must be “cleaned” using some signal processing technique or method to minimize effects of noise and to en hance useful information. There are different types of noises or artifacts in biomedical signals. Baseline wander and ocular artifacts are the most important artifacts in case of ECG and EEG respectively. This research is mainly focused on proposing novel methods for removal of base line wander and ocular artifact from ECG and EEG. A new method is proposed for base line wander artifact denoising from ECG using a cascaded combination of Complete En semble Empirical Mode Decomposition (CEEMD) and Morphological functions with adaptive Structure Elements (SEs). The proposed method maintains morphology of ECG during denoising and denoising performance is independent of heart rate in case of stress ECG. ABSTRACT iii A new method is proposed for ocular artifact removal from EEG using Stationary Wave let Enhanced Independent Component Analysis (ICA) with a novel threshold technique. The proposed method preserves morphological information present in EEG and the novel threshold technique makes denoising more efficient. MR image is the most important 2-D biomedical signal (Biomedical Image) and segmentation is one of the most important steps of MRI denoising and classification. A novel fuzzy energy based level set method is proposed in this research work for segmen tation of MR images. Proposed method deals effectively and simultaneously with intensi ty inhomogeneity and noise problems of medical image by integrating active contour with Fuzzy C-Means (FCM) clustering. Denoising of MR images is further enhanced by using a mean filter based spatial term with proposed FCM based energy function. Performance of proposed methods is tested with various publicly available da tasets and compared with earlier state-of-the-art methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

BIROK, RAJESH. "STUDY OF BIOMEDICAL SIGNAL MEASUREMENT & ANALYSIS TECHNIQUES". Thesis, 2022. http://dspace.dtu.ac.in:8080/jspui/handle/repository/19470.

Texto completo
Resumen
Biomedical signal is a summarizing term for all kinds of signals that can be continually measured and monitored from living creatures including human beings. Electrocardiogram (ECG), perhaps is the most common and popular 1-D biomedical signals as it is directly associated with one of the most important organs of human body that is activities of the heart. ECG has a wide range of applications in cardiac diagnostics and is preferred over other methods as it is largely non-invasive, safe to the patient, easy to obtain, provides instantaneous results with highest level of accuracy. The appropriate analysis of the ECG signals using suitable means is of utmost importance, before any diagnostics. Therefore, to fulfill this requirement the higher order cumulants are an effective mathematical tool for the analysis of nonlinear and non-stationary ECG signals. The proposed method in this thesis, classifies dataset of ECG signals based upon higher order statistics i.e., using cumulants. It provides a quality detection technique in comparison to the other methods used earlier in this research domain. However, ECG very easily gets contaminated by various types of noises and artefacts during the process of its acquisition. Therefore, this contaminated ECG must be “cleaned” to the appropriate level, it means to minimize the counter-productive effects of embedded noise and artefacts so as to enhance the required information, before using it for any further processing for diagnosis or interpretation. There are different types of noises or artefacts present in any ECG signal, but baseline wander is considered as severest one. This research work is mainly focused on proposing novel methods for removal of baseline wander and other types of noises from ECG signal. Accordingly, a new method is proposed for baseline wander artefact denoising from ECG using cascaded combination of Complete Ensemble Empirical Mode Decomposition (CEEMD) and Artificial Neural Networks [ANN]. The proposed method maintains morphology of ECG signal during denoising and thus there is no loss of vital information from the ECG signal. The denoising of ECG signal is further enhanced by another novel approach making use of Genetic Particle filter improved fuzzy-AEEMD, which has been proposed in this thesis. The performance of proposed methods is tested with different types of readily available ECG datasets and compared with other state-of-the-art methods and these proposed methods proved to be more effective and efficient denoising methods. ABSTRACT 6 Biomedical signal measurement is one of the most important aspects of biomedical signal analysis and interpretation to support scientific hypotheses and medical diagnoses. Biomedical signal measurement aims at appropriately acquiring and measuring biomedical signals for accurate and improved diagnosis and proper medicine management. Extensive research is going on in the field of Bio-Medical Measurements and Instrumentation to find out new non-invasive ways and methods for diagnosis and measurement of health parameters for the welfare of the mankind. Non-invasive techniques are more suitable than the invasive ones if sufficient accuracy can be achieved using them. Among the available non-invasive medical devices and techniques, perhaps bioimpedance based diagnostics is still highly unexplored and underrated owing to insufficient research efforts. Keeping above scenario in mind accordingly, an efficient low-cost bioelectrical impedance measuring instrument was developed, implemented, and tested in this study. Primarily, it is based upon the low-cost component-level approach so that it can be easily used by researchers and investigators in the specific domain. The measurement setup of instrument was tested on adult human subjects to obtain the impedance signal of the forearm which is under investigation in this case. However, depending on the illness or activity under examination, the instrument can be used on any other part of the body. The technique is easy and user-friendly, and it does not necessitate any special training, therefore it can be effectively used to collect bioimpedance data and interpret the findings for medical diagnostics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Rangaprakash, D. "Analysis Of Multichannel And Multimodal Biomedical Signals Using Recurrence Plot Based Techniques". Thesis, 2012. https://etd.iisc.ac.in/handle/2005/2491.

Texto completo
Resumen
For most of the naturally occurring signals, especially biomedical signals, the underlying physical process generating the signal is often not fully known, making it difficult to obtain a parametric model. Therefore, signal processing techniques are used to analyze the signal for non-parametrically characterizing the underlying system from which the signals are produced. Most of the real life systems are nonlinear and time varying, which poses a challenge while characterizing them. Additionally, multiple sensors are used to extract signals from such systems, resulting in multichannel signals which are inherently coupled. In this thesis, we counter this challenge by using Recurrence Plot based techniques for characterizing biomedical systems such as heart or brain, using signals such as heart rate variability (HRV), electroencephalogram(EEG) or functional magnetic resonance imaging (fMRI), respectively, extracted from them. In time series analysis, it is well known that a system can be represented by a trajectory in an N-dimensional state space, which completely represents an instance of the system behavior. Such a system characterization has been done using dynamical invariants such as correlation dimension, Lyapunov exponent etc. Takens has shown that when the state variables of the underlying system are not known, one can obtain a trajectory in ‘phase space’ using only the signals obtained from such a system. The phase space trajectory is topologically equivalent to the state space trajectory. This enables us to characterize the system behavior from only the signals sensed from them. However, estimation of correlation dimension, Lyapunov exponent, etc, are vulnerable to non-stationarities in the signal and require large number of sample points for accurate computation, both of which are important in the case of biomedical signals. Alternatively, a technique called Recurrence Plots (RP) has been proposed, which addresses these concerns, apart from providing additional insights. Measures to characterize RPs of single and two channel data are called Recurrence Quantification Analysis (RQA) and cross RQA (CRQA), respectively. These methods have been applied with a good measure of success in diverse areas. However, they have not been studied extensively in the context of experimental biomedical signals, especially multichannel data. In this thesis, the RP technique and its associated measures are briefly reviewed. Using the computational tools developed for this thesis, RP technique has been applied on select single channel, multichannel and multimodal (i.e. multiple channels derived from different modalities) biomedical signals. Connectivity analysis is demonstrated as post-processing of RP analysis on multichannel signals such as EEG and fMRI. Finally, a novel metric, based on the modification of a CRQA measure is proposed, which shows improved results. For the case of single channel signal, we have considered a large database of HRV signals of 112 subjects recorded for both normal and abnormal (anxiety disorder and depression disorder) subjects, in both supine and standing positions. Existing RQA measures, Recurrence Rate and Determinism, were used to distinguish between normal and abnormal subjects with an accuracy of 58.93%. A new measure, MLV has been introduced, using which a classification accuracy of 98.2% is obtained. Correlation between probabilities of recurrence (CPR) is a CRQA measure used to characterize phase synchronization between two signals. In this work, we demonstrate its utility with application to multimodal and multichannel biomedical signals. First, for the multimodal case, we have computed running CPR (rCPR), a modification proposed by us, which allows dynamic estimation of CPR as a function of time, on multimodal cardiac signals (electrocardiogram and arterial blood pressure) and demonstrated that the method can clearly detect abnormalities (premature ventricular contractions); this has potential applications in cardiac care such as assisted automated diagnosis. Second, for the multichannel case, we have used 16 channel EEG signals recorded under various physiological states such as (i) global epileptic seizure and pre-seizure and (ii) focal epilepsy. CPR was computed pair-wise between the channels and a CPR matrix of all pairs was formed. Contour plot of the CPR matrix was obtained to illustrate synchronization. Statistical analysis of CPR matrix for 16 subjects of global epilepsy showed clear differences between pre-seizure and seizure conditions, and a linear discriminant classifier was used in distinguishing between the two conditions with 100% accuracy. Connectivity analysis of multichannel EEG signals was performed by post-processing of the CPR matrix to understand global network-level characterization of the brain. Brain connectivity using thresholded CPR matrix of multichannel EEG signals showed clear differences in the number and pattern of connections in brain connectivity graph between epileptic seizure and pre-seizure. Corresponding brain headmaps provide meaningful insights about synchronization in the brain in those states. K-means clustering of connectivity parameters of CPR and linear correlation obtained from global epileptic seizure and pre-seizure showed significantly larger cluster centroid distances for CPR as opposed to linear correlation, thereby demonstrating the efficacy of CPR. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. Connectivity analysis on multichannel fMRI signals was performed using CPR matrix and graph theoretic analysis. Adjacency matrix was obtained from CPR matrices after thresholding it using statistical significance tests. Graph theoretic analysis based on communicability was performed to obtain community structures for awake resting and anesthetic sedation states. Concurrent behavioral data showed memory impairment due to anesthesia. Given the fact that previous studies have implicated the hippocampus in memory function, the CPR results showing the hippocampus within the community in awake state and out of it in anesthesia state, demonstrated the biological plausibility of the CPR results. On the other hand, results from linear correlation were less biologically plausible. In biological systems, highly synchronized and desynchronized systems are of interest rather than moderately synchronized ones. However, CPR is approximately a monotonic function of synchronization and hence can assume values which indicate moderate synchronization. In order to emphasize high synchronization/ desynchronization and de-emphasize moderate synchronization, a new method of Correlation Synchronization Convergence Time (CSCT) is proposed. It is obtained using an iterative procedure involving the evaluation of CPR for successive autocorrelations until CPR converges to a chosen threshold. CSCT was evaluated for 16 channel EEG data and corresponding contour plots and histograms were obtained, which shows better discrimination between synchronized and asynchronized states compared to the conventional CPR. This thesis has demonstrated the efficacy of RP technique and associated measures in characterizing various classes of biomedical signals. The results obtained are corroborated by well known physiological facts, and they provide physiologically meaningful insights into the functioning of the underlying biological systems, with potential diagnostic value in healthcare.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Rangaprakash, D. "Analysis Of Multichannel And Multimodal Biomedical Signals Using Recurrence Plot Based Techniques". Thesis, 2012. http://etd.iisc.ernet.in/handle/2005/2491.

Texto completo
Resumen
For most of the naturally occurring signals, especially biomedical signals, the underlying physical process generating the signal is often not fully known, making it difficult to obtain a parametric model. Therefore, signal processing techniques are used to analyze the signal for non-parametrically characterizing the underlying system from which the signals are produced. Most of the real life systems are nonlinear and time varying, which poses a challenge while characterizing them. Additionally, multiple sensors are used to extract signals from such systems, resulting in multichannel signals which are inherently coupled. In this thesis, we counter this challenge by using Recurrence Plot based techniques for characterizing biomedical systems such as heart or brain, using signals such as heart rate variability (HRV), electroencephalogram(EEG) or functional magnetic resonance imaging (fMRI), respectively, extracted from them. In time series analysis, it is well known that a system can be represented by a trajectory in an N-dimensional state space, which completely represents an instance of the system behavior. Such a system characterization has been done using dynamical invariants such as correlation dimension, Lyapunov exponent etc. Takens has shown that when the state variables of the underlying system are not known, one can obtain a trajectory in ‘phase space’ using only the signals obtained from such a system. The phase space trajectory is topologically equivalent to the state space trajectory. This enables us to characterize the system behavior from only the signals sensed from them. However, estimation of correlation dimension, Lyapunov exponent, etc, are vulnerable to non-stationarities in the signal and require large number of sample points for accurate computation, both of which are important in the case of biomedical signals. Alternatively, a technique called Recurrence Plots (RP) has been proposed, which addresses these concerns, apart from providing additional insights. Measures to characterize RPs of single and two channel data are called Recurrence Quantification Analysis (RQA) and cross RQA (CRQA), respectively. These methods have been applied with a good measure of success in diverse areas. However, they have not been studied extensively in the context of experimental biomedical signals, especially multichannel data. In this thesis, the RP technique and its associated measures are briefly reviewed. Using the computational tools developed for this thesis, RP technique has been applied on select single channel, multichannel and multimodal (i.e. multiple channels derived from different modalities) biomedical signals. Connectivity analysis is demonstrated as post-processing of RP analysis on multichannel signals such as EEG and fMRI. Finally, a novel metric, based on the modification of a CRQA measure is proposed, which shows improved results. For the case of single channel signal, we have considered a large database of HRV signals of 112 subjects recorded for both normal and abnormal (anxiety disorder and depression disorder) subjects, in both supine and standing positions. Existing RQA measures, Recurrence Rate and Determinism, were used to distinguish between normal and abnormal subjects with an accuracy of 58.93%. A new measure, MLV has been introduced, using which a classification accuracy of 98.2% is obtained. Correlation between probabilities of recurrence (CPR) is a CRQA measure used to characterize phase synchronization between two signals. In this work, we demonstrate its utility with application to multimodal and multichannel biomedical signals. First, for the multimodal case, we have computed running CPR (rCPR), a modification proposed by us, which allows dynamic estimation of CPR as a function of time, on multimodal cardiac signals (electrocardiogram and arterial blood pressure) and demonstrated that the method can clearly detect abnormalities (premature ventricular contractions); this has potential applications in cardiac care such as assisted automated diagnosis. Second, for the multichannel case, we have used 16 channel EEG signals recorded under various physiological states such as (i) global epileptic seizure and pre-seizure and (ii) focal epilepsy. CPR was computed pair-wise between the channels and a CPR matrix of all pairs was formed. Contour plot of the CPR matrix was obtained to illustrate synchronization. Statistical analysis of CPR matrix for 16 subjects of global epilepsy showed clear differences between pre-seizure and seizure conditions, and a linear discriminant classifier was used in distinguishing between the two conditions with 100% accuracy. Connectivity analysis of multichannel EEG signals was performed by post-processing of the CPR matrix to understand global network-level characterization of the brain. Brain connectivity using thresholded CPR matrix of multichannel EEG signals showed clear differences in the number and pattern of connections in brain connectivity graph between epileptic seizure and pre-seizure. Corresponding brain headmaps provide meaningful insights about synchronization in the brain in those states. K-means clustering of connectivity parameters of CPR and linear correlation obtained from global epileptic seizure and pre-seizure showed significantly larger cluster centroid distances for CPR as opposed to linear correlation, thereby demonstrating the efficacy of CPR. The headmap in the case of focal epilepsy clearly enables us to identify the focus of the epilepsy which provides certain diagnostic value. Connectivity analysis on multichannel fMRI signals was performed using CPR matrix and graph theoretic analysis. Adjacency matrix was obtained from CPR matrices after thresholding it using statistical significance tests. Graph theoretic analysis based on communicability was performed to obtain community structures for awake resting and anesthetic sedation states. Concurrent behavioral data showed memory impairment due to anesthesia. Given the fact that previous studies have implicated the hippocampus in memory function, the CPR results showing the hippocampus within the community in awake state and out of it in anesthesia state, demonstrated the biological plausibility of the CPR results. On the other hand, results from linear correlation were less biologically plausible. In biological systems, highly synchronized and desynchronized systems are of interest rather than moderately synchronized ones. However, CPR is approximately a monotonic function of synchronization and hence can assume values which indicate moderate synchronization. In order to emphasize high synchronization/ desynchronization and de-emphasize moderate synchronization, a new method of Correlation Synchronization Convergence Time (CSCT) is proposed. It is obtained using an iterative procedure involving the evaluation of CPR for successive autocorrelations until CPR converges to a chosen threshold. CSCT was evaluated for 16 channel EEG data and corresponding contour plots and histograms were obtained, which shows better discrimination between synchronized and asynchronized states compared to the conventional CPR. This thesis has demonstrated the efficacy of RP technique and associated measures in characterizing various classes of biomedical signals. The results obtained are corroborated by well known physiological facts, and they provide physiologically meaningful insights into the functioning of the underlying biological systems, with potential diagnostic value in healthcare.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Chakraborty, Saikat. "Effective Clinical Gait Analysis using Supervised Learning Techniques". Thesis, 2021. http://ethesis.nitrkl.ac.in/10321/1/2021_PhD_SChakraborty_516CS1005_Effective.pdf.

Texto completo
Resumen
Gait analysis has become a popular trend to solve critical problems in different application domains. It has demonstrated crucial significance in the clinical domain also. Quantification of gait pattern through subtle analysis of salient features have surpassed the pitfalls of prevailing qualitative assessment techniques. Computational intelligence techniques, specifically machine learning algorithms, have demonstrated competing performance in modeling non-linear data relationships of gait variables. Another issue that needs to be emphasized is the expensiveness of the prevailing gait assessment systems. High end sensors make the overall system costly, which is not affordable for most clinics, especially in a developing country. High expenditure causes limited expansion of gait laboratories across the world. Compulsorily pathologists follow the error-prone qualitative techniques to assess gait. Hence, a low-cost arrangement to analyze gait is hugely needed. This thesis has proposed four contributions to address some challenging problems of human gait in the clinical domain using some low-cost system setups. Few vital issues,like gait event detection, abnormality detection, feature assessment, etc., were explored and investigated. Basically, this thesis targets to construct a few affordable automatic gait abnormality detection systems after addressing some pre-requisite issues. Validation of a sensor before using it for clinical purposes is an important issue. Studies reported that the skeletal data stream of Kinect is not suitable to estimate joint kinematics. Although the joint angle time series follows the pattern of the corresponding ground truth, it differs substantially in terms of magnitude. Hence, as an alternative, the color image data stream of Kinect was investigated for joint kinematics in the first contribution. The point cloud feature of Kinect was used to extract lower limb joint positions, which were then converted to joint angles using extended Kalman filter and a kinematic model. The process was validated against the gold standard cameras. The obtained joint angles were significant to use for medical purposes. Event annotation is considered as an initial work for constructing a gait abnormality detection system. Most of the clinics follow the manual annotation technique of gait events on a time series data. However, this method is error-prone and laborious. On the contrary, automatic gait event detection systems are gradually becoming popular. This thesis proposes an event detection system using a state-space model. A multi-Kinect architecture for overground walking was established. Data were collected from both pathological and normal populations. A state-space model was constructed where the temporal evolution of gait signal was modeled by quantifying feature uncertainty. The inter-state transition frames were marked as the gait events. In addition, an attempt was made for treadmill gait also. Here, an unsupervised approach was proposed to detect gait events using a multi-Kinect system. Cerebral Palsy is a widespread disease across the world. The activity of daily life of patients suffers from distorted gait. Numerous features have been extracted to characterize the gait pattern of this population. However, there exists a high variability in the recommended features. Prior information on the most important gait feature would help to construct a population-specific gait abnormality detection system. Hence, a well-known statistical approach called meta-analysis, which is generally used in medical science to estimate the effect of an intervention, has been used to select the most important gait features in Cerebral Palsy population. Features were ranked according to their importance level. Automatic abnormality detection systems, specifically for the Cerebral Palsy patients, are expensive. On the other hand, systems based on a low-cost sensor, like Kinect, suffer from several problems. This thesis addresses some of those issues. A clinically relevant walking track was constructed using a multi-Kinect architecture. An algorithm to remove outliers from the multi-Kinect data has been proposed. Features, generally used to detect the Cerebral Palsy gait, are influenced by the gait velocity. A speed-invariant feature might be beneficial for such systems. Hence, this thesis used a handcrafted speed invariant feature and compared its performance against the best feature set. Different supervised models were established to construct the abnormality detection systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Sahu, Ram Shankar. "Analysis of Mammographic Images for Early Detection of Breast Cancer Using Machine Learning Techniques". Thesis, 2015. http://ethesis.nitrkl.ac.in/7648/1/100.pdf.

Texto completo
Resumen
Breast cancer is the main reason for death among women. Radiographic images obtained from mammography equipment are one of the most frequently used techniques for helping in early detection of breast cancer. The motivation behind this study is to focus the tumour types of breast cancer images .It is methodology to anticipated a sickness in view of the visual conclusion of breast disease tumour types with precision, particularly when numerous feature are related. Breast Cancer (BC) is one such sample where the phenomenon is very complex furthermore numerous feature of tumour types are included. In the present investigation, various pattern recognition techniques were used for the classification of breast cancer using mammograms image processing techniques .The pattern recognition techniques for tumour image enhancements, segmentation, texture based image feature extraction and subsequent classification of breast cancer mammogram image was successfully performed. When two machine learning techniques such as Artificial Neural Network (ANN), Support Vector Machine (SVM) were used to classify 120 images, it was observed from the results that Artificial Neural Network classifiers demonstrated the h classification rate 91.31% and the SVM with both Radial Basis Function (RBF) and linear kernel classifiers demonstrated the highest classification rate of 92.11% and RBF classification rate is 92.85%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Peng, Jyun-Fan y 彭俊凡. "An analysis of biomedical literatures based on document clustering and text mining techniques". Thesis, 2017. http://ndltd.ncl.edu.tw/handle/a962m7.

Texto completo
Resumen
碩士
臺北市立大學
數學系數學教育碩士班
105
This research use text mining technique to analyze the biomedical literatures which are the National Health Insurance literatures included in PubMed database in recent decade. The digital archives technique gets more and more popular and advanced, so that people can get the digital archives whenever and wherever they want. Even so, we still rely on the human ability to process the literature arrangement, which is inefficient. Therefore, the efficient method to get the useful information is the important issue that scholars want to figure out. This research analyzes the unstructured data in literatures, explores the patterns in interested based on text analysis technique and clustering technique in machine leaning such as Self-Organizing Map and k-mean algorithm. We also cluster the biomedical literatures into many research fields according to release years of these literatures. Furthermore, we show the change and the trend of research topic with data visualization technique. The research demonstrates that the following topics are the most important issues in recent years: cerebrovascular event and cardiovascular disease, chronic kidney disease and coronary artery disease, pharmacotherapy field of cardiovascular disease and lung diseases. The numbers of literatures of these topics are getting a significant growth.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Liu, Lu. "Applied Fourier Transform Near-infrared Techniques for Biomass Compositional Analysis". 2007. http://trace.tennessee.edu/utk_gradthes/165.

Texto completo
Resumen
A new method for rapid chemical analysis of lignocellulosic biomass was developed using Fourier transform near-infrared (FT-NIR) spectroscopic techniques. The new method is less time-consuming and expensive than traditional wet chemistry. A mathematical model correlated FT-NIR spectra with concentrations determined by wet chemistry. Chemical compositions of corn stover and switchgrass were evaluated in terms of glucose, xylose, galactose, arabinose, mannose, lignin, and ash. Model development evaluated multivariate regressions, spectral transform algorithms, and spectral pretreatments and selected partial least squares regression, log(1/R), and extended multiplicative signal correction, respectively. Chemical composition results indicated greater variability in corn stover than switchgrass, especially among botanic parts. Also, glucose percentage was higher in internodes (>40%) than nodes or leaves (~30- 40%). Leaves had the highest percentage of lignin (~23-25%) and ash (~4-9%). Husk had the highest total sugar percentage (~77%). Individual FT-NIR predictive models were developed with good accuracy for corn stover and switchgrass. Root mean square errors for prediction (RMSEPs) from crossvalidation for glucose, xylose, galactose, arabinose, mannose, lignin and ash were 0.633, 0.620, 0.235, 0.374, 0.203, 0.458 and 0.266 (%w/w), respectively for switchgrass, and 1.407, 1.346, 0.201, 0.341, 0.321, 1.087 and 0.700 (%w/w), respectively for corn stover. A unique general model for corn stover and switchgrass was developed and validated for general biomass using a combination of independent samples of corn stover, switchgrass and wheat straw. RMSEPs of this general model using cross-validation were 1.153, 1.208, 0.425, 0.578, 0.282, 1.347 and 0.530 %w/w for glucose, xylose, galactose, arabinose, mannose, lignin and ash, respectively. RMSEPs for independent validation were less than those obtained by cross-validation. Prediction of major constituents satisfied standardized quality control criteria established by the American Association of Cereal Chemists. Also, FT-NIR analysis predicted higher heating value (HHV) with a RMSEP of 53.231 J/g and correlation of 0.971. An application of the developed method is the rapid analysis of the chemical composition of biomass feedstocks to enable improved targeting of plant botanic components to conversion processes including, but not limited to, fermentation and gasification.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Munn, Lance Leon. "The application of digital image analysis techniques to studies of lymphocyte function". Thesis, 1993. http://hdl.handle.net/1911/16655.

Texto completo
Resumen
Video microscopy and digital image analysis methods were developed for studying immune cell function. Firstly, fluorescence energy transfer spectroscopy was used to monitor the evolution of the intercellular contact area between conjugating lymphocytes. Two-color conjugates (containing a donor-labeled and an acceptor-labeled cell) were identified and followed as the adhesion progressed. As the labeled membranes came into close apposition during the conjugation, the quenching of donor probe fluorescence due to close proximity to the acceptor molecules was used as an indicator of the intercellular contact area. An analysis of the experimental uncertainties arising from spatial inhomogeneities in the fluorescent labeling and photobleaching rate constants is presented along with experimental results. Secondly, an assay of lymphocyte adhesion based on time-resolved morphological measurements of intercellular aggregation is presented. Homotypic lymphocyte aggregation was induced and followed using video microscopy and time-lapse recording. The rate of aggregation was accurately represented by the temporal evolution of the aggregate size distribution, and an analysis of aggregate morphologies allowed comparisons of cytoskeletal activity. A mathematical model developed for the kinetics of aggregation aided in interpretation of experimental data. Results from a series of aggregation experiments using Jurkat and K562 cells treated with various activating antibodies are presented to demonstrate the capabilities of the assay and the accuracy of the model. The results show that the assay is sensitive enough to compare aggregation events which were induced through distinct molecular epitopes. Aggregate size distribution plots of actual experimental data show good agreement with predictions of the model. Thirdly, video microscopy and digital imaging were used as a non-invasive, quantitative assay of lymphocyte activation and proliferation. The mean cell sizes of T lymphocytes in an activation kinetics assay were measured by digital image analysis and compared to $\lbrack\sp3$H) -thymidine incorporation of cells under the same treatment. The digital imaging assay was more sensitive than the $\lbrack\sp3$H) -thymidine incorporation assay in determining the earliest time-point of activation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Sana, Furrukh. "Efficient Techniques of Sparse Signal Analysis for Enhanced Recovery of Information in Biomedical Engineering and Geosciences". Diss., 2016. http://hdl.handle.net/10754/621865.

Texto completo
Resumen
Sparse signals are abundant among both natural and man-made signals. Sparsity implies that the signal essentially resides in a small dimensional subspace. The sparsity of the signal can be exploited to improve its recovery from limited and noisy observations. Traditional estimation algorithms generally lack the ability to take advantage of signal sparsity. This dissertation considers several problems in the areas of biomedical engineering and geosciences with the aim of enhancing the recovery of information by exploiting the underlying sparsity in the problem. The objective is to overcome the fundamental bottlenecks, both in terms of estimation accuracies and required computational resources. In the first part of dissertation, we present a high precision technique for the monitoring of human respiratory movements by exploiting the sparsity of wireless ultra-wideband signals. The proposed technique provides a novel methodology of overcoming the Nyquist sampling constraint and enables robust performance in the presence of noise and interferences. We also present a comprehensive framework for the important problem of extracting the fetal electrocardiogram (ECG) signals from abdominal ECG recordings of pregnant women. The multiple measurement vectors approach utilized for this purpose provides an efficient mechanism of exploiting the common structure of ECG signals, when represented in sparse transform domains, and allows leveraging information from multiple ECG electrodes under a joint estimation formulation. In the second part of dissertation, we adopt sparse signal processing principles for improved information recovery in large-scale subsurface reservoir characterization problems. We propose multiple new algorithms for sparse representation of the subsurface geological structures, incorporation of useful prior information in the estimation process, and for reducing computational complexities of the problem. The techniques presented here enable significantly enhanced imaging of the subsurface earth and result in substantial savings in terms of convergence time, leading to optimized placement of oil wells. This dissertation demonstrates through detailed experimental analysis that the sparse estimation approach not only enables enhanced information recovery in variety of application areas, but also greatly helps in reducing the computational complexities associated with the problems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

dos, Santos Ketson Roberto Maximiano. "Stochastic dynamics and wavelets techniques for system response analysis and diagnostics: Diverse applications in structural and biomedical engineering". Thesis, 2019. https://doi.org/10.7916/d8-d26y-tz96.

Texto completo
Resumen
In the first part of the dissertation, a novel stochastic averaging technique based on a Hilbert transform definition of the oscillator response displacement amplitude is developed. In comparison to standard stochastic averaging, the requirement of “a priori” determination of an equivalent natural frequency is bypassed, yielding flexibility in the ensuing analysis and potentially higher accuracy. Further, the herein proposed Hilbert transform based stochastic averaging is adapted for determining the time-dependent survival probability and first-passage time probability density function of stochastically excited nonlinear oscillators, even endowed with fractional derivative terms. To this aim, a Galerkin scheme is utilized to solve approximately the backward Kolmogorov partial differential equation governing the survival probability of the oscillator response. Next, the potential of the stochastic averaging technique to be used in conjunction with performance-based engineering design applications is demonstrated by proposing a stochastic version of the widely used incremental dynamic analysis (IDA). Specifically, modeling the excitation as a non-stationary stochastic process possessing an evolutionary power spectrum (EPS), an approximate closed-form expression is derived for the parameterized oscillator response amplitude probability density function (PDF). In this regard, IDA surfaces are determined providing the conditional PDF of the engineering demand parameter (EDP) for a given intensity measure (IM) value. In contrast to the computationally expensive Monte Carlo simulation, the methodology developed herein determines the IDA surfaces at minimal computational cost. In the second part of the dissertation, a novel multiple-input/single-output (MISO) system identification technique is developed for parameter identification of nonlinear and time-variant oscillators with fractional derivative terms subject to incomplete non-stationary data. The technique utilizes a representation of the nonlinear restoring forces as a set of parallel linear sub-systems. Next, a recently developed L1-norm minimization procedure based on compressive sensing theory is applied for determining the wavelet coefficients of the available incomplete non-stationary input-output (excitation-response) data. Several numerical examples are considered for assessing the reliability of the technique, even in the presence of incomplete and corrupted data. These include a 2-DOF time-variant Duffing oscillator endowed with fractional derivative terms, as well as a 2-DOF system subject to flow-induced forces where the non-stationary sea state possesses a recently proposed evolutionary version of the JONSWAP spectrum. In the third part of this dissertation, a joint time-frequency analysis technique based on generalized harmonic wavelets (GHWs) is developed for dynamic cerebral autoregulation (DCA) performance quantification. DCA is the continuous counter-regulation of the cerebral blood flow by the active response of cerebral blood vessels to the spontaneous or induced blood pressure fluctuations. Specifically, various metrics of the phase shift and magnitude of appropriately defined GHW-based transfer functions are determined based on data points over the joint time-frequency domain. The potential of these metrics to be used as a diagnostics tool for indicating healthy versus impaired DCA function is assessed by considering both healthy individuals and patients with unilateral carotid artery stenosis. Next, another application in biomedical engineering is pursued related to the Pulse Wave Imaging (PWI) technique. This relies on ultrasonic signals for capturing the propagation of pressure pulses along the carotid artery, and eventually for prognosis of focal vascular diseases (e.g., atherosclerosis and abdominal aortic aneurysm). However, to obtain a high spatio-temporal resolution the data are acquired at a high rate, in the order of kilohertz, yielding large datasets. To address this challenge, an efficient data compression technique is developed based on the multiresolution wavelet decomposition scheme, which exploits the high correlation of adjacent RF-frames generated by the PWI technique. Further, a sparse matrix decomposition is proposed as an efficient way to identify the boundaries of the arterial wall in the PWI technique.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Shields, IV Charles Wyatt. "Acoustic and Magnetic Techniques for the Isolation and Analysis of Cells in Microfluidic Platforms". Diss., 2016. http://hdl.handle.net/10161/12188.

Texto completo
Resumen

Cancer comprises a collection of diseases, all of which begin with abnormal tissue growth from various stimuli, including (but not limited to): heredity, genetic mutation, exposure to harmful substances, radiation as well as poor dieting and lack of exercise. The early detection of cancer is vital to providing life-saving, therapeutic intervention. However, current methods for detection (e.g., tissue biopsy, endoscopy and medical imaging) often suffer from low patient compliance and an elevated risk of complications in elderly patients. As such, many are looking to “liquid biopsies” for clues into presence and status of cancer due to its minimal invasiveness and ability to provide rich information about the native tumor. In such liquid biopsies, peripheral blood is drawn from patients and is screened for key biomarkers, chiefly circulating tumor cells (CTCs). Capturing, enumerating and analyzing the genetic and metabolomic characteristics of these CTCs may hold the key for guiding doctors to better understand the source of cancer at an earlier stage for more efficacious disease management.

The isolation of CTCs from whole blood, however, remains a significant challenge due to their (i) low abundance, (ii) lack of a universal surface marker and (iii) epithelial-mesenchymal transition that down-regulates common surface markers (e.g., EpCAM), reducing their likelihood of detection via positive selection assays. These factors potentiate the need for an improved cell isolation strategy that can collect CTCs via both positive and negative selection modalities as to avoid the reliance on a single marker, or set of markers, for more accurate enumeration and diagnosis.

The technologies proposed herein offer a unique set of strategies to focus, sort and template cells in three independent microfluidic modules. The first module exploits ultrasonic standing waves and a class of elastomeric particles for the rapid and discriminate sequestration of cells. This type of cell handling holds promise not only in sorting, but also in the isolation of soluble markers from biofluids. The second module contains components to focus (i.e., arrange) cells via forces from acoustic standing waves and separate cells in a high throughput fashion via free-flow magnetophoresis. The third module uses a printed array of micromagnets to capture magnetically labeled cells into well-defined compartments, enabling on-chip staining and single cell analysis. These technologies can operate in standalone formats, or can be adapted to operate with established analytical technologies, such as flow cytometry. A key advantage of these innovations is their ability to process erythrocyte-lysed blood in a rapid (and thus high throughput) fashion. They can process fluids at a variety of concentrations and flow rates, target cells with various immunophenotypes and sort cells via positive (and potentially negative) selection. These technologies are chip-based, fabricated using standard clean room equipment, towards a disposable clinical tool. With further optimization in design and performance, these technologies might aid in the early detection, and potentially treatment, of cancer and various other physical ailments.


Dissertation
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Lo, Pei-yu y 羅珮瑜. "The Analysis of Skin Reaction after UVB Irradiation by Biomedical Engineering Techniques and Mathematical Model for Minimal Erythema Doses (MED) Predicting". Thesis, 2008. http://ndltd.ncl.edu.tw/handle/35953854107972038580.

Texto completo
Resumen
碩士
國立成功大學
醫學工程研究所碩博士班
96
This study is to investigate and analyze the associated parameters for skin reaction after ultraviolet B (UVB) irradiation using non-invasive measuring devices. The device of transepidermal water loss and capacitance measurement is used to quantitatively characterize the skin response. A confocal laser scanning microscopy is employed to in-vivo analyze the tissue changes for the optical section of skin in real time. The laser Doppler perfusion imager is used to monitor the blood perfusion of skin. The colorimetric measurement (Chromameter a*, b*, and L* values; Mexameter Hb for determining the skin surface melanin index and erythema index) are also be employed for recording the changes of skin color. In this study, 20 healthy Chinese volunteers are recruited. 2 sites with 20 mm in diameter of the volar forearms are irradiated with UVB dosages of 100 mj/cm2 and 200 mj/cm2, respectively. The skin before and immediately after UVB irradiation, and 24 hours after UVB irradiation are characterized using the above mention devices. From the experimental results, it is shown that the colorimeric measurement and visual scoring (VS) provide better discrimination over UVB irradiation dosages. Both a* (A) and erythema index (EI) show very good positive linear relation to VS. (a* is better than EI.) The a* value provides more reliable information and is used to obtain a mathematical model in predicting the minimal erythema dose (MED). Comparing the experimental data, it is found that the predicted value is lower than the visual scoring by 10 mj/cm2. However, the correlation of MED values obtained using the visual assessment and mathematical prediction is fairly high (Pearson correlation coefficient = 0.758). In conclusion, the mathematical model in the estimation of MED basing on the a* for the UVB irradiated skin is constructed in this study. In the future, a more precise treatment procedure of UVB irradiation, as well as a more treatment efficacy and a less treatment risk may be expected and provided.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

"Remote Sensing For Vital Signs Monitoring Using Advanced Radar Signal Processing Techniques". Doctoral diss., 2018. http://hdl.handle.net/2286/R.I.51751.

Texto completo
Resumen
abstract: In the past half century, low-power wireless signals from portable radar sensors, initially continuous-wave (CW) radars and more recently ultra-wideband (UWB) radar systems, have been successfully used to detect physiological movements of stationary human beings. The thesis starts with a careful review of existing signal processing techniques and state of the art methods possible for vital signs monitoring using UWB impulse systems. Then an in-depth analysis of various approaches is presented. Robust heart-rate monitoring methods are proposed based on a novel result: spectrally the fundamental heartbeat frequency is respiration-interference-limited while its higher-order harmonics are noise-limited. The higher-order statistics related to heartbeat can be a robust indication when the fundamental heartbeat is masked by the strong lower-order harmonics of respiration or when phase calibration is not accurate if phase-based method is used. Analytical spectral analysis is performed to validate that the higher-order harmonics of heartbeat is almost respiration-interference free. Extensive experiments have been conducted to justify an adaptive heart-rate monitoring algorithm. The scenarios of interest are, 1) single subject, 2) multiple subjects at different ranges, 3) multiple subjects at same range, and 4) through wall monitoring. A remote sensing radar system implemented using the proposed adaptive heart-rate estimation algorithm is compared to the competing remote sensing technology, a remote imaging photoplethysmography system, showing promising results. State of the art methods for vital signs monitoring are fundamentally related to process the phase variation due to vital signs motions. Their performance are determined by a phase calibration procedure. Existing methods fail to consider the time-varying nature of phase noise. There is no prior knowledge about which of the corrupted complex signals, in-phase component (I) and quadrature component (Q), need to be corrected. A precise phase calibration routine is proposed based on the respiration pattern. The I/Q samples from every breath are more likely to experience similar motion noise and therefore they should be corrected independently. High slow-time sampling rate is used to ensure phase calibration accuracy. Occasionally, a 180-degree phase shift error occurs after the initial calibration step and should be corrected as well. All phase trajectories in the I/Q plot are only allowed in certain angular spaces. This precise phase calibration routine is validated through computer simulations incorporating a time-varying phase noise model, controlled mechanic system, and human subject experiment.
Dissertation/Thesis
Doctoral Dissertation Electrical Engineering 2018
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Supriya, Supriya. "Brain Signal Analysis and Classification by Developing New Complex Network Techniques". Thesis, 2020. https://vuir.vu.edu.au/40551/.

Texto completo
Resumen
Brain signal analysis has a crucial role in the investigation of the neuronal activity for diagnosis of brain diseases and disorders. The electroencephalogram (EEG) is the most efficient biomarker for the analysis of brain signal that assists in the diagnosis of brain disorder medication and also plays an essential role in all the neurosurgery related to the brain. EEG findings illustrate the meticulous condition, and clinical content of the brain dysfunctions, and has an undisputed importance role in the detection of epilepsy condition and sleep disorders and dysfunctions allied to alcohol. The clinicians visually study the EEG recording to determine the manifestation of abnormalities in the brain. The visual EEG assessment is tiresome, fallible, and also high-priced. In this dissertation, a number of frameworks have been developed for the analysis and classification of EEG signals by addressing three different domains named: Epilepsy, Sleep staging, and Alcohol Use Disorder. Epilepsy is a non-contagious chronic disease of the brain that affects around 65 million people worldwide. The sudden onset tendency of the epileptic attacks vulnerable their sufferers to injuries. It is also challenging for the clinical staff to detect the epileptic-seizure activity early enough for determining the semiology associated with the seizure onset. For that reason, automated techniques that can accurately detect the epilepsy from EEG are of great importance to epileptic patients and especially to those patients who are resistive to therapies and medications. In this dissertation, four different techniques (named Weighted Visibility Network, Weighted Horizontal Visibility Network, Weighted Complex Network, and New Weighted Complex Network) have been developed for the automated identification of epileptic activity from the EEG signals. Most of the developed schemes attained 100% classification outcomes in their experimental evaluation for the identification of seizure activity from non-seizure activity. A sleep disorder can increase the menace of seizure incidence or severity, cognitive tasks impairments, mood deviation, diminution in the functionality of the immune system and other brain anomalies such as insomnia, sleep apnoea, etc. Hence, sleep staging is essential to discriminate among distinct sleep stages for the diagnosis of sleep and its disorders. EEG provides vital and inimitable information regarding the sleeping brain. The study of EEG has documented deformities in sleep patterns. This research has developed an innovative graph- theory based framework named weighted visibility network for sleep staging from EEG signals. The developed framework in this thesis, outperforms with 97.93% overall classification accuracy for categorizing distinct sleep states Alcoholism causes memory issues as well as motor skill defects by affecting the different portions of the brain. Excessive use of alcohol can cause sudden cardiac death and cardiomyopathy. Also, alcohol use disorder leads to respiratory infections, Vision impairment, liver damage, and cancer, etc. Research study demonstrates the use of EEG for diagnosis the patient with a high menace of developmental impediments with alcohol. In this current Ph.D. project, I developed a weighted graph-based technique that analyses EEG to distinguish between alcoholic subject and non-alcoholic person. The promising classification outcome demonstrates the effectiveness of the proposed technique.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

AMARU', Fabio. "Multimodal techniques for biomedical image processing". Doctoral thesis, 2014. http://hdl.handle.net/11562/693559.

Texto completo
Resumen
Il lavoro di dottorato ha coinvolto tre principali aree di ricerca biomedica. Nella prima area, abbiamo mirato a valutare se le misure del tempo di rilassamento T1 in Risonanza Magnetica possono contribuire ad individuare dei predittori strutturali di lievi disturbi cognitivi in pazienti con forma Recidivante-Remittente di Sclerosi Multipla(RRMS). Ventinove controlli sani (HC) e quarantanove RRMS pazienti sono stati sottoposti a Risonanza magnetica a 3T per acquisire in maniera ottimale per la zona corticale e per la sostanza bianca (WML), i tempi di rilassamento T1 (rt), la conta delle lesioni e il volume. Nella WML e in quelle di tipo CL I (sostanza bianca - grigia mista), i T1 rt z-score sono risultati, significativamente, più lunghi rispetto ai tessuti dei controlli HC (p<0.001 e p<0.01, rispettivamente), indice di un’impoverimento del tessuto cerebrale. L'analisi di regressione multivariata ha rivelato che: i T1 rt z-score nelle lesioni corticali sono predittori indipendenti del recupero della memoria a lungo termine (p=0.01), i T1 z -score nella lesioni corticali della materia bianca sono predittori indipendenti del deficit relativi all’attenzione prolungata e all’elaborazione delle informazioni (p=0,02) ; Nella seconda, descriviamo un suscettometro biomagnetico a temperatura ambiente in grado di quantificare il sovraccarico di ferro nel fegato. Tramite un campo magnetico modulato elettronicamente, il sistema riesce a misurare segnali magnetici 108 volte più piccoli del campo applicato. Il rumore meccanico del suscettometro a temperatura ambiente viene minimizzato e il drift termico viene monitorato da un sistema automatico di bilanciamento. Abbiamo testato e calibrato lo strumento utilizzando un fantoccio riempito con una soluzione di esacloruro esaidrato II di ferro, ottenendo come correlazione R = 0,98 tra la massima risposta del suscettometro e la concentrazione di ferro. Queste misure indicano che per garantire una buon funzionamento dello strumento con una variabilità del segnale di uscita pari al 4-5%, eguale a circa 500ugr/gr di ferro, il tempo di acquisizione deve essere minore o uguale a 8 secondi. Nela terza area, un'analisi agli elementi finiti del modello 3D anatomicamente dettagliato del piede umano è il risultato finale della segmentazione 3D, secondo tecniche di ricostruzione applicate ad immagini standard DICOM di scansione a Tomografia Computerizzata, in congiunzione con la modellazione 3D assistita e dell’analisi agli elementi finiti (FEA). In questo modello la reale morfologia del cuscinetto adiposo plantare è stato considerata: è stato dimostrato giocare un ruolo molto importante durante il contatto con il terreno. Per ottenere i dati sperimentali da confrontare con le predizioni del modello 3D del piede, un esame posturografico statico su una pedana baropodometrica è stato effettuato. La pressione sperimentale del contatto plantare è risultata, qualitativamente, comparabile con i risultati predetti dall’analisi agli elementi finiti, principalmente, confrontando i valori sperimentali con i valori massimi delle pressioni in corrispondenza delle zona centrali del tallone e sotto le teste metatarsali.
The PhD work involved three main biomedical research areas. In the first, we aimed at assessing whether T1 relaxometry measurements may help identifying structural predictors of mild cognitive impairments in patients with relapsing-remitting multiple sclerosis. Twenty-nine healthy controls and forty-nine RRMS patients underwent at high resolution 3T magnetic resonance imaging to obtain optimal cortical and white matter lesion count/volume as well as T1 relaxation times (rt). In WML and CL type I (mixed white-gray matter), T1 rt z-scores were significantly longer than in HC tissue (p<0.001 and p<0.01 respectively), indicating loss of structure. Multivariate analysis revealed T1 rt z-scores in CL type I were independent predictors of long term retrieval (p=0.01), T1 z-score relaxation time in white matter cortical lesions were independent predictors of sustained attention and information processing (p=0.02); In the second, we describe a biomagnetic susceptometer at room-temperature to quantify liver iron overload. By electronically modulated magnetic field, the magnetic system measure magnetic signal 108 times weaker than field applied. The mechanical noise of room-temperature susceptometer is cancelled and thermal drift is monitored by an automatic balance control system. We have tested and calibrated the system using cylindrical phantom filled with hexahydrated iron II choloride solution, obtaining the correlation (R=0.98) of the maximum variation in the responses of the susceptometer. These measures indicate that the acquisition time must be less than 8 seconds to guarantee an output signal variability to about 4-5%, equal to 500ugr/grwet of iron. In the third, a 3D anatomically detailed finite element analysis human foot model is final results of density segmentation 3D reconstruction techiniques applied in Computed Tomography(CT) scan DICOM standard images in conjunctions with 3D finite element analysis(FEA) modeling. In this model the real morphology of plantar fat pad has been considered: it was shown to play a very important role during the contact with the ground. To obtain the experimental data to compare the predictions of 3D foot model, a posturography static examination test on a baropodometric platform has been carried. The experimental plantar contact pressure is, qualitatively, comparable with FEA predicted results, nominally, the peak pressure value zones at the centre heel region and beneath the metatarsal heads.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Agrawal, Alankar. "Computational and mathematical analysis of dynamics of fused deposition modelling based rapid prototyping technique for scaffold fabrication". Thesis, 2014. http://ethesis.nitrkl.ac.in/5763/1/212BM1348-6.pdf.

Texto completo
Resumen
Fused Deposition Modelling (FDM) based rapid prototying technique, is basically used to fabricate three dimensional (3D) objects. Recently, the technique has been considered as the most promising for fabrication of 3D scaffold from polymeric materials for biomedical application including tissue engineering. It is further reported that the scaffold designed from FDM by layer by layer deposition process does not mimic geometry as designed in the computer aided design (CAD) software. In this context, the adjustment of instrument parameters such as extruder nozzle diameter, nozzle angle and liquefier length is of paramount importance to achieve improved extruded melt flow behaviour and scaffold design. Therefore, this main focus of this thesis work is to analyse the flow behaviour of PCL scaffold material using computational and mathematical tools by varying extruded nozzle diameter, nozzle angle and nozzle length of the existing FDM machine. This analysis shows that the reduction in nozzle diameter and nozzle angle, results in higher pressure drops that leading to fine geometry of the scaffold. The proposed designed suggests that the nozzle diameter can be decreased from 0.5mm to 0.2mm with a nozzle angle of 120 degrees, that will increase the pressure at the nozzle tip and decrease extruded melt diameter that contributes better resolution during scaffold fabrication.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

He, Siheng. "Advanced Analysis Algorithms for Microscopy Images". Thesis, 2015. https://doi.org/10.7916/D8TB16MV.

Texto completo
Resumen
Microscope imaging is a fundamental experimental technique in a number of diverse research fields, especially biomedical research. It begins with basic arithmetic operations that intend to reproduce the information contained in the experimental sample. With the rapid advancement in CCD cameras and microscopes (e.g. STORM, GSD), image processing algorithms that extract information more accurate and faster are highly desirable. The overarching goal of this dissertation is to further improve image analysis algorithms. As most of microscope imaging applications start with fluorescence quantification, first we develop a quantification method for fluorescence of adsorbed proteins on microtubules. Based on the quantified result, the adsorption of streptavidin and neutravidin to biotinylated microtubules is found to exhibit negative cooperativity due to electrostatic interactions and steric hindrance. This behavior is modeled by a newly developed kinetic analogue of the Fowler-Guggenheim adsorption model. The complex adsorption kinetics of streptavidin to biotinylated structures suggests that the nanoscale architecture of binding sites can result in complex binding kinetics and hence needs to be considered when these intermolecular bonds are employed in self-assembly and nanobiotechnology. In the second part, a powerful lock-in algorithm is introduced for image analysis. A classic signal processing algorithm, the lock-in amplifier, was extended to two dimensions (2D) to extract the signal in patterned images. The algorithm was evaluated using simulated image data and experimental microscopy images to extract the fluorescence signal of fluorescently labeled proteins adsorbed on surfaces patterned with chemical vapor deposition (CVD). The algorithm was capable of retrieving the signal with a signal-to-noise ratio (SNR) as low as -20 dB. The methodology holds promise not only for the measurement of adsorption events on patterned surfaces but in all situations where a signal has to be extracted from a noisy background in two or more dimensions. The third part develops an automated software pipeline for image analysis, Fluorescencent Single Molecule Image Analysis (FSMIA). The software is customized especially for single molecule imaging. While processing the microscopy image stacks, it extracts physical parameters (e.g. location, fluorescence intensity) for each molecular object. Furthermore, it connects molecules in different frames into trajectories, facilitating common analysis tasks such as diffusion analysis and residence time analysis, etc. Finally, in the last part, a new algorithm is developed for the localization of imaged objects based on the search of the best-correlated center. This approach yields tracking accuracies that are comparable to those of Gaussian fittings in typical signal-to-noise ratios, but with one order-of-magnitude faster execution. The algorithm is well suited for super-resolution localization microscopy methods since they rely on accurate and fast localization algorithms. The algorithm can be adapted to localize objects that do not exhibit radial symmetry or have to be localized in higher dimensional spaces. Throughout this dissertation, the accuracy, precision and implementation of new image processing algorithms are highlighted. The findings not only further the theory behind digital image processing, but also further enrich the toolbox for microscopy image analysis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Tanjung, Nancy Giovanni. "In Vitro and In Silico Analysis of Osteoclastogenesis in Response to Inhibition of De-phosphorylation of EIF2alpha by Salubrinal and Guanabenz". Thesis, 2013. http://hdl.handle.net/1805/5189.

Texto completo
Resumen
Indiana University-Purdue University Indianapolis (IUPUI)
An excess of bone resorption over bone formation leads to osteoporosis, resulting in a reduction of bone mass and an increase in the risk of bone fracture. Anabolic and anti-resorptive drugs are currently available for treatment, however, none of these drugs are able to both promote osteoblastogenesis and reduce osteoclastogenesis. This thesis focused on the role of eukaryotic translation initiation factor 2 alpha (eIF2alpha), which regulates efficiency of translational initiation. The elevation of phosphorylated eIF2alpha was reported to stimulate osteoblastogenesis, but its effects on osteoclastogenesis have not been well understood. Using synthetic chemical agents such as salubrinal and guanabenz that are known to inhibit the de-phosphorylation of eIF2alpha, the role of phosphorylation of eIF2alpha in osteoclastogenesis was investigated in this thesis. The questions addressed herein were: Does the elevation of phosphorylated eIF2alpha (p-eIF2alpha) by salubrinal and guanabenz alter osteoclastogenesis? If so, what regulatory mechanism mediates the process? It was hypothesized that p-eIF2alpha could attenuate the development of osteoclast by regulating the transcription factor(s) amd microRNA(s) involved in osteoclastogenesis. To test this hypothesis, we conducted in vitro and in silico analysis of the responses of RAW 264.7 pre-osteoclast cells to salubrinal and guanabenz. First, the in vitro results revealed that the elevated level of phosphorylated eIF2alpha inhibited the proliferation, differentiation, and maturation of RAW264.7 cells and downregulated the expression of NFATc1, a master transcription factor of osteoclastogenesis. Silencing eIF2alpha by RNA interference suppressed the downregulation of NFATc1, suggesting the involvement of eIF2alpha in regulation of NFATc1. Second, the in silico results using genome-wide expression data and custom-made Matlab programs predicted a set of stimulatory and inhibitory regulator genes as well as microRNAs, which were potentially involved in the regulation of NFATc1. RNA interference experiments indicated that the genes such as Zfyve21 and Ddit4 were primary candidates as an inhibitor of NFATc1. In summary, the results showed that the elevation of p-eIF2alpha by salubrinal and guanabenz leads to attenuation of osteoclastogenesis through the downregulation of NFATc1. The regulatory mechanism is mediated by eIF2alpha signaling, but other signaling pathways are likely to be involved. Together with the previous data showing the stimulatory role of p-eIF2alpha in osteoblastogenesis, the results herein suggest that eIF2alpha-mediated signaling could provide a novel therapeutic target for treatment of osteoporosis by promoting bone formation and reducing bone resorption.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

LOVECCHIO, NICOLA. "Design and development of a lab-on-chip for biomedical analysis based on electrowetting on dielectric technique". Doctoral thesis, 2018. http://hdl.handle.net/11573/1077423.

Texto completo
Resumen
The purpose of this thesis research project has been the development of a compact and versatile optoelectronic platform able to implement all the functionalities needed for a lab-on-chip operation. The project includes also the development of the electronics needed for the control of the system. In particular, the proposed platform includes three different modules designed for the fluid handling through the ElectroWetting On Dielectric (EWOD) technique, the thermal sample treatment and optical detection. These modules incorporate thin film microelectronic devices (such as photosensors and interferential filters for the optical detection, or heaters and temperature sensors for the sample treatments) on glass substrates connected to the electronic microcontrollers. Moreover, the use of handling techniques which avoid the use of pumps and syringes led to a portable, high-sensitive and low-power consumption lab-on-chip device. All of the modules have been designed, fabricated and tested separately. Finally, a device integrating all of the functionalities mentioned before has been designed for the development of a multifunctional platform able to perform a “true” lab-on-chip biomolecular system.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía