Dissertations / Theses on the topic 'Microscopy – Data processing'

To see the other types of publications on this topic, follow the link: Microscopy – Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 37 dissertations / theses for your research on the topic 'Microscopy – Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Walton, John Moorcroft. "The acquisition, analysis and processing of Scanning Auger Microscopy (SAM) and Scanning Tunneling Microscopy (STM) data." Thesis, University of York, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.387184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Farnham, Rodrigo Bouchardet. "Processing and inpainting of sparse data as applied to atomic force microscopy imaging." California State University, Long Beach, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jarasch, Markus. "Interfacing a Computer to a Scanning Tunneling Microscope." PDXScholar, 1994. https://pdxscholar.library.pdx.edu/open_access_etds/5047.

Full text
Abstract:
A program was written in 'C' to control the functions of an already existing Scanning Tunneling Microscope (STM). A DAS-1601 data acquisition card (from Keithley Data Acquisition) was installed together with its drivers for 'C' on a computer with a 486-DX motherboard. The computer was interfaced to the electronics of the STM. Images taken of HOPG (highly oriented pyrolitic graphite) were of a reasonable quality and showed atomic resolution.
APA, Harvard, Vancouver, ISO, and other styles
4

Xinyu, Chang. "Neuron Segmentation and Inner Structure Analysis of 3D Electron Microscopy Data." Kent State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=kent1369834525.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rogers, Wendy Laurel. "A Mahalanobis-distance-based image segmentation error measure with applications in automated microscopy /." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=66025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Angadi, Veerendra C. "Quantitative electron energy-loss spectrum data processing for hyperspectral imaging in analytical transmission electron microscopy." Thesis, University of Sheffield, 2018. http://etheses.whiterose.ac.uk/20007/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Stein, Simon Christoph [Verfasser], Jörg [Akademischer Betreuer] [Gutachter] Enderlein, and Holger [Gutachter] Stark. "Advanced Data Processing in Super-resolution Microscopy / Simon Christoph Stein ; Gutachter: Jörg Enderlein, Holger Stark ; Betreuer: Jörg Enderlein." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2017. http://d-nb.info/1138835935/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Andronov, Leonid. "Development of advanced methods for super-resolution microscopy data analysis and segmentation." Thesis, Strasbourg, 2018. http://www.theses.fr/2018STRAJ001.

Full text
Abstract:
Parmi les méthodes de super-résolution, la microscopie par localisation de molécules uniques se distingue principalement par sa meilleure résolution réalisable en pratique mais aussi pour l’accès direct aux propriétés des molécules individuelles. Les données principales de la microscopie par localisation sont les coordonnées des fluorochromes, un type de données peu répandu en microscopie conventionnelle. Le développement de méthodes spéciales pour le traitement de ces données est donc nécessaire. J’ai développé les logiciels SharpViSu et ClusterViSu qui permettent d’effectuer les étapes de traitements les plus importantes, notamment une correction des dérives et des aberrations chromatiques, une sélection des événements de localisations, une reconstruction des données dans des images 2D ou dans des volumes 3D par le moyen de différentes techniques de visualisation, une estimation de la résolution à l’aide de la corrélation des anneaux de Fourier, et une segmentation à l’aide de fonctions K et L de Ripley. En plus, j’ai développé une méthode de segmentation de données de localisation en 2D et en 3D basée sur les diagrammes de Voronoï qui permet un clustering de manière automatique grâce à modélisation de bruit par les simulations Monte-Carlo. En utilisant les méthodes avancées de traitement de données, j’ai mis en évidence un clustering de la protéine CENP-A dans les régions centromériques des noyaux cellulaires et des transitions structurales de ces clusters au moment de la déposition de la CENP-A au début de la phase G1 du cycle cellulaire
Among the super-resolution methods single-molecule localization microscopy (SMLM) is remarkable not only for best practically achievable resolution but also for the direct access to properties of individual molecules. The primary data of SMLM are the coordinates of individual fluorophores, which is a relatively rare data type in fluorescence microscopy. Therefore, specially adapted methods for processing of these data have to be developed. I developed the software SharpViSu and ClusterViSu that allow for most important data processing steps, namely for correction of drift and chromatic aberrations, selection of localization events, reconstruction of data in 2D images or 3D volumes using different visualization techniques, estimation of resolution with Fourier ring correlation, and segmentation using K- and L-Ripley functions. Additionally, I developed a method for segmentation of 2D and 3D localization data based on Voronoi diagrams, which allows for automatic and unambiguous cluster analysis thanks to noise modeling with Monte-Carlo simulations. Using advanced data processing methods, I demonstrated clustering of CENP-A in the centromeric regions of the cell nucleus and structural transitions of these clusters upon the CENP-A deposition in early G1 phase of the cell cycle
APA, Harvard, Vancouver, ISO, and other styles
9

Fortier, Hélène. "AFM Indentation Measurements and Viability Tests on Drug Treated Leukemia Cells." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34345.

Full text
Abstract:
A significant body of literature has reported strategies and techniques to assess the mechanical properties of biological samples such as proteins, cellular and tissue systems. Atomic force microscopy has been used to detect elasticity changes of cancer cells. However, only a few studies have provided a detailed and complete protocol of the experimental procedures and data analysis methods for non-adherent blood cancer cells. In this work, the elasticity of NB4 cells derived from acute promyelocytic leukemia (APL) was probed by AFM indentation measurements to investigate the effects of the disease on cellular biomechanics. Understanding how leukemia influences the nanomechanical properties of cells is expected to provide a better understanding of the cellular mechanisms associated to cancer, and promises to become a valuable new tool for cancer detection and staging. In this context, the quantification of the mechanical properties of APL cells requires a systematic and optimized approach for data collection and analysis, in order to generate reproducible and comparative data. This Thesis elucidates the automated data analysis process that integrates programming, force curve collection and analysis optimization to assess variations of cell elasticity in response to processing criteria. A processing algorithm was developed by using the IGOR Pro software to automatically analyze large numbers of AFM data sets in an efficient and accurate manner. In fact, since the analysis involves multiple steps that must be repeated for many individual cells, an automated and un-biased processing approach is essential to precisely determine cell elasticity. Different fitting models for extracting the Young’s modulus have been systematically applied to validate the process, and the best fitting criteria, such as the contact point location and indentation length, have been determined in order to obtain consistent results. The designed automated processing code described in this Thesis was used to correlate alterations in cellular biomechanics of cancer cells as they undergo drug treatments. In order to fully assess drug effects on NB4 cells, viability assays were first performed using Trypan Blue staining for primary insights before initiating thorough microplate fluorescence intensity readings using a LIVE/DEAD viability kit involving ethidium and calcein AM labelling components. From 0 to 24 h after treatment using 30 µM arsenic trioxide, relative live cell populations increased until 36 h. From 0 to 12 h post-treatment, relative populations of dead cells increased until 24 h post-treatment. Furthermore, a drastic drop in dead cell count has been observed between 12 and 24 h. Additionally, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. With respect to cell mechanics, trapping of the non-adherent NB4 cells within fabricated SU8-10 microwell arrays, allowed consistent AFM indentation measurements up to 48 h after treatment. Results revealed an increase in cell elasticity up to 12 h post-treatment and a drastic decrease between 12 and 24 h. Furthermore, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. In addition to these indentation and viability testing approaches, morphological appearances were monitored, in order to track the apoptosis process of the affected cells. Relationships found between viability and elasticity assays in conjunction with morphology alterations revealed distinguish stages of apoptosis throughout treatment. 24 h after initial treatment, most cells were observed to have burst or displayed obvious blebbing. These relations between different measurement methods may reveal a potential drug screening approach, for understanding specific physical and biological of drug effects on the cancer cells.
APA, Harvard, Vancouver, ISO, and other styles
10

Vaillancourt, Benoit. "Novel biophysical appliations [sic] of STICS." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=111550.

Full text
Abstract:
The object of this thesis is to present two novel applications of Spatiotemporal Image Correlation Spectroscopy (STICS) to biological systems. STICS is a technique which uses the correlations in pixel intensity fluctuations of an image time series, captured under fluorescence microscopy, to measure the speed and direction of a flowing population of fluorescently labeled molecules. The method was first applied to measure the dynamics of transport vesicles inside growing pollen tubes of lily flowers. The measured vector maps allowed to confirm the presence of actin filaments along the periphery of the tubes, as well as the presence of a reverse-fountain pattern in the apical region. In a second set of experiments, STICS was used to measure the retrograde flow of filamentous actin in migrating chick DRG neuronal growth cones. These results serve as proof of principle that STICS can be used to probe the response of the growth cone cytoskeleton to external chemical cues.
APA, Harvard, Vancouver, ISO, and other styles
11

Bilyeu, Taylor Thomas. "Crystallographic Image Processing with Unambiguous 2D Bravais Lattice Identification on the Basis of a Geometric Akaike Information Criterion." Thesis, Portland State University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=1541427.

Full text
Abstract:

Crystallographic image processing (CIP) is a technique first used to aid in the structure determination of periodic organic complexes imaged with a high-resolution transmission electron microscope (TEM). The technique has subsequently been utilized for TEM images of inorganic crystals, scanning TEM images, and even scanning probe microscope (SPM) images of two-dimensional periodic arrays. We have written software specialized for use on such SPM images. A key step in the CIP process requires that an experimental image be classified as one of only 17 possible mathematical plane symmetry groups. The current methods used for making this symmetry determination are not entirely objective, and there is no generally accepted method for measuring or quantifying deviations from ideal symmetry. Here, we discuss the crystallographic symmetries present in real images and the general techniques of CIP, with emphasis on the current methods for symmetry determination in an experimental 2D periodic image. The geometric Akaike information criterion (AIC) is introduced as a viable statistical criterion for both quantifying deviations from ideal symmetry and determining which 2D Bravais lattice best fits the experimental data from an image being processed with CIP. By objectively determining the statistically favored 2D Bravais lattice, the determination of plane symmetry in the CIP procedure can be greatly improved. As examples, we examine scanning tunneling microscope images of 2D molecular arrays of the following compounds: cobalt phthalocyanine on Au (111) substrate; nominal cobalt phthalocyanine on Ag (111); tetraphenoxyphthalocyanine on highly oriented pyrolitic graphite; hexaazatriphenylene-hexacarbonitrile on Ag (111). We show that the geometric AIC procedure can unambiguously determine which 2D Bravais lattice fits the experimental data for a variety of different lattice types. In some cases, the geometric AIC procedure can be used to determine which plane symmetry group best fits the experimental data, when traditional CIP methods fail to do so.

APA, Harvard, Vancouver, ISO, and other styles
12

Ter, Haak Martin. "Machine learning for blob detection in high-resolution 3D microscopy images." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232114.

Full text
Abstract:
The aim of blob detection is to find regions in a digital image that differ from their surroundings with respect to properties like intensity or shape. Bio-image analysis is a common application where blobs can denote regions of interest that have been stained with a fluorescent dye. In image-based in situ sequencing for ribonucleic acid (RNA) for example, the blobs are local intensity maxima (i.e. bright spots) corresponding to the locations of specific RNA nucleobases in cells. Traditional methods of blob detection rely on simple image processing steps that must be guided by the user. The problem is that the user must seek the optimal parameters for each step which are often specific to that image and cannot be generalised to other images. Moreover, some of the existing tools are not suitable for the scale of the microscopy images that are often in very high resolution and 3D. Machine learning (ML) is a collection of techniques that give computers the ability to ”learn” from data. To eliminate the dependence on user parameters, the idea is applying ML to learn the definition of a blob from labelled images. The research question is therefore how ML can be effectively used to perform the blob detection. A blob detector is proposed that first extracts a set of relevant and nonredundant image features, then classifies pixels as blobs and finally uses a clustering algorithm to split up connected blobs. The detector works out-of-core, meaning it can process images that do not fit in memory, by dividing the images into chunks. Results prove the feasibility of this blob detector and show that it can compete with other popular software for blob detection. But unlike other tools, the proposed blob detector does not require parameter tuning, making it easier to use and more reliable.
Syftet med blobdetektion är att hitta regioner i en digital bild som skiljer sig från omgivningen med avseende på egenskaper som intensitet eller form. Biologisk bildanalys är en vanlig tillämpning där blobbar kan beteckna intresseregioner som har färgats in med ett fluorescerande färgämne. Vid bildbaserad in situ-sekvensering för ribonukleinsyra (RNA) är blobbarna lokala intensitetsmaxima (dvs ljusa fläckar) motsvarande platserna för specifika RNA-nukleobaser i celler. Traditionella metoder för blob-detektering bygger på enkla bildbehandlingssteg som måste vägledas av användaren. Problemet är att användaren måste hitta optimala parametrar för varje steg som ofta är specifika för just den bilden och som inte kan generaliseras till andra bilder. Dessutom är några av de befintliga verktygen inte lämpliga för storleken på mikroskopibilderna som ofta är i mycket hög upplösning och 3D. Maskininlärning (ML) är en samling tekniker som ger datorer möjlighet att “lära sig” från data. För att eliminera beroendet av användarparametrar, är tanken att tillämpa ML för att lära sig definitionen av en blob från uppmärkta bilder. Forskningsfrågan är därför hur ML effektivt kan användas för att utföra blobdetektion. En blobdetekteringsalgoritm föreslås som först extraherar en uppsättning relevanta och icke-överflödiga bildegenskaper, klassificerar sedan pixlar som blobbar och använder slutligen en klustringsalgoritm för att dela upp sammansatta blobbar. Detekteringsalgoritmen fungerar utanför kärnan, vilket innebär att det kan bearbeta bilder som inte får plats i minnet genom att dela upp bilderna i mindre delar. Resultatet visar att detekteringsalgoritmen är genomförbar och visar att den kan konkurrera med andra populära programvaror för blobdetektion. Men i motsats till andra verktyg behöver den föreslagna detekteringsalgoritmen inte justering av sina parametrar, vilket gör den lättare att använda och mer tillförlitlig.
APA, Harvard, Vancouver, ISO, and other styles
13

Le, Floch Hervé. "Acquisition des images en microscopie electronique a balayage in situ." Toulouse 3, 1986. http://www.theses.fr/1986TOU30026.

Full text
Abstract:
Chaine d'acquisition du signal image du mebis. Etude des sources de bruit associees aux detecteurs a semiconducteur et mise au point d'un processus de realisation de diodes de detection a barriere de surface. Conception d'une carte electronique compatible avec un microordinateur. Cette carte permet la numerisation, le stockage sur disquette et la visualisation des images fournies par le mebis
APA, Harvard, Vancouver, ISO, and other styles
14

Schulte, Lukas [Verfasser], Holger [Akademischer Betreuer] Stark, Holger [Gutachter] Stark, and Ralf [Gutachter] Ficner. "New Computational Tools for Sample Purification and Early-Stage Data Processing in High-Resolution Cryo-Electron Microscopy / Lukas Schulte ; Gutachter: Holger Stark, Ralf Ficner ; Betreuer: Holger Stark." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2019. http://d-nb.info/1175204889/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hall, Richard James. "Development of methods for improving throughput in the processing of single particle cryo-electron microscopy data, applied to the reconstruction of E. coli RNA polymerase holoenzyne - DNA complex." Thesis, Imperial College London, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.411621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

TRASOBARES, Susana. "Synthesis and Characterization of New Carbon Nitrogen Structures, Thin Films and Nanotubes." Phd thesis, Université Paris Sud - Paris XI, 2001. http://tel.archives-ouvertes.fr/tel-00002751.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Louys, Mireille. "Traitement d'images de microscopie électronique appliqué à l'étude structurale de macromolécules biologiques." Université Louis Pasteur (Strasbourg) (1971-2008), 1988. http://www.theses.fr/1988STR13153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Sorel, Julien. "Tomographie électronique analytique : Automatisation du traitement de données et application aux nano-dispositifs 3D en micro-électronique." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI078.

Full text
Abstract:
Ce travail porte sur l’automatisation du traitement des données de tomographie électronique analytique appliquée aux nano-dispositifs électroniques. La technique utilisée est la spectroscopie de dispersion en énergie des rayons-X en mode balayage en microscopie électronique en transmission (STEM-EDX : Scanning Transmission Electron Microscopy, Energy Dispersive X-ray spectroscopy). Si la tomographie électronique STEM-EDX a bénéficié d’avancées technologiques récentes, comme de nouvelles sources électroniques ‘X’-FEG (Field Emission Gun) et des détecteurs X sensibles, les SDD (Silicon Drift Detectors), elle reste chronophage avec une statistique de comptage souvent faible pour éviter des durées prohibitives et une dégradation de l’échantillon par irradiation électronique. L’empilement des projections STEM-EDX, acquises sous différents angles d’inclinaison, est par ailleurs très volumineux et les logiciels commerciaux actuels ne peuvent pas le traiter automatiquement et de manière optimale. Pour améliorer cette situation, nous avons développé un programme utilisant la librairie Hyperspy en langage python, dédiée au traitement de données multi-dimensionnelles. L’analyse statistique multivariée permet d’optimiser et d’automatiser le débruitage des données, la calibration des spectres et la séparation des raies d’émission X superposées pour l’obtention de reconstructions tridimensionnelles quantitatives. Une technique de reconstruction avancée, l’acquisition comprimée, a aussi été mise en œuvre, diminuant le nombre de projections sans réduire l’information 3D finale. La méthode développée a été utilisée pour l’analyse chimique 3D de quatre nanostructures issues de la microélectronique : des transistors FET multi-grilles, HEMT et GAA, et un film mince GeTe. Les échantillons ont été taillés en pointe par FIB (Focused Ion Beam: Faisceau d’ions focalisés), et les données obtenues sur un microscope Titan Themis muni d’un système à 4 détecteurs SDD. L’évaluation du programme atteste qu’il permet d’obtenir des résultats précis et fiables sur les architectures 3D étudiées. Des pistes d’améliorations sont discutées en perspective d’un futur logiciel dédié au traitement de données en tomographie électronique analytique
The aim of this thesis is to automate the process of hyperspectral analysis for analytical electron tomography applied to nanodevices. The work presented here is focused on datasets obtained by energy-dispersive X-ray spectroscopy in a scanning transmission electron microscope (STEM-EDX). STEM-EDX tomography has benefited greatly from recent developments in electron sources such as the ‘X’-FEG (Field Emission Gun), and multiple X-ray detector systems such as the Super-X, incorporating four SSD (Silicon Drift Detectors) detectors. The technique remains however very time-consuming, and low X-ray count rates are necessary to minimize the total acquisition time and avoid beam damage during the experiment. In addition, tomographic stacks of STEM-EDX datacubes, acquired at different tilt angles, are too large to be analyzed by commercial software packages in an optimal way. In order to automate this process, we developed a code based on Hyperspy, a Python library for multidimensional data analysis. Multivariate statistical analysis techniques were employed to optimize and automate the denoising, the energy calibration and the separation of overlapping X-ray lines, with the aim to achieve quantitative, chemically sensitive volumes. Moreover, a compressed sensing based algorithm was employed to achieve high fidelity reconstructions with undersampled tomographic datasets. The code developed during this thesis was used for the 3D chemical analysis of four microelectronic nanostructures: FinFET, HEMT and GAA transistors, and a GeTe thin film for memory device applications. The samples were prepared in a needle shape using a focused ion beam, and the data acquisitions were performed using a Titan Themis microscope equipped with a super-X EDX detector system. It is shown that the code yields 3D morphological and chemical information with high accuracy and fidelity. Ways to improve the current methodology are discussed, with future efforts aiming at developing a package dedicated to analytical electron tomography
APA, Harvard, Vancouver, ISO, and other styles
19

Ceremuga, Joseph Thomas II. "Optimizing inspection of high aspect ratio microstructure using a programmable optical microscope." Thesis, Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/5394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ceremuga, Joseph Thomas. "Optimizing inspection of high aspect ratio microstructure using a programmable optical microscope." Available online, Georgia Institute of Technology, 2004:, 2003. http://etd.gatech.edu/theses/available/etd-04082004-180101/unrestricted/ceremuga%5Fjoseph%5Ft%5F200312%5Fms.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Smrčková, Zuzana. "Motilita leukemických buněk analyzovaná nekoherentním holografickým kvantitativním zobrazováním fáze." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-444984.

Full text
Abstract:
This diploma thesis deals with the issue of motility analysis in leukemia cells. An accurate description of the cell movement and the detection of differences in motility under experimental conditions can be obtained by quantitative analysis of cell motility using time-lapse recording. The first part of this work describes various types of tumor cell migraton. The second part focuses on methods of analysis of cell motility in tissue culture using time-lapse recording, which include image acquisition and processing. Part of this chapter describes a coherence-controlled holographic microscope, which was used in the practical part and for which an insert was designed to ensure the exact and stable position of the individual chambers. The last part is focused on the research of leukemic cell motility, which is concluded by a discussion of the obtained results. The appendix contains a published study included acknowledgement to the author of this diploma thesis for participation in the project.
APA, Harvard, Vancouver, ISO, and other styles
22

Vlachynská, Alžběta. "Metody pro obrazovou analýzu populace fotosyntetických buněčných kultur." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221387.

Full text
Abstract:
This work was carried out in collaboration with the Department of Adaptive Biotechnologies, Global Change Research Centre AS CR. It deals with the quantitative analysis of photosynthetic cell cultures. It uses images captured by a confocal fluorescent microscope to the automatic determining the number of cells in the sample. The work consists of a theoretical analysis, which briefly describes fluorescence and confocal microscopy. It also concisely introduces a microscope Leica TCS SP8 X, which I used to scan data. One capture is devoted to the theory of digital image processing. The second part deskribes the development of algorithm for processing 3D data and simplified algorithm for processing 2D data and its program implementations in a programming environment MATLAB R2013b. Grafical user interface is explained in detail. Done measurement are presented at the conclusion. It mentions compiled sample preparation protocol. The results of the program are compared with manual counting. Number of cells per 1 ml are determined by created program in samples of cell cultures Chenopodium rubrum (Cr) and Solanum lycopersicum (To).
APA, Harvard, Vancouver, ISO, and other styles
23

CRAESMEYER, GABRIEL R. "Tratamento de efluente contendo urânio com zeólita magnética." reponame:Repositório Institucional do IPEN, 2013. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10578.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:42:11Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:05:08Z (GMT). No. of bitstreams: 0
Dissertação (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
24

Andén, Olivia. "Structural basis of modulation by pH and calcium in a ligand-gated ion channel." Thesis, KTH, Genteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-299889.

Full text
Abstract:
Pentameriska ligandstyrda jonkanaler (pLGICs) är avgörande för omvandlingen av kemisk till elektrisk signalöverföring i djurs nervsystem. Dysfunktion i dessa kanaler har visat sig vara kopplad till flera sjukdomar inklusive epilepsi, schizofreni, Alzheimers och autism, vilket gör dem till en måltavla för en mängd olika läkemedel. Att studera eukaryota kanaler är dock mycket utmanande, så upptäckten av prokaryota homologer, som är mycket lättare att studera, har därmed bidragit mycket till förståelsen för struktur och funktion hos proteiner i denna familj. I detta projekt producerades och renades en prokaryotisk pLGIC kallad DeCLIC från Escherichia coli. Strukturell bestämning av kanalen genomfördes med användning av kryo-elektronmikroskopi vid lågt pH och i närvaro av kalcium. En elektrontäthet med 3.4 Å upplösning uppnåddes och jämfördes med tidigare bestämda strukturer vid olika förhållanden i ett försök att bestämma hur proteinets struktur moduleras av kalcium och pH. Resultaten visar flera skillnader i kanalens konformation i närvaro och frånvaro av kalcium såväl som vid olika pH-värden. Dessutom antyder analys av den bestämda elektrontätheten ett möjligt intermediärt tillstånd vid lågt pH i närvaro av kalcium.
Pentameric ligand-gated ion channels (pLGICs) are crucial for the conversion of chemical to electrical signaling in the nervous system of mammals. Dysfunction in these channels has been found to be connected to several diseases including epilepsy, schizophrenia, Alzheimer’s, and autism, making them the target of a wide variety of therapeutic agents. However, studying eukaryotic channels is challenging so the discovery of prokaryotic homologs that are much easier to study has thus greatly helped in the understanding of the structure and function in this family of proteins. In this project, a prokaryotic pLGIC called DeCLIC was produced and purified from Escherichia coli. Structural determination of the channel was pursued using cryo-electron microscopy at a low pH and in the presence of calcium. An electron density at 3.4 Å resolution was achieved and compared to previously determined structures at different conditions in an attempt to determine the structural modulation of calcium and pH. Results show multiple differences in channel conformation in the presence and absence of calcium as well as in different pH conditions. Furthermore, analysis of the determined electron density suggests a possible intermediate state at low pH in the presence of calcium.
APA, Harvard, Vancouver, ISO, and other styles
25

Hammes, Daniel Markus [Verfasser]. "Data processing, 3D grain boundary modelling and analysis of in-situ deformation experiments using an automated fabric analyser microscope / Daniel Markus Hammes." Mainz : Universitätsbibliothek Mainz, 2017. http://d-nb.info/114906577X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Naik, Pranab Sabitru. "Design and implementation of a fully automated real-time s-parameter imaging system." Thesis, Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B30708758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Saxena, Shubham. "Nanolithography on thin films using heated atomic force microscope cantilevers." Thesis, Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-08302006-223629/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Baghi, Quentin. "Optimisation de l’analyse de données de la mission spatiale MICROSCOPE pour le test du principe d’équivalence et d’autres applications." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEO003/document.

Full text
Abstract:
Le Principe d'Equivalence (PE) est un pilier fondamental de la Relativité Générale. Il est aujourd'hui remis en question par les tentatives d'élaborer une théorie plus exhaustive en physique fondamentale, comme la théorie des cordes. La mission spatiale MICROSCOPE vise à tester ce principe à travers l'universalité de la chute libre, avec un objectif de précision de 10-15, c'est-à-dire un gain de deux ordres de grandeurs par rapport aux expériences actuelles. Le satellite embarque deux accéléromètres électrostatiques, chacun intégrant deux masses-test. Les masses de l'accéléromètre servant au test du PE sont de compositions différentes, alors que celles de l'accéléromètre de référence sont constituées d'un même matériau. L'objectif est de mesurer la chute libre des masses-test dans le champ gravitationnel de la Terre, en mesurant leur accélération différentielle avec une précision attendue de 10-12 ms-2Hz-1/2 dans la bande d'intérêt. Une violation du PE se traduirait par une différence périodique caractéristique entre les deux accélérations. Cependant, diverses perturbations sont également mesurées en raison de la grande sensibilité de l'instrument. Certaines d'entre elles, comme les gradients de gravité et d'inertie, sont bien définies. En revanche d'autres ne sont pas modélisées ou ne le sont qu'imparfaitement, comme le bruit stochastique et les pics d'accélérations dus à l'environnement du satellite, qui peuvent entraîner des saturations de la mesure ou des données lacunaires. Ce contexte expérimental requiert le développement d'outils adaptés pour l'analyse de données, qui s'inscrivent dans le cadre général de l'analyse des séries temporelles par régression linéaire.On étudie en premier lieu la détection et l’estimation de perturbations harmoniques dans le cadre de l'analyse moindres carrés. On montre qu’avec cette technique la projection des perturbations harmoniques sur le signal de violation du PE peut être maintenue à un niveau acceptable. On analyse ensuite l'impact des pertes de données sur la performance du test du PE. On montre qu'avec l'hypothèse pire cas sur la fréquence des interruptions de données (environ 300 interruptions de 0.5 seconde par orbite, chiffre évalué avant le vol), l'incertitude des moindres carrés ordinaires est multipliée par un facteur 35 à 60. Pour compenser cet effet, une méthode de régression linéaire basée sur une estimation autorégressive du bruit est développée, qui permet de décorréler les observations disponibles, sans calcul ni inversion directs de la matrice de covariance. La variance de l'estimateur ainsi construit est proche de la valeur optimale, ce qui permet de réaliser un test du PE au niveau attendu, même en présence de pertes de données fréquentes. On met également en place une méthode pour évaluer plus précisément la DSP du bruit à partir des données disponibles, sans utilisation de modèle a priori. L'approche est fondée sur une modification de l'algorithme espérance-maximisation (EM) avec une hypothèse de régularité de la DSP, en utilisant une imputation statistique des données manquantes. On obtient une estimée de la DSP avec une erreur inférieure à 10-12 ms-2Hz-1/2. En dernier lieu, on étend les applications de l'analyse de données en étudiant la faisabilité de la mesure du gradient de gravité terrestre avec MICROSCOPE. On évalue la capacité de cette observable à déchiffrer la géométrie des grandes échelles du géopotentiel. Par simulation des signaux obtenus à partir de différents modèles du manteau terrestre profond, on montre que leurs particularités peuvent être distinguées
The Equivalence Principle (EP) is a cornerstone of General Relativity, and is called into question by the attempts to build more comprehensive theories in fundamental physics such as string theories. The MICROSCOPE space mission aims at testing this principle through the universality of free fall, with a target precision of 10-15, two orders of magnitude better than current on-ground experiments. The satellite carries on-board two electrostatic accelerometers, each one including two test-masses. The masses of the test accelerometer are made with different materials, whereas the masses of the reference accelerometer have the same composition. The objective is to monitor the free fall of the test-masses in the gravitational field of the earth by measuring their differential accelerations with an expected precision of 10-12 ms-2Hz-1/2 in the bandwidth of interest. An EP violation would result in a characteristic periodic difference between the two accelerations. However, various perturbations are also measured because of the high sensitivity of the instrument. Some of them are well defined, e.g. gravitational and inertial gradient disturbances, but others are unmodeled, such as random noise and acceleration peaks due to the satellite environment, which can lead to saturations in the measurement or data gaps. This experimental context requires us to develop suited tools for the data analysis, which are applicable in the general framework of linear regression analysis of time series.We first study the statistical detection and estimation of unknown harmonic disturbances in a least squares framework, in the presence of a colored noise of unknown PSD. We show that with this technique the projection of the harmonic disturbances onto the WEP violation signal can be rejected. Secondly we analyze the impact of the data unavailability on the performance of the EP test. We show that with the worst case before-flight hypothesis (almost 300 gaps of 0.5 second per orbit), the uncertainty of the ordinary least squares is increased by a factor 35 to 60. To counterbalance this effect, a linear regression method based on an autoregressive estimation of the noise is developed, which allows a proper decorrelation of the available observations, without direct computation and inversion of the covariance matrix. The variance of the constructed estimator is close to the optimal value, allowing us to perform the EP test at the expected level even in case of very frequent data interruptions. In addition, we implement a method to more accurately characterize the noise PSD when data are missing, with no prior model on the noise. The approach is based on modified expectation-maximization (EM) algorithm with a smooth assumption on the PSD, and use a statistical imputation of the missing data. We obtain a PSD estimate with an error less than 10-12 ms-2Hz-1/2. Finally, we widen the applications of the data analysis by studying the feasibility of the measurement of the earth's gravitational gradient with MICROSCOPE data. We assess the ability of this set-up to decipher the large scale geometry of the geopotential. By simulating the signals obtained from different models of the earth's deep mantle, and comparing them to the expected noise level, we show that their features can be distinguished
APA, Harvard, Vancouver, ISO, and other styles
29

Lopez, Prieto Juan Jose, and Perez Miguel Angel Purizaca. "Modelo tecnológico para optimizar el proceso de detección de leucemia utilizando el algoritmo canny, a través de la microscopía digital." Bachelor's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2021. http://hdl.handle.net/10757/655225.

Full text
Abstract:
En el Perú, se estima que, de 1200 nuevos casos de cáncer infantil detectados anualmente, 350 terminan en muerte. Estas estimaciones aumentan constantemente debido a la falta de soluciones económicas y confiables para detectar el cáncer. Por ejemplo, el 60% de estas neoplasias se están detectando en estadios avanzados. Por lo tanto, el proceso de diagnóstico del cáncer en el Perú toma casi tres veces más que en los países desarrollados [1], reduciendo así las posibilidades de cura en el momento adecuado. Ante esta situación, el proyecto propone un modelo tecnológico para optimizar el proceso de detección de leucemia mediante la microscopía digital. Este modelo utiliza el algoritmo de Canny, y un banco de imágenes de Glóbulos Blancos y Rojos, para la identificación de células microscópicas, que finalmente serán analizadas por un especialista en salud para brindar el diagnóstico final. El modelo propuesto incluye la captura, digitalización y análisis de muestras microscópicas, y está compuesto por las siguientes 5 fases: 1. Recolección de datos; 2. Captura de datos; 3. Procesamiento de imágenes; 4. Clasificación celular; 5. Visualización de resultados. El modelo fue validado con cinco muestras de sangre de tres hombres y dos mujeres en diferentes categorías de edad. Estas muestras fueron validadas por el Jefe de Patología Clínica de un hospital público del Callao. Los resultados mostraron que se obtuvo una tasa de efectividad del 90,5% en la identificación de glóbulos blancos, reduciendo de esta manera el tiempo de diagnóstico actual de 3 meses, a un estimado 32 días.
In Peru, it is estimated that out of 1200 new cases of childhood cancer detected annually, 350 end in death. These estimates are constantly increasing due to the lack of economical and reliable solutions to detect cancer. For instance, 60% of the neoplasms used to detect cancer are analyzed in advanced stages. Therefore, the process of cancer diagnosis in Peru takes almost three times more than in developed countries [1], reducing the chances of a cure. In this situation, we propose a technological model for optimizing the detection process of leukemia using digital microscopy. This model applies the Canny algorithm on a bank of images of normal and abnormal microscopic cells, for the identification of microscopic cells, which will finally be analyzed by a health specialist to provide the final diagnosis. The proposed model includes the capture, digitization, and analysis of microscopic samples. Fives phases are included in this model: 1. Data collection; 2. Data capture; 3. Image processing; 4. Cell classification; 5. Display of results. The model was validated with five blood samples from three men and two women in different age categories. All these samples were validated by the Head of Clinical Pathology at a public hospital in Callao. The results showed that a 90.5% effectiveness rate of white blood cell identification was obtained, thus reducing the current diagnosis time from 3 months, to an estimated 32 days.
Tesis
APA, Harvard, Vancouver, ISO, and other styles
30

Brunel, Guilhem. "Caractérisation automatique d’organisations cellulaires dans des mosaïques d’images microscopiques de bois." Thesis, Montpellier 2, 2014. http://www.theses.fr/2014MON20225/document.

Full text
Abstract:
Ce travail porte sur l'analyse d'images numériques biologiques. Il vise à définir et mettre en œuvre des processus de mesures automatiques de données biologiques à partir d'images numériques dans un cadre de traitement de masse, et aborde notamment : l'incidence des choix méthodologiques sur la stabilité des résultats, l'étude de la validation des mesures produites et les limites de la généricité des méthodes et modèles appliquées à la biologie végétale.La réflexion est menée dans le cadre de l'étude de certaines organisations cellulaires, et plus particulièrement de l'identification et l'analyse automatique de files cellulaires dans des mosaïques d'images microscopiques de bois. En effet, l'étude des tendances biologiques le long de ces structures est nécessaire pour comprendre la mise en place des différentes organisations et maturations de cellule. Elle ne peut être conduite qu'à partir d'une grande zone d'observation du plan ligneux. Pour cela,- nous avons mis en place un nouveau protocole de préparation (rondelles de bois poncées) et de numérisation des échantillons permettant d'acquérir entièrement la zone d'observation sans biais- nous avons développé une chaîne de traitement permettant l'extraction automatique des files cellulaires dans des mosaïques images numériques.- nous avons proposé des indices de fiabilité pour chaque mesure effectuée afin de mieux cibler les études statistiques à venir.Les méthodes développées dans la thèse permettent l'acquisition et le traitement rapide d'un volume important de données. Ces données devraient servir de base à de nombreuses investigations : des analyses architecturales des arbres avec le suivi de file cellulaire et/ou la détection de perturbations biologiques, des analyses de variabilité intra et inter arbres permettant de mieux comprendre la croissance endogène des arbres
This study focuses on biological numeric picture processes. It aims to define and implement new automated measurements at large scale analysis. Moreover, this thesis addresses: The incidence of the proposed methodology on the results reliability measurements accuracy definition and analysis proposed approaches reproducibility limits when applied to plant biology.This work is part of cells organization study, and aims to automatically identify and analyze the cell lines in microscopic mosaic wood slice pictures. Indeed, the study of biological tendencies among the cells lines is necessary to understand the cell migration and organization. Such a study can only be realized from a huge zone of observation of wood plane. To this end, this work proposes:- a new protocol of preparation (slices of sanded wood) and of digitizing of samples, in order to acquire the entire zone of observation without bias,- a novel processing chain that permit the automated cell lines extraction in numeric mosaic pictures,- reliability indexes for each measurement for further efficient statistical analysis.The methods developed during this thesis enable to acquire and treat rapidly an important volume of information. Those data define the basis of numerous investigations, such as tree architectural analysis cell lines following and/or detection of biological perturbations. And it finally helps the analysis of the variability intra- or inter- trees, in order to better understand the tree endogenous growth
APA, Harvard, Vancouver, ISO, and other styles
31

Stein, Simon Christoph. "Advanced Data Processing in Super-resolution Microscopy." Doctoral thesis, 2017. http://hdl.handle.net/11858/00-1735-0000-0023-3EE1-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

(11161374), Emma J. Reid. "Multi-Resolution Data Fusion for Super Resolution of Microscopy Images." Thesis, 2021.

Find full text
Abstract:

Applications in materials and biological imaging are currently limited by the ability to collect high-resolution data over large areas in practical amounts of time. One possible solution to this problem is to collect low-resolution data and apply a super-resolution interpolation algorithm to produce a high-resolution image. However, state-of-the-art super-resolution algorithms are typically designed for natural images, require aligned pairing of high and low-resolution training data for optimal performance, and do not directly incorporate a data-fidelity mechanism.


We present a Multi-Resolution Data Fusion (MDF) algorithm for accurate interpolation of low-resolution SEM and TEM data by factors of 4x and 8x. This MDF interpolation algorithm achieves these high rates of interpolation by first learning an accurate prior model denoiser for the TEM sample from small quantities of unpaired high-resolution data and then balancing this learned denoiser with a novel mismatched proximal map that maintains fidelity to measured data. The method is based on Multi-Agent Consensus Equilibrium (MACE), a generalization of the Plug-and-Play method, and allows for interpolation at arbitrary resolutions without retraining. We present electron microscopy results at 4x and 8x super resolution that exhibit reduced artifacts relative to existing methods while maintaining fidelity to acquired data and accurately resolving sub-pixel-scale features.

APA, Harvard, Vancouver, ISO, and other styles
33

Schulte, Lukas. "New Computational Tools for Sample Purification and Early-Stage Data Processing in High-Resolution Cryo-Electron Microscopy." Doctoral thesis, 2018. http://hdl.handle.net/11858/00-1735-0000-002E-E54F-B.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Xuguang. "A novel high-K SONOS type non-volatile memory and NMOS HfO₂ Vth instability studies for gate electrode and interface threatment effects." Thesis, 2005. http://hdl.handle.net/2152/2089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Mohideen, Farlin. "2D-3D image registration of scanning electron microscope images and micro-CT volumes." Master's thesis, 2012. http://hdl.handle.net/1885/155845.

Full text
Abstract:
This thesis investigates the employment of feature based methods for 2D-3D image registration. Image registration of 2D images and 3D volumes has many industrial and medical applications. Industrial applications include grain structure analysis, data fusion for fault detection, medical image analysis. In this thesis we are interested in Micro-CT volume and SEM (Scanning Electron Microscope) image registration. State of the art 2D-3D registration techniques are computationally expensive. For example, registration of a Mega-pixel 2D image to a Giga-voxel 3D volume computation time is in many hundreds of CPU hours utilizing state of the art CPU technology. We address this problem by introducing a point feature based registration framework for 2D-3D registration. Specifically, we present novel techniques for repeatable key-point detection for 2D images and 3D volumes, a novel way of descriptor formation for matching 2D key-points and 3D key-points and model estimation for registration up to an affinity under low feature matching accuracy. We further develop techniques for model estimation. We present an affine registration estimation based on an algorithm, which re{u00AD} quires three true positive matches of feature points and extend this to estimate the model based on only a single true positive match, which we call the one{u00AD} point algorithm. We present an algorithm based on Branch-and-Bound for rigid model estimation with proof that convergence is guaranteed. We experimentally validate the performance of this algorithm and show its theoretical superiority compared to RANSAC. In addition, handling image scale is also addressed by resolving the ambiguity of 2D image scale and 3D image scale. We show that 2D scale and 3D scale do not represent similar image and volume neighborhoods. We compare our technique with state of the art global registration techniques, such as correlation based registration and mutual information based registration and indicate the superior performance of our method. Furthermore, we introduce a novel feature descriptor based on image curvature founded on mathematically sound principles for improving feature matching accuracy. We indicate that the 2D-3D registration accuracy improves under this novel feature descriptor. Other practical applications, such as homography estimation and pose estimation, are also investigated. Furthermore, we extend the Branch-and-Bound based algo rithm with guaranteed convergence for vanishing point estimation and essential matrix estimation, for which empirical results are provided.
APA, Harvard, Vancouver, ISO, and other styles
36

Tomek, Jakub. "Processing data from two-photon microscope." Master's thesis, 2013. http://www.nusl.cz/ntk/nusl-330734.

Full text
Abstract:
Two-photon laser scanning microscopy is a modern method of in vivo neurophysiological research, capable of imaging up to hundreds of neurons at once. However, this method produces a large amount of data, difficult to process and analyze manually. This thesis presents Two-Photon Processor, a new toolkit for complex processing of data from two-photon microscope. During the work on this thesis, we designed the SeNeCA segmentation algorithm for detection of neurons in full-frame recording from a two-photon microscope. SeNeCA combines high speed and high quality of segmentation and, according to our evaluation, it currently is the best algorithm for segmentation of neurons in in vivo data. Two- Photon Processor is already routinely used in the Institute of Experimental Medicine of the ASCR, Department of Auditory Neuroscience, and it was published in the Journal of Neurophysiology.
APA, Harvard, Vancouver, ISO, and other styles
37

Heisen, Burkhard Clemens. "New Algorithms for Macromolecular Structure Determination." Doctoral thesis, 2009. http://hdl.handle.net/11858/00-1735-0000-0006-B503-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography