Dissertations / Theses on the topic 'Elemental imaging'

To see the other types of publications on this topic, follow the link: Elemental imaging.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Elemental imaging.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lum, Tsz Shan. "Elemental imaging and speciation for bioanalysis." HKBU Institutional Repository, 2016. https://repository.hkbu.edu.hk/etd_oa/328.

Full text
Abstract:
Elemental detection is an emerging area in bioanalysis. Thanks to the rapid advancement in instrumentation such as inductively coupled plasma-mass spectrometry (ICP-MS), low detection limit and quick analysis can be achieved. Besides, ICP-MS also suffers less matrix effect as compared to molecular mass spectrometry, so a precise and accurate detection of toxic or essential elements can be provided. Different types of sample introduction or separation systems such as laser ablation (LA) and liquid chromatograph (LC) are excellent hyphenation options for the elemental detection apart from the total analysis of standalone ICP-MS analysis. Spatial analysis and speciation of the two mentioned techniques provide additional merits to the elemental detection in bioanalysis.;LA-ICP-MS makes use of a laser to ablate the solid sample, and the generated sample aerosol is then transferred to ICP-MS for detection. It can be used for bioimaging. There are examples of LA mapping of biological tissues to reveal the spatial distribution of metal, to study the neurodegenerated disease in brain or the accumulation in metallodrug in tumor mass. In order to incorporate the imaging tool in drug development, in the first part of this thesis, LA-ICP-MS bioimaging of liver and kidney was performed to compare the differential spatial distribution of two structurally different platinum-based anti-cancer drug candidates. It was expected that this approach can assist the chemical modification in drug development.;To put this idea a step further, the spatial analysis tool was tested for its potential in therapeutic drug monitoring. Hair profiling in whiskers of mice treated with vanadium anti-diabetic complex or gadolinium-based contrasting agents at different dosage levels were conducted. Results shown that different deposition behaviors and accumulation/elimination profile can be observed, demonstrating a great potential in routine clinical application.;On the other hand, LC-ICP-MS offers the possibility for speciation study. Several accessories for organic solvent introduction in ICP-MS make the coupling of reverse phase chromatography using high percentage of organic solvent in the mobile phase more convenient. To demonstrate the advantage of this configuration, a speciation of bromine-containing drug in mice urine and plasma was included in the last part of this project for metabolite profiling study.;In Short, this work presents several useful hyphenated techniques of ICP-MS in bioanalysis, proving the tremendous potential of elemental detection in drug development (assisting molecular modification in drug design and metabolite profiling) and therapeutic drug monitoring (hair profiling)
APA, Harvard, Vancouver, ISO, and other styles
2

Stedman, Jacqueline D. "Regional distribution of elemental concentrations in brain tissue of #normal' ageing and sporadic Alzheimer's disease subjects as determined by PIXE, RBS and INA analyses." Thesis, University of Surrey, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.318698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Asogan, Dhinesh. "A non-contact laser ablation cell for mass spectrometry." Thesis, Loughborough University, 2011. https://dspace.lboro.ac.uk/2134/11014.

Full text
Abstract:
A common analytical problem in applying LA sampling concerns dealing with large planar samples, e.g. gel plates, Si wafers, tissue sections or geological samples. As the current state of the art stands, there are two solutions to this problem: either sub-sample the substrate or build a custom cell. Both have their inherent drawbacks. With sub-sampling, the main issue is to ensure that a representative is sample taken to correctly determine the analytes of interest. Constructing custom cells can be time consuming, even for research groups that are experienced or skilled, as they have to be validated before data can be published. There are various published designs and ideas that attempt to deal with the issue of large samples, all of which ultimately enclose the sample in a box. The work presented in this thesis shows a viable alternative to enclosed sampling chambers. The non-contact cell is an open cell that uses novel gas dynamics to remove the necessity for an enclosed box and, therefore, enables samples of any arbitrary size to be sampled. The upper size limit of a sample is set by the travel of the XY stages on the laser ablation system, not the dimensions of the ablation cell.
APA, Harvard, Vancouver, ISO, and other styles
4

Dickson, Hazel Rebecca. "Strategies for the identification of pharmaceutical compounds in thin tissue sections using lazer-based elemental and molecular mass spectrometric imaging techniques." Thesis, University of Sheffield, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.522450.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Buchriegler, Josef [Verfasser], Jürgen [Gutachter] Faßbender, and Matjaz [Gutachter] Kavcic. "Full-field PIXE imaging using a Colour X-ray Camera : Advantages and drawbacks in elemental mapping of large areas with a poly-capillary optics / Josef Buchriegler ; Gutachter: Jürgen Faßbender, Matjaz Kavcic." Dresden : Technische Universität Dresden, 2021. http://d-nb.info/1235346390/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Choi, Hyungryul. "Fabrication of anti-reflective and imaging nanostructured optical elements." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/106723.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 69-73).
Moth eyes minimize reflection over a broad band of angles and colors and lotus leaves minimize wetting over a broad range of breakthrough pressures by virtue of subwavelength structures patterned on their respective surfaces; similar examples of organisms exploiting geometry to attain properties unavailable in bulk materials are abundant in nature. These instances have inspired applications to man-made structures, collectively known as functional materials: for example, self-cleaning/anti-fogging surfaces, and solar cells with increased efficiency. I fabricated a functional surface where both wetting and reflectivity are controlled by geometry. Using a periodic array of subwavelength-sized high aspect ratio cones, patterned on glass and coated with optimized surfactants, I have experimentally shown that we can significantly enhance transmission from the surfaces of a glass slab, and at the same time make the surfaces either superhydrophobic or superhydrophilic, depending on the applications, such as antifogging and self-cleaning glass. Novel lithographic techniques result in high patterning accuracy over large surface areas, and is easily adaptable to nanoimprinting for future mass replication. In addition, an all-dielectric subwavelength-patterned Luneburg lens was fabricated for operation at free-space wavelength of A =1.55 um.
by Hyungryul Choi.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
7

Neugebohren, Jannis. "Implementing Ion Imaging to Probe Chemical Kinetics and Dynamics at Surfaces." Doctoral thesis, Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2018. http://hdl.handle.net/11858/00-1735-0000-002E-E43B-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vianna, Graziela Valadares Gomes de Mello. "Imagens sonoras no ar: a sugestão de sentido na publicidade radiofônica." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/27/27153/tde-09092009-120319/.

Full text
Abstract:
O objetivo com este trabalho foi compreender o sentido sugerido pelos elementos sonoros que constituem a linguagem radiofônica. Dessa forma, buscamos entender como se dá a sugestão de sentido ao ouvinte por meio da performance da voz, dos efeitos sonoros, da trilha musical, do silêncio e do tratamento técnico. Elementos que devem ser associados ao repertório do ouvinte a fim de serem criadas imagens multisensoriais com base em uma mensagem que utiliza unicamente o som como significante. Para compreender, portanto, essa sugestão de sentido com base no som, definimos a publicidade radiofônica como nosso objeto de análise. Para fazer o recorte do objeto de análise, seguimos uma agenda que a publicidade comumente dá atenção, composta pelas principais datas comemorativas ligadas ao consumo: Natal, férias de verão, Páscoa, Dia das Mães, Dia dos Namorados, Dia dos Pais, Dia das Crianças, períodos em que o investimento publicitário é incrementado. Consideramos, então, os jingles e os spots dramatizados veiculados nessas datas nas duas emissoras de rádio de maior audiência nas capitais das regiões Sudeste e Sul, que receberam 95,1% dos investimentos publicitários no rádio no Brasil no período pesquisado. Baseando-se na análise das peças selecionadas, identificamos-lhe o \"ouvinte-modelo\", os objetivos principais e específicos de cada peça e as estratégias de valorização do anunciante e de seu produto. Estabelecemos uma tipologia dos elementos sonoros e de suas funções e observamos as possibilidades de sugestão de imagens com base em tais elementos. Entendemos que este trabalho pode contribuir para pesquisas futuras sobre a linguagem radiofônica, bem como para profissionais da publicidade, dos estúdios e do rádio, para que possam aproveitar melhor o potencial expressivo do meio e tornar o rádio mais atraente para seus investidores e ainda para seus ouvintes-modelo, ao explorarem de forma adequada tal potencial.
The objective of this thesis is to comprehend the suggestions of sense that are constructed by the sound elements of radio language. So, we analysed the text, the performing of the voices, the music, the sound effects, the silence and the sound techniques (recording, mixing and post-recording effects, like reverberation, equalization) to understand how those suggestions are constructed. Elements that are associated to the cultural background of the listener to create multisensorial images. For this purpose, we defined radio advertising as the object of our analysis. We considered the fact that near the dates related to consuming Christmas, Summer vacation, Easter, Mother\'s Day, Valentine\'s Day, Father\'s Day and Children\'s Day advertising is intensified in the media. So we recorded the two more popular radio stations in each capital of south and south east brazilian regions one week before each date. Those regions had received 95,1 % of total investments in advertising in radio (2004/2005). After this recording, we selected two commercials for each date one jingle and one spot that utilizes drama as a technique to persuade the listener. The analysis of radio advertising allow us to identify the \"model-listener\" of the advertisings and the objectives of each commercial. And also permit us to verify the strategies to persuade the audience and point some possibilities to suggest sound images to this audience. The research establishes a typology of the sound elements of the radio language that are related to the potential of expression of this medium. We believe that this work can be a contribution for further researches about radio language, for the professionals of advertising and for those who works on radio stations to better comprehend such potential. So, radio could become more attractive for advertising investments and for the listeners.
APA, Harvard, Vancouver, ISO, and other styles
9

Lima, Cícero Ribeiro de. "Estudo da obtenção de imagens de tomografia de impedância elétrica do pulmão pelo método de otimização topológica." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/3/3132/tde-15092006-162839/.

Full text
Abstract:
A Tomografia por Impedância Elétrica (TIE) é uma técnica recente de obtenção de imagens médicas para monitoração de tecidos biológicos. A TIE nos permite obter imagens que representam um plano transverso de qualquer seção do corpo humano (cabeça, tórax, coxa, etc.), onde cada “pixel” na imagem representa a sua impedância ou resistividade elétrica. As imagens são geradas através de valores de voltagens medidos em eletrodos posicionados em torno da seção do corpo humano. Estas voltagens são obtidas aplicando-se uma seqüência de corrente elétrica de baixa amplitude nos eletrodos, de acordo com um padrão da excitação elétrica (adjacente ou diametral). A TIE é baseada na solução de um problema inverso, onde dadas as voltagens medidas no exterior do corpo, essa técnica tenta encontrar a distribuição de condutividades no interior do corpo. O objetivo principal deste trabalho é aplicar o Método de Otimização Topológica (MOT) para obtenção de imagens da seção de um corpo na TIE. A Otimização Topológica (OT) busca a distribuição de material no interior de um domínio de projeto, retirando e adicionado material em cada ponto desse domínio de maneira a minimizar (ou maximizar) uma função objetivo especificada, satisfazendo dadas restrições impostas ao problema de otimização. Neste trabalho, o MOT é um método iterativo, cujo algoritmo computacional (implementado em linguagem C) combina o Método dos Elementos Finitos (MEF) e um algoritmo de otimização conhecido por Programação Linear Seqüencial (PLS). O problema de obtenção da imagem usando MOT consiste em se obter a distribuição de material (ou de condutividade) na seção do corpo que minimize a diferença entre os potenciais elétricos medidos nos eletrodos e os potenciais calculados num modelo computacional. A principal vantagem do MOT, aplicado à obtenção de imagens na TIE, é permitir a inclusão de várias restrições no problema de otimização, reduzindo o espaço de solução e evitando imagens sem significado clínico. Neste trabalho, o MOT utiliza o modelo de material SIMP para relaxar o problema de OT e vários esquemas são implementados com o intuito de regularizar o problema inverso da TIE (resolvido pelo MOT), tais como parâmetro de sintonia da imagem (“tuning”), restrição baseada na condutividade média do domínio da imagem, filtro espacial de controle de gradientes, aumento gradual do fator de penalidade do modelo de material (método de continuação) e aproximação contínua da distribuição de material (“CAMD”). Este trabalho está inserido num projeto temático cujo objetivo é estudar técnicas de reconstrução de imagem aplicadas a um tomógrafo por impedância elétrica para monitorar de forma precisa a ventilação forçada do pulmão e diagnosticar quando alguma parte do pulmão estiver danificada (obstruída ou colapsada) durante o processo de ventilação forçada. Para ilustração, são apresentadas imagens obtidas utilizando-se dados numéricos e experimentais de voltagem de domínios 2D bem conhecidos.
The Electrical Impedance Tomography (EIT) is a recent monitoring technique on biological tissues. The EIT allows us to obtain images representing a transversal plane of any section of human body (head, thorax, thigh, etc). Each image pixel is related to its corresponding value of electrical impedance (or resistivity). The images are generated from voltage values measured on electrodes positioned around the section of human body. These voltages are obtained by applying to the electrodes an alternated sequence of low intensity electrical currents in according to an excitation pattern (adjacent or diametral). The EIT is based on an inverse problem, where given the voltages measured outside of body, this technique tries to find the conductivity distribution inside of the body. In this work, the main objective is to apply Topology Optimization Method (TOM) to obtain images of body section in EIT. Topology Optimization seeks a material distribution inside of a design domain, determining which points of space should be solid and which points should be void, to minimize (or maximize) an objective function requirement, satisfying specified constraints. In this work, the MOT is an iterative method whose computational algorithm (implemented in C language) combines Finite Element Method (FEM) and an optimization algorithm called Sequential Linear Programming (SLP). The topology optimization problem applied to obtain images consists of finding the material (or conductivity) distribution in the body section that minimizes the difference between electric potential measured on electrodes and electric potential calculated by using a computational model. The main advantage of TOM applied to image reconstruction in EIT is to allow us to include several constraints in optimization problem, which reduces the solution space and avoids images without clinical meaning. In this work, the MOT uses a material model based on SIMP to makes relaxation of topology optimization problem and several regularization schemes are implemented to solve inverse problem of EIT, such as image tuning control, weighted distance interpolation based on average conductivity of domain, spatial filtering technique for gradient control, graduated changing in penalty factor of material model during the optimization process (continuity method), and continuous approximation of material distribution (CAMD). This work belongs to a thematic project whose aim is to study reconstruction image algorithms that could be used in an EIT device to monitor accurately mechanical ventilation of lung and to diagnose when any portion of lung is damaged (obstructed or collapsed) during mechanical ventilation process. To illustrate the implementation of the method, image reconstruction results obtained by using voltage numerical and experimental data of well-know 2D domains are shown.
APA, Harvard, Vancouver, ISO, and other styles
10

Pleasants, Ian Blair. "Detector elements for hard X-ray and gamma-ray astronomy applications." Thesis, University of Southampton, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.262152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Langridge, Mark T. "Manufacture of micro-optical elements for imaging and light-guidance." Thesis, University of Surrey, 2015. http://epubs.surrey.ac.uk/807979/.

Full text
Abstract:
In this thesis we discuss the manufacture and characterisation of micro-optical elements, for guiding light into sub-wavelength beams & spots, and for use in super-resolution imaging. A physical limit exists in microscopy where it is impossible to view object smaller than half the illuminating wavelength, via conventional means. In white light microscopy this creates an resolution limit of 321nm (at a wavelength of 500nm, in air). This places a limit on the smallest objects a researcher can study using optical microscopy. We present a method for fabricating plano-convex lenses which, when placed in near proximity to the samples, boost magnification of conventional microscopes by up-to 2.5x and resolve features below 200nm, with white light illumination. We also demonstrate a curved axicon Bessel-beam former, that produces long (17 micrometer) non-diffracting beams of light, that can be sub-wavelength in width, down to 2/3rds the wavelength. In this thesis we contribute the following to current knowledge: We describe a focused ion-beam milling technique to form bespoke geometry of parabolic & spherical curvature, including reflective dishes, of diameter 1-10 microns, with a surface roughness of 4.0-4.1nm. As part of this work, we calculate the efficiency of a new technique for removing ion-beam induced damage, using wet-chemical etching. Here we show that increasing the ion-dose above 3000 µC/cm^2 allows a higher percentage of the implantation and amorphisation damage to be removed, and leaves less than 0.5% of the gallium remaining in the surface. We use the ion-milled dishes to form lens moulds; we double-replicate the brittle silicon mould, to create a hard wearing rubber mould. As multiple rubber moulds can be created per silicon mould the process becomes industrially scalable. A thin-film of polymer lenses is then formed from the mould. We characterise these lenses, demonstrating 1.2-2.5x magnification and resolution of 200nm. We demonstrate their use by imaging two biological samples, one fixed & stained, and one unlabelled in water. Additionally, using computer simulations alongside the focused ion-beam manufacturing technique, we demonstrate a curved axicon lens structure, that forms long, non-diffracting beams of intense light. We model and experimentally analyse how the lens profile and high-to-low refractive index change forms the beam, and show that increasing the refractive index change decreases the beam width but at a loss of light transmission.
APA, Harvard, Vancouver, ISO, and other styles
12

Li, Xiaobei. "Instrumentation and inverse problem solving for impedance imaging /." Thesis, Connect to this title online; UW restricted, 2006. http://hdl.handle.net/1773/5973.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Dominick, Colleen Elizabeth. "An investigation of array elements for enhanced single echo acquisition imaging." Thesis, Texas A&M University, 2005. http://hdl.handle.net/1969.1/4309.

Full text
Abstract:
Rapid MR imaging has facilitated the development of a variety of medical tools such as MR guided surgeries, drug delivery, stent placement, biopsies, and blood flow imaging. This rapid imaging is largely attributable to the development of parallel imaging techniques. In one such technique, single echo acquisition (SEA) imaging, scan time is reduced by substituting the lengthy phase encoding process with spatial information from an extensive receiver coil array. In order to easily construct and obtain images from this coil array, an ideal set of coil elements would be easily decoupled and tuned, possess high SNR and penetration depth, and would allow for operation in both transmit and receive mode. Several types of coils have been considered for use in massive coil arrays, including the planar pair coil, the loop coil and the stripline coil. This thesis investigates each of these coils for use in massive arrays for SEA imaging with enhanced penetration. This investigation includes: improving the currents on the planar pair coil, determining the feasibility of the loop coil and the stripline coil at the scale required for SEA, and comparing the salient properties of each coil type. This investigation revealed that the stripline coil appears to be the best coil element for SEA imaging with enhanced penetration.
APA, Harvard, Vancouver, ISO, and other styles
14

Cravo, Anderson Gabriel Santiago. "Elastografia em imagens de ultrassom utilizando elementos de contorno." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/3/3142/tde-14062016-141801/.

Full text
Abstract:
Este trabalho apresenta uma nova metodologia para elastografia virtual em imagens simuladas de ultrassom utilizando métodos numéricos e métodos de visão computacional. O objetivo é estimar o módulo de elasticidade de diferentes tecidos tendo como entrada duas imagens da mesma seção transversal obtidas em instantes de tempo e pressões aplicadas diferentes. Esta metodologia consiste em calcular um campo de deslocamento das imagens com um método de fluxo óptico e aplicar um método iterativo para estimar os módulos de elasticidade (análise inversa) utilizando métodos numéricos. Para o cálculo dos deslocamentos, duas formulações são utilizadas para fluxo óptico: Lucas-Kanade e Brox. A análise inversa é realizada utilizando duas técnicas numéricas distintas: o Método dos Elementos Finitos (MEF) e o Método dos Elementos de Contorno (MEC), sendo ambos implementados em Unidades de Processamento Gráfico de uso geral, GpGPUs ( \"General Purpose Graphics Units\" ). Considerando uma quantidade qualquer de materiais a serem determinados, para a implementação do Método dos Elementos de Contorno é empregada a técnica de sub-regiões para acoplar as matrizes de diferentes estruturas identificadas na imagem. O processo de otimização utilizado para determinar as constantes elásticas é realizado de forma semi-analítica utilizando cálculo por variáveis complexas. A metodologia é testada em três etapas distintas, com simulações sem ruído, simulações com adição de ruído branco gaussiano e phantoms matemáticos utilizando rastreamento de ruído speckle. Os resultados das simulações apontam o uso do MEF como mais preciso, porém computacionalmente mais caro, enquanto o MEC apresenta erros toleráveis e maior velocidade no tempo de processamento.
This thesis presents a new methodology for computational elastography applied to simulated ultrasound images, using numerical methods and comptuter vision methods. The aim is to estimate the elastic moduli of diferent tissues using two diferent images of the same cross section acquired in diferent times and pressure conditions. The proposed methodology consists in evaluate the displacement field using optical flow techniques and then apply an inverse analysis using a numerical method. In order to evaluate the displacement field, two distinct formulations for optical flow are used: Lucas-Kanade and Brox. For the inverse analysis problem, the Finite Element Method and the Boundary Element Method are used, both implemented in general purpose graphic units, GpGPUs. Considering a number of materials that may be present in the images, the multiresgions boundary element method is used in order to couple diferent matrices for diferent materials. The optimization process is evaluated using complex variable method. The methodology is validated in three diferent steps: noiseless simulations; additive white gaussian noise simulations; and ultrasound mathematical phantom with speckle tracking. The results show that the Finite Element Method presents more accurate estimatives but a high computational cost, while the Boundary Element Method presents tolerable errors but a better processing time.
APA, Harvard, Vancouver, ISO, and other styles
15

Bass, Mark James. "Velocity mapping of elementary bimolecular reactions." Thesis, University of Oxford, 2004. http://ora.ox.ac.uk/objects/uuid:ea381f05-6a68-435f-91d6-30d14a8c8dc4.

Full text
Abstract:
A new and flexible velocity-map ion imaging apparatus, designed for the study of photodissociation processes and photon-initiated bimolecular reactions in a single molecular beam, has been constructed, developed and characterised. An image Legendre moment fitting analysis was developed to allow recovery of centre-of-mass (CM) angular scattering and kinetic energy release distributions from velocity-map ion images of the products of photon-initiated bimolecular reactions. The Legendre moment analysis methodology has been applied to images of the HCl(v' = 0,j' = 0-6) products of the reactions of Cl(²P3/2) atoms with ethane and n-butane at collision energies of 0.24 eV and 0.32 eV respectively. The Cl(²P3/2) reactants were generated by polarised laser photodissociation of Cl₂ at 355 nm. For reaction with ethane, the CM angular scattering distributions show a steady trend from forward scattering at low j' to more isotropic, but backward peaking, scattering at high j'. An impact parameter-based mechanism is proposed to account for the observed dynamics. Abstraction of a hydrogen atom from a primary carbon site in n-butane is seen to produce rotationally very cold HCl products that are forward scattered, whereas H atom abstraction from a secondary carbon site in n-butane yields more isotropically scattered HCl products formed with higher rotational excitation. A peripheral mechanism is proposed to operate for the primary abstraction channel, whilst a more rebound type mechanism is seen to account for the dynamics of the secondary abstraction channel. Around 22% and 30% of the available energy is found in internal modes of the alkyl radical co-products of the Cl + C₂H₆ and Cl + n-C₄H₁₀ reactions respectively. Possible sources of alkyl co-product excitation are discussed in each case. The hydrogen or deuterium atom abstraction reactions of Cl(²P3/2) with CH₄, CD₄ and CH₃D, have been studied at mean collision energies around 0.3 eV. Chlorine atom reactants were generated by polarised laser photodissociation of Cl₂ at 308 nm. The methyl radical products were detected using (2+1) resonance-enhanced multi-photon ionisation, coupled with velocity-map ion imaging. The laboratory frame speed distributions obtained from the images are in excellent agreement with previous work. The interpretation of the experiments is shown to be very sensitive to assumptions made about the reactant velocity distributions. If these are assumed to be narrow, the data are seen to suggest that a significant fraction of the product signal must arise from the reaction of Cl with vibrationally excited methane reactants. This conclusion is in agreement with previous photon-initiated reaction studies. However, by allowing for the spread in collision energies in the molecular beam, it is shown that it is possible to fit the data sensibly assuming reaction with vibrational ground state methane alone. CM angular scattering distributions thereby derived are presented for all three reactions.
APA, Harvard, Vancouver, ISO, and other styles
16

Rodriguez, Rivera Noemi Ines. "Imagem por dupla difração com luz branca sem elementos intermediários." [s.n.], 2007. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278130.

Full text
Abstract:
Orientador: Jose Joaquin Lunazzi
Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin
Made available in DSpace on 2018-08-08T07:18:00Z (GMT). No. of bitstreams: 1 RodriguezRivera_NoemiInes_D.pdf: 6088514 bytes, checksum: c2c66dc6b71f900d00c6832f96aaf3eb (MD5) Previous issue date: 2007
Resumo: Apresentamos neste trabalho a análise da formação de imagens por elementos difrativos com luz branca fazendo o traçado de raios pelas direções principais. O primeiro sistema analisado é composto por duas redes de difração e uma fenda, o segundo por dois elementos bidimensionais de estrutura espiral e um orifício, que formam imagens ortoscópicas (relevo natural). A partir das análises mencionadas desenvolvemos um sistema de dois elementos difrativos sem elementos intermediários que forma uma imagem de luz branca que é pancromática, porque oferece as cores originais. Além disso, apresentamos um sistema formador de uma imagem por transmissão que consiste na projeção de objetos usando uma fonte linear (filamento extenso) e um elemento difrativo. Aproveitando as propriedades de uma fonte linear, desenvolvemos um sistema que permite que espelhos ou lentes imperfeitos gerem imagens nítidas. Mediante estes sistemas visamos conseguir um dia a formação de imagens convergentes, entretanto já oferecemos novas maneiras de se exibir imagens tridimensionais atrativas e amplas
Abstract: We present the analysis of the formation of images by diffractive elements using white light by performing ray-tracing through main directions. The first system we describe is composed of two diffraction gratings and a slit, the second by two bi-dimensional spiral elements and a hole aperture, generating ortoscopic (natural relief) images. From this we had found a system of two diffractive elements without any intermediating element that makes and image which is panchromatic because gives original colors. Furthermore, we present a transmission imaging system that projects objects by means of a linear source (extended filament) and a diffractive element. Profiting the imaging properties of that linear source we also developed a system for mirrors and lenses having no sharpness to generate sharp images. By studying these systems we seek to find a way to achieve the making of convergent images and we already offer new ways to exhibit attractive and large three-dimensional images
Doutorado
Física
Doutor em Ciências
APA, Harvard, Vancouver, ISO, and other styles
17

Lerch, Terence. "Thermal evaluation of an integrated circuit chip using infrared imaging and finite element techniques /." Online version of thesis, 1991. http://hdl.handle.net/1850/11113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Akalin, Acar Zeynep. "Electro-magnetic Source Imaging Using Realistic Head Models." Phd thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/2/12606173/index.pdf.

Full text
Abstract:
Electro-Magnetic Source Imaging (EMSI) is the estimation of the position, orientation and strength of active electrical sources within the brain from electrical and magnetic measurements. For an accurate source localization, the head model must correctly represent the electrical and geometrical properties of the head. To solve the forward problem using realistic head models numerical techniques must be used. This work uses the Boundary Element Method (BEM) for solving the forward problem. The accuracy of the existing BEM formulation is improved by using second order elements, recursive integration and the isolated problem approach (IPA). Two new formulations are developed to improve the solution speed by computing transfer matrices for EEG and MEG solutions. The IPA formulation is generalized and integrated into the accelerated BEM algorithm. Once the transfer matrices are computed, the forward solutions take about 300 ms for a 256 sensor EEG and MEG system. The head model used in the BEM solutions is constructed by segmenting three dimensional multimodal magnetic resonance images. For segmentation, a semi-automatic hybrid algorithm is developed that makes use of snakes, morphological operations, thresholding and region growing. The mesh generation algorithm allows intersecting tissue compartments. For the inverse problem solution genetic algorithm (GA) is used to search for a given number of dipoles. Source localization with simulated data show that the localization error is within 1.1 mm for EEG and 1.2 mm for MEG when SNR is 10 on a realistic model with 7 compartments. When a single-dipole source in a realistic model is explored using a best-fit spherical model, the localization errors increase up to 8.5 mm for EEG and 7 mm for MEG. Similar tests are also performed with multiple dipoles. It was observed that realistic models provide definitely more accurate results compared to spherical models. The EMSI approach is also tested using experimental EEG data to localize the sources of auditory evoked potentials. The reconstructed source locations are correctly found in the Heschl'
s gyrus. In conclusion, this thesis presents a complete source localization framework for future brain research using the EMSI.
APA, Harvard, Vancouver, ISO, and other styles
19

Magalhães, Daniel Souza Ferreira. "Estudo de imagens por dupla difração com seleção de luz branca e elementos definidos bidimensionalmente." [s.n.], 2005. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278144.

Full text
Abstract:
Orientador: Jose Joaquin Lunazzi
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin
Made available in DSpace on 2018-08-04T02:40:53Z (GMT). No. of bitstreams: 1 Magalhaes_DanielSouzaFerreira_M.pdf: 2786566 bytes, checksum: d02ed179060d33a6a269a495bcb6a2bb (MD5) Previous issue date: 2005
Resumo: Neste trabalho analisamos a formação de imagens por elementos difrativos com luz branca fazendo o traçado de raios pelas direções principais e explorando as possibilidades que a simetria de um conjunto de dois elementos definidos bidimensionalmente e intermediados por um pequeno orifício oferece. A mais interessante delas é a de termos uma imagem convergente, que pode ser vista projetando sobre uma tela, por exemplo
Abstract: We perform ray tracing analysis on the imaging of white light objects by diffractive elements by means of the main direction of the rays. We explore the possibilities that the symmetry of a set of two bidimensionally acting diffracting elements intermediated for a pinhole offers. The most interesting of them is that we have a convergent image, that can be seen projecting on a screen, for example
Mestrado
Ótica
Mestre em Física
APA, Harvard, Vancouver, ISO, and other styles
20

Duarte, Peter. "Using computerized imaging to evaluate the visual preference effects of downtown streetscape elements." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/MQ51060.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Dobson, Andrew. "Seismic modelling for the sub-basalt imaging problem including an analysis and development of the boundary element method." Thesis, University of Edinburgh, 2005. http://hdl.handle.net/1842/765.

Full text
Abstract:
The north-east Atlantic margin (NEAM) is important for hydrocarbon exploration because of the growing evidence of hydrocarbon reserves in the region. However, seismic exploration of the sub-surface is hampered by large deposits of flood basalts, which cover possible hydrocarbon-bearing reservoirs underneath. There are several hypotheses as to why imaging beneath basalt is a problem. These include: the high impedance contrast between the basalt and the layers above; the thin-layering of the basalt due to the many flows which make up a basalt succession; and the rough interfaces on the top-basalt interface caused by weathering and emplacement mechanisms. I perform forward modelling to assess the relative importance of these factors for imaging of sub-basalt reflections. The boundary element method (BEM) is used for the rough-interface modelling. The method was selected because only the interfaces between layers need to be discretized, in contrast to grid methods such as finite difference for which the whole model needs to be discretized, and so should lead to fast generation of shot gathers for models which have only a few homogeneous layers. I have had to develop criteria for accurate modelling with the boundary element method and have considered the following: source near an interface, two interfaces close together, removal of model edge effects and precise modelling of a transparent interface. I have improved efficiency of my code by: resampling the model so that fewer discretization elements are required at low frequencies, and suppressing wrap-around so that the time window length can be reduced. I introduce a new scheme which combines domain decomposition and a far-field approximation to improve the efficiency of the boundary element code further. I compare performance with a standard finite difference code. I show that the BEM is well suited to seismic modelling in an exploration environment when there are only a few layers in the model and when a seismic profile containing many shot gathers for one model is required. For many other cases the finite difference code is still the best option. The input models for the forward modelling are based on real seismic data which were acquired in the Faeroe-Shetland Channel in 2001. The modelling shows that roughness on the surface of the basalt has little effect on the imaging in this particular area of the NEAM. The thin layers in the basalt act as a low-pass filter to the seismic wave. For the real-data acquisition, even the topbasalt reflection is a low frequency event. This is most likely to be due to high attenuation in the layers above the basalt. I show that sea-surface multiple energy is considerable and that it could mask possible sub-basalt events on a seismic shot gather, but any shallow sub-basalt events should still be visible even with the presence of multiple energy. This leaves the possibility that there is only one major stratigraphic unit between the base of the basalt and the crystalline basement. The implication of the forward modelling and real data analysis for acquisition is that the acquisition parameters must emphasize the low frequencies, since the high frequencies are attenuated before they even reach the top-basalt interface. The implication for processing is that multiple removal is of prime importance.
APA, Harvard, Vancouver, ISO, and other styles
22

Qiu, Yan. "Three dimensional finite element model for lesion correspondence in breast imaging." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Miao, Cheng Hsi. "The design of phased synthetic aperture imaging systems using a minimum number of elements." Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185625.

Full text
Abstract:
The research described in this report resulted from my participation in the design study for the Phased-Array Imaging Telescope. To maintain high transmission, a practical system should contain a minimum number of components. This consideration leads to the concept of shared symmetries between the subtelescope and final collector. This report presents an approach to the design of such arrays, and examines the implications of including aspheric correction for the telescope array. As expected, the number of elements in this correction design concept seems to work well. Four array systems based on this concept are presented; each uses only one spherical mirror as the beam collector. The effects of changing the primary mirror's relative aperture, and of changing the system length, on the symmetry and order of aberrations arising from the use of an eccentric aspheric, are explained in this report. The subtle limitations of techniques for adding special surfaces for decentered aspheric correction to optical design programs are discussed as well. Two additional design concepts are examined and compared. A preliminary tolerancing analysis is performed, and error budgets developed. An adaptive element is considered for relaxing the alignment and fabrication tolerances.
APA, Harvard, Vancouver, ISO, and other styles
24

Barreto, Janaina Lucene Mendonza. "Coco de roda novo quilombo: da roda ao centro, imagens e símbolos de uma tradição." Universidade Federal da Paraíba, 2017. http://tede.biblioteca.ufpb.br:8080/handle/tede/9773.

Full text
Abstract:
Submitted by FABIANA DA SILVA FRANÇA (fabiana21franca@gmail.com) on 2018-02-22T13:13:47Z No. of bitstreams: 1 Arquivo Total.pdf: 3421973 bytes, checksum: db60176d1cf41839ed3a79cfeeb7ccca (MD5)
Made available in DSpace on 2018-02-22T13:13:47Z (GMT). No. of bitstreams: 1 Arquivo Total.pdf: 3421973 bytes, checksum: db60176d1cf41839ed3a79cfeeb7ccca (MD5) Previous issue date: 2017-06-30
This research aims to identify and analyze, from a phenomenological approach as well as interviews, some visual elements of the Coco de Roda do Ipiranga New Quilombo, artistic and cultural manifestation that occurs in the municipality of Conde, Paraíba. The elements that we identify in the course of research and that will be approached are: the hype; The coconut wheel; The bow and the garments. It is important to consider that the Coco de Roda here studied and debated, in its visual elements, is a form of resistance and artistic and cultural expression of great relevance for the society of Paraiba and for the country, considering its tradition and musical richness, of dance , Of the visual elements, in short, of all living and latent culture. Thus, by emphasizing and discussing the visual elements, we intend to contribute, on the one hand, to the scientific knowledge and knowledge present in the University, particularly for the fields of visual arts, culture and anthropology and, on the other hand, to strengthen the idea And the conviction that it is a significant cultural manifestation for society and, for the moment, for cultural public policies at the local, regional and national levels.
Esta pesquisa objetiva identificar e analisar, a partir de uma abordagem fenomenológica bem como de entrevistas realizadas, alguns elementos visuais do Coco de Roda do Novo Quilombo do Ipiranga, manifestação artístico e cultural que ocorre no município do Conde, Paraíba. Os elementos que identificamos no percurso de pesquisa e que serão abordados são: o bombo; a roda de coco; a umbigada e as vestimentas. Importa considerar que o Coco de Roda aqui estudado e debatido, em seus elementos visuais, é uma forma de resistência e expressão artístico e cultural de grande relevância para a sociedade paraibana e para o país, tendo em vista sua tradição e riqueza musical, de dança, dos elementos visuais, enfim, de toda a cultura viva e em latência. Assim, ao ressaltar e discutir os elementos visuais pretende-se contribuir, por um lado, para os conhecimentos científicos e saberes presentes na Universidade, em particular para os campos das artes visuais, cultura e antropologia e, por outro, para o fortalecimento da ideia e convicção de que trata-se de uma manifestação cultural significativa para a sociedade e, desde logo, para as politicas publicas culturais em âmbito local, regional e nacional.
APA, Harvard, Vancouver, ISO, and other styles
25

Qin, Shanlin. "Fractional order models: Numerical simulation and application to medical imaging." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/115108/1/115108_9066888_shanlin_qin_thesis.pdf.

Full text
Abstract:
This thesis is primarily concerned with developing new models and numerical methods based on the fractional generalisation of the Bloch and Bloch-Torrey equations to account for anomalous MRI signal attenuation. The two main contributions of the research are to investigate the anomalous evolution of MRI signals via the fractionalised Bloch equations, and to develop new effective numerical methods with supporting analysis to solve the time-space fractional Bloch-Torrey equations.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhang, Wenlong. "Forward and Inverse Problems Under Uncertainty." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE024/document.

Full text
Abstract:
Cette thèse contient deux matières différentes. Dans la première partie, deux cas sont considérés. L'un est le modèle plus lisse de la plaque mince et l'autre est les équations des limites elliptiques avec des données limites incertaines. Dans cette partie, les convergences stochastiques des méthodes des éléments finis sont prouvées pour chaque problème.Dans la deuxième partie, nous fournissons une analyse mathématique du problème inverse linéarisé dans la tomographie d'impédance électrique multifréquence. Nous présentons un cadre mathématique et numérique pour une procédure d'imagerie du tenseur de conductivité électrique anisotrope en utilisant une nouvelle technique appelée Tentomètre de diffusion Magnéto-acoustographie et proposons une approche de contrôle optimale pour reconstruire le facteur de propriété intrinsèque reliant le tenseur de diffusion au tenseur de conductivité électrique anisotrope. Nous démontrons la convergence et la stabilité du type Lipschitz de l'algorithme et présente des exemples numériques pour illustrer sa précision. Le modèle cellulaire pour Electropermécanisme est démontré. Nous étudions les paramètres efficaces dans un modèle d'homogénéisation. Nous démontrons numériquement la sensibilité de ces paramètres efficaces aux paramètres microscopiques critiques régissant l'électropermécanisme
This thesis contains two different subjects. In first part, two cases are considered. One is the thin plate spline smoother model and the other one is the elliptic boundary equations with uncertain boundary data. In this part, stochastic convergences of the finite element methods are proved for each problem.In second part, we provide a mathematical analysis of the linearized inverse problem in multifrequency electrical impedance tomography. We present a mathematical and numerical framework for a procedure of imaging anisotropic electrical conductivity tensor using a novel technique called Diffusion Tensor Magneto-acoustography and propose an optimal control approach for reconstructing the cross-property factor relating the diffusion tensor to the anisotropic electrical conductivity tensor. We prove convergence and Lipschitz type stability of the algorithm and present numerical examples to illustrate its accuracy. The cell model for Electropermeabilization is demonstrated. We study effective parameters in a homogenization model. We demonstrate numerically the sensitivity of these effective parameters to critical microscopic parameters governing electropermeabilization
APA, Harvard, Vancouver, ISO, and other styles
27

Cavalcanti, Luiz Carlos Amaral Mendonça. "Detecção de elementos antrópicos em imagens aéreas da floresta amazônica." Universidade Federal do Amazonas, 2016. http://tede.ufam.edu.br/handle/tede/5301.

Full text
Abstract:
Submitted by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-12-01T13:30:13Z No. of bitstreams: 1 Dissertação - Luiz C. A. M. Cavalcanti.pdf: 12456865 bytes, checksum: 8cefb0785da034136e29212e34ef9290 (MD5)
Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-12-01T13:30:28Z (GMT) No. of bitstreams: 1 Dissertação - Luiz C. A. M. Cavalcanti.pdf: 12456865 bytes, checksum: 8cefb0785da034136e29212e34ef9290 (MD5)
Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-12-01T13:30:49Z (GMT) No. of bitstreams: 1 Dissertação - Luiz C. A. M. Cavalcanti.pdf: 12456865 bytes, checksum: 8cefb0785da034136e29212e34ef9290 (MD5)
Made available in DSpace on 2016-12-01T13:30:50Z (GMT). No. of bitstreams: 1 Dissertação - Luiz C. A. M. Cavalcanti.pdf: 12456865 bytes, checksum: 8cefb0785da034136e29212e34ef9290 (MD5) Previous issue date: 2016-07-01
Agência de Fomento não informada
During environmental crimes patrolling, the response time is a very important component for the success of the missions. Generally, infractions occur in remote and hard-access places, characteristics that hinder both the patrolling as well the action of environmental protection agents. To increase the approaches’ success rate and reduce the risk of human lives, unmanned aerial vehicles (UAVs) can be used to cover large areas of forest in a short time without being perceived by offenders, allowing the patrolling organs responsible for these areas to plan and act more efficiently in the repression of such crimes. The new problem generated by this approach is the huge amount of data generated during these missions, which often includes hours of video. The manual inspection of all this material in searching for anthropic elements is very tiring and error-prone. This work presents a evaluation of image segmentation techniques, inspections of features to be extracted, followed by a supervised classification of those segments for anthropic element detection in amazon’s rain forest aerial images. Besides making publicly available a dataset with more than 3,000 images and 10,000 segments labeled accordingly, this work investigates different strategies for anthropic elements classification. The experiments obtained a consistency error rate inferior to 8% in image segmentation and a precision above 94% on target objects classification through one-class classifiers ensemble, using One-class SVM and REPTree algorithms.
Durante o patrulhamento de crimes ambientais, o tempo de resposta é um componente muito importante no sucesso das missões. Geralmente as infrações ocorrem em lugares ermos e de difícil acesso, características que dificultam tanto o patrulhamento quanto a ação de agentes de preservação ambiental. Para aumentar a taxa de sucesso das abordagens e reduzir o risco de vidas humanas, veículos aéreos não-tripulados (VANTs) podem ser usados para cobrir grandes áreas de floresta em pouco tempo, sem que sejam percebidos por infratores, permitindo que os órgãos de patrulhamento dessas áreas possam planejar e agir com mais eficiência na repressão a esses crimes. O novo problema gerado por essa abordagem é a enorme quantidade de dados gerada durante essas missões, que muitas vezes compreendem horas de vídeo. A inspeção manual de todo esse material em busca de elementos antrópicos é muito cansativa e propensa a erros. Este trabalho apresenta uma avaliação de técnicas de segmentação de imagens, inspeção de características a serem extraídas, seguido da classificação supervisionada destes segmentos para detecção de elementos antrópicos em imagens aéreas da floresta amazônica. Além da publicação de uma base de dados com cerca de 3.000 imagens e 10.000 segmentos devidamente rotulados e investiga diferentes estratégias para classificação de elementos antrópicos. Os experimentos realizados obtiveram taxas de erro de consistência inferiores a 8% na segmentação das imagens utilizando o algoritmo SRM e precisão acima de 94% na classificação dos objetos de interesse através de conjuntos de classificadores unários, utilizando os algoritmos One-Class SVM e REPTree.
APA, Harvard, Vancouver, ISO, and other styles
28

Polidorio, Airton Marco [UNESP]. "Detenção de elementos da paisagem urbana em imagens aéreas multiespectrais." Universidade Estadual Paulista (UNESP), 2007. http://hdl.handle.net/11449/100265.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:30:31Z (GMT). No. of bitstreams: 0 Previous issue date: 2007-02-28Bitstream added on 2014-06-13T21:01:20Z : No. of bitstreams: 1 polidorio_am_dr_prud.pdf: 4223310 bytes, checksum: a706b8aa2e1aad989b8ed1db4738dca3 (MD5)
Atividades inerentes à Fotogrametria e ao Sensoriamento Remoto que utilizam dados extraídos de imagens aéreas estão em constante desenvolvimento, seja pela inserção de novas tecnologias relacionadas com a aquisição desses dados, seja pelo estabelecimento de novos conceitos e métodos que permitem computá-los, transformando-os em informação. Essa transformação, dados-informação, é feita por vários processos entre os quais, alguns foram automatizados e outros necessitam da supervisão e da interação com um operador humano para realizá-la. Um dos fatores que impede a completa automação desses processos é a falta de conhecimento contextual prévio sobre a natureza dos dados. A detecção e a discriminação de elementos específicos em dados de imageamento aéreo constituem uma forma de adquirir esse conhecimento contextual. Este trabalho aborda esse problema de detectar e discriminar elementos específicos em dados de imageamento aéreo de regiões urbanas, especificamente: sombras, vegetação verde, corpos d'água, rodovias pavimentadas e telhados de edificações, bem como discriminar a natureza da elevação desses elementos, de forma que seja possível inferir se determinado elemento tem elevação própria, ou se estáao nível da superfície do terreno.
Inherent activities to Photogrammetryand Remote Sensingthat use data acquired byaerial images are in constant development, due the useful of new hardware to acquire such data, or due the establishment of new concepts and methods that allow computing those data, transforming them in information. That transformation, data-to-information, is done by several processes. Some of these processes were automated and others still need the supervision and the interaction with a human operator to accomplish that transformation. A factor that obstruct the complete automation those processes is the lackof previous context-knowledge about the nature of those data. The detection and the discrimination of specific elements in aerial imagingdata constitute a wayto acquire that context-knowledge. This work approaches the problem related with the automatic detection and discrimination of specific elements in aerial imagingdata of urban areas, specifically: shadows; green vegetation; water bodies, paved-roads and roofs buildings, besides determiningthe nature of the elevation of those elements, in order to allow infer if a specific element has own elevation, or it is been at the level of the terrain surface. In order, to detect those elements, this workproposes new metrics, designed as enhancements indexes, to treat the elements of interest.
APA, Harvard, Vancouver, ISO, and other styles
29

Polidorio, Airton Marco. "Detenção de elementos da paisagem urbana em imagens aéreas multiespectrais /." Presidente Prudente : [s.n.], 2007. http://hdl.handle.net/11449/100265.

Full text
Abstract:
Resumo: Atividades inerentes à Fotogrametria e ao Sensoriamento Remoto que utilizam dados extraídos de imagens aéreas estão em constante desenvolvimento, seja pela inserção de novas tecnologias relacionadas com a aquisição desses dados, seja pelo estabelecimento de novos conceitos e métodos que permitem computá-los, transformando-os em informação. Essa transformação, dados-informação, é feita por vários processos entre os quais, alguns foram automatizados e outros necessitam da supervisão e da interação com um operador humano para realizá-la. Um dos fatores que impede a completa automação desses processos é a falta de conhecimento contextual prévio sobre a natureza dos dados. A detecção e a discriminação de elementos específicos em dados de imageamento aéreo constituem uma forma de adquirir esse conhecimento contextual. Este trabalho aborda esse problema de detectar e discriminar elementos específicos em dados de imageamento aéreo de regiões urbanas, especificamente: sombras, vegetação verde, corpos d'água, rodovias pavimentadas e telhados de edificações, bem como discriminar a natureza da elevação desses elementos, de forma que seja possível inferir se determinado elemento tem elevação própria, ou se estáao nível da superfície do terreno.
Abstract: Inherent activities to Photogrammetryand Remote Sensingthat use data acquired byaerial images are in constant development, due the useful of new hardware to acquire such data, or due the establishment of new concepts and methods that allow computing those data, transforming them in information. That transformation, data-to-information, is done by several processes. Some of these processes were automated and others still need the supervision and the interaction with a human operator to accomplish that transformation. A factor that obstruct the complete automation those processes is the lackof previous context-knowledge about the nature of those data. The detection and the discrimination of specific elements in aerial imagingdata constitute a wayto acquire that context-knowledge. This work approaches the problem related with the automatic detection and discrimination of specific elements in aerial imagingdata of urban areas, specifically: shadows; green vegetation; water bodies, paved-roads and roofs buildings, besides determiningthe nature of the elevation of those elements, in order to allow infer if a specific element has own elevation, or it is been at the level of the terrain surface. In order, to detect those elements, this workproposes new metrics, designed as enhancements indexes, to treat the elements of interest.
Orientador: Nilton Nobuhiro Imai
Coorientador: Antonio Garcia Tommaselli
Banca: José Alberto Quintanilha
Banca: Flávio Bortolozzi
Banca: Maria de Lourdes Bueno Trindade Galo
Banca: Júlio Hasegawa
Doutor
APA, Harvard, Vancouver, ISO, and other styles
30

Zanjani-pour, Sahand. "Intervertebral disc stress and pressure in different daily postures : a finite element study." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/26682.

Full text
Abstract:
Low back pain is the most common cause of disability in the United Kingdom with health care costs of more than 1 billion pounds per year. One reason associated with low back pain is the degeneration of intervertebral discs due to loads on the spine. Daily postures such as standing and sitting produce different loads on the discs. Previously, many studies investigated the stress and pressure within the disc in these postures. The results do not agree with each other and the experiments have many limitations. The aim of this project was to assess the feasibility of incorporating magnetic resonance (MR) imaging and finite element (FE) analysis to predict the pressure and stresses developed by different daily postures in an individual. Transient and non-transient subject specific 2D models of nine individuals in standing and sitting were created based on previously acquired MR images. The geometry of these FE models was based on supine MR images. The sitting and standing boundary conditions were calculated by comparing their MR images with the supine posture. The results showed that for six subjects sitting created more intradiscal pressure compared to standing and in one subject standing more than sitting. For two of the subjects the pressure was nearly the same in sitting and standing. Because of the 2D model’s limitations, 3D models of an individual were developed. Both transient and non-transient models of the individual were created. The intradiscal pressure results were three times lower compared to the 2D models. This was due to consideration of out of plane deformation in the 3D models. These results were in the range of in-vivo and in-vitro measurements available in the literature. In conclusion, it was possible to create kinematic transient subject specific FE models based on the MR images in different postures. 2D models provide a method for comparing the postures but 3D models may be more realistic.
APA, Harvard, Vancouver, ISO, and other styles
31

Chen, Cheng. "Finite element modeling of trabecular bone from multi-row detector CT imaging." Thesis, University of Iowa, 2014. https://ir.uiowa.edu/etd/1440.

Full text
Abstract:
The finite element method (FEM) has been widely applied to various medical imaging applications over the past two decades. The remarkable progress in high-resolution imaging techniques has allowed FEM to draw great research interests in computing trabecular bone (TB) stiffness from three-dimensional volumetric imaging. However, only a few results are available in literature on applying FEM to multi-row detector CT (MDCT) imaging due to the challenges posed by limited spatial resolution. The research presented here develops new methods to preserve TB structure connectivity and to generate high-quality mesh representation for FEM from relatively low resolution images available at MDCT imaging. Specifically, it introduced a space-variant hysteresis algorithm to threshold local trabecular structure that preserves structure connectivity. Also, mesh generation algorithms was applied to represent TB micro-architecture and mesh quality was compared with that generated by traditional methods. TB stiffness was computed using FEM simulation on micro-CT (µ-CT) and MDCT images of twenty two cadaveric specimens of distal tibia. Actual stiffness of those specimens were experimentally determined by mechanical testing and its correlation with computed stiffness was analyzed. The observed values of linear correlation (r2) between actual bone stiffness and computed stiffness from µ-CT and MDCT imaging were 0.95 and 0.88, respectively. Also, reproducibility of the FEM-based computed bone stiffness was determined from repeat MDCT scans of cadaveric specimens and the observed intra-class correlation coefficient was a high value of 0.98. Experimental results demonstrate the feasibility of application of FEM with high sensitivity and reproducibility on MDCT imaging of TB at distal tibia under in vivo condition.
APA, Harvard, Vancouver, ISO, and other styles
32

Chean, Shen Lee. "Numerical study for acoustic micro-imaging of three dimensional microelectronic packages." Thesis, Liverpool John Moores University, 2014. http://researchonline.ljmu.ac.uk/4502/.

Full text
Abstract:
Complex structures and multiple interfaces of modern microelectronic packages complicate the interpretation of acoustic data. This study has four novel contributions. 1) Contributions to the finite element method. 2) Novel approaches to reduce computational cost. 3) New post processing technologies to interpret the simulation data. 4) Formation of theoretical guidance for acoustic image interpretation. The impact of simulation resolution on the numerical dispersion error and the exploration of quadrilateral infinite boundaries make up the first part of this thesis's contributions. The former focuses on establishing the convergence score of varying resolution densities in the time and spatial domain against a very high fidelity numerical solution. The latter evaluates the configuration of quadrilateral infinite boundaries in comparison against traditional circular infinite boundaries and quadrilateral Perfectly Matched Layers. The second part of this study features the modelling of a flip chip with a 140µm solder bump assembly, which is implemented with a 230MHz virtual raster scanning transducer with a spot size of 17µm. The Virtual Transducer was designed to reduce the total numerical elements from hundreds of millions to hundreds of thousands. Thirdly, two techniques are invented to analyze and evaluate simulated acoustic data: 1) The C-Line plot is a 2D max plot of specific gate interfaces that allows quantitative characterization of acoustic phenomena. 2) The Acoustic Propagation Map, contour maps an overall summary of intra sample wave propagation across the time domain in one image. Lastly, combining all the developments. The physical mechanics of edge effects was studied and verified against experimental data. A direct relationship between transducer spot size and edge effect severity was established. At regions with edge effect, the acoustic pulse interfacing with the solder bump edge is scattered mainly along the horizontal axis. The edge effect did not manifest in solder bump models without Under Bump Metallization (UBM). Measurements found acoustic penetration improvements of up to 44% with the removal of (UBM). Other acoustic mechanisms were also discovered and explored. Defect detection mechanism was investigated by modelling crack propagation in the solder bump assembly. Gradual progression of the crack was found have a predictable influence on the edge effect profile. By exploiting this feature, the progress of crack propagation from experimental data can be interpreted by evaluating the C-Scan image.
APA, Harvard, Vancouver, ISO, and other styles
33

Qiu, Yan 1973. "Three dimensional finite element model for lesion correspondence in breast imaging [electronic resource] / by Yan Qiu." University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000192.

Full text
Abstract:
Title from PDF of title page.
Document formatted into pages; contains 64 pages.
Thesis (M.S.C.S.)--University of South Florida, 2003.
Includes bibliographical references.
Text (Electronic thesis) in PDF format.
ABSTRACT: Predicting breast tissue deformation is of great significance in several medical applications such as surgery, biopsy and imaging. In breast surgery, surgeons are often concerned with a specific portion of the breast, e.g., tumor, which must be located accurately beforehand. Also clinically it is important for combining the information provided by images from several modalities or at different times, for the planning and guidance of interventions. Multi-modality imaging of the breast obtained by mammography, MRI and PET is thought to be best achieved through some form of data fusion technique. However, images taken by these various techniques are often obtained under entirely different tissue configurations, compression, orientation or body position. In these cases some form of spatial transformation of image data from one geometry to another is required such that the tissues are represented in an equivalent configuration.
ABSTRACT: We constructed the 3D biomechanical models for this purpose using Finite Element Methods (FEM). The models were based on phantom and patient MRIs and could be used to model the interrelation between different types of tissue by applying displacements of forces and to register multimodality medical images.
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
34

César, de Miranda Loureiro Eduardo. "Construção de simuladores baseados em elementos de volume a partir de imagens tomográficas coloridas." Universidade Federal de Pernambuco, 2002. https://repositorio.ufpe.br/handle/123456789/9491.

Full text
Abstract:
Made available in DSpace on 2014-06-12T23:14:50Z (GMT). No. of bitstreams: 2 arquivo8962_1.pdf: 612408 bytes, checksum: b775622cd500cf37d88772515a5ea702 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2002
Este trabalho apresenta uma nova metodologia aplicada na construção de simuladores baseados em elementos de volume. O processo de segmentação dos modelos resume-se à tarefa de colorir as imagens tomográficas atribuindo uma cor diferente para cada órgão segmentado. Um modelo de cabeça e pescoço foi construído utilizando esta nova técnica. Além de simplificar o procedimento permitindo a construção dos modelos em um menor período de tempo, as informações são armazenadas de forma otimizada aumentando a performance do programa que calcula o transporte de radiação. A execução de comandos gráficos pelo mesmo programa que calcula o transporte de radiação permite também que imagens sejam reconstruídas a partir dos dados do modelo mostrando regiões de isodose, sob diversos pontos de vista, aumentando o nível da informação passada ao usuário. Radiografias virtuais do modelo construído também podem ser obtidas. Esta capacidade permite que estudos sejam realizados visando a otimização das técnicas radiográficas avaliando ao mesmo tempo as doses nos órgãos e tecidos. O programa aqui apresentado, denominado MCvoxEL, que implementa esta nova metodologia, foi validado comparando-se os seus resultados com os de programas já consolidados no meio científico. Coeficientes de conversão para doses provenientes de exposições a feixes de fótons paralelos também foram obtidos
APA, Harvard, Vancouver, ISO, and other styles
35

Diarra, Bakary. "Study and optimization of 2D matrix arrays for 3D ultrasound imaging." Thesis, Lyon 1, 2013. http://www.theses.fr/2013LYO10165/document.

Full text
Abstract:
L’imagerie échographique en trois dimensions (3D) est une modalité d’imagerie médicale en plein développement. En plus de ses nombreux avantages (faible cout, absence de rayonnement ionisant, portabilité) elle permet de représenter les structures anatomiques dansleur forme réelle qui est toujours 3D. Les sondes à balayage mécaniques, relativement lentes, tendent à être remplacées par des sondes bidimensionnelles ou matricielles qui sont unprolongement dans les deux directions, latérale et azimutale, de la sonde classique 1D. Cetagencement 2D permet un dépointage du faisceau ultrasonore et donc un balayage 3D del’espace. Habituellement, les éléments piézoélectriques d’une sonde 2D sont alignés sur unegrille et régulièrement espacés d’une distance (en anglais le « pitch ») soumise à la loi del’échantillonnage spatial (distance inter-élément inférieure à la demi-longueur d’onde) pour limiter l’impact des lobes de réseau. Cette contrainte physique conduit à une multitude d’éléments de petite taille. L’équivalent en 2D d’une sonde 1D de 128 éléments contient128x128=16 384 éléments. La connexion d’un nombre d’éléments aussi élevé constitue unvéritable défi technique puisque le nombre de canaux dans un échographe actuel n’excède querarement les 256. Les solutions proposées pour contrôler ce type de sonde mettent en oeuvredu multiplexage ou des techniques de réduction du nombre d’éléments, généralement baséessur une sélection aléatoire de ces éléments (« sparse array »). Ces méthodes souffrent dufaible rapport signal à bruit du à la perte d’énergie qui leur est inhérente. Pour limiter cespertes de performances, l’optimisation reste la solution la plus adaptée. La première contribution de cette thèse est une extension du « sparse array » combinéeavec une méthode d’optimisation basée sur l’algorithme de recuit simulé. Cette optimisation permet de réduire le nombre nécessaire d’éléments à connecter en fonction des caractéristiques attendues du faisceau ultrasonore et de limiter la perte d’énergie comparée à la sonde complète de base. La deuxième contribution est une approche complètement nouvelle consistant à adopter un positionnement hors grille des éléments de la sonde matricielle permettant de supprimer les lobes de réseau et de s’affranchir de la condition d’échantillonnage spatial. Cette nouvelles tratégie permet d’utiliser des éléments de taille plus grande conduisant ainsi à un nombre d’éléments nécessaires beaucoup plus faible pour une même surface de sonde. La surface active de la sonde est maximisée, ce qui se traduit par une énergie plus importante et donc unemeilleure sensibilité. Elle permet également de balayer un angle de vue plus important, leslobes de réseau étant très faibles par rapport au lobe principal. Le choix aléatoire de la position des éléments et de leur apodization (ou pondération) reste optimisé par le recuit simulé.Les méthodes proposées sont systématiquement comparées avec la sonde complète dansle cadre de simulations numériques dans des conditions réalistes. Ces simulations démontrent un réel potentiel pour l’imagerie 3D des techniques développées. Une sonde 2D de 8x24=192 éléments a été construite par Vermon (Vermon SA, ToursFrance) pour tester les méthodes de sélection des éléments développées dans un cadreexpérimental. La comparaison entre les simulations et les résultats expérimentaux permettentde valider les méthodes proposées et de prouver leur faisabilité
3D Ultrasound imaging is a fast-growing medical imaging modality. In addition to its numerous advantages (low cost, non-ionizing beam, portability) it allows to represent the anatomical structures in their natural form that is always three-dimensional. The relativelyslow mechanical scanning probes tend to be replaced by two-dimensional matrix arrays that are an extension in both lateral and elevation directions of the conventional 1D probe. This2D positioning of the elements allows the ultrasonic beam steering in the whole space. Usually, the piezoelectric elements of a 2D array probe are aligned on a regular grid and spaced out of a distance (the pitch) subject to the space sampling law (inter-element distancemust be shorter than a mid-wavelength) to limit the impact of grating lobes. This physical constraint leads to a multitude of small elements. The equivalent in 2D of a 1D probe of 128elements contains 128x128 = 16,384 elements. Connecting such a high number of elements is a real technical challenge as the number of channels in current ultrasound scanners rarely exceeds 256. The proposed solutions to control this type of probe implement multiplexing or elements number reduction techniques, generally using random selection approaches (« spars earray »). These methods suffer from low signal to noise ratio due to the energy loss linked to the small number of active elements. In order to limit the loss of performance, optimization remains the best solution. The first contribution of this thesis is an extension of the « sparse array » technique combined with an optimization method based on the simulated annealing algorithm. The proposed optimization reduces the required active element number according to the expected characteristics of the ultrasound beam and permits limiting the energy loss compared to the initial dense array probe.The second contribution is a completely new approach adopting a non-grid positioningof the elements to remove the grating lobes and to overstep the spatial sampling constraint. This new strategy allows the use of larger elements leading to a small number of necessaryelements for the same probe surface. The active surface of the array is maximized, whichresults in a greater output energy and thus a higher sensitivity. It also allows a greater scansector as the grating lobes are very small relative to the main lobe. The random choice of the position of the elements and their apodization (or weighting coefficient) is optimized by the simulated annealing.The proposed methods are systematically compared to the dense array by performing simulations under realistic conditions. These simulations show a real potential of the developed techniques for 3D imaging.A 2D probe of 8x24 = 192 elements was manufactured by Vermon (Vermon SA, Tours,France) to test the proposed methods in an experimental setting. The comparison between simulation and experimental results validate the proposed methods and prove their feasibility
L'ecografia 3D è una modalità di imaging medicale in rapida crescita. Oltre ai vantaggiin termini di prezzo basso, fascio non ionizzante, portabilità, essa permette di rappresentare le strutture anatomiche nella loro forma naturale, che è sempre tridimensionale. Le sonde ascansione meccanica, relativamente lente, tendono ad essere sostituite da quelle bidimensionali che sono una estensione in entrambe le direzioni laterale ed azimutale dellasonda convenzionale 1D. Questo posizionamento 2D degli elementi permette l'orientamentodel fascio ultrasonico in tutto lo spazio. Solitamente, gli elementi piezoelettrici di una sondamatriciale 2D sono allineati su una griglia regolare e separati da una distanza (detta “pitch”) sottoposta alla legge del campionamento spaziale (la distanza inter-elemento deve esseremeno della metà della lunghezza d'onda) per limitare l'impatto dei lobi di rete. Questo vincolo fisico porta ad una moltitudine di piccoli elementi. L'equivalente di una sonda 1D di128 elementi contiene 128x128 = 16.384 elementi in 2D. Il collegamento di un così grandenumero di elementi è una vera sfida tecnica, considerando che il numero di canali negliecografi attuali supera raramente 256. Le soluzioni proposte per controllare questo tipo disonda implementano le tecniche di multiplazione o la riduzione del numero di elementi, utilizzando un metodo di selezione casuale (« sparse array »). Questi metodi soffrono di unbasso rapporto segnale-rumore dovuto alla perdita di energia. Per limitare la perdita di prestazioni, l’ottimizzazione rimane la soluzione migliore. Il primo contributo di questa tesi è un’estensione del metodo dello « sparse array » combinato con un metodo di ottimizzazione basato sull'algoritmo del simulated annealing. Questa ottimizzazione riduce il numero degli elementi attivi richiesto secondo le caratteristiche attese del fascio di ultrasuoni e permette di limitare la perdita di energia.Il secondo contributo è un approccio completamente nuovo, che propone di adottare un posizionamento fuori-griglia degli elementi per rimuovere i lobi secondari e per scavalcare il vincolo del campionamento spaziale. Questa nuova strategia permette l'uso di elementi piùgrandi, riducendo così il numero di elementi necessari per la stessa superficie della sonda. La superficie attiva della sonda è massimizzata, questo si traduce in una maggiore energia equindi una maggiore sensibilità. Questo permette inoltre la scansione di un più grande settore,in quanto i lobi secondari sono molto piccoli rispetto al lobo principale. La scelta casualedella posizione degli elementi e la loro apodizzazione viene ottimizzata dal simulate dannealing. I metodi proposti sono stati sistematicamente confrontati con la sonda completaeseguendo simulazioni in condizioni realistiche. Le simulazioni mostrano un reale potenzialedelle tecniche sviluppate per l'imaging 3D.Una sonda 2D di 8x24 = 192 elementi è stata fabbricata da Vermon (Vermon SA, ToursFrance) per testare i metodi proposti in un ambiente sperimentale. Il confronto tra lesimulazioni e i risultati sperimentali ha permesso di convalidare i metodi proposti edimostrare la loro fattibilità
APA, Harvard, Vancouver, ISO, and other styles
36

Watson, Francis Maurice. "Better imaging for landmine detection : an exploration of 3D full-wave inversion for ground-penetrating radar." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/better-imaging-for-landmine-detection-an-exploration-of-3d-fullwave-inversion-for-groundpenetrating-radar(720bab5f-03a7-4531-9a56-7121609b3ef0).html.

Full text
Abstract:
Humanitarian clearance of minefields is most often carried out by hand, conventionally using a a metal detector and a probe. Detection is a very slow process, as every piece of detected metal must treated as if it were a landmine and carefully probed and excavated, while many of them are not. The process can be safely sped up by use of Ground-Penetrating Radar (GPR) to image the subsurface, to verify metal detection results and safely ignore any objects which could not possibly be a landmine. In this thesis, we explore the possibility of using Full Wave Inversion (FWI) to improve GPR imaging for landmine detection. Posing the imaging task as FWI means solving the large-scale, non-linear and ill-posed optimisation problem of determining the physical parameters of the subsurface (such as electrical permittivity) which would best reproduce the data. This thesis begins by giving an overview of all the mathematical and implementational aspects of FWI, so as to provide an informative text for both mathematicians (perhaps already familiar with other inverse problems) wanting to contribute to the mine detection problem, as well as a wider engineering audience (perhaps already working on GPR or mine detection) interested in the mathematical study of inverse problems and FWI.We present the first numerical 3D FWI results for GPR, and consider only surface measurements from small-scale arrays as these are suitable for our application. The FWI problem requires an accurate forward model to simulate GPR data, for which we use a hybrid finite-element boundary-integral solver utilising first order curl-conforming N\'d\'{e}lec (edge) elements. We present a novel `line search' type algorithm which prioritises inversion of some target parameters in a region of interest (ROI), with the update outside of the area defined implicitly as a function of the target parameters. This is particularly applicable to the mine detection problem, in which we wish to know more about some detected metallic objects, but are not interested in the surrounding medium. We may need to resolve the surrounding area though, in order to account for the target being obscured and multiple scattering in a highly cluttered subsurface. We focus particularly on spatial sensitivity of the inverse problem, using both a singular value decomposition to analyse the Jacobian matrix, as well as an asymptotic expansion involving polarization tensors describing the perturbation of electric field due to small objects. The latter allows us to extend the current theory of sensitivity in for acoustic FWI, based on the Born approximation, to better understand how polarization plays a role in the 3D electromagnetic inverse problem. Based on this asymptotic approximation, we derive a novel approximation to the diagonals of the Hessian matrix which can be used to pre-condition the GPR FWI problem.
APA, Harvard, Vancouver, ISO, and other styles
37

Ataseven, Yoldas. "Parallel Implementation Of The Boundary Element Method For Electromagnetic Source Imaging Of The Human Brain." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606548/index.pdf.

Full text
Abstract:
Human brain functions are based on the electrochemical activity and interaction of the neurons constituting the brain. Some brain diseases are characterized by abnormalities of this activity. Detection of the location and orientation of this electrical activity is called electro-magnetic source imaging (EMSI) and is of signicant importance since it promises to serve as a powerful tool for neuroscience. Boundary Element Method (BEM) is a method applicable for EMSI on realistic head geometries that generates large systems of linear equations with dense matrices. Generation and solution of these matrix equations are time and memory consuming due to the size of the matrices and high computational complexity of direct methods. This study presents a relatively cheap and eective solution the this problem and reduces the processing times to clinically acceptable values using parallel cluster of personal computers on a local area network. For this purpose, a cluster of 8 workstations is used. A parallel BEM solver is implemented that distributes the model eciently to the processors. The parallel solver for BEM is developed using the PETSc library. The performance of the iv solver is evaluated in terms of CPU and memory usage for dierent number of processors. For a 15011 node mesh, a speed-up eciency of 97.5% is observed when computing transfer matrices. Individual solutions can be obtained in 520 ms on 8 processors with 94.2% parallellization eciency. It was observed that workstation clusters is a cost eective tool for solving complex BEM models in clinically acceptable time. Eect of parallelization on inverse problem is also demonstrated by a genetic algorithm and very similar speed-up is obtained.
APA, Harvard, Vancouver, ISO, and other styles
38

Yang, Jing. "The role of phonological working memory in Chinese reading development behavioral and fMRI evidence /." Click to view the E-thesis via HKUTO, 2009. http://sunzi.lib.hku.hk/hkuto/record/B42664640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Brook, Nicholas H. "The flavour dependence of charged hadron production at large transverse momenta using high energy photon and hadron beams at the OMEGA spectrometer : equipped with a ring imaging Cherenkov detector." Thesis, University of Manchester, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.328282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ferreira, Manoel Dênis Costa. "Análise inversa utilizando o método dos elementos de contorno e correlação de imagens digitais." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/18/18134/tde-25092012-084832/.

Full text
Abstract:
A identificação de parâmetros físicos e geométricos utilizando medições experimentais é um procedimento comum no tratamento de muitos problemas da ciência e engenharia. Neste contexto, a análise inversa apresenta-se como uma importante ferramenta no tratamento desses problemas. Este trabalho apresenta formulações que acoplam o uso do método dos elementos de contorno (MEC) e a técnica de correlação de imagens digitais (CID) (para obtenção dos campos de deslocamentos) na resolução de alguns problemas inversos de interesse para engenharia de estruturas. Implementou-se um código computacional baseado no MEC, em técnicas de regularização e em algoritmo genético, para análise inversa em problemas de identificação das propriedades dos materiais, recuperação das condições de contorno e identificação de parâmetros do modelo coesivo de fraturamento. Exemplos com dados oriundos de uma prévia análise direta (simulando dados experimentais) são apresentados para demonstrar a eficiência das formulações propostas. Ensaios de vigas em flexão em três pontos com entalhe foram realizados com aquisição de imagens para obtenção dos campos de deslocamentos da região de propagação da fissura, via CID. Estes campos foram utilizados para alimentar o modelo inverso proposto. A técnica de CID originou dados em quantidade e precisão suficientes para os fins almejados neste trabalho. A utilização do MEC mostrou-se simples e de grande eficiência para a solução dos problemas inversos tratados.
The identification of physical and geometrical parameters utilizing experimental measurements is a common procedure in treating many problems of science and engineering. In this context, the inverse analysis is an important tool in treating these problems. This work presents formulations that associate the use of boundary element method (BEM) and the technique of digital image correlation (DIC) (for obtaining the displacement fields) in solving some inverse problems of interest to Structure Engineering. A computer code based on the BEM, on regularization techniques and genetic algorithm has been implemented for the treatment of problems such as Identification of material properties, recovery of boundary conditions and identification of cohesive model parameters. Examples with data from a previous direct analysis (simulating experimental data) are presented to demonstrate the effectiveness of the proposed formulations. Three point flexural tests with notch were performed and images were acquired to obtain the displacement fields on one lateral surface of the samples, via DIC. These displacement fields were used to feed the inverse model proposed. The DIC technique resulted in quantitative and accurate data for the purposes of this study. The use of the BEM proved to be simple and efficient in solving the inverse problems treated.
APA, Harvard, Vancouver, ISO, and other styles
41

Santos, Marina Vieira. "Imagens e Ciências no Ensino Fundamental II: um estudo à luz da semiótica peirceana." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/81/81132/tde-30032017-153653/.

Full text
Abstract:
Considerando que é no ensino fundamental que fenômenos de natureza química são tratados pela primeira vez e que as imagens podem auxiliar no processo de aprendizagem e possibilitar a construção de significados para o conhecimento, o presente trabalho propõe-se a analisar as escolhas, as ações e os comportamentos de uma professora de ciências durante suas aulas para o 9° ano do ensino fundamental II, com o objetivo de explorar o uso de recursos imagéticos propostos por ela. Para tanto, fizeram parte desta aferição questões como: quais os tipos de imagens foram utilizados no desenvolvimento da disciplina? O que a professora almejou ao utilizá-las? Em qual contexto estão inseridas? Como se deu o processo de comunicação acerca da imagem? A pesquisa foi desenvolvida em uma escola municipal localizada na cidade de Salto - SP, onde foram acompanhadas e gravadas as aulas por um período de seis meses, durante o qual foi trabalhado o conteúdo de química. Caracteriza-se, assim, como um estudo de caso. Cada imagem utilizada pela professora em sala de aula foi considerada signo e explorada a partir da teoria da semiótica de Charles Sanders Peirce. Considerá-la como signo é atribuir-lhe uma carga de materialização, ou seja, o objeto é materializado na representação, possibilitando a transição entre o domínio imaterial e o visual e a compreensão de como podem ou não colaborar para a construção do conteúdo de química, considerando, ainda, o discurso da professora acerca do signo. As imagens foram classificadas de acordo com modos de relação pelo qual o signo se constitui (quanto à sua própria natureza) e quanto ao seu objeto. Essas relações pertencem às denominadas tricotomias estabelecidas por Peirce, resultando em diversas classes, como quali-signo icônico, sin-signo icônico, sin-signo indicial, legi-signo icônico, legi-signo indicial e legi-signo simbólico. Essas análises e classificações somadas aos argumentos e construções apresentados pela professora durante a pesquisa evidenciaram sua postura fortemente realista durante a leitura das imagens.
Whereas in the elementary school that chemical nature phenomena are treated for the first time and the images can assist the learning process and enable the construction of meanings for knowledge, this paper explores the actions and behaviors of a science teacher for their classes to the 9th grade of elementary school II, in order to explore the use of pictorial resources proposed for it. So, were part of this assessment questions such as what types of images were used in the development of the discipline? What the teacher aspired to use them? In what context are inserted? How was the communication process about the image? This research is characterized as a case study and was developed in a municipal school in the city of Salto - SP, where classes were monitored and recorded in a period of six months, a period in which they worked the chemical content. Each image used by the teacher in the classroom was considered a sign and explored from the theory of Charles Sanders Peirce semiotics. Consider it as a sign is to attribute it a load of materialization, in the other words, the object is materialized in the representation, enabling the transition between immaterial domain and the visual and understanding of how may or may not contribute to the construction of the chemical content considering also the speech of the teacher about the sign. The images were classified according to modes of relationship in which the sign is constituted (about its nature) and about its object. These relationships belong to called trichotomies established by Peirce, resulting in several classes, as qualisign iconic, sinsign iconic, sinsign indicial, legisign iconic, legisign indicial and legisign symbolic. These reviews and ratings added to teacher\'s arguments and buildings presented during the research showed a strongly realistic posture while reading the images.
APA, Harvard, Vancouver, ISO, and other styles
42

LAURENT, DANIEL. "Monde imaginal et auto-education elements pour une hermeneutique de l'imaginal educatif." Lille 3, 1995. http://www.theses.fr/1995LIL30010.

Full text
Abstract:
Au depart un double questionnement : a) s'interroger sur la place et le role des images elues (l'imaginal) dans la construction de l'identite privee et professionnelle de l'enseignant. B) specifier l'experience esthetique qui caracterise ce processus. La problematique s'inscrit entre quatre notions : celle d'une identite narrative qui, au travers de l'experience esthetique, s'abreuve a la source de l'imaginal fictionnel pour se construire une vision sensee de sa vie et celle d'une hermeneutique de soi, percue comme autoeducation du sujet sur son identite narrative. D'ou la problematique des rapports d'une hermeneutique de soi a une hermeneutique de l'imaginal sous un triple questionnement : comment est-il possible de s'auto-eduquer au contact des oeuvres d'art et de fiction ? de valider notre agir personnel en nous referant a de l'imaginal fictionnel ? et, plus largement, quels rapports l'individu doit-il etablir entre l'agir culturel (la tache personnelle de se realiser soi) et les exigences civilisationnelles du monde englobant de l'epoque. Autour d'une double reference : theorique a montaigne nietzsche, freud et imaginale aux films de francois truffaut l'enfant sauvage et une belle fille comme moi, la demarche vise a retablir l'autoreflexion comme une aufklarung du soi dans un retour aux formes vecues de la subjectivite. Elle est liee a la conception philosophique que l'individu doit par l'auto-reflexion et en fonction de son experience biographique propre, s'habituer a penser les forces qui l'habitent pour les mettre au service de la finalite de vie qu'il se donne. Au coeur d'un rapport a l'alterite (car dans l'imaginal ce que narcisse cherche alors ce n'est pas son image mais le regard approbateur de l'autre) l'individu est bien en presence d'une tache auto-educative que chacun mene, medite seul avec ses propres armes - y compris celles d'un imaginal fictif
The origin of the thesis stands on a two-fold questioning : an interrogation on the place and role of elected images (the 'imaginal') in the construction of the private and professional identity of the teacher and an attempt to define the esthetic experience which characterizes this process. The core of the problem inscribes itself between four notions : that of a narrative identity which, through esthetic experiences, enables the individual to draw into his stockpile of fictional "imaginal" in order to build a meaningful vision of his life, along with a hermeneutic of the self, conceived as aself-education, from his narrative identity, of the character. Hence the necessity of considering the relationships between a hermeneutic of the self and a hermeneutic of the 'imaginal', necessity which leads to a three-fold questioning : how, through a contact with artistic and fictional works, is it possible to validate a personal praxis by refering to a fictional 'imaginal' ? and what links the individual have to establish between his cultural praxis (the personal task of constructing one self) and the civilisational requirements of the overwhelming contemporary world. Through theoretical references to montaigne, nietzsche and freud and through 'imaginal' references to francois truffaut's films, the aim of the work is to re-establish self-reflexion as an aufklarung of the self by re-introducing personal experiences, the fundemental basis of subjectivity. It is also related to the philosophical conception which emphasizes the need for an individual to draw from his own biographical experience and from self reflexion to be in the habit of finding, in his own ressources, the means to exploit them for the leitmotives of his life. Centered upon the relation to the otherness (for in the imaginal, what narcisse maybe is looking for is not his image, but the approbation of the other), we are confronted to a task of self-education, that everybody is following, and where everybody meditates with his own means-including the 'imaginal' ones
APA, Harvard, Vancouver, ISO, and other styles
43

McGuinness, Brigitt Angelica. "Teacher Perceptions of the Implementation Processes of the Imagine Learning Program in Title I Elementary Schools." W&M ScholarWorks, 2018. https://scholarworks.wm.edu/etd/1530192587.

Full text
Abstract:
Closing achievement gaps for students from low-socioeconomic backgrounds is a decades-long issue in public education, particularly for reading instruction (International Reading Association [IRA], 2010; National Center for Education Statistics [NCES], 2013). Across the United States, initiatives to further integrate technology-based instruction to achieve differentiation are constantly emerging. Selecting which programs to use and how to best implement the technology to produce the highest academic gains remain significant issues. Research has shown that technology-based programs can produce the same positive or negative effects as teacher-led instruction (Ross, Morrison, & Lowther, 2010). Finding and implementing high-quality literacy technology is particularly important for students attending Title I schools. Students from low-income backgrounds may start their schooling at a disadvantage in terms of vocabulary and oral communication skills (Reardon, 2013; Timmons, 2008) which research has linked to higher unemployment rates (Timmons, 2008). The purpose of this qualitative program evaluation was to analyze teacher perceptions regarding the impact of implementation activities for a technology-based literacy program in four Title I schools in a Virginia school district. Nine teachers representing kindergarten, first and second grades were interviewed regarding their level of preparedness, classroom integration, obstacles and facilitators in relation to program implementation. Teachers reported high levels of preparedness in placing students on the program, but low levels of support in ongoing implementation and training. Recommendations included providing all teachers with initial and continual professional development, allowing stakeholders to visit model classrooms, providing necessary equipment, devoting time for program-specific data talks and individual teacher planning, and garnering more planning input from the program consultants.
APA, Harvard, Vancouver, ISO, and other styles
44

Sato, Marcel. "Modelagem de problemas da mecanica da fratura e propagação de trincas em fadiga." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/263144.

Full text
Abstract:
Orientador: Paulo Sollero
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica
Made available in DSpace on 2018-08-14T01:32:47Z (GMT). No. of bitstreams: 1 Sato_Marcel_M.pdf: 15643379 bytes, checksum: 9ad3608b9de1e264891ac44ce87bbb5a (MD5) Previous issue date: 2009
Resumo: Este trabalho apresenta uma ferramenta para modelar a propagação de múltiplas trincas por fadiga em modo misto para materiais isotrópicos, utilizando o Método dos Elementos de Contorno Dual. Este método é utilizado para realizar uma análise de tensões e deformações no sólido, de modo a proporcionar resultados confiáveis para o campo elástico na região ao redor da trinca. À partir destes resultados, os fatores de intensidade de tensão são obtidos utilizando-se a técnica da Integral J, realizando-se o desacoplamento dos modos através da decomposição do campo elástico em seus componentes simétricos e anti-simétricos. Os fatores de intensidade de tensão são utilizados para calcular o ângulo de propagação e tamanho de incremento pelo método da Mínima Densidade de Energia de Deformação. A vida em fadiga é obtida pela integração direta da expressão da Lei de Paris, modificada pelo modelo de fechamento de trinca. O algoritmo é validado com resultados experimentais para dois problemas envolvendo fratura em modo misto e fadiga. Em ambos os testes, foi utilizada a técnica de Correlação de Imagens Digitais para monitorar a propagação das trincas por fadiga e técnicas de processamento de imagens foram empregadas para analisar os resultados
Abstract: This work presents a tool to model problems of multiple site fatigue crack propagation in isotropic materials under mixed mode conditions, using the dual boundary element method. This method is used to make an analysis of tensions and deformations in the solid, providing reliable results for the elastic field in the region near the crack tip. The stress intensity factors are obtained using the J-integral technique and they are decoupled with a procedure based on the decomposition of the elastic field into its symmetric and anti-symmetric components. The crack propagation angle and the increment size are calculated through the minimum strain energy density criterion. The fatigue life is obtained through the integration of the Paris law expression modified by the crack closure model. The validation of the algorithm is made with experimental results for two mixed mode fracture problems with fatigue. In both cases, the digital image correlation technique was used to monitor the fatigue crack propagation during the tests and digital image processing techniques were used to analyze the result
Mestrado
Mecanica dos Sólidos e Projeto Mecanico
Mestre em Engenharia Mecânica
APA, Harvard, Vancouver, ISO, and other styles
45

Oberg, Matheus Barbosa Andrade Moser. "Termoelasticidade : um estudo via método dos elementos de contorno, termografia e correlação digital de imagens." reponame:Repositório Institucional da UnB, 2016. http://repositorio.unb.br/handle/10482/20114.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Faculdade UnB Gama, Faculdade de Tecnologia, Programa de Pós-graduação em Integridade de Materiais da Engenharia, 2016.
Submitted by Fernanda Percia França (fernandafranca@bce.unb.br) on 2016-04-27T17:34:14Z No. of bitstreams: 1 2016_MatheusBarbosaAndradeMoserOberg.pdf: 3328409 bytes, checksum: 801212901d39931f4667cbcc8b950c35 (MD5)
Approved for entry into archive by Marília Freitas(marilia@bce.unb.br) on 2016-05-04T14:38:33Z (GMT) No. of bitstreams: 1 2016_MatheusBarbosaAndradeMoserOberg.pdf: 3328409 bytes, checksum: 801212901d39931f4667cbcc8b950c35 (MD5)
Made available in DSpace on 2016-05-04T14:38:33Z (GMT). No. of bitstreams: 1 2016_MatheusBarbosaAndradeMoserOberg.pdf: 3328409 bytes, checksum: 801212901d39931f4667cbcc8b950c35 (MD5)
Este trabalho apresenta um estudo numérico e experimental sobre termoelasticidade em regime permanente utilizando o método dos elementos de contorno (MEC) e técnicas de metrologia óptica. Na formulação termoelástica do MEC, o efeito das contribuições decorrentes do fenômeno termoelástico surge, naturalmente, na forma de uma integral de domínio. Visando preservar a característica principal do MEC, esta integral de domínio foi convertida a uma integral equivalente sobre o contorno utilizando o método da integração radial (MIR). Esta técnica de conversão, que consiste em uma abordagem puramente matemática, tem como requisito que o campo de temperaturas seja definido por meio de uma função matemática. Em grande parte dos problemas de engenharia, entretanto, esta informação é adquirida por meio de uma distribuição de valores pontuais de temperaturas. Desta forma, para aplicação do MIR, faz-se necessária a utilização de uma técnica de regressão para aproximação deste campo de temperaturas por uma função matemática que o descreva. Com o objetivo de avaliar a influência do tipo de regressão utilizada, foram elaborados uma montagem e um procedimento experimental para aquisição simultânea dos campos de temperaturas e de deslocamentos consequente. O campo de temperaturas é avaliado por meio de imagens térmicas, enquanto o campo de deslocamentos resultante é adquirido por correlação digital de imagens (CDI). Para assegurar a qualidade da análise por CDI, foi desenvolvido um equipamento de marcação CNC capaz de reproduzir de pontos, de distribuição gerada computacionalmente, sobre a superfície dos corpos de prova via impressão. A partir disto, foi construído um modelo numérico, reproduzindo as condições observadas experimentalmente, para análise via MEC com MIR. O campo de temperaturas foi aproximado por funções polinomiais de ordens bi quadrática, bi cúbica e bi quártica a fim de avaliar-se a sensibilidade do problema ao tipo de aproximação realizada. Por fim, comparando-se os campos de deslocamentos obtidos numericamente aos resultados experimentais observou-se uma boa concordância entre os resultados, independente do grau do polinômio utilizado na regressão. ______________________________________________________________________________________________ ABSTRACT
This work presents a numerical and experimental study on steady-state thermoelasticity using the boundary elements method (BEM), digital image correlation (DIC) and thermal images. In the BEM formulation for thermoelasticity, the effect of the thermoelastic loads, naturally, rises as a domain integral. In order to preserve BEM’s boundary only main characteristic this domain integral is converted into a boundary integral equivalent by the radial integration method (RIM). This technique, which consists in a purely mathematical approach, requires the temperature fields to be described as a function. However, in many engineering situations, this information is provided as a distribution of individual temperature values. In such situations, to successfully apply the MIR, it is necessary to use a regression technique to approach the temperature field by a mathematical function. In pursuance of evaluating the influence of the kind of regression applied, an experimental assembly was developed to acquire, simultaneously, the temperature field and the consequent displacement field field. The acquisition is performed by thermal images, while the resultant displacement field is obtained through DIC. To assure the quality of the DIC analysis, a CNC marking equipment was designed specifically to mark computer generated speckle patterns on the surfaces to be measured. After that, a numerical model was developed to reproduce the experimentally observed conditions for the BEM with RIM analysis. The experimental temperature field was approximated by three different kind of polynomial expressions: bi quadratic, bi cubic and bi quartic. In the end, the comparison between the numerical and the experimental displacement results showed good agreement regardless the type of polynomial regression used.
APA, Harvard, Vancouver, ISO, and other styles
46

Nguyen, Van Dang. "A finite elements method to solve the Bloch-Torrey equation applied to diffusion magnetic resonance imaging of biological tissues." Palaiseau, Ecole polytechnique, 2014. http://pastel.archives-ouvertes.fr/docs/00/95/77/50/PDF/thesis_Dang.pdf.

Full text
Abstract:
L'imagerie de résonance magnétique de diffusion (IRMD) est une technique d'imagerie non-invasive qui donne l'accès aux caractéristiques de diffusion de l'eau dans des tissus biologiques, notamment, dans le cerveau. Les restrictions que la structure cellulaire microscopique impose à la diffusion des molécules d'eau, sont agrégées statistiquement dans un mesurable signal d'IRMD macroscopique. L'inférence de la structure microscopique du tissu à partir du signal d'IRMD permet de détecter des régions pathologiques et d'observer les propriétés fonctionnelles du cerveau. A cet effet, il est important de mieux comprendre la relation entre la microstructure du tissu et le signal d'IRMD ce qui nécessite des nouvelles outils numériques capable de faire les calculs dans des géométries complexes modèles des tissus. Nous proposons une telle méthode numérique basée sur les éléments finis linéaires ce qui permet de décrire précisément des géométries complexes. La discrétisation par des éléments finis est couplée à la méthode adaptative des pas de temps de Runge-Kutta Chebyshev. Cette méthode qui assure la convergence du second ordre à la fois en temps et en espace, est implémentée sous la plateforme FeniCS C++. Nous utilisons aussi le générateur de maillage Salome pour travailler de manière efficace avec des géométries périodiques à plusieurs compartiments. Nous considérons quatre applications de la méthode pour étudier la diffusion dans des modèles à plusieurs compartiments. Dans la première application, nous étudions le comportement au temps long et démontrons la convergence d'un coefficient de diffusion apparent vers un tenseur de diffusion effectif obtenu par l'homogénéisation. La deuxième application vise à vérifier numériquement qu'un modèle à deux compartiments permet d'approximer le modèle à trois compartiments dans lequel le compartiment cellulaire et le compartiment extra-cellulaire sont complétés par un compartiment membranaire. La troisième application consiste à valider le modèle de Karger du signal d'IMRD macroscopique qui prend en compte l'échange entre compartiments. La dernière application se focalise sur le signal d'IMRD issu des neurones isoles. Nous proposons un modèle efficace unidimensionnel pour calculer le signal d'IRMD de manière précise dans un réseau des neurites de rayons variés. Nous testons la validité d'une expression semi-analytique du signal d'IRMD issu des réseaux de neurites
Diffusion magnetic resonance imaging (dMRI) is a non-invasive imaging technique that gives a measure of the diffusion characteristics of water in biological tissues, notably, in the brain. The hindrances that the microscopic cellular structure poses to water diffusion are statistically aggregated into the measurable macroscopic dMRI signal. Inferring the microscopic structure of the tissue from the dMRI signal allows one to detect pathological regions and to monitor functional properties of the brain. For this purpose, one needs a clearer understanding of the relation between the tissue microstructure and the dMRI signal. This requires novel numerical tools capable of simulating the dMRI signal arising from complex microscopic geometrical models of tissues. We propose such a numerical method based on linear finite elements that allows for a more accurate description of complex geometries. The finite elements discretization is coupled to the adaptive Runge-Kutta Chebyshev time stepping method. This method, which leads to the second order convergence in both time and space, is implemented on FeniCS C++ platform. We also use the mesh generator Salome to work efficiently with multiple-compartment and periodic geometries. Four applications of the method for studying the dMRI signal inside multi-compartment models are considered. In the first application, we investigate the long-time asymptotic behavior of the dMRI signal and show the convergence of the apparent diffusion coefficient to the effective diffusion tensor computed by homogenization. The second application aims to numerically verify that a two-compartment model of cells accurately approximates the three-compartment model, in which the interior cellular compartment and the extracellular space are separated by a finite thickness membrane compartment. The third application consists in validating the macroscopic Karger model of dMRI signals that takes into account compartmental exchange. The last application focuses on the dMRI signal arising from isolated neurons. We propose an efficient one-dimensional model for accurately computing the dMRI signal inside neurite networks in which the neurites may have different radii. We also test the validity of a semi-analytical expression for the dMRI signal arising from neurite networks
APA, Harvard, Vancouver, ISO, and other styles
47

Gao, Guochao. "Contribution to seismic modeling and imaging in the presence of reflector roughness." Thesis, Ecole centrale de Marseille, 2020. http://www.theses.fr/2020ECDM0010.

Full text
Abstract:
En raison de divers processus géologiques et de mouvements crustaux, les interfaces rugueuses existent largement dans la Terre. Une interface rugueuse peut affecter fortement la propagation des ondes sismiques par des changements d'amplitude, de phase, d'angle de diffusion, du contenu en fréquence et même de conversion de type d'onde. Inévitablement, la qualité de l'imagerie sismique ou de l'inversion en est fortement influencée. Malgré les nombreux travaux consacrés à l'interaction des ondes avec des interfaces rugueuses, cette interaction est loin d’être comprise, car il est encore difficile de modéliser la propagation des ondes sismiques dans un tel contexte et par conséquent de reconstruire correctement le sous-sol. Cette thèse étudie l'effet des interfaces rugueuses sur la modélisation et l'imagerie des ondes sismiques et explore le potentiel d'une méthode électromagnétique pour s’affranchir de cet effet et ainsi mieux imager le sous-sol.Nous utilisons une méthode numérique basée sur les éléments finis spectraux, et plus précisément le code SPECFEM2D, qui permet de modéliser la propagation des ondes acoustiques dans le domaine temporel. Tout d'abord, nous considérons un réseau sinusoïdal et illustrons numériquement les conséquences de l'équation de réseau sur les signaux temporels. Ensuite, en utilisant l'analyse f-k, nous montrons le positionnement des différents ordres de diffraction dans le domaine fréquence-nombre d'onde. Après une analyse de sensibilité, nous sélectionnons une configuration appropriée qui permet la séparation des ordres de diffraction à partir d’un shot gather. Enfin, il est montré que la hauteur de rugosité et la longueur de corrélation influencent manifestement l'apparence du champ d'onde diffracté. Cependant, la longueur de corrélation a moins d'effet sur l'énergie des ondes diffractées que la rugosité d'interface.Nous utilisons un schéma d'inversion de forme d'onde complète (FWI) basé sur le logiciel DENISE afin d’étudier l'influence de la hauteur de rugosité et de la longueur de corrélation sur les résultats d'imagerie sismique. Lorsque la hauteur de rugosité augmente jusqu'à atteindre la longueur d'onde dominante ou plus, le bruit aléatoire domine dans les données sismiques, et les résultats FWI se détériorent considérablement, en particulier pour la reconstruction d'un réflecteur horizontal situé sous l'interface rugueuse. En revanche, la longueur de corrélation a un effet beaucoup plus faible sur le bruit aléatoire et la qualité des résultats inversés.Comme démontré dans ce travail, la rugosité de l'interface a un impact majeur sur la propagation et l'imagerie des ondes sismiques. Lorsqu'une interface rugueuse est présente dans le sous-sol, son effet doit être examiné de manière critique dans le cadre de la FWI, afin de reconstruire correctement les réflecteurs éventuellement situés en dessous, puis d'interpréter correctement les images du sous-sol. Dans ce contexte, nous effectuons des tests préliminaires sur l'utilisation d'une méthode d'extinction sélective visant à enlever l'impact de la rugosité sur les champs d'ondes. Les résultats sont prometteurs et montrent le potentiel de la méthode pour une meilleure imagerie. De plus, l'écart type de l'amplitude des données traitées semble pouvoir être utilisé pour évaluer les caractéristiques de l'interface rugueuse, ce qui présenterait également un intérêt important pour les géophysiciens et les géologues
Due to various geological processes and crustal movements, rough interfaces widely exist within the Earth. The rough interface can strongly affect seismic wave propagation, manifested as changes in the amplitude, phase, scattering angle, frequency content, and even the wave-type conversion. Inevitably, the quality of seismic imaging or inversion is also greatly influenced. Despite the numerous works devoted to the interaction of waves with rough interfaces, this interaction remains to be better understood, as it is still quite challenging to model the seismic wave propagation and to properly reconstruct the subsurface. The thesis investigates the effect of rough interfaces on seismic wave modeling and imaging, and explores the potential of an electromagnetic method to remove this effect and to better image the subsurface.We use a spectral-element method, and more specifically the code SPECFEM2D, for modeling acoustic wave propagation in the time domain. First, we consider a sinusoidal grating and illustrate numerically the consequences of the grating equation on the temporal signals. Then, using f-k analysis, we show the location of the different diffraction orders in the frequency-wavenumber domain. After a sensitivity analysis, we select an appropriate configuration that allows for the separation of diffraction orders from a shot gather. Last, both roughness height and correlation length are shown to obviously influence the appearance of the diffracted wavefield. However, the correlation length has less effect on the energy of the diffracted waves than the interface roughness.We adopt a full-waveform inversion (FWI) scheme based on the software package DENISE to study the influence of different roughness heights and correlation lengths on seismic imaging results. When the roughness height increases up to the dominant wavelength or is greater, the random noise dominates in the seismic data, and the FWI results significantly deteriorate, especially for the reconstruction of a horizontal reflector located below the rough interface. In contrast, the correlation length has a much smaller effect on both random noise and quality of the inverted results than the roughness height. As shown here, the interface roughness has a major impact on both seismic wave propagation and imaging. When a rough interface is expected to be present in the subsurface, its effect should be critically considered in FWI, in order to properly reconstruct reflectors possibly located below, and then to properly interpret images of the subsurface. In this context, we perform some preliminary tests on the use of a selective extinction method to remove the impact of the roughness on the wavefields. The results are promising and show the potential of the method for better imaging. In addition, the standard deviation of the amplitude of the processed data may be used to evaluate the characteristics of the rough interface, which is also of interest for geophysicists and geologists
APA, Harvard, Vancouver, ISO, and other styles
48

Coggin, Linda L. "Creating discourses of possibility| Storying between the real and the imagined to negotiate rural lives in two elementary classrooms." Thesis, Indiana University, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3665263.

Full text
Abstract:

In an age of standardization of learning and the learner, literacy is narrowly defined to view young people from a deficit rather than a strength perspective. Empowering learners to draw on knowledges and experiences that they have access to in their everyday lives broadens the view of literacy learning. Research on literacy as a sociocultural practice, rural literacies, and performance theory frame this inquiry that seeks to understand how students are positioned as capable users of multiple literacy practices. This work examines: How do students living in rural contexts bring personal stories and interests into classrooms to make sense of literacy learning? What pedagogical practices make visible students' personal stories and interests as resources for literacy? How do students negotiate lived and imagined experience in classroom literacy engagements?

Using ethnographic methodologies and a practice centered performance approach, this research foregrounds the complexities of literacy learning that are responsive to this midwestern rural school community. Over the course of one academic year, I observed and participated in the everyday literacy events in a sixth grade and a second grade classroom. This work focuses on how rurality is imagined and experienced in these focal classrooms and the pedagogical practices that establish an ethos of sharing personal stories and experiences. An analysis of student created multimedia videos demonstrates how these literacy events were a location to 1) enact cultural ways of knowing, 2) negotiate multiple discourses that were significant in students' worlds, and 3) create new possibilities for using literacies and participating in classrooms. In the midst of tensions that place students as objects of instructional and political policies, discourses of possibility are created when young people are positioned as capable subjects who contribute and create knowing.

APA, Harvard, Vancouver, ISO, and other styles
49

Diedrich, Tiago Josué. "Um método baseado em ontologias para interpretação e catalogação de elementos pouco evidentes em imagens." Instituto Tecnológico de Aeronáutica, 2014. http://www.bd.bibl.ita.br/tde_busca/arquivo.php?codArquivo=3136.

Full text
Abstract:
Utilizam-se imagens nos mais variados ramos de interesse, quer seja nas áreas militares, médicas, redes sociais, notícias, sensoriamento remoto, entre outras. A presente pesquisa tem a finalidade de apoiar o processo de representação de imagens, utilizando-se de um método para interpretar e catalogar cenas de interesse. Explora-se o processo de anotação em mídias, de forma a reconhecer elementos pouco evidentes nas imagens. O problema observado expõe a dificuldade do usuário para interpretar e catalogar um maior número de elementos contidos numa cena, pois intérpretes nem sempre conseguem identificar o contexto das mídias, o que acarreta desperdício de recursos envolvidos. O objetivo direciona-se para a elaboração de um método, a fim de apoiar a tarefa de interpretação e de catalogação do analista de imagens, por meio de marcações padronizadas, oriundas de uma ontologia de domínio, associando sistemas baseados em regras. Consolidou-se o método proposto por meio da implementação de um plugin de inteligência de imagens para a Plataforma AEROGRAF, um sistema de informações georreferenciadas. A fim de validar o plugin, realizou-se um estudo de caso baseado em imagens de aeródromos. As cenas foram classificadas em três níveis de dificuldade. Dividiu-se o time de analistas em dois grupos: controle e experimental. O primeiro realizou o processo de catalogação apenas por meio visual. O segundo valeu-se do método proposto. Ferramentas estatísticas foram usadas para avaliar e validar os resultados do estudo de caso. Constatou-se que o método elevou o nível de compreensão dos intérpretes, pois o grupo experimental identificou um número maior de elementos existentes nas cenas. Além disso, obteve-se comprovação estatística de que quanto maior o nível de dificuldade das imagens, maior o ganho.
APA, Harvard, Vancouver, ISO, and other styles
50

Cueto, Rojas María José. "Incidencia de los elementos visuales sobre el posicionamiento de marca dentro de un proceso de rebranding." Bachelor's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2018. http://hdl.handle.net/10757/625251.

Full text
Abstract:
Tal y como lo indica Berry (1993) (Citado en Arthur, 1995) y Pereira (2015), las marcas no son estáticas, estas son construcciones activas y en recurrente evolución, influidas por agentes internos como externos. Dado que la marca puede sentirse estancada, anticuada o inferior respecto a su competencia; la renovación y actualización es un proceso a tomar en cuenta, es decir, realizar un Rebranding (Romero, 2015). Este es tomado como una renovación, un refrescamiento, un reinvento, renombre y/o un reposicionamiento (Merrilees y Miller, 2008), lo cual puede dar origen a la creación de un nuevo logo, término, símbolo, diseño o la combinación de estos para una marca establecida con la intención de desarrollar un posicionamiento diferente o uno nuevo en la mente de todos los stakeholders y competidores (Muzellec y Lambkin, 2005). La siguiente investigación se enfoca en el rediseño del logotipo en el posicionamiento de una marca, analizando los atributos de esta que se ven expuestos en el logo y que se plantean en el posicionamiento esperado de ella dentro una estrategia de rebranding. Es así que se obtiene que este elemento visual es una herramienta ideal para reforzar atributos de la marca en el posicionamiento. Ello se analiza en la generación millennial, segmento atractivo para las empresas y al cual la marca tiene como objetivo dirigirse actualmente. Se recurrió a una investigación descriptiva a partir de una metodología cuantitativa utilizando el método de la encuesta en el público al cual se direcciona la marca, lo que llevó a relucir el poder del rediseño del logo al reforzar ciertas falencias que el posicionamiento de la marca pueda no poseer.
As indicated by Berry (1993) (Quoted in Arthur, 1995) and Pereira (2015), brands are not static, they are active constructions and in recurrent evolution and influenced by internal and external agents. Kapferer (2004) points out that a brand that does not change over time may lose its relevance. Since the brand may feel stagnant, outdated or inferior to its competitors; the renewal and updating is a process to be considered, that is, to carry out a Rebranding (Romero, 2015). This is taken as a renewal, a refreshing, a reinvention, a reputation and/or a repositioning (Merrilees and Miller, 2008). This process may lead to the creation of a new logo, term, symbol, design or combination of these for an established brand with the intention of developing a different or new position in the minds of all stakeholders and competitors (Muzellec and Lambkin, 2005). The following research focuses on the redesign of the logo in the positioning of a brand, analyzing the attributes of this that are exposed in the logo and that arise in the expected positioning of it within a rebranding strategy. It is so that you get that this visual element is an ideal tool to reinforce attributes of the brand in the positioning. This is analyzed in the millennial generation, an attractive segment for companies and to which the brand aims to currently go. A descriptive research was used based on a quantitative methodology using the survey method in the public to which the brand is directed, which led to the power of the logo redesign to reinforce certain shortcomings that the positioning of the brand could not possess.
Tesis
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography