Dissertations / Theses on the topic 'Tomographic technique'

To see the other types of publications on this topic, follow the link: Tomographic technique.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Tomographic technique.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Jacobsson, Svärd Staffan. "A Tomographic Measurement Technique for Irradiated Nuclear Fuel Assemblies." Doctoral thesis, Uppsala University, Department of Nuclear and Particle Physics, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-4227.

Full text
Abstract:

The fuel assemblies used at the Swedish nuclear power plants contain typically between 100 and 300 fuel rods. An experimental technique has been demanded for determining the relative activities of specific isotopes in individual fuel rods without dismantling the assemblies. The purpose is to validate production codes, which requires an experimental relative accuracy of <2 % (1 σ).

Therefore, a new, non-destructive tomographic measurement technique for irradiated nuclear fuel assemblies has been developed. The technique includes two main steps: (1) the gamma-ray flux distribution around the assembly is recorded, and (2) the interior gamma-ray source distribution in the assembly is reconstructed. The use of detailed gamma-ray transport calculations in the reconstruction procedure enables accurate determination of the relative rod-by-rod source distribution.

To investigate the accuracy achievable, laboratory equipment has been constructed, including a fuel model with a well-known distribution of 137Cs. Furthermore, an instrument has been constructed and built for in-pool measurements on irradiated fuel assemblies at nuclear power plants.

Using the laboratory equipment, a relative accuracy of 1.2 % was obtained (1 σ). The measurements on irradiated fuel resulted in a repeatability of 0.8 %, showing the accuracy that can be achieved using this instrument. The agreement between rod-by-rod data obtained in calculations using the POLCA–7 production code and measured data was 3.1 % (1 σ).

Additionally, there is a safeguards interest in the tomographic technique for verifying that no fissile material has been diverted from fuel assemblies, i.e. that no fuel rods have been removed or replaced. The applicability has been demonstrated in a measurement on a spent fuel assembly. Furthermore, detection of both the removal of a rod as well as the replacement with a non-active rod has been investigated in detail and quantitatively established using the laboratory equipment.

APA, Harvard, Vancouver, ISO, and other styles
2

Jacobsson, Svärd Staffan. "A tomographic measurement technique for irradiated nuclear fuel assemblies /." Uppsala : Acta Universitatis Upsaliensis : Univ.-bibl. [distributör], 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-4227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

MALALLA, NUHAD ABDULWAHED YOUNIS. "C-ARM TOMOGRAPHIC IMAGING TECHNIQUE FOR DETECTION OF KIDNEY STONES." OpenSIUC, 2016. https://opensiuc.lib.siu.edu/dissertations/1278.

Full text
Abstract:
Nephrolithiasis can be a painful problem due to presence of kidney stones. Kidney stone is among the common painful disorders of the urinary system. Various imaging modalities are used to diagnose patients with symptoms of renal or urinary tract disease such as plain kidney, ureter, bladder x-ray (KUB), intravenous pyelography (IVP), and computed tomography (CT). As a traditional three-dimensional (3D) nephrolithiasis and kidney stones detection technique, computed tomography (CT) provides detailed cross-sectional images as well as 3D structure of kidney from moving the x-ray beam in a circle around the body. However, the risk of CT scans of the kidney is relatively higher exposure to radiation which is more than regular x-rays. C-arm technique is a new x-ray imaging modality that uses 2D array detector and cone shaped x-ray beam to create 3D information about the scanned object. Both x-ray source and 2D array detector cells mounted on C-shaped wheeled structure (C-arm). A series of projection images are acquired by rotating the C-arm around the patient in along circular path with a single rotation. The characteristic structure of C-arm allows to provide wide variety of movements around the patient that helps to remain the patient stationary during scanning time. In this work, we investigated a C-arm technique to generate a series of tomographic images for nephrolithiasis and detection of kidney stones. C-arm tomographic technique (C-arm tomosynthesis) as a new three dimensional (3D) kidney imaging method that provides a series of two dimensional (2D) images along partial circular orbit over limited view angle. Our experiments were done with kidney phantom which formed from a pig kidney with two embedded kidney stones inside it and low radiation dosage. Radiation dose and scanning time needed for kidney imaging are all dramatically reduced due to the cone beam geometry and also to limitation of angular rotation. To demonstrate the capability of our C-arm tomosynthesis to generate 3D kidney information for kidney stone detection, two groups of tomographic image reconstruction algorithms were developed for C-arm tomosynthesis: direct algorithms such as filtered back projection (FBP) and iterative algorithms such as simultaneous algebraic reconstruction technique (SART), maximum likelihood expectation maximization (MLEM), ordered- subset maximum likelihood expectation maximization (OS-MLEM) and Pre-computed penalized likelihood reconstruction (PPL). Three reconstruction methods were investigated including: pixel-driven method (PDM), ray-driven method (RDM) and distance driven method (DDM). Each method differs in their efficiency of calculation accuracy per computing time. Preliminary results demonstrated the capability of proposed technique to generate volumetric data about the kidney for nephrolithiasis and kidney stone detection by using all investigated reconstruction algorithms. In spite of each algorithms differs in their strategies, embedded kidney stone can be clearly visualized in all reconstruction results. Computer simulation studies were also done on simulated phantom to evaluate the results for each reconstruction algorithm. To mimic kidney phantom, simulated phantom was simulated with two different size kidney stones. Dataset of projection images was collated by using a virtual C-arm tomosynthesis with geometric configuration similar to real technique. All investigated algorithms were used to reconstruct 3D information. Different of image quality functions were applied to evaluate the imaging system and the reconstruction algorithms. The results show the capability of C-arm tomosynthesis to generate 3D information of kidney structures and to identify the size and location of kidney stones with limited amount of radiation dose.
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Chuyoung. "Algorithms for Tomographic Reconstruction of Rectangular Temperature Distributions using Orthogonal Acoustic Rays." Thesis, Virginia Tech, 2016. http://hdl.handle.net/10919/73754.

Full text
Abstract:
Non-intrusive acoustic thermometry using an acoustic impulse generator and two microphones is developed and integrated with tomographic techniques to reconstruct temperature contours. A low velocity plume at around 450 °F exiting through a rectangular duct (3.25 by 10 inches) was used for validation and reconstruction. 0.3 % static temperature relative error compared with thermocouple-measured data was achieved using a cross-correlation algorithm to calculate speed of sound. Tomographic reconstruction algorithms, the simplified multiplicative algebraic reconstruction technique (SMART) and least squares method (LSQR), are investigated for visualizing temperature contours of the heated plume. A rectangular arrangement of transmitter and microphones with a traversing mechanism collected two orthogonal sets of acoustic projection data. Both reconstruction techniques have successfully recreated the overall characteristic of the contour; however, for the future work, the integration of the refraction effect and implementation of additional angled projections are required to improve local temperature estimation accuracy. The root-mean-square percentage errors of reconstructing non-uniform, asymmetric temperature contours using the SMART and LSQR method are calculated as 20% and 19%, respectively.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
5

Sik, Ayhan Serkan. "X-ray Physics And Computerized Tomography Simulation Using Java And Flash." Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/3/756239/index.pdf.

Full text
Abstract:
For the education of X-ray imaging, having a detailed knowledge on the interaction of radiation with matter is very important. Also the generation and detection concepts of the X-ray have to be grasped well. Sometimes it is not easy to visualize the interactions and assess the scheme in quantum physics level for the medical doctors and the engineers who have not studied on the modern physics in an appropriate level. This thesis aims to visualize these interactions, X-ray generation and detection, and computerized tomographic imaging. With these simulations, the user can 1) observe and analyze which type of interaction occurs under which condition, 2) understand the interaction cross sections and interaction results, 3) visualise X-ray generation and detection features, 4) clarify the method of image reconstruction, and the features affecting the image quality in computerized tomography system. This is accomplished by changing the controllable variables of the radiation and the systems with the provided interfaces. In this thesis, JAVA/FLASH based simulation interfaces are designed to easily assess the subject. The benefits of these software are their ability to execute the programs prepared on the World Wide Web media. The interfaces are accessible from anywhere, at any time.
APA, Harvard, Vancouver, ISO, and other styles
6

Paduelli, Marcela Candian. "Estudo da técnica de ondas de tensão como instrumento de avaliação interna de árvores urbanas." Universidade Federal de São Carlos, 2011. https://repositorio.ufscar.br/handle/ufscar/4180.

Full text
Abstract:
Made available in DSpace on 2016-06-02T19:58:04Z (GMT). No. of bitstreams: 1 3535.pdf: 15531173 bytes, checksum: 0bec1dee483a9cf070857af8977686a0 (MD5) Previous issue date: 2011-03-18
Financiadora de Estudos e Projetos
Arborization is fundamental in urban spaces, due to the significant benefits it provides. To important issue to guarantee the adequacy of the role it plays is its maintenance. To this purpose, an application of preventive methods of diagnosis is recommended, allowing the evaluation of external and internal conditions of tress. Currently, the analysis of urban trees is subjectivity. For a complete and secure analysis, it is important to associate this visual external analysis to an internal analysis. This could be done by applying non-destructive techniques, wich allows an internal visualization of the tree through a tomographic image, without causing it any damage. Among the non-destructive techniques, that have applicability in the internal evaluation of urban trees, the technique of stress waves can be highlighted. Considering the great benefits that this technique can provide in the evaluation of urban trees, this research seeks to verify its scientific and technique viability, analyzing its reliability and also establishing some parameters of application. Studies with 12 trees of the Caesalpinea peltophoroides species (popularly known as Sibipiruna) have been carried out. The stress waves equipment was applied to sections of 50, 90 and 130 cm from the trees basis, generating tomographic images of these sections. Afterwards, the trees were cut at the level of these sections and, in order to compare these images, some pictures of them were made. The results show that the trees waves technique presents large applicability for internal evaluation of urban trees, achieving significant results. Also some parameters for the application of this technique were established. For instance, it is necessary to determinate the speed reference of waves propagation for the interpretation of tomographic images, as well as it is to determinate the best height of the assays section. The latter consisting on the height of the diameter to the chest height (DAP), where average density of the tree is representative.
A arborização é fundamental nos espaços urbanos, devido aos importantes benefícios que proporciona. Para garantir que esta exerça adequadamente seu papel, é importante a sua manutenção, na qual é necessário o emprego de métodos de diagnóstico preventivos, que permitam avaliar as condições externas e internas das árvores. Atualmente, a análise de árvores urbanas é realizada basicamente pela análise visual externa, a qual apresenta grande subjetividade. Para uma análise completa e segura é importante que a análise visual externa seja acompanhada de uma análise interna. Esta pode ser realizada por meio de técnicas nãodestrutivas, que possibilitam a visualização interna da árvore através da imagem tomográfica, sem causar nenhum dano a esta. Dentre as técnicas não-destrutivas existentes, com aplicabilidade na avaliação interna de árvores urbanas, pode-se destacar a técnica de ondas de tensão. Diante dos grandes benefícios que esta técnica pode proporcionar na avaliação de árvores urbanas, esta pesquisa buscou verificar sua viabilidade técnica e científica, analisando a sua confiabilidade e estabelecendo parâmetros para sua aplicação. Foram realizados estudos com 12 árvores da espécie Caesalpinea peltophoroides (conhecida popularmente como Sibipiruna), nas quais foram realizadas medições em seções a 50, 90 e 130 cm da base, empregando o equipamento de ondas de tensão, o qual gerou as imagens tomográficas das seções. Em seguida, as árvores foram cortadas nas seções de ensaio e fotografadas, para posterior comparação com a imagem tomográfica gerada. Os resultados permitiram verificar que a técnica de ondas de tensão apresenta grande viabilidade de aplicação na avaliação interna de árvores urbanas, com resultados bastante significativos. Foi possível também estabelecer alguns parâmetros de aplicação da técnica, como a necessidade da determinação da velocidade de propagação da onda de referência, para a interpretação da imagem tomográfica e a determinação da melhor altura da seção para realização dos ensaios, a qual consistiu na altura do diâmetro à altura do peito (DAP), na qual se encontra a representatividade da densidade média da árvore.
APA, Harvard, Vancouver, ISO, and other styles
7

Boutet, Jérôme. "Localisation d'inclusions fluorescentes dans les milieux diffusants à l'aide de techniques laser. Application au diagnostic médical in vivo." Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENY009/document.

Full text
Abstract:
La tomographie de fluorescence est une méthode d'imagerie préclinique et clinique permettant de localiser des traceurs fluorescents préalablement injectés ou naturellement présents dans un organisme vivant. Ce travail de thèse à consisté, dans un premier temps, à définir l'architecture et les conditions d'utilisation optimales d'un tomographe de fluorescence continu appliqué à l'observation de tissus de faible épaisseur. On s'est en particulier attaché à traiter le problème de l'observation de tissus hétérogènes et d'organes fortement absorbants. Dans un deuxième temps, pour observer des tissus de plus grande épaisseur, nous avons montré l'apport de la mesure du temps de vol moyen des photons pour améliorer la localisation d'inclusions fluorescentes. Les performances de deux types de systèmes capables de réaliser ce type de mesure ont été comparées et nous avons proposé un protocole permettant d'en optimiser les principaux paramètres. Notre procédé a été appliqué à la problématique du guidage de biopsies prostatiques. Il pourra aussi être utilisé pour visualiser d'autres pathologies moyennant une simple adaptation
Fluorescence tomography is a preclinical and clinical imaging method which aims to localize fluorescent probes injected into a living organism. In this thesis work, we defined the optimal design and parameters of a continuous wave fluorescent tomograph applied to thin tissue observation. We handled the problem of observing heterogeneous and highly absorbing organs. Secondly, we showed the advantage of time of flight measurement for inclusion detection through thicker tissues. The detection performances of two types of system capable of these measurements were compared and we proposed a protocol to optimize their main parameters. This process was applied to the problematic of prostatic biopsy guiding. It would also be used to detect and localize other pathologies by means of a simple adaptation
APA, Harvard, Vancouver, ISO, and other styles
8

Breckon, W. R. "Image reconstruction in Electrical Impedance Tomography." Thesis, Oxford Brookes University, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pratte, Jean-François. "Conception d'un amplificateur filtre rapide en CMOS 0.35 um destiné à un tomographe à émission de positrons animal." Sherbrooke : Université de Sherbrooke, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Şık, Ayhan Serkan. "X-ray physics and computerized tomography simulation using java and flash." Ankara : METU, 2003. http://etd.lib.metu.edu.tr/upload/756239/index.pdf.

Full text
Abstract:
Thesis (M.S.)--Middle East Technical University, 2003.
Keywords: Keywords: Radiation interaction with matter, cross section of interaction, radiation generation and detection, computerized tomographic imaging, Java/Flash simulations.
APA, Harvard, Vancouver, ISO, and other styles
11

Lundqvist, Saleh Tobias. "Tomographic Techniques for Safeguards Measurements of Nuclear Fuel Assemblies." Licentiate thesis, Uppsala universitet, Institutionen för neutronforskning, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-85831.

Full text
Abstract:
Nuclear power is currently experiencing increased interest over the world. New nuclear reactors are being built and techniques for taking care of the nuclear waste are being developed. This development puts new demands and standards to safeguards, i.e. the international efforts for ensuring the non-proliferation of nuclear weapons. New measuring techniques and devices are continuously being developed for enhancing the ability to detect diversion of fissile material. In this thesis, tomographic techniques for application in safeguards are presented. Tomographic techniques can non-destructively provide information of the inner parts of an object and may thus be used to control that no material is missing from a nuclear fuel assembly. When using the tomographic technique described in this thesis, the radiation field around a fuel assembly is first recorded. In a second step, the internal source distribution is mathematically reconstructed based on the recorded data. In this work, a procedure for tomographic safeguards measurements is suggested and the design of a tomographic measuring device is presented. Two reconstruction algorithms have been specially developed and evaluated for the application on nuclear fuel; one algorithm for image reconstruction and one for reconstructing conclusive data on the individual fuel rod level. The combined use of the two algorithms is suggested. The applicability for detecting individual removed or replaced rods has been demonstrated, based on experimental data.
APA, Harvard, Vancouver, ISO, and other styles
12

Deng, Junjun. "Parallel computing techniques for computed tomography." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/945.

Full text
Abstract:
X-ray computed tomography is a widely adopted medical imaging method that uses projections to recover the internal image of a subject. Since the invention of X-ray computed tomography in the 1970s, several generations of CT scanners have been developed. As 3D-image reconstruction increases in popularity, the long processing time associated with these machines has to be significantly reduced before they can be practically employed in everyday applications. Parallel computing is a computer science computing technique that utilizes multiple computer resources to process a computational task simultaneously; each resource computes only a part of the whole task thereby greatly reducing computation time. In this thesis, we use parallel computing technology to speed up the reconstruction while preserving the image quality. Three representative reconstruction algorithms--namely, Katsevich, EM, and Feldkamp algorithms--are investigated in this work. With the Katsevich algorithm, a distributed-memory PC cluster is used to conduct the experiment. This parallel algorithm partitions and distributes the projection data to different computer nodes to perform the computation. Upon completion of each sub-task, the results are collected by the master computer to produce the final image. This parallel algorithm uses the same reconstruction formula as the sequential counterpart, which gives an identical image result. The parallelism of the iterative CT algorithm uses the same PC cluster as in the first one. However, because it is based on a local CT reconstruction algorithm, which is different from the sequential EM algorithm, the image results are different with the sequential counterpart. Moreover, a special strategy using inhomogeneous resolution was used to further speed up the computation. The results showed that the image quality was largely preserved while the computational time was greatly reduced. Unlike the two previous approaches, the third type of parallel implementation uses a shared-memory computer. Three major accelerating methods--SIMD (Single instruction, multiple data), multi-threading, and OS (ordered subsets)--were employed to speed up the computation. Initial investigations showed that the image quality was comparable to those of the conventional approach though the computation speed was significantly increased.
APA, Harvard, Vancouver, ISO, and other styles
13

Desaubry, Christophe. "Conception de deux dispositifs expérimentaux pour la vélocimétrie par imagerie de particules : Application aux écoulements basse vitesse." Lyon, INSA, 1995. http://www.theses.fr/1995ISAL0111.

Full text
Abstract:
L'étude des mécanismes de transfert de chaleur par convection (naturelle ou mixte) interfaces fluide-paroi, nécessite une bonne connaissance de la dynamique des écoulements basse vitesse. Actuellement, la tomographie laser est une technique de visualisation fréquemment utilisée pour la description qualitative de la dynamique des fluides. A la tomographie peut être couplée la Vélocimétrie par Imagerie de Particules (V. I. P. ). Cette méthode de mesure consiste à extraire des informations quantitatives d'images doublement exposées et obtenues par la tomographie laser. Dans ce travail, deux dispositifs expérimentaux sont présentés afin de réaliser la détermination de champs de vitesses instantanées dans des écoulements basse vitesse. Les deux appareillages reposent sur la capture d'images vidéo issues de caméras standards et sur leur exploitation à partir de techniques de traitement numérique utilisées généralement en V. I. P. Une approche systématique, mais égaleme comparative, des dispositifs est réalisée tout en s'appuyant sur différentes applications qui mettent en œuvre des mécanismes d'écoulements diverses
[The study of heat transfer that appears in convective flows needs a very good knowledge of low velocity flow dynamic. The laser visualization is actually a technic witch is commonly used for qualitative flow description. Particle Image Velocimetry (P. I. V), recently developed, is derived from the proceeding technical. The P. I. V quantitative method allows to extract informations concerning two dimensionnal velocity field from a double-exposed image of the fluid in motion. Two experimental setups are described in this work. Both are based on the standard video catching technical and on numerical technical often used in P. I. V measurements. A systematic and comparative analysis of the two setups is made by considering different applications allowing several fluid flow mechanisms. ]
APA, Harvard, Vancouver, ISO, and other styles
14

Choi, Young Jin. "Application of tomographic techniques for rheological measurements and process control /." For electronic version search Digital dissertations database. Restricted to UC campuses. Access is free to UC campus dissertations, 2003. http://uclibs.org/PID/11984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Collaer, Marcia Lee. "IMAGE DATA COMPRESSION: DIFFERENTIAL PULSE CODE MODULATION OF TOMOGRAPHIC PROJECTIONS." Thesis, The University of Arizona, 1985. http://hdl.handle.net/10150/291412.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Teichmann, Ulrich, Astrid Ziemann, Klaus Arnold, and Armin Raabe. "Akustische Tomographie und optische Scintillometertechnik zur Sondierung der atmosphärischen Grenzschicht." Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-214141.

Full text
Abstract:
Während eines Experimentes an der Forschungsstation Melpitz des IfT (Institut für Tropossphärenforschung) im September 1997 wurden erstmalig zwei verschiedene Meßmethoden gleichzeitig eingesetzt, die flächengemittelte Lufttemperaturen (Akustische Tomographie - Leipziger Institut für Meteorologie (LIM)) sowie liniengemittelte fühlbare Wärmeflüsse (Scintillometertechnik - IfT) lieferten. Es konnte gezeigt werden, daß teilweise erhebliche Temperaturdifferenzen an einem Strahlungstag auf dieser oberflächlich betrachteten horizontal homogenen Wiese existieren. Die geringe Datenbasis, größtenteils bedingt durch die ungünstige Anströmrichtung während dieses Zeitraums, läßt noch keinen sicheren Schluß zu, ob diese horizontalen Temperaturdifferenzen für die ebenfalls beobachteten horizontalen Unterschiede der vertikalen fühlbaren Wärmeflüsse und damit für die manchmal in Melpitz beobachtete Nicht-Schließung der Energiebilanz verantwortlich sind
During an experiment at the Iff field research station Melpitz in September 1997 for the first time two different techniques were used to determine simultaneously area averaged air temperatures (Acoustic Tomography -LIM) and line averaged sensible heat fluxes (Scintillation technique - IfT). lt could be shown that on a \'golden\' day appreciably large temperature differences occurred on this superficially considered horizontal homogeneous meadow. Because of the weak data base mostly due to difficult fetch conditions it could not be proven that these temperature differences led to the horizontal differences of vertical sensible heat fluxes and therefore to the sometimes observed non-closure of the energy balance in Melpitz
APA, Harvard, Vancouver, ISO, and other styles
17

Murcia, Jérôme de. "Reconstruction d'images cardiaques en tomographie d'émission monophotonique à l'aide de modèles spatio-temporels." Grenoble INPG, 1996. http://www.theses.fr/1996INPG0078.

Full text
Abstract:
La tomographie d'emission monophotonique fournit une sequence d'images 3d representatives de la distribution du traceur administre au patient, pour differents instants du cycle cardiaque. Elle met en evidence les zones mal irriguees du myocarde. Afin d'eviter une trop longue immobilisation, les temps d'acquisition sont limites, ce qui conduit a des mesures tres bruitees. La reconstruction etant un probleme inverse mal-pose, les images tomographiques sont alors tres degradees. Nous proposons dans cette these deux methodes permettant d'ameliorer la qualite statistique de ces images en regularisant temporellement le processus de reconstruction. La simple moyenne temporelle conduisant a un flou cinetique, le mouvement du myocarde est d'abord estime a partir du suivi de trois surfaces caracteristiques du myocarde, puis integre dans l'algorithme de reconstruction. La premiere methode developpee consiste a reconstruire une phase particuliere connaissant l'ensemble des mesures acquises et la loi d'evolution. La reconstruction s'inscrit dans le cadre theorique du filtrage de kalman. La solution formelle necessitant l'inversion d'une matrice de tres grande dimension, nous proposons une solution sous-optimale mais rapide. L'algorithme recursif repose sur des operations de type filtrage-retropropjection. Nous presentons ensuite une deuxieme approche dans laquelle l'ensemble des phases est reconstruite simultanement. La reconstruction de la sequence est effectuee par minimisation d'une fonction quadratique etablie dans le cadre d'une regularisation spatio-temporelle. Le calcul iteratif des images est effectue a partir de l'algorithme du gradient conjugue. Les resultats experimentaux montrent la validite de notre approche et mettent en evidence l'apport d'une regularisation temporelle
APA, Harvard, Vancouver, ISO, and other styles
18

Leung, Hing Tong Lucullus. "Development of an electrical impedance tomograph for complex impedance imaging." Thesis, University of South Wales, 1991. https://pure.southwales.ac.uk/en/studentthesis/development-of-an-electrical-impedance-tomograph-for-complex-impedance-imaging(b3f26e76-490d-4364-a270-28cff1dccd70).html.

Full text
Abstract:
This project concerns the development of electrical impedance tomography towards the production of complex impedance images. The prime intention was to investigate the feasibility of developing suitable instrumentation; but not clinical applications. It was aimed to develop techniques for the performance evaluation of data collection systems. To achieve this it was necessary to design and develop a multi· current source type impedance tomography system, to act as a platform for the current study and for future work. The system developed is capable of producing conductivity and permittivity images. It employs microprocessor based data collection electronics, providing portability between a range of possible host computers. The development of the system included a study of constant amplitude current source circuits leading to the design and employment of a novel circuit. In order to aid system testing, a surface mount technology resistor-mesh test object was produced. This has been adopted by the EEC Concerted Action on Impedance Tomography (CAIT) programme as the first standard test object. A computer model of the phantom was produced using the industry standard ASTEC3 circuit simulation package. This development allows the theoretical performance of any system topology, at any level of detail, to be established. The imaging system has been used to produce images from test objects, as well as forearm and lung images on humans. Whilst the conductivity images produced were good, the permittivity in-vivo images were noisy, despite good permittivity images from test objects. A study of the relative merits of multiple and single stimulus type systems was carried out as a result of the discrepancies in the in-vivo and test object images. This study involved a comparison of the author's system with that of Griffiths at the University Hospital of Wales. The results showed that the multi current source type system, whilst able to reduce stray capacitance, creates other more significant errors due to circuit matching; future development in semiconductor device technology may help to overcome this difficulty. It was identified that contact impedances together with the effective capacitance between the measurement electrode pairs in four-electrode systems reduces the measurability of changes in phase. A number of benchmarking indices were developed and implemented, both for system characterisation and for practical/theoretical design comparisons.
APA, Harvard, Vancouver, ISO, and other styles
19

Laidlaw, James Stuart. "Tomographic techniques and their application to geotechnical and groundwater flow problems." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/28493.

Full text
Abstract:
Most downhole tools in use today measure properties immediately adjacent to the borehole, and as such, only a small portion of the subsurface volume is known with any degree of certainty. When dealing with geologic situations which are characteristically heterogeneous, the engineer often requires more information than what present tests can provide. Tomography is an in-situ testing method that allows the generation of a two dimensional subsurface image by reconstructing material property variations between boreholes. It is essentially a solution to the inverse problem where signals are measured and, through computer manipulation, are used to infer material contrasts in the subsurface. For the purposes of this thesis, a two dimensional configuration is used to demonstrate and evaluate the tomographic technique with source and receiver locations positioned at intervals down adjacent and nearly vertical boreholes. Both iterative and direct matrix solution methods are used to evaluate the use of seismic and groundwater flow data for subsurface tomography. The iterative methods include a variation of the classical algebraic reconstruction technique (CART), a modified version of the ART algorithm (MART), and a modified version of the ART algorithm using the Chebyshev norm criterion (LART). The purpose of the iterative tests is to determine the best algorithm for signal reconstruction when data noise and different damping parameters are applied. The matrix methodologies include a constrained L¹ linear approximation algorithm and singular value decomposition routines (SVD). These methods solve the set of linear equations (Ax = b) which the tomographic techniques produce. The purpose of this stage of testing is to optimize a direct method of solution to the sets of linear equations such that different forms of anomaly can be discerned. Numerous synthetic seismic and groundwater data sets are used by both iterative and matrix algorithms. Seismic test data sets are generated by calculation of transit times through materials of known seismic velocity. Groundwater test data sets are generated by drawdown analyses and finite element procedures. All algorithms demonstrate a reasonable ability at reconstructing sections which closely re-sembled the known profiles. Vertical anomalies, however, are not as well defined as horizontal anomalies. This is primarily a result of incomplete cross-hole scanning geometry which also affects the rank and condition of the matrices used by the direct forms of solution. The addition of Gaussian noise to the data produces poor reconstructions regardless of the type of algorithm used. This emphasizes the fact that tomographic techniques require clear and relatively error-free signals.
Science, Faculty of
Earth, Ocean and Atmospheric Sciences, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
20

王晴兒 and Ching-yee Oliver Wong. "Measurement of cerebrovascular perfusion reserve using single photon emission tomographic techniques." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31981677.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Bhatia, Mickey. "Wavelet transform-based multi-resolution techniques for tomographic reconstruction and detection." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/11650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Nikitidis, Michail S. "Application of gamma-ray tomographic techniques in granular flows in hoppers." Thesis, University of Surrey, 1997. http://epubs.surrey.ac.uk/844103/.

Full text
Abstract:
The aim of this dissertation is to demonstrate the potential of novel measurement techniques based on the scanning of gamma-ray transmission in the investigation of axially-symmetric flow properties of granular materials in 3D hoppers. Furthermore, the results of the experimental investigations are compared on a strictly quantitative basis with Newtonian Dynamics (i.e. Discrete Element simulations) and Molecular Dynamics (i.e. kinetic gas theory calculations). Measurements were performed using two specially constructed scanner systems of different geometric configuration of gamma-ray sources and detectors(namely parallel and fan beam arrangements respectively). The fan beam scanner has been developed entirely in the Department of Chemical & Process Engineering by the author of this thesis and therefore a significant part of the thesis deals with major points concerning both hardware and software development as well as associated calibration procedures. Gas-phase continuous mono-disperse systems have been studied using (i) the full tomographic imaging technique which is able to produce 3D planar maps of voidage at selected heights of a storage vessel and (ii) the single profile absorptiometric technique capable of producing voidage profiles in both Cartesian and polar coordinates at much faster acquisition rates. Results were compared with earlier Distinct Element numerical simulations showing encouraging agreement in terms of both the absolute values of voidage and their spatial fluctuations as well as the geometric structure of the static and dynamic particle assemblies. Size segregation in air borne binary mixtures have been quantified using the novel dual energy photon technique which is capable of producing solids fraction profiles for each of the individual components of a binary mixture in addition to the voidage profiles. Spatial and temporal data on solids fractions in a binary mixture were analysed using methodology based on statistical mechanics principles which led to the definition of "micro-turbulence" during flow in terms of the self-diffusion velocities of individual solid components. This then allows the calculation of both the self- and mutual-diffusion coefficients used to quantify size segregation. These calculations were also compared with theoretical predictions based on the kinetic gas theory which was found to grossly over-predict the calculated diffusion coefficients in slow-shearing granular flows.
APA, Harvard, Vancouver, ISO, and other styles
23

Sharaf, Jamal Mahmood. "Elemental analysis of biological matrices using emission and transmission tomographic techniques." Thesis, University of Surrey, 1994. http://epubs.surrey.ac.uk/844448/.

Full text
Abstract:
The main objective of this study has been to investigate the feasibility of using tomographic techniques for non-destructive analysis. A potentially useful technique with neutrons as probes for material characterisation is presented. The technique combines the principles of reconstructive tomography with instrumental neutron activation analysis (INAA) so that elemental distributions in a section through a specimen can be mapped. Neutron induced gamma-ray emission tomography (NIGET) technique, where prompt or delayed gamma-rays can be detected in a tomographic mode, has been developed for samples irradiated in the core of a nuclear reactor and used in studies of different biological matrices. The capabilities of the technique will be illustrated using a spatial resolution of 1 mm. The quantitative usefulness of NIGET depends on the accuracy of compensation for the effect of scattering and attenuation as well as determination of the tomographic system characteristics which contribute to the intrinsic measurement process. It will be shown how quantitative information about the induced radionuclide concentration distribution in a specimen can be obtained when compensation for scattered gamma-rays is taken into account employing a high resolution semiconductor detector and a method of scattering correction based upon the use of three energy windows to collect emission data. For attenuation correction an iterative method which combined emission and transmission measurements has been implemented and its performance was compared to the performance of a number of other attenuation correction algorithms. The work involved investigation into the role of a number of factors which influence the accuracy of data acquisition. An efficiency-resolution figure of merit as a function of collimator efficiency, system resolution and object diameter has been defined. Further, a number of reconstruction techniques were investigated and compared for accuracy, minimum number of projections required and their ability to handle noise. Reconstruction by filtered back projection was fastest to compute, but performed poorly when compared to iterative techniques.
APA, Harvard, Vancouver, ISO, and other styles
24

Wong, Ching-yee Oliver. "Measurement of cerebrovascular perfusion reserve using single photon emission tomographic techniques." Hong Kong : University of Hong Kong, 1998. http://sunzi.lib.hku.hk/hkuto/record.jsp?B19605328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Bourret, Rodolphe. "Etude d'algorithmes de reconstruction en tomographie d'impédance : technique analytique et technique numérique par la méthode des éléments finis." Toulouse 3, 1990. http://www.theses.fr/1990TOU30047.

Full text
Abstract:
L'imagerie par tomographie d'impedance est une nouvelle technique, non invasive, qui apparait comme potentiellement complementaire des autres techniques. Elle consiste a reconstituer des images en coupe d'un milieu physiologique, a partir d'informations electriques provoquees, fonctions des proprietes de conductivite du milieu. Dans le premier chapitre, une compilation bibliographique conduit a presenter l'imagerie par tomographie d'impedance. Cette etude nous a guide dans le choix d'un modele mathematique caracterisant l'interaction ondes-matiere (equations de maxwell). Des algorithmes de reconstruction permettant de resoudre ce probleme soit par des methodes analytiques soit par des methodes numeriques. La deuxieme partie concerne leur mise en uvre dans l'unite u305 et les resultats obtenus. Ce travail de recherche s'interesse plus particulierement a la resolution numerique et a la methode des elements finis. Une part importante de ce travail a ete d'adapter cette methode a la tomographie d'impedance et de realiser des programmes generaux modulaires, relativement simples d'emploi. Un algorithme de reconstruction simple propose par wexler a ete teste. Un autre algorithme de reconstruction a ete elabore, utilisant la methode de perturbation singuliere d'apres une idee d'alessandrini (en supposant que le potentiel est connu a l'interieur du milieu). Les ameliorations envisageables sur les differentes techniques abordees precedemment sont analysees dans la troisieme partie
APA, Harvard, Vancouver, ISO, and other styles
26

Xie, Yao. "Adaptive and Robust Techniques (ART) for thermoacoustic tomography." [Gainesville, Fla.] : University of Florida, 2006. http://purl.fcla.edu/fcla/etd/UFE0015243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Goulet, Mathieu. "Application of tomography techniques to plastic scintillation dosimetry." Thesis, Université Laval, 2014. http://www.theses.ulaval.ca/2014/30599/30599.pdf.

Full text
Abstract:
Cette thèse porte sur le développement d’outils de contrôle de qualité pour les traitements de radiothérapie externe. Le but principal vise à incorporer les principes de tomographie à la dosimétrie par scintillateurs plastiques pour concevoir des appareils de haute résolution spatiale et faciles d’utilisation, tout en étant justes et précis. Dans un premier temps, la réponse de longues fibres scintillantes placées dans un champ de radiation est étudiée, et un détecteur de fluence est développé pour la validation en temps réel des traitements de radiothérapie. En utilisant l’information des deux extrémités de chaque fibre simultanément, la position de l’interaction du champ ainsi que l’intégrale de la fluence traversant la fibre peuvent être mesurées, permettant la détection d’erreurs de lames d’au moins 2 mm à l’isocentre. Le modèle théorique de réponse précédemment développé est ensuite appliqué à la reconstruction tomographique d’une distribution de dose mesurée à l’aide d’une matrice rotative de longues fibres scintillantes parallèles. Le dosimètre 2D obtenu parvient à reconstruire la dose calculée par le système de planification de traitement avec un écart maximal de 2% dans les régions de bas gradient de dose. Le concept de dosimétrie tomographique, ou tomodosimétrie, est ensuite appliqué à la mesure de dose en trois dimensions en utilisant des plans de fibres cylindriques et concentriques. En simulant la rotation de ces plans autour de leur axe central et en interpolant en trois dimensions les doses 2D obtenues, le dosimètre 3D parvient à reconstruire la dose de départ à un écart d’au plus 1% en dehors des zones de haut gradient de dose. Finalement, les principes de reconstruction itérative démontrés pour les longues fibres scintillantes sont appliqués à un volume de scintillateurs imagé à l’aide d’une caméra plénoptique. En re-projetant les projections acquises par les pixels de la caméra dans le volume de scintillateurs, le dosimètre 3D parvient à reconstruire en temps réel la dose à un écart d’au plus 3% dans les régions de faible gradient de dose. Cette étude conclut que le mariage de la tomographie et de la dosimétrie permet l’apparition d’une nouvelle génération d’appareils de contrôle de qualité alliant à la fois résolution spatiale et facilité d’utilisation.
This thesis deals with the development of tools for the quality assurance of external beam radiotherapy. The main goal is to incorporate tomography processes to plastic scintillator dosimetry in order to conceive high resolution, precise, accurate and easy-to-use quality assurance devices. First, a long scintillating fiber response to an incoming radiation field is studied, and a fluence monitoring device is developed for the real-time validation of radiotherapy treatments. Using the light signal emitted from both sides of each fiber, both the interaction position of the incoming field and the fluence integral across the fiber can be measured, allowing for the detection of leaf errors of at least 2 mm at isocentre. The theoretical response model previously developed is then applied to the tomographic reconstruction of dose distributions measured using a rotating matrix of long scintillating fibers. The dose reconstructed using this 2D dosimeter is in agreement with the calculations from the treatment planning software up to a maximum difference of 2% in the low dose gradient regions. The concept of tomographic dosimetry, or tomodosimetry, is then applied to 3D dose measurements using concentric, cylindrical planes of fibers. By simulating the rotation of these planes around the dosimeter central axis and by interpolating in three dimensions the obtained 2D doses, the 3D dosimeter is able to reconstruct the initial input dose with a deviation of maximum 1% outside of high dose gradient regions. Finally, the iterative reconstruction principles demonstrated for long scintillating fibers are applied to a scintillator volume imaged using a plenoptic camera. By re-projecting the projections acquired by the camera sensor pixels inside the scintillator volume, the 3D dosimeter is able to reconstruct the dose in real time with a maximal deviation of 3% in the low dose gradient regions. This study concludes that the union of tomography and dosimetry enables the development of a new generation of quality assurance devices, combining both spatial resolution and user-friendliness.
Tableau d'honneur de la FÉSP
APA, Harvard, Vancouver, ISO, and other styles
28

Ettehadi, Seyedrohollah. "Model-based and machine learning techniques for nonlinear image reconstruction in diffuse optical tomography." Thèse, Université de Sherbrooke, 2017. http://hdl.handle.net/11143/11895.

Full text
Abstract:
La tomographie optique diffuse (TOD) est une modalité d’imagerie biomédicale 3D peu dispendieuse et non-invasive qui permet de reconstruire les propriétés optiques d’un tissu biologique. Le processus de reconstruction d’images en TOD est difficile à réaliser puisqu’il nécessite de résoudre un problème non-linéaire et mal posé. Les propriétés optiques sont calculées à partir des mesures de surface du milieu à l’étude. Dans ce projet, deux méthodes de reconstruction non-linéaire pour la TOD ont été développées. La première méthode utilise un modèle itératif, une approche encore en développement qu’on retrouve dans la littérature. L’approximation de la diffusion est le modèle utilisé pour résoudre le problème direct. Par ailleurs, la reconstruction d’image à été réalisée dans différents régimes, continu et temporel, avec des mesures intrinsèques et de fluorescence. Dans un premier temps, un algorithme de reconstruction en régime continu et utilisant des mesures multispectrales est développé pour reconstruire la concentration des chromophores qui se trouve dans différents types de tissus. Dans un second temps, un algorithme de reconstruction est développé pour calculer le temps de vie de différents marqueurs fluorescents à partir de mesures optiques dans le domaine temporel. Une approche innovatrice a été d’utiliser la totalité de l’information du signal temporel dans le but d’améliorer la reconstruction d’image. Par ailleurs, cet algorithme permettrait de distinguer plus de trois temps de vie, ce qui n’a pas encore été démontré en imagerie de fluorescence. La deuxième méthode qui a été développée utilise l’apprentissage machine et plus spécifiquement l’apprentissage profond. Un modèle d’apprentissage profond génératif est mis en place pour reconstruire la distribution de sources d’émissions de fluorescence à partir de mesures en régime continu. Il s’agit de la première utilisation d’un algorithme d’apprentissage profond appliqué à la reconstruction d’images en TOD de fluorescence. La validation de la méthode est réalisée avec une mire aux propriétés optiques connues dans laquelle sont inséres des marqueurs fluorescents. La robustesse de cette méthode est démontrée même dans les situations où le nombre de mesures est limité et en présence de bruit.
Abstract : Diffuse optical tomography (DOT) is a low cost and noninvasive 3D biomedical imaging technique to reconstruct the optical properties of biological tissues. Image reconstruction in DOT is inherently a difficult problem, because the inversion process is nonlinear and ill-posed. During DOT image reconstruction, the optical properties of the medium are recovered from the boundary measurements at the surface of the medium. In this work, two approaches are proposed for non-linear DOT image reconstruction. The first approach relies on the use of iterative model-based image reconstruction, which is still under development for DOT and that can be found in the literature. A 3D forward model is developed based on the diffusion equation, which is an approximation of the radiative transfer equation. The forward model developed can simulate light propagation in complex geometries. Additionally, the forward model is developed to deal with different types of optical data such as continuous-wave (CW) and time-domain (TD) data for both intrinsic and fluorescence signals. First, a multispectral image reconstruction algorithm is developed to reconstruct the concentration of different tissue chromophores simultaneously from a set of CW measurements at different wavelengths. A second image reconstruction algorithm is developed to reconstruct the fluorescence lifetime (FLT) of different fluorescent markers from time-domain fluorescence measurements. In this algorithm, all the information contained in full temporal curves is used along with an acceleration technique to render the algorithm of practical use. Moreover, the proposed algorithm has the potential of being able to distinguish more than 3 FLTs, which is a first in fluorescence imaging. The second approach is based on machine learning techniques, in particular deep learning models. A deep generative model is proposed to reconstruct the fluorescence distribution map from CW fluorescence measurements. It is the first time that such a model is applied for fluorescence DOT image reconstruction. The performance of the proposed algorithm is validated with an optical phantom and a fluorescent marker. The proposed algorithm recovers the fluorescence distribution even from very noisy and sparse measurements, which is a big limitation in fluorescence DOT imaging.
APA, Harvard, Vancouver, ISO, and other styles
29

Teague, Gavin. "Mass flow measurement of multi-phase mixtures by means of tomographic techniques." Doctoral thesis, University of Cape Town, 2002. http://hdl.handle.net/11427/5097.

Full text
Abstract:
Includes bibliographical references.
This thesis investigates the use of a dual-plane impedance tomography system to calculate the individual mass flow rates of the components in an air-gravel-seawater mixture. The long-term goal of this research is to develop a multi-phase flowmeter for the on-line monitoring of an airlift used in an offshore mining application. This requires the measurement of both the individual component volume fractions and their velocities. Tomography provides a convenient non-intrusive technique to obtain this information. Capacitance tomography is used to reconstruct the dielectric distribution of the material within a pipeline. It is based on the concept that the capacitance of a pair of electrodes depends on the dielectric distribution of the material between the electrodes. By mounting a number of electrodes around the periphery of the pipeline, and measuring the capacitances of the different electrode combinations, it is possible to reconstruct the distribution of the phases within the pipeline, provided the phases have different dielectric constants. Resistance tomography is used to reconstruct the resistivity distribution within the cross-section of the pipeline and operates in a similar way to capacitance tomography. Impedance tomography can be described as a dual-modal approach since both the capacitance and conductance of the different electrode combinations are measured to reconstruct the omplex impedance of the material distribution. Previous research has shown that impedance tomography can be used to reconstruct a three-phase air-gravelwater mixture [3,4]. In addition, it has been shown that neural networks can be used to perform this reconstruction task [3,4]. In particular, a single-layer feed-forward neural network with a 1-of-C output encoding can be trained to perform a three-phase image reconstruction. Further, a double-layer feed-forward neural network can be trained to predict the volume fractions of the three phases within the flow directly, based on the capacitance and conductance readings obtained from the data acquisition system. However, these tests were only for static configurations. This thesis will readdress this problem from the dynamic viewpoint. In addition, the individual component velocities will be calculated using the cross-correlation of the volume fraction predictions from two impedance tomography systems spaced a certain distance apart.
APA, Harvard, Vancouver, ISO, and other styles
30

Chin, Kimberley Germaine. "An investigative study of the applicability of the convolution method of geophysical tomography." Ohio : Ohio University, 1985. http://www.ohiolink.edu/etd/view.cgi?ohiou1183751187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ingels, Alexandre. "Développement de techniques d’imageries pour le diagnostic et le pronostic des tumeurs du rein." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS451/document.

Full text
Abstract:
Le but du projet est le développement de nouvelles techniques d’imagerie pour le diagnostic et le traitement du cancer du rein. Nous avons évalué différentes techniques comprenant la tomographie en cohérence optique et l’imagerie moléculaire. Nous avons évalué différents marqueurs potentiels pour l’imagerie moléculaire en étudiant l’expression de différents marqueurs dans le cancer du rein et leur association avec le pronostic de la maladie. Enfin, nous avons évalué deux techniques d’imagerie moléculaire sur des modèles précliniques : l’Imagerie par Résonnance Magnétique moléculaire et l’échographie moléculaire
The aim of this project is the development of new imaging techniques for renal cancer diagnostic and treatment.We have assessed several techniques including optical coherence tomography and molecular imaging. We assessed a series of potential markers for molecular imaging by measuring some pre-defined markers expressions by immunohistochemistry in renal cell carcinoma and their association with disease’s prognostic. Finally, we assessed two molecular imaging techniques in pre-clinical models: Molecular Magnetic Resonance Imaging and Molecular Ultrasound Imaging
APA, Harvard, Vancouver, ISO, and other styles
32

Michelet, Claire. "Développement d'une technique de micro-tomographie par faisceau d'ions à l'échelle cellulaire." Bordeaux 1, 1998. http://www.theses.fr/1998BOR10638.

Full text
Abstract:
Une technique de micro-tomographie par faisceau d'ions a ete developpee sur la microsonde nucleaire du cenbg. Associee a la methode de microscopie stim (scanning transmission ion microscopy), cette technique permet l'analyse tri-dimensionnelle en densite. Son principal interet, notamment dans le cadre d'applications en biologie cellulaire, est de ne necessiter aucune coupe prealable de l'echantillon. Au cours de ce travail, un algorithme specifique de reconstruction des donnees a ete mis au point, a partir de methodes numeriques de retroprojection filtree derivees des scanners medicaux. Les tests realises ont permis de valider la technique par comparaison a des images de microscopie electronique. Pour la premiere fois, des resultats experimentaux de tomographie par faisceau d'ions ont ete obtenus a l'echelle cellulaire, sur une lignee tumorale humaine. La structure interne de cellules isolees a ete revelee par contraste en densite, avec une resolution spatiale de l'ordre du micron. Enfin, une etude preliminaire de tomographie par fluorescence x induite (pixe : particle induced x-ray emission) a permis de cartographier les distributions en mineraux dans une cellule individuelle.
APA, Harvard, Vancouver, ISO, and other styles
33

Dupre, Antoine. "Electrical impedance tomography for void fraction measurements of harsh two-phase flows : prototype development and reconstruction techniques." Thesis, Ecole centrale de Marseille, 2017. http://www.theses.fr/2017ECDM0005/document.

Full text
Abstract:
Les récentes avancées technologiques des matériels d’acquisition de données ont permis de réduire le temps d’acquisition d’image en tomographie électrique, ce qui offre des opportunités pour l’étude des écoulements diphasiques transitoires. Parmi les nombreux atouts de cette technique d’imagerie d’écoulements diphasiques, on peut citer son caractère non-intrusif, sa haute fréquence d’acquisition et son faible coût. Un ensemble d’électrodes placées sur le pourtour d’une conduite servent à transmettre une excitation électrique au milieu et à le sonder. Ainsi, la distribution des phases perturbe les champs électriques de manière caractéristique. L’objectif de cette thèse est d’évaluer le potentiel de la tomographie d’impédance électrique rapide. La première étape consiste au développement d’un prototype de capteur et à l’évaluation de sa performance par des essais simplifiés. L’architecture du système utilise un contrôle en potentiel du signal d’excitation et ne nécessite donc pas d’implémenter un module de conversion tension-courant. La seconde étape est la reconstruction de l’image à partir des données mesurées. L’approche qui a été considérée est de supposer une image approchée de la distribution des phases grâce à une identification du régime d’écoulement. Ainsi, le défi de résoudre un problème inverse fortement non-linéaire est simplifié. Une méthode d’identification de régimes d’écoulements horizontaux eau-air a été élaborée avec un module de tomographie de capacitance électrique et une boucle d’essais hydrauliques déjà éprouvés. Cette technique est en cours d’adaptation au prototype de tomographie d’impédance électrique rapide et en amélioration grâce à l’inclusion des régimes d’écoulements verticaux. En parallèle, une méthode de reconstruction d’image a été développée, basée sur l’algorithme NOSER et un postulat pseudo-2D. L’analyse des images reconstruites à partir d’un set d’expériences de référence procure un aperçu des avantages et des défauts de la méthode et du prototype
Recent developments with data acquisition equipment have reduced the time required for image acquisition with electrical tomography, thereby bringing new opportunities for the study of fast-evolving two-phase flows. Amongst the numerous advantages of this imaging technique for multiphase flow related research are non-intrusiveness, high acquisition rates, low-cost and improved safety. A set of electrodes placed on the periphery of the pipe to be imaged is used to impose an electrical excitation and measure the system response. The distribution of phases inside the study volume distorts the electrical field in a characteristic manner. The objective of this thesis is to assess the potential of electrical impedance tomography at high acquisition rate. The first stage consists in developing a prototype sensor and assessing its performance with simplistic experiments. The system architecture employs voltage control of the excitation and therefore does not require the implementation of the conventional voltage-to-current converter module. A novel data collection method, the full scan strategy, is considered and provides correcting factors for the parasitic impedances in the system. The second stage is the image reconstruction from the measurement data. The approach considered in the thesis is to assume that flow regime identification techniques may provide valuable information on the phase distribution that can be injected in the inverse problem for imaging, thereby tackling the challenge of the non-linearity of the inverse problem. A method for horizontal air-water flow regime identification has been elaborated with an electrical capacitance tomography sensor and multiphase flow rig tried and tested. It is being adapted to the fast electrical impedance tomography prototype and upgraded to include vertical flow regimes. In parallel, an image reconstruction method has been developed based on the NOSER algorithm and a pseudo-2D postulate. The analysis of the reconstructed images for a set of benchmark experiments provide insights on the merits and deficiencies of the algorithm and of the prototype
APA, Harvard, Vancouver, ISO, and other styles
34

Selivanov, Vitali. "Topics in image reconstruction for high resolution positron emission tomography." Thèse, Sherbrooke : Université de Sherbrooke, 2002. http://savoirs.usherbrooke.ca/handle/11143/4169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Méteau, Jérémy. "Instrumentation optique pour la caractérisation des tissus : analyse de la complémentarité et des limites techniques de fluorescence hyperspectrale et de Tomographie Optique Cohérente en vue de leur intégration multimodale." Thesis, Besançon, 2014. http://www.theses.fr/2014BESA2041/document.

Full text
Abstract:
L'objectif de ce travail de recherche est le développement d'un système fibré d'imagerie point par point d'auto fluorescence multi-excitation, de tissus biologiques en utilisant la technique de fluorescence hyperspectrale et l'étude d'un système de tomographie optique cohérente comme possible modalité supplémentaire. La première partie de ce rapport présente les propriétés optique des tissus biologiques et les fluorophores pertinents pour la détection de tumeurs cancéreuses. La deuxième partie présente l'instrumentation du système d'imagerie de fluorescence et l'analyse hyperspectrale des résultats obtenus in vitro.Il est démontré la pertinence de ce type d'analyse qui permet de déterminer la concentration de certains fluorophores. La troisième partie présente le système de tomographie optique cohérente appelé "scan free" OCT car il permet de réaliser des images sans déplacement d'éléments optiques. Ce système est caractérisé et présente des fonctionnalités intéressantes comme la compensation de la dispersion dépendante de la profondeur. Les divers résultats obtenus montrent que ces deux techniques sont complémentaires car elles apportent des informations de nature différentes. La première technique donne de se informations sur la composition biochimique des tissus, la seconde donne des information sur la structure
The aim of this activity is the development of a mono point imaging fiber system which uses hyperspectral multi-excitation auto fluorescence technique for biological tissues and the study of an Optical Coherence Tomography system like another modality. At first, this report presents the optical properties of biological tissues and the relevant fluorophores for cancerous tumors detection. Secondly, the fluorescence imaging system instrumentation and hyperspectral analysis are presented with in vitro results. The third part presents the "scan free" optical coherence tomography system which is able to image without optical displacement. It's characterized and have interesting functionality like depth dependant dispersion compensation. These both techniques are complementary because they get different kind of information. The information of the first one is about biochemical composition of the tissues and the information of the second one is about the stucture
APA, Harvard, Vancouver, ISO, and other styles
36

Pal, Sandip. "Sensitive Detection Techniques and Systems for Minor Species Tomography." Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.503005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Kyte, David John. "Magnetic induction tomography and techniques for eddy-current imaging." Thesis, University of Surrey, 1985. http://epubs.surrey.ac.uk/707/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Bayford, R. H. F. W. "Application of constrained optimisation techniques in electrical impedance tomography." Thesis, Middlesex University, 1994. http://eprints.mdx.ac.uk/13280/.

Full text
Abstract:
A Constrained Optimisation technique is described for the reconstruction of temporal resistivity images. The approach solves the Inverse problem by optimising a cost function under constraints, in the form of normalised boundary potentials. Mathematical models have been developed for two different data collection methods for the chosen criterion. Both of these models express the reconstructed image in terms of one dimensional (I-D) Lagrange multiplier functions. The reconstruction problem becomes one of estimating these 1-D functions from the normalised boundary potentials. These models are based on a cost criterion of the minimisation of the variance between the reconstructed resistivity distribution and the true resistivity distribution. The methods presented In this research extend the algorithms previously developed for X-ray systems. Computational efficiency is enhanced by exploiting the structure of the associated system matrices. The structure of the system matrices was preserved in the Electrical Impedance Tomography (EIT) implementations by applying a weighting due to non-linear current distribution during the backprojection of the Lagrange multiplier functions. In order to obtain the best possible reconstruction it is important to consider the effects of noise in the boundary data. This is achieved by using a fast algorithm which matches the statistics of the error in the approximate inverse of the associated system matrix with the statistics of the noise error in the boundary data. This yields the optimum solution with the available boundary data. Novel approaches have been developed to produce the Lagrange multiplier functions. Two alternative methods are given for the design of VLSI implementations of hardware accelerators to improve computational efficiencies. These accelerators are designed to implement parallel geometries and are modelled using a verification description language to assess their performance capabilities.
APA, Harvard, Vancouver, ISO, and other styles
39

Ktistis, Christos. "Electromagnetic induction tomography techniques for low conductivity biomedical application." Thesis, University of Manchester, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.682783.

Full text
Abstract:
This thesis considers the feasibility of using magnetic inductance tomography (MIT) to reconstruct images which represent the internal conductivity distribution of low contrast objects. The research focuses on a biomedical application, namely the measurement of human body composition. The thesis describes the development of a system which combines a photonic scanner for measuring the shape of the subject, with an experimental MIT system. The shape information can be used as a priori knowledge for the image reconstruction algorithm. The MIT system contained a full-scale 16 coil, circular sensor array, with 8 coils used for excitation and 8 for detection. The diameter of the object space was 75 cm and a commercial data acquisition system was used to interrogate the array. The measured data was reconstructed using linear algorithms and two kinds of sensitivity maps, one computed without shape information and the other with. The thesis contains results from a variety of tests illustrating the limits of each and the importance of knowing the external shape of the object. The shape scanning system is operating on the structured light principle. It consisted of four cameras and the 8 laser line generators which were integrated in the same scanning mechanisms used by the MIT system. Experiments were preformed in order to verify if the current design was capable to capture and reconstruct human body shape. Overall, the results of the research undertaken for this thesis support the feasibility of reconstructing internal features however more work is needed to obtain images of sufficient quality.
APA, Harvard, Vancouver, ISO, and other styles
40

Kusminarto. "Study and development of techniques in computerised neutron tomography." Thesis, University of Surrey, 1986. http://epubs.surrey.ac.uk/844567/.

Full text
Abstract:
Since the construction of the first commercial scanner for routine medical diagnosis implementing the principles of computerised tomography by Hounsfield in 1973 and its worldwide adoption, the use of various types of ionising and non-ionising radiations for tomographic imaging and other applications has been under continuous study. In this work a neutron beam has been used as the probe in order to obtain tomographic images, in transmission and emission modes, of the internal structure and elemental composition of test objects respectively. Various methods of neutron transmission tomography were studied and developed. A collimated He-3 proportional counter, a conventional combination of film/Gd-converter in a single cassette and a 35inm camera were employed as the detecting systems. A computerised video camera-based microdensitometer was used to digitise the radiographs obtained and a method to improve image noise was developed and tested. The technique of computerised tomography has also been applied to image elemental distributions, in the section of interest, employing delayed gamma-rays emitted by the object following neutron irradiation. The technique is not suitable in the case when very long-lived, very short-lived or stable isotopes are produced therefore a technique novel employing prompt gamma-rays emitted by the object during irradiation was developed and tested in this work. This technique has been termed Neutron Capture Prompt Gamma-ray Emission Tomography.
APA, Harvard, Vancouver, ISO, and other styles
41

Moussallem, Mazen. "Optimisation de la délimitation automatique des tumeurs pulmonaires à partir de l'imagerie TEP/TDM pour les planifications dosimétriques des traitements par radiothérapie." Phd thesis, Université Claude Bernard - Lyon I, 2011. http://tel.archives-ouvertes.fr/tel-00864905.

Full text
Abstract:
L'un des aspects les plus critiques dans les planifications dosimétriques des traitements par radiothérapie est la délinéation des limites de la tumeur. Cette délinéation se fait généralement sur les images anatomiques de tomodensitométrie (TDM). Mais récemment, il est recommandé de faire cette délinéation pour les cancers broncho-pulmonaires non à petites cellules (CBNPC) sur les images fonctionnelles de Tomographie par Émission de Positon (TEP) pour prendre en compte les caractéristiques biologiques de la cible. Jusqu'à ce jour, aucune technique de segmentation ne s'est révélée satisfaisante pour les images TEP en application clinique. Une solution pour ce problème est proposée dans cette étude. Méthodes : Les optimisations de notre méthode ont consisté principalement à faire l'ajustement des seuils directement à partir des corps des patients au lieu de le faire à partir du fantôme. Résultats : Pour les lésions de grands axes supérieurs à 20 mm, notre technique de segmentation a montré une bonne estimation des mesures histologiques (la moyenne de différence de diamètre entre données mesurées et déterminées avec notre technique = +1,5 ± 8,4 %) et une estimation acceptable des mesures TDM. Pour les lésions de grands axes inférieurs ou égaux à 20 mm, cette méthode a montré un écart avec les mesures dérivées des données histologiques ou bien des données TDM. Conclusion : Cette nouvelle méthode d'ajustement montre une bonne précision pour la délimitation des lésions de grands axes compris entre 2 et 4,5 cm. Néanmoins, elle n'évalue pas correctement les lésions les plus petites, cela peut être dû à l'effet du volume partiel
APA, Harvard, Vancouver, ISO, and other styles
42

Fincke, Jonathan Randall. "Non-contact quantitative imaging of limbs, bone and tissue, using ultrasound tomographic techniques." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119342.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 123-131).
Non-contact and quantitative ultrasound images of bone and soft tissue are produced from original algorithms applied to observational data sets collected on two custom built ultrasound imaging systems for limb imaging. The images are quantitative in that the distribution of sound speeds and dimensionally accurate geometry of tissue structures are reconstructed. The first imaging system is based on laser generation and detection of ultrasound (LUS) and the second is a water immersion ultrasound tomography (UST) system. The LUS and UST systems and algorithms, in aggregate, can be used to generate large volume, quantitative 2D and traditional 2D ultrasound images. Existing medical ultrasound systems are unable to acquire or generate large volume, quantitative and clinically useful bone images. A medical ultrasound system with these capabilities would have significant clinical value. LUS and UST systems could improve the quality, cost and safety of osteoporosis diagnosis and tracking, prosthetic fitting, bone fracture detection and tracking, intraoperative imaging and volumetric imaging in intensive care units. The algorithms and systems established for this thesis contribute broadly to non-contact ultrasound imaging: LUS for medical imaging, and UST for bone and soft tissue imaging and quantification. Non-contact techniques are clinically valuable because they can deliver operator independent image quality and large volume imagery without making contact with the body and distorting the tissue. Quantitative ultrasound imaging techniques are clinically useful because they provide intrinsic information about tissue mechanical properties, such as stiffness and density. Successful bone and soft tissue quantification using UST techniques could yield an entirely new and radiation free means of assessing bone and soft tissue strength and health. The experiments completed with the LUS system demonstrate its capability to generate images without contacting or treating the skin surface. Further, soft tissue (weak reflector), as well as bone (strong reflectors) are resolved at skin safe optical exposures. To enhance the LUS system performance, the optical wavelength of the generation laser is studied and optimized to deliver the largest acoustic source possible while also meeting optical exposure thresholds for skin. Additionally, commercially available laser vibrometer technology optimized for detecting vibrations on rough surfaces, such as skin, is identified and tested. Three original algorithms yield images of bone and soft tissue geometry and sound speed when applied to experimental data from the UST device. The first algorithm is a backscatter/reflection adaptive imaging technique that enables high resolution, SNR and volumetric imagery of bone and soft tissue to be formed from a single, mechanically scanned ultrasound transducer. The second algorithm uses a travel-time sound speed inversion technique that estimates water, soft tissue and bone sound speed to within 10% of ground truth estimates. The performance of this algorithm is validated on multiple samples and a simulated data set. The third algorithm is a full waveform inversion (FWI) algorithm regularized with the level set technique to enable quantification of bone properties. This algorithm is validated on an animal tissue sample and simulated data sets. The FWI technique resolves the soft tissue spatial sound speed distribution with half to one-quarter wavelength (1 - 0.5 mm) resolution and average sound speed values are within 10% of ground truth measurements.
by Jonathan Randall Fincke.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
43

Bisel, Tucker. "Investigation into the Capabilities and Characteristics of Tomographic Particle Image Velocimetry Measurement Techniques." Thesis, The University of North Carolina at Charlotte, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10815987.

Full text
Abstract:

Literature describing measurement results obtained using tomographic particle image velocimetry (TomoPIV) systems has become increasingly common, but details on the processes used to obtain those results and reasons for selecting specified analysis settings are not often explained. In this thesis, an overview is given of techniques and methodologies found to be useful when conducting TomoPIV measurements using an asymmetric four-camera system, including image pre-processing techniques and methods of improving analysis settings. Effective image pre-processing techniques include background subtraction for removing static background noise and improving signal-to-noise ratios and a customized filter that easily and reliably increases voxel reconstruction speed and reduces memory requirements. The custom filter acts to concentrate light intensity around particle locations while muting background pixel intensities. The effects on final measurement results are observed for voxel reconstruction and 3D least squares matching (LSM) settings, including relaxation number, reconstruction iterations, and interrogation volume size. The effects of surface reflections of laser light on TomoPIV results are also investigated by comparing measurement results of a cubic bluff body painted first with a flat white aerosol paint, and second with an airbrushed Rhodamine 6G fluorescent paint. Fluoresced light is blocked by bandpass filters, resulting in minimal reflections from the Rhodamine 6G paint and no observed impact on measurement results. White paint results in intense surface reflections and increased image noise, preventing reliable recognition of distant particles. Comparison of averaged 3D vector map results for both coatings reveals that allowable measurement depth decreases as surface reflection intensity increases.

APA, Harvard, Vancouver, ISO, and other styles
44

Peteya, Jennifer Anita. "Resolving Details of the Nonbiomineralized Anatomy of Trilobites Using Computed Tomographic Imaging Techniques." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1366025146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Lundqvist, Tobias. "Investigation of Algebraic Reconstruction Techniques for Tomographic Measurements on Spent Nuclear Fuel Assemblies." Thesis, Uppsala universitet, Institutionen för strålningsvetenskap, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-307832.

Full text
Abstract:
A non-destructive tomographic measurement technique for application on nuclear fuel assemblies has beendeveloped at the Uppsala University. Using this technique, the rod-by-rod distribution of selectedradioactive isotopes is determined experimentally. In the present work, the numerical technique to reconstruct the activity distribution inside the fuelassemblies has been analyzed. Three iterative reconstruction algorithms have been investigated, ART(Additive Reconstruction Technique), ML (Maximum Likelihood) and ASIRT (Additive SimultaneousIterative Reconstruction Technique). It was found that the ART algorithm is too sensitive to data points where the gamma-ray intensityis low, while ASIRT handles it in the best manner. Furthermore, ASIRT appears to be the most stablealgorithm and produces the best agreement to theoretical data.
APA, Harvard, Vancouver, ISO, and other styles
46

Yao, Rutao. "Development of multispectral scatter correction techniques for high-resolution positron emission tomography." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/nq26405.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Hassoun, Alain. "Quantification des erreurs de reconstruction dues aux variations aléatoires des mesures en tomographie d'émission : comparaison expérimentale des techniques intervallistes et statistiques." Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20272.

Full text
Abstract:
En médecine nucléaire, les images de Tomographie d'Emission Mono-Photonique (TEMP) permettent de diagnostiquer un certain nombre de maladies dégénératives, comme la maladie de Parkinson. Le principe de ce type de diagnostic est de comparer l'activité reconstruite au sein de deux régions d'intérêt particulières. Cette comparaison est problématique car les mesures ayant des fluctuations aléatoires, les activités reconstruites en ont aussi. La statistique de ces fluctuations étant inconnue, le diagnostic obtenu est souvent peu fiable. Pour rendre ce type de diagnostic plus fiable, il est important de quantifier, dans chaque région d'intérêt, l'impact des fluctuations aléatoires des mesures de projection sur l'activité reconstruite. Dans cette thèse, nous nous intéressons à cette quantification via des méthodes de reconstruction basées sur une nouvelle modélisation du processus d'acquisition tomographique. Une des particularités de ces méthodes est que les valeurs des activités reconstruites ne sont pas des valeurs précises mais des intervalles d'activités. Une propriété importante de ces estimations intervallistes est que la largeur des intervalles quantifie l'erreur statistique de reconstruction. Une contribution importante de ce travail est la mise en place d'un protocole dérivé de la méthode de comparaison quantitative appliquée en routine clinique. Ce protocole peut être utilisé, d'une part, pour évaluer la performance d'une méthode de quantification d'erreur dans une tâche de comparaison d'activités reconstruites et d'autre part, pour comparer les performances de plusieurs algorithmes de quantification. Nous montrons et discutons les performances de deux méthodes de quantification intervallistes et d'une méthode de quantification choisie comme référence
In nuclear medicine, Single-Photon Emission Computed Tomography (SPECT) images are used to diagnose a certain number of degenerative diseases such as Parkinson's disease. The principle of this kind of diagnosis is to compare the activity reconstructed in two specific regions of interest. The random fluctuations of the reconstructed activities, due to the random fluctuations of the measurement, have unknown statistical properties. This lack of knowledge makes the comparison, and thus the diagnosis, unreliable. To make the diagnosis more reliable, it is important to quantify the impact of random fluctuations of the projection measurements on the reconstructed activities in each region of interest.In this thesis, we focused on this quantification by using reconstruction methods based on a new modelling of the acquisition tomography process. A special feature of the obtained reconstructions is that the reconstructed activities are not precise- but interval-valued activities. The width of the reconstructed intervals quantifies the reconstruction error. As important contribution, we have proposed a protocol derived from the quantitative comparison method applied in clinical routine. This protocol can be used to evaluate the performance of an error quantification algorithm or, to compare the performances of two quantification algorithms. We show and discuss the performance of two interval-based quantification methods and a chosen reference method
APA, Harvard, Vancouver, ISO, and other styles
48

Nguyen, Linh V., and Leonid A. Kunyansky. "A Dissipative Time Reversal Technique for Photoacoustic Tomography in a Cavity." SIAM PUBLICATIONS, 2016. http://hdl.handle.net/10150/622000.

Full text
Abstract:
We consider the inverse source problem arising in thermo-and photoacoustic tomography. It consists in reconstructing the initial pressure from the boundary measurements of the acoustic wave. Our goal is to extend versatile time reversal techniques to the case when the boundary of the domain is perfectly reflecting, effectively turning the domain into a reverberant cavity. Standard time reversal works only if the solution of the direct problem decays in time, which does not happen in the setup we consider. We thus propose a novel time reversal technique with a nonstandard boundary condition. The error induced by this time reversal technique satisfies the wave equation with a dissipative boundary condition and, therefore, decays in time. For larger measurement times, this method yields a close approximation; for smaller times, the first approximation can be iteratively refined, resulting in a convergent Neumann series for the approximation.
APA, Harvard, Vancouver, ISO, and other styles
49

Donner, Quentin. "Correction de l'atténuation et du rayonnement diffusé en tomographie d'émission à simples photons." Université Joseph Fourier (Grenoble), 1994. http://www.theses.fr/1994GRE10155.

Full text
Abstract:
L'objectif de la tomographie d'émission à simples photons est d'établir une image fonctionnelle d'un organe. Pour ce faire, on administre au patient un radioélément qui se fixe dans l'organe, puis on calcule la distribution du radioélément à partir de mesures du rayonnement émis. Une proportion non négligeable de ce rayonnement interagit cependant dans la matière. L'objectif de ces travaux est de tenir compte de ces interactions afin de reconstruire plus précisément. Après avoir décrit les principales caractéristiques du système d'acquisition, nous nous consacrons a la reconstruction de la distribution d'émission à partir de projections atténuées, notamment quand la géométrie d'acquisition est conique. Plusieurs méthodes du type prédiction-correction sont fréquemment utilisées, mais leur convergence n'est pas établie. En fait, nous montrons théoriquement que la plus simple de ces méthodes diverge dans un cas particulier. La correction d'atténuation peut également être traitée par la méthode du gradient préconditionnée. Nous proposons plusieurs préconditionnements, qui conduisent à différents algorithmes de reconstruction. Ces algorithmes présentent de grandes analogies avec les algorithmes de prédiction-correction, mais ils ont l'avantage d'être convergents. Nous terminons ce chapitre en validant une méthode du type prédiction-correction, celle de Morozumi, sur données simulées et sur données expérimentales. Nous abordons ensuite le problème de la détermination de l'objet atténuant suivant deux approches. La première repose sur l'utilisation des mesures de rayonnement direct: on reconstruit l'émission une première fois sans corriger l'atténuation, puis on ajuste un ellipsoïde aux contours extérieurs de cette image. La seconde approche est basée sur l'utilisation des mesures du rayonnement diffuse: nous étudions la possibilité d'utiliser ces mesures pour établir une cartographie d'atténuation et nous mettons en évidence les difficultés liées a ce problème. Nous terminons en regardant comment la cartographie d'atténuation peut servir à corriger les effets du rayonnement diffuse. Nous proposons une méthode du type prédiction-correction, nous discutons sa convergence, puis nous la comparons a une méthode classique sur données expérimentales. .
The aim of single photon emission tomography is to compute a functional picture of an organ. This is done by administering to the patient a radiopharmaceutical which is fixing in the organ. Then, one computes the distribution of the radiopharmaceutical from the measurement of the emitted gamma-rays. However, an important part of these gamma-rays are interacting with the matter inside the body. The aim of this work is to take these interactions into account so as to reconstruct more accurately. .
APA, Harvard, Vancouver, ISO, and other styles
50

Arhjoul, Lahcen. "Modélisation pharmacocinétique en tomographie d'émission par positrons en utilisant la technique des ondelettes." Thèse, Université de Sherbrooke, 2006. http://savoirs.usherbrooke.ca/handle/11143/4232.

Full text
Abstract:
Dans le cadre de ce travail de recherche, les objectifs étaient de mettre en oeuvre et de valider la technique des ondelettes dans la modélisation pharmacocinétique chez le rat par tomographie d'émission par positrons (TEP). En TEP, le métabolisme du glucose dans l'organe étudié est mesuré en injectant un analogue du glucose, le fluorodéoxyglucose ([indice supérieur 18]FDG). La quantité de radioactivité injectée est mesurée dans le plasma sanguin en fonction du temps et constitue la courbe d'entrée, tandis que la radioactivité mesurée dans les tissus à l'aide de la TEP constitue la réponse des tissus. Avec la courbe d'entrée et l'intensité de la radioactivité dans les tissus telle que mesurée par le tomographe, le métabolisme du glucose est calculé à l'aide d'un modèle mathématique compartimental. Ce calcul se fait habituellement sur des images reconstruites filtrées ou itérées. Cependant, ces images filtrées ont perdu la résolution spatiale ou contiennent encore du bruit dû à la faible dose de radioactivité injectée ou le temps restreint de la mesure. Dans ce travail, nous proposons la technique des ondelettes basée sur des algorithmes de compression et de filtrage qui s'avèrent performants et faciles à utiliser. De plus, à partir des images filtrées et compressées par les ondelettes, nous calculons le métabolisme du glucose pixel par pixel, afin de générer une image appelée l'image paramétrique qui permet une visualisation du métabolisme du glucose dans les différentes structures d'un organe. Nous avons appliqué la technique des ondelettes autant sur les images que sur les projections, c'est-à-dire directement sur les matrices de projections avant de reconstruire les images pour éviter le filtrage des mesures et les opérations de reconstruction. Les ondelettes ont l'avantage de réduire les matrices et de grouper les intensités des pixels, procurant une meilleure statistique, donc plus de précision, et par conséquent une meilleure qualité des images paramétriques. La technique des ondelettes a été introduite également pour la correction du volume partiel en imagerie TEP. L'effet du volume partiel survient lorsque la radioactivité des structures, dont la taille est inférieure à la résolution spatiale du tomographe, est sous-estimée. La méthode des ondelettes continues représente une alternative aux méthodes habituellement utilisées, basées sur les informations anatomiques qui proviennent de l'imagerie par résonance magnétique (IRM) ou de tomodensitométrie (TDM). L'approche des ondelettes continues consiste à caractériser les différentes structures par le couple échelle et position. En utilisant ces informations fournies par les ondelettes, toutes les intensités sous-estimées des petites structures sont rehaussées, ce qui permet d'améliorer la détection des lésions et des tumeurs en imagerie TEP. En conclusion, le travail de cette thèse démontre l'avantage de l'utilisation des ondelettes dans le calcul des paramètres physiologiques à partir des images et des sinogrammes TEP mesurés avec le [indice supérieur 18]FDG chez le rat. Enfin, les résultats obtenus sur les images avec les ondelettes ont montré moins de variation, moins de bruit tout en préservant la résolution spatiale. L'application de la transformée en ondelettes continues dans la correction de l'effet du volume partiel pour les images TEP en utilisant l'ondelette appropriée a montré le potentiel des ondelettes pour localiser les différentes structures permettant une bonne correction et une meilleure qualité d'image.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography