To see the other types of publications on this topic, follow the link: NUCLEUS SOFTWARE.

Dissertations / Theses on the topic 'NUCLEUS SOFTWARE'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'NUCLEUS SOFTWARE.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Xiang. "The Use of Software Faults in Software Reliability Assessment and Software Mutation Testing." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1434394783.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Yi 1973. "Reliability quantification of nuclear safety-related software." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28367.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Nuclear Engineering, 2004.
Page 242 blank.
Includes bibliographical references (p. 238-241).
The objective of this study is to improve quality and reliability of safety-critical software in the nuclear industry. It is accomplished by focusing on the following two areas: Formulation of a standard extensive integrated software testing strategy for safety-critical software, and Development of systematic test-based statistical software reliability quantification methodologies. The first step to improving the overall performance of software is to develop a comprehensive testing strategy, the gray box testing method. It has incorporated favorable aspects of white box and black box testing techniques. The safety-critical features of the software and feasibility of the methodology are the key drivers in determining the architecture for the testing strategy. Monte Carlo technique is applied to randomly sample inputs based on the probability density function derived from the specification of the given software. Software flowpaths accessed during testing are identified and recorded. Complete nodal coverage testing is achieved by automatic coverage checking. It is guaranteed that the most popular flowpaths of the software are tested.
The second part of the methodology is the quantification of software performance. Two Bayesian based white box reliability estimation methodologies, nodal coverage- based and flowpath coverage-based, are developed. The number of detected errors and the failure-free operations, the objective and subjective knowledge of the given software, and the testing and software structure information are systematically incorporated into both reliability estimation approaches. The concept of two error groups in terms of testability is initiated to better capture reliability features of the given software. The reliability of the tested flowpaths of the software and that of the untested flowpaths can be updated at any point during testing. Overall software reliability is calculated as a weighted average of the tested and untested parts of the software, with the probability of being visited upon next execution as the weight of each part. All of the designed testing and reliability estimation strategies are successfully implemented and automated via various software tools and demonstrated on a typical safety-critical software application.
by Yi Zhang.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
3

Horng, Tze-Chieh 1964. "MIDAS : minor incident decision analysis software." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/16643.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Engineering, 2004.
Includes bibliographical references (p. 59-60).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
MIDAS is the minor incident decision analysis software that acts as an advisory tool for plant decision makers and operators to analyze the available decision alternatives for resolving minor incidents. The minor incidents dealt with in this thesis include non- safety related component failure, equipment maintenance, inspection or testing. MIDAS implements the risk-informed decision analysis methodology that uses multi- attribute utility theory (MAUT) and formal decision-making models that was developed for nuclear power plants. MIDAS integrates the theory, practical models and the graphical user interfaces for analysts to quickly obtain the insight regarding the performance of decision options and driving factors. To be able to deal with the inherent diversity of scenarios and decision options, a well-defined option models and modular calculation structure were constructed in MIDAS. In addition, MIDAS provides the functions of performing sensitivity and uncertainty analyses to take into account the inherent model and parameter uncertainties in decision option evaluation. Two case studies are performed to demonstrate the application of MIDAS in nuclear power plant risk-informed incident management. The insight obtained from the analysis results of case studies reveals that for nuclear power plant incident management, risk usually is not the most important concern. Cost and external attention are usually the dominant deciding factors in decision-making. However, in fact, the safety performance of each option is reflected in terms of the cost and external attention.
by Tze-Chieh Horng.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
4

Arno, Matthew G. (Matthew Gordon). "Verification and validation of safety related software." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/33517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lunglhofer, Jon R. (Jon Richard). "Complete safety software testing : a formal method." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/88311.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Xinhui 1996. "Development of a graphical approach to software requirements analysis." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/50421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Poorman, Kenneth E. (Kenneth Earl) 1967. "On the complete testing of simple safety-related software." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/36439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bydell, Linn. "Evaluation of the thermal-hydraulic software GOTHIC for nuclear safety analyses." Thesis, Uppsala universitet, Tillämpad kärnfysik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-202808.

Full text
Abstract:
The aim of this master theses was to evaluate the thermal-hydraulic calculation software GOTHIC for the purpose of nuclear containment safety analyses. The evaluation was performed against some of the Marviken full scale containment experiments and a comparison was also made against the two codes RELAP5 and COPTA. Models with different complexity were developed in GOTHIC and the parameters pressure, temperature and energy in different areas of the enclosure was investigated. The GOTHIC simulations in general showed a good agreement with the Marviken experimental results and had an overall better agreement then RELAP5. From the results it was possible to conclude that the developed GOTHIC model provided a good representation of the Marviken facility.
APA, Harvard, Vancouver, ISO, and other styles
9

Loza, Peñaran Miguel Angel. "Control dinámico de un reactor nuclear PWR utilizando software libre (SCICOS)." Bachelor's thesis, Universidad Nacional Mayor de San Marcos, 2009. https://hdl.handle.net/20.500.12672/15122.

Full text
Abstract:
Presenta en forma clara y sencilla el uso del software libre SCICOS para modelar la cinética y dinámica de un reactor nuclear PWR. Creemos que este trabajo puede servir para la enseñanza y fácil compresión del funcionamiento de un reactor nuclear que se estudia en el curso de física de reactores nucleares. Las ecuaciones diferenciales utilizadas en este modelo son de primer orden y de fácil compresión. El modelo utilizado es el de Cinética puntual que describe la población de los neutrones y de los núcleos precursores que finalmente determina la potencia del reactor nuclear PWR. El Modelo térmico, representa el calor transferido por el combustible y el refrigerante desde el núcleo hasta el generador de vapor. Por último, modelamos la reactividad total del reactor, que está formado por la reactividad de las barras de control y las reactividades relacionadas a la temperatura del combustible y refrigerante. Esta etapa mantiene sobre control al reactor nuclear. Se presentan ejemplos demostrativos en el capítulo V con respecto a la programación, datos y resultados.
Trabajo de suficiencia profesional
APA, Harvard, Vancouver, ISO, and other styles
10

Veerasamy, Saravanan. "Valdiation of BaBar tracking software using lambda hyperon." Thesis, University of Iowa, 2007. http://ir.uiowa.edu/etd/141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Barroso, Adelanir Antonio. "DOC BASE NUCLEAR 1.01: software de gestão de documentos em Serviços de Medicina Nuclear Juiz de Fora." Universidade Federal de Juiz de Fora (UFJF), 2017. https://repositorio.ufjf.br/jspui/handle/ufjf/4958.

Full text
Abstract:
Submitted by isabela.moljf@hotmail.com (isabela.moljf@hotmail.com) on 2017-06-19T12:16:50Z No. of bitstreams: 1 adelanirantoniobarroso.pdf: 1550547 bytes, checksum: c29a77a5cddabe41dae80005aea02776 (MD5)
Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-06-29T12:21:34Z (GMT) No. of bitstreams: 1 adelanirantoniobarroso.pdf: 1550547 bytes, checksum: c29a77a5cddabe41dae80005aea02776 (MD5)
Made available in DSpace on 2017-06-29T12:21:34Z (GMT). No. of bitstreams: 1 adelanirantoniobarroso.pdf: 1550547 bytes, checksum: c29a77a5cddabe41dae80005aea02776 (MD5) Previous issue date: 2017-04-10
O software de gestão de documentos em Serviços de Medicina Nuclear (SMN), denominado DOC BASE NUCLEAR 1.01, funciona via local e por acesso remoto (via web), com o objetivo principal de arquivamento eletrônico e controle da documentação exigida pelos órgãos reguladores brasileiros, com ênfase especial para Agência Nacional de Vigilância Sanitária (ANVISA), Comissão Nacional de Energia Nuclear (CNEN) e Ministério do Trabalho e Previdência Social (MTPS). O software possibilita a criação de checklist da base documental exigida, com definições de origem e validade dos documentos. O processo inicial teve origem na análise de pastas físicas que foram criadas e indexadas, com documentos impressos que serviram de base para orientar o modelo do software, desenvolvido em linguagem Visual Basic, plataforma .NET, com banco de dados SQL SERVER, além da linguagem ASP NET. O programa foi concebido de forma a permitir a inclusão de cópias digitalizadas da legislação específica, além de documentos adicionais e suas especificações, com orientações e alertas para a permanente manutenção da base documental em conformidade com a legislação vigente. Na fase tardia do projeto as aplicações foram expandidas permitindo a inclusão de documentos advindos de diferentes origens, incluindo aqueles internos do SMN facilitando a centralização, o arquivamento eletrônico e a controladoria dos mesmos. É possível, ainda, elaborar uma lista digital de exigências e incluir novos documentos, respeitando seus algoritmos e pré-requisitos para obtenção dos mesmos. A caixa de busca por denominações facilita a localização e o acesso aos documentos que, selecionados, são agrupados em local específico, sob o nome genérico de “carrinho de documentos” e, ao término da seleção, são finalizados (fechados) com a criação de pasta personalizada para o uso a que se destine. O software foi submetido a teste operacional em SMN, com boa resposta aos objetivos propostos, fácil execução e boa interação com usuário.
The present product DOC BASE NUCLEAR 1.01 is a document management software in Nuclear Medicine Services (SMN), which works via local and remote access (web) with the main objective of electronically filing and controlling the documentation required by the Brazilian regulatory agencies, with special emphasis on the National Agency of Sanitary Surveillance (ANVISA), the National Nuclear Energy Commission (CNEN) and the Ministry of Labor and Social Security (MTPS). The product enables the creation of a checklist of the required documentary base, with definitions of origin and validity of the documents. This initial process originated from the analysis of physical folders that were created and indexed, with printed documents that served as a basis to guide the design of the software, developed in Visual Basic language .NET platform (DOT NET), with SQL SERVER database and ASP NET language.The software has been created to allow the inclusion of digitised copies of specific legislation, as well as additional documents and their specifications, with guidelines and alerts/notifications for the permanent maintenance of the documentary base, in accordance with current legislation. At a later stage of the project, the possible applications of the software were expanded and allowed for the inclusion of different sources, such as internal documents from SMN, facilitating their centralisation, electronic archiving and control. The software generates a digital list of requirements and the inclusion of new documents, maintaining the algorithms of the requirements and prerequisites for obtaining them. One can find a search box for denominations in order to locate and access to documents that are selected as a group under the generic name of “documents cart" and that, at the end of the selection, are finalized (closed) with the new custom folder for your intended use. The software has been tested by Nuclear Medicine Services, with a good responseregarding its objectives, usability and good user interaction.
APA, Harvard, Vancouver, ISO, and other styles
12

Gnau, Andrew Patrick. "Evaluation of the regulatory review process for the software development life cycle." Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/43344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ouyang, Meng. "An integrated formal approach for developing reliable software of safety-critical system." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Oliver, William A. "Monitoring Software and Charged Particle Identification for the CLAS12 Detector." VCU Scholars Compass, 2019. https://scholarscompass.vcu.edu/etd/6031.

Full text
Abstract:
The CEBAF Large Acceptance Spectrometer for the 12 GeV era, known as CLAS12, uses the time of flight (TOF) system to identify charged particles from scattering events between the beam and target. The TOF system is divided into two parts: The Forward time of flight system, and the Central time of flight system. These two sub-systems subtend different polar angles of the detector geometry for wide acceptance of scattered particles. Reconstruction is the service used to identify particles from the interactions between the beam and target, called as a vertex or the point where the interaction occurs. The vertex position is traced back using the tracking system and the TOF system. The resolution of the detector affects the accuracy of the reconstructed vertex location. This paper’s goal will be to develop software for validation suite for CLAS12, which will include central and forward tracking plots. Plots will be developed to check the precision of the reconstructed vertices in both the central and forward detectors. This will be done assuming a target with zero dimension at 𝑣𝑧 = 0, and an extended target of 5 cm at 𝑣𝑧 = 0. This paper will also look at the TOF resolution, and identify particles using the TOF detectors and the effect of the vertex correction on the velocity vs. momentum plots.
APA, Harvard, Vancouver, ISO, and other styles
15

Sui, Yu 1973. "Reliability improvement and assessment of safety critical software." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/47688.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Engineering; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1998.
Includes bibliographical references (leaves 95-101).
In order to allow the introduction of safety-related Digital Instrumentation and Control (DI&C) systems in nuclear power plants, the software used by the systems must be demonstrated to be highly reliable. The most widely used and most powerful method for ensuring high software quality and reliability is testing. An integrated methodology is developed in this thesis for reliability assessment and improvement of safety critical software through testing. The methodology is based upon input domain-based reliability modeling and structural testing method. The purpose of the methodology is twofold: Firstly it can be used to control the testing process. The methodology provides path selection criteria and stopping criteria for the testing process with the aim to achieve maximum reliability improvement using available testing resources. Secondly, it can be used to assess and quantify the reliability of the software after the testing process. The methodology provides a systematic mechanism to quantify the reliability and estimate uncertainty of the software after testing.
by Yu Sui.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
16

D'Angelo, Elizabeth Marcela Alonso. "Software requirements for a nuclear plant operator advisor, the SIPO case study." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq29237.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Gomes, António João Matos. "Ressonância magnética nuclear com recurso a um transreceptor rádio controlado por software." Master's thesis, Faculdade de Ciências e Tecnologia, 2014. http://hdl.handle.net/10362/13008.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
A investigação realizada no desenvolvimento de Software-Defined Radars (SDR) tende sempre para a combinação de hardware do Universal Software Radio Peripheral (USRP) com o software Gnu Radio, desenvolvido especialmente para as comunicações rádio. Existem diversos estudos em que demonstram que estas duas ferramentas podem ser usadas em conjunto para a implementação de um sistema SDR de baixo custo e bastante versátil. Com o desenvolvimento da tecnologia, o USRP tem vindo a aumentar o seu potencial podendo ser aplicado a diversas tecnologias. Com alguns conhecimentos do funcionamento de uma Ressonância Magnética Nuclear (NMR) é possível adaptar-se o USRP num equipamento capaz de efetuar uma NMR. O trabalho realizado nesta dissertação consiste na implementação de um sistema de Ressonância Magnética Nuclear (Nuclear Magnetic Resonance – NMR) utilizando um sistema de Software-Defined Radio (SDR). Foi construído um diagrama de blocos que enviasse um sinal semelhante ao sinal enviado numa NMR, foi criado um sinal de resposta, de forma a simular a NMR sendo de seguida analisado como se de um sinal obtido se tratasse. A análise do sinal consiste na obtenção do máximo e máximos relativos, aproximando-os a uma expressão inversamente exponencial.
APA, Harvard, Vancouver, ISO, and other styles
18

Johnson, Stephen Philip. "Mapping numerical software onto distributed memory parallel systems." Thesis, University of Greenwich, 1992. http://gala.gre.ac.uk/8676/.

Full text
Abstract:
The aim of this thesis is to further the use of parallel computers, in particular distributed memory systems, by proving strategies for parallelisation and developing the core component of tools to aid scalar software porting. The ported code must not only efficiently exploit available parallel processing speed and distributed memory, but also enable existing users of the scalar code to use the parallel version with identical inputs and allow maintenance to be performed by the scalar code author in conjunction with the parallel code. The data partition strategy has been used to parallelise an in-house solidification modelling code where all requirements for the parallel software were successfully met. To confirm the success of this parallelisation strategy, a much sterner test was used, parallelising the HARWELL-FLOW3D fluid flow package. The performance results of the parallel version clearly vindicate the conclusions of the first example. Speedup efficiencies of around 80 percent have been achieved on fifty processors for sizable models. In both these tests, the alterations to the code were fairly minor, maintaining the structure and style of the original scalar code which can easily be recognised by its original author. The alterations made to these codes indicated the potential for parallelising tools since the alterations were fairly minor and usually mechanical in nature. The current generation of parallelising compilers rely heavily on heuristic guidance in parallel code generation and other decisions that may be better made by a human. As a result, the code they produce will almost certainly be inferior to manually produced code. Also, in order not to sacrifice parallel code quality when using tools, the scalar code analysis to identify inherent parallelism in a application code, as used in parallelising compilers, has been extended to eliminate dependencies conservatively assumed, since these dependencies can greatly inhibit parallelisation. Extra information has been extracted both from control flow and from processing symbolic information. The tests devised to utilise this information enable the non-existence of a significant number of previously assumed dependencies to be proved. In some cases, the number of true dependencies has been more than halved. The dependence graph produced is of sufficient quality to greatly aid the parallelisation, with user interaction and interpretation, parallelism detection and code transformation validity being less inhibited by assumed dependencies. The use of tools rather than the black box approach removes the handicaps associated with using heuristic methods, if any relevant heuristic methods exist.
APA, Harvard, Vancouver, ISO, and other styles
19

Shelbaya, Olivier. "Laser spectroscopy of rare Rubidium isotopes and development of data analysis software at TRIUMF." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110585.

Full text
Abstract:
TRIUMF, Canada's national laboratory for nuclear and particle physics has been peering into the shape and structure of highly unstable, rare isotopes. Employing the method of collinear fast-beam laser spectroscopy, the nuclear ground state spin, hyperfine moments and charge radii can be determined. The laser spectroscopy group at TRIUMF has used this method to investigate the properties of rubidium isotopes, where 76-98 Rb had been previously studied at ISOLDE by Thibault et al. Laser spectroscopy was performed at TRIUMF on both the neutron deficient 74-76 Rb, providing insight into the behaviour leading up to the proton drip line. On the neutron rich end, a programme is in place to extend measurements into the highly deformed 98-100 Rb. Preliminary measurements of the spectra of the spin-0 isotope 92 Rb have been carried out, on beams with intensities of 10^7 per second, representing the first time heavy rubidiums have been produced and studied spectroscopically at TRIUMF. These measurements have been carried out in conjunction with the implementation of a new MCS based data acquisition system, greatly improving the data collection and analysis capabilities of the laser spectroscopy group.
TRIUMF, le laboratoire national canadien pour la recherche en physique nucléaire et en physique des particules, a entrepris l'étude de la forme ainsi que la structure d'isotopes rares et hautement instables. En employant la méthode de la spectroscopie colinéaire utilisant des faisceaux atomiques à haute vitesse, le spin nucléaire ainsi que les moments nucléaires et le rayon de charge peuvent êtres determinés. Le groupe de spectroscopie au laser de TRIUMF a usé de cette méthode pour observer les propriétés fondamentales des isotopes du rubidium, famille isotopique pour laquelle les isotopes 76 Rb à 98 Rb ont déjà été étudiés au laboratoire ISOLDE par Thibault et al. En premier lieu, les isotopes déficitaires en neutrons 74 Rb - 76 Rb ont étés observés, permettant l'étude du rayon de charge nucléaire dans le régime menant à la ligne de limite de stabilité. Du côté riche en neutrons, un programme est en cours pour entreprendre l'étude des isotopes 98 Rb - 100 Rb, ces derniers étant hautement déformés. Des mesures péliminaires sur le spectre hyperfin de l'isotope 92 Rb, au spin nucleaire de 0, ont été réalisées, avec un rendement ionique de l'ordre de 10^7 par seconde. Ceci représente la première fois que des rubidiums lourds ont été produits et étudiés spectroscopiquement à TRIUMF. Ces nouvelles mesures ont été prises grâce à un nouveau système basé sur le principe d'un MCS, permettant l'augmentation de la capacité ainsi que de la qualité de l'acquisition de données.
APA, Harvard, Vancouver, ISO, and other styles
20

Valastyán, Iván. "Software Solutions for Nuclear Imaging Systems in Cardiology, Small Animal Research and Education." Doctoral thesis, KTH, Medicinsk teknik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-12069.

Full text
Abstract:
The sensitivity for observing physiological processes makes nuclear imaging an important tool in medical diagnostics. Different types of nuclear imaging modalities, with emphasis on the software components and image reconstructions, are presented in this thesis:  the Cardiotom for myocardial heart studies at the Karolinska University Hospital, the small animal Positron Emission Tomograph (PET) scanners for research and the SPECT, PET, spiral CT and Cardiotom demonstrators for the Royal Institute of Technology medical imaging laboratory. A modular and unified software platform has been developed for data representation, acquisition, visualization, reconstruction and presentation of the programs of the imaging devices mentioned above. The high performance 3D ML-EM and OS-EM iterative image reconstruction methods are implemented both on Cardiotom and miniPET scanners. As a result, the in-slice resolution of the first two prototypes of the Cardiotom today is the same as the formerly used filtered back-projection, however the in-depth resolution is considerably increased. Another improvement due to the new software is the shorter time that is required for data acquisition and image reconstruction. The new electronics with the newly developed software ensure images for medical diagnosis within 10 minutes from the start ofthe examination. The first system from the standardized production of the Cardiotom cameras is in the test phase. The performance parameters (sensitivity, spatial and energy resolution, coincidence time resolution) of the full ring mini PET camera are comparable to other small animal PETsystems.
QC20100721
APA, Harvard, Vancouver, ISO, and other styles
21

Gottardo, Marco. "New hardware and software technologies for real-time control in nuclear fusion experiments." Doctoral thesis, Università degli studi di Padova, 2018. http://hdl.handle.net/11577/3422811.

Full text
Abstract:
The current machines for the study of nuclear fusion does not produce energy, and their output is substantially a large amount of data. The accuracy of the data collected, and their density within narrow temporal samples, can determine the effectiveness of the real time control systems to install in future reactors. We set ourselves the objective to design and test a high-speed and high-density data acquisition system based on the latest generation FPGA technologies. in the thesis is used the latest products released by Xilinx to design a acquire stream system of signals from generic probes (specifically magnetic probes). The Zynq 7000 family is nowadays state of the art of sistemy SoC that integrating a powerful and extensive FPGA section with an ARM mullticore. Of fundamental importance will be the drastic reduction of signal cables between the sensory apparatus and acquisition systems with the dual objective of eliminating the noise induced and drastically lower installation costs. Magnetic field configuration in RFX is characterised by fast variations of all the three field components during the pulse, with relevant non axis-symmetry in toroidal direction. Typical spectra exhibit modes up to n=15 in toroidal direction and mainly m=0 and m=1 in poloidal direction. As a consequence, probe signals have a large dynamic (more than 60 dB), and extended frequency spectrum (several tens of kHz). Therefore, a large number of probes are required to correctly identify the complex spatial structure of the plasma column. To reduce shielding effects, probes must be installed inside the stabilising shell. The three components of field outside the vacuum vessel can be very different in amplitude. At the same time, one can reach 0.8 T and another can be typically lower than some mT. Furthermore, they vary very quickly. The probes to be installed have to guarantee an uncertainty less than 1 mT to correctly reconstruct the plasma behaviour. These two specifications are particularly stringent and require an accurate calibration and a careful probe alignment to minimise the spurious effect of unwanted components. A further design specification for the sensors is due to the maximum operation temperature of the vacuum vessel (200 °C). The analogic acquires systems must exhibit high isolation, high speed and resolution, but above all a low noise level. The noise must be below minimum margins throughout the frequency spectrum contained in the signals provided by magnetic probes. The main topic of the thesis is to verify the suitability of the ATCA MIMO ISOL modules in the upper and lower part of the signal spectrum of bi-axis magnetic probes in order to be able to be integrated into the new FPGA acquisition and realtime control in RFX systems.
Le attuali machine per lo studio della fusione nucleare producono un grande ammontare di dati. L’accuratezza di questi, la loro densità allinterno di stretti intervalli temporali può determinare la efficacia dei sistemi di controllo in tempo reale che dovranno essere installati nei futuri reattori. Ci siamo posti l’obiettivo di sviluppare e valutare un sistema di acquisizione dati ad alta velocità basato sulla ultima generazione di FPGA. In questa tesi abbiamo impiegato gli ultimi prodotti rilasciati da Xilinx per produrre un sistema in grado di acquisire segnali in streaming provenienti da sonde magnetiche generiche, installate in RFP e Tokamak. La famiglia 7000 Zynq è oggi lo stato dell’arte dei sistemi SoC integrando una potente sezione FPGA con un sistema multicore ARM Cortex A9 di ultima generazione. Di fondamentale importanza sarà la drastica riduzione dei cablaggi tra la macchina e l’apparato di concentrazione dei dati acquisiti con l’obiettivo di eliminare il rumore indotto e ridurre drasticamente i costi di assemblaggio. La configurazione dei campi magnetici in RFX è caratterizzata da veloci variazioni nelle tre component di campo, nella durata dell’impulso, con rilevanti componenti non assial-simmetriche specialmente in direzione toroidale. Tipicamente lo spettro mostra dei modi superiori a n=15 in direzione toroidale pricipalmente con m=0 e m=1 in direzione poloidale. Come conseguenza, i segnali alle sonde hanno un largo range dinamico, superiore a 60 dB e estendono lo spettro di frequenza oltre svariati decine di kHz. Quindi, molte sonde sono richieste per identificare correttamente la complessa struttura spaziale della Colonna di plasma. Per ridurre l’effetto schermo, le sonde devono essere installate dentro la shell stabilizzatrice. Le tre componenti del campo fuori dalla camera da vuoto possono essere molto differenti in ampiezza. Allo stesso tempo, una può raggiungere 0.8 T e un’altra può essere dell’ordine dei mT. Inoltre queste risultano essere molto veloci. Le sonde istallate devono garantire una incertezza minore di 1 mT per ricostruire correttamente il comportamento del plasma. Queste due specifiche sono particolarmente stringenti e richiedono una accurate calibrazione e allineamento dei sensori per minimizzare gli effetti spuri e il rilevamento di componenti indesiderate. Un ulteriore specifica di progetto per i sensori è dovuta alla massima temperatura della camera da vuoto che può raggiungere i 200 °C. I sistemi di acquisizione analogica devono mostrare un alto isolamento, velocità e risoluzione; ma soprattutto la qualità deve essere maggiore di qanto richiesto dal livello di rumore. Il rumore deve rimanere infatti sotto una soglia minima nello spettro di uscita. L’obiettivo principare della tesi è di verificare l’applicabilità del modulo ATCA MIMO ISOL nella parte alta e bassa dello spettro del segnale delle sonde magnetiche biassiali in modo da metterlo in grado di essere integrato nella nuova acquisizione FPGA e controllo in tempo reale per RFXmod.
APA, Harvard, Vancouver, ISO, and other styles
22

Stenberg, Johan. "Software Tools for Design of Reagents for Multiplex Genetic Analyses." Doctoral thesis, Uppsala : Acta Universitatis Upsaliensis, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-6832.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Yayla, Ihsan. "Filter Design Software By Synthesis Method." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12610279/index.pdf.

Full text
Abstract:
In this study, Window-based computer program, named Synthesis Software, is developed for designing filters with equal-ripple or maximally flat passbands and general stopbands by using cascade synthesis technique in transformed frequency domain. Synthesis technique is applicable to lumped element and commensurate line distributed element filters with Lowpass, Highpass or Bandpass characteristics. Singly or Doubly terminated filters can be synthesized. friendly environment for typing in the parameters of the filter to be designed. This part uses Synthesis and Plot parts as modules. This software is based on the previous softwares developed in EEE Department of Middle East Technical University. All the previous softwares were gathered in the well-known software Filpro, which is in DOS environment, in Pascal. Thus, the new software is actually a conversion of Synthesis part of Filpro from DOS environment into Windows environment in the language C#, with some improvements in root finding algorithms for numerical conditioning. Synthesis Software is has three parts. The first and main part is the implementation of synthesis technique by using object oriented programming technique. In this way, synthesis technique implementation is isolated from other parts of Synthesis Software and it can be used by other filter design programs as a module. The second part of the program is responseplotting section. In this part Insertion Loss, Return Loss, Time Delay, Phase and Smith Chart responses are calculated and displayed. The last part is User Interface, which provides user
APA, Harvard, Vancouver, ISO, and other styles
24

Argante, Erco. "CoCa a model for parallelization of high energy physics software /." Eindhoven : Eindhoven University of Technology, 1998. http://catalog.hathitrust.org/api/volumes/oclc/41892351.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Medina, Calderón Richard. "Desarrollo de software para cálculo neutrónico en el reactor RP-10." Bachelor's thesis, Universidad Nacional Mayor de San Marcos, 2004. https://hdl.handle.net/20.500.12672/3204.

Full text
Abstract:
Se presenta el desarrollo del Software WIMCIT; el cual ha sido producido tomando como base el CPC[16] (Código para Cálculo de Reactores), que realiza la simulación, automatiza las entradas numéricas y genera el modelamiento matemático en formato estándar, para el código de cálculo neutrónico CITATION [11] del reactor nuclear peruano RP-10 en 2 y 3 dimensiones para varios grupos de energía. Dicho código se encarga de hallar la solución numérica de la ecuación de transporte en su aproximación de difusión, mediante el método de diferencias finitas y muestra sus salidas para un análisis posterior con el Software WIMCIT. Finalmente se han añadido al programa desarrollado; capacidades gráficas y comunicación con el sistema operativo, los cuales permitirán un manejo robusto del proceso, una fácil interacción usuario-software, y mejoras sustanciales en la visualización de los resultados.
Tesis
APA, Harvard, Vancouver, ISO, and other styles
26

Llerena, Herrera Isbel. "An automated software for analysis of experimental data on decay heat from spent nuclear fuel." Thesis, Uppsala universitet, Tillämpad kärnfysik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-188831.

Full text
Abstract:
The Swedish Nuclear Fuel and Waste Management Company (SKB) has developed a method for final disposal of spent nuclear fuel. This technique requires accurate measurement of the residual decay heat of every assembly. For this purpose, depletion codes as well as calorimetric and gamma-ray spectroscopy experimental methods have been developed and evaluated. In this work a prototype analysis tool has been developed to automate the analysis of both calorimetric and gamma-ray spectroscopy measurements. The performance of the analysis tool has been investigated by comparing its output with earlier results and calculations. Parallel to the software development, new measurements on 73 BWR assemblies were performed. The results obtained for the determination of the residual decay heat are presented. Finally, suggestions for further development are outlined and discussed.
APA, Harvard, Vancouver, ISO, and other styles
27

曾熙旻 and Hei-man Tsang. "Simulations and software developments for cosmic-ray and particle physics experiments in underground laboratories." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B39557030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Han, Yunong. "Load-aware traffic control in software-defined enterprise Wireless Local Area Networks." Thesis, University of Essex, 2018. http://repository.essex.ac.uk/22379/.

Full text
Abstract:
With the growing popularity of Bring Your Own Device (BYOD), modern enterprise Wireless Local Area Networks (WLANs) deployments always consist of multiple Access Points (APs) to meet the fast-increasing demand for wireless access. In order to avoid network congestion which leads to issues such as suboptimal Quality of Service (QoS) and degraded user Quality of Experience (QoE), intelligent network traffic control is needed. Software Defined Networking (SDN) is an emerging architecture and intensively discussed as one of the most promising technologies to simplify network management and service development. In the SDN architecture, network management is directly programmable because it is decoupled from forwarding layer. Leveraging SDN to the existing enterprise WLANs framework, network services can be flexibly implemented to support intelligent network traffic control. This thesis studies the architecture of software-defined enterprise WLANs and how to improve network traffic control from a client-side and an AP-side perspective. By extending an existing software-defined enterprise WLANs framework, two adaptive algorithms are proposed to provide client-based mobility management and load balancing. Custom protocol messages and AP load metric are introduced to enable the proposed adaptive algorithms. Moreover, a software-defined enterprise WLAN system is designed and implemented on a testbed. A load-aware automatic channel switching algorithm and a QoS-aware bandwidth control algorithm are proposed to achieve AP-based network traffic control. Experimental results from the testbed show that the designed system and algorithms significantly improve the performance of traffic control in enterprise WLANs in terms of network throughput, packet loss rate, transmission delay and jitter.
APA, Harvard, Vancouver, ISO, and other styles
29

Gruber, Liliane Dailei Almeida. "Mediação do professor no uso do software educativo cidade do átomo : abordagem dos temas energia nuclear e radioatividade no ensino médio." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/116495.

Full text
Abstract:
Por considerar que o uso das tecnologias de informação e comunicação às práticas pedagógicas, aliadas à abordagem dos assuntos nucleares é tema relevante, desafiador e ainda pouco explorado, o presente projeto visou criar um espaço de reflexões acerca das relações de aprendizagem e o papel do professor em sala de aula no contexto escolar de nível médio. Para o desenvolvimento dessa pesquisa, utilizou-se metodologia qualitativa. Como instrumentos de coleta de dados utilizaram-se gravações de áudio e vídeo, anotações em diários de aula, produções textuais dos sujeitos investigados e arquivos de registro da navegação realizada no ambiente do software educativo. A utilização de recursos digitais e, principalmente, do software educativo Cidade do Átomo, possibilitou a realização do exercício de representações de papéis, cujo objetivo é abordar assuntos científicos e tecnológicos polêmicos, tal como os relacionados à produção de energia elétrica a partir da energia nuclear. Nessa atividade, os estudantes interpretam diversos papéis de um mesmo contexto, defendendo e debatendo os pontos de vista convergentes de seus personagens. Foi possível evidenciar na participação dos estudantes por meio de questionamentos e relações estabelecidas entre seus pares e com os recursos tecnológicos a formação de um ambiente de interação proveitoso. Esta constatação evidencia a potencialidade do espaço educativo formado, que resultou em reforço do processo ensino-aprendizagem.
By considering that the use of information technology and pedagogical practices communication allied with an approach of nuclear subject it’s a relevant, challenging and yet not well developed theme, the present project aimed to create speculative space about the learning relationship and the role of the educator in a middle school context. To the development of this research were used a qualitative methodology. As data collecting instrument were used audio and video recordings, diary class annotations, textual productions of the analyzed subjects and navigation record files of the educational software ambient. Use of digital resources, mainly the educative software “Cidade do Átomo”, enabled the roleplaying game, whose objective is the approach of scientific and technologic polemic subjects, such as the nuclear energy production. In this exercise, the students played various roles on the same context, defending and debating their characters point of view. By the use of questions and discussion of the peer relationship with the technological resource, was possible to bespeak a beneficial interactive environment. This statement endorses the potentiality of the educative space, which resulted in a reinforcement of the teaching-learning process.
APA, Harvard, Vancouver, ISO, and other styles
30

MILANESIO, DANIELE. "Development and Validation of a software for the Analysis of Antennas for Controlled Magnetically Confined Nuclear Fusion." Doctoral thesis, Politecnico di Torino, 2008. http://hdl.handle.net/11583/2498895.

Full text
Abstract:
Ion-Cyclotron Resonance Heating (ICRH) and Lower-Hybrid (LH)Resonance Heating are key parts of all present-day experiment toward the realization of controlled nuclear fusion with magnetic confinement. Both auxiliary heating systems are essentially antennas made up of complex plasma facing components, charged with the difficult mission of delivering extremely high RF powers to the plasma with typically poor loadings (as compared to antennas in radar or broadcasting), and high near-field reactive energies that generate serious mismatches to the feeding power lines. Because of the impossibility of testing these antennas in plasmas outside the actual experiments for which they have to be designed, availability of accurate simulation tools is a key factor in assisting their design: this work is concerned with the numerical analysis of these plasma facing antennas, through the upgrade of an existing code called TOPICA and its recent extension named TOPLHA. The antenna simulation code must be able to handle the actual geometry of both antennas (including their housing in the experiment, the protective screens, etc.), which have witnessed a constant increase in complexity. On the other hand, plasma loading on the antenna is extremely sensitive to the plasma profile, especially near the antenna itself: the antenna code must therefore be able to correctly account for the plasma conditions, which makes it necessary to include non-cold plasma terms (that affect resonances). More specifically, since the frequency range of the two heating systems is quite different (below 100 MHz for ICRH and around few GHz for LH), the two antennas differ not only in the geometrical features (ICRH antennas are essentially metal loops while LH antennas are open-ended waveguide arrays), but also in the way the wave propagates in the plasma and the heating process happens. The problem can be solved with a considerable numerical efficiency by formally separating it into two parts: the vacuum region around the antenna conductors and the plasma region. This approach leads to the problem formulation via a set of two coupled integral equations, further discretized via the Method of Moments (MoM). The MoM is used in a hybrid form: spatial-domain approach is employed for the antenna and other conductors (with high geometrical complexity), while a spectral-domain approach is used for the plasma region (as plasma description is naturally available in this domain). Numerical tests have already been performed for simple ICRH launchers, and results compared with available experimental data both in vacuum and with real plasmas. Eventually, this work has extended the existing capabilities of TOPICA in two directions: the efficient handling of IC antennas housed in near, but distinct recesses and the LH range. Both problems are multi-cavity, in the sense that the antennas are recessed in a modular way. Starting from the validated version of TOPICA, a new approach has been developed to allow the code handle a much greater number of unknowns (from 10,000 to more than 150,000). In the IC range, this is dictated primarily by geometrical complexity, while the overall electrical length of one recess does not exceed one half free-space wavelength; in the LH range, conversely, geometry is smoother, but electrical size is larger. The multi-cavity approach addressed in this work exploits the fact that the inner parts of the individual cavities are coupled one to each other only through the equivalent currents on their apertures, and accordingly solves the global MoM linear system block-wise, with significant memory and time saving. Furthermore, this modular approach led to a heavy parallelization of the code, with an astonishing increase in the overall performances.
APA, Harvard, Vancouver, ISO, and other styles
31

Possani, Rafael Guedes. "Re-engenharia do software SCMS para uma linguagem orientada a objetos (Java) para uso em construções de phantoms segmentados." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/85/85133/tde-04062012-152752/.

Full text
Abstract:
Metodologias recentes de planejamento dependem fortemente de imagens de tomografia computadorizada e a tendência é que os procedimentos de dosimetria interna na terapia usando medicina nuclear também sejam baseados em imagens, tais como, imagens de ressonância magnética (RM) e tomografia computadorizada (TC), que extraem informações anatômicas e histológicas, bem como, imagens funcionais ou mapas de atividades, provenientes de PET e SPECT. Estas informações, associadas a um software de transporte de radiação, são utilizadas para estimar a dose interna em pacientes submetidos a tratamento em medicina nuclear. Este trabalho visa a re-engenharia do SCMS, que é um software de interface entre o código MCNP e as imagens médicas, que carregam as informações do paciente em tratamento. Em outras palavras, as informações necessárias contidas nas imagens são interpretadas e apresentadas em um formato específico para o código MCNP, que executa a simulação do transporte de radiação. Portanto, o usuário não precisa compreender o complexo processo de introdução de parâmetros do MCNP, pois o SCMS é responsável por construir automaticamente dados anatômicos do paciente, bem como, os dados da fonte radioativa. O SCMS foi originalmente desenvolvido em Fortran-77 e neste trabalho, ele foi reescrito em uma linguagem orientada a objetos (JAVA). Novas funcionalidades e opções de dados também foram incorporadas ao software. Assim, o novo software tem uma série de melhorias, tais como interface gráfica intuitiva e um menu para a seleção do espectro de energia correspondente a um radioisótopo específico, armazenado em um banco de dados XML. A nova versão também trabalha com uma maior quantidade de materiais e o usuário pode especificar uma região de interesse na tomografia computadorizada para o cálculo da dose absorvida.
Recent treatment planning systems depend strongly on CT images and the tendency is that the internal dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) and computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET and SPECT. This information associated with a radiation transport simulation software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand the complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran-77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose.
APA, Harvard, Vancouver, ISO, and other styles
32

Tondin, José Egidio Marin. "Prospecção de implementação de ensino à distância para a disciplina de fundamentos de física nuclear na Pós- Graduação do Ipen utilizando infra - estrutura de software livre." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/85/85131/tde-20082009-184625/.

Full text
Abstract:
Na sociedade atual a utilização dos recursos tecnológicos e da internet mostram-se fundamentais em todas as áreas da sociedade e a educação não poderia ficar afastada de tal evolução. Dessa forma, o ensino à distância é ferramenta essencial no desenvolvimento educacional, mostrando-se elemento inovador e estimulante, não só por oferecer recursos variados e complementares à formação do educando mas, principalmente, diante das facilidades inerentes à sua utilização a distância, abrangendo localidades remotas e de difícil acesso, bem como possibilitando minimização de custos e, com isso, permitindo ampla difusão do conhecimento. Neste trabalho objetivou-se a apuração do interesse e da viabilidade da implantação do curso a distância da disciplina de Fundamentos de Física Nuclear na Pós-Graduação do IPEN, utilizando infra-estrutura de software livre, através da realização de um Projeto Piloto de ensino à distância. A pesquisa desenvolveu-se em três etapas, através de entrevistas com alunos e professores da Pós-Graduação do IPEN, a fim de avaliar o interesse dos entrevistados na utilização do ensino a distância. Os resultados da referida pesquisa servirão de subsídio para futura implantação do ensino à distância na Pós-Graduação do IPEN, tendo sido constatado o interesse e aceitabilidade por parte dos entrevistados na implantação do referido curso, bem como foram colhidos valiosos dados que servirão de embasamento para possibilitar a criação e implantação, com sucesso, do curso de Pós-Graduação à distância no IPEN.
In modern society, the utilization of technological resources and of the internet are of fundamental importance in all areas and as such the educational activities have to follow this evolution. In this context, distance learning is a fundamental tool for educational development, being innovative and stimulating , offering a variety of resources that are complementary to the activities of the students. Also it is important to point out that distance learning can be brought to remote regions, of difficult access in many cases, so allowing a wide dissemination of knowledge and also implying in lower costs. The objective of this work is the prospection of the interest and viability of implantation of distance learning in the course on Fundamentals of Nuclear Physics at the graduate course of IPEN, using free software infra-structure , by means of a pilot project on distance learning. The work was developed in three phases. In the first phase, a questionnaire was applied to the students , with the aim of finding out about their profiles and interest in distance learning, before they had contact with the pilot project. In the second phase, other students were interviewed ,also with the objective of knowing their opinion about distance learning ,but after they had access to the pilot project. Finally,the professors of the graduate course of IPEN were consulted, also by means of a questionnaire, in order to know about their interest in the utilization of distance learning. The results obtained in the present work show that distance learning is a welcome pedagogical resource for students as well for teachers; these results will subsidize the future implementation of distance learning in the graduate course of IPEN.
APA, Harvard, Vancouver, ISO, and other styles
33

Nascimento, Pedro Augusto do. "Desenvolvimento de softwares para aplicação em medicina nuclear : cálculo da blindagem PET/CT e otimização de dose para radiofármaco em PET/CT." reponame:Repositório Institucional da UnB, 2016. http://repositorio.unb.br/handle/10482/24632.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Ceilândia, Programa de Pós-Graduação em Ciências e Tecnologias em Saúde, 2016.
Submitted by Albânia Cézar de Melo (albania@bce.unb.br) on 2017-08-14T16:18:57Z No. of bitstreams: 1 2016_PedroAugustodoNascimento.pdf: 3243139 bytes, checksum: 11f2328399cc0fe75d371464522a2803 (MD5)
Approved for entry into archive by Raquel Viana (raquelviana@bce.unb.br) on 2017-09-22T22:05:30Z (GMT) No. of bitstreams: 1 2016_PedroAugustodoNascimento.pdf: 3243139 bytes, checksum: 11f2328399cc0fe75d371464522a2803 (MD5)
Made available in DSpace on 2017-09-22T22:05:30Z (GMT). No. of bitstreams: 1 2016_PedroAugustodoNascimento.pdf: 3243139 bytes, checksum: 11f2328399cc0fe75d371464522a2803 (MD5) Previous issue date: 2017-09-22
A PET/CT (Positron Emission Tomography/ Computed Tomography) é uma técnica de aquisição de imagem, de alta resolução da anatomia e fisiologia humana, extremante eficiente no diagnóstico de tumores metabolicamente ativos. A proteção radiológica do PET/CT tem um desafio especial, pois a alta energia de 511 keV dos fótons provenientes da aniquilação de pares diferencia-se de outros exames de diagnósticos que utilizam radiações ionizantes. O cálculo dos requisitos de blindagem para radiação por emissão de pósitrons de em instalações PET/CT proposto em 2006 pela Task Grupo 108 (TG 108), da American Association of Physicists in Medicine (AAMP), pode ser uma tarefa complexa. Nesse trabalho apresentamos dois softwares em forma de Aplicativos (App) que visam contribuir com a Medicina Nuclear. O primeiro, denominado BlindPet, calcula as espessuras das barreiras empregadas na blindagem de instalações destinadas à prática PET/CT e o segundo App, o DosePet, calcula os volumes e doses a serem administradas em pacientes e os resíduos radiação na sala de preparação de PET/CT. Os softwares foram projetados utilizando a ferramenta Web MIT App Inventor2 para plataforma Android. Os aplicativos permitem avaliar a quantidade de radiação ainda existente nas instalações após as aplicações, aumentando a segurança e diminuindo as exposições, além de possibilitar maior eficiência no aproveitamento do radiofármaco.
The PET / CT (Positron Emission Tomography / Computed Tomography) is an image capture technique, high resolution of the human anatomy and physiology, extremely efficient in the diagnosis of metabolically active tumors. Radiological protection of the PET / CT has a special challenge because the high energy of 511 keV photons from the annihilation of pairs differs from other tests diagnostics using ionizing radiation. The calculation of shielding requirements for radiation positron emission of PET / CT facilities proposed in 2006 by the Task Group 108 (TG 108), prepared by the American Association of Physicists in Medicine (AAPM), can be a complex task. In this paper we present two software in the form of Application (App) designed to help in the nuclear medicine. The first, called BlindPet calculates the thicknesses of the shielding barriers used in the installations intended for PET / CT and second App, DosePet calculates the volumes and doses to be administered to patients and residues radiation in the PET/CT preparation room. The software has been designed using the Web Inventor2 MIT App tool for Android platform. The application allows evaluating the amount of radiation still existing in the premises after the applications, increasing security and reducing exposures, and enable greater efficiency in the use of the radiopharmaceutical.
APA, Harvard, Vancouver, ISO, and other styles
34

Tsang, Hei-man. "Simulations and software developments for cosmic-ray and particle physics experiments in underground laboratories." Click to view the E-thesis via HKUTO, 2007. http://sunzi.lib.hku.hk/HKUTO/record/B39557030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kütt, Moritz [Verfasser], Franz [Akademischer Betreuer] Fujara, and Alexander [Akademischer Betreuer] Glaser. "Simulation of Neutron Multiplicity Measurements using Geant4 - Open Source Software for Nuclear Arms Control / Moritz Kütt. Betreuer: Franz Fujara ; Alexander Glaser." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2016. http://d-nb.info/1112333177/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Barbalace, Antonio. "Emerging Hardware Architectures and Advanced Open-Source Software Technologies for Real-Time Control and Data Acquisition in Quasi-Continuous Nuclear Fusion Experiments." Doctoral thesis, Università degli studi di Padova, 2011. http://hdl.handle.net/11577/3421603.

Full text
Abstract:
Research in nuclear fusion devices is motivated by the increasing global demand of low-cost and eco-friendly energy. To meet the requirements for an economically attractive fusion power plant, new real-time control and data acquisition systems that support quasi-continuous plasma operation and unstable plasma scenarios have to be investigated. This thesis covers hardware and software design and codesign issues in real-time control and data acquisition for quasi-continuous nuclear experiments, with emphasis on the choice of data acquisition and generation hardware, central processing units, real-time operating systems and application frameworks. The work starts with an introduction to fusion science and magnetic nuclear fusion experimental devices. Two different experiments (RFX-mod and JET) are then presented in order to show several plasma instabilities that need to be controlled in real time, namely Magneto Hydro Dynamic modes stabilization (MHD) and Vertical Stabilization (VS). The thesis then focuses on the hardware design of data acquisition boards; moreover interpolation problems in analog to digital signal conversion that arise from the adoption of nonsimultaneous sampling instead of the established simultaneous sampling are covered. A section dedicated to Open-Source software that addresses Operating Systems and application frameworks for low latency real-time control and data acquisition follows. Linux with its Real-Time patches is proved to be a Hard Real-Time Operating System running on multicore CPUs along with resource partitioning and shielding. Different control computational models and software frameworks implementing them are presented; MARTe, among EPICS and many others, has shown the lowest latency and the most bounded jitter. The thesis ends with the design and implementation details of the proposal for the RFX-mod MHD control system upgrade.
La ricerca sulla fusione nucleare oggigiorno é fortemente motivata dal continuo aumento della richiesta globale di energia a basso costo ed ecocompatibile. Per poter soddisfare i requisiti per la costruzione di impianti da fusione commerciali devono essere studiati nuovi sistemi per il controllo in tempo reale e l'acquisizione dati, che supportano esperimenti con plasma quasi-continui e scenari di plasmi non stabili. Questa tesi tratta delle soluzioni hardware e software per il controllo in tempo reale e l'acquisizione dati negli esperimenti fusionistici operanti in modalitá quasi continua. La tesi presenta un'indagine sulle schede per l'acquisizione e la generazione dati, sulla scelta dell'elaboratore, sui sistemi operativi in tempo reale e sui framework applicativi. Il lavoro incomincia con un'introduzione teorica sulla fusione termonucleare e alcuni dispositivi esperimentali per la fusione. In particolare due diversi esperimenti (RFX-mod e JET) vengono presentati per identificare diverse instabilitá presenti nel plasma che devono essere controllate in tempo reale, queste sono la stabilizzazione dei modi Magneto-Hydro dinamici (MHD) e la stabilizzazione verticale (VS). Segue una parte dedicata all'analisi delle diverse architetture di acquisizione dati realizzati in dispositivi commerciali. Quindi viene discusso come riallineare digitalmente, tramite interpolazione o filtri passa-tutto, dei segnali analogici acquisiti con campionamento non simultaneo utilizzando schede con diversi ingressi multiplexati su un unico convertitore. Vi é quindi una sezione dedicata al software libero dedicata in particolare ai sistemi operativi e ai framework applicativi con provata bassa latenza utilizzabili per il controllo in tempo reale e l'acqusizione dati. Linux, corredato delle sue real-time patches, dimostrata di essere un sistema operativo deterministico (real-time) se opportunamente configurato su sistemi multiprocessore utilizzando le tecniche di partizionamento e schermatura delle risorse. Diversi modelli computazionali e alcuni framework applicativi che li implementano vengono analizzati. MARTe tra EPICS e altri software per il controllo presenta la latenza piú bassa e il jitter piú contenuto. La tesi si conclude con il progetto di aggiornamento del sistema di controllo dei modi MHD dell'esperimento RFX-mod.
APA, Harvard, Vancouver, ISO, and other styles
37

Hegarty, Declan. "FPGA-based architectures for next generation communications networks." Thesis, University of Glasgow, 2008. http://theses.gla.ac.uk/455/.

Full text
Abstract:
This engineering doctorate concerns the application of Field Programmable Gate Array (FPGA) technology to some of the challenges faced in the design of next generation communications networks. The growth and convergence of such networks has fuelled demand for higher bandwidth systems, and a requirement to support a diverse range of payloads across the network span. The research which follows focuses on the development of FPGA-based architectures for two important paradigms in contemporary networking - Forward Error Correction and Packet Classification. The work seeks to combine analysis of the underlying algorithms and mathematical techniques which drive these applications, with an informed approach to the design of efficient FPGA-based circuits.
APA, Harvard, Vancouver, ISO, and other styles
38

MacIsaac, Liam J. "Modelling smart domestic energy systems." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/4214/.

Full text
Abstract:
The increasing price of fossil fuels, coupled with the increased worldwide focus on their contribution to climate change has driven the need to develop cleaner forms of energy generation. The transition to cleaner energy sources has seen a much higher penetration of renewable sources of electricity on the grid than ever before. Among these renewable generation sources are wind and solar power which provide intermittent and often unpredictable energy generation throughout the day depending on weather conditions. The connection of such renewable sources poses problems for electricity network operators whose legacy systems have been designed to use traditional generation sources where supply can be increased as required to meet demand. Among the solutions proposed to address this issue with intermittency in generation are storage systems and automation systems which aim to reduce demand in order to match the available renewable generation. Such a transition would introduce a requirement for more advanced technology within homes to provide network operators with greater control over domestic loads. Another aspect to the transition towards a low-carbon society is the change that will be required to domestic heating systems. Current domestic heating systems largely rely on Natural Gas as their fuel source. In order to meet carbon reduction targets, changes will need to be made to domestic buildings including insulation and other energy efficiency measures. It is also possible that present systems will begin to be replaced by new heating technologies such as ground and air source heat pumps. Due to the effect that such technological transitions will have on domestic end-users, it is important that these new technologies are designed with end-users in mind. It is therefore necessary that software tools are available to model and simulate these changes at the domestic level to guide the design of new systems. This thesis provides a summary of some of the existing building energy analysis tools that are available and shows that there is currently a shortcoming in the capabilities of existing tools when modelling future domestic smart grid technologies. Tools for developing these technologies must include a combination of building thermal characteristics, electrical energy generation and consumption, software control and communications. A new software package was developed which allows for the modelling of small smart grid systems, with a particular focus on domestic systems including electricity, heat transfer, software automation and control and communications. In addition to the modelling of electrical power flow and heat transfer that is available in existing building energy simulation packages, the package provides the novel features of allowing the simulation of data communication and human interaction with appliances. The package also provides a flexible framework that allows system components to be developed in full object-orientated programming languages at run time, rather than having to use additional third-party development environments. As well as describing the background to the work and the design of the new software, this thesis describes validation studies that were carried out to verify the accuracy of the results produced by the package. A simulation-based case study was also carried out to demonstrate the features offered by the new platform in which a smart domestic energy control system including photovoltaic generation, hot water storage and battery storage was developed. During the development of this system, new algorithms for obtaining the operating point of solar panels and photovoltaic maximum power point tracking were developed.
APA, Harvard, Vancouver, ISO, and other styles
39

Hainline, Douglas Ray. "Validation of queries to a relational database." Thesis, University of Greenwich, 1986. http://gala.gre.ac.uk/8670/.

Full text
Abstract:
This thesis addresses the problem of preventing users of a data base system from interrogating it with query language expressions which are syntactically and semantically valid but which do not match the user's intentions. A method of assisting users of a relational data base to formulate query language expressions which are valid representations of the abstract query which the user wishes to put is developed. The central focus of the thesis is a method of communicating the critical aspects of the semantics of the relation which would be generated in response to a user's proposed operations on the data base. Certain classes of user error which can arise when using a relational algebra query system are identified, and a method of demonstrating their invalidity is demonstrated. This is achieved by representing via a graph the consequences of operations on relations. Also developed are techniques allowing the generation of pseudo-natural language text describing the relations which would be created as the result of the user's proposed query language operations. A method of allowing the creators of data base relations to incorporate informative semantic data about their relations is developed. A method of permitting this data to be modified by query language operations is specified. Pragmatic linguistic considerations which arise when this data is used to generate pseudo-natural language statements are addressed, and examples of the system's use are given.
APA, Harvard, Vancouver, ISO, and other styles
40

Al-Gburi, Abeer. "Robustness analysis of nonlinear systems with feedback linearizing control." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/375170/.

Full text
Abstract:
The feedback linearization approach is a control method which employs feedback to stabilize systems containing nonlinearities. In order to accomplish this, it assumes perfect knowledge of the system model to linearize the input-output relationship. In the absence of perfect system knowledge, modelling errors inevitably affect the performance of the feedback controller. This thesis introduces a design and analysis approach for robust feedback linearizing controllers for nonlinear systems. This approach takes into account these model errors and provides robustness margins to guarantee the stability of feedback linearized systems. Based on robust stability theory, two important tools, namely the small gain theorem and the gap metric, are used to derive and validate robustness and performance margins for the feedback linearized systems. It is shown that the small gain theorem can provide unsatisfactory results, since the stability conditions found using this approach require the nonlinear plant to be stable. However, the gap metric approach is shown to yield general stability conditions which can be applied to both stable and unstable plants. These conditions show that the stability of the linearized systems depends on how exact the inversion of the plant nonlinearity is, within the nonlinear part of the controller. Furthermore, this thesis introduces an improved robust feedback linearizing controller which can classify the system nonlinearity into stable and unstable components and preserve the stabilizing action of the inherently stabilizing nonlinearities in the plant, cancelling only the unstable nonlinear part of the plant. Using this controller, it is shown that system stability depends on the bound on the input nonlinear component of the plant and how exact the inversion of the unstable nonlinear of the plant is, within the nonlinear part of the controller.
APA, Harvard, Vancouver, ISO, and other styles
41

Grammatikopoulos, Panagiotis. "Computer simulation of dislocation interaction with radiation-induced obstacles in iron." Thesis, University of Liverpool, 2009. http://livrepository.liverpool.ac.uk/1218/.

Full text
Abstract:
Assessment of candidate materials for fusion power plants provide one of the major structural materials challenges of the next decades. Computer simulation provides a useful alternative to experiments on real-life irradiated materials. Within the framework of a multi-scale modelling approach, atomic scale studies by molecular dynamics (MD) and statics (MS) are of importance, since they enable understanding of atomic interaction mechanisms invisible at coarser scales. Nano-scale defect clusters, such as voids, solute-atom precipitates and dislocation loops can form in metals irradiated by high-energy atomic particles. Since they are obstacles to dislocation glide, they can affect plasticity, substantially changing the yield and flow stresses and ductility. In this study, a model for α-Fe developed by Osetsky and Bacon [26] has been used, that enables dislocation motion under applied shear strain at various temperatures and strain rates. Three main results were obtained. First, the two interatomic potentials used (A97 [79] and A04 [31]) were assessed with respect to reproducing dislocation properties. Both were in good agreement but for one fact: an unexpected and not previously reported displacement of core atoms along the direction of the dislocation line of a 1/2[111](1-10) edge dislocation was observed for the A97 potential. A connection of this phenomenon with differences in Peierls stress values for the two potentials was proposed. Second, the interaction of a 1/2[111](1-10) edge dislocation with a number of different configurations of spherical voids and Cu-precipitates 2 and 4 nm in diameter was investigated. The defects were centred on, above and below the dislocation glide plane. The mechanisms governing the interactions were analysed. For the first time it was observed that by interacting with a void, the dislocation can undergo both positive and negative climb, depending on the void position. A bcc to fcc phase transition was observed for the larger precipitates, in agreement with literature findings. Third, the obstacle strength of 1/2‹111› and ‹100› loops was obtained under various conditions and geometries for both potentials. Interactions are sometimes complex, but could be described in terms of conventional dislocation reactions in which Burgers vector is conserved. The critical resolved shear stress for dislocation breakaway and the fraction of interstitials left behind are wide-ranging. Finally, a mapping of all obstacle strengths was created for the purpose of comparison. ‹100› loops with Burgers vector parallel to the dislocation glide plane and 1/2‹111› loops proved to be strong obstacles. Small size voids are stronger than Cu-precipitates of the same size. The complexity of some reactions and the variety of obstacle strengths poses a challenge for the development of continuum models of dislocation behaviour in irradiated iron.
APA, Harvard, Vancouver, ISO, and other styles
42

Zhan, Yiyi. "PC-based visual simulation of high pressure arc plasma." Thesis, University of Liverpool, 2011. http://livrepository.liverpool.ac.uk/3433/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Leslie, Robert. "An evaluation of load sharing algorithms for heterogeneous distributed systems." Thesis, University of Greenwich, 1997. http://gala.gre.ac.uk/6224/.

Full text
Abstract:
Distributed systems offer the ability to execute a job at other nodes than the originating one. Load sharing algorithms use this ability to distribute work around the system in order to achieve greater efficiency. This is reflected in substantially reduced response times. In the majority of studies the systems on which load sharing has been evaluated have been homogeneous in nature. This thesis considers load sharing in heterogeneous systems, in which the heterogeneity is exhibited in the processing power of the constituent nodes. Existing algorithms are evaluated and improved ones proposed. Most of the performance analysis is done through simulation. A model of diskless workstations communicating and transferring jobs by Remote Procedure Call is used. All assumptions about the overheads of inter-node communication are based upon measurements made on the university networks. The comparison of algorithms identifies those characteristics that offer improved performance in heterogeneous systems. The level of system information required for transfer is investigated and an optimum found. Judicious use of the collected information via algorithm design is shown to account for much of the improvement. However detailed examination of algorithm behaviour compared with that of a 'optimum' load sharing scenario reveals that there are occasions when full use of all the information available is not beneficial. Investigations are carried out on the most promising algorithms to assess their adaptability, scalability and stability under a variety of differing conditions. The standard definitions of load balancing and load sharing are shown not to apply when considering heterogeneous systems. To validate the assumptions in the simulation model a load sharing scenario was implemented on a network of Sun workstations at the University. While the scope of the implementation was somewhat limited by lack of resources, it does demonstrate the relative ease with which the algorithms can be implemented without alteration of the operating system code or modification at the kernel level.
APA, Harvard, Vancouver, ISO, and other styles
44

Wang, Chunliang. "Computer Assisted Coronary CT Angiography Analysis : Disease-centered Software Development." Licentiate thesis, Linköping University, Linköping University, Radiology, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-17783.

Full text
Abstract:

The substantial advances of coronary CTA have resulted in a boost of use of this new technique in the last several years, which brings a big challenge to radiologists by the increasing number of exams and the large amount of data for each patient. The main goal of this study was to develop a computer tool to facilitate coronary CTA analysis by combining knowledge of medicine and image processing.Firstly, a competing fuzzy connectedness tree algorithm was developed to segment the coronary arteries and extract centerlines for each branch. The new algorithm, which is an extension of the “virtual contrast injection” method, preserves the low density soft tissue around the coronary, which reduces the possibility of introducing false positive stenoses during segmentation.Secondly, this algorithm was implemented in open source software in which multiple visualization techniques were integrated into an intuitive user interface to facilitate user interaction and provide good over¬views of the processing results. Considerable efforts were put on optimizing the computa¬tional speed of the algorithm to meet the clinical requirements.Thirdly, an automatic seeding method, that can automatically remove rib cage and recognize the aortic root, was introduced into the interactive segmentation workflow to further minimize the requirement of user interactivity during post-processing. The automatic procedure is carried out right after the images are received, which saves users time after they open the data. Vessel enhance¬ment and quantitative 2D vessel contour analysis are also included in this new version of the software. In our preliminary experience, visually accurate segmentation results of major branches have been achieved in 74 cases (42 cases reported in paper II and 32 cases in paper III) using our software with limited user interaction. On 128 branches of 32 patients, the average overlap between the centerline created in our software and the manually created reference standard was 96.0%. The average distance between them was 0.38 mm, lower than the mean voxel size. The automatic procedure ran for 3-5 min as a single-thread application in the background. Interactive processing took 3 min in average with the latest version of software. In conclusion, the presented software provides fast and automatic coron¬ary artery segmentation and visualization. The accuracy of the centerline tracking was found to be acceptable when compared to manually created centerlines.

APA, Harvard, Vancouver, ISO, and other styles
45

Maccormick, Marion. "The ALICE Project at the IPN, OrsayR&D and software developments 1996-2003." Habilitation à diriger des recherches, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00159807.

Full text
Abstract:
Ce document décrit en détail les étapes importantes de R&D dans la conception et mise au point de la chambre proportionnelle à lecture cathodique hautement segmentée de la station1 du Spectromètre Bras Dimuons de l'expérience ALICE, récemment implantée au LHC. Plusieurs aspects expérimentaux sont résumés - comprenant l'électronique, la construction mécanique, la modélisation du détecteur, les simulations de la physique et les faisceaux tests, les méthodes de cartographie en langue Orientée Objet et les résultats des prototypes en faisceaux test. Ce document est écrit à destination des jeunes expérimentateurs.
APA, Harvard, Vancouver, ISO, and other styles
46

TONDIN, JOSE E. M. "Prospeccao de implementacao de ensino a distancia para a disciplina de fundamentos de fisica nuclear na pos-graduacao do IPEN utilizando infra-estrutura de software livre." reponame:Repositório Institucional do IPEN, 2009. http://repositorio.ipen.br:8080/xmlui/handle/123456789/9392.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:26:27Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T14:04:18Z (GMT). No. of bitstreams: 0
Dissertacao (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
47

Percival, Graham Keith. "Physical modelling meets machine learning : performing music with a virtual string ensemble." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/4253/.

Full text
Abstract:
This dissertation describes a new method of computer performance of bowed string instruments (violin, viola, cello) using physical simulations and intelligent feedback control. Computer synthesis of music performed by bowed string instruments is a challenging problem. Unlike instruments whose notes originate with a single discrete excitation (e.g., piano, guitar, drum), bowed string instruments are controlled with a continuous stream of excitations (i.e. the bow scraping against the string). Most existing synthesis methods utilize recorded audio samples, which perform quite well for single-excitation instruments but not continuous-excitation instruments. This work improves the realism of synthesis of violin, viola, and cello sound by generating audio through modelling the physical behaviour of the instruments. A string's wave equation is decomposed into 40 modes of vibration, which can be acted upon by three forms of external force: A bow scraping against the string, a left-hand finger pressing down, and/or a right-hand finger plucking. The vibration of each string exerts force against the instrument bridge; these forces are summed and convolved with the instrument body impulse response to create the final audio output. In addition, right-hand haptic output is created from the force of the bow against the string. Physical constants from ten real instruments (five violins, two violas, and three cellos) were measured and used in these simulations. The physical modelling was implemented in a high-performance library capable of simulating audio on a desktop computer one hundred times faster than real-time. The program also generates animated video of the instruments being performed. To perform music with the physical models, a virtual musician interprets the musical score and generates actions which are then fed into the physical model. The resulting audio and haptic signals are examined with a support vector machine, which adjusts the bow force in order to establish and maintain a good timbre. This intelligent feedback control is trained with human input, but after the initial training is completed the virtual musician performs autonomously. A PID controller is used to adjust the position of the left-hand finger to correct any flaws in the pitch. Some performance parameters (initial bow force, force correction, and lifting factors) require an initial value for each string and musical dynamic; these are calibrated automatically using the previously-trained support vector machines. The timbre judgements are retained after each performance and are used to pre-emptively adjust bowing parameters to avoid or mitigate problematic timbre for future performances of the same music. The system is capable of playing sheet music with approximately the same ability level as a human music student after two years of training. Due to the number of instruments measured and the generality of the machine learning, music can be performed with ensembles of up to ten stringed instruments, each with a distinct timbre. This provides a baseline for future work in computer control and expressive music performance of virtual bowed string instruments.
APA, Harvard, Vancouver, ISO, and other styles
48

Zimmermann, Jürgen P. "Micromagnetic simulations of magnetic exchange spring systems." Thesis, University of Southampton, 2007. https://eprints.soton.ac.uk/65551/.

Full text
Abstract:
Magnetic exchange spring systems are multi-layers or composites of magnetically hard and soft materials that are exchange-coupled across their interfaces. In recent years, research into exchange spring systems has flourished, with potential for application in high-performance permanent magnets, GMR spin devices, magnetic MEMS technology, and in magnetic data storage. We investigate the magnetic properties of MBE grown superlattices with alternating layers of magnetically hard rare earth-iron (DyFe2, ErFe2) and soft yttriumiron (YFe2) compounds. They are ideal model systems to study exchange spring phenomena. We develop numerical models of the investigated systems and apply micromagnetic simulations. The simulation code OOMMF is extended and used to solve Landau-Lifshitz-Gilbert and Brown’s equations. This allows us to determine the microscopic configuration of the magnetisation that is not directly accessible by experiment. Magnetic field-sweep measurements of a multilayered DyFe2/YFe2 system show an unexpected triple switching of the magnetically hard DyFe2 layers. The magnetisation of the hard magnetic layers reverse before the soft magnetic layers. We reproduce the experimental hysteresis loops of the net and compound-specific magnetisation by means of simulations and explain the switching behaviour. Using similar numerical methods, we interpret experimental data on ErFe2/YFe2 multilayers. At sufficiently high fields, applied perpendicular to the multilayer film plane, the energy is minimised by a multilayer spin flop. This is a particular spin configuration where the magnetisation aligns with a direction perpendicular to the applied field. Taking the preceding findings further, we investigate multilayers of ErFe2/YFe2/ DyFe2/YFe2. We gain insight in the complex spin configurations in systems of different magnetically hard materials, with a pre-strung domain wall in the soft YFe2 layers. Varying the thickness of the YFe2 layers, we study the changing mutual interference of the switching patterns in the ErFe2 and DyFe2 layers.
APA, Harvard, Vancouver, ISO, and other styles
49

Goutal, Pascale. "Jeux non-coopératifs finis appliqués à la sécurité nucléaire." Paris 6, 1997. http://www.theses.fr/1997PA066362.

Full text
Abstract:
Afin de dissuader les eventuels detournements de matieres nucleaires, un inspecteur est charge d'effectuer un certain nombre d'inspections dans une installation nucleaire. Deux inspections ont lieu a des dates fixes et connues de l'inspecteur et d'un eventuel detourneur. Entre les deux dates, un nombre limite d'inspections peuvent etre decidees par l'inspecteur. Nous supposons qu'un detournement est toujours detecte lors de l'inspection suivante. L'inspecteur cherche a minimiser le temps de detection et l'eventuel detourneur a le maximiser. Les jeux d'inspections sont etudies en deux temps. Nous considerons d'abord differentes familles de jeux d'inspection, et nous nous interessons tout particulierement aux jeux admettant une structure recursive. Ensuite, nous elaborons un jeu a information incomplete repondant aux donnees du probleme pose et realisons un logiciel, nomme jadis, permettant d'obtenir des strategies optimales d'inspection sur un grand nombre de periodes. Le jeu peut alors etre etudie sur un nombre de periodes suffisamment large pour confirmer le fait que l'information incomplete permet de reduire le temps de detection d'un detournement. Un second type de jeu est egalement etudie, le jeu d'infiltration, qui consiste en une poursuite-evasion sur un graphe de plusieurs arcs reliant deux noeuds. Un agent infiltrant doit se rendre de l'un de ces deux noeuds a l'autre sans etre intercepte par un gardien. Differents contextes de surveillance sont etudies.
APA, Harvard, Vancouver, ISO, and other styles
50

Jung, Sung Uk. "On using gait to enhance face extraction for visual surveillance." Thesis, University of Southampton, 2012. https://eprints.soton.ac.uk/340358/.

Full text
Abstract:
Visual surveillance finds increasing deployment for monitoring urban environments. Operators need to be able to determine identity from surveillance images and often use face recognition for this purpose. Unfortunately, the quality of the recorded imagery can be insufficient for this task. This study describes a programme of research aimed to ameliorate this limitation. Many face biometrics systems use controlled environments where subjects are viewed directly facing the camera. This is less likely to occur in surveillance environments, so it is necessary to handle pose variations of the human head, low frame rate, and low resolution input images. We describe the first use of gait to enable face acquisition and recognition, by analysis of 3D head motion and gait trajectory, with super-resolution analysis. The face extraction procedure consists of three stages: i) head pose estimation by a 3D ellipsoidal model; ii) face region extraction by using a 2D or a 3D gait trajectory; and iii) frontal face extraction and reconstruction by estimating head pose and using super-resolution techniques. The head pose is estimated by using a 3D ellipsoidal model and non-linear optimisation. Region- and distance-based feature refinement methods are used and a direct mapping from the 2D image coordinate to the object coordinate is developed. In face region extraction the potential face region is extracted based on the 2D gait trajectory model when a person walks towards a camera. We model a looming field and show how this field affects the image sequences of the human walking. By fitting a 2D gait trajectory model the face region can then be tracked. For the general case of the human walking a 3D gait trajectory model and heel strike positions are used to extract the face region in 3D space. Wavelet decomposition is used to detect the gait cycle and a new heel strike detection method is developed. In face extraction a high resolution frontal face image is reconstructed with low resolution face images by analysing super-resolution. Based on the head pose and 3D ellipsoidal model the invalid low resolution face images are filtered and the frontal view face is reconstructed. By adapting the existing super-resolution the high resolution frontal face image can be synthesised, which is demonstrated to be suitable for face recognition. The contributions of this research include the construction of a 3D model for pose estimation from planar imagery and the first use of gait information to enhance the face extraction and recognition process allowing for deployment in surveillance scenarios.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography