Dissertations / Theses on the topic 'Financial engineering Data processing'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Financial engineering Data processing.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Fernandez, Noemi. "Statistical information processing for data classification." FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.
Full textChiu, Cheng-Jung. "Data processing in nanoscale profilometry." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36677.
Full textIncludes bibliographical references (p. 176-177).
New developments on the nanoscale are taking place rapidly in many fields. Instrumentation used to measure and understand the geometry and property of the small scale structure is therefore essential. One of the most promising devices to head the measurement science into the nanoscale is the scanning probe microscope. A prototype of a nanoscale profilometer based on the scanning probe microscope has been built in the Laboratory for Manufacturing and Productivity at MIT. A sample is placed on a precision flip stage and different sides of the sample are scanned under the SPM to acquire its separate surface topography. To reconstruct the original three dimensional profile, many techniques like digital filtering, edge identification, and image matching are investigated and implemented in the computer programs to post process the data, and with greater emphasis placed on the nanoscale application. The important programming issues are addressed, too. Finally, this system's error sources are discussed and analyzed.
by Cheng-Jung Chiu.
M.S.
Koriziz, Hariton. "Signal processing methods for the modelling and prediction of financial data." Thesis, Imperial College London, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.504921.
Full textLaurila, M. (Mikko). "Big data in Finnish financial services." Bachelor's thesis, University of Oulu, 2017. http://urn.fi/URN:NBN:fi:oulu-201711243156.
Full textTämän työn tavoitteena on selvittää big data -käsitettä sekä kehittää ymmärrystä Suomen rahoitusalan big data -kypsyydestä. Tutkimuskysymykset tutkielmalle ovat “Millaisia big data -ratkaisuja on otettu käyttöön rahoitusalalla Suomessa?” sekä “Mitkä tekijät hidastavat big data -ratkaisujen implementointia rahoitusalalla Suomessa?”. Big data käsitteenä liitetään yleensä valtaviin datamassoihin ja suuruuden ekonomiaan. Siksi big data onkin mielenkiintoinen aihe tutkittavaksi suomalaisessa kontekstissa, missä datajoukkojen koko on jossain määrin rajoittunut markkinan koon myötä. Työssä esitetään big datan määrittely kirjallisuuteen perustuen sekä esitetään yhteenveto big datan soveltamisesta Suomessa aikaisempiin tutkimuksiin perustuen. Työssä on toteutettu laadullinen aineistoanalyysi julkisesti saatavilla olevasta informaatiosta big datan käytöstä rahoitusalalla Suomessa. Tulokset osoittavat big dataa hyödynnettävän jossain määrin rahoitusalalla Suomessa, ainakin suurikokoisissa organisaatioissa. Rahoitusalalle erityisiä ratkaisuja ovat esimerkiksi hakemuskäsittelyprosessien automatisointi. Selkeimmät big data -ratkaisujen implementointia hidastavat tekijät ovat osaavan työvoiman puute, sekä uusien regulaatioiden asettamat paineet kehitysresursseille. Työ muodostaa eräänlaisen kokonaiskuvan big datan hyödyntämisestä rahoitusalalla Suomessa. Tutkimus perustuu julkisen aineiston analyysiin, mikä osaltaan luo pohjan jatkotutkimukselle aiheesta. Jatkossa haastatteluilla voitaisiinkin edelleen syventää tietämystä aiheesta
Siu, Ka Wai. "Numerical algorithms for data analysis with imaging and financial applications." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/550.
Full textPan, Howard W. (Howard Weihao) 1973. "Integrating financial data over the Internet." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/37812.
Full textIncludes bibliographical references (leaves 65-66).
This thesis examines the issues and value-added, from both the technical and economic perspective, of solving the information integration problem in the retail banking industry. In addition, we report on an implementation of a prototype for the Universal Banking Application using currently available technologies. We report on some of the issues we discovered and the suggested improvements for future work.
by Howard W. Pan.
M.Eng.
Derksen, Timothy J. (Timothy John). "Processing of outliers and missing data in multivariate manufacturing data." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/38800.
Full textIncludes bibliographical references (leaf 64).
by Timothy J. Derksen.
M.Eng.
Nyström, Simon, and Joakim Lönnegren. "Processing data sources with big data frameworks." Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.
Full textBig data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
徐順通 and Sung-thong Andrew Chee. "Computerisation in Hong Kong professional engineering firms." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31263124.
Full textWang, Yi. "Data Management and Data Processing Support on Array-Based Scientific Data." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1436157356.
Full textBostanudin, Nurul Jihan Farhah. "Computational methods for processing ground penetrating radar data." Thesis, University of Portsmouth, 2013. https://researchportal.port.ac.uk/portal/en/theses/computational-methods-for-processing-ground-penetrating-radar-data(d519f94f-04eb-42af-a504-a4c4275d51ae).html.
Full textGrinman, Alex J. "Natural language processing on encrypted patient data." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/113438.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 85-86).
While many industries can benefit from machine learning techniques for data analysis, they often do not have the technical expertise nor computational power to do so. Therefore, many organizations would benefit from outsourcing their data analysis. Yet, stringent data privacy policies prevent outsourcing sensitive data and may stop the delegation of data analysis in its tracks. In this thesis, we put forth a two-party system where one party capable of powerful computation can run certain machine learning algorithms from the natural language processing domain on the second party's data, where the first party is limited to learning only specific functions of the second party's data and nothing else. Our system provides simple cryptographic schemes for locating keywords, matching approximate regular expressions, and computing frequency analysis on encrypted data. We present a full implementation of this system in the form of a extendible software library and a command line interface. Finally, we discuss a medical case study where we used our system to run a suite of unmodified machine learning algorithms on encrypted free text patient notes.
by Alex J. Grinman.
M. Eng.
Westlund, Kenneth P. (Kenneth Peter). "Recording and processing data from transient events." Thesis, Massachusetts Institute of Technology, 1988. https://hdl.handle.net/1721.1/129961.
Full textIncludes bibliographical references.
by Kenneth P. Westlund Jr.
Thesis (B.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1988.
Setiowijoso, Liono. "Data Allocation for Distributed Programs." PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/5102.
Full textJakovljevic, Sasa. "Data collecting and processing for substation integration enhancement." Texas A&M University, 2003. http://hdl.handle.net/1969/93.
Full textAygar, Alper. "Doppler Radar Data Processing And Classification." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609890/index.pdf.
Full textLu, Feng. "Big data scalability for high throughput processing and analysis of vehicle engineering data." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-207084.
Full textChung, Kit-lun, and 鐘傑麟. "Intelligent agent for Internet Chinese financial news retrieval." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B30106503.
Full text陳詠儀 and Wing-yi Chan. "The smart card technology in the financial services." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31268596.
Full textTrigueiros, Duarte. "Neural network based methods in the extraction of knowledge from accounting and financial data." Thesis, University of East Anglia, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292217.
Full textChen, Jiawen (Jiawen Kevin). "Efficient data structures for piecewise-smooth video processing." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/66003.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 95-102).
A number of useful image and video processing techniques, ranging from low level operations such as denoising and detail enhancement to higher level methods such as object manipulation and special effects, rely on piecewise-smooth functions computed from the input data. In this thesis, we present two computationally efficient data structures for representing piecewise-smooth visual information and demonstrate how they can dramatically simplify and accelerate a variety of video processing algorithms. We start by introducing the bilateral grid, an image representation that explicitly accounts for intensity edges. By interpreting brightness values as Euclidean coordinates, the bilateral grid enables simple expressions for edge-aware filters. Smooth functions defined on the bilateral grid are piecewise-smooth in image space. Within this framework, we derive efficient reinterpretations of a number of edge-aware filters commonly used in computational photography as operations on the bilateral grid, including the bilateral filter, edgeaware scattered data interpolation, and local histogram equalization. We also show how these techniques can be easily parallelized onto modern graphics hardware for real-time processing of high definition video. The second data structure we introduce is the video mesh, designed as a flexible central data structure for general-purpose video editing. It represents objects in a video sequence as 2.5D "paper cutouts" and allows interactive editing of moving objects and modeling of depth, which enables 3D effects and post-exposure camera control. In our representation, we assume that motion and depth are piecewise-smooth, and encode them sparsely as a set of points tracked over time. The video mesh is a triangulation over this point set and per-pixel information is obtained by interpolation. To handle occlusions and detailed object boundaries, we rely on the user to rotoscope the scene at a sparse set of frames using spline curves. We introduce an algorithm to robustly and automatically cut the mesh into local layers with proper occlusion topology, and propagate the splines to the remaining frames. Object boundaries are refined with per-pixel alpha mattes. At its core, the video mesh is a collection of texture-mapped triangles, which we can edit and render interactively using graphics hardware. We demonstrate the effectiveness of our representation with special effects such as 3D viewpoint changes, object insertion, depthof- field manipulation, and 2D to 3D video conversion.
by Jiawen Chen.
Ph.D.
Jakubiuk, Wiktor. "High performance data processing pipeline for connectome segmentation." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106122.
Full text"December 2015." Cataloged from PDF version of thesis.
Includes bibliographical references (pages 83-88).
By investigating neural connections, neuroscientists try to understand the brain and reconstruct its connectome. Automated connectome reconstruction from high resolution electron miscroscopy is a challenging problem, as all neurons and synapses in a volume have to be detected. A mm3 of a high-resolution brain tissue takes roughly a petabyte of space that the state-of-the-art pipelines are unable to process to date. A high-performance, fully automated image processing pipeline is proposed. Using a combination of image processing and machine learning algorithms (convolutional neural networks and random forests), the pipeline constructs a 3-dimensional connectome from 2-dimensional cross-sections of a mammal's brain. The proposed system achieves a low error rate (comparable with the state-of-the-art) and is capable of processing volumes of 100's of gigabytes in size. The main contributions of this thesis are multiple algorithmic techniques for 2- dimensional pixel classification of varying accuracy and speed trade-off, as well as a fast object segmentation algorithm. The majority of the system is parallelized for multi-core machines, and with minor additional modification is expected to work in a distributed setting.
by Wiktor Jakubiuk.
M. Eng. in Computer Science and Engineering
Nguyen, Qui T. "Robust data partitioning for ad-hoc query processing." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106004.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 59-62).
Data partitioning can significantly improve query performance in distributed database systems. Most proposed data partitioning techniques choose the partitioning based on a particular expected query workload or use a simple upfront scheme, such as uniform range partitioning or hash partitioning on a key. However, these techniques do not adequately address the case where the query workload is ad-hoc and unpredictable, as in many analytic applications. The HYPER-PARTITIONING system aims to ll that gap, by using a novel space-partitioning tree on the space of possible attribute values to dene partitions incorporating all attributes of a dataset. The system creates a robust upfront partitioning tree, designed to benet all possible queries, and then adapts it over time in response to the actual workload. This thesis evaluates the robustness of the upfront hyper-partitioning algorithm, describes the implementation of the overall HYPER-PARTITIONING system, and shows how hyper-partitioning improves the performance of both selection and join queries.
by Qui T. Nguyen.
M. Eng.
Bao, Shunxing. "Algorithmic Enhancements to Data Colocation Grid Frameworks for Big Data Medical Image Processing." Thesis, Vanderbilt University, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13877282.
Full textLarge-scale medical imaging studies to date have predominantly leveraged in-house, laboratory-based or traditional grid computing resources for their computing needs, where the applications often use hierarchical data structures (e.g., Network file system file stores) or databases (e.g., COINS, XNAT) for storage and retrieval. The resulting performance for laboratory-based approaches reveal that performance is impeded by standard network switches since typical processing can saturate network bandwidth during transfer from storage to processing nodes for even moderate-sized studies. On the other hand, the grid may be costly to use due to the dedicated resources used to execute the tasks and lack of elasticity. With increasing availability of cloud-based big data frameworks, such as Apache Hadoop, cloud-based services for executing medical imaging studies have shown promise.
Despite this promise, our studies have revealed that existing big data frameworks illustrate different performance limitations for medical imaging applications, which calls for new algorithms that optimize their performance and suitability for medical imaging. For instance, Apache HBases data distribution strategy of region split and merge is detrimental to the hierarchical organization of imaging data (e.g., project, subject, session, scan, slice). Big data medical image processing applications involving multi-stage analysis often exhibit significant variability in processing times ranging from a few seconds to several days. Due to the sequential nature of executing the analysis stages by traditional software technologies and platforms, any errors in the pipeline are only detected at the later stages despite the sources of errors predominantly being the highly compute-intensive first stage. This wastes precious computing resources and incurs prohibitively higher costs for re-executing the application. To address these challenges, this research propose a framework - Hadoop & HBase for Medical Image Processing (HadoopBase-MIP) - which develops a range of performance optimization algorithms and employs a number of system behaviors modeling for data storage, data access and data processing. We also introduce how to build up prototypes to help empirical system behaviors verification. Furthermore, we introduce a discovery with the development of HadoopBase-MIP about a new type of contrast for medical imaging deep brain structure enhancement. And finally we show how to move forward the Hadoop based framework design into a commercialized big data / High performance computing cluster with cheap, scalable and geographically distributed file system.
Hatchell, Brian. "Data base design for integrated computer-aided engineering." Thesis, Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/16744.
Full textWaite, Martin. "Data structures for the reconstruction of engineering drawings." Thesis, Nottingham Trent University, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.328794.
Full textEinstein, Noah. "SmartHub: Manual Wheelchair Data Extraction and Processing Device." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1555352793977171.
Full textChaudhuri, Shomesh Ernesto. "Financial signal processing : applications to asset-market dynamics and healthcare finance." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/117839.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 139-144).
The seemingly random fluctuations of price and value produced by information flow and complex interactions across a diverse population of stakeholders has motivated the extensive use of stochastic processes to analyze both capital markets and the regulatory approval process in healthcare. This thesis approaches the statistical analysis of such processes through the lens of signal processing, with a particular emphasis on studying how dynamics evolve over time. We begin with a brief introduction to financial signal processing in Part I, before turning to specific applications in the main body of the thesis. In Part II, we apply spectral analysis to understand and quantify the relationship between asset-market dynamics across multiple time horizons, and show how this framework can be used to improve portfolio and risk management. Using the Fourier transform, we decompose asset-return alphas, betas and covariances into distinct frequency components, allowing us to identify the relative importance of specific time horizons in determining each of these quantities. Our approach can be applied to any portfolio, and is particularly useful for comparing the forecast power of multiple investment strategies. Part III addresses the growing interest from the healthcare industry, regulators and patients to include Bayesian adaptive methods in the regulatory approval process of new therapies. By applying sequential likelihood ratio tests to a Bayesian decision analysis framework that assigns asymmetric weights to false approvals and false rejections, we are able to design adaptive clinical trials that maximize the value to current and future patients and consequently, public health. We also consider the possibility that as the process unfolds, drug sponsors might stop a trial early if new information suggests market prospects are not as favorable as originally forecasted. We show that clinical trials that can be modified as data are observed are more valuable than trials without this flexibility.
by Shomesh Ernesto Chaudhuri.
Ph. D.
Guttman, Michael. "Sampled-data IIR filtering via time-mode signal processing." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=86770.
Full textDans ce mémoire, la conception de filtres de données-échantillonnées ayant une réponse impulsionnelle infinie basée sur le traitement de signal en mode temporel est présentée. Le traitement de signal dans le domaine temporel (TSDT), définie comme étant le traitement d'information analogique échantillonnée en utilisant des différences de temps comme variables, est devenu une des techniques émergentes de conception de circuits des plus populaires. Puisque le TSDT est toujours relativement récent, il y a encore beaucoup de développements requis pour étendre cette technologie comme un outil de traitement de signal général. Dans cette recherche, un ensemble de blocs d'assemblage capable de réaliser la plupart des opérations mathématiques dans le domaine temporel sera introduit. En arrangeant ces structures élémentaires, des systèmes en mode temporel d'ordre élevé, plus spécifiquement des filtres en mode temporel, seront réalisés. Trois filtres de deuxième ordre dans le domaine temporel (passe-bas, passe-bande et passe-haut) sont modélisés sur MATLAB et simulé sur Spectre afin de vérifier la méthodologie de conception. Finalement, un intégrateur amorti et un filtre passe-bas IIR de deuxième ordre en mode temporel sont implémentés avec des composantes discrètes.
Breest, Martin, Paul Bouché, Martin Grund, Sören Haubrock, Stefan Hüttenrauch, Uwe Kylau, Anna Ploskonos, Tobias Queck, and Torben Schreiter. "Fundamentals of Service-Oriented Engineering." Universität Potsdam, 2006. http://opus.kobv.de/ubp/volltexte/2009/3380/.
Full textFaber, Marc. "On-Board Data Processing and Filtering." International Foundation for Telemetering, 2015. http://hdl.handle.net/10150/596433.
Full textOne of the requirements resulting from mounting pressure on flight test schedules is the reduction of time needed for data analysis, in pursuit of shorter test cycles. This requirement has ramifications such as the demand for record and processing of not just raw measurement data but also of data converted to engineering units in real time, as well as for an optimized use of the bandwidth available for telemetry downlink and ultimately for shortening the duration of procedures intended to disseminate pre-selected recorded data among different analysis groups on ground. A promising way to successfully address these needs consists in implementing more CPU-intelligence and processing power directly on the on-board flight test equipment. This provides the ability to process complex data in real time. For instance, data acquired at different hardware interfaces (which may be compliant with different standards) can be directly converted to more easy-to-handle engineering units. This leads to a faster extraction and analysis of the actual data contents of the on-board signals and busses. Another central goal is the efficient use of the available bandwidth for telemetry. Real-time data reduction via intelligent filtering is one approach to achieve this challenging objective. The data filtering process should be performed simultaneously on an all-data-capture recording and the user should be able to easily select the interesting data without building PCM formats on board nor to carry out decommutation on ground. This data selection should be as easy as possible for the user, and the on-board FTI devices should generate a seamless and transparent data transmission, making a quick data analysis viable. On-board data processing and filtering has the potential to become the future main path to handle the challenge of FTI data acquisition and analysis in a more comfortable and effective way.
Hinrichs, Angela S. (Angela Soleil). "An architecture for distributing processing on realtime data streams." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11418.
Full textMarcus, Adam Ph D. Massachusetts Institute of Technology. "Optimization techniques for human computation-enabled data processing systems." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/78454.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 119-124).
Crowdsourced labor markets make it possible to recruit large numbers of people to complete small tasks that are difficult to automate on computers. These marketplaces are increasingly widely used, with projections of over $1 billion being transferred between crowd employers and crowd workers by the end of 2012. While crowdsourcing enables forms of computation that artificial intelligence has not yet achieved, it also presents crowd workflow designers with a series of challenges including describing tasks, pricing tasks, identifying and rewarding worker quality, dealing with incorrect responses, and integrating human computation into traditional programming frameworks. In this dissertation, we explore the systems-building, operator design, and optimization challenges involved in building a crowd-powered workflow management system. We describe a system called Qurk that utilizes techniques from databases such as declarative workflow definition, high-latency workflow execution, and query optimization to aid crowd-powered workflow developers. We study how crowdsourcing can enhance the capabilities of traditional databases by evaluating how to implement basic database operators such as sorts and joins on datasets that could not have been processed using traditional computation frameworks. Finally, we explore the symbiotic relationship between the crowd and query optimization, enlisting crowd workers to perform selectivity estimation, a key component in optimizing complex crowd-powered workflows.
by Adam Marcus.
Ph.D.
Stein, Oliver. "Intelligent Resource Management for Large-scale Data Stream Processing." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-391927.
Full textDeMaio, William (William Aloysius). "Data processing and inference methods for zero knowledge nuclear disarmament." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/106698.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 63-64).
It is hoped that future nuclear arms control treaties will call for the dismantlement of stored nuclear warheads. To make the authenticated decommissioning of nuclear weapons agreeable, methods must be developed to validate the structure and composition of nuclear warheads without it being possible to gain knowledge about these attributes. Nuclear resonance fluorescence (NRF) imaging potentially enables the physically-encrypted verification of nuclear weapons in a manner that would meet treaty requirements. This thesis examines the physics behind NRF, develops tools for processing resonance data, establishes methodologies for simulating information gain during warhead verification, and tests potential inference processes. The influence of several inference parameters are characterized, and success is shown in predicting the properties of an encrypting foil and the thickness of a warhead in a one-dimensional verification scenario.
by William DeMaio.
S.B.
Gardener, Michael Edwin. "A multichannel, general-purpose data logger." Thesis, Cape Technikon, 1986. http://hdl.handle.net/20.500.11838/2179.
Full textThis thesis describes the implementation of a general-purpose, microprocessor-based Data Logger. The Hardware allows analog data acquisition from one to thirty two channels with 12 bit resolution and at a data throughput of up to 2KHz. The data is logged directly to a Buffer memory and from there, at the end of each 109, it is dumped to an integral cassette data recorder. The recorded data can be transfered from the logger to a desk-top computer, via the IEEE 488 port, for further processing and display. All log parameters are user selectable by means of menu prompted keyboard entry and a Real-Time clock (RTC) provides date and time information automatically.
Baker, Alison M. "Restructuring Option Chain Data Sets Using Matlab." Digital WPI, 2010. https://digitalcommons.wpi.edu/etd-theses/473.
Full textAdonis, Ridoh. "An empirical investigation into the information management systems at a South African financial institution." Thesis, Cape Peninsula University of Technology, 2016. http://hdl.handle.net/20.500.11838/2474.
Full textThe study has been triggered by the increase in information breaches in organisations. Organisations may have policies and procedures, strategies and systems in place in order to mitigate the risk of information breaches; however, data breaches are still on the rise. Governments across the world have or are putting in place laws around data protection which organisations have to align their process, strategies and systems to. The continuous and rapid emergence of new technology is making it even easier for information breaches to occur. In particular, the focus of this study is aimed at the information management systems in a selected financial institution in South Africa. Based on the objectives, this study: explored the shortfalls of information security on a South African financial institution; investigated whether data remains separate while privacy is ensured; investigated responsiveness of business processes on information management; investigated the capability of systems on information management; investigated the strategies formulated for information management and finally, investigated projects and programmes aimed at addressing information management. Quantitative, as well as qualitative analysis, was employed whereby questionnaires were sent to employees who were employed at junior management positions. Semi- structured in-depth interviews were self-administered whereby the researcher interviewed senior management at the organisation. These senior managers from different value chains are responsible for implementing information management policies and strategy.
Nedstrand, Paul, and Razmus Lindgren. "Test Data Post-Processing and Analysis of Link Adaptation." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-121589.
Full textNarayanan, Shruthi (Shruthi P. ). "Real-time processing and visualization of intensive care unit data." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119537.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 83).
Intensive care unit (ICU) patients undergo detailed monitoring so that copious information regarding their condition is available to support clinical decision-making. Full utilization of the data depends heavily on its quantity, quality and manner of presentation to the physician at the bedside of a patient. In this thesis, we implemented a visualization system to aid ICU clinicians in collecting, processing, and displaying available ICU data. Our goals for the system are: to be able to receive large quantities of patient data from various sources, to compute complex functions over the data that are able to quantify an ICU patient's condition, to plot the data using a clean and interactive interface, and to be capable of live plot updates upon receiving new data. We made significant headway toward our goals, and we succeeded in creating a highly adaptable visualization system that future developers and users will be able to customize.
by Shruthi Narayanan.
M. Eng.
Shih, Daphne Yong-Hsu. "A data path for a pixel-parallel image processing system." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/40570.
Full textIncludes bibliographical references (p. 65).
by Daphne Yong-Hsu Shih.
M.Eng.
Saltin, Joakim. "Interactive visualization of financial data : Development of a visual data mining tool." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-181225.
Full textAkleman, Ergun. "Pseudo-affine functions : a non-polynomial implicit function family to describe curves and sufaces." Diss., Georgia Institute of Technology, 1992. http://hdl.handle.net/1853/15409.
Full textKardos, Péter. "Performance optimization ofthe online data processing softwareof a high-energy physics experiment : Performance optimization ofthe online data processing softwareof a high-energy physics experiment." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-404475.
Full textJungner, Andreas. "Ground-Based Synthetic Aperture Radar Data processing for Deformation Measurement." Thesis, KTH, Geodesi och satellitpositionering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-199677.
Full textDet här examensarbetet bygger på erfarenheter av arbete med en mark-baserad syntetisk apertur radar (GB-SAR) vid Geomatiska Institutet i Castelldefels (Barcelona, Spanien). SAR tekniken tillåter radar interferometri som är en vanligt förekommande teknik både på satellit och flygburna platformar. Det här arbetet beskriver instrumentets tekniska egenskaper samt behandlingen av data for att uppmäta deformationer. En stor del av arbetet har ägnats åt utveckling av GB-SAR data applikationer som koherens och interferogram beräkning, automatisering av bild matchning med skript, geokodning av GB-SAR data samt anpassning av befintliga SAR program till GB-SAR data. Slutligen har mätningar gjorts i fält for att samla in data nödvändiga for GB-SAR applikations utvecklingen samt få erfarenhet av instrumentets egenskaper och begränsningar. Huvudresultatet av fältmätningarna är att hög koherens nödvändig för interferometriska mätningar går att uppnå med relativ lång tid mellan mätepokerna. Flera faktorer som påverkar resultatet diskuteras, som det observerade områdets reflektivitet, radar bild matchningen och den illuminerande geometrin.
van, Schaik Sebastiaan Johannes. "A framework for processing correlated probabilistic data." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:91aa418d-536e-472d-9089-39bef5f62e62.
Full textKorziuk, Kamil, and Tomasz Podbielski. "Engineering Requirements for platform, integrating health data." Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16089.
Full textR, S. Umesh. "Algorithms for processing polarization-rich optical imaging data." Thesis, Indian Institute of Science, 2004. http://hdl.handle.net/2005/96.
Full textMcCaney, Patrick Michael 1980. "Emotional response modeling in financial markets : Boston Stock Exchange data analysis." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28481.
Full textIncludes bibliographical references (leaves 57-58).
In this thesis, physiological data is analyzed in the context of financial risk processing, specifically investigating the effects of financial trading decisions and situations on the physiological responses of professional market makers. The data for this analysis comes from an experiment performed on market makers at the Boston Stock Exchange. This analysis involved significant preprocessing of large financial and physiological data sets. Short-term and long term analysis of financial and performance based event markers of the data are performed and the results interpreted. There are two main conclusions. First, negative performance events are found to be the the main driver of physiological responses; positive performance events have minimal deviations from baseline physiological signals. Second, a long term analysis of events yield more substantial physiological changes than a short term analysis.
by Patrick Michael McCaney.
M.Eng.
Xie, Tian, and 謝天. "Development of a XML-based distributed service architecture for product development in enterprise clusters." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B30477165.
Full text