Dissertations / Theses on the topic 'Computations management'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Computations management.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Haraburda, David. "Arithmetic Computations and Memory Management Using a Binary Tree Encoding af Natural Numbers." Thesis, University of North Texas, 2011. https://digital.library.unt.edu/ark:/67531/metadc103323/.
Full textBourgey, Florian. "Stochastic approximations for financial risk computations." Thesis, Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAX052.
Full textIn this thesis, we investigate several stochastic approximation methods for both the computation of financial risk measures and the pricing of derivatives.As closed-form expressions are scarcely available for such quantities, %and because they have to be evaluated daily, the need for fast, efficient, and reliable analytic approximation formulas is of primal importance to financial institutions.We aim at giving a broad overview of such approximation methods and we focus on three distinct approaches.In the first part, we study some Multilevel Monte Carlo approximation methods and apply them for two practical problems: the estimation of quantities involving nested expectations (such as the initial margin) along with the discretization of integrals arising in rough forward variance models for the pricing of VIX derivatives.For both cases, we analyze the properties of the corresponding asymptotically-optimal multilevel estimatorsand numerically demonstrate the superiority of multilevel methods compare to a standard Monte Carlo.In the second part, motivated by the numerous examples arising in credit risk modeling, we propose a general framework for meta-modeling large sums of weighted Bernoullirandom variables which are conditional independent of a common factor X.Our generic approach is based on a Polynomial Chaos Expansion on the common factor together withsome Gaussian approximation. L2 error estimates are given when the factor X is associated withclassical orthogonal polynomials.Finally, in the last part of this dissertation, we deal withsmall-time asymptotics and provide asymptoticexpansions for both American implied volatility and American option prices in local volatility models.We also investigate aweak approximations for the VIX index inrough forward variance models expressed in termsof lognormal proxiesand derive expansions results for VIX derivatives with explicit coefficients
Lee, Yau-tat Thomas, and 李猷達. "Formalisms on semi-structured and unstructured data schema computations." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B43703914.
Full textLee, Yau-tat Thomas. "Formalisms on semi-structured and unstructured data schema computations." Click to view the E-thesis via HKUTO, 2009. http://sunzi.lib.hku.hk/hkuto/record/B43703914.
Full textLadjel, Riad. "Secure distributed computations for the personal cloud." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASG043.
Full textThanks to smart disclosure initiatives and new regulations like GDPR, individuals are able to get the control back on their data and store them locally in a decentralized way. In parallel, personal data management system (PDMS) solutions, also called personal clouds, are flourishing. Their goal is to empower users to leverage their personal data for their own good. This decentralized way of managing personal data provides a de facto protection against massive attacks on central servers and opens new opportunities by allowing users to cross their data gathered from different sources. On the other side, this approach prevents the crossing of data from multiple users to perform distributed computations. The goal of this thesis is to design a generic and scalable secure decentralized computing framework which allows the crossing of personal data of multiple users while answering the following two questions raised by this approach. How to preserve individuals' trust on their PDMS when performing global computations crossing data from multiple individuals? And how to guarantee the integrity of the final result when it has been computed by a myriad of collaborative but independent PDMSs?
Botadra, Harnish. "iC2mpi a platform for parallel execution of graph-structured iterative computations /." unrestricted, 2006. http://etd.gsu.edu/theses/available/etd-07252006-165725/.
Full textTitle from title screen. Sushil Prasad, committee chair. Electronic text (106 p. : charts) : digital, PDF file. Description based on contents viewed June 11, 2007. Includes bibliographical references. Includes bibliographical references (p. 61-53).
Azzopardi, Marc Anthony. "Computational air traffic management." Thesis, Cranfield University, 2015. http://dspace.lib.cranfield.ac.uk/handle/1826/9200.
Full textBrogliato, Marcelo Salhab. "Essays in computational management science." reponame:Repositório Institucional do FGV, 2018. http://hdl.handle.net/10438/24615.
Full textApproved for entry into archive by ÁUREA CORRÊA DA FONSECA CORRÊA DA FONSECA (aurea.fonseca@fgv.br) on 2018-08-24T16:29:40Z (GMT) No. of bitstreams: 1 brogliato-phd-thesis-v3-final.pdf: 19935806 bytes, checksum: df0caf31076cfec5c116e0b4c18346ee (MD5)
Made available in DSpace on 2018-08-27T13:54:00Z (GMT). No. of bitstreams: 1 brogliato-phd-thesis-v3-final.pdf: 19935806 bytes, checksum: df0caf31076cfec5c116e0b4c18346ee (MD5) Previous issue date: 2018-07-15
A presente tese é formada por três trabalhos científicos na área de Management Science Computacional. A gestão moderna e a alta tecnologia interagem em múltiplas e profundas formas. O professor Andre Ng diz aos seus estudantes na Escola de Negócios de Stanford que “Inteligência Artificial é a nova eletricidade”, como sua forma hiperbólica de enfatizar o potencial transformador da tecnologia. O primeiro trabalho é inspirado na possibilidade de que haverá alguma forma de dinheiro digital e estuda ledger distribuídas, propondo e analisando o Hathor, uma arquitetura alternativa para criptomoedas escaláveis. O segundo trabalho pode ser um item crucial no entendimento de tomadas de decisão, nos trazendo um modelo formal de recognition-primed decisions. Situada na intersecção entre psicologia cognitiva, ciência da computação, neuro-ciência e inteligência artifical, ele apresenta um framework open-source, multi-plataforma e altamente paralelo da Sparse Distributed Memory e analisa a dinâmica da memória e algumas aplicações. O terceiro e último trabalho se situa na intersecção entre marketing, difusão de inovação tecnologica e modelagem, extendendo o famoso modelo de Bass para levar em consideração usuário que, após adotar a tecnologia por um tempo, decidiram rejeitá-la.
This thesis presents three specific, self-contained, scientific papers in the Computational Management Science area. Modern management and high technology interact in multiple, profound, ways. Professor Andrew Ng tells students at Stanford’s Graduate School of Business that “AI is the new electricity”, as his hyperbolic way to emphasize the potential transformational power of the technology. The first paper is inspired by the possibility that there will be some form of purely digital money and studies distributed ledgers, proposing and analyzing Hathor, an alternative architecture towards a scalable cryptocurrency. The second paper may be a crucial item in understanding human decision making, perhaps, bringing us a formal model of recognition-primed decision. Lying at the intersection of cognitive psychology, computer science, neuroscience, and artificial intelligence, it presents an open-source, cross-platform, and highly parallel framework of the Sparse Distributed Memory and analyzes the dynamics of the memory with some applications. Last but not least, the third paper lies at the intersection of marketing, diffusion of technological innovation, and modeling, extending the famous Bass model to account for users who, after adopting the innovation for a while, decide to reject it later on.
Iserte, Agut Sergio. "High-throughput Computation through Efficient Resource Management." Doctoral thesis, Universitat Jaume I, 2018. http://hdl.handle.net/10803/664128.
Full textEsta propuesta aborda, desde dos enfoques distintos, la mejora de la productividad de centros de procesamientos de datos mediante una gestión eficiente de los recursos. Por un lado, la combinación de tecnologías de virtualización remotas de GPUs junto con gestores de recursos en clústeres HPC y entornos de computación en la nube. Por el otro lado, la reconfiguración de trabajos en términos de modificar el número de procesos durante la ejecución. La evaluación de prestaciones revela un incremento no sólo en la productividad, sino también en el consumo energético.
Apel, Joachim, and Uwe Klaus. "Aspects of Large Scale Symbolic Computation Management." Universität Leipzig, 1998. https://ul.qucosa.de/id/qucosa%3A34525.
Full textJohansson, Björn. "Model management for computational system design /." Linköping : Univ, 2003. http://www.bibl.liu.se/liupubl/disp/disp2003/tek857s.pdf.
Full textAhrens, James P. "Scientific experiment management with high-performance distributed computation /." Thesis, Connect to this title online; UW restricted, 1996. http://hdl.handle.net/1773/6974.
Full textUichanco, Joline Ann Villaranda. "Data-driven revenue management." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/41728.
Full textIncludes bibliographical references (p. 125-127).
In this thesis, we consider the classical newsvendor model and various important extensions. We do not assume that the demand distribution is known, rather the only information available is a set of independent samples drawn from the demand distribution. In particular, the variants of the model we consider are: the classical profit-maximization newsvendor model, the risk-averse newsvendor model and the price-setting newsvendor model. If the explicit demand distribution is known, then the exact solutions to these models can be found either analytically or numerically via simulation methods. However, in most real-life settings, the demand distribution is not available, and usually there is only historical demand data from past periods. Thus, data-driven approaches are appealing in solving these problems. In this thesis, we evaluate the theoretical and empirical performance of nonparametric and parametric approaches for solving the variants of the newsvendor model assuming partial information on the distribution. For the classical profit-maximization newsvendor model and the risk-averse newsvendor model we describe general non-parametric approaches that do not make any prior assumption on the true demand distribution. We extend and significantly improve previous theoretical bounds on the number of samples required to guarantee with high probability that the data-driven approach provides a near-optimal solution. By near-optimal we mean that the approximate solution performs arbitrarily close to the optimal solution that is computed with respect to the true demand distributions.
(cont.) For the price-setting newsvendor problem, we analyze a previously proposed simulation-based approach for a linear-additive demand model, and again derive bounds on the number of samples required to ensure that the simulation-based approach provides a near-optimal solution. We also perform computational experiments to analyze the empirical performance of these data-driven approaches.
by Joline Ann Villaranda Uichanco.
S.M.
West, Richard. "Adaptive real-time management of communication and computation resources." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/9237.
Full textKim, Jinwoo. "Memory hierarchy management through off-line computational learning." Diss., Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/8194.
Full textMohammadi, Javad. "Distributed Computational Methods for Energy Management in Smart Grids." Research Showcase @ CMU, 2016. http://repository.cmu.edu/dissertations/710.
Full textChada, Daniel de Magalhães. "From cognitive science to management science: two computational contributions." reponame:Repositório Institucional do FGV, 2011. http://hdl.handle.net/10438/17053.
Full textApproved for entry into archive by Kelly Ayala (kelly.ayala@fgv.br) on 2016-09-12T12:58:17Z (GMT) No. of bitstreams: 1 Chada 2011 FINAL ENTREGUE.pdf: 579283 bytes, checksum: f463590c20f51b84ba0f9357ab1a6e08 (MD5)
Approved for entry into archive by Kelly Ayala (kelly.ayala@fgv.br) on 2016-09-12T13:00:07Z (GMT) No. of bitstreams: 1 Chada 2011 FINAL ENTREGUE.pdf: 579283 bytes, checksum: f463590c20f51b84ba0f9357ab1a6e08 (MD5)
Made available in DSpace on 2016-09-12T13:03:31Z (GMT). No. of bitstreams: 1 Chada 2011 FINAL ENTREGUE.pdf: 579283 bytes, checksum: f463590c20f51b84ba0f9357ab1a6e08 (MD5) Previous issue date: 2011
This work is composed of two contributions. One borrows from the work of Charles Kemp and Joshua Tenenbaum, concerning the discovery of structural form: their model is used to study the Business Week Rankings of U.S. Business Schools, and to investigate how other structural forms (structured visualizations) of the same information used to generate the rankings can bring insights into the space of business schools in the U.S., and into rankings in general. The other essay is purely theoretical in nature. It is a study to develop a model of human memory that does not exceed our (human) psychological short-term memory limitations. This study is based on Pentti Kanerva’s Sparse Distributed Memory, in which human memories are registered into a vast (but virtual) memory space, and this registration occurs in massively parallel and distributed fashion, in ideal neurons.
Este trabalho é composto de duas contribuições. Uma se usa do trabalhode Charles Kemp e Joshua Tenenbaum sobre a descoberta da forma estrutural: o seu modelo é usado para estudar os rankings da revista Business Week sobre escolas de administração, e para investigar como outras formas estruturais (visualizações estruturadas) da mesma informação usada para gerar os rankings pode trazer discernimento no espaço de escolas de negócios nos Estados Unidos e em rankings em geral. O outro ensaio é de natureza puramente teórica. Ele é um estudo no desenvolvimento de um modelo de memória que não excede os nossos (humanos) limites de memória de curto-prazo. Este estudo se baseia na Sparse Distributed Memory (Memória Esparsa e Distribuida) de Pentti Kanerva, na qual memórias humanas são registradas em um vasto (mas virtual) espaço, e este registro ocorre de forma maciçamente paralela e distribuida, em neurons ideais.
Kang, Sheng. "Optimization for recipe-based, diet-planning inventory management." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61895.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 40-41).
This thesis presents a new modeling framework and research methodology for the study of recipe-based, diet-planning inventory management. The thesis begins with an exploration on the classic optimization problem - the diet problem based upon mixed-integer linear programming. Then, considering the fact that real diet-planning is sophisticated as it would be planning recipes rather than possible raw materials for the meals. Hence, the thesis develops the modeling framework under the assumption that given the recipes and the different purchasing options for raw materials listed in the recipes, examine the nutrition facts and calculate the purchasing decisions and the yearly optimal minimum cost for food consumption. This thesis further discusses the scenarios for different groups of raw materials in terms of shelf-timing difference. To model this inventory management, the modeling implementation includes preprocess part and the optimization part: the formal part involves with conversion of customized selection to quantitative relation with stored recipes and measurement on nutrition factors; the latter part solves the cost optimization problem.
by Sheng Kang.
S.M.
Cole, Murray Irwin. "Algorithmic skeletons : a structured approach to the management of parallel computation." Thesis, University of Edinburgh, 1988. http://hdl.handle.net/1842/11997.
Full textTagner, Nikita. "Optimal Energy Management for Parallel Hybrid Electric Vehicles using Dynamic Programming." Thesis, KTH, Optimeringslära och systemteori, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209776.
Full textI denna avhandling formuleras två optimala styrningsproblem för reglering av hybridelektriska fordon. I den första, mer generella, formuleringen kan både hastighet och batteriladdning variera. I den andra formuleringen är hastigheten specifierad i förväg and därmed kan endast batteriladdningen variera fritt. Den första formuleringen tar betydligt längre tid att lösa med dynamisk programmering än den andra formuleringen. Av dem utvärderade körcyklerna gav den som var mest kuperade bränslebesparingar på 4:45 % om den löstes med den generella formuleringen istället för den där hastigheten är specifierad i förväg. När den lägsta tillåtna hastigheten sänktes från 75 till 70 km/h sparades 0:52 % bränsle. Däremot, om den lägsta tillåtna hastigheten sänktes från 80 till 70 km/h ökade besparingen till 1:92 %. Sammanfattningsvis, om dynamisk programmering ska implementras i realtid på ett hybridelektriskt fordon så är dem potentiella bränslebesparingarna betydligt högre om vägen är väldigt kuperad och en låg lägsta hastighet tillåts för den generella formuleringen än om formuleringen med hastighet specifierad på förhand väljs. Därmed, för vägar som inte är lika kuperade och där hastigheten inte tillåts att variera mycket kan, potentiellt, högre bränslebesparingar uppnås om formuleringen med förspecifierad hastighet väljs och förmågan att växla alternativt stänga av eller på motorn inkluderas.
Aleksic, Mario. "Incremental computation methods in valid and transaction time databases." Thesis, Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/8126.
Full textAbdallah, Mohamed E. S. M. "A Novel Computational Approach for the Management of Bioreactor Landfills." Thèse, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/20314.
Full textSolak, Serdar. "Computational complexity management of H.264/AVC video coding standard." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=95058.
Full textLa norme de codage vidéo H.264/AVC permet une efficacité de compression grandement supérieure à celle des normes précédentes grâce à des techniques de codage avancées d'une grande flexibilité. Ceci dit, le prix de cette performance améliorée est l'augmentation de la complexité du calcul requise, ce qui est un obstacle majeur pour les appareils dont la puissance et la capacité de calcul sont limitées. Ce mémoire présente de nouvelles techniques pour réduire et contrôler la complexité du calcul requise par un codeur H.264/AVC. Une nouvelle méthode de prédiction est développée pour estimer le coût débit-distorsion Lagrangien d'un macrobloc. Cette méthode est utilisée avec deux nouveaux algorithmes de réduction de la complexité pour un codeur H.264/AVC. Le premier algorithme utilise les coûts prédits du taux de distorsion pour identifier les macroblocs codés de type SKIP avant les essais des modes INTRA ou INTER. Des simulations démontrent que cet algorithme entraîne une réduction significative de la complexité du calcul avec une diminution négligeable de la performance débit-distorsion. Le deuxième algorithme utilise la méthode de prédiction des coûts débit-distorsion pour réduire la complexité du codeur en identifiant les macroblocs codés de type INTRA et INTER plus tôt lors du processus de codage. Les résultats indiquent que des réductions encore plus grandes de la complexité peuvent être obtenues au prix d'une dégradation accrue de la performance débit-distorsion. Un dispositif de contrôle évolutif est proposé pour contrôler la complexité au niveau du macrobloc à l'aide d'un unique paramètre. Le dispositif utilise une technique de regroupement gérant l'allocation des ressources de calcul aux macroblocs et intègre la méthode de prédiction du coût débit-distorsion Lagrangien. Les résultats démontrent une amélioration significative de la performance du taux de distorsion tout en limitant la comp
Cavalcanti, João Marcos Bastos. "A computational logic approach for Web site synthesis and management." Thesis, University of Edinburgh, 2003. http://hdl.handle.net/1842/23294.
Full textChen, Ziwei. "Workflow Management Service based on an Event-driven Computational Cloud." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-141696.
Full textAbramovich, Michael. "Impacts of revenue management on estimates of spilled passenger demand." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/82413.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 138-140).
In the airline industry, spill refers to passenger demand turned away from a flight because demand has exceeded capacity. The accurate estimation of spill and the lost revenue it implies is an important parameter in airline fleet assignment models, where improved estimates lead to more profitable assignments. Previous models for spill estimation did not take into account the effects of passenger choice and airline revenue management. Since revenue management systems protect seats for later-arriving higher fare passengers, revenue management controls will influence the number of spilled passengers and their value because they will restrict availability to lower fare passengers even if seats on the aircraft are available. This thesis examines the effect of various revenue management systems and fare structures on spill, and, in turn, the marginal value of incremental capacity. The Passenger Origin Destination Simulator is used to simulate realistic passenger booking scenarios and to measure the value of spilled demand. A major finding of the research is that in less restricted fare structures and with traditional revenue management systems, increasing capacity on a flight leads to buy-down which can result in negative marginal revenues and therefore revenue losses. This behavior is contrary to conventional wisdom and is not considered in existing spill models. On the other hand, marginal revenues at low capacities are greater than would be predicted by first-choice-only spill models because some passengers will sell-up to higher fares to avoid spilling out. Additionally, because of passenger recapture between flights, adding capacity to one flight can lead to revenue losses on another. Therefore, the marginal value of incremental capacity is not always positive. Negative marginal revenues and associated revenue losses with increasing capacity can at least be partially mitigated by using more advanced revenue management forecasting and optimization algorithms which take into account passenger willingness to pay. The thesis also develops a heuristic analytical method for estimating spill costs which takes into account the effects of passenger sell-up, where previous models tend to underestimate the spill cost by only modeling passengers' first choices. The heuristic demonstrates improved estimates of passenger spill: in particular, in restricted fare structures and for moderate amounts of spill, the model exhibits approximate relative errors on the order of 5%, a factor of two improvement over previous models.
by Michael Abramovich.
S.M.
Cushing, Judith Bayard. "Computational proxies : an object-based infrastructure for computational science /." Full text open access at:, 1995. http://content.ohsu.edu/u?/etd,195.
Full textZhao, Shouqi. "Dependent risk modelling and ruin probability : numerical computation and applications." Thesis, City University London, 2014. http://openaccess.city.ac.uk/13702/.
Full textБулах, Богдан Вікторович. "Інфраструктура потокiв задач на основi композицiї грiд-сервiсiв для автоматизованого схемотехнiчного проектування." Doctoral thesis, Київ, 2013. https://ela.kpi.ua/handle/123456789/6242.
Full textSchneck, Phyllis Adele. "Dynamic management of computation and communication resources to enable secure high-performances applications." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/8264.
Full textPhu, Thi Vu. "A comparison of discrete and flow-based models for air traffic flow management." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45287.
Full textIncludes bibliographical references (leaves 73-74).
The steady increase of congestion in air traffic networks has resulted in significant economic losses and potential safety issues in the air transportation. A potential way to reduce congestion is to adopt efficient air traffic management policies, such as, optimally scheduling and routing air traffic throughout the network. In recent years, several models have been proposed to predict and manage air traffic. This thesis focuses on the comparison of two such approaches to air traffic flow management: (i) a discrete Mixed Integer Program model, and (ii) a continuous flow-based model. The continuous model is applied in a multi-commodity setting to take into account the origins and destinations of the aircraft. Sequential quadratic programming is used to optimize the continuous model. A comparison of the performance of the two models based on a set of large scale test cases is provided. Preliminary results suggest that the linear programming relaxation of the discrete model provides results similar to the continuous flow-based model for high volumes of air traffic.
by Thi Vu Phu.
S.M.
Gog, Ionel Corneliu. "Flexible and efficient computation in large data centres." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/271804.
Full textFONSECA, FABIANA LANZILLOTTA DA. "STORMWATER MANAGEMENT WITH WATERCOURSE VALORIZATION: COMPUTATIONAL SIMULATION OF THE TINTAS RIVER BASIN." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2018. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=34965@1.
Full textThe increasing process of urbanization in Brazil began in a fast and disorderly way causing numerous social and natural impacts. Urban sprawl has modified the watersheds, causing negative impacts on the cities, such as the occurrence of catastrophes associated with storm events. In order to mitigate the social, environmental and financial damages caused by floods, associated with the promotion of watercourses in the urban landscape and increasing resilience in cities, compensatory measures in stormwater management become imperative.The goal of this work reffers to present alternatives and control techniques applied to drainage systems, contemplating sustainable actions to value the watercourses, integrating them to the landscape and promoting the increase of resilience in urban centers, followed by an effective management and satisfactory monitoring.
Alsouri, Sami Verfasser], Stefan [Akademischer Betreuer] Katzenbeisser, and Eric [Akademischer Betreuer] [Bodden. "Behavior Compliance Control for More Trustworthy Computation Outsourcing / Sami Alsouri. Betreuer: Stefan Katzenbeisser ; Eric Bodden." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2013. http://d-nb.info/1110064357/34.
Full textAlsouri, Sami [Verfasser], Stefan Akademischer Betreuer] Katzenbeisser, and Eric [Akademischer Betreuer] [Bodden. "Behavior Compliance Control for More Trustworthy Computation Outsourcing / Sami Alsouri. Betreuer: Stefan Katzenbeisser ; Eric Bodden." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2013. http://nbn-resolving.de/urn:nbn:de:tuda-tuprints-35789.
Full textBrown, Mikel J. "Using natural language for database queries /." Online version of thesis, 1985. http://hdl.handle.net/1850/9044.
Full textGuergachi, Abdelaziz. "Uncertainty management in the activated sludge process, innovative applications of computational learning theory." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0016/NQ58278.pdf.
Full textKrause, Thilo. "Evaluating congestion management schemes in liberalized electricity markets applying agent-based computational economics /." Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=16928&part=abstracts.
Full textWalsh, Jonathan A. (Jonathan Alan). "Computational methods for efficient nuclear data management in Monte Carlo neutron transport simulations." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/95570.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 127-133).
This thesis presents the development and analysis of computational methods for efficiently accessing and utilizing nuclear data in Monte Carlo neutron transport code simulations. Using the OpenMC code, profiling studies are conducted in order to determine the types of nuclear data that are used in realistic reactor physics simulations, as well as the frequencies with which those data are accessed. The results of the profiling studies are then used to motivate the conceptualization of a nuclear data server algorithm aimed at reducing on-node memory requirements through the use of dedicated server nodes for the storage of infrequently accessed data. A communication model for this algorithm is derived and used to make performance predictions given data access frequencies and assumed system hardware parameters. Additionally, a new, accelerated approach for rejection sampling the free gas resonance elastic scattering kernel that reduces the frequency of zero-temperature elastic scattering cross section data accesses is derived and implemented. Using this new approach, the runtime overhead incurred by an exact treatment of the free gas resonance elastic scattering kernel is reduced by more than 30% relative to a standard sampling procedure used by Monte Carlo codes. Finally, various optimizations of the commonly-used binary energy grid search algorithm are developed and demonstrated. Investigated techniques include placing kinematic constraints on the range of the searchable energy grid, index lookups on unionized material energy grids, and employing energy grid hash tables. The accelerations presented routinely result in overall code speedup by factors of 1.2-1.3 for simulations of practical systems.
by Jonathan A. Walsh.
S.M.
Leidig, Jonathan Paul. "Epidemiology Experimentation and Simulation Management through Scientific Digital Libraries." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/28759.
Full textPh. D.
Verstak, Alexandre. "Data and Computation Modeling for Scientific Problem Solving Environments." Thesis, Virginia Tech, 2002. http://hdl.handle.net/10919/35299.
Full textMaster of Science
Gip, Orreborn Jakob. "Asset-Liability Management with in Life Insurance." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-215339.
Full textInförandet av nya regelverk och ökad konkurrens har medfört att stokastiska ALM-modeller blivit allt viktigare för livförsäkringsbolag. Den ofta komplexa strukturen hos försäkringsprodukter försvårar dock modelleringen, vilket gör att många modeller anses vara för komplicerade samt ineffektiva, av försäkringsbolagen. Det finns därför ett intresse i att utreda om egenskaper hos viktiga finansiella nyckeltal kan studeras utifrån en mer effektiv och mindre komplicerad modell. I detta arbete föreslås ett ramverk för stokastisk modellering av en förenklad version av balansräkningen hos typiska livförsäkringsbolag. Modellen baseras på en stokastisk kapitalmarknadsmodell, med vilken såväl aktiepriser som räntenivåer simuleras. Vidare så stödjer modellen simulering av de mest väsentliga produktegenskaperna, samt modellerar kundåterbäring som en funktion av den kollektiva konsolideringsgraden. Modellens förmåga att fånga de viktigaste egenskaperna hos balansräkningens ingående komponenter undersöks med hjälp av scenario- och känslighetsanalyser. Ytterligare undersöks även huruvida modellen är känslig för förändringar i olika indata, där fokus främst tillägnas de parametrar som kräver mer avancerade skattningsmetoder.
Svensson, Frida. "Scalable Distributed Reinforcement Learning for Radio Resource Management." Thesis, Linköpings universitet, Tillämpad matematik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177822.
Full textDet finns en stor potential automatisering och optimering inom radionätverk (RAN, radio access network) genom att använda datadrivna lösningar för att på ett effektivt sätt hantera den ökade komplexiteten på grund av trafikökningar and nya teknologier som introducerats i samband med 5G. Förstärkningsinlärning (RL, reinforcement learning) har naturliga kopplingar till reglerproblem i olika tidsskalor, såsom länkanpassning, interferenshantering och kraftkontroll, vilket är vanligt förekommande i radionätverk. Att förhöja statusen på datadrivna lösningar i radionätverk kommer att vara nödvändigt för att hantera utmaningarna som uppkommer med framtida 5G nätverk. I detta arbete föreslås vi en syetematisk metodologi för att applicera RL på ett reglerproblem. I första hand används den föreslagna metodologin på ett välkänt reglerporblem. Senare anpassas metodologin till ett äkta RAN-scenario. Arbetet inkluderar utförliga resultat från simuleringar för att visa effektiviteten och potentialen hos den föreslagna metoden. En lyckad metodologi skapades men resultaten på RAN-simulatorn saknade mognad.
Grundke, Peter. "Integrated market and credit portfolio models risk measurement and computational aspects." Wiesbaden Gabler, 2006. http://d-nb.info/987215159/04.
Full textMabotuwana, Thusitha Dananjaya De Silva. "ChronoMedIt : a computational quality audit framework for better management of patients with chronic disease." Thesis, University of Auckland, 2010. http://hdl.handle.net/2292/6034.
Full textÖstberg, Per-Olov. "Virtual infrastructures for computational science: software and architectures for distributed job and resource management." Doctoral thesis, Umeå universitet, Institutionen för datavetenskap, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-42428.
Full textInom beräkningsvetenskap begränsar ofta mängden tillgänglig beräkningskraft både storlek på problem som kan ansättas såväl som kvalitet på lösningar som kan uppnås. Metodik för skalning av beräkningskapacitet till stor skala (dvs större än kapaciteten hos enskilda resurscentras) baseras för närvarande på aggregering och federation av distribuerade beräkningsresurser. Oavsett hur denna resursaggregering tar sig uttryck tenderar skalning av vetenskapliga beräkningar till storskalig nivå att inkludera omformulering av problemställningar och beräkningsstrukturer för att bättre utnyttja problem- och resursparallellism. Effektiv parallellisering och skalning av vetenskapliga beräkningar är svårt och kompliceras ytterligare av faktorer som medföljer resursaggregering, t.ex. heterogeneitet i resursmiljöer och beroenden i programmeringsmodeller och beräkningsmetoder. Detta utbytesförhållande illustrerar komplexiteten i utförande av beräkningar och behovet av mekanismer som erbjuder högre abstraktionsnivåer för hantering av beräkningar i distribuerade beräkningsmiljöer.Denna avhandling diskuterar design och konstruktion av virtuella beräkningsinfrastrukturer som abstraherar komplexitet i utförande av beräkningar, frikopplar design av beräkningar från utförande av beräkningar samt underlättar storskalig användning av beräkningsresurser för vetenskapliga beräkningar. I synnerhet behandlas jobb- och resurshantering i distribuerade virtuella vetenskapliga infrastrukturer avsedda för Grid och Cloud computing miljöer. Det huvudsakliga området för avhandlingen är Grid computing, vilket adresseras med service-orienterad beräknings- och arkitekturmetodik. Arbetet diskuterar metodik och mekanismer för konstruktion av virtuella beräkningsinfrastrukturer samt gör bidrag inom enskilda områden som jobbhantering, applikationsintegrering, jobbprioritering och service-baserad programvaruutveckling.Utöver vetenskapliga publikationer bidrar detta arbete också med bidrag i form av programvarusystem som illustrerar de metoder som diskuteras. The Grid Job Management Framework (GJMF) abstraherar komplexitet i hantering av beräkningsjobb och erbjuder en uppsättning middleware-agnostiska gränssnitt för körning, kontroll och övervakning av beräkningsjobb i distribuerade beräkningsmiljöer. FSGrid erbjuder en generisk modell för specifikation och delegering av resurstilldelning i virtuella organisationer och grundar sig på distribuerad rättvisebaserad jobbprioritering. Mekanismer som dessa frikopplar jobb- och resurshantering från fysiska infrastruktursystem samt underlättar konstruktion av skalbara virtuella infrastrukturer för beräkningsvetenskap.
Balachandran, Libish Kalathil. "Computational workflow management for conceptual design of complex systems : an air-vehicle design perspective." Thesis, Cranfield University, 2007. http://dspace.lib.cranfield.ac.uk/handle/1826/5070.
Full textGhorasi, Rahim. "An intelligent data management system for computational modelling of pollutants transport in river networks." Thesis, Loughborough University, 2007. https://dspace.lboro.ac.uk/2134/13342.
Full textAladesanmi, Ereola Johnson. "Non intrusive load monitoring & identification for energy management system using computational intelligence approach." Master's thesis, University of Cape Town, 2015. http://hdl.handle.net/11427/13561.
Full textElectrical energy is the life line to every nation’s or continent development and economic progress. Referable to the recent growth in the demand for electricity and shortage in production, it is indispensable to develop strategies for effective energy management and system delivery. Load monitoring such as intrusive load monitoring, non-intrusive load monitoring, and identification of domestic electrical appliances is proposed especially at the residential level since it is the major energy consumer. The intrusive load monitoring provides accurate results and would allow each individual appliance's energy consumption to be transmitted to a central hub. Nevertheless, there are many practical disadvantages to this method that have motivated the introduction of non-intrusive load monitoring system. The fiscal cost of manufacturing and installing enough monitoring devices to match the number of domestic appliances is considered to be a disadvantage. In addition, the installation of one meter per household appliances would lead to congestion in the house and thus cause inconvenience to the occupants of the house, therefore, non-intrusive load monitoring technique was developed to alleviate the aforementioned challenges of intrusive load monitoring. Non-intrusive load monitoring (NILM) is the process of disaggregating a household’s total energy consumption into its contributing appliances. The total household load is monitored via a single monitoring device such as smart meter (SM). NILM provides cost effective and convenient means of load monitoring and identification. Several nonintrusive load monitoring and identification techniques are reviewed. However, the literature lacks a comprehensive system that can identify appliances with small energy consumption, appliances with overlapping energy consumption and a group of appliance ranges at once. This has been the major setback to most of the adopted techniques. In this dissertation, we propose techniques that overcome these setbacks by combining artificial neural networks (ANN) with a developed algorithm to identify appliances ranges that contribute to the energy consumption within a given period of time usually an hour interval.
Wang, Zhiyong S. M. Massachusetts Institute of Technology. "A computational method and software development for make-to-order pricing optimization." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/35097.
Full textIncludes bibliographical references (p. 34).
High variability of demand and inflexible capacity are inevitable in a make-to-order production despite its cost savings. A computational method is proposed in this thesis to exploit pricing opportunities in the price elasticity of demand and the up-to-date order transactions. Software development possibility was considered based on such pricing optimization method. Based on experiments conducted using a software prototype, we concluded that using the proposed computational method and software developed following the method with acceptable performance and scalability, pricing optimization was able to increase the revenue of a make-to-order production.
by Zhiyong Wang.
S.M.