Dissertations / Theses on the topic 'Benchmark analysi'

To see the other types of publications on this topic, follow the link: Benchmark analysi.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Benchmark analysi.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

CHIESA, DAVIDE. "Development and experimental validation of a Monte Carlo simulation model for the Triga Mark II reactor." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2014. http://hdl.handle.net/10281/50064.

Full text
Abstract:
In recent years, many computer codes, based on Monte Carlo methods or deterministic calculations, have been developed to separately analyze different aspects regarding nuclear reactors. Nuclear reactors are very complex systems, which require an integrated analysis of all the variables which are intrinsically correlated: neutron fluxes, reaction rates, neutron moderation and absorption, thermal and power distributions, heat generation and transfer, criticality coefficients, fuel burnup, etc. For this reason, one of the main challenges in the analysis of nuclear reactors is the coupling of neutronics and thermal-hydraulics simulation codes, with the purpose of achieving a good modeling and comprehension of the mechanisms which rule the transient phases and the dynamic behavior of the reactor. This is very important to guarantee the control of the chain reaction, for a safe operation of the reactor. In developing simulation tools, benchmark analyses are needed to prove the reliability of the simulations. The experimental measurements conceived to be compared with the results coming out from the simulations are really precious and can provide useful information to improve the description of the physics phenomena in the simulation models. My PhD research activity was held in this framework, as part of the research project Analysis of Reactor COre (ARCO, promoted by INFN) whose task was the development of modern, flexible and integrated tools for the analysis of nuclear reactors, relying on the experimental data collected at the research reactor TRIGA Mark II, installed at the Applied Nuclear Energy Laboratory (LENA) at the University of Pavia. In this way, once the effectiveness and the reliability of these tools for modeling an experimental reactor have been demonstrated, these could be applied to develop new generation systems. In this thesis, I present the complete neutronic characterization of the TRIGA Mark II reactor, which was analyzed in different operating conditions through experimental measurements and the development of a Monte Carlo simulation tool (relied on the MCNP code) able to take into account the ever increasing complexity of the conditions to be simulated. First of all, after giving an overview of some theoretical concepts which are fundamental for the nuclear reactor analysis, a model that reconstructs the first working period of the TRIGA Mark II reactor, in which the “fresh” fuel was not heavily contaminated with fission reaction products, is described. In particular, all the geometries and the materials are described in the MCNP simulation model with good detail, in order to reconstruct the reactor criticality and all the effects on the neutron distributions. The very good results obtained from the simulations of the reactor at low power condition -in which the fuel elements can be considered to be in thermal equilibrium with the water around them- are then used to implement a model for simulating the full power condition (250kW), in which the effects arising from the temperature increase in the fuel-moderator must be taken into account. The MCNP simulation model was exploited to evaluate the reactor power distribution and a dedicated experimental campaign was performed to measure the water temperature within the reactor core. In this way, through a thermal-hydraulic calculation tool, it has been possible to determine the temperature distribution within the fuel elements and to include the description of the thermal effects in the MCNP simulation model. Thereafter, since the neutron flux is a crucial parameter affecting the reaction rates and thus the fuel burnup, its energy and space distributions are analyzed presenting the results of several neutron activation measurements. Particularly, the neutron flux was firstly measured in the reactor's irradiation facilities through the neutron activation of many different isotopes. Hence, in order to analyze the energy flux spectra, I implemented an analysis tool, based on Bayesian statistics, which allows to combine the experimental data from the different activated isotopes and reconstruct a multi-group flux spectrum. Subsequently, the spatial neutron flux distribution within the core was measured by activating several aluminum-cobalt samples in different core positions, thus allowing the determination of the integral and fast flux distributions from the analysis of cobalt and aluminum, respectively. Finally, I present the results of the fuel burnup calculations, that were performed for simulating the current core configuration after a 48 years-long operation. The good accuracy that was reached in the simulation of the neutron fluxes, as confirmed by the experimental measurements, has allowed to evaluate the burnup of each fuel element from the knowledge of the operating hours and the different positions occupied in the core over the years. In this way, it has been possible to exploit the MCNP simulation model to determine a new optimized core configuration which could ensure, at the same time, a higher reactivity and the use of less fuel elements. This configuration was realized in September 2013 and the experimental results confirm the high quality of the work done. The results of this Ph.D. thesis highlight that it is possible to implement analysis tools -ranging from Monte Carlo simulations to the fuel burnup time evolution software, from neutron activation measurements to the Bayesian statistical analysis of flux spectra, and from temperature measurements to thermal-hydraulic models-, which can be appropriately exploited to describe and comprehend the complex mechanisms ruling the operation of a nuclear reactor. Particularly, it was demonstrated the effectiveness and the reliability of these tools in the case of an experimental reactor, where it was possible to collect many precious data to perform benchmark analyses. Therefore, for as these tools have been developed and implemented, they can be used to analyze other reactors and, possibly, to project and develop new generation systems, which will allow to decrease the production of high-level nuclear waste and to exploit the nuclear fuel with improved efficiency.
APA, Harvard, Vancouver, ISO, and other styles
2

Fang, Qijun. "Hierarchical Bayesian Benchmark Dose Analysis." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/316773.

Full text
Abstract:
An important objective in statistical risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to hierarchical Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indeed, for the few existing forms of Bayesian BMDs, informative prior information is seldom incorporated. Here, a new method is developed by using reparameterized quantal-response models that explicitly describe the BMD as a target parameter. This potentially improves the BMD/BMDL estimation by combining elicited prior belief with the observed data in the Bayesian hierarchy. Besides this, the large variety of candidate quantal-response models available for applying these methods, however, lead to questions of model adequacy and uncertainty. Facing this issue, the Bayesian estimation technique here is further enhanced by applying Bayesian model averaging to produce point estimates and (lower) credible bounds. Implementation is facilitated via a Monte Carlo-based adaptive Metropolis (AM) algorithm to approximate the posterior distribution. Performance of the method is evaluated via a simulation study. An example from carcinogenicity testing illustrates the calculations.
APA, Harvard, Vancouver, ISO, and other styles
3

Werner, Sarah. "Internationalisierung von Universitäten Eine Benchmark-Analyse /." St. Gallen, 2006. http://www.biblio.unisg.ch/org/biblio/edoc.nsf/wwwDisplayIdentifier/03605649001/$FILE/03605649001.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Jingyu, and Jingyu Liu. "Autologistic Modeling in Benchmark Risk Analysis." Diss., The University of Arizona, 2017. http://hdl.handle.net/10150/626166.

Full text
Abstract:
An important objective in statistical risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs), that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). With this context, a quantitative methodology is developed to characterize vulnerability among 132 U.S. urban centers ('cities') to terrorist events, applying a place-based vulnerability index to a database of terrorist incidents and related human casualties. A centered autologistic regression model is employed to relate urban vulnerability to terrorist outcomes and also to adjust for autocorrelation in the geospatial data. Risk- analytic BMDs are then estimated from this modeling framework, wherein levels of high and low urban vulnerability to terrorism are identified. This new, translational adaptation of the risk-benchmark approach, including its ability to account for geospatial autocorrelation, is seen to operate quite flexibly in this socio-geographic setting. Further, alternative definitions for neighborhoods are considered to extend the autologistic benchmark paradigm to non-spatial settings. All 3108 counties in the contiguous 48 U.S. states are studied to identify a benchmark dose variable as the number of hazards. This is employed to benchmark billion-dollar losses across each county. County-level resilience is used as a potential characteristic for defining the neighborhood structure within the autologistic model.
APA, Harvard, Vancouver, ISO, and other styles
5

Peterson, Ross Jordan. "LANDS' END: OWL TOWELS BENCHMARK ANALYSIS." Thesis, The University of Arizona, 2009. http://hdl.handle.net/10150/192563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hothorn, Torsten, Friedrich Leisch, Achim Zeileis, and Kurt Hornik. "The design and analysis of benchmark experiments." SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 2003. http://epub.wu.ac.at/758/1/document.pdf.

Full text
Abstract:
The assessment of the performance of learners by means of benchmark experiments is established exercise. In practice, benchmark studies are a tool to compare the performance of several competing algorithms for a certain learning problem. Cross-validation or resampling techniques are commonly used to derive point estimates of the performances which are compared to identify algorithms with good properties. For several benchmarking problems, test procedures taking the variability of those point estimates into account have been suggested. Most of the recently proposed inference procedures are based on special variance estimators for the cross-validated performance. We introduce a theoretical framework for inference problems in benchmark experiments and show that standard statistical test procedures can be used to test for differences in the performances. The theory is based on well defined distributions of performance measures which can be compared with established tests. To demonstrate the usefulness in practice, the theoretical results are applied to benchmark studies in a supervised learning situation based on artificial and real-world data.
Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
APA, Harvard, Vancouver, ISO, and other styles
7

Alsaket, Yahya. "Benchmark solutions for advanced analysis of steel frames." Thesis, Queensland University of Technology, 1999. https://eprints.qut.edu.au/36105/1/36105_Alsaket_1999.pdf.

Full text
Abstract:
During the past ten years, considerable research has been conducted with the aim of developing, implementing and verifying "advanced analysis" techniques suitable for the non-linear analysis and design of steel framed structures. With the use of one of these methods, the simplified concentrated methods, comprehensive assessment of the actual failure modes and ultimate strengths of framing systems is now possible in practical design situations, without resort to conventional elastic methods of analysis and semiempirical specification equations. This research has the potential to extend the creativity of the structural engineer and simplify the design process, while ensuring greater economy and more uniform safety in certain design situations. However, the application of concentrated plasticity methods is currently restricted to two dimensional steel frame structures that are fully laterally restrained and constructed from compact sections (i.e. sections not subject to local and/or lateral buckling effects). Unfortunately this precludes the use of advanced analysis from the design of frames consisting of cold-formed sections and a significant proportion of hot-rolled universal beam sections. Therefore research is currently under way to develop concentrated plasticity methods of analysis for steel frame structures subject to local and/or lateral buckling effects. This thesis was aimed at developing appropriate benchmark solutions that are needed to validate these simplified methods of analysis. Finite element analyses and five large scale experiments were conducted in order to study the ultimate strength behaviour of two-dimensional single bay single storey steel frames subjected to local and/or lateral buckling effects. The frames comprised of cold-formed rectangular hollow sections and hot-rolled I-sections. A good agreement between the results from finite element analyses and experiments validated the accuracy of the finite element model used. The finite element model was then used to develop benchmark solutions for two-dimensional single storey, single bay steel frames comprising cold-formed rectangular hollow sections and hot-rolled I-sections subjected to local and/or lateral buckling effects. This thesis presents the details of finite element analyses and large scale experiments and their results, and a series of analytical benchmark solutions that can be used for the verification of simplified concentrated plasticity methods of analysis.
APA, Harvard, Vancouver, ISO, and other styles
8

Trautvetter, Jan. "Analyse der "Pipes and Filter" Architektur gegenüber instanzbasierten Ansätzen bei Workflows." [S.l. : s.n.], 2006. http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-28652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rosell, Sagrelius Simon. "Internal Performance Benchmark : -Cost Gap Analysis between painting processes." Thesis, Örebro universitet, Institutionen för naturvetenskap och teknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-69778.

Full text
Abstract:
Scania OmniExpress Busproduction Finland Oy in Lahti, a company that manufactures buses of product family Scania Interlink LD, Scania Interlink MD, Scania Interlink HD and Scania Citywide LE Suburban. As Scania receives larger customer orders, these orders are divided into both Scania Production Słupsk S.A. and Scania OmniExpress Busproduction Finland Oy, which bring a desire to harmonize these factories. To achieve this harmonization an investigation is required of Scania OmniExpress Busproduction Finland Oy analysis process and comparison with Scania Production Słupsk S.A. To keep up the permanent development outcome, Strategic Plan Scania Production Lahti requires a survey of the painting process in Lahti, Finland. Based on this, an internal cost performance benchmark has been implemented between the factories. Through a currant status analysis of both facility’s, based on Strategic Plan Scania Production Lahti methods and strategies, as well as complementing this with external methods and theory, the gap between the factories has been conducted from a cost perspective. Based on the more in-depth analysis made in Scania OmniExpress Busproduction Finland Oy, an improvement work has been carried out.
Scania OmniExpress Busproduction Finland Oy i Lahti, tillverkar idag bussfamiljerna Scania Interlink LD, Scania Interlink MD, Scania Interlink HD och Scania Citywide LE Suburban. För att kunna leverera vid större kundordrar delas dessa upp mellan två Scaniaägda fabriker, Scania Production Słupsk S.A. och Scania OmniExpress Busproduction Finland Oy, detta medför att en harmonisering krävs mellan fabrikerna så att slutprodukten blir identisk. För att uppnå denna harmonisering utfördes denna studie mellan dessa fabriker. Att jobba med ständiga förbättringar är djupt inprintat i Scanias visioner och mål. I arbetet med ständiga förbättringar för processer skulle målerprocessen förbättras i denna studie. Genom en nulägesanalys i båda fabrikerna baserad på Strategic Plan Scania Production Lahti metoder och strategier såväl som kompliment från externa metoder och teorier har prestanda gapet identifierats. Baserat på en såväl djupare nulägesanalys i Scania OmniExpress Busproduction Finland Oy, har ett förbättringsarbete utförts.
APA, Harvard, Vancouver, ISO, and other styles
10

Maier, Hansjörg. "Analyse- und Optimierungsmethoden für Aggregatlagerungssysteme zur Verbesserung des Fahrkomforts = Analysis and optimization of engine mounting systems to improve comfort behaviour /." Karlsruhe, 2007. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=015965241&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Brunner, Marc. "Technische Analyse und neutrale Benchmark-Prognose Strukturbrüche auf den Finanzmärkten frühzeitig und korrekt erkennen /." St. Gallen, 2009. http://www.biblio.unisg.ch/org/biblio/edoc.nsf/wwwDisplayIdentifier/03604873101/$FILE/03604873101.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Theunissen, Marli. "An application of Data Envelopment Analysis to benchmark CEO remuneration / Marli Theunissen." Thesis, North-West University, 2012. http://hdl.handle.net/10394/9845.

Full text
Abstract:
The purpose of this study is to determine whether the Data Envelopment Analysis (DEA) can be applied to Chief Executive Officer (CEO) remuneration of companies listed on the Johannesburg Stock Exchange (JSE) by defining inputs in terms of remuneration factors and outputs in terms of business factors in order to establish a benchmark for CEO remuneration. An exploratory study is conducted, using cross-sectional data from a secondary source. The sample consists of 221 companies listed on the JSE that disclosed their financial and non-financial information during 2010. The DEA was performed to estimate the relative technical efficiency of CEOs to convert their remuneration into company performance indicators. Base Pay, Perquisites and Pension, Annual Bonus Plans and Long-term Incentives were used as the inputs to the DEA model and company performance and size, measured by Return on Equity (ROE) and Total Assets respectively, were used as the outputs to the model. The empirical results prove that the DEA can be successfully applied as a benchmarking model for CEO remuneration that incorporates multiple inputs and outputs and establishes benchmarks and potential improvements for overpaid, inefficient CEOs. The CEOs from 80 of the 221 companies included in the sample emerged as the benchmark CEOs and formed the efficiency frontier against which inefficient CEOs were compared in order to determine the potential improvements for these CEOs. From a research perspective, this study contributes to the advancement of CEO remuneration research by introducing an alternative model by which CEO remuneration can be analysed. Future studies can analyse CEO remuneration by using other variables or time series data in the DEA model or combine the DEA with other methods like the regression analysis to perform more comprehensive investigations. From a practical perspective, the DEA can be used to establish a benchmark for CEO remuneration. Remuneration committees can use the results of the DEA as a guide to determine acceptable remuneration levels and decrease the pay gap between CEOs and the average worker. The originality of this study lies in the fact that it is the first South African study that used the DEA instead of the regression analysis to analyse CEO remuneration of companies listed on the JSE. This study also disaggregated Total CEO Remuneration into Base Pay, Perquisites and Pension, Annual Bonus Plans and Long-term Incentives to provide more accurate benchmark information. In addition, this is the first study that established benchmark CEO remuneration levels and suggested improvements to the remuneration package structure of overpaid, under-performing CEOs of companies listed on the JSE.
Thesis (MCom (Management Accountancy))--North-West University, Potchefstroom Campus, 2013.
APA, Harvard, Vancouver, ISO, and other styles
13

Souza, De Oliveira Roberto. "Méthode d'élaboration d'un benchmark pour les SGBD relationnels." Toulouse, ENSAE, 1987. http://www.theses.fr/1987ESAE0012.

Full text
Abstract:
L'étude de la charge d'un système, de ses performances et de ses algorithmes de base s'appuie fortement sur l'application d'outils mathématiques. En ce qui concerne les Systemes de Gestion de Base de Données Relationnels (SGBDR), l'analyse des données fournit des outils qui s'avèrent très importants dans l'évolution de telles études. À partir de l'application des méthodes de classification aux attributs appartenant aux relations du schéma d'une base, la charge du SGBDR (due à cette base) est définie en fonction des représentations des opérandes (attributs, groupes d'attributs, relations) composant ses requêtes. L'emploi de cette méthode à une base de données réelle évoluant avec le temps a été faite et la synthèse des résultats obtenus y est étudiée.
APA, Harvard, Vancouver, ISO, and other styles
14

Rumplík, Michal. "Zkoumání souvislostí mezi pokrytím poruch a testovatelností elektronických systémů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-235536.

Full text
Abstract:
This work deals with testability analysis of digital circuits and fault coverage. It contains a desription of digital systems, their diagnosis, a description of tools for generating and applying tests and sets of benchmark circuits. It describes the testing of circuits and experimentation in tool TASTE for testability analysis and commercial tool for generating and applying tests. The experiments are focused on increase the testability of circuits.
APA, Harvard, Vancouver, ISO, and other styles
15

Feldmann, James. "Open-Space Protection in Cochise County: A Peer-Based Benchmark Analysis." Thesis, The University of Arizona, 2005. http://hdl.handle.net/10150/190649.

Full text
Abstract:
Numerous studies compare open space policies in amenity-rich high-growth rural counties in the American West. Less common is research on similar counties before these large population shifts occur. This study selects Cochise County and 18 peer counties to benchmark another important segment of the American West—counties of moderate growth. The intent is not to explain causation between policy and open space characteristics but instead to expose open space trends among peers that may be valuable for Cochise County planners. The study begins by reviewing the role of open space in the American West before discussing the federal, state, and local policy context. Interviews with planners and a review of comprehensive plan policies at each county then provide material to benchmark Cochise County and offer recommendations. The results demonstrate that Cochise County planners take a relatively modest approach to open space planning and may benefit from: 1. Elaborating on the Comprehensive Plan purpose 2. Employing stronger language for open space goals 3. Including all applicable goals of open space protection 4. Increasing the number of moderately worded open space tools 5. Recognizing cooperation as a key to open space protection 6. Maintaining strong leadership Expected population growth and a high demand for Cochise’s many natural and cultural amenities only reinforce the need for these recommendations.
APA, Harvard, Vancouver, ISO, and other styles
16

Burkett, Jason Lee. "BENCHMARK STUDIES FOR STRUCTURAL HEALTH MONITORING USING ANALYTICAL AND EXPERIMENTAL MODELS." Master's thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2660.

Full text
Abstract:
The latest bridge inventory report for the United States indicates that 25% of the highway bridges are structurally deficient or functionally obsolete. With such a large number of bridges in this condition, safety and serviceability concerns become increasingly relevant along with the associated increase in user costs and delays. Biennial inspections have proven subjective and need to be coupled with standardized non-destructive testing methods to accurately assess a bridge's condition for decision making purposes. Structural health monitoring is typically used to track and evaluate performance, symptoms of operational incidents, anomalies due to deterioration and damage during regular operation as well as after an extreme event. Dynamic testing and analysis are concepts widely used for health monitoring of existing structures. Successful health monitoring applications on real structures can be achieved by integrating experimental, analytical and information technologies on real life, operating structures. Real-life investigations must be backed up by laboratory benchmark studies. In addition, laboratory benchmark studies are critical for validating theory, concepts, and new technologies as well as creating a collaborative environment between different researchers. To implement structural health monitoring methods and technologies, a physical bridge model was developed in the UCF structures laboratory as part of this thesis research. In this study, the development and testing of the bridge model are discussed after a literature review of physical models. Different aspects of model development, with respect to the physical bridge model are outlined in terms of design considerations, instrumentation, finite element modeling, and simulating damage scenarios. Examples of promising damage detection methods were evaluated for common damage scenarios simulated on the numerical and physical models. These promising damage indices were applied and directly compared for the same experimental and numerical tests. To assess the simulated damage, indices such as modal flexibility and curvature were applied using mechanics and structural dynamics theory. Damage indices based on modal flexibility were observed to be promising as one of the primary indicators of damage that can be monitored over the service life of a structure. Finally, this thesis study will serve an international effort that has been initiated to explore bridge health monitoring methodologies under the auspices of International Association for Bridge Maintenance and Safety (IABMAS). The data generated in this thesis research will be made available to researchers as well as practitioners in the broad field of structural health monitoring through several national and international societies, associations and committees such as American Society of Civil Engineers (ASCE) Dynamics Committee, and the newly formed ASCE Structural Health Monitoring and Control Committee.
M.S.
Department of Civil and Environmental Engineering
Engineering and Computer Science
Civil Engineering
APA, Harvard, Vancouver, ISO, and other styles
17

Bertarelli, Lorenza. "Analisi delle prestazioni della piattaforma semantica smart-m3 mediante il benchmark lubm." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amslaurea.unibo.it/9263/.

Full text
Abstract:
Il software Smart-M3, ereditato dal progetto europeo SOFIA, conclusosi nel 2011, permette di creare una piattaforma d'interoperabilità indipendente dal tipo di dispositivi e dal loro dominio di utilizzo e che miri a fornire un Web Semantico di informazioni condivisibili fra entità software e dispositivi, creando ambienti intelligenti e collegamenti tra il mondo reale e virtuale. Questo è un campo in continua ascesa grazie al progressivo e regolare sviluppo sia della tecnologia, nell'ambito della miniaturizzazione dei dispositivi, che delle potenzialità dei sistemi embedded. Questi sistemi permettono, tramite l'uso sempre maggiore di sensori e attuatori, l'elaborazione delle informazioni provenienti dall’esterno. È evidente, come un software di tale portata, possa avere una molteplicità di applicazioni, alcune delle quali, nell’ambito della Biomedica, può esprimersi nella telemedicina e nei sistemi e-Heath. Per e-Health si intende infatti l’utilizzo di strumenti basati sulle tecnologie dell'informazione e della comunicazione, per sostenere e promuovere la prevenzione, la diagnosi, il trattamento e il monitoraggio delle malattie e la gestione della salute e dello stile di vita. Obiettivo di questa tesi è fornire un set di dati che mirino ad ottimizzare e perfezionare i criteri nella scelta applicativa di tali strutture. Misureremo prestazioni e capacità di svolgere più o meno velocemente, precisamente ed accuratamente, un particolare compito per cui tale software è stato progettato. Ciò si costruisce sull’esecuzione di un benchmark su diverse implementazioni di Smart-M3 ed in particolare sul componente centrale denominato SIB (Semantic Information Broker).
APA, Harvard, Vancouver, ISO, and other styles
18

Kumar, Shikhar S. M. Massachusetts Institute of Technology. "Quantifying time-dependent uncertainty in the BEAVRS benchmark using time series analysis." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119052.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Nuclear Science and Engineering, 2018.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 85-87).
Advances in computational capabilities have enabled the development of high-fidelity methods for large-scale modeling of nuclear reactors. However, such techniques require proper benchmarking and validation to ensure correct and consistent modeling of real problems. Thus, the BEAVRS benchmark was developed to legitimize the advancements of new 3-D full-core algorithms in the field of reactor physics. However, in order to address the issue of BEAVRS uncertainty quantification (UQ) of Uranium-235 fission reaction rate data, this thesis proposes a new method for measuring uncertainty that goes beyond merely conducting statistical analysis of multiple measurements at one given point in time. Instead, this work hinges on principles of time series analysis and develops a rigorous method for quantifying the uncertainty in using "tilt-corrected" data in an attempt to evaluate time-dependent uncertainty. Such efforts show consistent results across the four dierent methods and will ultimately assist in demonstrating that BEAVRS is a non-proprietary international benchmark for the validation of high-fidelity tools.
by Shikhar Kumar.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
19

Brosig, Isabella [Verfasser]. "Benchmark-Manipulation : Eine ökonomische und regulatorische Analyse des LIBOR-Manipulationsskandals / Isabella Brosig." Baden-Baden : Nomos Verlagsgesellschaft mbH & Co. KG, 2018. http://d-nb.info/1169989187/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Maharshi, Shivam. "Performance Measurement and Analysis of Transactional Web Archiving." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/78371.

Full text
Abstract:
Web archiving is necessary to retain the history of the World Wide Web and to study its evolution. It is important for the cultural heritage community. Some organizations are legally obligated to capture and archive Web content. The advent of transactional Web archiving makes the archiving process more efficient, thereby aiding organizations to archive their Web content. This study measures and analyzes the performance of transactional Web archiving systems. To conduct a detailed analysis, we construct a meaningful design space defined by the system specifications that determine the performance of these systems. SiteStory, a state-of-the-art transactional Web archiving system, and local archiving, an alternative archiving technique, are used in this research. We experimentally evaluate the performance of these systems using the Greek version of Wikipedia deployed on dedicated hardware on a private network. Our benchmarking results show that the local archiving technique uses a Web server’s resources more efficiently than SiteStory for one data point in our design space. Better performance than SiteStory in such scenarios makes our archiving solution favorable to use for transactional archiving. We also show that SiteStory does not impose any significant performance overhead on the Web server for the rest of the data points in our design space.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
21

Pradelle, Sylvain. "Évaluation de la durée de vie du béton armé : approche numériqueglobale vis-à-vis de la pénétration d’agents agressifs." Thesis, Paris Est, 2017. http://www.theses.fr/2017PESC1176/document.

Full text
Abstract:
L’objectif de cette thèse est d’approfondir le développement d’une plateforme de modélisation, qui décrit le transport multi-espèces et multi-phasiques à travers les matériaux cimentaires. Les structures en béton armé peuvent se dégrader en raison de la corrosion des armatures en acier, induite par les chlorures et/ou la carbonatation. La plateforme de modélisation traite principalement de la période d’initiation de cette corrosion par la prédiction du transport des agents délétères à travers le béton d’enrobage. Ces phénomènes sont dépendants des propriétés relatives à l’humidité du matériau et requièrent l’étude des mouvements de l’eau liquide et de la phase gazeuse dans le matériau. La première partie de cette thèse se concentre sur la pénétration des chlorures à travers des bétons saturés. La pénétration des chlorures se limite à un processus couplé de diffusion et de fixation des ions sur la matrice cimentaire. Dans ce cadre, de nombreux modèles ont été développés et de nombreuses données expérimentales sont accessibles. Un benchmark de ces modèles est réalisé, avec pour objectif d’identifier les plus robustes et les plus fiables. Cette étude contribue également à choisir les isothermes de fixation des chlorures les plus pertinentes. Une analyse fiabiliste des modèles sélectionnés précédemment est menée. Un cadre de calcul de la durée de vie fiabiliste du béton armé immergée dans une solution saline est proposé. Une analyse de sensibilité est également réalisée afin de déterminer les données d’entrée du modèle les plus influentes. Les résultats mettent en avant le rôle crucial joué par l’enrobage, la teneur critique en chlorures et le coefficient effectif des chlorures. L’importance de la non-linéarité des isothermes est également soulignée, alors que cette propriété est encore mal maîtrisée. De nombreux modèles de prédiction des transferts d’humidité ont été développés. La compréhension des phénomènes physiques en jeu reste insuffisante pour les matériaux cimentaires. Une analyse de sensibilité fiabiliste du modèle multi-phasiques et de l’équation de Richards est réalisée, en considérant un essai de séchage. Les résultats soulignent l’importance de l’isotherme de désorption de vapeur d’eau et de la perméabilité à l’eau liquide, qui s’écrit comme une fonction de la saturation. Par la suite, les travaux se sont focalisés sur la détermination de cette perméabilité. Celle-ci a été réalisée par analyse inverse des profils de perte de masse lors d’un séchage et des profils de saturation durant une imbibition. Les valeurs déterminées sont comparées aux mesures de perméabilité aux gaz, aux mesures directes et indirectes (notamment Katz-Thompson) de perméabilité à l’eau liquide, reportées dans la littérature. Parmi les pistes d’évolution des modèles de carbonatation, une description plus complète du transport des espèces en phase gazeuse est à proposer. Le dernier chapitre traite de ce point, en considérant une diffusion ternaire du mélange gazeux avec toutefois une description simplifiée des réactions chimiques de carbonatation. Une étude théorique met en évidence les conséquences de la nouvelle description des transferts : les profils de pression de gaz (dépression) et de CO2 sont modifiés, ce qui peut impacter l’avancement de la carbonatation. Par la suite, une calibration a été réalisée afin de mettre en cohérence les prédictions numériques avec des expériences de carbonatation accélérée. Une analyse de sensibilité fiabiliste a été conduite en considérant un essai de carbonatation pour des fractions de CO2 extérieures allant de 0,04 % à 50 %, avec une humidité relative extérieure contrôlée. Les résultats ont montré l’importance de la porosité totale, de la teneur initiale en C-S-H (fraction de CO2 extérieure élevée) et des conditions hydriques extérieures (fraction de CO2 atmosphérique). Enfin, la carbonatation atmosphérique avec la prise en compte de cycles d’humidification-séchage a été simulée pour deux bétons
The purpose of this research is to go deeper in the development of a modelling platform, which describes multiphase and multi-species transport within cementitious materials. Reinforced concrete (RC) structures can be deteriorated as a result of chloride-induced and/or carbonation-induced corrosion of the steel rebars. The modelling platform deals with the initiation period of this corrosion by predicting the transport of the deleterious agents through the concrete cover. This phenomenon is dependent on the moisture properties of the material and requires the study of the movement of liquid-water and gas-phase transport in the material. The first step of this thesis focuses on chloride ingress within fully water-saturated concretes. The chloride ingress is limited to a coupled diffusion-binding process. Within this framework, several models have been developed and numerous experimental data are available. A benchmark of these models is performed in order to identify the most reliable engineering models. This also contributes to choose the most relevant chloride binding isotherms. A probabilistic analysis of selected models among the benchmark is carried out. A general framework is proposed to calculate a reliability service life for reinforced concrete structures in the case of immersion in seawater. A sensitivity analysis is also performed in order to define the most influencing input data. Results point out the crucial role of the concrete cover, the critical chloride content and, to a lesser extent, the effective chloride diffusion coefficient. The importance of the non linearity of isotherms is also highlighted whereas this property is still not well-known. Several moisture transport models have been developed. The understanding of the numerous physical phenomena involved is still insufficient for cementitious materials. A reliability sensitivity analysis of the multiphasic model and of the model based on Richards equation is performed, considering the drying of concretes. Results point out the importance of defining a relevant water vapour desorption isotherm and, to a lesser extent, the liquid-water permeability, as a function of saturation. Thereafter, this research focuses on the determination of this permeability. This is performed by inverse analysis considering two different experimental tests: the mass loss monitoring during a drying and the monitoring of saturation profiles during an imbibition. The determined values are compared to measurements of gas permeability and to measurements with direct and indirect methods (in particular, Katz-Thompson methods) of liquid water permeability, assessed in the literature. Among the outlooks of sophistication of predictive models dedicated to the carbonation, a more comprehensive description of the transport of species in the gaseous phase has to be proposed. The last chapter of the manuscript deals with this issue, by considering a ternary diffusion process of the gaseous mix along with a simplified description of the chemical carbonation reactions. A theoretical study is carried out in order to highlight the changes induced by the new description of transfers: the profiles of gas pressure (depression) and the profiles of CO2 pressure are modified, which can impact the progress of the carbonation front. Thereafter, a calibration of the model is performed in order to bring the numerical predictions into line with the experimental results of accelerated carbonation tests. A reliability sensitivity analysis is performed considering a carbonation test for external fractions of CO2 ranging from 0.04 % to 50 %, with constant external relative humidity. Results point out the significance of the bulk porosity, of the initial content of C-S-H (high external fractions of CO2) and the external moisture conditions (atmospheric external fractions of CO2). Finally, atmospheric carbonation involving wetting–drying cycles is simulated for two concretes
APA, Harvard, Vancouver, ISO, and other styles
22

Ulmer, Richard Marion. "Benchmark description of an advanced burner test reactor and verification of COMET for whole core criticality analysis in fast reactors." Thesis, Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52222.

Full text
Abstract:
This work developed a stylized three dimensional benchmark problem based on Argonne National Laboratory's conceptual Advanced Burner Test Reactor design. This reactor is a sodium cooled fast reactor designed to burn recycled fuel to generate power while transmuting long term waste. The specification includes heterogeneity at both the assembly and core levels while the geometry and material compositions are both fully described. After developing the benchmark, 15 group cross sections were developed so that it could be used for transport code method verification. Using the aforementioned benchmark and 15 group cross sections, the Coarse-Mesh Transport Method (COMET) code was compared to Monte Carlo code MCNP5 (MCNP). Results were generated for three separate core cases: control rods out, near critical, and control rods in. The cross section groups developed do not compare favorably to the continuous energy model; however, the primary goal of these cross sections is to provide a common set of approachable cross sections that are widely usable for numerical methods development benchmarking. Eigenvalue comparison results for MCNP vs. COMET are strong, with two of the models within one standard deviation and the third model within one and a third standard deviation. The fission density results are highly accurate with a pin fission density average of less than 0.5% for each model.
APA, Harvard, Vancouver, ISO, and other styles
23

Cain, David A. McCarthy John R. Chizmar John F. "Analysis of facilities benchmarks as a predictor of institutional quality." Normal, Ill. Illinois State University, 1997. http://wwwlib.umi.com/cr/ilstu/fullcit?p9804930.

Full text
Abstract:
Thesis (Ph. D.)--Illinois State University, 1997.
Title from title page screen, viewed June 9, 2006. Dissertation Committee: John R. McCarthy, John F. Chizmar (co-chairs), Anita H. Lupo, Stephen Bragg, Robert Arnold. Includes bibliographical references (leaves 84-88) and abstract. Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
24

Menin, Ilenia <1984&gt. "Benchmarkig: studio di efficienza delle destinazioni turistiche svizzere." Master's Degree Thesis, Università Ca' Foscari Venezia, 2013. http://hdl.handle.net/10579/2737.

Full text
Abstract:
La presente tesi si pone l'obiettivo di approfondire il ruolo del destination benchmarking quale strumento di management che aiuta le destinazioni turistiche a migliorare le proprie performance. Partendo dalla definizione del processo di benchmarking e dei principali modelli che in letteratura economica studiano la competitività di una destinazione turistica si giunge all'analisi dei vantaggi competitivi che hanno reso la Svizzera il paese turisticamente più competitivo secondo il “Travel Tourism Competitiviness Index” redatto nel 2011 dal World Economic Forum. La tesi si conclude con un'analisi di efficienza delle principali mete turistiche alpine e cittadine svizzere svolta implementando il modello della Data Envelopment Analisys.
APA, Harvard, Vancouver, ISO, and other styles
25

Gemmeke, Tobias. "Physikalisch orientierte Optimierung und Analyse von Hardmakros für die digitale Signalverarbeitung /." Aachen : Shaker, 2006. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=014901490&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Ciani, Angelo. "Il posizionamento dell’Acquedotto lucano spa: Una cluster analysis." Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1439.

Full text
Abstract:
2009 - 2010
Research aims to analyze the operating performance of companies in the water services sector in the various Italian regions, to locate the position, in terms of overall profitability, of the Acquedotto Lucano Spa, a public company owned by the municipalities of the Region of Basilicata, which manages the integrated water service. Research was based on a sample where the database was constructed from data collected with the help of CNEL and the Osservatorio dei Servizi Pubblici Locali. It was, then, identified a set of indicators of financial statements that give an account of the economic structure and financial position of the various companies that manage the SII on the national territory. Through the use of appropriate techniques of clustering, we determined the distances between the statistical units in our sample and identified, finally, the positioning of the Region of Basilicata compared to other companies in the sample, obtaining, therefore, homogeneous groupings of companies that exhibit similar levels of economic, financial and equity indicators. Since we did not have data on the effectiveness of a particular clustering methodology compared to the other, we have followed different approaches to cluster analysis, thereby being able to measure how much each category of algorithms was appropriate to the study of such samples of data. Hierarchical methods have provided only general indications, a signal that the sample data are extremely cohesive and compact. a feature that has made problematic even the application of density methods, as it also determines a little variation in density between the data. Using k-means algorithm, we obtained important information about the number of clusters in which to split the sample, however, the values of the performance indicators we were not convinced of the complete reliability of the cluster boundaries identified. In order to obtain a more precise profile of the individual regions of homogeneity we have used a genetic algorithm, suitably calibrated, which has provided the cluster with surfaces of complex shape and non-linear, allowing a better interpretation of the results... [edited by author]
IX n.s.
APA, Harvard, Vancouver, ISO, and other styles
27

Kotorova, Irina, and Mattias Sandström. "Comparative analysis of emerging markets hedge funds and emerging markets benchmark indices performance." Thesis, Högskolan i Halmstad, Sektionen för ekonomi och teknik (SET), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-15570.

Full text
Abstract:
Many hedge funds are believed to yield considerable returns to investors; there is an assumption that suggests hedge funds seem uncorrelated with market fluctuations and have relatively low volatility. In recent years, emerging market hedge funds have experienced a higher capital inflow in periods when the diversification benefits of investing in emerging markets are higher. However, the strategy‟s share of the hedge fund industry‟s total capital flows has decreased significantly during the same periods: this might imply that investors have reallocated capital to other hedge fund strategies. This paper investigates whether emerging markets hedge funds have been as consistent in performance as the benchmark indices by presenting results of comparative analysis of two sample emerging markets hedge fund indices and two standard emerging markets benchmarks performance. The empirical study ranges from the period of January 2006 to December 2010.
APA, Harvard, Vancouver, ISO, and other styles
28

Beardsley, Colton Tack. "Computational Fluid Dynamics Analysis in Support of the NASA/Virginia Tech Benchmark Experiments." Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/99091.

Full text
Abstract:
Computational fluid dynamics methods have seen an increasing role in aerodynamic analysis since their first implementation. However, there are several major limitations is these methods of analysis, especially in the area of modeling separated flow. There exists a large demand for high-fidelity experimental data for turbulence modeling validation. Virginia Tech has joined NASA in a cooperative project to design and perform an experiment in the Virginia Tech Stability Wind Tunnel with the purpose of providing a benchmark set of data for the turbulence modeling community for the flow over a three-dimensional bump. This process requires thorough risk mitigation and analysis of potential flow sensitivities. The current study investigates several aspects of the experimental design through the use of several computational fluid dynamics codes. An emphasis is given to boundary condition matching and uncertainty quantification, as well as sensitivities of the flow features to Reynolds number and inflow conditions. Solutions are computed for two different RANS turbulence models, using two different finite-volume CFD codes. Boundary layer inflow parameters are studied as well as pressure and skin friction distribution on the bump surface. The shape and extent of separation are compared for the various solutions. Pressure distributions are compared to available experimental data for two different Reynolds numbers.
Master of Science
Computational fluid dynamics (CFD) methods have seen an increasing role in engineering analysis since their first implementation. However, there are several major limitations is these methods of analysis, especially in the area of modeling of several common aerodynamic phenomena such as flow separation. This motivates the need for high fidelity experimental data to be used for validating computational models. This study is meant to support the design of an experiment being cooperatively developed by NASA and Virginia Tech to provide validation data for turbulence modeling. Computational tools can be used in the experimental design process to mitigate potential experimental risks, investigate flow sensitivities, and inform decisions about instrumentation. Here, we will use CFD solutions to identify risks associated with the current experimental design and investigate their sensitivity to incoming flow conditions and Reynolds number. Numerical error estimation and uncertainty quantification is performed. A method for matching experimental inflow conditions is proposed, validated, and implemented. CFD data is also compared to experimental data. Comparisons are also made between different models and solvers.
APA, Harvard, Vancouver, ISO, and other styles
29

Ciferri, Ricardo Rodrigues. "Um benchmark voltado a analise de desempenho de sistemas de informações geograficas." [s.n.], 1995. http://repositorio.unicamp.br/jspui/handle/REPOSIP/276028.

Full text
Abstract:
Orientador: Geovane Cayres Magalhães
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Ciencia da Computação
Made available in DSpace on 2018-07-20T11:53:49Z (GMT). No. of bitstreams: 1 Ciferri_RicardoRodrigues_M.pdf: 4459147 bytes, checksum: 97d6729803031239d16e7f494be96817 (MD5) Previous issue date: 1995
Resumo: A enorme quantidade e a natureza dos dados armazenados por aplicações que utilizam sistemas de informações geográficas (SIGs) implicam em alterações ou extensões nos métodos de acesso, otimizadores de consulta e linguagens de consulta estabelecidos para sistemas gerenciadotes de banco de dados (SGBDs) convencionais. Com isto, diferentes soluções têm sido apresentadas, tornando-se imprescindível a criação de algum mecanismo que possa medir a eficiência destas soluções para auxiliar o direcionamento de futuros trabalhos de pesquisas. Para tal propósito é utilizada, nesta dissertação, a técnica experimental de benchmark. Esta dissertação propõe a carga de trabalho e caracteriza os dados de um benchmark voltado à análise de desempenho de SIGs. A carga de trabalho do benchmark é composta por um conjunto de transações primitivas, especificadas em alto nível, que podem ser utilizadas para a formação de transações mais complexas. Estas transações primitivas são predominantemente orientadas aos dados espaciais, sendo, a priori, independentes do formato de dados utilizado (raster ou vetorial). A caracterização dos dados do benchmark foi efetuada em termos dos tipos de dados necessários para a representação de aplicações georeferenciadas, e adicionalmente procedimentos para se realizar a geração de dados sintéticos. Finalmente, uma aplicação alvo utilizando dados sintéticos foi definida com a finalidade de validar o benchmark proposto.
Abstract: Geographical Information Systems (GIS) de ai with data that are special in nature and size. Thus, the technologies developed for conventional data base systems such as access methods, query optimizers and languages, have to be modified in order to satisfy the needs of a GIS. These modifications, embedded in several GIS, or being proposed by research projects, need to be evaluated. This thesis proposes mechanisms for evaluating GIS based on benchmarks. The benchmark is composed of a workload to be submitted to the GIS being analysed and data characterizing the information. The workload is made of a set of primitive transactions that can be. combined in order to derive transactions of any degree of complexity. These primitive transactions are oriented to spatial data but not dependent on the way they are represented (vector or raster). The benchmark data base characterization was defined in terms of the types of data required by applications that use georeferencing, and by the need to generate complex and controlled artificial data. The proposed technique and methods were used to show how to create the transactions and the data for a given application.
Mestrado
Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
30

Mark, Cindy. "A system-level synthetic circuit generator for FPGA architectural analysis." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/2812.

Full text
Abstract:
Architectural research for Field-Programmable Gate Arrays (FPGAs) tends to use an experimental approach. The benchmark circuits are used not only to compare different architectures, but also to ensure that the FPGA is sufficiently flexible to implement the desired variety of circuits. The most common benchmark circuits used for architectural research are circuits from the Microelectronics Center of North Carolina (MCNC). These circuits are small; they occupy less than 3% [5] of the largest available commercial FPGA. Moreover, these circuits are more representative of the glue logic circuits that were targets of early devices. This contrasts with the trend towards implementing Systems on Chip (SoCs) on FPGAs where several functional modules are integrated into a single circuit which is mapped onto one device. In this thesis, we develop a synthetic system-level circuit generator that connects pre-existing circuits in a realistic manner to build large netlists that share the characteristics of real SoC circuits. This generator is based on a survey of contemporary circuit designs from industrial and academic sources. We demonstrate that these system-level circuits scale well and that their post-routing characteristics match the results of large pre-existing benchmarks better than the results of circuits from previous synthetic generators.
APA, Harvard, Vancouver, ISO, and other styles
31

Riolo, Joseph S. "A Comparative Analysis of Genome Complexity and Manufacturability with Engineering Benchmarks." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1627663196868138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Daluddung, Susan Joan. "Community Benchmarks: An Analysis of Performance Measurements in Urban Planning Management." PDXScholar, 2005. https://pdxscholar.library.pdx.edu/open_access_etds/1664.

Full text
Abstract:
New public management practices in the U.S. call for governmental accountability, performance measures and benchmarks. Community benchmarks research provides a basis for current information and further research for planners and educators in the urban planning profession. A benchmark is simply a standard for performance or targeted level of service delivery aspired to by the city. Community benchmarks, as defined by the researcher, are tied to an adopted community plan. Community plans take many shapes including the General or Comprehensive Plan, the city's budget document, or a variety of strategic planning documents. The intent of the study was to complete research and survey mid-size cities to determine common performance practices for urban planning. management. The sample population was 381 cities selected from the National League of Cities and a database was created. The intent was to create a composite of key quantitative variables strongly related to the benchmark cities program. Additional terminal research was conducted from 2000 to 2004 to supplement survey results. Case studies of several select cities were conducted in order to determine the application of community benchmarks.
APA, Harvard, Vancouver, ISO, and other styles
33

Ciardiello, Corrado. "Analisi dei dati per il benchmark energetico delle strutture ricettive della provincia di Rimini." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
L’elaborato in esame rientra in un progetto di diagnostica energetica tra AIA (Associazione Italiana Albergatori) della provincia di Rimini e l’azienda Bioaus s.r.l. di Forlì. Il progetto è strutturato in tre parti principali: 1. Stabilire un valore di benchmark; 2. Definire un algoritmo e foglio di calcolo per il monitoraggio dei consumi; 3. Fornire proposte di efficientamento energetico che AIA potrà presentare ai propri associati.
APA, Harvard, Vancouver, ISO, and other styles
34

Ertl, Felix. "Exergoeconomic Analysis and Benchmark of a Solar Power Tower with Open Air Receiver Technology." Thesis, KTH, Kraft- och värmeteknologi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-101320.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Pereira, Rafles Frausino. "Analise de desempenho de banco de dados utilizando Benchmarks especializados." [s.n.], 1990. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275934.

Full text
Abstract:
Orientador: Geovane Cayres Magalhães
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Ciencia da Computação
Made available in DSpace on 2018-07-13T23:16:26Z (GMT). No. of bitstreams: 1 Pereira_RaflesFrausino_M.pdf: 4773203 bytes, checksum: 686091de688cc78af8b2902255256a6b (MD5) Previous issue date: 1990
Resumo: Apresentamos, neste trabalho, uma metodologia de análise de desempenho de sistemas de bancos de dados intitulada Metodologia de Benchmark Especializado (MBE). Nesta metodologia, o usuário parametriza um modelo de representação do sistema, baseado em uma aplicação ou conjunto de aplicações. O modelo é estruturado em níveis, que representam desde as estruturas mais abstratas de um sistema de banco de dados, até as características mais físicas. Nesta representação podem ser modelados: o ambiente de execução, o banco de dados e a carga de trabalho. A metodologia pode ser empregada na análise de alternativas de projeto de banco de dados, como, também, na seleção de um sistema de gerenciamento de banco de dados. A validação da MBE foi efetuada através da implementação de um protótipo de uma ferramenta, que foi utilizada para gerar um benchmark sintético conhecido e um benchmark especializado para uma dada aplicação.
Abstract: This work describes a database performance methodology called Specialized Benchmark Methodology (MBE). ln this methodology the user sets parameters of a layered system model that is based on one or a group of aplications. Each layer represents an abstraction level of an application representation, from the logical structure down to the most physical aspects. The environment, the database and the workload can be represented in the mode!. The MBE can be used for selecting design alternatives as well as in the selection of database management systems. Validation of MBE was acomplished through a prototype tool used for generating a wel1 known sinthetic benchmark and a benchmark specialized for a given application.
Mestrado
Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
36

Versick, Daniel. "Verfahren und Werkzeuge zur Leistungsmessung, -analyse und -bewertung der Ein-, Ausgabeeinheiten von Rechensystemen." Berlin Logos, 2009. http://d-nb.info/1000185036/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Mamani, Edwin Luis Choquehuanca. "Metodologia de benchmark para avaliação de desempenho não-estacionária: um estudo de caso baseado em aplicações de computação em nuvem." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-28072016-140437/.

Full text
Abstract:
Este trabalho analisa os efeitos das propriedades dinâmicas de sistemas distribuídos de larga escala e seu impacto no desempenho, e introduz uma abordagem para o planejamento de experimentos de referência capazes de expor essa influência. Especialmente em aplicações complexas com múltiplas camadas de software (multi-tier), o efeito total de pequenos atrasos introduzidos por buffers, latência na comunicação e alocação de recursos, pode resultar gerando inércia significativa ao longo do funcionamento do sistema. A fim de detetar estas propriedade dinâmica, o experimento de execução do benchmark deve excitar o sistema com carga de trabalho não-estacionária sob um ambiente controlado. A presente pesquisa discorre sobre os elementos essenciais para este fim e ilustra a abordagem de desenvolvimento com um estudo de caso. O trabalho também descreve como a metodologia de instrumentação pode ser explorada em abordagens de modelagem dinâmica para sobrecargas transientes devido a distúrbios na carga de trabalho.
This work examines the effects of dynamic properties of large-scale distributed systems and their impact on the delivered performance, and introduces an approach to the design of benchmark experiments capable of exposing this influence. Specially in complex, multi-tier applications, the net effect of small delays introduced by buffers, communication latency and resource instantiation, may result in significant inertia along the input-output path. In order to bring out these dynamic property, the benchmark experiment should excite the system with non-stationary workload under controlled conditions. The present research report elaborates on the essentials for this purpose and illustrates the design approach through a case study. The work also outlines how the instrumentation methodology can be exploited in dynamic modeling approaches to model transient overloads due to workload disturbances.
APA, Harvard, Vancouver, ISO, and other styles
38

Yu, Yin. "Essays on the Use of Earnings Dynamics as an Earnings Benchmark by Financial Market Participants." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1282062083.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Hon, Ryan Paul. "Creation of a whole-core PWR benchmark for the analysis and validation of neutronics codes." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/47611.

Full text
Abstract:
This work presents a whole-core benchmark problem based on a 2-loop pressurized water reactor with both UO₂and MOX fuel assemblies. The specification includes heterogeneity at both the assembly and core level. The geometry and material compositions are fully described and multi-group cross section libraries are provided in 2, 4, and 8 group formats. Simplifications made to the benchmark specification include a Cartesian boundary, to facilitate the use of transport codes that may have trouble with cylindrical boundaries, and control rod homogenization, to reduce the geometric complexity of the problem. These modifications were carefully chosen to preserve the physics of the problem and a justification of these modifications is given. Detailed Monte Carlo reference solutions including core eigenvalue, assembly averaged fission densities and selected fuel pin fission densities are presented for benchmarking diffusion and transport methods. Three different core configurations are presented in the paper namely all-rods-out, all-rods-in, and some-rods-in.
APA, Harvard, Vancouver, ISO, and other styles
40

Bailey, Jason W. "Using data envelopment analysis to identify internal benchmarks in a large parcel delivery service." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file 0.39 Mb., 97 p, 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:1430769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Becker, Simon, Sascha Kornek, Claudia Kreutzfeldt, Sebastian Opitz, Lars Richter, Maik Ulmschneider, and Anja Werner. "Entwicklung von Benchmarks für die Umweltleistung innerhalb der Maschinenbaubranche - Eine Benchmarkingstudie im Auftrag der SIEMENS AG." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2004. http://nbn-resolving.de/urn:nbn:de:swb:14-1081420313609-35523.

Full text
Abstract:
Die vorliegende Arbeit hat das Ziel, ökologische Benchmarks zur Bestimmung der Umweltleistung am Fallbeispiel der SIEMENS AG zu entwickeln. Anhand generierter Indikatoren ist die Umweltleistung im Rahmen eines Öko-Benchmarking nicht nur darstellbar, sondern auch verbesserbar. Das Instrument Benchmarking wird daher einleitend theoretisch beleuchtet und von anderen Instrumenten wie Ranking oder Rating abgegrenzt. Grundsätze für die erfolgreiche Durchführung eines Öko-Benchmarking werden erörtert. Der wissenschaftliche Erkenntnisstand zum Thema Umweltkennzahlen zur Messung der Umweltleistung und als die Grundlage eines Öko-Benchmarking schaffen einen zweites theoretisches Standbein. Die Anforderungen der Praxis werden berücksichtigt. Um die theoretische Vielfalt an Umweltkennzahlen auf ein praktikables Maß zu reduzieren, werden die Vorschläge von Non Governmental Organisation?s sowie Kennzahlen der Wettbewerber mit eingebunden. Das Ergebnis stellt sich als ein handliches aber inhaltsreiches Packet von zu empfehlenden Umweltkennzahlen dar. In einem folgenden Schritt wird das Vorgehen bei der Analyse der zur Verfügung gestellten Daten der SIEMENS AG dargestellt. Es wird versucht, Benchmarks zu generieren; die Analyse von ökologischen Ecoprints wird im Rahmen des Benchmarkingprozesses geschildert. Das Ziel der Datenauswertung ist es, den Standorten ein qualifiziertes und praktikables Feedback zu geben. Zur Verbesserung der ökologischen und ökonomischen Leistung wird die Optimierung der Kommunikation als wesentlicher Bestandteil untersucht. Doch das höchste ökologische Ziel hat keinen Bestand, wenn die Ökonomie vernachlässigt wird. Eine qualitative Kosten-Nutzen-Analyse hinterfragt die anfallenden Kosten der Kennzahlenbehandlung und gibt Aufschluss über die wesentlichen Nutzenfelder. Aber auch ökologische Belange haben auf die Unternehmen Einfluss. Auf die Zukunft gerichtete Perspektiven werden daher mit berücksichtigt. Als Methode zur Analyse der ökologischen Zukunft werden Szenarien vorgestellt.
APA, Harvard, Vancouver, ISO, and other styles
42

Emslie, Frank Norman. "Flownex analysis of high temperature test reactor thermo-hydraulic benchmarks / Frank Norman Emslie." Thesis, North-West University, 2005. http://hdl.handle.net/10394/469.

Full text
Abstract:
The High Temperature engineering Test Reactor (HTTR) is an experimental High Temperature Gas-cooled Reactor (HTGR) built by the Japanese Atomic Energy Research Institute (JAERI) to facilitate tests of HTGR technology. One of these test activities involves the validation and verification of thermo-hydraulic codes used in the design of similar HTGR plants. This report details the benchmarking of the Flownex simulation package as used by PBMR (Fly.) Ltd., a South African company developing another type of HTGR known as the Pebble Bed Modular Reactor. The benchmark is of a loss-of-off-site-power event that was tested at the HTTR facility. The event involves a cut of the electric power supply to the circulators, a reactor SCRAM and the activation of the Auxiliary Cooling system to remove decay heat. The need for verification of thermodynamic software is very important in modem nuclear power plant designs, as so much depends on the results produced. Any errors in these results can have serious economic and safety consequences. This report firstly discusses the background of the study, elaborating on the need for the work and the benefit that can be derived from it. Thereafter the process of software verification and validation (V&V) is discussed so that the need for V&V may be clearly understood. Various modelling and simulation methods are then compared, to provide an idea of the work already done in this field. Following this more detail is given on the HTTR test plant and how it is modelled in Flownex. This model is then used for both steady-state and transient simulations, the results of which are then compared with test data. With some exceptions, the study shows that the simulation results are very close to the measured data. Differences are of such a magnitude that they may be attributed to instrumentation inaccuracies. The study contributes to the field in that the methodology of analysing thermo-hydraulic systems is further broadened. The conclusions drawn from this study are aimed at the simulation design engineer, to help him or her understand similar problems and to find solutions faster.
Thesis (M.Ing. (Mechanical Engineering))--North-West University, Potchefstroom Campus, 2005.
APA, Harvard, Vancouver, ISO, and other styles
43

Mani, Sindhu. "Empirical Performance Analysis of High Performance Computing Benchmarks Across Variations in Cloud Computing." UNF Digital Commons, 2012. http://digitalcommons.unf.edu/etd/418.

Full text
Abstract:
High Performance Computing (HPC) applications are data-intensive scientific software requiring significant CPU and data storage capabilities. Researchers have examined the performance of Amazon Elastic Compute Cloud (EC2) environment across several HPC benchmarks; however, an extensive HPC benchmark study and a comparison between Amazon EC2 and Windows Azure (Microsoft’s cloud computing platform), with metrics such as memory bandwidth, Input/Output (I/O) performance, and communication computational performance, are largely absent. The purpose of this study is to perform an exhaustive HPC benchmark comparison on EC2 and Windows Azure platforms. We implement existing benchmarks to evaluate and analyze performance of two public clouds spanning both IaaS and PaaS types. We use Amazon EC2 and Windows Azure as platforms for hosting HPC benchmarks with variations such as instance types, number of nodes, hardware and software. This is accomplished by running benchmarks including STREAM, IOR and NPB benchmarks on these platforms on varied number of nodes for small and medium instance types. These benchmarks measure the memory bandwidth, I/O performance, communication and computational performance. Benchmarking cloud platforms provides useful objective measures of their worthiness for HPC applications in addition to assessing their consistency and predictability in supporting them.
APA, Harvard, Vancouver, ISO, and other styles
44

Guimond, Jean-Francois. "Do Mutual Fund Managers Have Superior Skills? An Analysis of the Portfolio Deviations from a Benchmark." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/finance_diss/12.

Full text
Abstract:
By construction, actively managed portfolios must differ from passively managed ones. Consequently, the manager’s problem can be viewed as selecting how to deviate from a passive portfolio composition. The purpose of this study is to see if we can infer the presence of superior skills through the analysis of the portfolio deviations from a benchmark. Based on the Black-Litterman approach, we hypothesize that positive signals should lead to an increase in weight, from which should follow that the largest deviations from a benchmark weight reveal the presence of superior skills. More precisely, this study looks at the subsequent performance of the securities corresponding to the largest deviations from different external benchmarks. We use a sample of 8385 US funds from the CRSP Survivorship bias free database from June 2003 to June 2004 to test our predictions. We use two external benchmarks to calculate the deviations: the CRSP value weighted index (consistent with the Black-Litterman model) and the investment objective of each fund. Our main result shows that a portfolio of the securities with the most important positive deviations with respect to a passive benchmark (either CRSP-VW or investment objective), would have earned a subsequent positive abnormal return (on a risk-adjusted basis) for one month after the portfolio date. The magnitude of this return is around 0.6% for all the funds, and can be as high as 2.77% for small caps value funds. This result is robust to all the performance measures used in this study.
APA, Harvard, Vancouver, ISO, and other styles
45

Schmid, Sandra Mirjam <1996&gt. "Business accelerators - The case of VeniSIA: A benchmark analysis to conceptualize a possible accelerator within Venice." Master's Degree Thesis, Università Ca' Foscari Venezia, 2021. http://hdl.handle.net/10579/19754.

Full text
Abstract:
Venice has long been struggling with its three main problems – overdependence on tourism, progressive depopulation, and environmental challenges. The Corona pandemic starting in February 2020 and the resulting absence of tourists has exacerbated the situation and contributed to the initiation of the project called ‘Venice Sustainability Innovation Accelerator’ (VeniSIA). The aim of VeniSIA is to counteract all three main problems by creating a new entrepreneurial atmosphere in the city and the region, where entrepreneurs focus on the implementation of the ‘Sustainable Development Goals’, set up and adopted by all UN member states. By means of a benchmark analysis, insights are gained into which design elements have been used so far to successfully pursue different missions. For this purpose, the most popular accelerators, mainly from the U.S., as well as accelerators specializing in global environmental issues are analyzed. The result provides information on key elements that the new accelerator should have in order to be best adapted to the conditions in Venice. Based on this, VeniSIA should be primarily backed by corporates supporting the acceleration of sustainable start-ups which predominantly serve the environment. The vision is that Venice itself becomes a living laboratory where sustainable solutions are tested and can then be rolled out globally. Venice thus has the chance to become a smart city and transform itself into the oldest city of the future.
APA, Harvard, Vancouver, ISO, and other styles
46

Assaf, Ali. "Réduction de modèle, observation et commande prédictive d'une station d'épuration d'eaux usées." Thesis, Université de Lorraine, 2012. http://www.theses.fr/2012LORR0282/document.

Full text
Abstract:
Les installations d'épuration des eaux usées sont des systèmes de grande dimension, non linéaires, sujets à des perturbations importantes en flux et en charge. Une commande prédictive (MPC) a été appliquée au Benchmark BSM1 qui est un environnement de simulation qui définit une installation d'épuration. Une identification en boucle ouverte d'une station d'épuration d'eaux usées a été réalisée pour déterminer un modèle linéaire en se basant sur un ensemble de mesures entrée sortie du Benchmark BSM1. Les réponses indicielles en boucle ouverte ont été obtenues à partir de variation échelon des entrées autour de leurs valeurs stationnaires. Le modèle tient compte des non-linéarités à travers des paramètres variables. Les réponses indicielles obtenues permettent de déterminer par optimisation les fonctions de transfert continues correspondantes. Ces fonctions de transfert peuvent être regroupées en cinq modèles mathématiques. Des fonctions de transfert continues de premier ordre, de premier ordre avec un intégrateur, des réponses inverses, de second ordre et de second ordre avec zéro représentant les réponses indicielles ont été identifiées. Les valeurs numériques des coefficients de chaque modèle choisi ont été calculées par un critère des moindres carrés. La commande prédictive (MPC) utilise le modèle obtenu comme un modèle interne pour commander le procédé. Deux stratégies de la commande prédictive DMC et QDMC d'une station d'épuration avec ou sans compensation par anticipation ont été testées. La commande par anticipation est utilisée pour réduire l'effet de deux perturbations mesurées, le débit entrant et la concentration entrante en ammonium, sur le système
Wastewater treatment processes are large scale, non linear systems, submitted to important disturbances of influent flow rate and load. Model predictive control (MPC) widely used industrial technique for advanced multivariable control, has been applied to the Benchmark Simulation Model 1 (BSM1) simulation benchmark of wastewater treatment process. An open loop identification method has been developed to determine a linear model for a set of input-output measurements of the process. All the step responses have been obtained in open loop from step variations of the manipulated inputs and measured disturbances around their steady state values. The non-linearities of the model are taken into account by variable parameters. The step responses coefficient obtained make it possible to determine by optimization the corresponding transfer functions. That functions are classified by five mathematical models, such as : first order, first order with integrator, inverse response, second order and second order with zero. The numerical values of coefficients of each model selected were calculated using a least squares criterion. Model predictive control (MPC) uses the resulting model as an internal model to control the process. Dynamic matrix control DMC and quadratic dynamic matrix control QDMC predictive control strategies, in the absence and presence of feedforward compensation, have been tested. Two measured disturbances have been used for feedforward control, the influent flow rate and ammonium concentration
APA, Harvard, Vancouver, ISO, and other styles
47

Abuella, Mohamed A. "STUDY OF PARTICLE SWARM FOR OPTIMAL POWER FLOW IN IEEE BENCHMARK SYSTEMS INCLUDING WIND POWER GENERATORS." OpenSIUC, 2012. https://opensiuc.lib.siu.edu/theses/991.

Full text
Abstract:
AN ABSTRACT OF THE THESIS OF Mohamed A. Abuella, for the Master of Science degree in Electrical and Computer Engineering, presented on May 10, 2012, at Southern Illinois University Carbondale. TITLE:STUDY OF PARTICLE SWARM FOR OPTIMAL POWER FLOW IN IEEE BENCHMARK SYSTEMS INCLUDING WIND POWER GENERATORS MAJOR PROFESSOR: Dr. C. Hatziadoniu, The aim of this thesis is the optimal economic dispatch of real power in systems that include wind power. The economic dispatch of wind power units is quite different of conventional thermal units. In addition, the consideration should take the intermittency nature of wind speed and operating constraints as well. Therefore, this thesis uses a model that considers the aforementioned considerations in addition to whether the utility owns wind turbines or not. The optimal power flow (OPF) is solved by using one of the modern optimization algorithms: the particle swarm optimization algorithm (PSO). IEEE 30-bus test system has been adapted to study the implementation PSO algorithm in OPF of conventional-thermal generators. A small and simple 6-bus system has been used to study OPF of a system that includes wind-powered generators besides to thermal generators. The analysis of investigations on power systems is presented in tabulated and illustrative methods to lead to clear conclusions.
APA, Harvard, Vancouver, ISO, and other styles
48

Conte, Francesco. "A general purpose algorithm for a class of vehicle routing problems: A benchmark analysis on literature instances." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
The aim of this work is to analyse the computational performances of a general-purpose heuristic capable of solving a large class of Vehicle Routing Problems. The general Ruin and Recreate (R&R) framework of the algorithm is discussed, with particular attention on some of the removal and insertion operators used in the implementation. The benchmark analysis is then performed using standard benchmark instances of three different routing problems. Before analyzing the algorithm, a definition of the problem is provided together with some mathematical formulations. Exact methods are briefly discussed, whereas an exhaustive presentation of the most famous (meta)heuristic approaches is given. The obtained results show that the algorithm returns good solutions for almost all the different problem variants with up to 500 customers.
APA, Harvard, Vancouver, ISO, and other styles
49

Abendroth, Martin, and Eberhard Altstadt. "COVERS WP4 Benchmark 1 Fracture mechanical analysis of a thermal shock scenario for a VVER-440 RPV." Forschungszentrum Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:d120-qucosa-28238.

Full text
Abstract:
This paper describes the analytical work done by modelling and evaluating a thermal shock in a WWER-440 reactor pressure vessel due to an emergency case. An axial oriented semielliptical underclad/surface crack is assumed to be located in the core weld line. Threedimensional finite element models are used to compute the global transient temperature and stress-strain fields. By using a three-dimensional submodel, which includes the crack, the local crack stress-strain field is obtained. With a subsequent postprocessing using the j-integral technique the stress intensity factors KI along the crack front are obtained. The results for the underclad and surface crack are provided and compared, together with a critical discussion of the VERLIFE code.
APA, Harvard, Vancouver, ISO, and other styles
50

Abendroth, Martin, and Eberhard Altstadt. "COVERS WP4 Benchmark 1 Fracture mechanical analysis of a thermal shock scenario for a VVER-440 RPV." Forschungszentrum Dresden-Rossendorf, 2007. https://hzdr.qucosa.de/id/qucosa%3A21650.

Full text
Abstract:
This paper describes the analytical work done by modelling and evaluating a thermal shock in a WWER-440 reactor pressure vessel due to an emergency case. An axial oriented semielliptical underclad/surface crack is assumed to be located in the core weld line. Threedimensional finite element models are used to compute the global transient temperature and stress-strain fields. By using a three-dimensional submodel, which includes the crack, the local crack stress-strain field is obtained. With a subsequent postprocessing using the j-integral technique the stress intensity factors KI along the crack front are obtained. The results for the underclad and surface crack are provided and compared, together with a critical discussion of the VERLIFE code.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography