Dissertations / Theses on the topic 'Testing and Rating Methodology'

To see the other types of publications on this topic, follow the link: Testing and Rating Methodology.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Testing and Rating Methodology.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Yuen, Hon-ming Jacky, and 袁漢明. "Implementing peer assessment and self-assessment in a Hong Kong classroom." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31944966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zollinger, Lance M. "Probability of default rating methodology review." Thesis, Kansas State University, 2014. http://hdl.handle.net/2097/18811.

Full text
Abstract:
Master of Agribusiness
Department of Agricultural Economics
Allen M. Featherstone
Institutions of the Farm Credit System (FCS) focus on risk-based lending in accordance with regulatory direction. The rating of risk also assists retail staff in loan approval, risk-based pricing, and allowance decisions. FCS institutions have developed models to analyze financial and related customer information in determining qualitative and quantitative risk measures. The objective of this thesis is to examine empirical account data from 2006-2012 to review the probability of default (PD) rating methodology within the overall risk rating system implemented by a Farm Credit System association. This analysis provides insight into the effectiveness of this methodology in predicting the migration of accounts across the association’s currently-established PD ratings where negative migration may be an apparent precursor to actual loan default. The analysis indicates that average PD ratings hold relatively consistent over the years, though the distribution of the majority of PD ratings shifted to higher quality by two rating categories over the time period. Various regressions run in the analysis indicate that the debt to asset ratio is most consistently statistically significant in estimating future PD ratings. The current ratio appears to be superior to working capital to gross profit as a liquidity measure in predicting PD rating migration. Funded debt to EBITDA is more effective in predicting PD rating movement as a measure of earnings to debt than gross profit to total liabilities, although the change of these ratios over time appear to be weaker indicators of the change in PD rating potentially due to the variable nature of annual earnings of production agriculture operations due to commodity price volatility. The debt coverage ratio is important as it relates to future PD migration, though the same variability in commodity price volatility suggests the need implement multi-year averaging for calculation of earnings-based ratios. These ratios were important in predicting the PD rating of observations one year into the future for production agriculture operations. To further test the predictive ability of the PD ratings, similar regression analyses were completed comparing current year rating and ratios to future PD ratings beyond one year, specifically for three and five years. Results from these regression models indicate that current year PD rating and ratios are less effective in predicting future PD ratings beyond one year. Furthermore, because of the variation in regression results between the analyses completed for one, three and five years into the future, it is important to regularly capture ratio and rating information, at least annually.
APA, Harvard, Vancouver, ISO, and other styles
3

Taylor, Benjamin. "Sequential methodology and applications in sports rating." Thesis, Lancaster University, 2011. http://eprints.lancs.ac.uk/50655/.

Full text
Abstract:
Sequential methods aim to update beliefs about a set of parameters given new blocks of data that arise in sequence. Early research in this area was motivated by the case where the blocks of data arise in time and as a result of observing an underlying dynamical system, but an important modern application is in the analysis of large datasets. This thesis considers both the design and application of sequential methods. A new adaptive sequential Monte Carlo (SMC) methodology is presented. By incorporating adaptive Markov chain Monte Carlo (MCMC) moves into the SMC update, it is possible to utilise the heuristic, computational and theoretical advantages of SMC to make gains in sampling efficiency. The new method is tested on the problem of Bayesian mixture analysis and found to outperform an adaptive MCMC algorithm in 5 out of 6 of the situations considered. Theoretical justification of the method, guidelines for implementation and a condition for convergence are provided. When the dimensionality of the parameter space is high, methods such as the adaptive SMC sampler do not work well. In such cases, sequential data analysis can proceed with statistical models that are amenable to the exact or approximate filtering recursions. The two situations considered here will be the rating of sports teams and players. A new method for rating and selecting teams for the NCAA basketball tournament is considered. The selection of teams is important to University institutions in the United States, as admittance brings academic as well as sports-related financial benefits. Currently the selection process is undertaken by a panel of expert voters. The new method is in the main found to agree with these pundits, but in the seasons considered a small number of cases are highlighted where injustice to the team was evident. Also considered is the rating of professional basketball players. A new method is developed that measures a player's offensive and defensive ability and provides a means of combining this information into an overall rating. The method uses data from multiple seasons to more accurately estimate player abilities in a single season. Injustice in the assigning of NBA awards in the 2009 season is uncovered, but the research also highlights one possible reason for this: the commonly cited box-score statistics contain little information on defensive ability.
APA, Harvard, Vancouver, ISO, and other styles
4

Barbouche, Tarek. "Extreme Value Theory Applied to Securitizations Rating Methodology." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-204640.

Full text
Abstract:
One of today’s financial trends is securitization. Evaluating Securitization risk requires some strong quantitative skills and a deep understanding of both credit and market risk. For international securitization programs it is mandatory to take into account the exchange-rates-related risks. We will see the di˙erent methods to evaluate extreme variations of the exchange rates using the Extreme Value Theory and Monte Carlo simulations.
Värdepapperisering är en av dagens finansiella trender. Att utvärdera vär-depapperisering risk kräver starka kvantitativa kunskaper och en förståelseför både kredit- och marknadsrisk. För internationell värdepapperisering ärdet obligatoriskt att hänsyn tas till valutarisker. Vi kommer att se de olika metoder för att utvärdera extrema variationer i valutakurser med hjälp av extremvärdesteori och Monte Carlo-simuleringar.
APA, Harvard, Vancouver, ISO, and other styles
5

Hubner, Andreas. "Methodology for testing RFID applications." Thesis, De Montfort University, 2018. http://hdl.handle.net/2086/17497.

Full text
Abstract:
Radio Frequency Identification (RFID) is a promising technology for process automation and beyond that capable of identifying objects without the need for a line-of-sight. However, the trend towards automatic identification of objects also increases the demand for high quality RFID applications. Therefore, research on testing RFID systems and methodical approaches for testing are needed. This thesis presents a novel methodology for the system level test of RFID applications. The approach called ITERA, allows for the automatic generation of tests, defines a semantic model of the RFID system and provides a test environment for RFID applications. The method introduced can be used to gradually transform use cases into a semi-formal test specification. Test cases are then systematically generated, in order to execute them in the test environment. It applies the principle of model based testing from a black-box perspective in combination with a virtual environment for automatic test execution. The presence of RFID tags in an area, monitored by an RFID reader, can be modelled by time-based sets using set-theory and discrete events. Furthermore, the proposed description and semantics can be used to specify RFID systems and their applications, which might also be used for other purposes than testing. The approach uses the Unified Modelling Language to model the characteristics of the system under test. Based on the ITERA meta model test execution paths are extracted directly from activity diagrams and RFID specific test cases are generated. The approach introduced in this thesis allows to reduce the efforts for RFID application testing by systematically generating test cases and the automatic test execution. In combination with meta model and by considering additional parameters, like unreliability factors, it not only satisfies functional testing aspects, but also increases the confidence in the robustness of the tested application. Mixed with the instantly available virtual readers, it has the potential to speed up the development process and decrease the costs - even during the early development phases. ITERA can be used for highly automated testing, reproducible tests and because of the instantly available readers, even before the real environment is deployed. Furthermore, the total control of the RFID environment enables to test applications which might be difficult to test manually. This thesis will explain the motivation and objectives of this new RFID application test methodology. Based on a RFID system analysis it proposes a practical solution on the identified issues. Further, it gives a literature review on testing fundamentals, model based test case generation, the typical components of a RFID system and RFID standards used in industry.
APA, Harvard, Vancouver, ISO, and other styles
6

Timilsina, Parashmani. "Truck Load Testing and Adjusted Load Rating of Ironton Russell Bridge." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1574417628832757.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cherep, Dragoevich Manuel. "A Methodology for Applying Concolic Testing." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-303772.

Full text
Abstract:
Concolic testing is a technique that combines concrete and symbolic execution in order to generate inputs that explore different execution paths leading to better testing coverage. Concolic testing tools can find runtime errors fully automatically using available type specifications. The type specifications in a function define the type of each input. However, most specification languages are never expressive enough, which can lead to runtime errors caused by malformed inputs (i.e. irrelevant errors). Moreover, logic errors causing a program to operate incorrectly without crashing cannot be reported automatically. A universal methodology for any programming language is proposed. Preconditions force the concolic execution to generate well formed inputs before testing a function. On the other hand, postconditions lead to a runtime error when a program operates incorrectly, helping to find logic errors. The results obtained using the concolic testing tool CutEr, in the functional programming language Erlang, show how a program is only tested using well formed inputs specially generated to try to violate the defined postconditions.
APA, Harvard, Vancouver, ISO, and other styles
8

Abdul, Raof Abdul Halim. "The production of a performance rating scale : an alternative methodology." Thesis, University of Reading, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.394195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Huang, Ziyi. "Rating methodology of high voltage mass impregnated DC cable circuits." Thesis, University of Southampton, 2014. https://eprints.soton.ac.uk/372744/.

Full text
Abstract:
With the continuing growth in energy consumption worldwide, the move towards a European wide super grid will result in significant changes in how modern transmission and distribution networks are operated. Fundamental to this is the need to accurately know or determine the available ampacity of high voltage cable circuits, because huge bulk power volumes need to be transmitted between maritime nations through dc power cables. Therefore, an accurate cable rating becomes paramount towards an efficient and safe operation of transmission networks, while the finance for large scale network construction schemes is limited. Although the standardised thermal-limited rating has been successfully implemented for traditional ac cable networks for over 50 years, the move towards dc cable transmission imposes extra physical constraints on the cable rating, which are not considered by standard rating approaches. The two main concerns are the potential dielectric electrical breakdown prior to a normal thermal runaway and the development of dielectric cavities during cable cooling. In addition, the thermal-limited rating of submarine dc cable crossings, within a complex marine environment, requires an advanced numerical modelling method, where the traditional IEC thermal-limited rating method does not apply. Besides the technical value, significant interest exists both within the electrical power industry and organizations such as Cigré and IEC, because this work will inform future international standards for rating high voltage dc cables. Considering the dielectric electrical stress constraint as the limiting factor for cable ratings, an analytical electrical stress-limited rating method has been developed and successfully benchmarked by numerical simulations for a practical cable design. This method allows ratings to be calculated against a criterion of maximum dielectric electrical strength. Considering the dielectric cavity creation threshold as the limiting factor for cable ratings, a comprehensive study has been conducted, including thermal dynamics, theory of elasticity and electrical circuit theory. Subsequently, the analytical calculation of the cable internal pressure has been originally developed, together with a concept of the mechanical pressure-limited rating. The method has been successfully demonstrated for a practical cable design, yielding a rating which prevents the creation of cavity due to potential plastic deformations of the cable sheath. When crossings are inevitably installed, cables are pushed towards their thermal limit, as a result of the mutual heating. In order to accurately rate these circuits under various ambient conditions; Finite Element Analysis (FEA) methods have been developed. Compared to the traditional IEC calculation, FEA modelling provides a more reasonable and accurate solution, by releasing idealistic assumptions in the IEC method. In addition, a systematic cable rating strategy has been suggested and successfully demonstrated through rating submarine high voltage dc cable crossings, which considers highly coupled physics: thermal, electrical and mechanical. In summary, this thesis contributes towards the modern rating methodology development for hvdc mass impregnated cable circuits, under a purpose of efficient and reliable long-term operation.
APA, Harvard, Vancouver, ISO, and other styles
10

Biswas, Dhruv. "A system-on-a chip testing methodology." Thesis, University of Ottawa (Canada), 2005. http://hdl.handle.net/10393/26854.

Full text
Abstract:
In this thesis, we present a system-on-a chip testing methodology. The system consists of a wrapper, test access mechanism and the cores under test. The cores include the ISCAS sequential and combinational benchmark circuits. At the gate level, stuck at fault model is used to detect faults. The wrapper separates the circuit under test from other cores. The test access mechanism transports the test patterns or test vectors to the desired circuit under test and then transports the responses back to the output pin of the SOC. The faults are then injected using the fault simulator that generates test for the circuit under test. Out of the many TAM design methods, we implemented the TAM as a plain signal transport medium, which is shared by all the cores in the system-on-chip. Once the dedicated TAM lines are set to the circuit under test, fault simulation is done. Each circuit in an SOC is independently tested for its fault coverage. The isolation of the circuit under test from the others is taken care by the program running in the background. We were able to simulate the whole SOC testing and get satisfactory fault coverage for the circuits under tests.
APA, Harvard, Vancouver, ISO, and other styles
11

Ramilli, Marco <1983&gt. "A Design Methodology for Computer Security Testing." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amsdottorato.unibo.it/4438/.

Full text
Abstract:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
APA, Harvard, Vancouver, ISO, and other styles
12

Fragachan, Jose M. "Accelerated testing methodology for evaluating pavement patching materials." Link to electronic thesis, 2007. http://www.wpi.edu/Pubs/ETD/Available/etd-050407-140250/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

MUNIZ, EUDES SIQUEIRA. "NEW METHODOLOGY FOR TESTING SHALES UNDER TRIAXIAL STRESSES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 1998. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=1535@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
Os folhelhos correspondem a mais de 75% das formações perfuradas para exploração de hidrocarbonetos e cerca de 90% dos problemas de instabilidade em poços de petróleo são atribuídos a ele. Segundo Steiger & Leung (1991), são gastos de 600 milhões a 1 bilhão de dólares anuais com custos adicionais de perfuração, gerados por problemas de instabilidade em poços. Este trabalho propõe uma nova metodologia de ensaio triaxial axissimétrico CIU em folhelhos, capaz de obter os parâmetros necessários para análises de instabilidade em períodos de tempo relativamente curtos. Esta metodologia está baseada em procedimentos normalmente utilizados em ensaios de solos e emprega equipamentos específicos de testes em rochas. Foram realizados 12 ensaios triaxiais divididos em duas campanhas em um folhelho proveniente da Bacia de Campos. Os resultados dos ensaios comprovam a eficácia da nova metodologia. O comportamento de resistência destes folhelhos é descrito utilizando o critério linear de resistência de Mohr-Coulomb com coesão de 3,17 MPa e um ângulo de atrito interno de 25,3º.
Shales constitute more than 75% of the drilled rocks in the search for hydrocarbons and an estimated high 90% of wellbore instability problems are credited to their presence. According to Steiger & Leung (1991), more than 600 Million US dollars are spent annually by the oil industry just to corrent the problems due to instabilities. This work presents a new methodology for running triaxial compression tests in shales under undrained conditions. This methodology allows shorter duration of the tests and it is based upon adaptation of concepts traditionally employed in the soil mechanics field. 12 tests have been carried out in shales obtained from offshore Brazil during this work. The tests were divided into 2 groups based upon the nature of the fluid used in the pore pressure lines. The tests results show that this methodology is very efficient. The shear strength behavior of the shale is described and by using the Mohr-Coulomb criterium, the shear strength parameters are a cohesion of 3,17 MPa and phase equal to 25,3°.
Los folhelhos corresponden a más del 75% de las formaciones perforadas para exploración de hidrocarbonetos y a ellos le son atribuidos cerca de 90% de los problemas de inestabilidad en pozos de petróleo. Según Steiger & Leung (1991), se gastan de 600 millones a 1 billión de dólares anuales con costos adicionales de perforación, generados por problemas de inestabilidad en pozos. Este trabajo propone una nueva metodología de ensayo triaxial axisimétrico CIU en folhelhos, capaz de obter los parámetros necesarios para el análisis de inestabilidad en períodos de tiempo relativamente cortos. Esta metodología está basada en procedimientos normalmente utilizados en ensayos de suelos y emplea equipos específicos de pruebas en rocas. Fueron realizados 12 ensayos triaxiales divididos en dos campañas en un folhelho proveniente de la Bacia de Campos. Los resultados de los ensayos comproban la eficacia de la nueva metodología. Para describir el comportamiento de la resistencia de estos folhelhos se utiliza el criterio lineal de resistencia de Mohr-Coulomb con cohesión de 3,17 MPa y un ángulo de fricción interno de 25,3º.
APA, Harvard, Vancouver, ISO, and other styles
14

Ennulat, Harold W. "Emulation framework for testing higher level control methodology." Thesis, This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-10102009-020258/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Krishnan, Hari Shanker. "A diagnostic model for testing the memorability of advertisements." Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185728.

Full text
Abstract:
The purpose of this research is to develop and empirically test a conceptual framework for examining the effects of advertising exposure on consumer memory so as to better understand the information processing of advertisements. Patterns and levels of performance on various tests of memory for different advertisement components are interpreted within the framework of a memory model adapted from the well-known SAM model in psychology. Predictions are made regarding the effects on recall, recognition, and indirect test performance of an ad's execution strength and relevance to the main message elements, and elaboration (semantic versus nondirected). The general diagnostic procedures are illustrated in a study of humor in advertising. Subjects viewed print ads with variations in the humorous execution's strength and relevance to the brand claims either without explicit instructions to elaborate or with a task requiring semantic elaboration of the links between the humor and the brand claims. Subsequently they completed a (direct) recognition or recall task, or an indirect test of memory for various ad components. The results, though not entirely systematic, show that memory for the brand name and brand claim components vary as a function of stimulus characteristics and the processing operations at encoding. Second, the findings show that the ad components may facilitate or interfere with each other. High levels of attention to one ad component may lead to lower memory performance on other components. Third, this research shows how a theory-based set of comparisons of memory test performance may be used to identify the locus of effects, viz., at encoding or at retrieval. Memory failures due to lack of encoding attention to the ad are distinguished from the inability to retrieve the encoded information later. Finally, the study demonstrates the use of indirect tasks in testing advertising effects that implicate implicit retrieval processes from memory. The patterns of parallel versus dissociated performance on traditional direct versus indirect tests offer insights into various types of advertising effects on memory. The academic and managerial implications of the findings are discussed.
APA, Harvard, Vancouver, ISO, and other styles
16

Mack, Scott J. "An alternative testing methodology for TOW missile training systems." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1996. http://handle.dtic.mil/100.2/ADA322575.

Full text
Abstract:
Thesis (M.S. in Operations Research) Naval Postgraduate School, September 1996.
Thesis advisor(s): Dan C. Boger. "September 1996." Includes bibliographical references (p. 93). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
17

Lindberg, Stefan, and Fredrik Strandberg. "The development and evaluation of a unit testing methodology." Thesis, Karlstad University, Division for Information Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-148.

Full text
Abstract:

Westinghouse Fuel Manufacturing in Västerås, Sweden, manufactures fuel rods for nuclear plants. Manufacturing-IT is a software development section at Westinghouse Fuel Manufacturing. This thesis involves the development of a unit testing methodology (UTM) for the Manufacturing-IT section, which currently does not follow a well-defined software test process.

By evaluating different unit testing best practices and UTM design issues collected from literature, articles, papers and the Internet, a UTM document was developed. The UTM document was developed according to requirements from Manufacturing-IT and as an extension to existing documents within the Westinghouse organization.

The UTM was evaluated by applying the methodology in a case study. A single unit within a production control system in the rod manufacturing workshop at the Westinghouse fuel factory in Västerås was tested. Asides from evaluating the UTM, the case study was intended to find software tools that could simplify the unit testing process, and to test the production control system unit thoroughly.

The 182 test cases designed and implemented revealed 28 faults in the tested unit. NUnit was chosen to be the tool for automated unit testing in the UTM. The results from the case study indicate that the methods and other unit testing process related activities included in the UTM document developed are applicable to unit testing. However, adjustments and further evaluation will be needed in order to enhance the UTM.

The UTM developed in this thesis is a first step towards a structured testing process for the Manufacturing-IT section and the UTM document will be used at the Manufacturing-IT section.

By using the methods and other unit testing process related activities in the UTM developed in this thesis, any company or individual with similar requirements for a UTM as Manufacturing-IT, and that performs unit testing in an unstructured way, may benefit in that a more structured unit testing process is achieved.

APA, Harvard, Vancouver, ISO, and other styles
18

Corliss, Walter F. "An engineering methodology for implementing and testing VLSI circuits." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27024.

Full text
Abstract:
The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined within this thesis. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix. (rh)
APA, Harvard, Vancouver, ISO, and other styles
19

Morales, Gerardo. "A testing methodology for the validation of web applications." Thesis, Evry, Institut national des télécommunications, 2010. http://www.theses.fr/2010TELE0014/document.

Full text
Abstract:
L'objectif de cette thèse est d'assurer le bon comportement des aspects fonctionnels des systèmes basés sur le web. Pour atteindre cet objectif, nous nous basons dans ce manuscrit, sur deux approches différentes de test: l'approche active et l'approche passive. Le principe du test actif consiste à générer automatiquement une suite de scénarios de tests qui sera appliquée sur un système sous test pour en étudier sa conformité par rapport à ses besoins fonctionnels. Quant au test passif, il consiste à observer passivement le système sous test, sans interrompre le flux normal de ses opérations. Pour l'approche active, nous proposons une méthodologie qui permet de générer automatiquement des séquences de test afin de valider la conformité d'un système par rapport à la description formel du comportement du système. Le comportement est spécifié en utilisant un modèle formel basé sur des machines à états finis étendues temporisées (TEFSM). La génération automatique des tests est ensuite effectuée en utilisant des outils développés dans notre laboratoire et permet d'obtenir des cas de test exécutables qui permettent au moteur de test d'interagir avec une application web réel. Dans l’approche passive, nous spécifions des propriétés fonctionnelles à tester sous la forme d'invariants temporisés.Nous analysons ensuite les traces d’exécution d’un Web service composé afin d’élaborer un verdict sur sa conformité par rapport au comportement souhaité du système. Plusieurs algorithmes et outils sont fournis dans ce manuscrit pour effectuer le test actif et passif des systèmes Web. Nous avons appliqué nos méthodologies à divers systèmes (le Mission Handler et le Travel Reservation Service) pour illustrer les approches proposées sur des systèmes réels
The objective of this thesis is to ensure the proper behaviour of the functional aspects of web based systems. To achieve this goal, we proposed two different test approaches: the active approach and the passive approach. Our goal is to automatically generate a suite of active test scenarios that will be applied on a system under test to examine its compliance with respect to its functional specification, and, when interrupting the normal flow of operation is problematic, to observe the system under test with passive testing. The goal of this work is developing a method and a set of tools to test web based systems using the active and passive testing approaches. Concerning the active testing approach, we present a methodology to cover the end-to-end testing process (from building the model until the test execution). This work tackles the gap between, on the one hand, generating abstract test cases from abstract models and, on the other hand, developing methods allowing concretizing these tests and automatically applying them on a real applications. Then, concerning the passive test approach, we present a methodology and a new tool for observing the behaviour of the communications of the web applications with external web services (for SOA based web applications) in order to check whether the observed behaviour is correct. All the methodologies and tools presented in this work are applied on two industrial case studies, Mission Handler and Travel Reservation Service, in order to validate our contributions in active and passive testing respectively
APA, Harvard, Vancouver, ISO, and other styles
20

MADDI, KISHORE R. "A METHODOLOGY FOR INTERCONNECT TESTING OF NETWORK-ON-CHIP." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1115906768.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Mella, Luca. "Ict security: Testing methodology for targeted attack defence tools." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6963/.

Full text
Abstract:
La tesi di laurea presentata si inserisce nell’ampio contesto della Sicurezza Informatica, in particolare tratta il problema del testing dei sistemi di sicurezza concepiti per contrapporsi alle odierne minacce: gli attacchi mirati (Targeted Attacks) ed in generale le minacce avanzate persistenti (Advanced Persistent Threats). Il principale obiettivo del lavoro svolto è lo sviluppo e la discussione di una metodologia di test per sistemi di sicurezza focalizzati su questo genere di problemi. Le linee guida proposte hanno lo scopo di aiutare a colmare il divario tra quello che viene testato e quello che in realt`a deve essere affrontato realmente. Le attività svolte durante la preparazione della tesi sono state sia di tipo teorico, per quanto concerne lo sviluppo di una metodologia per affrontare al meglio il testing di sistemi di sicurezza a fronte di attacchi mirati, che ne di tipo sperimentale in quanto si sono utilizzati tali concetti per lo svolgimento di test su più strumenti di difesa in uno scenario d’interesse reale.
APA, Harvard, Vancouver, ISO, and other styles
22

Menon, Hari. "Improving the general measurement methodology." Thesis, Virginia Tech, 1990. http://hdl.handle.net/10919/41584.

Full text
Abstract:
This thesis proposed to improve an existing performance measurement methodology called the "General Measurement Methodology (GMM)." The GMM as well as its variations have been used in organizations to design a measurement effort to support performance improvement. It has evolved over a number of years and is currently being researched at the International Productivity Center (IPC). In order to attain its objective, this research adopted a case study approach supported by data from the literature as well as an expert panel. Three cases were considered to collect data on performance measurement system design and implementation. Two of these organizations (IPC and Acme Manufacturing Company) have used the GMM to set up performance measurement systems. The third case study (Golden State Power and Light {GSP&L}) was selected to lend another perspective to measurement system design since it used another approach or methodology. Site visits were made to each case study and data was collected primarily using the unstructured interview. The literature contributed more perspectives on how organizations measure performance. Responses from an expert panel of fourteen people enhanced the data base even further. Data from each of the above sources have been collected and processed. As explained in Chapter 6, the research design adopted had to be altered toward the end since the development and validation of the improved GMM was difficult. The strength and advantages of the improved version could not be completely verified. However, the conclusions of this thesis include a comprehensive description of the knowledge, wisdom and insight gained about measurement. A roadmap (based on the information acquired) toward effective measurement system design, development and implementation has also been presented.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
23

Al-Risaini, Mansour Ibrahim. "Characterization of soybean moisture using acoustic methodology." Master's thesis, Mississippi State : Mississippi State University, 2003. http://library.msstate.edu/etd/show.asp?etd=etd-11202003-190149.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

梁錦玲 and Kam-ling Joyce Leung. "A study on using performance appraisal as a strategic management tool." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1995. http://hub.hku.hk/bib/B3126668X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Müller, Lukáš. "Problém černého pasažera při ratingu eurozóny - případ Řecka." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-205128.

Full text
Abstract:
This thesis examines an impact of Greek membership in eurozone on a trustworthiness of Greek economy. In order to do that, it uses a methodology of rating agency Moody´s, on which it applies macroeconomic data. The goal is to identify key factor, which caused the discrepancy between awarded rating and real economic stability. The text itself is divided into two thematic parts. The first introduces an institution of rating Agency and explains the Sovereign Rating Methodology 2013. The second than analyze Greek economy using this methodology and applies findings on the causes of Greek debt crisis. Even though Greek membership in eurozone indeed lowered the interest rates of their bonds, direct impact of this membership on country rating wasn´t proved. Therefore one of the reasons of Greek crisis was also a moral hazard, when the financial markets relied on the fact, that eurozone will not let a member state go bankrupt.
APA, Harvard, Vancouver, ISO, and other styles
26

Bruneau, Julien. "Developing and Testing Pervasive Computing Applications: A Tool-Based Methodology." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2012. http://tel.archives-ouvertes.fr/tel-00767395.

Full text
Abstract:
Malgré des progrès récents, développer une application d'informatique ubiquitaire reste un défi à cause d'un manque de canevas conceptuels et d'outils aidant au développement. Ce défi implique de prendre en charge des objets communicants hétérogènes, de surmonter la complexité des technologies de systèmes distribués, de définir l'architecture d'une application, et d'encoder cela dans un programme. De plus, tester des applications d'informatique ubiquitaire est problématique car cela implique d'acquérir, de tester et d'interfacer une variété d'entités logicielles et matérielles. Ce procédé peut rapidement devenir coûteux en argent et en temps lorsque l'environnement ciblé implique de nombreuses entités. Cette thèse propose une méthodologie outillée pour dévelop- per et tester des applications d'informatique ubiquitaire. Notre méthodologie fournit tout d'abord le langage de conception DiaSpec. Ce langage permet de définir une taxonomie d'entités spécifiques à un domaine applicatif, s'abstrayant ainsi de leur hétérogénéité. Ce langage inclut également une couche permettant de définir l'architecture d'une application. Notre suite outillée fournit un compilateur qui, à partir de descriptions DiaSpec, génère un canevas de programmation guidant les phases d'implémentation et de test. Afin d'aider à la phase de test, nous proposons une approche de simulation et un outil intégré dans notre méthodologie outillée : l'outil DiaSim. Notre approche utilise le support de test généré par DiaSpec pour tester les applications de manière transparente dans un environnement physique simulé. La simulation d'une application est rendue graphiquement dans un outil de visualisation 2D. Nous avons combiné DiaSim avec un langage dédié permet- tant de décrire les phénomènes physiques en tant qu'équations différentielles, permettant des simulations réalistes. DiaSim a été utilisé pour simuler des applications dans des domaines applicatifs variés. Notre approche de simulation a également été appliquée à un système avionique, démontrant la généralité de notre approche de simulation.
APA, Harvard, Vancouver, ISO, and other styles
27

Chassin, Ludovic Jean. "In silico testing of glucose controllers : methodology and sample application." Thesis, City University London, 2005. http://openaccess.city.ac.uk/8435/.

Full text
Abstract:
Diabetes mellitus designates a range of metabolic disorders characterised by hyperglycaemia due to deficient or absent insulin secretion, insulin action, or both. In particular, Type 1 diabetes is characterised by a total lack of endogenous insulin secretion which has to be replaced by exogenous insulin to control the plasma glucose concentration. An extracorporeal wearable artificial pancreas (AP) has been a research aim for over three decades. The research is motivated by the need to improve glucose control. Results of a major study, the Diabetes Control and Complications Trial (DCCT), have demonstrated that improvements in glucose control prevent or delay long term complications, which are the main causes of morbidity and mortality in subjects with Type 1 diabetes. Prior to a clinical evaluation, performance of new medical devices can be tested in silico. Such an approach has been adopted extensively by the pharmaceutical industry in the development of new drugs. In silica testing benefits from relatively low financial, human, and time costs by comparison with the resources required for a full clinical evaluation. The aims of the present thesis are to identify components of the AP, integrate them into a simulation environment, and design an in silico evaluation strategy for the development of closed-loop algorithms with the ultimate goal to assess safety and efficacy prior to clinical evaluation. In the present work, submodels of metabolic processes were linked to represent the characteristics of the glucoregulation in Type 1 diabetes. The submodels were associated with sets of parameters to account for variability in population and individual responses to meals and insulin therapy. The model of glucoregulation in Type 1 diabetes was extended by models of subcutaneous (sc) glucose sensing and sc insulin delivery to represent all aspects of the AP. A systematic approach was developed and employed to evaluate, in silica, the potential and limitations of an AP glucose controller. This was exemplified by evaluating a nonlinear model predictive controller. The robustness of the AP was explored by hypothesising various perturbations induced by different system components. A further objective included the establishment of a qualitative grading scheme of glucose control from the clinical viewpoint. This was followed by a comparison between results from simulations and a clinical trial of 24 hours, which gave the proof of concept of in silica testing. It was found that despite discrepancies due to initial conditions and meal differences, the simulations indicated well the outcome of the clinical trial. In conclusion, the thesis demonstrates the significant potential of in silica testing to make predictions about system behaviour aiding the assessment of safety and efficacy of control algorithms during the development of an AP.
APA, Harvard, Vancouver, ISO, and other styles
28

Dear, Ian D. "A test strategy planning methodology driven by economic parameters." Thesis, Brunel University, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292562.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Thornton, Nathan Paul. "Live Load Testing of Appalachia, Va Concrete Arch Bridges for Load Rating Recommendation." Thesis, Virginia Tech, 2012. http://hdl.handle.net/10919/35195.

Full text
Abstract:
As Americaâ s infrastructure ages, many of the nationâ s bridges approach the end of their service life. In order to develop a method for handling the rising number of deficient and functionally obsolete bridges, nondestructive tests and evaluations must be undertaken. Valuable information from these tests regarding the strength and condition of bridges will help in making decisions about their rehabilitation and replacement.

Two adjoining open spandrel reinforced concrete arch bridges in downtown Appalachia, Virginia were selected for live load testing by Virginia Department of Transportation (VDOT). Both bridges have supported an increasing amount of extreme coal truck traffic throughout their service life and are essential to the efficient transport of coal in the region. Because of their age, having been built in 1929, and the amount of visible damage and repairs, VDOT was concerned about their remaining capacity and safe operation.

The live load tests focused on global behavior characteristics such as service strain and deflection as well as local behavior of the arches surrounding significant repairs. It was found that the strain and deflection data collected during load testing displayed linear elastic behavior, indicating excess capacity beyond the test loads. Also, given the loading applied, the measured strains and deflections were small in magnitude, showing that the bridges are still acting as stiff structures and are in good condition. Data collected during these tests was compared to results from a finite element model of the bridges to determine the coal truck size which is represented by the live load test loading configurations. The model comparisons determined the test loads produced comparable deflections to those produced by the target coal truck load. Through this approach, a recommendation was given to VDOT regarding the satisfactory condition of the aging bridges to aid in the process of load rating and maintenance scheduling for the two bridges.
Master of Science

APA, Harvard, Vancouver, ISO, and other styles
30

Feng, Xianan. "Nondestructive Load Testing and Experimental Load Rating of the Veteran's Glass City Skyway." University of Toledo / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1279214079.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Wvong, Russil. "A new methodology for OSI conformance testing based on trace analysis." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29343.

Full text
Abstract:
This thesis discusses the problems of the conventional ISO 9646 methodology for OSI conformance testing, and proposes a new methodology based on trace analysis. In the proposed methodology, a trace analyzer is used to determine whether the observed behavior of the implementation under test is valid or invalid. This simplifies test cases dramatically, since they now need only specify the expected behavior of the IUT; unexpected behavior is checked by the trace analyzer. Test suites become correspondingly smaller. Because of this reduction in size and complexity, errors in test suites can be found and corrected far more easily. As a result, the reliability and the usefulness of the conformance testing process are greatly enhanced. In order to apply the proposed methodology, trace analyzers are needed. Existing trace analyzers are examined, and found to be unsuitable for OSI conformance testing. A family of new trace analysis algorithms is presented and proved. To verify the feasibility of the proposed methodology, and to demonstrate its benefits, it is applied to a particular protocol, the LAPB protocol specified by ISO 7776. The design and implementation of a trace analyzer for LAPB are described. The conventional ISO 8882-2 test suite for LAPB, when rewritten to specify only the expected behavior of the IUT, is found to be more than an order of magnitude smaller.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
32

Morris, Brian. "A model test methodology for the fire testing of compartment walls." Thesis, University of Ulster, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hohmann, Brian P. (Brian Patrick). "Life extension of structural components via an improved nondestructive testing methodology." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62688.

Full text
Abstract:
Thesis (Sc. D.)--Massachusetts Institute of Technology, Dept. of Materials Science and Engineering, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 337-355).
An experimental study was performed to determine the flaw detection sensitivity of advanced nondestructive testing (NDT) techniques with respect to structural applications. The techniques analyzed exemplify the incorporation of digital technology into NDT and includes the following: meandering winding magnetometer array (MWM-array@) eddy current, phased-array ultrasonic (PA-UT), three dimensional computed tomography (3DCT), and digital radiography (DR). The three classes of samples inspected with these techniques consisted of alloy block specimens containing flat bottom hole (FBH) arrays, probability of detection (POD) wedding cake samples, and actual airplane engine components. Results from the sensitivity analyses were compared to current NDT techniques used industrially. An image analysis program called Cellprofiler was used to optimize the threshold correction factor for selected results. The Cellprofiler output was analyzed in conjunction with POD software, and the integration of digitally advanced NDT techniques with image analysis software resulted in approximately a threefold improvement in the minimum detectable flaw size at the 90/95 POD/CL level. An improved inspection methodology was presented which incorporated redundancy in the in-service inspection plan with the use of Bayesian updating techniques to forecast remnant life. Reliability block diagrams for structural disk and blade aircraft engine components were presented as examples of the methodology. Implementation of the proposed NDT methodology significantly increases the feasibility of a retirement-forcause (RFC) approach to be applied to aging structural components in a cost-effective manner.
by Brian P. Hohmann.
Sc.D.
APA, Harvard, Vancouver, ISO, and other styles
34

Westerlaken, Michelle. "A New User Testing Methodology for Digitally Mediated Human-Animal Interaction." Thesis, Malmö högskola, Fakulteten för kultur och samhälle (KS), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-21688.

Full text
Abstract:
This thesis evaluates a novel methodology for the user testing of digitally mediated human-animal interactions. The proposed method includes the structural analysis of video observations following a Grounded Theory approach. Complemented with more subjective human observations, this methodology aims to initiate a more informed iterative design and research process in which the animal’s experience with a playful artefact is analysed and reflected upon. The research involves the user testing of a prototype for an independently developed tablet game designed for cats and humans. With a focus on the user experience of the cat, the data analysis of this study results in new insights in the behaviour of the cat while interacting with the game. These outcomes are subsequently concluded in the form of design iterations that can help to improve the prototype. This study demonstrates how a new methodology can provide an initial focus on the perceptions and experience of the animal and lead to valuable insights that can advance the design of a digital artefact intended for animal use. Further research in this new area of interaction design can benefit from this study by expanding the theoretical framework and methodologies to different contexts and settings with the integration of playful technological artefacts and other animals that are known to engage in natural play.
APA, Harvard, Vancouver, ISO, and other styles
35

Shoukas, Gregory M. "Development and validation of a method for testing and rating thermosyphon solar water heaters." Connect to online resource, 2007. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1442903.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Reilly, James Joseph. "Load Testing Deteriorated Spans of the Hampton Roads Bridge-Tunnel for Load Rating Recommendations." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/74302.

Full text
Abstract:
The Hampton Roads Bridge-Tunnel is one of the oldest prestressed concrete structures in the United States. The 3.5 mile long twin structure includes the world's first underwater tunnel between two man-made islands. Throughout its 60 years in service, the harsh environment along the Virginia coast has taken its toll on the main load carrying girders. Concrete spalling has exposed prestressing strands within the girders allowing corrosion to spread. Some of the more damaged girders have prestressing strands that have completely severed due to the extensive corrosion. The deterioration has caused select girders to fail the necessary load ratings. The structure acts as an evacuation route for the coast and is a main link for the local Norfolk Naval Base and surrounding industry. Because of these constraints, load posting is not a viable option. Live load testing of five spans was performed to investigate the behavior of the damaged spans. Innovative techniques were used during the load test including a wireless system to measure strains. Two different deflection systems were implemented on the spans, which were located about one mile offshore. The deflection data was later compared head to head. From the load test results, live load distribution factors were developed for both damaged and undamaged girders. The data was also used by the local Department of Transportation to validate computer models in an effort to help pass the load rating. Overall, this research was at the forefront of the residual strength of prestressed concrete girders and the testing of in-service bridges.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
37

Ristanovic, Dragan. "New methodology for transmission line relay testing and evaluation using advanced tools." Texas A&M University, 2003. http://hdl.handle.net/1969.1/355.

Full text
Abstract:
Protective relays are important parts of the power system. The protection guards valuable equipment, and protective relays play a vital role in performing the task. The relay detects fault conditions within an assigned area, opens and closes output contacts to cause the operation of other devices under its control. The relay acts to operate the appropriate circuit breakers to prevent damage to personnel and property. To ensure consistent reliability and proper operation, protective relay equipment must be evaluated and tested. The importance of the relay evaluation issue is linked to capability to test the relays and relaying systems using very accurate waveform representation of a fault event. The purpose of testing protective relays is to ensure correct operation of the relay for all possible power system conditions and disturbances. To fulfill this purpose, relay testing in varying network configurations and with different fault types is required. There are a variety of options that have different performance potentials and implementation constraints. Use of digital simulators to test protective relays has proven to be an invaluable mean to evaluate relay performance under realistic conditions. This thesis describes a new methodology that attempts to improve the existing practices in testing relays by using advanced digital simulator hardware, different software packages for network modeling, and new software tools for generating and replaying test waveforms. Various types of microprocessor relays are tested and evaluated through the set of scenarios. New methodology that combines different software packages to facilitate particular testing objectives is applied.
APA, Harvard, Vancouver, ISO, and other styles
38

Zager, Mary Ann. "Explicating and testing a general theory of crime." Diss., The University of Arizona, 1993. http://hdl.handle.net/10150/186570.

Full text
Abstract:
Gottfredson and Hirschi's A General Theory of Crime (1990) motivated much research on the concept of self-control and its relationship to crime, delinquency, and deviant behavior. Although researchers are aware of this theory's contribution to criminological research, some confusion about the exact nature of the relationship between self-control and criminal behavior (as specified by Gottfredson and Hirschi) remains. To clarify this relationship, the assumptions most vital to the theory are explained. One theorem derived from these assumptions regards the role of opportunity in deviant behavior. Gottfredson and Hirschi clearly posit opportunity as a necessary but not sufficient condition for criminal (and analogous non-criminal) behavior to occur. The precise role of opportunity in self-control theory, however, is somewhat unclear in Gottfredson and Hirschi's work. The present work clarifies this element of opportunity, searches for a measure of self-control that is opportunity free, and addresses the relationship between this type of measure and delinquent behavior using data from the National Youth Survey. The role of opportunity in this theory is clarified using gender differences in delinquent behavior as a tool for separating the components of opportunity. Using gender differences in several delinquent behaviors, the existence of the two components of opportunity (one inherent in the act and one inherent in the actor) is confirmed. After establishing these elements of opportunity, gender differences are used to facilitate the search for a measure of self-control that is distinct from both. This attitudinal measure raises the issue of the role of attitudes in Gottfredson and Hirschi's theory. The final analysis focuses on the relationships between attitudes (both children's and parent's) and children's delinquent behavior. Log-linear models are used to specify the structures of these relationships, which are complimentary to Gottfredson and Hirschi's assumptions regarding social norms, parental influence on children's value systems, and an individual's ability to engage in behaviors that they realize are inappropriate.
APA, Harvard, Vancouver, ISO, and other styles
39

Wotherspoon, David Kenneth. "Television content analysis : agreement between expert and naive coders." Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/28314.

Full text
Abstract:
Agreement between trained and untrained coders in assessing television content was investigated. A model integrating the different approaches to content analysis was proposed. The model contains three dimensions: audience coders versus expert coders, microanalysis versus macroanalysis, and quantitative versus qualitative analysis. The audience versus expert coders facet of that model was evaluated by having university students watch and assess the content of 24 television programs chosen from prime-time on the basis of their popularity. They were not trained in content analysis and did not know the questions about which they were asked until after viewing their program. Their evaluations were compared with similar evaluations given previously by trained (expert) coders. Each of the 24 programs was watched by 5 male and 5 female naive coders (total N=240). The groups were balanced for ethnicity and socioeconomic status. A statistic developed especially for this research was used to compare the naive and expert ratings on 22 selected variables. The results indicated that untrained and trained coders in general evaluated the programs similarly. Moreover, the questions on which the experts tended not to agree (that is, were unreliable) were generally the same ones on which the untrained coders did not agree, both amongst themselves and with the experts.
Arts, Faculty of
Psychology, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
40

Shoard, William. "A methodology for the design of sustainable buildings utilising computer simulation and a sustalnabllity rating scheme." Thesis, University of Manchester, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.525990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Van, Huyssteen Johan Williams. "A proposed credit rating methodology for co-operative banks in South Africa / Johan Williams van Huyssteen." Thesis, North-West University, 2008. http://hdl.handle.net/10394/3815.

Full text
Abstract:
When large banks have a shortage of liquidity, they solve the problem by either placing papers in the market, going to other banks, borrowing from other financial institutions or making use of its reserves. When entering the market, credit ratings facilitate the loan process by providing an indication of the lending banks' risk. However, when South African co-operative banks enter the market for finances, no rating can be applied as the method for rating these banks does not exist. This, in turn, leads to a slow-down in the loans process and co-operative banks being charged higher interest rates. The primary objective of this dissertation was the formulation of a credit rating methodology, amended from Fitch Ratings and Moody's Investors Service, for South African co-operative banks. A literature study was undertaken in order to determine the theoretical aspects of rating ban.ks as well as providing insight into the management structures of cooperatives and their business practices. A proposed credit rating methodology was then developed and tested by means of a questionnaire provided to South African credit unions of different sizes in Gauteng and the North-West. The history of credit unions and co-operative banks was provided as the point of departure and followed by the Co-operative Banks Act. This was done in order to facilitate the understanding for the need of the rating methodology along with the rating aspects provided for by legislation, especially regarding the operating and regulatory environment. The developed methodology was found to be adequately suited for co-operative banks in South Africa (CBSA) and could ultimately assist CBSAs in negotiating interest rates charged when entering the market for liquidity purposes. This in turn could have positive implications in the government's aim to reach the large un banked population of South Africa.
Thesis (M.Com. (Risk Management))--North-West University, Potchefstroom Campus, 2009.
APA, Harvard, Vancouver, ISO, and other styles
42

Nasseri, Sahand. "Application of an improved transition probability matrix based crack rating prediction methodology in Forida's highway network." [Tampa, Fla] : University of South Florida, 2008. http://purl.fcla.edu/usf/dc/et/SFE0002379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Nasseri, Sahand. "Application of an Improved Transition Probability Matrix Based Crack Rating Prediction Methodology in Florida’s Highway Network." Scholar Commons, 2008. https://scholarcommons.usf.edu/etd/424.

Full text
Abstract:
With the growing need to maintain roadway systems for provision of safety and comfort for travelers, network level decision-making becomes more vital than ever. In order to keep pace with this fast evolving trend, highway authorities must maintain extremely effective databases to keep track of their highway maintenance needs. Florida Department of Transportation (FDOT), as a leader in transportation innovations in the U.S., maintains Pavement Condition Survey (PCS) database of cracking, rutting, and ride information that are updated annually. Crack rating is an important parameter used by FDOT for making maintenance decisions and budget appropriation. By establishing a crack rating threshold below which traveler comfort is not assured, authorities can screen the pavement sections which are in need of Maintenance and Rehabilitation (M&R). Hence, accurate and reliable prediction of crack thresholds is essential to optimize the rehabilitation budget and manpower. Transition Probability Matrices (TPM) can be utilized to accurately predict the deterioration of crack ratings leading to the threshold. Such TPMs are usually developed by historical data or expert or experienced maintenance engineers' opinion. When historical data are used to develop TPMs, deterioration trends have been used vii indiscriminately, i.e. with no discrimination made between pavements that degrade at different rates. However, a more discriminatory method is used in this thesis to develop TPMs based on classifying pavements first into two groups. They are pavements with relatively high traffic and, pavements with a history of excessive degradation due to delayed rehabilitation. The new approach uses a multiple non-linear regression process to separately optimize TPMs for the two groups selected by prior screening of the database. The developed TPMs are shown to have minimal prediction errors with respect to crack ratings in the database that were not used in the TPM formation. It is concluded that the above two groups are statistically different from each other with respect to the rate of cracking. The observed significant differences in the deterioration trends would provide a valuable tool for the authorities in making critical network-level decisions. The same methodology can be applied in other transportation agencies based on the corresponding databases.
APA, Harvard, Vancouver, ISO, and other styles
44

Sudol, Alicia. "A methodology for modeling the verification, validation, and testing process for launch vehicles." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54429.

Full text
Abstract:
Completing the development process and getting to first flight has become a difficult hurdle for launch vehicles. Program cancellations in the last 30 years were largely due to cost overruns and schedule slips during the design, development, testing and evaluation (DDT&E) process. Unplanned rework cycles that occur during verification, validation, and testing (VVT) phases of development contribute significantly to these overruns, accounting for up to 75% of development cost. Current industry standard VVT planning is largely subjective with no method for evaluating the impact of rework. The goal of this research is to formulate and implement a method that will quantitatively capture the impact of unplanned rework by assessing the reliability, cost, schedule, and risk of VVT activities. First, the fidelity level of each test is defined and the probability of rework between activities is modeled using a dependency structure matrix. Then, a discrete event simulation projects the occurrence of rework cycles and evaluates the impact on reliability, cost, and schedule for a set of VVT activities. Finally, a quadratic risk impact function is used to calculate the risk level of the VVT strategy based on the resulting output distributions. This method is applied to alternative VVT strategies for the Space Shuttle Main Engine to demonstrate how the impact of rework can be mitigated, using the actual test history as a baseline. Results indicate rework cost to be the primary driver in overall project risk, and yield interesting observations regarding the trade-off between the upfront cost of testing and the associated cost of rework. Ultimately, this final application problem demonstrates the merits of this methodology in evaluating VVT strategies and providing a risk-informed decision making framework for the verification, validation, and testing process of launch vehicle systems.
APA, Harvard, Vancouver, ISO, and other styles
45

Peters, Richard G. "Judgments of academic achievement by teachers and standardized, norm-referenced tests revisited : an issue of educational and political policy." Virtual Press, 1991. http://liblink.bsu.edu/uhtbin/catkey/832995.

Full text
Abstract:
The purpose of this study was to investigate the degree of concurrence between teachers' judgments of the academic achievement of students and the results of standardized,norm-referenced achievement tests. Although this issue had been addressed before, results reported in the literature lacked a sensitivity to the informational needs of educational policy makers and were obfuscated by significant differences in research design and analytical techniques. This study attempted to address the potential moderating effect of teachers' pre-established notions of students' knowledge, academic subject area, grade level, and student gender on the agreement level between teachers' judgment of student achievement and test results, while focusing on the ever increasing use of test scores to make decisions regarding student readiness for promotion/graduation and overall school accountability.Approximately 670 teachers were asked to rate their students as "not ready to succeed at the next grade level without remedial assistance" (non-masters) or "ready to succeed without additional instruction or intervention" (masters). Ratings were obtained in both English/language arts and mathematics for 15,935 students in grades 1, 2, 3, 6, and 8. The sample utilized was representative of the demographics of the state of Indiana. While appropriate statistical tests of significance were performed when appropriate, this study focused on effect size as the final determinant of "educational significance."Analyses revealed no practical reason to believe that teachers' judgments were influenced by their initial ratings of students as masters or non-masters, student gender, grade level, or subject matter. On the average, teachers' mastery/non-mastery ratings were found to agree with "cutscores" established through discriminant analysis in about 78% of the cases. These results were seen as encouraging, in that test results could be used to support teacher judgment, which seemed unaffected by moderating variables, while not offering information completely redundant with pre-existing teacher knowledge of student achievement.
Department of Educational Psychology
APA, Harvard, Vancouver, ISO, and other styles
46

Berkman, Ali Emre. "A Sampling Methodology For Usability Testing Of Consumer Products Considering Individual Differences." Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612188/index.pdf.

Full text
Abstract:
Aim of the study was to discuss and identify individual differences that influence the user performance during usability tests of consumer products that are known to prevent researchers to conduct systematic studies. The rationale behind the study was developing a tool for sampling in order to handle experiential factors as a variable rather than a source of error. The study made it possible to define and elaborate on constructs general interaction expertise (GIE) and general interaction self efficacy (GISE), and to devise a measurement scheme based on performance observation and attitude measurement. Both perspectives were evaluated with preliminary validity studies and it was possible to provide evidence on predictive validity of the tool developed. Furthermore, opportunities of utilizing the results in design and qualitative research settings were also explored.
APA, Harvard, Vancouver, ISO, and other styles
47

MUNIZ, EUDES SIQUEIRA. "DEVELOPMENT OF EQUIPMENT AND TESTING METHODOLOGY TO EVALUATE ROCK-DRILLING FLUID INTERACTION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2003. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=4013@1.

Full text
Abstract:
AGÊNCIA NACIONAL DE PETRÓLEO
A perfuração de poços de petróleo através de folhelhos, que se constituem na maioria das rochas da coluna estratigráfica, pode apresentar problemas de instabilidade devido às interações fisico-químicas entre os fluidos de perfuração e estas rochas. Os custos associados à solução destes problemas são muito altos e dependendo da intensidade destes problemas, poços podem ser perdidos. Nesta tese, um novo equipamento, capaz de reaplicar parte das tensões que estão atuando na amostra de rocha, foi desenvolvido. Uma metodologia de testes para avaliar os mecanismos de interação rocha-fluido e para determinar os parâmetros de transporte de massa, necessários para análises de estabilidade de poços, é proposta. Especificamente, parâmetros que descrevem o transporte de água e íons devido a gradientes hidráulicos e químicos são determinados. O conhecimento destes parâmetros contribui para compreender a eficiência do fluido de perfuração no controle de instabilidades durante a perfuração. Testes realizados em dois folhelhos de plataformas offshore usando diferentes fluidos demonstraram a eficiência do equipamento e da metodologia de testes. Os parâmetros de transporte obtidos são consistentes com valores obtidos em outros trabalhos.
The drilling of oil wells through shales, which constitute the majority of rocks in the stratigraphic column, may present instability problems due to physico-chemicals interactions between the drilling fluids and these rocks. The costs associated to the solution of these problems are very high and, depending upon the intensity of these problems, wells can be lost. In this thesis, a new equipment was developed which is capable of reapplying part of the stresses that were acting upon the rock sample. A testing methodology to evaluate rock-fluid interaction mechanisms and to determine the mass transport parameters, needed for wellbore stability analyses, is proposed. Specifically, parameters that describe the transport of water and ions due to hydraulic and chemicals gradients are determined. The knowledge about these parameters is instrumental to understand the efficiency of the drilling fluid in controlling instabilities during drilling. Tests carried out in two shales from offshore platforms using different fluids demonstrated the efficiency of equipment and of the testing methodology. The transport parameters obtained are consistent with values obtained elsewhere.
APA, Harvard, Vancouver, ISO, and other styles
48

Hofmann, Annika Luisa. "Case study: DBRS sovereign rating of Portugal - analysis of rating methodology and rating decisions." Master's thesis, 2017. http://hdl.handle.net/10362/26843.

Full text
Abstract:
This paper analyzes and assesses the DBRS sovereign credit rating methodology and its rating decisions on Portugal. A replicated rating model on Portugal allows to assess the DBRS rating methodology and to identify country-specific risk factors. An OLS regression compares rating effects of ten fundamental variables among S&P, Moody’s, Fitch Ratings and DBRS. Further, a rating scale model fractionally disentangles DBRS rating grades into their subjective and objective rating components. Both qualitative and empirical findings attest DBRS a comparably lenient rating behavior on Portugal –in comparison to other rating agencies as well as within the DBRS cross-country rating decisions.
APA, Harvard, Vancouver, ISO, and other styles
49

Nimchuk, Arlen M. "Testing for preference of rating scale format." 1993. http://hdl.handle.net/1993/17752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hsu, Tsan-Sen, and 許燦森. "A Concurrent NoC Testing Methodology." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/60725130685940331994.

Full text
Abstract:
碩士
國立臺灣大學
電機工程學研究所
100
As the complexity of VLSI design scales, we are unlikely to adopt traditional SoC design method. Instead, network-on-chip (NoC) – a new SoC paradigm was formally proposed in 2002 which possesses more flexibility, scalability, reliability and the ability of fault tolerance. NoC simulates the communication mechanism of computer network, that is, it replaces the bus architecture with many simple routers. These routers are responsible for communications between modules. However, NoC testing is a big challenge. First, because all routers in NoC are connected with nearby router, it is not easy to inject test pattern due to lack of test access point. Second, if we adopt traditional scan-based method, we will suffer from long testing time as flip-flop number increases. Third, we also need to consider interconnect fault between two nearby routers because open/short fault is likely to exist among these nets. In this thesis, we insert specific boundary scan cell into NoC for improving testability. Besides, we can test FIFO faults completely by functional test. And from experimental result, the fault coverage of remaining logic elements is 99.62% in stuck-at fault model and 95.78% in transition fault model.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography