To see the other types of publications on this topic, follow the link: ON-TO-METHODOLOGY.

Dissertations / Theses on the topic 'ON-TO-METHODOLOGY'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'ON-TO-METHODOLOGY.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Madau, Maxime. "A methodology to localise EMFI areas on Microcontrollers." Thesis, Montpellier, 2019. http://www.theses.fr/2019MONTS045.

Full text
Abstract:
De nos jours, la sécurité des systèmes embarqués prend une place de plus en plus importante. Notamment du fait de l'augmentation de la part du marché prise par l'IoT et le marché automobile.Afin de justifier un certain niveau de sécurité ces systèmes embarqués doivent subir des audits de sécurité afin soit d'obtenir une certification, qui peut s'avérer nécessaire pour adresser certains marché, ou alors plus simplement pour éviter de ternir le nom de l'entreprise en cas de faille.Le chemin d'attaque le plus efficace est probablement l'injection de faute obtenue par une violation volontaire des conditions d'utilisation d'un circuit.De cette faute différent scénarios sont possibles, soit celle-ci est couplée à des outils statistiques pour obtenir la clef secrète d'un algorithme cryptographique, soit elle permet une escalade de privilège.Cette thèse se concentre sur les fautes induites par perturbation électromagnétique, qui est le support qui offre le meilleur compromis précision coût.Si les attaques par injections de fautes se sont montrées maintes fois efficaces dans la littérature, elles possèdent néanmoins un défaut conséquent dans le cadre de l'évaluation sécuritaire. Ce défaut vient du très grand nombre de paramètres offert par l'injection de faute à son utilisateur. Si on ajoute à cela les contraintes temporelles inhérentes à l'évaluation, on se rend compte de la difficulté de garantir la sécurité d'un produit contre de telles menaces.De ce constat il devient évident que des métriques ou stratégie sont nécessaire pour améliorer la qualité des évaluations.Cette thèse est un premier pas dans cette direction et propose de résoudre la complexité spatiales lors d'une campagne évaluation face à l'injection de faute électromagnétique.L'idée est de définir une métrique se basant sur des expérimentations ainsi que l'état de l'art pour réduire l'espace à tester à quelques positions qui vont presque certainement mener à une faute du fait de leur propriété physique.Dans une première partie la création d'un tel critère est présentée. Celui-ci se base sur un modèle simplifié du couplage sonde d'injection circuit et sur le modèle de faute le plus récent.Ensuite les limites d'un tel critère sont analysées afin d'en trouver une amélioration.Cependant, l'injection de faute ne permet pas seulement d'attaquer directement une cible, elle peut aussi diminuer sa sécurité en visant ses contre-mesures.La plupart des contre-mesures ont en commun l'utilisation d'un générateur de nombre aléatoire, c'est pourquoi la robustesse d'un générateur aléatoire récent sera évaluée dans une troisième partie.De cette analyse un chemin d'attaque sera dérivé dans le cadre de l'injection de faute via ondes électromagnétiques
Today, security of embedded devices is put in the limelight with the increasing market share of both IoT and automotive.To ensure a proper level of security to its customer such embedded components must undergo pentesting either to obtain some certifications to address security market but also to avoid tarnishing the name of the firm in case of vulnerability.Amongst the various attack paths, one of most threatening is the voluntary violation of operation condition to induce a fault on a circuit.These faults are then used for privilege escalation or combined with statistic tools to recover cryptographic keys. This thesis focuses on the use of electromagnetic field to generate such faults, this medium being the one that offers the best trade-off between cost and accuracy.The efficiency of such family of attack has already been demonstrated in the literature. Yet fault injection techniques shared a common problem which root cause is the amount of parameter an evaluator has to tweaks to obtain a fault. Therefore, it is hard to state whether a target is protected against fault injection since evaluation is bounded in time, thus exhaustive search is not an option.Metrics or strategies should be defined to get the most out of up to date fault injection methods.This thesis is a first step towards defining such metrics, and proposed to tackle the space complexity of EM fault injection. In other words, according to the attack scenario we developed metrics or strategy relying on both experimentation and state of the art. The aims of those metrics/strategy being to reduce the space on the DUT that undergo electromagnetic emanation to the most likely to be faulted area.In a first part, a criterion based on a basic model of the coupling between the injection probes and the circuit as well as today fault model will be developed.This criterion is then analysed and a refinement is proposed.Yet fault injection could also be used to nullify countermeasure that disable some attack vectors. Most of those countermeasures have in common the use of a true random generator.Thence in a second part we evaluate the robustness of an up to date true random number generator against electromagnetic perturbation.From this analysis we derived which parts of true random number generator are more relevant to be targeted using electromagnetic waves
APA, Harvard, Vancouver, ISO, and other styles
2

Gurhan, Ozkan. "A methodology to measure the metal erosion on recovered armatures." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA401404.

Full text
Abstract:
Thesis (M.S. in Physics) Naval Postgraduate School, Dec. 2001.
Thesis advisors, William B. Maier II, Francis Stefani. "December 2001." Includes bibliographical references (p. 93). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
3

Carlshamre, Pär. "A usability perspective on requirements engineering : from methodology to product development." Doctoral thesis, Linköpings universitet, MDALAB - Human Computer Interfaces, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-4976.

Full text
Abstract:
Usability is one of the most important aspects of software. A multitude of methods and techniques intended to support the development of usable systems has been provided, but the impact on industrial software development has been limited. One of the reasons for this limited success is the gap between traditional academic theory generation and commercial practice. Another reason is the gap between usability engineering and established requirements engineering practice. This thesis is based on empirical research and puts a usability focus on three important aspects of requirements engineering: elicitation, specification and release planning. There are two main themes of investigation. The first is concerned with the development and introduction of a usability-oriented method for elicitation and specification of requirements, with an explicit focus on utilizing the skills of technical communicators. This longitudinal, qualitative study, performed in an industrial setting in the first half of the nineties, provides ample evidence in favor of a closer collaboration between technical communicators and system developers. It also provides support for the benefits of a task-oriented approach to requirements elicitation. The results are also reflected upon in a retrospective paper, and the experiences point in the direction of an increased focus on the specification part, in order to bridge the gap between usability engineering and established requirements management practice. The second represents a usability-oriented approach to understanding and supporting release planning in software product development. Release planning is an increasingly important part of requirements engineering, and it is complicated by intricate dependencies between requirements. A survey performed at five different companies gave an understanding of the nature and frequency of these interdependencies. This knowledge was then turned into the design and implementation of a support tool, with the purpose of provoking a deeper understanding of the release planning task. This was done through a series of cooperative evaluation sessions with release planning experts. The results indicate that, although the tool was considered useful by the experts, the initial understanding of the task was overly simplistic. As a result, a number of design implications are proposed.
On the day of the public defence the status of article VI was: Submitted.
APA, Harvard, Vancouver, ISO, and other styles
4

Carlshamre, Pär. "A usability perspective on requirements engineering : from methodology to product development /." Linköping : Univ, 2001. http://www.ep.liu.se/diss/science_technology/07/26/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cornel, Caralea May. "A Methodology to Measure the Impact of Diversity on Cybersecurity Team Effectiveness." BYU ScholarsArchive, 2019. https://scholarsarchive.byu.edu/etd/8594.

Full text
Abstract:
In recent years, the definition of cybersecurity professional has been diluted to include more individuals, particularly women, to be included. Depending on the definition used, women currently comprise between 11% and 25% of the cybersecurity workforce. While multiple studies have indicated the benefits to diverse teams, research in the cybersecurity area is lacking.This research proposes a framework that uses a modified escape-the-room gamified scenario to measure the effectiveness of cybersecurity teams in technical problem-solving. The framework presents two routes, incident response and penetration testing, the participants can choose. In a preliminary study, this framework is used to show the combination of gender diversity and prior cybersecurity experience and/or cybersecurity knowledge, particularly in women, are found to be significant in reducing the time taken to solve cybersecurity tasks in the incident response, and penetration testing domains.In conclusion, opportunities for extending this research into a large-scale study are discussed, along with other applications of cybersecurity escape-rooms.
APA, Harvard, Vancouver, ISO, and other styles
6

Oliveira, Rodrigues Antonio Wendell de. "A methodology to develop high performance applications on GPGPU architectures : application to simulation of electrical machines." Thesis, Lille 1, 2012. http://www.theses.fr/2012LIL10029/document.

Full text
Abstract:
Les phénomènes physiques complexes peuvent être simulés numériquement par des techniques mathématiques. Ces simulations peuvent mener ainsi à la résolution de très grands systèmes. La parallélisation des codes de simulation numérique est alors une nécessité pour parvenir à faire ces simulations en des temps non-exorbitants. Le parallélisme s’est imposé au niveau des architectures de processeurs et les cartes graphiques sont maintenant utilisées pour des fins de calcul généraliste, aussi appelé "General-Purpose GPU", avec comme avantage évident l’excellent rapport performance/prix. Cette thèse se place dans le domaine de la conception de ces applications hautes-performances pour la simulation des machines électriques. Nous fournissons une méthodologie basée sur l’Ingénierie Dirigées par les Modèles (IDM) qui permet de modéliser une application et l’architecture sur laquelle l’exécuter, afin de générer un code OpenCL. Notre objectif est d’aider les spécialistes en algorithmes de simulations numériques à créer un code efficace qui tourne sur les architectures GPGPU. Pour cela, une chaine de compilation de modèles qui prend en compte plusieurs aspects du modèle de programmation OpenCL est fournie. De plus, nous fournissons des transformations de modèles qui regardent des niveaux d’optimisations basées sur les caractéristiques de l’architecture.Comme validation expérimentale, la méthodologie est appliquée à la création d’une application qui résout un système linéaire issu de la Méthode des Éléments Finis. Dans ce cas nous montrons, entre autres, la capacité de la méthodologie de passer à l’échelle par une simple modification de la multiplicité des unités GPU disponibles
Complex physical phenomena can be numerically simulated by mathematical techniques. Usually, these techniques are based on discretization of partial differential equations that govern these phenomena. Hence, these simulations enable the solution of large-scale systems. The parallelization of algorithms of numerical simulation, i.e., their adaptation to parallel processing architectures, is an aim to reach in order to hinder exorbitant execution times. The parallelism has been imposed at the level of processor architectures and graphics cards are now used for purposes of general calculation, also known as "General- Purpose GPU". The clear benefit is the excellent performance/price ratio. This thesis addresses the design of high-performance applications for simulation of electrical machines. We provide a methodology based on Model Driven Engineering (MDE) to model an application and its execution architecture in order to generate OpenCL code. Our goal is to assist specialists in algorithms of numerical simulations to create a code that runs efficiently on GPGPU architectures. To ensure this, we offer a compilation model chain that takes into account several aspects of the OpenCL programming model. In addition, we provide model transformations that analyze some levels of optimizations based on the characteristics of the architecture. As an experimental validation, the methodology is applied to the creation of an application that solves a linear system resulting from the Finite Element Method (FEM). In this case, we show, among other things, the ability of the methodology of scaling by a simple modification of the number of available GPU devices
APA, Harvard, Vancouver, ISO, and other styles
7

Aguilar, Juan Pablo. "Experimental methodology to assess the effect of coatings on fiber properties using nanoindentation." Thesis, Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45781.

Full text
Abstract:
Current body armor technologies need further improvements in their design to help reduce combat injuries of military and law enforcement personnel. Kevlar-based body armor systems have good ballistic resistance up to a certain ballistic threat level due to limitations such as decreased mobility and increased weight [1,2]. Kevlar fibers have been modified in this work using a nano-scale boron carbide coating and a marked increase in the puncture resistance has been experimentally observed. It is hypothesized that this improvement is due to the enhancement of the mechanical properties of the individual Kevlar fibers due to the nano-scale coatings. This study presents a comprehensive experimental investigation of individual Kevlar fibers based on nanoindentation to quantify the cause of the enhanced puncture resistance. The experimental setup was validated using copper wires with a diameter size in the same order of magnitude as Kevlar fibers. Results from nanoindentation did not show significant changes in the modulus or hardness of the Kevlar fibers. Scanning Electron Microscopy revealed that the coated fibers had a marked change in their surface morphology. The main finding of this work is that the boron carbide coating did not affect the properties of the individual fibers due to poor adhesion and non-uniformity. This implies that the observed enhancement in puncture resistance originates from the interaction between fibers due to the increase in roughness. The results are important in identifying further ways to enhance Kevlar puncture resistance by modifying the surface properties of fibers.
APA, Harvard, Vancouver, ISO, and other styles
8

Vazquez, Job Andres. "A management methodology to control the impact of engineering changes on marine projects." Thesis, University of Newcastle Upon Tyne, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556137.

Full text
Abstract:
Engineering changes are common in manufacturing industries. Engineering changes can be found from the conception of the project until its culmination. In general, engineering changes are difficult to quantify. Engineering changes vary from design changes up to production changes. The management of these changes normally ends up in variation orders and is time consuming. In the Marine Industry, they are widely recognised, but hardly researched and quantified. The approach to develop a new tool is necessary to understand not only the management of engineering changes, but also the impact quantification on the project overall. The research to develop a new tool consists on developing case studies to implement a powerful methodology in live projects. The integration of different techniques from other industries that widely recognise the management of engineering changes makes an opportunity to quantify engineering changes through this methodology. The new methodology includes coverage, impact representation, decision making process, lean improvement and analysis from the experts. The impact representation and the integration of tracking mechanisms are a key tool for researchers and practitioners to make decision in marine projects.
APA, Harvard, Vancouver, ISO, and other styles
9

Anandarao, Sudhir. "Application of the risk assessment methodology to level cross accidents on JR East." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/39055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

York, Richard H. "A new methodology to measure body/self-concept based on personal construct theory." Thesis, Boston University, 1987. https://hdl.handle.net/2144/38120.

Full text
Abstract:
Thesis (Ph.D.)--Boston University
PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you.
The objective of this dissertation is to describe and test the reliability and validity of the Body/Self-Concept Methodology (BSC). It measures body attitudes as consequences of body/self-concept. The BSC Methodology was constructed from insights of the debate about the self in American psychology. This debate included philosophical, neurological, psychological, spiritual, and theological issues. These were integrated into Kelly's methodology producing a psychotheological research perspective. This methodology consists of a research philosophy, theory and measures for body/self-concept, and statistical methods. The BSC Method is the six techniques for collecting qualitative and quantitative data. This collection depended on a computer. The quantitative data are ratings of bipolar adjective pairs and a Q-sort of body items. The BSC Method was tested in a study with 40 subjects. The qualitative results included affective self-report data. It was concluded that some of these results implied that this method pierced denial defense mechanisms. The quantitative results were highly valid and reliable for the attitude ratings, but less for the Q-sort. It was concluded that there was sufficient reliability and validity to justify further development of the BSC Methodology. The next step is to write a computer program for data collection and analysis.
2031-01-01
APA, Harvard, Vancouver, ISO, and other styles
11

Cheah, Charmaine Yi Ting. "Development of a methodology to quantify installation damage on geotextile for coastal application." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/112464/1/Charmaine%20Yi%20Ting_Cheah_Thesis.pdf.

Full text
Abstract:
This study was a step forward in enhancing the knowledge to quantify installation damage on geotextiles' properties for coastal protection structures. It examines the geotextiles' properties with the Drop Rock Test developed, which replicates construction stress on geotextile during installation process. The thesis investigated the influence of construction stress on geotextiles' robustness, mechanical strength, physical deformation, and filtration properties. The influence of subgrade characteristics (i.e moisture condition) on geotextiles' robustness during installation was examined in this study as well. Design charts to predict the robustness of geotextile during installation was developed which allows engineers and designers to select the appropriate geotextile to minimize the risk of damage during installation process.
APA, Harvard, Vancouver, ISO, and other styles
12

Sgaravatti, Daniele. "Down to earth philosophy : an anti-exceptionalist essay on thought experiments and philosophical methodology." Thesis, University of St Andrews, 2012. http://hdl.handle.net/10023/3228.

Full text
Abstract:
In the first part of the dissertations, chapters 1 to 3, I criticize several views which tend to set philosophy apart from other cognitive achievements. I argue against the popular views that 1) Intuitions, as a sui generis mental state, are involved crucially in philosophical methodology 2) Philosophy requires engagement in conceptual analysis, understood as the activity of considering thought experiments with the aim to throw light on the nature of our concepts, and 3) Much philosophical knowledge is a priori. I do not claim to have a proof that nothing in the vicinity of these views is correct; such a proof might well be impossible to give. However, I consider several versions, usually prominent ones, of each of the views, and I show those versions to be defective. Quite often, moreover, different versions of the same worry apply to different versions of the same theory. In the fourth chapter I discuss the epistemology of the judgements involved in philosophical thought experiments, arguing that their justification depends on their being the product of a competence in applying the concepts involved, a competence which goes beyond the possession of the concepts. I then offer, drawing from empirical psychology, a sketch of the form this cognitive competence could take. The overall picture squares well with the conclusions of the first part. In the last chapter I consider a challenge to the use of thought experiments in contemporary analytic philosophy coming from the ‘experimental philosophy'movement. I argue that there is no way of individuating the class of hypothetical judgements under discussion which makes the challenge both interesting and sound. Moreover, I argue that there are reasons to think that philosophers possess some sort of expertise which sets them apart from non-philosophers in relevant ways.
APA, Harvard, Vancouver, ISO, and other styles
13

Abu, Rub Faisal Asad Farid. "A business process improvement methodology based on process modelling, applied to the healthcare sector." Thesis, University of the West of England, Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.429689.

Full text
Abstract:
Process modelling can be used to provide a comprehensive understanding of business activities and functions and thence a base for detailed process analysis. Business process improvement refers to a family of approaches which aim to help an organisation adjust its processes to fit a dynamic or complex business environment, particularly so as to take advantage of rapid advances in information technologies. However, most business process improvement methodologies do not make significant use of process modelling to guide the evaluation and improvement of business processes. The current research uses process modelling techniques in a systematic and generalisable manner to gain deeper understanding of processes in a particular complex case. By analysis and further probing of the process models, it then seeks to develop a practical methodology for business improvement which will be applicable not only in the case in question but also more broadly. The case explored in detail in this work is the process of Cancer Care and Registration (CCR) in Jordan. This is introduced after a discussion of business processes in general, business processes in healthcare, and methods of business process modelling. There is some comparative treatment of CCR processes in the UK. The main method used for modelling existing processes in the Jordanian CCR case is Role Activity Diagramming (RAD). Models for six major sub-processes were prepared. The models thus produced were validated in discussion with participants. They were then subjected to an extensive analysis, with the objective of discovering whether the processes might be improved. One form of analysis examined the structural properties of the models, to discover for instance how closely coupled different roles were. A second, model-led, form of analysis methodically queried, through interview or questionnaire, each activity or interaction in the models, to see how well it was working, in its particular context, in terms of general criteria such as efficiency or reliability. Thirdly, the notion of non-functional requirements (NFRs), borrowed from software engineering, was used to derive detailed NFRs from high-level business objectives, as a basis for a systematic examination of broad quality levels achieved in existing processes. These complementary analyses, supported by further validation with - I - participants, then provided the base for a remodelling of the processes with the goal of business improvement. The redesign suggestions included indications of where information technology might be introduced or strengthened with beneficial effect. The methods of detailed modelling, systematic analysis, and redesign for business improvement are, while thoroughly applied to the case under investigation, sufficiently abstract to be proposed as a general methodology for the design of business process improvements. The key features of the methodology are that it is grounded in process modelling and brings together functional, non-functional and structural process analyses.
APA, Harvard, Vancouver, ISO, and other styles
14

Brumbaugh, Scott J. "Development of a Methodology to Measure Aerodynamic Forces on Pin Fins in Channel Flow." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/30871.

Full text
Abstract:
The desire for smaller, faster, and more efficient products places a strain on thermal management in components ranging from gas turbine blades to computers. Heat exchangers that utilize internal cooling flows have shown promise in both of these industries. Although pin fins are often placed in the cooling channels to augment heat transfer, their addition comes at the expense of increased pressure drop. Consequently, the pin fin geometry must be judiciously chosen to achieve the desired heat transfer rate while minimizing the pressure drop and accompanying pumping requirements. This project culminates in the construction of a new test facility and the development of a unique force measurement methodology. Direct force measurement is achieved with a cantilever beam force sensor that uses sensitive piezoresistive strain gauges to simultaneously measure aerodynamic lift and drag forces on a pin fin. After eliminating the detrimental environmental influences, forces as small as one-tenth the weight of a paper clip are successfully measured. Although the drag of an infinitely long cylinder in uniform cross flow is well documented, the literature does not discuss the aerodynamic forces on a cylinder with an aspect ratio of unity in channel flow. Measured results indicate that the drag coefficient of a cylindrical pin in a single row array is greater than the drag coefficient of an infinite cylinder in cross flow. This phenomenon is believed to be caused by an augmentation of viscous drag on the pin fin induced by the increased viscous effects inherent in channel flow.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
15

Wint, F. E. "'Am I bothered?' : using Q methodology to explore what bothers young people on Facebook." Thesis, University of Sheffield, 2013. http://etheses.whiterose.ac.uk/4493/.

Full text
Abstract:
Existing research into cyberbullying has tended to utilise surveys in order to understand the extent to which cyberbullying is experienced by young people in society. However, there has been little homogeneity between researchers when attempting to define cyberbullying and consequently there is disparity in how it has been operationalised. As well as this, recycling of the term ‘bullying’ brings with it certain presumptions and qualifications which may not be apt for social interactions in the new and ever evolving virtual world. Furthermore, it implicitly assumes that cyberbullying will bother young people, whilst simultaneously failing to acknowledge the situations which may bother young people but which do not constitute cyberbullying. In the present study the word ‘cyberbullying’ was thus omitted from use with participants in an attempt to circumvent the ‘trouble’ inherent with the term. The aim of this study was to gain an understanding of what bothers young people when on Facebook. A research methodology was sought which minimised the potential for researcher bias and maximised the opportunity for young people to give their personal account. Accordingly, Q methodology was employed to explore how 41 young people ranked 54 statements depicting hypothetical problem scenarios on Facebook. Participants sorted the statements according to personal significance from most agree (would bother) to most disagree (would not bother). The overall configuration of statements was subjected to factor analysis, from which a four factor solution was identified; ‘I want to protect others’; ‘I am worried about the dangers on Facebook’; ‘I know who I am and what I’m doing’; and ‘I don’t want any trouble’. The emergent social viewpoints were discussed further with four young people and an understanding was gained of what they perceived of Facebook; what action they would take if they experienced something negative on Facebook and what role they felt school should play in such situations. The findings were discussed in relation to existing literature, and the potential roles of schools and Educational Psychologists were considered. Limitations were acknowledged and recommendations for further research suggested.
APA, Harvard, Vancouver, ISO, and other styles
16

Wallskog, Pappas Alexis. "Migration of Legacy Applications to the Cloud- A Review on Methodology and Tools for Migration to the Cloud." Thesis, Umeå universitet, Institutionen för datavetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-96673.

Full text
Abstract:
Many organizations have legacy applications and strive to modernise them in order to react on changes and adapt to the new environment, the cloud. The enticements are quite a few but the risks are lurking as well. How to migrate a legacy application to the cloud is an unanswered question for many organizations. We look at how research has answered this question and the methods and tools they provide. The research partially answers the question of migration of legacy application to the cloud. The methods and tools are still quite granular, not that automated and is very dependent of what type of legacy application and the aim of the end result is.
APA, Harvard, Vancouver, ISO, and other styles
17

Mercado, Angel, Hervin Vargas, Edgardo Carvallo, and Carlos Raymundo. "Proposal to optimize the flow of preparation and delivery of vehicles to dealers based on the Lean methodology." Latin American and Caribbean Consortium of Engineering Institutions, 2019. http://hdl.handle.net/10757/656262.

Full text
Abstract:
Currently, there is a monthly breach with its vehicle delivery policy. This policy refers to said company, it refers to a minimum of 92% of units towards the points of sale. However, in the last quarter of 2017 and the semester of 2018, it has only been met from 82% to 90%. These are 5% -10% that have not come to meet in each month, but also directly to the profitability of the company. On the one hand, storage costs are increased by each car that does not arrive or dissatisfaction is generated by the customer, this is the problem of 3 main causes, which is the installation time of radios, installation of alarms and more vehicles that are not served in the damage assessment area. As a root cause, we have the first, since, on the one hand, a bottleneck is generated by the lack of car attention and, on the other hand, represents 20% of the monthly sales of the case study, For what needs to be done to reduce this bottleneck and increase the capacity of the vehicles in this area, the damage is detailed, two lines of evaluation are carried out. This type of triage, we ensure a standardized work, since there was a group of vehicles waiting with different types of damage in the same operating line. Also, the delivery of vehicles on time is improved to 95% and the times of the areas of installation of radios, installation of alarms and damage assessment are reduced by 42.85%, 51.42% and 50% respectively.
APA, Harvard, Vancouver, ISO, and other styles
18

Christein, John Paul. "A design methodology for welded structures to be used on U.S. Navy surface combatant ships." Master's thesis, This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-02022010-020042/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Siatkowski, Marcin [Verfasser]. "Approaches to the Analysis of Proteomics and Transcriptomics Data based on Statistical Methodology / Marcin Siatkowski." Greifswald : Universitätsbibliothek Greifswald, 2014. http://d-nb.info/1050274954/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Gasper, D. "Policy analysis and evaluation : An essay on methodology and education with reference to development studies." Thesis, University of East Anglia, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.372557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ma, Shuhui. "A methodology to predict the effects of quench rates on mechanical properties of cast aluminum alloys." Link to electronic dissertation, 2006. http://www.wpi.edu/Pubs/ETD/Available/etd-050106-174639/.

Full text
Abstract:
Dissertation (Ph.D.)--Worcester Polytechnic Institute.
Keywords: Time-Temperature-Property curve, Jominy End Quench, ANOVA analysis. Quench Factor Analysis, Taguchi design, Polymer quench, Cast Al-Si-Mg alloys, Quenching, Heat treatment. Includes bibliographical references (p.115-117).
APA, Harvard, Vancouver, ISO, and other styles
22

Spirkin, Anton M. "A three-dimensional particle-in-cell methodology on unstructured Voronoi grids with applications to plasma microdevices." Link to electronic dissertation, 2006. http://www.wpi.edu/Pubs/ETD/Available/etd-050506-145257/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Tank, Rajul. "Methodology to determine performance of a group technology design cell on the basis of performance measures." Thesis, This resource online, 1991. http://scholar.lib.vt.edu/theses/available/etd-10242009-020244/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Mejia, Katherine, Henry Quintanilla, Carlos Cespedes, Jose Rojas, and Carlos Raymundo. "Application of a management model based on DMAIC methodology to an MSE in the personal beauty sector to increase profitability." Springer Verlag, 2020. http://hdl.handle.net/10757/656357.

Full text
Abstract:
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado.
Micro and Small Enterprise (MSEs) are the primary employment driving force in Peru; however, their low level of management does not allow for their long-term sustainable development. This situation has led such companies to incur unnecessary costs and expenses. In this study, we deal with the case study of a microenterprise of services in the personal beauty sector, which applied a Management Model based on define, measure, analyze, improve, and control (DMAIC) method. That excellent methodology helps to improve processes through the philosophy of continuous improvement, with the aim of proposing improvements to increase profitability and growth. Moreover, the integration of various techniques and tools at each stage of the DMAIC methodology was proposed with a focus on human management processes, inventories, and operational services, which have greater impact on most companies of this type.
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, Meng. "A spatial statistical methodology to assess the contribution of land use to soil contamination and its influence on human health." Thesis, Imperial College London, 2011. http://hdl.handle.net/10044/1/9279.

Full text
Abstract:
Soil is a crucial component of rural and urban environments and soil quality in both these environments can be influenced significantly by land management. Heavy metals occur naturally in soils in small amounts. However, occasionally high heavy metal loads may originate from parent rocks through weathering, volcanic eruptions, and forest fires, or are attributed to human activities. Potentially contaminated soils may occur at old landfill sites, particularly those that accepted industrial wastes; fields that had past applications of waste water or municipal sludge; areas in or around mining waste piles and tailings; industrial areas where chemicals may have been dumped on the ground; or in areas downwind from industrial sites. Excess heavy metal accumulation in soils is toxic to humans and animals. Exposure to heavy metals is normally chronic, while acute poisoning from heavy metals is rare through ingestion or dermal contact, but is possible. Cadmium (Cd), and Lead (Pb) have been drawing a lot of attention from geochemists and environmental scientists due to the greater understanding of their toxicological affects on agriculture, ecosystems and human health. Chronic exposure to Cd is known to adversely affect kidney, liver, and the gastrointestinal function, while Pb is well known to affect the nervous system. One interesting question for governments, regulators and the community as a whole is to be able to attribute the sources of heavy metals to either natural sources or land management practices, which may span decades or even centuries considering heavy metals tend to accumulate. Redevelopment and reuse of these soils may pose a threat to human health through uptake of contaminants via ingestion, inhalation, and dermal contact. From the human health protection point of view it is important to assess whether the contamination present in soil reaches human receptors through intake/uptake since plausible source-pathway-receptor linkages exists. In that case, health effects may be related to the source and pathway of heavy metals. In addressing these topics, the objectives of this research were to develop spatial statistical methodologies that can be used: i) to correlate geochemical data with historic land uses and geological data in order to evaluate the influence of historic land use on soil contaminant levels ii) to correlate modelled contaminant levels with cancer incidence data in order to ascertain whether contaminated land may influence human health and assess the strength of any putative relationships. The correlation between heavy metals in soils and their origins (geological and anthropogenic sources) was investigated and quantitatively analysed through model regression, while geostatistical methods were used to analyse the spatial aspects of soil contamination autocorrelation. A probabilistic modelling method was developed to assess whether hot-spot areas within a study region can be better defined using geological and historical land use data. The methodology developed includes indicator kriging, logistic regression and the Bayes theorem as its main building blocks. In order to assess the health risk attributable to soil contamination, spatial autocorrelation and data clustering analyses were employed on cancer incidence data in order to identify whether living in a contaminated area can be one of the factors that attribute to developing cancer. Assessment of the correlation between contamination levels and cancer incidence on different geospatial levels was carried out. It was concluded that it is possible to identify and model the relative contribution of different land use based heavy metal sources to soil contamination. It was also shown that integration of spatial autocorrelation in the modelling has some advantages in terms of model fitting and prediction ability. Correlation between estimated soil contamination levels and cancer incidence data was not shown to be significant and no spatial clustering was found on census geospatial levels.
APA, Harvard, Vancouver, ISO, and other styles
26

Menousek, Dorothy. "A communication methodology focused on ecological unitizing designed to enable upper elementary school students to generate their own appropriate learning goals /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/6191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Rabah, Mhamad Hassanein. "Design methodology of antennas based on metamaterials and the theory of characteristic modes : application to cognitive radio." Thesis, Lille 1, 2015. http://www.theses.fr/2015LIL10141.

Full text
Abstract:
C’est dans la course au débit que de nouveaux systèmes de communications sans fil sont proposés aujourd’hui afin de répondre à cette demande. Il n’existe pas de technologie ou de standard unique pour remplacer toutes les autres technologies radio tout en satisfaisant l’ensemble des besoins de services et d’usages. Les futurs systèmes de communications devront donc permettre l’interopérabilité entre les dif- férents standards au niveau européen et mondial. L’émergence des technologies de la radio intelligente constitue aujourd’hui une solution originale et prometteuse pour répondre à ce challenge. Cependant, le déploiement de ces systèmes soulève encore de nombreux défis techniques notamment au niveau des antennes. Ces dernières doivent en particulier être capables de modifier leurs réponses en fréquences ou bien avoir une très large bande tout en minimisant la déformation des formes d’ondes. Elles doivent répondre aux besoins de cohabitation multi-systèmes et de miniaturisation. Or la miniaturisation et la largeur de bande sont limités par des contraintes physiques et les caractéristiques des matériaux les constituant. Les métamatériaux, des composites électromagnétiques artificiels, peuvent permettre la conception d’éléments rayonnants pour la radio intelligente avec un plus grand degré de liberté et à moindre coût. Cependant l’analyse de l’association des métamatériaux à des antennes ou éléments rayonnants n’est pas aisé, surtout quand les antennes prennent une forme arbitraire. Pour adresser ces challenges les techniques classiques de conception d’antennes ne sont pas suffisantes. Pour cela, nous proposons dans ce travail une approche basée sur la conception modale utilisant la théorie des modes caractéristiques. Cette approche démontre un potentiel remarquable dans la conception des antennes électriquement petites ainsi les antennes basées/inspirées des métamatériaux. En outre, l’évaluation des performances d’antennes à formes arbitraires en fonction des paramètres géométriques à été démontré sans considération d’une excitation prédéfinie. Ceci offre un énorme avantage pour les techniques basées sur l’optimisation des formes d’antennes. Finalement, une antenne très large bande dédiée au sondage du spectre est présentée. L’approche modale a été utilisée afin de garantir un diagramme du rayonnement stable sur toute la bande de fréquence ainsi qu’une bonne efficacité dans le régime électriquement petit. Plusieurs validations expérimentales sont également présentées
The rise of wireless communication systems and the big demand of high bit rate links have entailed researches to lie over new communications systems. With this diversity of wireless systems, flexibility for operating between different standards is strongly needed. Cognitive radio (CR) consist the future system that can offer this flexibility. The new features of CRs remains many challenges to their antennas. Miniaturization, isolation and bandwidth improvement, are all real needs and effective challenges. Especially when the geometry of the antenna become more complex in order to fit the terminal chassis. The use of metamaterials (MTM) has been introduced to overcome physics limitations in order to undertake these needs. The analysis of MTM in presence of radiating elements such antennas prove to be a challenge. In this thesis, an new approach to address these challenges is proposed. It is based on a modal concept using the theory of characteristic modes (TCM). It proves to be useful to analyse and design of electrically small antenna (ESA), metamaterial-inspired antennas and metamaterial-based antennas. Furthermore, the same approach is used to evaluate the antenna performances when surrounded by complex artificial materials by proposing closed formulas for the quality factor. This remains into a huge advantage in the antenna shape optimisation in the antenna industry. As a proof of concept, an extremely-wide-band antenna for underlay CR (sensing antenna) is developed using the proposed approach in order to have stable radiation pattern an high efficiency in the electrically small regime. An experimental validation of the performances of all the presented designs is also provided
APA, Harvard, Vancouver, ISO, and other styles
28

Possik, Jalal. "Contribution to a Methodology and a Co-Simulation Framework assessing the impact of Lean on Manufacturing Performance." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0390.

Full text
Abstract:
Au-delà des compétences humaines et managériales nécessaires pour développer une entreprise, le bon déploiement du Lean peut jouer un rôle important dans la réduction des gaspillages et la maximisation de l'efficacité. Ces avantages dépendent fortement de l'intégration adéquate des techniques Lean. L'un des principaux obstacles auxquels font face les entreprises est la difficulté de choisir les outils Lean qui correspondent le mieux à leurs contextes et qui sont les mieux adaptés à l’atteinte de leurs objectifs.Dans cette étude, nous avons proposé un environnement de co-simulation basé sur HLA avec une plateforme digitale basée sur Java pour permettre à différents fédérés (simulations à évènements discrets) qui représentent les outils opérationnels Lean de fonctionner simultanément en parallèle. Les mécanismes de gestion du temps de HLA sont nécessaires pour réguler l’avancement des fédérés pendant le cycle de simulation. Un exemple d’entreprise aéronautique est utilisé pour démontrer l’utilité de cet environnement de co-simulation. Six modèles de configuration Lean sont étudiés par rapport au modèle actuel de l’entreprise simulé sans l’application du Lean, et ce sous l’influence de la fluctuation du marché, de la diversification de la demande et de l’incertitude des ressources
Aside from the human and managerial skills necessary to propel any business, the right Lean deployment can play a big role in reducing waste and maximizing efficiency. Capturing these benefits is highly dependent on adequate Lean techniques integration. One of the major hurdles companies face is the difficulty to choose the Lean tools that best fit their contexts and that are best tailored towards reaching their objectives. In this study, we proposed an HLA based Co-Simulation framework with a Java-based digital platform to allow different federates (discrete event simulations), representing the operational Lean tools, running simultaneously in parallel. Time management mechanisms of HLA are required for regulating the advancement of the federates during the simulation run. An example of an Aeronautic company is used to demonstrate the usefulness of this co-simulation framework. Six Lean configuration models are investigated under market fluctuation, demand diversification, and uncertainty of resources contexts compared with an actual model simulated as a Lean free scenario
APA, Harvard, Vancouver, ISO, and other styles
29

Chamberlin, Ryan Earl. "A three-dimensional direct simulation Monte Carlo methodology on unstructured Delaunay grids with applications to micro and nanoflows." Worcester, Mass. : Worcester Polytechnic Institute, 2007. http://www.wpi.edu/Pubs/ETD/Available/etd-032907-092912/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Zhang, Xiaowei. "A METHODOLOGY OF SPICE SIMULATION TO EXTRACT SRAM SETUP AND HOLD TIMING PARAMETERS BASED ON DFF DELAY DEGRADATION." UKnowledge, 2015. http://uknowledge.uky.edu/ece_etds/75.

Full text
Abstract:
SRAM is a significant component in high speed computer design, which serves mainly as high speed storage elements like register files in microprocessors, or the interface like multiple-level caches between high speed processing elements and low speed peripherals. One method to design the SRAM is to use commercial memory compiler. Such compiler can generate different density/speed SRAM designs with single/dual/multiple ports to fulfill design purpose. There are discrepancy of the SRAM timing parameters between extracted layout netlist SPICE simulation vs. equation-based Liberty file (.lib) by a commercial memory compiler. This compiler takes spec values as its input and uses them as the starting points to generate the timing tables/matrices in the .lib. Originally large spec values are given to guarantee design correctness. While such spec values are usually too pessimistic when comparing with the results from extracted layout SPICE simulation, which serves as the “golden” rule. Besides, there is no margin information built-in such .lib generated by this compiler. A new methodology is proposed to get accurate spec values for the input of this compiler to generate more realistic matrices in .lib, which will benefit during the integration of the SRAM IP and timing analysis.
APA, Harvard, Vancouver, ISO, and other styles
31

Namaki, Araghi Sina. "A methodology for business process discovery and diagnosis based on indoor location data : Application to patient pathways improvement." Thesis, Ecole nationale des Mines d'Albi-Carmaux, 2019. http://www.theses.fr/2019EMAC0014.

Full text
Abstract:
Dans chaque organisation, les processus métier sont aujourd’hui incontournables. Cette thèse vise à développer une méthode pour les améliorer. Dans le domaine de la santé, les organisations hospitalières déploient beaucoup d’efforts pour mettre leurs processus sous contrôle, notamment à cause de la très faible marge d’erreur admise. Les parcours des patients au sein des structures de santé constituent l’application qui a été choisie pour démontrer les apports de cette méthode. Elle a pour originalité d’exploiter les données de géolocalisation des patients à l’intérieur de ces structures. Baptisée DIAG, elle améliore les parcours de soins grâce à plusieurs sous-fonctions : (i) interpréter les données de géolocalisation pour la modélisation de processus, (ii) découvrir automatiquement les processus métier, (iii) évaluer la qualité et la performance des parcours et (iv) diagnostiquer automatiquement les problèmes de performance des processus. Cette thèse propose donc les contributions suivantes : la méthode DIAG elle-même qui, grâce à quatre différents états, extrait les informations des données de géolocalisation ; le méta-modèle DIAG qui a deux utilités : d’une part, interpréter les données de géolocalisation et donc passer des données brutes aux informations utilisables, et, d’autre part contribuer à vérifier l’alignement des données avec le domaine grâce à deux méthodes de diagnostic décrites plus bas ; deux algorithmes de découverte de processus qui utilisent la stabilité statistique des logs d’évènements ; une nouvelle approche de process mining utilisant SPC (Statistical Process Control) pour l’amélioration ; l’algorithme proDIST qui mesure les distances entre les modèles de processus ; deux méthodes de diagnostic automatique de processus pour détecter les causes des déviations structurelles dans des cas individuels et pour des processus communs. Le contexte de cette thèse confirme la nécessité de proposer de telles solutions. Une étude de cas dans le cadre de ce travail de recherche illustre l’applicabilité de la méthodologie DIAG et des fonctions et méthodes mentionnées
Business processes are everywhere and, as such, we must acknowledge them. Among all of them, hospital processes are of vital importance. Healthcare organizations invest huge amount of efforts into keeping these processes under control, as the allowed margin of error is so slight. This research work seeks to develop a methodology to endorse improvement of patient pathways inside healthcare organizations. It does so by using the indoor location data of patients. This methodology is called DIAG (Data state, Information state, Awareness, Governance). It is constructed of several different functions. The most important ones are as follows: (i) location data interpreting, (ii) automatic discovery of business process models, (iii) business process analyzing for evaluating the performance and quality of processes, and finally, (iv) automatic diagnosing of business processes. Along the former functions, the contribution of this thesis are: The DIAG methodology which, through four different states, extracts knowledge from location data; the DIAG meta-model which supports both the interpretation of location data (from raw data to usable information) and the alignment of the domain knowledge (which are used for the diagnosing methods); two process discovery algorithms which explore statistical stability in event logs, application of Statistical Process Control (SPC) for the “enhancement notation” of Process Mining; the ProDIST algorithm for measuring the distance between process models; two automatic process diagnosing methods to detect causes of structural deviations in individual cases and common processes. The state of the art in this dissertation endorses the necessity for proposing such solutions. A case study within this research work illustrates the applicability of the DIAG methodology and its mentioned functions and methods
APA, Harvard, Vancouver, ISO, and other styles
32

GUNASEKARAN, VISHNURAJ V. "A MIXED-SIGNAL MODEL DEVELOPMENT AND VERIFICATION METHODOLOGY WITH EMPHASIS ON A SIGMA-DELTA ANALOG-TO-DIGITAL CONVERTER." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1134419771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Proença, Sara Salgueiro. "Organization of the maintenance : method to implement a maintenance management system and methodology for efficient maintenance on heavy machinery." Master's thesis, Instituto Politécnico de Setúbal. Escola Superior de Tecnologia de Setúbal, 2019. http://hdl.handle.net/10400.26/31375.

Full text
Abstract:
Relatório de Dissertação do Mestrado em Engenharia de Produção
Este trabalho dá início à construção de um método para implementação de um sistema de gestão da manutenção e é também um estudo de organização e disponibilidade de máquinas pesadas. A pesquisa tem como fundamento base a manutenção e todas as questões inerentes a esta. Irá descrever e salientar a importância da manutenção e resolução de problemas nos dias de hoje nas empresas. Para o sector específico da manutenção de máquinas pesadas, apresento uma proposta de organização com o objetivo de fazer melhor uso dos recursos humanos e materiais. O trabalho será apresentado como um estudo de caso geral. Perante os modelos existentes de manutenção, a realidade, o propósito / finalidade, princípios e ferramentas, apresento novas perspetivas de como atuar e desenvolver o trabalho que permita encontrar a melhor forma de fazer a gestão da manutenção eficiente e eficaz. No desenvolvimento do trabalho é importante conhecer todas as variáveis da manutenção porque apesar desta ser planeada podem existir e / ou verificar-se desvios do planeamento, o que acontece com alguma frequência. A criação de uma metodologia de implementação de um sistema tem como objetivo não só a sua implementação, mas também a eliminação de falhas e a procura da melhoria continua. Após conhecer bem a manutenção é tempo de começar uma nova pesquisa para o desenvolvimento do sistema em si. Na continuação deste trabalho, serão criados procedimentos e apoio à decisão, a fim de sustentar a organização da manutenção e o sistema de gestão da mesma. Este estudo foi realizado para três tipos de equipamentos: escavadora, máquina florestal e pá carregadora de rodas).
This paper is the beginning of the construction of a method for implementation of a maintenance management system and a study organization and availability of heavy machines. It is a research of base fundaments of maintenance and all the issues that arise from the maintenance. It will describe the importance of maintenance on these days to the companies and their problems with maintenance. A specific sector, maintenance of heavy machinery with the proposal to organized, to make better use of human and material resources, will be presented as a general case study. Given the maintenance models, the reality, the end of destination, principles and tools, a new perspective, how to act and develop the work in order to find the best ways to perform the maintenance management in an effective and efficient manner will arise. It is important to know all the variables of maintenance. Because even if maintenance is planned deflections of the plan can be possible and that happens very often. The creation of a methodology to the implementation of the system aims its implementation and therefore the elimination of waste and always looking for improvement. Now that we know well the maintenance the question is the time to begin a new research for the development of the system itself. On the continuing of this work procedures and decision supports will be created in order to sustain the maintenance organization and the maintenance management system focussing on specialised equipment’s (excavator, forest machine and wheel loader).
APA, Harvard, Vancouver, ISO, and other styles
34

Khobo, Rendani Yaw-Boateng Sean. "A modelling methodology to quantify the impact of plant anomalies on ID fan capacity in coal fired power plants." Master's thesis, Faculty of Engineering and the Built Environment, 2020. http://hdl.handle.net/11427/32244.

Full text
Abstract:
In South Africa, nearly 80 % of electricity is generated from coal fired power plants. Due to the complexity of the interconnected systems that make up a typical power plant, analysis of the root causes of load losses is not a straightforward process. This often leads to losses incorrectly being ascribed to the Induced Draught (ID) fan, where detection occurs, while the problem actually originates elsewhere in the plant. The focus of this study was to develop and demonstrate a modelling methodology to quantify the effects of major plant anomalies on the capacity of ID fans in coal fired power plants. The ensuing model calculates the operating point of the ID fan that is a result of anomalies experienced elsewhere in the plant. This model can be applied in conjunction with performance test data as part of a root cause analysis procedure. The model has three main sections that are integrated to determine the ID fan operating point. The first section is a water/steam cycle model that was pre-configured in VirtualPlantTM. The steam plant model was verified via energy balance calculations and validated against original heat balance diagrams. The second is a draught group model developed using FlownexSETM. This onedimensional network is a simplification of the flue gas side of the five main draught group components, from the furnace inlet to the chimney exit, characterising only the aggregate heat transfer and pressure loss in the system. The designated ID fan model is based on the original fan performance curves. The third section is a Boiler Mass and Energy Balance (BMEB) specifically created for this purpose to: (1) translate the VirtualPlant results for the steam cycle into applicable boundary conditions for the Flownex draught group model; and (2) to calculate the fluid properties applicable to the draught group based on the coal characteristics and combustion process. The integrated modelling methodology was applied to a 600 MW class coal fired power plant to investigate the impact of six major anomalies that are typically encountered. These are: changes in coal quality; increased boiler flue gas exit temperatures; air ingress into the boiler; air heater inleakage to the flue gas stream; feed water heaters out-of-service; and condenser backpressure degradation. It was inter alia found that a low calorific value (CV) coal of 14 MJ/kg compared to a typical 17 MJ/kg reduced the fan's capacity by 2.1 %. Also, having both HP FWH out of service decreased the fan's capacity by 16.2 %.
APA, Harvard, Vancouver, ISO, and other styles
35

Watson, Leslie. "Methodology and practice of taxonomy with special reference to organization and applications of descriptive data on grasses and legumes." Thesis, Canberra, ACT : The Australian National University, 1989. http://hdl.handle.net/1885/142584.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kiper, Troy O. Hughley Anthony E. McClellan Mark R. "Batteries on the battlefield developing a methodology to estimate the fully burdened cost of batteries in the Department of Defense /." Monterey, California : Naval Postgraduate School, 2010. http://edocs.nps.edu/npspubs/scholarly/theses/2010/Jun/10Jun%5FKiper.pdf.

Full text
Abstract:
"Submitted in partial fulfillment of the requirements for the degree of Master of Business Administration from the Naval Postgraduate School, June 2010."
Advisor(s): Nussbaum, Daniel A. ; Second Reader(s): Hudgens, Bryan J. ; Yoho, Keenan D. "June 2010." "MBA Professional report"--Cover. Description based on title screen as viewed on July 15, 2010. Author(s) subject terms: Life cycle cost estimating, battery acquisition, delivered energy, fully burdened costs, fully burdened cost of fuel, fully burdened cost of water, fully burdened cost of batteries, analysis of alternatives, tradespace, capability development document, battery. Includes bibliographical references (p. 79-84). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
37

McClellan, Mark R., Troy O. Kiper, and Anthony E. Hughley. "Batteries on the battlefield : developing a methodology to estimate the fully burdened cost of batteries in the Department of Defense." Thesis, Monterey, California. Naval Postgraduate School, 2010. http://hdl.handle.net/10945/5278.

Full text
Abstract:
Approved for public release; distribution is unlimited
L), have developed methodologies to calculate the fully burdened cost of fuel as delivered energy in defense systems. Whereas these previous studies did not consider other energy sources such as batteries, this thesis contributes to the DoD area of knowledge in estimating life cycle costs of systems by developing a methodology to estimate the fully burdened cost of batteries.
APA, Harvard, Vancouver, ISO, and other styles
38

Lee, Kangsoo, and 李岡洙. "Using EEG methodology to examine the effect of exercise induced fatigue on the direction of attention during motor skill performance." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/206744.

Full text
Abstract:
Exercise induced fatigue can have a negative impact on motor skill performance. While part of the decline is attributable to physiological factors that directly influence the coordination of movement, psychological factors may also contribute. Typically, motor learning environments encourage the accumulation of task-relevant declarative knowledge, which can be depended on to consciously support performance. The literature suggests that skills learnt in this way are vulnerable to demanding performance environments, including those in which the performer is fatigued. Recent empirical work has demonstrated that ‘implicit’ motor learning environments, devised to limit declarative knowledge buildup and/or dependence on working memory, promote resilient skill performance even after exhaustive fatigue protocols. Such findings imply that dependence on declarative knowledge to support motor skill execution may be a limiting factor under physiologically fatigue. However, it remains unclear the effect fatigue has on attentional resources, such as working memory. Using established experimental paradigms and EEG methodology, a research project was designed to investigate. Two explanations were considered: (1) fatigue distracts attention away from the control of movement or (2) fatigue directs attention to the skill, which interferes with automated control of the movement. In this study novice participants were allowed to freely accumulate declarative knowledge before completing a targeted muscle-fatigue protocol. A probe response paradigm assessed participants’ ability to recall the position of movement at the time a tone sounded, under the assumption that better recall reflects skill-focused attention. Neural activity was monitored by wireless EEG technology. Neural co-activation (or coherence) between brain regions associated with motor planning (Fz or F3) and with verbal-analytical processing (T3) has been suggested to reflect conscious control of motor skills. Therefore, a fatigue induced increase in T3-F3 coherence can be interpreted as increased conscious involvement in movement control, whereas, a decrease suggests a shift of attention away from movement control. The data collected suggests that to some extent fatigue raises visual-spatial and verbal-analytical contributions to motor control, but highlights methodological issues and limitations of the work.
published_or_final_version
Human Performance
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
39

Licata, Rebecca Lynn. "Using response surface methodology to model the impact of extrusion on starch digestibility and expansion of sorghum maize composite flour." Thesis, Curtin University, 2012. http://hdl.handle.net/20.500.11937/2500.

Full text
Abstract:
Sorghum is a major drought and high temperature tolerant grain crop currently used mainly as animal feed in Australia. This study investigated high temperature high pressure extrusion cooking for the manufacture of snack-food like products using sorghum. Response surface methodology statistical modelling was successfully used to specify the optimal formulation and processing conditions to manufacture an expanded product with high levels of slowly digestible starch with potential for assisting in healthy blood glucose control.
APA, Harvard, Vancouver, ISO, and other styles
40

Providência, António Bernardo. "Metodologia de personalização de productos baseada em design centrado no utilizador. Methodology to design customized product based on user centered design." Doctoral thesis, Universitat de Girona, 2012. http://hdl.handle.net/10803/84045.

Full text
Abstract:
This PHD is about design as a tool of interaction based in information systems, allowing not only to understand the relationship of design with the user, as also, based on User Centered Design, to create a methodology that solves real problems derived from its personal needs. In the case study, the work was based on People with Special Needs, which by their limitations spend much of their day sitting and consequently end up suffering from pressure ulcers. From the research methodology resulted an approach that in a first stage relates psychophysical data acquisition and processing derived from information systems, in an approximation to semiotics. In a second stage an application has been developed in LabView for the integration and processing of data relating to acquisition of user data and data regarding the characteristics of materials and prototyping processes. The result is a file that can be interpreted by CAD systems. The third and final phase, based on the interpretation of the data in the CAD system, allows exporting the information to a CAM system and consequently the production of a customized product through rapid manufacturing technology. The work developed resulted in an integrated system called "Core System" divided by modules that allows management of all information in real time regardless of the inputs of each of its sub-modules, allowing each of these changes to be reflected in the final result. The application is the result of research work that relates the different multidisciplinary areas, building, based on correlations and data validation by specialized technicians, information processing models.
Este trabajo aborda el papel como herramienta de interacción basada en los sistemas de información permitiendo no sólo entender mejor la relación del diseño con el usuario y, basándose en el Diseño Centrado en el Usuario, crear una metodología que solucionase problemas reales resultado de una necesidad personal. En el caso de estudio el trabajo se basó en personas con necesidades especiales, que por sus limitaciones pasan gran parte del día sentados, y por eso con el tiempo sufren de úlceras por presión. De la metodología de investigación resultó un enfoque que en una primera etapa relaciona la adquisición y procesamiento de datos psicofísicos en una aproximación a la semiótica partiendo de sistemas de información.
APA, Harvard, Vancouver, ISO, and other styles
41

Chun, Julie M. "Using a Design for Project Implementation (DFPI) methodology to accelerate Return on Investment (ROI) of an Enterprise Resource Planning (ERP) System." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/59164.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering; in conjunction with the Leaders for Global Operations Program at MIT, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 85-87).
Corporations continue to grapple with the dilemma of identifying, developing and managing the implementation of meaningful process improvement projects while simultaneously meeting business goals and customer needs. In this thesis we propose a methodology, dubbed Design for Project Implementation (DFPI) that integrates a change management model and engineering design and assessment tools to provide facts and data upon which to base decisions. We suggest that the methodology can be applied via a two-dimensional evaluation process that provides a means of balancing the needs of the business (via an impact to business perspective) and a means to accelerate return on investment (via an ease of project implementation perspective). We propose that the DFPI methodology can be applied in a bottoms-up approach to investigate the value proposition of a project, highlighting critical project elements and making specific recommendations to project leaders. We also suggest that a DFPI integrated business solution (design tools in conjunction with an interactive database) can be applied in a top-down approach, identifying high risk or high leverage areas to leadership sponsors whom can deploy project leaders to investigate the potential opportunities. We tested our hypotheses related to the DFPI methodology and design tools at Raytheon Company. The methodology was deployed on process improvement projects targeted on leveraging the increased capability gained from a recent transition to an SAP enterprise resource planning (ERP) system integrated solution. In this thesis we define the DFPI methodology, describe how the associated design tools can be customized to target any type of business processes within a corporation (by applying it to ERP-related business processes at Raytheon), review the results of our pilot application at Raytheon and conclude with a short discussion of future areas of study.
by Julie M. Chun.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
42

Kazakov, Mikhaïl. "A Methodology of semi-automated software integration : an approach based on logical inference. Application to numerical simulation solutions of Open CASCADE." INSA de Rouen, 2004. http://www.theses.fr/2004ISAM0001.

Full text
Abstract:
Application integration is a process of bringing of data or functionality from one program together with that from another application programs that were not initially created to work together. Recently, the integration of numerical simulation solvers gained the importance. Integration within this domain has high complexity due to the presence of non-standard application interfaces that exchange complex, diverse and often ambiguous data. Nowadays, the integration is done mostly manually. Difficulties of the manual process force to increase the level of automation of the integration process. The author of this dissertation created a methodology and its software implementation for semi-automated (i. E. Partially automated) application integration. Application interfaces are usually represented by their syntactical definitions, but they miss the high-level semantics of applicative domains - human understanding on what the software does. The author proposes to use formal specifications (ontologies) expressed in Description Logics in order to specify software interfaces and define their high-level semantics. The author proposes a three-tier informational model for structuring ontologies and the integration process. This model distinguishes among computation-indeoendent domain knowledge (domain ontology), platform-independent interface specifications (interface ontology) and platform-specific technological integration information (technological ontology). A mediation ontology is defined to fuse the specifications. A reasoning procedure over these ontologies searches for semantic links among syntactic definitions of application interfaces. Connectors among applications are generated using the information about semantic links. Integrated applications communicate later via the connectors. The author designed a meta-model based data manipulation approach that facilitates and supports the software implementation of the integration process.
APA, Harvard, Vancouver, ISO, and other styles
43

McDonald, Richard Keirs. "Towards regenerative development : a methodology for university campuses to become more sustainable, with a focus on the University of South Florida." [Tampa, Fla] : University of South Florida, 2008. http://purl.fcla.edu/usf/dc/et/SFE0002430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

McDonald, Richard Keirs III. "Towards Regenerative Development: A Methodology for University Campuses to Become More Sustainable, With a Focus on the University of South Florida." Scholar Commons, 2008. https://scholarcommons.usf.edu/etd/391.

Full text
Abstract:
The administrations of several universities have developed strategies to reduce the negative environmental effects created by their institutions. Because no single, comprehensive methodology to guide institutions to sustainability exists, these strategies range widely in scope. As well, the definition of "sustainability" differs for these various institutions, resulting in strategies ranging from small-scale recycling programs to major initiatives to incorporate green building and revamping curricula. This study attempts to create the first comprehensive methodology to guide university campuses and processes to become regenerative. Regenerative systems "produce more resources than needed, provide resources for other projects, and enhance [the] environment" (Bernheim 2003), and are synonymous with the "triple top line" of sustainability presented by Braungart and McDonough (2002). Sustainability plans of other universities were reviewed to determine what strategies have been successful for these institutions. These data were synthesized to create the comprehensive methodology. The methodology is incremental to allow time for institutions to adjust their financial plans and facilities management practices. Subsequently, the University of South Florida's Tampa campus (USF) served as a case study. Buildings and other infrastructure were reviewed, as were the curricula, buying practices, food service, and other university processes. Finally, a survey was presented to the primary decision-makers for USF to identify obstacles to implementation of the sustainability methodology. Recommendations for overcoming these obstacles were then be devised, incorporating solutions developed at other institutions as well as novel ideas.
APA, Harvard, Vancouver, ISO, and other styles
45

Wolynski, Misha. "RND estimation stability with respect to methodology : A study on the EURO STOXX 50 index around the September 2008 stock market crash." Thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-129173.

Full text
Abstract:
The aim of this study is to investigate whether implied RND functions are stable with respect to the choice of estimation methodology and whether the stability is affected by the stock market crash of September 15 2008. In order to do so, I estimate RND functions for the EURO STOXX 50 equity index using two different methods, namely the fully parametric two-lognormal method and a curve-fitting method based on the approach proposed by Shimko (1993). For the estimated RND functions, the mean, standard deviation, skewness and kurtosis are calculated. I find that though the qualitative shape and the direction of the evolution over time of the RND functions obtained with the two different methods are relatively similar for most of the estimated trading days, the calculated descriptive statistics show noticeable and systematic differences. These conclusions are largely unaffected by the stock market crash, though the discrepancy in the estimated skewness increases in magnitude and becomes more volatile after it. Based on this, I find that the question of whether the RND estimation is stable with respect to methodology depends on the intended application. If the aim is to qualitatively assess changes in market sentiment over time, both methods lead to largely the same conclusions, and thus, the RND can be considered stable. If, on the other hand, the RND is to be used to price a contingent claim and high numerical accuracy is necessary, the RND estimation cannot be said to be stable with respect to methodology.
APA, Harvard, Vancouver, ISO, and other styles
46

Ben, Salem Mohamed Oussama [Verfasser], and Georg [Akademischer Betreuer] Frey. "BROMETH: methodology to develop safe reconfigurable medical robotic systems : application on pediatric supracondylar humeral fracture / Mohamed Oussama Ben Salem ; Betreuer: Georg Frey." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2017. http://d-nb.info/1127040677/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Jeannesson, Clément. "Development of a methodology to exploit nuclear data in the unresolved resonance range and the impact on criticality safety and reactor applications." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASP074.

Full text
Abstract:
Les calculs neutroniques réalisés notamment pour assurer l’exploitation et la sûreté des installations nucléaires sont très dépendants des données nucléaires, qui décrivent les interactions neutron-matière. En particulier, la connaissance des sections efficaces, qui définissent la probabilité d’occurrence des réactions nucléaires en fonction de l’énergie du neutron, est primordiale. Dans la plage d’énergies du neutron incident qualifiée de domaine des résonances non résolues, une structure résonante caractérise les sections efficaces mais les résonances ne peuvent être différenciées expérimentalement. Seules les valeurs moyennes des sections efficaces peuvent être calculées à partir de paramètres moyens mesurés expérimentalement, ainsi que leurs distributions de probabilité à l’aide d’une méthode Monte-Carlo appelée « la méthode des ladders ». En ce dernier cas, une représentation discrète est alors privilégiée, fondée sur l'utilisation de tables de probabilité. Cette thèse développe une méthodologie précise pour traiter les sections efficaces dans le domaine des résonances non résolues. Le travail réalisé porte notamment sur les méthodes d’échantillonnage statistique de résonances dans le cadre de la méthode des ladders. Plusieurs points sont traités, parmi lesquels l’influence du nombre de résonances échantillonnées sur le calcul des sections efficaces, ou le nombre minimal d’itérations Monte-Carlo à réaliser. Ces questions sont reformulées en fonction des paramètres de résonance fournis, et une relation est établie avec le ratio entre l’espacement moyen entre les résonances et la largeur moyenne de réaction totale. Les calculs sont réalisés sur des bibliothèques entières de données nucléaires, ce qui constitue un point fort de cette thèse. La théorie des matrices aléatoires est ensuite introduite pour échantillonner des jeux de résonances en meilleur accord avec la physique sous-jacente du problème traité. La mise en œuvre de cette théorie permet ici de corréler les espacements entre les résonances échantillonnées. L’ensemble des calculs est comparé avec la théorie de Hauser-Feschbach pour le calcul des valeurs moyennes, avec des résultats probants lorsque cette dernière utilise l’approximation de Moldauer. Plusieurs méthodes de construction de tables de probabilité sont également étudiées, et deux nouvelles méthodes fondées sur des algorithmes de k-clustering sont introduites. Des calculs de benchmarks à l’aide de codes neutroniques permettent de compléter les résultats obtenus, et d’établir une série de recommandations pour le traitement des sections efficaces dans le domaine des résonances non-résolues
Neutronics computations are widely used in reactor physics and criticality calculations to ensure the safety and the exploitation of nuclear facilities. They rely on nuclear data which describe neutron-matter interactions. Among them, cross sections are fundamental data that express the probability for a particular reaction to occur as a function of the incident neutron energy. At the intermediate to high energy range, cross section shapes are no longer distinguished, which defines the so-called unresolved resonance range. There, cross sections can only be computed as average values from average experimentally-determined parameters, as well as probability tables. These latter are a discretized form of the cross section probability distributions, determined from a Monte-Carlo-based technique called the ladder method. This thesis aims at proposing a robust methodology to process cross sections in the unresolved resonance range. In particular, the work carried out deals with statistical sampling of resonances in the framework of the ladder method. Several issues are tackled, among which the impact of the number of sampled resonances on the cross section calculations, as well as the number of Monte-Carlo iterations performed. A relation is established between these quantities and values of the input resonance parameters, namely the ratio of the average resonance spacing and the average total reaction width. Calculations are done for constituents of entire nuclear data libraries, which is an advantage of this work. Then, the random matrix theory is introduced to produce more physical sets of resonances that take into account correlations between the resonance spacings. All calculations are compared to the outcomes of the Hauser-Feschbach formalism for the calculation of average cross sections. When these latter ones are computed using the Moldauer assumption, results significantly match. Several probability table construction methods are then studied. Two innovative methods are introduced, based on a k-clustering algorithm. Benchmarks calculations using neutronics codes complete the results, and enable to formulate a detailed methodology for the nuclear data processing in the unresolved resonance range
APA, Harvard, Vancouver, ISO, and other styles
48

Razik, Mohamed Haniffa Mohamed. "A perspective on Islamic legal methodology in terms of objectives of law : a comparative analysis with special reference to English equity and Istihsān." Thesis, University of Wales Trinity Saint David, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.683365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Martins, Guillaume Jacobus. "A methodology to identify, quantify and verify the cost benefits of energy and process improvements on a ferro-metal production plant / G.J. Martins." Thesis, North-West University, 2004. http://hdl.handle.net/10394/591.

Full text
Abstract:
South Africa has an energy intensive economy with a high dependency on local mining and base metal industries. Furnace plants, which form part of the metal industry, are energy intensive as a result of the actual melting processes which require a great amount of energy. The high electricity and energy usage translates into high operating costs for these plants which in turn reduces the profitability of the plants. South Africa's ferrochrome industry supplies about 60% of the world's ferrochrome demand and holds around 80% of the world's chrome reserves. This makes South Africa one of the key ferrochrome producers in the world. There is however a need to reduce the cost of production of these plants to ensure competitiveness and profitability within the world market. This dissertation starts by providing an introduction to the problem and then defining the objective and scope of the study. The need for a methodology to identify, quantify and verify energy and process improvement opportunities in Ferro-metal production plants is highlighted. This need exists because there is a lack of adequate methods for an integrated approach. Three main barriers to energy projects were identified in this study, namely: institutional, technological and financial barriers. The opportunities for energy management and process improvements are investigated, including opportunities to overcome the barriers identified in the study. A methodology, which is developed to incorporate both increased production and energy efficiency scenarios, is then provided. The methodology is firstly aimed at identifying possible opportunities and then quantifying them in terms of financial benefits for the plant. This is necessary to establish whether it will be worth while to explore the opportunities further. Benchmarking is also included in the methodology as this helps to track the performance of the plant over time. A process was developed to enable accurate measurement and verification of energy related projects in order to evaluate the effectiveness or success of implemented projects. This process is necessary to enhance the credibility of energy related projects by providing an accurate and transparent evaluation of the project's performance. This in turn provides the stakeholders with invaluable information regarding their investments in energy projects. The developed methodology was applied to a case study of a Ferro-metal production plant in order to evaluate the methodology. The case study revealed that the methodology can successfully identify and quantify potential opportunities. The no-cost and low-cost opportunities identified, showed a maximum possible annual saving of up to R925,500 depending on the specific options implemented. Load control opportunities in peak periods revealed an estimated annual cost saving of up to R3,767,400 per year. A possible estimated annual energy consumption saving worth R22,629,900 was identified by a Cusum analysis. This analysis was also used to examine the benefit of a production gain instead of energy efficiency which showed a possible increase in production of 60,300 tomes per year. The measurement and verification process was then used to determine the impact that an upgrade of a furnace, aimed at increasing production, had on the actual performance of the furnace. The verification process showed an increase in production worth over R3million and an energy saving of over Rl million as a direct result of the upgrade. The process showed that the upgrade did indeed achieve a production gain and therefore the upgrade is considered to be a success.
Thesis (M.Ing. (Mechanical Engineering))--North-West University, Potchefstroom Campus, 2005.
APA, Harvard, Vancouver, ISO, and other styles
50

Elizondo, David C. "A Methodology to Assess and Rank the Effects of Hidden Failures in Protection Schemes based on Regions of Vulnerability and Index of Severity." Diss., Virginia Tech, 2003. http://hdl.handle.net/10919/26902.

Full text
Abstract:
Wide-area disturbances are power outages occurring over large geographical regions that dramatically affect the power system reliability, causing interruptions of the electric supply to residential, commercial, and industrial users. Historically, wide-area disturbances have greatly affected societies. Virginia Tech directed a research project related to the causes of the major disturbances in electric power systems. Research results showed that the role of the power systemâ s protection schemes in the wide-area disturbances is critical. Incorrect operations of power systemâ s protection schemes have contributed to a spread of the disturbances. This research defined hidden failures of protection schemes and showed that these kinds of failures have contributed in the degradation of 70-80 percent of the wide-area disturbances. During a wide-area disturbance analysis, it was found that hidden failures in protection schemes caused the disconnection of power system elements in an incorrect and undesirable manner contributing to the disturbance degradation. This dissertation presents a methodology to assess and rank the effects of unwanted disconnections caused by hidden failures based on Regions of Vulnerability and index of severity in the protection schemes. The developed methodology for the evaluation of the Region of Vulnerability found that the indicator that most accurately reflects the relationship of the Region of Vulnerability with the single line diagram is kilometers. For the representation of the Region of Vulnerability in the power system, we found segments in the transmission line in which the occurrence of faults do make the relay to operate, producing the unwanted disconnection caused by hidden failure. The results in the test system show that the infeed currents restrain the Region of Vulnerability from spreading along power system elements. Finally the methodology to compute the index of severity is developed. The index of severity has the objective of ranking the protection schemes, considers the dynamics of the protection schemes, and evaluates the overall disturbance consequence under the static and dynamic perspectives.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography