Dissertations / Theses on the topic 'Analytic tool'

To see the other types of publications on this topic, follow the link: Analytic tool.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Analytic tool.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lim, Alvin. "Development of a Semi-Analytic Method to Estimate Forces Between Tool and Hand, Tool and Workpiece in Operation of a Hand-held Power Tool." University of Cincinnati / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406808912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abdulhadi, Samer Nazmi. "Strategic Decisions Creation-Implementation (SDCI) process : an empirical study." Thesis, Cranfield University, 2015. http://dspace.lib.cranfield.ac.uk/handle/1826/9725.

Full text
Abstract:
The aim of this research was to explore empirically how firms create and implement strategic decisions (SD’s). The research was inspired by the need to understand further organizational process underpinning SD’s phenomenon and potentially contribute to the overall performance of firms. Previous research on SD’s has been focusing on the formal strategic planning approaches, which have been criticized for their highly prescriptive views of SD’s, separating creation from implementation, and focusing on the content and discrete elements rather than the holistic process. Despite all these studies, our understanding of the actual nature of the SD phenomenon from creation to implementation remains incomplete. Motivated by the need to look empirically and holistically at this very complex social phenomenon, this research problematizes the above aspects of SD’s literature and positions this research within a wider social and descriptive process based approach. The research employed qualitative and Analytic Induction (AI) methodologies, and addressed the above need in three projects. The objective of each project has evolved and lead to the emergence of the final findings, which suggest a possible answer to the overall research aim. The Scoping Study proposed a theoretical framework of successful SD’s implementation factors. Project 1 went further and investigated these factors empirically. Project 2 developed empirically the process of how people actually create and implement SD’s. In Project 3, this process was analysed through the theoretical lens of the sensemaking perspective and was applied by practitioners through an empirically tested diagnostic tool. This research has made a step towards a better understanding of SD’s in practice and contributed to the academic knowledge by proposing a different, yet viable descriptive process, which can improve the overall quality of the SD’s, and potentially lead to better performance.
APA, Harvard, Vancouver, ISO, and other styles
3

Wong, Angelina H. "An interactive decision analytic tool for the cost-effectiveness analysis of antimicrobial agents for hospital-acquired pneumonia." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape9/PQDD_0005/MQ45459.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sood, Radhika. "Comparative Data Analytic Approach for Detection of Diabetes." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1544100930937728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Struthers, John. "Analytic autoethnography : a tool to inform the lecturer's use of self when teaching mental health nursing?" Thesis, Lancaster University, 2012. http://eprints.lancs.ac.uk/62512/.

Full text
Abstract:
This research explores the value of analytic autoethnography to develop the lecturer’s use of self when teaching mental health nursing. Sharing the lecturer’s selfunderstanding developed through analytic reflexivity focused on their autoethnographic narrative offers a pedagogical approach to contribute to the nursing profession’s policy drive to increase the use of reflective practices. The research design required me to develop my own analytic autoethnography. Four themes emerged from the data ‘Being in between’, ‘Perceived vulnerability of self’, ‘Knowing and doing’, and ‘Uniting selves’. A methodological analysis of the processes involved in undertaking my analytic autoethnography raised issues pertaining to the timing and health warnings of exploring memory as data. Actor-Network Theory was used as an evaluative framework to reposition the research findings back into relationships which support educational practices. The conclusion supports the use of analytic autoethnography to enable lecturers to share hidden practices which underpin the use of self within professional identities. Recommendations seek methodological literature which makes explicit possible emotional reactions to the reconstruction of self through analysis of memories. Being able to share narratives offers a pedagogical approach based on the dilemmas and tensions of being human, bridging the humanity between service user, student and lecturer.
APA, Harvard, Vancouver, ISO, and other styles
6

Aumeerally, Manisah, and n/a. "Analytic Model Derivation Of Microfluidic Flow For MEMS Virtual-Reality CAD." Griffith University. School of Information and Communication Technology, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061106.095352.

Full text
Abstract:
This thesis derives a first approximation model that will describe the flow of fluid in microfluidic devices such as in microchannels, microdiffusers and micronozzles using electrical network modelling. The important parameter that is of concern is the flow rates of these devices. The purpose of this work is to contribute to the physical component of our interactive Virtual Reality (VR)-prototyping tool for MEMS, with emphasis on fast calculations for interactive CAD design. Current calculations are too time consuming and not suitable for interactive CAD with dynamic animations. This work contributes to and fills the need for the development of MEMS dynamic visualisation, showing the movement of fluid within microdevices in time scale. Microfluidic MEMS devices are used in a wide range of applications, such as in chemical analysis, gene expression analysis, electronic cooling system and inkjet printers. Their success lies in their microdimensions, enabling the creation of systems that are considerably minute yet can contain many complex subsystems. With this reduction in size, the advantages of requiring less material for analysis, less power consumption, less wastage and an increase in portability becomes their selling point. Market size is in excess of US$50 billion in 2004, according to a study made by Nexus. New applications are constantly being developed leading to creation of new devices, such as the DNA and the protein chip. Applications are found in pharmaceuticals, diagnostic, biotechnology and the food industry. An example is the outcome of the mapping and sequencing of the human genome DNA in the late 1990's leading to greater understanding of our genetic makeup. Armed with this knowledge, doctors will be able to treat diseases that were deemed untreatable before, such as diabetes or cancer. Among the tools with which that can be achieved include the DNA chip which is used to analyse an individual's genetic makeup and the Gene chip used in the study of cancer. With this burgeoning influx of new devices and an increase in demand for them there is a need for better and more efficient designs. The MEMS design process is time consuming and costly. Many calculations rely on Finite Element Analysis, which has slow and time consuming algorithms, that make interactive CAD unworkable. This is because the iterative algorithms for calculating the animated images showing the ongoing proccess as they occur, are too slow. Faster computers do not solve the void of efficient algorithms, because with faster computer also comes the demand for a fasters response. A 40 - 90 minute FEA calculation will not be replaced by a faster computer in the next decades to an almost instant response. Efficient design tools are required to shorten this process. These interactive CAD tools need to be able to give quick yet accurate results. Current CAD tools involve time consuming numerical analysis technique which requires hours of numerous iterations for the device structure design followed by more calculations to achieve the required output specification. Although there is a need for a detailed analysis, especially in solving for a particular aspect of the design, having a tool to quickly get a first approximation will greatly shorten the guesswork involved in determining the overall requirement. The underlying theory for the fluid flow model is based on traditional continuum theory and the Navier-Stokes equation is used in the derivation of a layered flow model in which the flow region is segmented into layered sections, each having different flow rates. The flow characteristics of each sections are modeled as electrical components in an electrical circuit. Matlab 6.5 (MatlabTM) is used for the modelling aspect and Simulink is used for the simulation.
APA, Harvard, Vancouver, ISO, and other styles
7

Kennedy, Judith Ronelle Graduate Program in Professional Ethics School of Philosophy UNSW. "The treatment engagement model as a tool for identifying problematic doctor behaviour. Three case studies." Awarded by:University of New South Wales. Graduate Program in Professional Ethics, School of Philosophy, 2006. http://handle.unsw.edu.au/1959.4/28220.

Full text
Abstract:
This thesis is an exploration of professional behaviour in health care settings, using a Model of Treatment Engagement that is developed as a tool for ethics critique. The Model is tested and refined using data on: a psychiatric ???treatment??? carried out on over 1,127 occasions in a 15 - 40 bed non-acute hospital during the period 1961-1979; the problematic withdrawal of all life-support from a 37 year old man who had suffered acute brain trauma some five days previously, in a tertiary hospital in March 2000; and a clinical experiment recently proposed for the emergency setting and intended to encompass five hospitals and the NSW Ambulance Service. In each case, the Model proves useful in identifying the shift from the treatment paradigm and the ethical imperative of ensuring the patient (or his/her agent) appreciates the difference between what is proposed and what would normally be done. It reveals how doctors who dealt with the patient but did not decide on treatment contributed to ethically troublesome practice. It clarifies how having multiple doctor players in the treatment situation gave rise to the need to suppress dissenting views. Doctors who were close enough to the action to comprehend its nature, by not dissenting, reinforced the problematic choice for the actor and validated it in the eyes of observers. The lack of dissent at the level of doctors working under supervision, appeared to be a function of institutional arrangements. At the consultant level, there was evidence of pressure to concur from other consultants and indirect evidence of a fear of ostracism. The public responses in the two modern cases point to there being a strong idea in Sydney???s medical community that dissent should not be publicly displayed once a decision on how to treat has been made. I conclude there are two steps to reviewing ethically problematic treatment situations. The first consists of identifying the shift from the treatment paradigm. The second consists of establishing why the problematic choice is translated into action. The Treatment Engagement Model is put forward as a useful tool for both these analyses.
APA, Harvard, Vancouver, ISO, and other styles
8

Franks, Lianne. "Exploring multi-disciplinary team (MDT) experiences of cognitive analytic therapy (CAT) as a systemic consultation tool in an adult forensic service." Thesis, University of Liverpool, 2015. http://livrepository.liverpool.ac.uk/2034319/.

Full text
Abstract:
Background: Following the growing emphasis on the use of psychological consultation and the use of Cognitive Analytic Therapy (CAT) as a consultation tool, this qualitative study explored staff members’ experiences of using CAT as a systemic consultation tool. Method: Interviews were conducted with nine members of the Multi-Disciplinary Team in a High Secure Hospital and the data analysed using thematic analysis from a social constructionist perspective. Results: Emerging themes of CAT as consultation tool included the availability and accessibility, the genuine value and mirroring enlightenment. Conclusion: The study demonstrates how genuine value within the system sits at the heart of accessibility and availability of CAT as a systemic consultation tool and the mirroring enlightenment of staff and patients. Implications for clinical practice are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang`, Sen, and Songyang Gao. "A new approach for analyzing the RL competence in 3PLs : A case study of FLB." Thesis, Högskolan i Gävle, Institutionen för teknik och byggd miljö, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-6582.

Full text
Abstract:
Purpose – The purpose of our thesis is to introduce a new approach to help small and medium third party logistics service providers (3PLs) to develop and improve their Reverse logistics.   Design/methodology/approach – A case study was adopted, and a qualitative questionnaire and face-to-face interview were used to collect the fundamental data. In addition, the multi-criteria decision-making tool-AHP and the Reverse Logistics audit model were used to analyze the case and to solve the problem of 3PLs in Reverse Logistics improvement.   Findings – Under the fierce competition and in today’s volatile market, the Reverse Logistics is considered as a new competitive advantage to many 3PLs. However, the effective solutions in instructing 3PLs how to develop and improve their Reverse Logistics competence are still very scarce. Based on these, a new approach will be presented in this thesis to help 3PLs to solve problem.   Limitations – There are two main limitations in our paper, which emerge in the two models we adopted. Regarding the Reverse Logistics audit model, we need more data of the case company to support our research. For the AHP method, the limitation is:  the fact that general assumptions were used to provide correlative data in computation, and complex computation was simplified in order to show calculative processes clearly.   Practical implications – FLB, the case company will be researched to verify the practical implication of our new approach. We believe through our approach, lots of small and medium size 3PLs will find it easier to get a holistic view of their RL competence, and know how to develop or improve it.   Originality/value – How to evaluate and assess the RL competence will be presented separately from inside view of 3PLs and outside view of their customers. AHP and self-made RL audit model will be used to achieve then respectively.   Keywords Reverse logistics, Analytic hierarchy process, Assessment tool, and 3PLs    Paper type Case study/ Research paper
APA, Harvard, Vancouver, ISO, and other styles
10

Bonilla, Hernández Ana Esther. "Analysis and direct optimization of cutting tool utilization in CAM." Licentiate thesis, Högskolan Väst, Forskningsmiljön produktionsteknik(PTW), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-8672.

Full text
Abstract:
The search for increased productivity and cost reduction in machining can be interpreted as the desire to increase the material removal rate, MRR, and maximize the cutting tool utilization. The CNC process is complex and involves numerous limitations and parameters, ranging from tolerances to machinability. A well-managed preparation process creates the foundations for achieving a reduction in manufacturing errors and machining time. Along the preparation process of the NC-program, two different studies have been conducted and are presented in this thesis. One study examined the CAM programming preparation process from the Lean perspective. The other study includes an evaluation of how the cutting tools are used in terms of MRR and tool utilization. The material removal rate is defined as the product of three variables, namely the cutting speed, the feed and the depth of cut, which all constitute the cutting data. Tool life is the amount of time that a cutting tool can be used and is mainly dependent on the same variables. Two different combinations of cutting data might provide the same MRR, however the tool life will be different. Thereby the difficulty is to select the cutting data to maximize both MRR and cutting tool utilization. A model for the analysis and efficient selection of cutting data for maximal MRR and maximal tool utilization has been developed and is presented. The presented model shortens the time dedicated to the optimized cutting data selection and the needed iterations along the program development.
APA, Harvard, Vancouver, ISO, and other styles
11

Åström, Angie. "Svensk offentlig diplomati i förändring : En fallstudie om Svenska institutet." Thesis, Södertörns högskola, Institutionen för samhällsvetenskaper, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-17315.

Full text
Abstract:
The Swedish Institute is a public agency promoting Swedish interest, national image and confidence around the world. This work attempts to investigate how communication and a process over time influences and effects public diplomacy. The analytic discussion is based on a single case study research of this Institute representing ideas in the international science field of public diplomacy. The theoretical ideas of public diplomacy are placed in a theoretical perspective of social constructivism. The method is qualitative, with excerpts taken from interviews, literature, newspapers, articles, state public reports and social media. The work adopts a discourse analytic approach, aiming to uncover the structure of public diplomacy by using three analytic tools: soft power, nation branding and cultural diplomacy. The presented analysis and examples suggest a close collaboration between researchers and practitioners can lead to a coherent theory of public diplomacy. The result identifies promising directions as well as weakness and gaps in existing knowledge. The work promotes an analytic tool “korstryck” for theorize and conceptualize the discussion of public diplomacy. A strategy of today requires three fundamental components: power, diplomacy and communication. The challenge in public diplomacy is the balance between public opinion, public foreign policy and global networks of communication. The paper aims to open doors for further scientific works are needed in the searching for a theory of public diplomacy.
APA, Harvard, Vancouver, ISO, and other styles
12

Biro, Michael A. "An analysis of the reasons students enroll in the Machine Tool Operation and Tool & Die Making diploma program at Waukesha County Technical College." Menomonie, WI : University of Wisconsin--Stout, 2006. http://www.uwstout.edu/lib/thesis/2006/2006birom.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Tang, Xiaoting. "New analytical tools for systems biology." Online access for everyone, 2006. http://www.dissertations.wsu.edu/Dissertations/Fall2006/x_tang_081706.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Leite, Roger Almeida. "PhenoVis : a visual analysis tool to phenological phenomena." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/115181.

Full text
Abstract:
Phenology studies recurrent periodic phenomena of plants and their relationship to environmental conditions. Monitoring forest ecosystems using digital cameras allows the study of several phenological events, such as leaf expansion or leaf fall. Since phenological phenomena are cyclic, the comparative analysis of successive years is capable of identifying interesting variation on annual patterns. However, the number of images collected rapidly gets significant since the goal is to compare data from several years. Instead of performing the analysis over images, experts prefer to use derived statistics (such as average values). We propose PhenoVis, a visual analytics tool that provides insightful ways to analyze phenological data. The main idea behind PhenoVis is the Chronological Percentage Maps (CPMs), a visual mapping that offers a summary view of one year of phenological data. CPMs are highly customizable, encoding more information about the images using a pre-defined histogram, a mapping function that translates histogram values into colors, and a normalized stacked bar chart to display the results. PhenoVis supports different color encodings, visual pattern analysis over CPMs, and similarity searches that rank vegetation patterns found at various time periods. Results for datasets comprising data of up to nine consecutive years show that PhenoVis is capable of finding relevant phenological patterns along time. Fenologia estuda os fenômenos recorrentes e periódicos que ocorrem com as plantas. Estes podem vir a ser relacionados com as condições ambientais. O monitoramento de florestas, através de câmeras, permite o estudo de eventos fenológicos como o crescimento e queda de folhas. Uma vez que os fenômenos fenológicos são cíclicos, análises comparativas de anos sucessivos podem identificar variações interessantes no comportamento destes. No entanto, o número de imagens cresce rapidamente para que sejam comparadas lado a lado. PhenoVis é uma ferramenta para análise visual que apresenta formas para analisar dados fenológicos através de comparações estatísticas (preferência dos especialistas) derivadas dos valores dos pixels destas imagens. A principal ideia por trás de PhenoVis são os mapas percentuais cronológicos (CPMs), um mapeamento visual com uma visão resumida de um período de um ano de dados fenológicos. CPMs são personalizáveis e conseguem representar mais informações sobre as imagens do que um gráfico de linha comum. Isto é possível pois o processo envolve o uso de histogramas pré-definidos, um mapeamento que transforma valores em cores e um empilhamento dos mapas de percentagem que visa a criação da CPM. PhenoVis suporta diferentes codificações de cores e análises de padrão visual sobre as CPMs. Pesquisas de similaridade ranqueiam padrões parecidos encontrados nos diferentes anos. Dados de até nove anos consecutivos mostram que PhenoVis é capaz de encontrar padrões fenológicos relevantes ao longo do tempo.
APA, Harvard, Vancouver, ISO, and other styles
15

Naumanen, Hampus, Torsten Malmgård, and Eystein Waade. "Analytics tool for radar data." Thesis, Uppsala universitet, Institutionen för teknikvetenskaper, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-353857.

Full text
Abstract:
Analytics tool for radar data was a project that started when radar specialists at Saab needed to modernize their tools that analyzes binary encoded radar data. Today, the analysis is accomplished using inadequate and ineffective applications not designed for that purpose, and consequently this makes the analysis tedious and more difficult compared to using an appropriate interface. The applications had limitations regarding different radar systems too, which restricted their usage significantly. The solution was to design a new software that imports, translates and visualizes the data independent of the radar system. The software was developed with several parts that communicates with each other to translate a binary file. A binary file consists of a series of bytes containing the information of the targets and markers separating the revolutions of the radar. The byte stream is split according to the ASTERIX protocol that defines the length of each Data Item and the extracted positional values are stored in arrays. The code is then designed to convert the positional values to cartesian coordinates and plot them on the screen. The software has implemented features such as play, pause, reverse and a plotting history that allows the user to analyze the data in a simple and user-friendly manner. There are also numerous ways the software could be extended. The code is constructed in such a way that new features can be implemented for additional analytical abilities without affecting the components already designed.
APA, Harvard, Vancouver, ISO, and other styles
16

Holm, Jason L. "Textlinguistics as an exegetical tool." Online full text .pdf document, available to Fuller patrons only, 2001. http://www.tren.com.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Chong, Natalie. "Beyond Evidence-Based Decision Support : Exploring the Multi-Dimensional Functionality of Environmental Modelling Tools. Comparative Analysis of Tool." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1005.

Full text
Abstract:
À l’heure où les horizons d’une croissance et d’une consommation infinies sont remis en cause, les appels aux développements de solutions de plus en plus robustes, flexibles et intégrées pour gérer les problèmes environnementaux inédits ont conduit à l’avènement d’un nouveau paradigme, transformant de manière radicale les pratiques de la science et de la gestion. L’importance accrue accordée aux approches collaboratives, intégrées et participatives a soutenu l’essor d’arrangements entre science, pratique et politique, tout en rendant nécessaire la création de nouveaux outils pour accompagner la mise en œuvre d’une réglementation de plus en plus exigeante. Dans le contexte de la gestion des ressources en eau, les modèles sont apparus comme des outils cruciaux, plébiscités par des scientifiques et des praticiens, pour leur capacité à faire avancer la compréhension scientifique du fonctionnement des systèmes hydrologiques à renseigner les politiques publiques et la planification de l’eau dans les bassins versants. Une grande diversité d’outils de modélisation a été développée pour analyser les processus physiques, chimiques et biologiques à l’œuvre, à des échelles spatiales et temporelles diverses et avec des degrés de complexité variés. Par ailleurs, les modèles sont censés fournir aux praticiens des outils concrets au service de politiques fondées sur des faits scientifiques (‘evidence-based policy’), en permettant de transposer des problèmes complexes en solutions techniques « gérables ». Pour autant, leur application pratique est loin d’être proportionnelle à l’investissement en temps et en ressources dédié à leur développement. Cette thèse vise à éclairer le fossé persistant entre science, pratique et politique dans le contexte d’un nouveau paradigme pour la science et la gestion, à travers le prisme des outils de modélisation et de leur rôle à l’interface science-pratique-politique. Nous utilisons une approche qualitative et nous nous appuyons sur deux exemples empiriques : le PIREN-Seine en France et le CRC for Water Sensitive Cities en Australie. Bien que les deux exemples partagent des défis, des méthodes et des objectifs similaires, la richesse de leur comparaison repose sur la différence fondamentale dans leurs approches et leurs stratégies.Ce faisant, nous explorons les moteurs, implications et conséquences potentielles des changements de paradigme parallèles à l’œuvre de la science et la gestion, en nous concentrant sur trois aspects : 1/ l’utilisation et l’utilité des outils de modélisation pour soutenir la gestion, la planification et les politiques publiques concernant les ressources en eau ; 2/ les différentes modalités qui permettent d’aborder l’incertitude dans l’aide à la décision reposant sur des modèles ; 3/ la signification ou la portée de nouveaux arrangements entre science, pratique et politique. En retraçant l’histoire de la production et de l’utilisation des outils de modélisation dans les deux exemples, nous cherchons tout d’abord à comprendre la relation nuancée entre « utilisation » et « utilité », en offrant un aperçu des facteurs qui les influencent. Nous nous intéressons ensuite à la question de l’incertitude en analysant la manière dont chercheurs et praticiens affrontent le défi fondamental de l’incertitude dans l’aide à la décision fondée sur les modèles. En considérant les processus complexes, socialement négociés, qui s’inscrivent dans le contexte de la prise de décision, nous nous concentrons sur la construction sociale de l’ignorance et sur sa fonction. Nous examinons enfin, à un niveau macro socio-économique, l’évolution des pratiques engendrée par le changement de paradigme dans la science et la gestion. Parmi ces changements, nous proposons une interprétation de l’émergence et des fonctions des « organisations frontières », et le rôle qu’elles sont amenées à jouer dans la recherche de solutions robustes, flexibles et durables
As the sun sets on the age of unlimited growth and consumption, the call for progressively robust, adaptive and integrated solutions to address ‘wicked’ environmental problems has ushered in a new paradigm that has fundamentally changed the practices of both science and management. Emphasis on collaborative, integrative and participative approaches has given rise to burgeoning science-practice-policy arrangements while necessitating new tools to support the implementation of increasingly demanding regulation. In the context of water resources, models have emerged as fundamental tools favoured by scientists and practitioners alike, owing to their ability to advance scientific understanding of water systems functioning, while at the same time supporting key decisions in the management, policy and planning of river basins. A wide range of modelling tools have been developed to study the numerous physical, chemical, and biological processes at work, on different spatial and temporal scales, with varying levels of complexity. At the same time, models provide practitioners with a practical tool for supporting ‘evidence-based’ policy by transposing complex problems into technical, ‘manageable’ solutions. Yet, their application in practice has proven far from proportional to the amount of time and resources that have been invested in their development.This thesis aims to elucidate the enduring divide between science, practice and policy in the context of a new paradigm of science and management through the lens of modelling tools and their role at the science-practice-policy interface. Using a qualitative approach, we draw from two empirical examples: the PIREN-Seine in France and the CRC for Water Sensitive Cities in Australia. While both share similar challenges, methods and objectives, the fundamental difference in their strategies and approaches offers a rich foundation for comparison. In doing so, we explore the driving forces, implications and potential consequences of the parallel paradigm shifts in science and management, focusing on three main aspects: 1/ the use and utility of modelling tools to support water management, policy and planning; 2/ the different modalities of addressing uncertainty in model-based decision support, and; 3/ the role of new science-practice-policy arrangements. By first retracing the history of production and use of modelling tools in both examples, we seek to understand the nuanced relationship between ‘use’ and ‘utility’, offering insight into influencing factors. Next, we turn to the question of uncertainty by analysing how researchers and practitioners reconcile the fundamental challenge of uncertainty in model-based decision support. Delving deeper into the complex, negotiated social process that comprises the decision-making context, we focus on the social construction of ignorance and its role in decision-making. Finally, we examine the macro-level changes brought about by the paradigm shift in science and management. Amidst these changes, we seek to understand the emergence and functions of ‘boundary organisations’ in this new epoch, and their role in the quest for robust, adaptive and sustainable solutions
APA, Harvard, Vancouver, ISO, and other styles
18

Gomolka, Beth. "Service Offering Uncertainty Analysis Tool." Thesis, Linköping University, Linköping University, Department of Management and Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-19945.

Full text
Abstract:

Companies that seek to venture into providing services in addition to providing products have many business issues to consider as there are many differences between providing service and product offerings.  One factor that needs to be considered in service offerings is the aspect of time, as services are offered for an extended period of time, creating a unique type of relationship between the customer and the service provider.  With product offerings, the point of sale is usually the end of the product provider and customer relationship.  The added time aspect in the service offering brings with it the issues of uncertainty as service contracts are made for a certain period of time in the future, where things are unknown.

 

This thesis looked at types of uncertainties important to service offerings, especially in the manufacturing industry.  The uncertainties have an impact on how service offering contracts are constructed, as they can affect the profit and costs of the service provider. The three types of uncertainties that were examined were product malfunction uncertainty, service delivery uncertainty, and customer requirement uncertainty. Using these three types of uncertainty, mathematical models were constructed to represent the cost and revenue of different contract types. The different contract types were identified through a case study with a product manufacturer in Sweden.  Different probability distributions were selected to model the three types of uncertainty based on a literature review.  The mathematical models were then used to construct a software program, the uncertainty simulator tool, which service contract designers can use to model how uncertainties affect cost and revenue in their contracts.

APA, Harvard, Vancouver, ISO, and other styles
19

Tan, Johnson Cheah-Shin. "A machine utilization analysis tool." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/37773.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.
Includes bibliographical references (leaves 76-77).
by Johnson Cheah-Shin.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
20

Selvaraja, Sudarshan. "Microarray Data Analysis Tool (MAT)." University of Akron / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=akron1227467806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Roberts, David Anthony. "Discontinuous Systems Analysis: an Interdisciplinary Analysis Tool." Oxford, Ohio : Miami University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=miami1196390609.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Krishna, Nyaupane Bal. "Testing a Timing Analysis tool : SWEET." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-24444.

Full text
Abstract:
The main purpose of analysis of timing behavior of real time systems is to verify that the systemmeets its timing requirements. One important part of this analysis is to find the worst caseexecution time (WCET) of the software in the system. SWEET (Swedish Execution TimeAnalysis Tool) is a tool, developed at IDT in Västerås, that calculates the WCET by staticanalysis. The calculated WCET must be safe, i.e., it must never underestimate the real WCET.The tool should be able to handle large programs efficiently. It must also calculate correct resultsfor different types of program constructs.This thesis mainly focuses on verifying these two requirements for SWEET. Firstly, we havecreated large programs (larger than 14 KLOC) to find the limits of the program size that can behandled by SWEET and observed the result and the analysis time for these programs. Secondly,we have created a number of useful examples to test special features of SWEET and to showhow SWEET is capable to analyze different types of C programs, handling arrays/matrices,strings, etc.During analysis we have encountered some problems, which have pointed out a number of bugsin SWEET.
APA, Harvard, Vancouver, ISO, and other styles
23

Garg, Deepali 1982. "A tool for hemodynamic data analysis." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28385.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.
Includes bibliographical references (leaves 83-85).
Nearly 2% of all live births are very premature (gestational age less than 32 weeks), and 50% of all new cases of cerebral palsy occur in survivors of premature birth (gestational age less than 37 weeks). Because of their underdeveloped vascular structure, premature infants are especially vulnerable to brain injury caused by unregulated and erratic changes in blood pressure. A challenge in the prevention of serious brain injury in premature infants is the inability to identify impending or recent hemodynamic events that might lead to injury of the newborn's brain. If events that indicate a propensity to experience brain injury can be identified, then such events can be monitored clinically, and steps can be taken to prevent them from occurring. We designed and implemented a software tool, HemDAT, that can be used to test hemodynamics related hypotheses and to facilitate the discovery of interesting relationships among hemodynamic signals. HemDAT uses signal processing and statistical knowledge to provide clinical researchers a tool that can help develop a better understanding of how brain injury occurs in premature newborns. HemDAT is capable of processing and navigating large data sets of blood pressure and cerebral blood flow. Large data sets are important because the events that cause brain injury are believed to be short-lived, possibly infrequent, and unpredictable. Additionally, since this is a relatively unexplored area in human infants, HemDAT provides flexibility in performing repeated analyses with different parameters modifiable by the user. HemDAT also provides convenient visualizations of results and does not demand signal processing or statistical expertise from the user.
by Deepali Garg.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Chun-Yi 1971. "Dynamic simulation tool for seismic analysis." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/49995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Lisboa, Liana Barachisio. "ToolDAy A Tool for domain analysis." Universidade Federal de Pernambuco, 2008. https://repositorio.ufpe.br/handle/123456789/1692.

Full text
Abstract:
Made available in DSpace on 2014-06-12T15:51:53Z (GMT). No. of bitstreams: 1 license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2008
A reutilização de software - o processo de criar sistemas através de artefatos existentes, ao invés de desenvolvê-los do zero - é um aspecto chave para melhorias em qualidade e produtividade no desenvolvimento de software. Contudo, os ganhos da reutilização são mais efetivos quando o reuso é planejado e gerenciado de forma sistemática no contexto de um domínio específico, onde famílias de aplicações compartilham algumas funcionalidades. Neste contexto, uma das formas de se obter um processo de reuso mais sistemático é através do processo de análise de domínio - o processo de identificação de características comuns e variáveis de sistemas em um domínio específico. Este processo é composto por diversas atividades, como definição do escopo, modelagem e documentação do domínio, identificação das restrições, entre outros; e o seu sucesso é muito dependente de quão bem o mesmo é executado. Desta forma, torna-se essencial ter uma ferramenta de suporte para auxiliar a sua execução. Atualmente, existem diversas ferramentas que provêem suporte a análise de domínio, todavia, as mesmas apresentam limitações, como não prover suporte ao processo completo. Assim, este trabalho apresenta os requisitos, a arquitetura e a implementação de uma ferramenta que provê suporte a análise de domínio e que foi focada em resolver as limitações identificadas nas ferramentas existentes. Além disso, esta dissertação descreve o processo e os resultados encontrados nas diversas avaliações que foram executadas em diferentes ambientes com a ferramenta proposta
APA, Harvard, Vancouver, ISO, and other styles
26

Díaz, Vázquez Guillermo. "Case Data Analysis Tool for PowerFLOW." Thesis, KTH, Flygdynamik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-262008.

Full text
Abstract:
The field of computational fluid dynamics (CFD) is exponentially growing in terms of performance, robustness, and applications. The expansion of CFD also means more users and more simulations, which translates into more human errors and mistakes in the simulation set up. Because the simulation set up should be the correct in order to accurately reproduce the desired phenomenon, such errors must be mitigated in order to increase the reliability and robustness of the simulations. In this project a tool has been developed to tackle this issue, within the CFD software SIMULIA PowerFLOW. The tool extracts and analyzes the data of the cases before simulation, reporting the results to the user for error detection. The present work aims to present the implementation, the application and the benefits of the designed tool.
Strömningsmekaniska beräkningar (CFD) området växer exponentiellt med avseende på prestanda, robusthet och tillämpningar. Expansionen av CFD bidrar även till er användare och simuleringar, vilket i sin tur leder till er mänskliga fel och misstag i uppsättningen av simuleringar. Eftersom simuleringsuppsättningen behöver vara korrekt för att återskapa önskade fenomen, behöver sådana fel undvikas för att kunna öka simuleringens tillförlitlighet och robusthet. Med avseende på detta utvecklades ett verktyg i CFD mjukvaran SIMULIA PowerFLOW. Verktyget extraherar och analyserar inställningarna före simulering och rapporterar resultaten till användaren för feldetektering. Det här arbetet redogör för utvecklingen, tillämpningen och fördelarna med framställda verktyget.
APA, Harvard, Vancouver, ISO, and other styles
27

Pal, Anindita. "AN ONTOLOGY ANALYSIS TOOL FOR CONSUMER." Miami University / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=miami1146068536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Agnesson, Daniel, and Necip Yener Önder. "Customer Maturity Analysis Tool : A case study in designing a Customer MaturityAnalysis Tool." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Informatik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-14799.

Full text
Abstract:
The IT consultancy industry is characterized by knowledge intensive implementationprojects related to varying levels of standardized information system software. When faced with a large implementation project at a previously unknown customer various assessmentsis usually conducted to identify the level of fit between the software and organizationin order to plan and structure the implementation process. However, there are several aspects of the customer organization that potentially can affect the end result as well as the implementation process that is unrelated to the fit between the software and the implementation organization as well as the potency of the implementation method. By conducting measurements of these maturity factors within the customer organization the implementing procedure can be modified based on the customer maturity level in order to become more aligned with the capabilities present in the customer organization. Research question: Which aspects need to be covered by a Customer Maturity Analysis Tool (CMAT) in order to evaluate the pre implementation maturity for potential customersof our case company and how should these aspects be organized and measured? The first task of the research process was to create an underlying model of maturity perspectives and aspects to structure the literature review as well as the empirical data collection. It was decided to adhere to a deductive approach where the theoretical model would be validated and if necessary modified in accordance to feedback from potential users of the CMAT within the case company. This procedure was repeated in the creation of measurements and maturity levels for the aspects to be used in the tool. The research process would therefore transition from a general model based on literature review, through a iterative feedback loop to a final model tailored to the specific requirementsof the case company. The final CMAT ended up containing four main perspectives of customer maturity; ITi nfrastructure, Culture, Process and Business Governance. These four perspectives were in turn divided into subgroups in order to be able to aggregate and compare differentaspects of the perspectives with each other.
APA, Harvard, Vancouver, ISO, and other styles
29

Parker, Brandon S. "CLUE: A Cluster Evaluation Tool." Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5444/.

Full text
Abstract:
Modern high performance computing is dependent on parallel processing systems. Most current benchmarks reveal only the high level computational throughput metrics, which may be sufficient for single processor systems, but can lead to a misrepresentation of true system capability for parallel systems. A new benchmark is therefore proposed. CLUE (Cluster Evaluator) uses a cellular automata algorithm to evaluate the scalability of parallel processing machines. The benchmark also uses algorithmic variations to evaluate individual system components' impact on the overall serial fraction and efficiency. CLUE is not a replacement for other performance-centric benchmarks, but rather shows the scalability of a system and provides metrics to reveal where one can improve overall performance. CLUE is a new benchmark which demonstrates a better comparison among different parallel systems than existing benchmarks and can diagnose where a particular parallel system can be optimized.
APA, Harvard, Vancouver, ISO, and other styles
30

Fruscella, Jeffrey Allen. "THERMAL ANALYSIS AS AN IMPORTANT RESEARCH TOOL FOR COLLEGES AND UNIVERSITIES." Cleveland State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=csu1323301810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Lam, Chi-chung, and 林子聰. "Nuclear magnetic resonance spectroscopy as an analytical tool in the structural analysis of triacylglycerols." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B31233636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Lam, Chi-chung. "Nuclear magnetic resonance spectroscopy as an analytical tool in the structural analysis of triacylglycerols /." [Hong Kong : University of Hong Kong], 1994. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13787056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Butchart, Kate. "Hierarchical clustering using dynamic self organising neural networks." Thesis, University of Hertfordshire, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338383.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Azarm, Mana. "Tool Support and Data Management for Business Analytics." Thèse, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/20059.

Full text
Abstract:
The data delivery architectures in most enterprises are complex and under documented. Conceptual business models and business analytics applications are created to provide a simplified, and easy to navigate view of enterprise data for analysts. But the construction of such interfaces is tedious, manually intensive to build, requiring specialized technical expertise, and it is especially difficult to map exactly where data came from in the organization. In this paper we investigate how two aspects (lineage and requests for data i.e. semantics and new reports) can be addressed by tying metadata documentation to a systematic data delivery architecture in order to support business analytics applications. We propose a tool framework that includes a metadata repository for each step in the data delivery architecture, a web based interface to access and manage that repository and mapping tools that capture data lineage to support step by step automation of data delivery.
APA, Harvard, Vancouver, ISO, and other styles
35

Widing, Härje. "Business analytics tools for data collection and analysis of COVID-19." Thesis, Linköpings universitet, Statistik och maskininlärning, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176514.

Full text
Abstract:
The pandemic that struck the entire world 2020 caused by the SARS-CoV-2 (COVID-19) virus, will have an enormous interest for statistical and economical analytics for a long time. While the pandemic of 2020 is not the first that struck the entire world, it is the first pandemic in history where the data were gathered to this extent. Most countries have collected and shared its numbers of cases, tests and deaths related to the COVID-19 virus using different storage methods and different data types. Gaining quality data from the COVID-19 pandemic is a problem most countries had during the pandemic, since it is constantly changing not only for the current situation but also because past values have been altered when additional information has surfaced. The importance of having the latest data available for government officials to make an informed decision, leads to the usage of Business Intelligence tools and techniques for data gathering and aggregation being one way of solving the problem. One of the mostly used software to perform Business Intelligence is the Microsoft develop Power BI, designed to be a powerful visualizing and analysing tool, that could gather all data related to the COVID-19 pandemic into one application. The pandemic caused not only millions of deaths, but it also caused one of the largest drops on the stock market since the Great Recession of 2007. To determine if the deaths or other reasons directly caused the drop, the study modelled the volatility from index funds using Generalized Autoregressive Conditional Heteroscedasticity. One question often asked when talking of the COVID-19 virus, is how deadly the virus is. Analysing the effect the pandemic had on the mortality rate is one way of determining how the pandemic not only affected the mortality rate but also how deadly the virus is. The analysis of the mortality rate was preformed using Seasonal Artificial Neural Network. Forecasting deaths from the pandemic using the Seasonal Artificial Neural Network on the COVID-19 daily deaths data.
APA, Harvard, Vancouver, ISO, and other styles
36

Sun, Jiake, and Wenjie Jiang. "Analysis Tool for Warehouse Material Handling Data." Thesis, Högskolan i Halmstad, Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-15205.

Full text
Abstract:
Effective material handling plays a key role in cutting costs. Well-organized material handling can cut production cost by optimizing product transfer paths, decreasing the damage rate and by increasing the utilization of storage space. This report presents the development of an analysis system for StoraEnso Hylte’s paper reel database. The system extracts and classifies key points from the database which are related to material handling; like attributes related to the product (paper reel), forklift truck information and storage cell utilization. The analysis based on paper reels includes the damage rate and transfer paths of paper reels. A mathematical model is also presented, which tells us that the probability of damage per transport is more important than the number of transports for paper reels handling. The effect of decreasing non-optimal transportation (optimize the path) is very small.
APA, Harvard, Vancouver, ISO, and other styles
37

Österlind, Magnus. "VALIDERING AV VERKTYGET "ENTERPRISE ARCHITECTURE ANALYSIS TOOL"." Thesis, KTH, Industriella informations- och styrsystem, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-81393.

Full text
Abstract:
The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise. During this study EAAT has been validated in order to assess the usability of the tool and to provide suggestion of improvement. To reach conclusions EA models of IT systems out in the industry has been created in EAAT. A usability study has been performed in order to find weaknesses and strength in the tools user interface. The results of the study consist of a couple of scenarios of how the tool might be used by the industry. An important feature to improve is the possibility to easily find the weak parts of the system. The user interface should provide more feedback to the user. The modeling process could be improved by providing the user with suggestions on what to do next in order to reach a full and complete model.
APA, Harvard, Vancouver, ISO, and other styles
38

Lutero, Gianluca. "A Tool For Data Analysis Using Autoencoders." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20510/.

Full text
Abstract:
In this thesis will be showed the design and development of SpeechTab, a web application that collects structured speech data from different subjects and a technique that try to tell which one is affected of cognitive decline or not.
APA, Harvard, Vancouver, ISO, and other styles
39

Näsman, Per. "Risk analysis : a tool in decision-making." Licentiate thesis, KTH, Safety Research, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-612.

Full text
Abstract:

In our daily life we are surrounded by different kind of risks and we constantly strive for better methods to quantify and in the prolongation manage these risks. Every activity involves some risks and there are some kinds of risks and some level of risks that we are unwilling to accept. We all like to live a life that is free from risks, but that is impossible.

The word risk has a lot of different interpretations. In this thesis we shall let risk stand for the combination of random or uncertain events with negative consequences for human health, life and welfare and for the environment together with some measures of the likelihood of such events. We believe this is the prevailing concept or understanding of risk; as the probability of an event followed by some negative consequences or activities of that event.

In risk analysis one tries to recognize the nature of various risks and to assess the magnitude of the risks. In the risk analysis it is very important to know what system to consider and this is not self evident in many cases. The situation is clearly different for planning and/or building a system compared with running the same system in a real time state. The system that is going to be the subject to the risk analysis must be clearly defined and the limitations and the boundaries of the system must be set. It is very important to ensure that all persons involved in a risk analysis have a common understanding of the system being considered, including relevant operations.

During the past decades many studies have been carried out on risk related topics and the society has showed a significant interest in the field of risk analysis. Risk analysis is the interdisciplinary field of science that combines results and knowledge of probability theory, mathematical statistics, engineering, medicine, philosophy, psychology, economics and other applied disciplines.

In this thesis we will give some examples of different risk analyses carried out basically within two areas. The first part of the thesis (paper 1- paper V) describes different risk analyses carried out in the area of transportation. This is an area with large differences between the different modes of transportation in respect to, for example number of users, number of accidents, magnitude of the accidents and accessible data. The latter part of the thesis (paper VI and paper VII) describes two risk analyses carried out in the field of medicine. Medicine is a science, which has used methods from the area of risk analysis for a long time. The different papers will be used to discuss risk analysis as a tool in decision-making.


QC 20100616
APA, Harvard, Vancouver, ISO, and other styles
40

Parnandi, Silpa. "Power market analysis tool for congestion management." Morgantown, W. Va. : [West Virginia University Libraries], 2007. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=5187.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2007.
Title from document title page. Document formatted into pages; contains viii, 71 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 68-71).
APA, Harvard, Vancouver, ISO, and other styles
41

Coleman, Michael Glenn. "Channel CAT: a tactical link analysis tool." Thesis, Monterey, California. Naval Postgraduate School, 1997. http://hdl.handle.net/10945/8075.

Full text
Abstract:
Approved for public release; distribution is unlimited
The Tri-Service Tactical (TRI-TAC) standards for tactical data links mandate a terminal data rate of 32,000 bits per second. As greater demands for data throughput are placed upon tactical networks, it will become imperative that the design of future client/server architectures do not exceed the capacity of the TRI-TAC networks. This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software. The analysis tool, built using the Computer Aided Prototyping System (CAPS), provides designers the ability to input TRI-TAC channel parameter and view the results of the simulated channel traffic in graphical format. The size of data, period of transmission, and channel transmission rate can be set by the user, with the results displayed as a percent utilization of the maximum capacity of the channel. Designed using fielded equipment specifications, the details of the network mechanisms closely simulate the behavior of the actual tactical links. Testing has shown Channel CAT to be stable and accurate. As a result of this effort, Channel CAT provides software engineers an ability to test design decisions for client-server software in a rapid, low-cost manner.
APA, Harvard, Vancouver, ISO, and other styles
42

Morgenthaler, John David. "Static analysis for a software transformation tool /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1997. http://wwwlib.umi.com/cr/ucsd/fullcit?p9804509.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Huang, Yunshui Charles. "A prototype of data analysis visualization tool." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/12125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cavalcanti, Yguaratã Cerqueira. "A bug report analysis and search tool." Universidade Federal de Pernambuco, 2009. https://repositorio.ufpe.br/handle/123456789/2027.

Full text
Abstract:
Made available in DSpace on 2014-06-12T15:53:57Z (GMT). No. of bitstreams: 2 arquivo1938_1.pdf: 2696606 bytes, checksum: c2ff3cbbb3029fd0f89eb8d67c0e4f08 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009
Manutenção e evolução de software são atividades caracterizadas pelo seu enorme custo e baixa velocidade de execução. Não obstante, elas são atividades inevitáveis para garantir a qualidade do software quase todo software bem sucedido estimula os usuários a fazer pedidos de mudanças e melhorias. Sommerville é ainda mais enfático e diz que mudanças em projetos de software são um fato. Além disso, diferentes estudos têm afirmado ao longo dos anos que as atividades de manutenção e evolução de software são as mais caras do ciclo de desenvolvimento, sendo responsável por cerca de até 90% dos custos. Todas essas peculiaridades da fase de manutenção e evolução de software leva o mundo acadêmico e industrial a investigar constantemente novas soluções para reduzir os custos dessas atividades. Neste contexto, Gerência de Configuração de Software (GCS) é um conjunto de atividades e normas para a gestão da evolução e manutenção de software; GCS define como são registradas e processadas todas as modificações, o impacto das mesmas em todo o sistema, dentre outros procedimentos. Para todas estas tarefas de GCM existem diferentes ferramentas de auxílio, tais como sistemas de controle de versão e bug trackers. No entanto, alguns problemas podem surgir devido ao uso das mesmas, como por exemplo o problema de atribuição automática de responsável por um bug report e o problema de duplicação de bug reports. Neste sentido, esta dissertação investiga o problema de duplicação de bug reports resultante da utilização de bug trackers em projetos de desenvolvimento de software. Tal problema é caracterizado pela submissão de dois ou mais bug reports que descrevem o mesmo problema referente a um software, tendo como principais conseqüências a sobrecarga de trabalho na busca e análise de bug reports, e o mal aproveitamento do tempo destinado a essa atividade
APA, Harvard, Vancouver, ISO, and other styles
45

Mirmojarabian, S. (Seyed). "Signal analysis tool to investigate walking abnormalities." Master's thesis, University of Oulu, 2018. http://jultika.oulu.fi/Record/nbnfioulu-201809062748.

Full text
Abstract:
Abstract. This thesis presents a signal analysis tool, which has been designed to investigate walking abnormalities which are related to foot rolling movements during walking; interaction of foot with ground which is called stance phase. They would cause a wide range of severe anatomical damages such as ankle, leg, heel and back pain in the long-term. Comparing to the conventional data acquisition setups of biomechanical researches, inertial measurement sensors (IMU), which are being used widely as an appropriate alternative setup recently, facilitate monitoring human movement for a long-term period out of laboratory. This justifies the growing trend of improving the IMU-based algorithms which are designed for events detection, position calculation, and rotation estimation. Therefore, a set of 4 IMUs, placed on shank and foot of both legs, has been used for data collection. In data processing stage, two novel algorithms have been developed and implemented as the backbone of the designed software aiming to detect and integrate stance phases. The first algorithm was developed to detect stance phases in gait cycle data. Even though the detection of events in gait cycles has been the topic of a majority of biomechanical researches, stance phase as the interval between two consecutive events has not been studied sufficiently. The second algorithm, sensor alignment, generates a rotation matrix which is used to align IMU sensors placed on the same foot and shank. This alignment of the two sensors enables us to add or subtract the data point-wisely to make a more meaningful interpretation of the data regarding thought-out walking abnormalities during phase stances. The visualized results of the thesis can be considered as an early stage of a more comprehensive research which might lead to quantitative results corresponding to different walking abnormalities.
APA, Harvard, Vancouver, ISO, and other styles
46

Dahlin, Andreas. "Microscale Tools for Sample Preparation, Separation and Detection of Neuropeptides." Doctoral thesis, Uppsala University, Department of Chemistry, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-5838.

Full text
Abstract:

The analysis of low abundant biological molecules is often challenging due to their chemical properties, low concentration and limited sample volumes. Neuropeptides are one group of molecules that fits these criteria. Neuropeptides also play an important role in biological functions, which makes them extra interesting to analyze. A classic chemical analysis involves sampling, sample preparation, separation and detection. In this thesis, an enhanced solid supported microdialysis method was developed and used as a combined sampling- and preparation technique. In general, significantly increased extraction efficiency was obtained for all studied peptides. To be able to control the small sample volumes and to minimize the loss of neuropeptides because of unwanted adsorption onto surfaces, the subsequent analysis steps were miniaturized to a micro total analysis system (µ-TAS), which allowed sample pre-treatment, injection, separation, manipulation and detection.

In order to incorporate these analysis functions to a microchip, a novel microfabrication protocol was developed. This method facilitated three-dimensional structures to be fabricated without the need of clean room facilities.

The sample pre-treatment step was carried out by solid phase extraction from beads packed in the microchip. Femtomole levels of neuropeptides were detected from samples possessing the same properties as microdialysates. The developed injection system made it possible to conduct injections from a liquid chromatographic separation into a capillary electrophoresis channel, which facilitated for advanced multidimensional separations. An electrochemical sample manipulation system was also developed. In the last part, different electrospray emitter tip designs made directly from the edge of the microchip substrate were developed and evaluated. The emitters were proven to be comparable with conventional, capillary based emitters in stability, durability and dynamic flow range. Although additional developments remain, the analysis steps described in this thesis open a door to an integrated, on-line µ-TAS for neuropeptides analysis in complex biological samples.

APA, Harvard, Vancouver, ISO, and other styles
47

Kang, Youn Ah. "Informing design of visual analytics systems for intelligence analysis: understanding users, user tasks, and tool usage." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/44847.

Full text
Abstract:
Visual analytics, defined as "the science of analytical reasoning facilitated by interactive visual interfaces," emerged several years ago as a new research field. While it has seen rapid growth for its first five years of existence, the main focus of visual analytics research has been on developing new techniques and systems rather than identifying how people conduct analysis and how visual analytics tools can help the process and the product of sensemaking. The intelligence analysis community in particular has not been fully examined in visual analytics research even though intelligence analysts are one of the major target users for which visual analytics systems are built. The lack of understanding about how analysts work and how they can benefit from visual analytics systems has created a gap between tools being developed and real world practices. This dissertation is motivated by the observation that existing models of sensemaking/intelligence analysis do not adequately characterize the analysis process and that many visual analytics tools do not truly meet user needs and are not being used effectively by intelligence analysts. I argue that visual analytics research needs to adopt successful HCI practices to better support user tasks and add utility to current work practices. As the first step, my research aims (1) to understand work processes and practices of intelligence analysts and (2) to evaluate a visual analytics system in order to identify where and how visual analytics tools can assist. By characterizing the analysis process and identifying leverage points for future visual analytics tools through empirical studies, I suggest a set of design guidelines and implications that can be used for both designing and evaluating future visual analytics systems.
APA, Harvard, Vancouver, ISO, and other styles
48

Smirnova, Tatiana. "Analysis, Modeling and Simulation of Machine Tool Parts Dynamics for Active Control of Tool Vibration." Doctoral thesis, Karlskrona : Blekinge Institute of Technology, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00472.

Full text
Abstract:
Boring bar vibration in machine tools during internal turning operations is a pronounced problem in the manufacturing industry. Due to the often slender geometry of the boring bar, vibration may easily be induced by the material deformation process. One approach to overcome such vibration problems is to use active control of boring bar vibration. The design time of an active boring bar depends to a great extent on the knowledge of its dynamic properties when clamped in a lathe for different actuator positions and sizes, crucial for its performance. This thesis focuses on the development of accurate dynamic models of active boring bars with the purpose of providing qualitative information on suitable actuator position for a certain boring bar. The first part of the thesis considers the problem of building an accurate "3-D" finite element (FE) model of a standard boring bar used in industry. Results from experimental modal analysis of the actual boring bar are the reference. The second and the third parts discuss analytical and experimental methods for modeling the dynamic properties of a boring bar clamped in a machine tool. For this purpose, the Euler-Bernoulli and Timoshenko beam theories are used to produce both distributed-parameter system models and corresponding "1-D" FE models. A more complete "3-D" FE model of the system boring bar - clamping house is also developed. Spatial dynamic properties of these models are discussed and compared with adequate experimental modal analysis results from the actual boring bar clamped in a machine tool. The third part also investigates the sensitivity of the spatial dynamic properties of the derived boring bar models to variation in the structural parameters' values. The fourth part focuses on the development of a "3-D" FE model of the system boring bar - actuator - clamping house. Two models are discussed: a linear model and a model enabling variable contact between the clamping house and the boring bar with and without Coulomb friction in the contact surfaces. Based on these FE models fundamental bending modes and control path frequency response functions are discussed in conjunction with the corresponding quantities estimated for the actual active boring bar. In the fifth part, a method based on FE modeling and artificial neural networks for selecting a suitable actuator position inside an active boring bar is presented. Objective functions for selecting an actuator position are suggested. An active boring bar with an actuator position suggested by the method was manufactured and it displays fairly good correlation with the corresponding FE model. The final part focuses on modeling of an active boring bar vibration control system. A simple "1-D" FE model of a boring bar is utilized to simulate the dynamic response and an adaptive digital feedback controller realized by the feedback filtered-x LMS algorithm is used.
Link to fulltext: http://www.bth.se/ing/tsm.nsf
APA, Harvard, Vancouver, ISO, and other styles
49

Lavén, Martin. "Liquid Chromatography – Mass Spectrometry Analysis of Short-lived Tracers in Biological Matrices : Exploration of Radiotracer Chemistry as an Analytical Tool." Doctoral thesis, Uppsala University, Department of Chemistry, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-4727.

Full text
Abstract:

Liquid chromatography – mass spectrometry (LC-MS) methods were developed for the analysis of positron emission tomography (PET) radiotracers in biological matrices. Additionally, radiotracer chemistry was explored as an analytical tool for supporting LC-MS method development and imaging molecular interactions in miniaturised chemical analysis systems.

Conventional radiodetection methods can offer high sensitivity in the analysis of radiotracers in biological matrices, although with the short half-life of PET tracers, this mass sensitivity decreases rapidly with time. This limits the time frame for analysis, and may compromise the precision and accuracy of the later measurements. Performing LC-MS analysis of the dominant stable isotope form of the tracer removes such time restrictions.

An LC-MS/MS method was developed for determination of the tracer flumazenil in human plasma, with high inter-assay precision (RSD < 7%) and accuracy (95 – 104%). The method was applied in a multiple scan PET study where the plasma concentration spanned from 0.07 to 0.21 nM. The method removed the time restrictions associated with radiodetection methods and thus provided the opportunity of analysing a greater number of samples than would have been possible with radioanalysis.

Furthermore, an LC-MS/MS method was developed that provided an efficient metabolic screening tool of potential PET tracers, whereby the substrates could be collected directly from 11C-labelling batches. This permitted repeated incubation experiments without the need of repeated labelling syntheses. A para-methoxy-benzamide analogue of the radiotracer WAY-100635 was thus identified as a potential tracer with improved metabolic stability. Additionally, a capillary LC-MS method was developed with rapid (0.75 min) and efficient (> 99%) on-line high flow-rate extraction for determination of metabolic stability of PET radiotracers.

Finally, the concept of radionuclide imaging of miniaturised chemical analysis systems was demonstrated with the direct study of interactions within capillary extraction columns and microchannels moulded in a plastic CD and poly(dimethylsiloxane).

APA, Harvard, Vancouver, ISO, and other styles
50

El, Siblani Ali. "Tool condition analysis and monitoring in cold rolling process." Thesis, KTH, Industriell produktion, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-41318.

Full text
Abstract:
This research is about a costly problem in the automotive industry due to tool fracture during the splines cold rolling of steel shafts. The objective is to study the cause of this failure and propose solutions that can be implemented in the workshop.The writing starts with a brief introduction of the companies involved in shafts production and problem solving. It introduces the cold rolling process and its advantages on splines manufacturing, and it goes through relevant material and process characteristics that help to determine the cause of tool fracture.In order to understand the process failure and production flow, it has been necessary to build up an Ishikawa diagram with possible tool fracture causes. After collecting and analysing the data about the machine tool, cold rolling process and work-piece and rolling tool materials, tests and experiments have been done.It has been considered that there is a rolling tool fatigue that causes tool fracture. Beside tool fracture, two more problems with production flow instability and the right side rolling tool have been detected. Testing the material hardness of the work-piece has shown continuous hardness fluctuations from the supplier. Rolling tool misalignment has been measured by using a vernier caliper measurement device. Rolling tools material hardness analysis shows that tool is very hard and it is possible to use a tougher material which responds better to cyclic loads.Leax has tried to solve the problem by testing another lubrication and tool coatings. A modal analysis test has been performed in order to find the natural frequency of the work-piece which possibly may lead to vibration and over loading one of the rolling tools.The conclusion that has been reached is that main cause of fracture is rolling tool fatigue due to cyclic loads and it is important to use other rolling tool material. The other two detected problems, production flow instability and rigth side rolling tool fracture, should be considered as a part of the problem in order to significantly increase tools life and stabilize production flow rate.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography