Dissertations / Theses on the topic 'Exploration tools'

To see the other types of publications on this topic, follow the link: Exploration tools.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Exploration tools.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

康錦琦 and Kam-kee Kay Hong. "Visualization tools for information exploration." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31224416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hong, Kam-kee Kay. "Visualization tools for information exploration /." Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk:8888/cgi-bin/hkuto%5Ftoc%5Fpdf?B23273070.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Allender, Elyse J. "Automated Tools and Techniques for Mars Forward Exploration." University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1480328341223151.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Frantz, Ferreira Felipe. "Architectural exploration methods and tools for heterogeneous 3D-IC." Thesis, Ecully, Ecole centrale de Lyon, 2012. http://www.theses.fr/2012ECDL0033/document.

Full text
Abstract:
L'intégration tridimensionnelle (3D), où plusieurs puces sont empilées et interconnectées, est en train de révolutionner l'industrie des semi-conducteurs.Cette technologie permet d'associer, dans un même boîtier, des puces électroniques (analogique, numérique, mémoire) avec des puces d'autres domaines(MEMS, bio-capteurs, optique, etc). Cela ouvre de nombreuses voies d'innovation. Néanmoins, l'absence d'outils de conception assistée ordinateur(CAO) adaptés aux systèmes 3D freine l'adoption de la technologie.Cette thèse contribue à deux problématiques liées à la conception 3D : le partitionnement d'un système sur de multiples puces et l'optimisation hiérarchique de systèmes multiphysiques (hétérogènes).La première partie de la thèse est dédiée au problème de partitionner la fonctionnalité d'un système sur de multiples puces. Un outil de « floorplan » 3D a été développé pour optimiser ce partitionnement en fonction de la surface des puces, de la température d'opération du circuit et de la structure des interconnexions. Ce type d'outil étant complexe, nous proposons de régler ses paramètres de façon automatique par l'utilisation d'algorithmes évolutionnaires.Des résultats expérimentaux sur une suite de benchmarks et sur une architecture multi processeur connecté en réseau démontrent l'efficacité et l'applicabilité des techniques d'optimisation proposées.Dans la deuxième partie, nous présentons une méthodologie de conception hiérarchique qui est adaptée aux systèmes hétérogènes. La méthode combine une approche ascendante et descendante et utilise des courbes de compromis(Fronts de Pareto) comme une abstraction de la performance d'un circuit.La contribution principale de la thèse consiste à utiliser des techniques d'interpolation pour représenter les Fronts de Pareto par des fonctions continues et à leur intégration dans des processus d'optimisation classiques. Cela permet un gain en flexibilité lors de l'étape ascendante du flot (caractérisation) et un gain en temps lors de l'étape descendante (synthèse). Le flot de conception est démontré sur un amplificateur opérationnel ainsi comme sur la synthèse d'un lien optoélectronique avec trois niveaux hiérarchiques
3D integration technology is driving a strong paradigm shift in the design of electronic systems. The ability to tightly integrate functions from different technology nodes (analog, digital, memory) and physical domains (MEMS, optics, etc) offers great opportunities for innovation (More than Moore). However, leveraging this potential requires efficient CAD tools to compare architectural choices at early design stages and to co-optimize multiphysics systems.This thesis work is divided into two parts. The first part is dedicated to the problem of partitioning a system into multiple dies. A 3D floorplanning tool was developed to optimize area, temperature and the interconnect structure of a 3DIC. Moreover, a meta-optimization approach based on genetic algorithms is proposed to automatically configure the key parameters of the floorplanner. Tests were carried out on architectural benchmarks and a NoC based multiprocessor to demonstrate the efficiency of the proposed techniques.In the second part of the thesis, a hierarchical design methodology adapted to heterogeneous systems is presented. The method combines the bottom-up and top-down approaches with Pareto-front techniques and response surface modeling. The Pareto front of lower level blocks are extracted and converted into predictive performance models that can be stored and reused in a top-down optimization process. The design flow is demonstrated on an operational amplifier as well as on the synthesis of an optoelectronic data link with three abstraction levels
APA, Harvard, Vancouver, ISO, and other styles
5

Wlodarczyk, Radoslaw Stanislaw. "Surface structure predictions and development of global exploration tools." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2015. http://dx.doi.org/10.18452/17207.

Full text
Abstract:
Diese Arbeit ist ein Beitrag zur theoretischen Chemie sowie zur Oberflächenchemie. Durch Kombination von computergestützten und experimentellen Untersuchungen wird die atomare Struktur von dünnen SiO2-Filmen auf Ru(0001)-Unterlagen, von eisendotierten SiO2-Filmen auf diesen Unterlagen und von H2O-Filmen auf MgO(001)-Oberflächen bestimmt. Die atomaren Strukturmodelle wurden entweder mit dem neu entworfenen und im Paket DoDo implementierten genetischen Algorithmus oder mittels auf Sachkenntnis gestützter Vermutungen erhalten. Die simulierten Eigenschaften der so erhaltenen Strukturen stimmen sehr gut mit den experimentellen Daten (Raster-Tunnel-Mikroskopie, Infrarot-Spektroskopie) überein. Die erfolgreiche Strukturbestimmung mithilfe des DoDo-Programms zeigt, dass genetische Algorithmen zur systematischen und extensiven Erkundung der Energielandschaften 2D-periodischer Systeme geeignet sind.
This work is a contribution in the field of theoretical chemistry and surface science. The joint computational and experimental studies investigated the atomic structure of ultrathin silica and iron-doped silica films formed on the Ru(0001) surface and water films formed on the MgO(001) surface. The atomic structure models were obtained using either the educated guess approach or the genetic algorithm that was designed and implemented within the DoDo package. The properties simulated for the resulting models are in a very good agreement with the experimental data (scanning tunnelling microscopy, infrared spectroscopy). The successful structure determination using the DoDo program shows that the genetic algorithm technique is capable of systematic and extensive exploration of the energy landscapes for 2D-periodic systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Storey, Margaret-Anne D. "A cognitive framework for describing and evaluating software exploration tools." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/NQ37756.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jones, Adam. "Design Space Exploration and Optimization Using Modern Ship Design Tools." Thesis, Monterey, California. Naval Postgraduate School, 2014. http://hdl.handle.net/10945/43072.

Full text
Abstract:
CIVINS
Modern Naval Architects use a variety of computer design tools to explore feasi- ble options for clean sheet ship designs. Under the Naval Sea Systems Command (NAVSEA), the Naval Surface Warfare Center, Carderock Division (NSWCCD) has created computer tools for ship design and analysis purposes. This paper presents an overview of some of these tools, speci cally the Advanced Ship and Submarine Evaluation Tool (ASSET) version 6.3 and the Integrated Hull Design Environment (IHDE). This paper provides a detailed explanation of a ship design using these ad- vanced tools and presents methods for optimizing the performance of the hullform, the selection of engines for fuel e ciency, and the loading of engines for fuel e ciency. The detailed ship design explores the design space given a set of speci c requirements for a cruiser-type naval vessel. The hullform optimization technique reduces a ships residual resistance by using both ASSET and IHDE in a Design of Experiments (DoE) approach to reaching an optimum solution. The paper will provide a detailed example resulting in a 12% reduction in total ship drag by implementing this technique on a previously designed hullform. The reduction of drag results in a proportional reduction in the amount of fuel used to push the ship through the water. The engine selection optimization technique uses MATLAB to calculate the ideal engines to use for fuel minimization. For a given speed-time or power-time pro le, the code will evaluate hundreds of combinations of engines and provide the optimum en- gine combination and engine loading for minimizing the total fuel consumption. This optimization has the potential to reduce fuel consumption of current naval warships by upwards of 30%.
APA, Harvard, Vancouver, ISO, and other styles
8

Jones, Adam T. (Adam Thomas). "Design space exploration and optimization using modern ship design tools." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/92124.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Engineering Systems Division, 2014.
Thesis: Nav. E., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 163-164).
Modern Naval Architects use a variety of computer design tools to explore feasible options for clean sheet ship designs. Under the Naval Sea Systems Command (NAVSEA), the Naval Surface Warfare Center, Carderock Division (NSWCCD) has created computer tools for ship design and analysis purposes. This paper presents an overview of some of these tools, specifically the Advanced Ship and Submarine Evaluation Tool (ASSET) version 6.3 and the Integrated Hull Design Environment (IHDE). This paper provides a detailed explanation of a ship design using these advanced tools and presents methods for optimizing the performance of the hullform, the selection of engines for fuel efficiency, and the loading of engines for fuel efficiency. The detailed ship design explores the design space given a set of specific requirements for a cruiser-type naval vessel. The hullform optimization technique reduces a ships residual resistance by using both ASSET and IHDE in a Design of Experiments (DoE) approach to reaching an optimum solution. The paper will provide a detailed example resulting in a 12% reduction in total ship drag by implementing this technique on a previously designed hullform. The reduction of drag results in a proportional reduction in the amount of fuel used to push the ship through the water. The engine selection optimization technique uses MATLAB to calculate the ideal engines to use for fuel minimization. For a given speed-time or power-time profile, the code will evaluate hundreds of combinations of engines and provide the optimum engine combination and engine loading for minimizing the total fuel consumption. This optimization has the potential to reduce fuel consumption of current naval warships by upwards of 30%.
by Adam T. Jones.
S.M.
Nav. E.
APA, Harvard, Vancouver, ISO, and other styles
9

El-Shehaly, Mai Hassan. "A Visualization Framework for SiLK Data exploration and Scan Detection." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/34606.

Full text
Abstract:
Network packet traces, despite having a lot of noise, contain priceless information, especially for investigating security incidents or troubleshooting performance problems. However, given the gigabytes of flow crossing a typical medium sized enterprise network every day, spotting malicious activity and analyzing trends in network behavior becomes a tedious task. Further, computational mechanisms for analyzing such data usually take substantial time to reach interesting patterns and often mislead the analyst into reaching false positives, benign traffic being identified as malicious, or false negatives, where malicious activity goes undetected. Therefore, the appropriate representation of network traffic data to the human user has been an issue of concern recently. Much of the focus, however, has been on visualizing TCP traffic alone while adapting visualization techniques for the data fields that are relevant to this protocol's traffic, rather than on the multivariate nature of network security data in general, and the fact that forensic analysis, in order to be fast and effective, has to take into consideration different parameters for each protocol. In this thesis, we bring together two powerful tools from different areas of application: SiLK (System for Internet-Level Knowledge), for command-based network trace analysis; and ComVis, a generic information visualization tool. We integrate the power of both tools by aiding simplified interaction between them, using a simple GUI, for the purpose of visualizing network traces, characterizing interesting patterns, and fingerprinting related activity. To obtain realistic results, we applied the visualizations on anonymized packet traces from Lawrence Berkley National Laboratory, captured on selected hours across three months. We used a sliding window approach in visually examining traces for two transport-layer protocols: ICMP and UDP. The main contribution of this research is a protocol-specific framework of visualization for ICMP and UDP data. We explored relevant header fields and the visualizations that worked best for each of the two protocols separately. The resulting views led us to a number of guidelines that can be vital in the creation of "smart books" describing best practices in using visualization and interaction techniques to maintain network security; while creating visual fingerprints which were found unique for individual types of scanning activity. Our visualizations use a multiple-views approach that incorporates the power of two-dimensional scatter plots, histograms, parallel coordinates, and dynamic queries.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
10

Ćalić, Tihomir. "Exploration of model driven architecture capabilities via comparative utilization of MDA tools." abstract and full text PDF (free order & download UNR users only), 2006. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1438934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Thomas, Stephanie Faye. "The exploration and development of tools for active reading and electronic texts." Thesis, Sheffield Hallam University, 2008. http://shura.shu.ac.uk/20437/.

Full text
Abstract:
This thesis presents the results of research into the process of editing and the decisions faced by editors when approaching early modern texts. By looking at problems faced by editors of Renaissance texts, such as the difficulty of editing and presenting texts that exist in more than one version, for example Shakespeare's King Lear, it has enabled me to gain a better understanding of how these issues can be approached and how technology can assist in this. The thesis outlines the areas of the domain into which research has been undertaken, those where it is currently being investigated, and those which may be explored in the future. A literature review of relevant texts has been included, as well as a review of some of the existing methods of viewing texts electronically. I have focused my practical research on how scholarly readers at Undergraduate level respond to being confronted with an unstable text. The term "Active Reading" is used in this case to refer to a level of dynamic involvement with the text, where editorial decision-making can affect the meaning of the text. In observing the methods by which they currently examine and edit multiple-texts, I have been able to study readers and find out how they would like to be able to undertake this task using technology. I have utilized the knowledge gathered from this research to begin editing my own section of a Renaissance play using TEI XML, and to design some prototype editions of a Renaissance poem incorporating several interactive methods of engaging with multiple-text editions. I hope that by documenting the process of producing this work, as well as drawing conclusions from my findings from user trials, that this will contribute to new work in the development of electronic texts for literary readers.
APA, Harvard, Vancouver, ISO, and other styles
12

Zanella, Matthew Robert. "Improved Sailboat Design Process and Tools Using Systems Engineering Approach." Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/98503.

Full text
Abstract:
This research provides a detailed and systematic update of the traditional sailboat design process, with specific attention being paid to the tools used for evaluation purposes, and in doing so creates an improved and optimized design process for sailboats. More specifically, this report seeks to modify a systems-engineering approach to the ship design process, in order to properly incorporate modern sailboat evaluation techniques as well as elements of traditional sailboat design while providing analysis of a case study from Virginia Polytechnic Institute and State University's ocean vehicle design class. In considering all intricacies of sailboat design and with applications and gradual improvement in quality of design through the use of multi-objective optimization methods, a new sailboat design process evolves, which initially considers a wide variety of design options and alternatives. Specific attention is paid in this process to the evolution of the ordering and analysis of each segment of the subprocesses, reducing design risk through the use of industry standard assessment procedures and ensuring consistent interaction with the customer. In doing so, an improved and effective design process is established, to be used by future sailboat design teams at Virginia Polytechnic Institute and State University.
Master of Science
Boats and marine vehicles of different types have long been a mainstay in the growth and development of this country's military, economic and transportation infrastructure. Whether being used for fishing purposes in the Pacific Northwest or moving oil and gas to different cities along the eastern seaboard, marine transportation plays a critical role in day to day life. Long before the invention of gasoline powered engines, most boats were powered by wind which was harnessed by the use of sails. In the 1800's sailboats were used extensively for fishing, delivering mail and a number of other important activities. Nowadays, the use of sailboats is more geared towards recreational endeavors including racing or simply cruising local waterways. It is the responsibility of the sailboat designer to deliver options and products commensurate with the prospective owner's preferences. As such, it is important for the designer to develop a process or system which incorporates useful tools which can successfully evaluate design alternatives. In doing so, useful information will be produced by which the owner and designer can collaboratively make decisions. Unlike a military or commercial ship, the owner of a sailboat is most likely the main operator and shares a personal connection with the boat. This study modifies a systems-engineering approach to the ship design process, in order to properly incorporate modern sailboat evaluation techniques as well as elements of traditional sailboat design. In doing so, the operation provides a process and tool benchmark for future sailboat design teams at Virginia Polytechnic Institute and State University.
APA, Harvard, Vancouver, ISO, and other styles
13

Nichols, David L. "An Exploration of Blackboard Utilization by Faculty at a Midwestern University." Ohio University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1319213449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Colquechambi, Adriana, Gül Ulu, Mari Nakamura, and Xiaohui Yu. "An Exploration of Strategic Sustainable Development (SSD) Complemented Transformative Social Innovation (TSI) Tools." Thesis, Blekinge Tekniska Högskola, Institutionen för strategisk hållbar utveckling, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16386.

Full text
Abstract:
The human social system is facing complex social issues and (new) initiatives coming from different social actors are born to try to tackle these complex social issues. Social innovation is the field where these initiatives function, so it is also a complex field to identify and frame. Thus a new theory, the Transformative Social Innovation Theory (TSI), was developed in order to frame and bring more clarification on the social innovation field to contribute to societal transition and transformation. The five TSI tools were developed from the TSI theory and they are training tools. All the TSI tools aim to (dis)empower the social innovation initiatives, actors and networks in the process of transformative social innovation. Transformative Social Innovation is the process of changes in social relations involving challenging, altering and/or replacing dominant institutions and structures which are considered to be the roots of systemic errors. This study sought to explore the Transformative Social Innovation tools from the perspective of the Strategic Sustainable Development (SSD). In this regard, the Framework for Strategic Sustainable Development (FSSD) was adopted as it provides a principle-based and scientifically-proved definition of sustainability as well as a systems thinking approach regarding the complexity of global sustainability challenges. This research project tried to identify the potential contributions of the TSI tools to sustainability and the entry points of the tools where relevant SSD features could complement them so that they can contribute to strategically move the society towards sustainability. A qualitative research approach was selected. The methodology included four research methods, namely document content analysis, interviews, the FSSD analysis and prototyping. The results of this research indicated three main contributions of the TSI tools that could help to strategically move the society towards sustainability. Five entry points where the tools could be complemented with SSD features and a set of add-ins from SSD that could complement the current TSI tools were identified. The add-ins were sent to the TSI theory authors for the expert consultation.
APA, Harvard, Vancouver, ISO, and other styles
15

Benazzouz, Omar. "New tools for subsurface imaging of 3D seismic node data in hydrocarbon exploration." Doctoral thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/16799.

Full text
Abstract:
Doutoramento em Geociências
A aquisição de dados sísmicos de reflexão multicanal 3D/4D usando Ocean Bottom NODES de 4 componentes constitui atualmente um sector de importância crescente no mercado da aquisição de dados reflexão sísmica marinha na indústria petrolífera. Este tipo de dados permite obter imagens de sub-superfície de alta qualidade, com baixos níveis de ruído, banda larga, boa iluminação azimutal, offsets longos, elevada resolução e aquisição de tanto ondas P como S. A aquisição de dados é altamente repetitiva e portanto ideal para campanhas 4D. No entanto, existem diferenças significativas na geometria de aquisição e amostragem do campo de ondas relativamente aos métodos convencionais com streamers rebocados à superfície, pelo que é necessário desenvolver de novas ferramentas para o processamento deste tipo de dados. Esta tese investiga três aspectos do processamento de dados de OBSs/NODES ainda não totalmente resolvidos de forma satisfatória: a deriva aleatória dos relógios internos, o posicionamento de precisão dos OBSs e a implementação de algoritmos de migração prestack 3D em profundidade eficientes para obtenção de imagens precisas de subsuperfície. Foram desenvolvidos novos procedimentos para resolver estas situações, que foram aplicados a dados sintéticos e a dados reais. Foi desenvolvido um novo método para detecção e correcção de deriva aleatória dos relógios internos, usando derivadas de ordem elevada. Foi ainda desenvolvido um novo método de posicionamento de precisão de OBSs usando multilateração e foram criadas ferramentas de interpolação/extrapolação dos modelos de velocidades 3D de forma a cobrirem a extensão total área de aquisição. Foram implementados algoritmos robustos de filtragem para preparar o campo de velocidades para o traçado de raios e minimizar os artefactos na migração Krichhoff pre-stack 3D em profundidade. Os resultados obtidos mostram um melhoramento significativo em todas as situações analisadas. Foi desenvolvido o software necessário para o efeito e criadas soluções computacionais eficientes. As soluções computacionais desenvolvidas foram integradas num software standard de processamento de sísmica (SPW) utilizado na indústria, de forma a criar, conjuntamente com as ferramentas já existentes, um workflow de processamento integrado para dados de OBS/NODES, desde a aquisição e controle de qualidade à produção dos volumes sísmicos migrados pre-stack em profundidade.
Ocean bottom recording of 3D/4D multichannel seismic reflection data using 4 component Nodes is a recent and growing major segment in the marine seismic acquisition market in the oil and gas industry. These data provide high quality subsurface imaging with low ambient noise levels, broad bandwidth, wide azimuth illumination, long-offset, high resolution, and recordings of both P and S waves. In addition, data acquisition is highly repeatable and therefore ideal for 4D surveys. However, there are significant differences in acquisition geometry and wavefield sampling, compared to the conventional towed streamer data, which require new tools to be developed for data processing. This thesis investigates three key issues in OBS/NODE data processing that have not yet been satisfactorily fully solved: random clock drifts, accurate OBS positioning and efficient 3D pre-stack depth migration algorithms for accurate subsurface imaging. New procedures were developed to tackle these issues and these were tested on synthetic and real datasets. A new method for random clock drift was created using high order derivatives to detect and correct these residual drifts. A new accurate OBS/NODE positioning algorithm, using multilateration was developed. Tools were created for interpolation/extrapolation of 3D velocity functions across the full extent of the acquisition survey, and robust smoothing algorithms were used to prepare the velocity field to be used for ray tracing and prestack 3D Kirchhoff depth migration, so as to minimize migration artifacts. The results obtained show a clear improvement in all situations analyzed. Dedicated software tools were created and computationally efficient solutions were implemented. These were incorporated into an industry standard seismic processing software package (SPW), so as to provide, together with the already existing tools, a fully integrated processing workflow for OBS/NODE data, from data acquisition and quality control, to the production of the final pre-stack depth migrated seismic volumes.
APA, Harvard, Vancouver, ISO, and other styles
16

Bartman, Brian M. "SUPPORTING SOFTWARE EXPLORATION WITH A SYNTACTIC AWARESOURCE CODE QUERY LANGUAGE." Kent State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=kent1500967681232291.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Scott, Patrick. "Talking tools : faces of Aboriginal oral tradition in contemporary society (a practice-led exploration)." Thesis, University of Dundee, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.510619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Grover, Russell J. "An Exploration of Formal Methods and Tools Applied to a Small Satellite Software System." DigitalCommons@USU, 2010. https://digitalcommons.usu.edu/etd/743.

Full text
Abstract:
Formal system modeling has been a topic of interest in the research community for many years. Modeling a system helps engineers understand it better and enables them to check different aspects of it to ensure that there is no undesired or unexpected behavior and that it does what it was designed to do. This thesis takes two existing tools that were created to aid in the designing of spacecraft systems and creates a layer to connect them together and allow them to be used jointly. The first tool is a library of formal descriptions used to specify spacecraft behavior in an unambiguous manner. The second tool is a graphical modeling language that allows a designer to create a model using traditional block diagram descriptions. These block diagrams can be translated to the formal descriptions using the layer created as part of this thesis work. The software of a small satellite, and the additions made to it as part of this thesis work, is also described. Approaches to modeling this software formally are discussed, as are the problems that were encountered that led to expansions of the formal description library to allow better system description.
APA, Harvard, Vancouver, ISO, and other styles
19

Ousby, Louise. "Whatever it takes : an exploration of writing tools and strategies for completing a novel." Thesis, Queensland University of Technology, 2009. https://eprints.qut.edu.au/32182/1/Louise_Ousby_Exegesis.pdf.

Full text
Abstract:
This thesis consists of a novel written with the express purpose of exploring what practices and strategies are most useful in writing novel-length fiction as well as an exegesis which discusses the process. By its very nature, an undergraduate degree in Creative Writing is broad and general in approach. The Creative Writing undergraduate is being trained to manage many and varying writing tasks but none of them larger than can be readily marked and assessed in class quantities. This does not prepare the writing graduate for the gargantuan task of managing a project as large as a single title novel which can be up to 100,000 words and often is more. This study explores the question of what writing tools and practices best equip an emerging writer to begin, write and manage a long narrative within a deadline.
APA, Harvard, Vancouver, ISO, and other styles
20

Beaulieu, Julie. "Exploration of high-density oligoarrays as tools to assess substantial equivalence of genetically modified crops." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=97904.

Full text
Abstract:
Since the early 1990s, the concept of substantial equivalence has been a guiding principle of the Canadian Food Inspection Agency and Health Canada's regulatory approach toward products of plant biotechnology destined for the food and livestock feed markets. To assess substantial equivalence in terms of chemical composition, genetically modified (GM) plants are compared to conventional counterparts at the level of macro- and micro-nutrients, allergens and toxicants. Such targeted comparative analyses are limited in their scope and their capacity to detect unintended changes in chemical composition. There is a need to develop more effective testing protocols to improve the substantial equivalence assessment of GM crops. The objective of this thesis was to explore high-density oligoarrays as tools to assess substantial equivalence of Roundup Ready(TM) soybean. Three conventional and two GM soybean varieties were selected according to the similarity of their performance in field trials. Total RNA was extracted from first trifoliate leaves harvested from soybean plants grown in a controlled environment until the V2 stage. To annotate the 37 776 soybean probesets present on the multi-organism Soybean Affymetrix GeneChip(TM), consensus sequences were aligned with TIGR Soybean Gene Index tentative consensus sequences using BLASTN. After redefining the chip description file to exclude non-soybean probesets, the effects of three different normalization methods (Robust Multichip Average (RMA), Microarray Analysis Suite (MAS 5.0) and Model-Based Expression Index) were compared and Significance Analysis of Microarrays (SAM for R-Bioconductor) was applied to detect differential gene expression between conventional and GM soybean varieties. Eleven candidate genes were selected for further studies.
APA, Harvard, Vancouver, ISO, and other styles
21

Marshall, John James. "An exploration of hybrid art and design practice using computer-based design and fabrication tools." Thesis, Robert Gordon University, 2008. http://hdl.handle.net/10059/387.

Full text
Abstract:
The researcher’s previous experience suggested the use of computer-based design and fabrication tools might enable new models of practice that yield a greater integration between the 3D art and design disciplines. A critical, contextual review was conducted to assess what kinds of objects are being produced by art and design practitioners; what the significant characteristics of these objects might be; and what technological, theoretical and contextual frameworks support their making. A survey of international practitioners was undertaken to establish how practitioners use these tools and engage with other art and design disciplines. From these a formalised system of analysis was developed to derive evaluative criteria for these objects. The researcher developed a curatorial framework for a public exhibition and symposium that explored the direction that art and design practitioners are taking in relation to computer-based tools. These events allowed the researcher to survey existing works, explore future trends, gather audience and peer response and engage the broader community of interest around the field of enquiry. Interviews were conducted with practitioners whose work was included in this exhibition and project stakeholders to reveal patterns and themes relevant to the theoretical framework of this study. A model of the phases that practitioners go through when they integrate computer-based tools into their practice was derived from an existing technology adoption model. Also, a contemporary version of R. Krauss’s ‘Klein Group’ was developed that considers developments in the field from the use of digital technologies. This was used to model the context within which the researcher’s practice is located. The research identifies a form of ‘technologyled- practice’ and an increased capacity for a ‘transdisciplinary discourse’ at the intersection of disciplinary domains. This study will be of interest to practitioners from across the 3D art and design disciplines that use computerbased tools.
APA, Harvard, Vancouver, ISO, and other styles
22

Strock, Justin William. "Methods for Naval Ship Concept Exploration Interfacing Model Center and ASSET with Machinery System Tools." Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/33036.

Full text
Abstract:
In response to the Fiscal Year 2006 National Defense Authorization Act, the US Navy conducted an evaluation of alternative propulsion methods for surface combatants and amphibious warfare ships. The study looked at current and future propulsion technology and propulsion alternatives for these three sizes of warships. In their analysis they developed 23 ship concepts, only 7 of which were variants of medium size surface combatants (MSC,21,000-26,000 MT). The report to Congress was based on a cost analysis and operational effectiveness analysis of these variants. The conclusions drawn were only based on the ship variants they developed and not on a representative sample of the feasible, non-dominated designs in the design space.

This thesis revisits the Alternative Propulsion Study results for a MSC, which were constrained by the inability of the Navyâ s design tools to adequately search the full design space. This thesis will also assess automated methods to improve the APS approach, and examine a range of power generation alternatives using realistic operational profiles and requirements to develop a notional medium surface combatant (CGXBMD). It is essential to base conclusions on the non-dominated design space, and this new approach will use a multi-objective optimization to find non-dominated designs in the specified design space and use new visualization tools to assess the characteristics of these designs. This automated approach and new tools are evaluated in the context of the revisited study.
Master of Science

APA, Harvard, Vancouver, ISO, and other styles
23

Purton, I. M. "Concept exploration for a novel submarine concept using innovative computer-based research approaches and tools." Thesis, University College London (University of London), 2016. http://discovery.ucl.ac.uk/1505782/.

Full text
Abstract:
The concept of an Unmanned Underwater Vehicle (UUV) “Mothership” submarine (designated Submersible Ship Host (Nuclear), SSH(N)) has already been explored at UCL using the Design Building Block approach by Pawling and Andrews (2011). This thesis builds upon that study, further investigating the design of a large mother-ship submarine. The incorporation of a novel technology such as UUVs into submarines suggests that the traditional evolutionary approach to concept exploration for new submarine designs is questionable. A novel approach to exploring, within the design solution space, novel SSH(N) concepts has been investigated in this thesis. The significance of incorporating UUVs into submarine design has been explored by conducting an Operational Analysis (OA) of the mix of UUVs required supporting a range of scenarios. This OA gave a coherent justification for a mixed and significant total displacement of UUVs as the main payload for SSH(N)s. A MATLAB computer program, Submarine Preliminary Exploration of Requirements by Blocks (SUPERB), has been produced to generate and assess submarine concept designs. SUPERB also uses a novel generic arrangement approach called, “Compartment X-Listing”, which systematically allocates compartments within the pressure hull and then compares individual concept-level submarine designs to typical existing arrangements. Validation of SUPERB and Compartment X-Listing is presented and discussed using two existing submarine designs and two radical concept design proposals. A novel approach of modifying a nominal Pareto Front representation for complex novel designs called the Notional Pareto Front (NPF) has been used with SUPERB to generate designs and is considered to be an innovation in marine design practice. The NPF approach seeks to bound the solution space and focus concept exploration on a smaller region. This is seen to have the potential to inform an extensive early stage exploration of the design solution space, as a research approach for future concept level investigations, such as for SSH(N)s. Recommendations are made as to how this design approach may be taken forward.
APA, Harvard, Vancouver, ISO, and other styles
24

Papapavlou, Konstantinos. "Petrochronology and mineral chemistry of mid-crustal shear zones : new tools for tectonics and mineral exploration." Thesis, University of Portsmouth, 2017. https://researchportal.port.ac.uk/portal/en/theses/petrochronology-and-mineral-chemistry-of-midcrustal-shear-zones(e59893a5-5079-43b8-8dfd-b83e207b5097).html.

Full text
Abstract:
Dating ductile shear zones is daunting because we have to demonstrate either that the chronometer of choice grew during shear zone operation or that crystal-plastic deformation induced age resetting. By adopting a petrochronological approach in this project combining petrographic, geochemical, U-Pb isotopic, and quantitative microstructural data U-Pb isotopic dates are linked with certain shear zone processes. The study area is the South Range of the world-class Sudbury Impact Structure. Specifically, mylonitic shear zones at the Creighton Mine (South Range, Sudbury) operated during three distinct tectonothermal events at ca. 1.75 Ga, 1.65 Ga, and 1.45 Ga. The age dating of texturally and geochemically characterised titanite grains from a shear zone exposed at the 5400 level of the Creighton Mine, indicates operation of the shear during the Mazatzalian – Labradorian orogeny (1.7 – 1.6 Ga). Meso-scale sulphide structures of mechanical remobilization, within the main body of the examined shear zone, show that this event facilitated the local-scale transfer of sulphides to satellite positions. Three age populations of ca. 1.75 Ga, 1.65 Ga, and 1.45 Ga are also prevalent in shear zones from deeper levels of the Creighton Mine. These age populations yield new insights into the orogenic history of the South Range and the Southern Province, and provide further constraints on the comparison of accretionary provinces of the North American Mid-continent and the Southwest United States. Taking into consideration the fluid-mediated and crystal-plasticity textural features in the examined titanite populations it is suggested that these dates record events of syndeformational fluid percolation. Within the 1.75 Ga textural population of titanite grains survived inclusions of inherited titanite grains with shock-metamorphic features. Microstructural and micro to nano-scale crosscutting relationships suggest that the shock wave during the 1.85 Ga impact event induced in these grains the growth of 75°/<010> and 108°/<010> shock microtwins. The nucleation of twins induced a work hardening effect that allowed their survival during the later polyorogenic reworking of the basin (1.75 to 1.45 Ga). U-Pb age dating of these grains yield accurately the age of impact (i.e. 1851 ± 12 Ma). In comparison, titanite grains located within Archaean target rocks ofthe Vredefort structure show identical crystallographic features and partial age resetting. The differential response is attributed to the different distance of the samples from the base of the impact melt sheet that was the dominant heat source. The ore-controlling character of the examined shear zones in the Sudbury mining camp can provide critical information about the exploration potential of these structures in metallogenetic settings. Preliminary mineral-chemical analysis, from major to trace element level, of fabric-forming silicates show distinct trends in the abundance of pathfinder elements (e.g. transition metals). Further, work that will collate the different datasets using multivariable statistical methods will be pursued in order to untangle the vectoring potential of different elements.
APA, Harvard, Vancouver, ISO, and other styles
25

Brannon, Brittany Ann. "Faulty Measurements and Shaky Tools: An Exploration into Hazus and the Seismic Vulnerabilities of Portland, OR." PDXScholar, 2013. https://pdxscholar.library.pdx.edu/open_access_etds/1410.

Full text
Abstract:
Events or forces of nature with catastrophic consequences, or "natural disasters," have increased in both frequency and force due to climate change and increased urbanization in climate-sensitive areas. To create capacity to face these dangers, an entity must first quantify the threat and translate scientific knowledge on nature into comprehensible estimates of cost and loss. These estimates equip those at risk with knowledge to enact policy, formulate mitigation plans, raise awareness, and promote preparedness in light of potential destruction. Hazards-United States, or Hazus, is one such tool created by the federal government to estimate loss from a variety of threats, including earthquakes, hurricanes, and floods. Private and governmental agencies use Hazus to provide information and support to enact mitigation measures, craft plans, and create insurance assessments; hence the results of Hazus can have lasting and irreversible effects once the hazard in question occurs. This thesis addresses this problem and sheds light on the obvious and deterministic failings of Hazus in the context of the probable earthquake in Portland, OR; stripping away the tool's black box and exposing the grim vulnerabilities it fails to account for. The purpose of this thesis is twofold. First, this thesis aims to examine the critical flaws within Hazus and the omitted vulnerabilities particular to the Portland region and likely relevant in other areas of study. Second and more nationally applicable, this thesis intends to examine the influence Hazus outputs can have in the framing of seismic risk by the non-expert public. Combining the problem of inadequate understanding of risk in Portland with the questionable faith in Hazus alludes to a larger, socio-technical situation in need of attention by the academic and hazard mitigation community. This thesis addresses those issues in scope and adds to the growing body of literature on defining risk, hazard mitigation, and the consequences of natural disasters to urban environments.
APA, Harvard, Vancouver, ISO, and other styles
26

Ahmad, Abbas. "Model-Based Testing for IoT Systems : Methods and tools." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD008/document.

Full text
Abstract:
L'internet des objets (IoT) est aujourd'hui un moyen d'innovation et de transformation pour de nombreuses entreprises. Les applications s'étendent à un grand nombre de domaines, tels que les villes intelligentes, les maisons intelligentes, la santé, etc. Le Groupe Gartner estime à 21 milliards le nombre d'objets connectés d'ici 2020. Le grand nombre d'objets connectés introduit des problèmes, tels que la conformité et l'interopérabilité en raison de l'hétérogénéité des protocoles de communication et de l'absence d'une norme mondialement acceptée. Le grand nombre d'utilisations introduit des problèmes de déploiement sécurisé et d'évolution du réseau des IoT pour former des infrastructures de grande taille. Cette thèse aborde la problématique de la validation de l'internet des objets pour répondre aux défis des systèmes IoT. Pour cela, nous proposons une approche utilisant la génération de tests à partir de modèles (MBT). Nous avons confronté cette approche à travers de multiples expérimentations utilisant des systèmes réels grâce à notre participation à des projets internationaux. L'effort important qui doit être fait sur les aspects du test rappelle à tout développeur de système IoT que: ne rien faire est plus cher que de faire au fur et à mesure
The Internet of Things (IoT) is nowadays globally a mean of innovation and transformation for many companies. Applications extend to a large number of domains, such as smart cities, smart homes, healthcare, etc. The Gartner Group estimates an increase up to 21 billion connected things by 2020. The large span of "things" introduces problematic aspects, such as conformance and interoperability due to the heterogeneity of communication protocols and the lack of a globally-accepted standard. The large span of usages introduces problems regarding secure deployments and scalability of the network over large-scale infrastructures. This thesis deals with the problem of the validation of the Internet of Things to meet the challenges of IoT systems. For that, we propose an approach using the generation of tests from models (MBT). We have confronted this approach through multiple experiments using real systems thanks to our participation in international projects. The important effort which is needed to be placed on the testing aspects reminds every IoT system developer that doing nothing is more expensive later on than doing it on the go
APA, Harvard, Vancouver, ISO, and other styles
27

Snook, Benjamin Richard. "Towards exploration tools for high purity quartz : an example from the South Norwegian Evje-Iveland Pegmatite Belt." Thesis, University of Exeter, 2014. http://hdl.handle.net/10871/14884.

Full text
Abstract:
High Purity Quartz (HPQ; quartz containing less than 50 ppm trace elements) is of increasing economic significance due to its use in certain high-tech components (computer chip/semiconductor manufacture) and in green technologies (silicon wafer production). Current HPQ deposits (hydrothermal veins/leuco-granites/alaskites) are rare and volumetrically small. Unless significant new deposits are found, increasing demand will raise its prices, elevating the strategic nature of this limited commodity. The large volumes and simple mineralogy of pegmatites and the high chemical purity of their constituents make them an attractive target for HPQ. PhD studies are being carried out on quartz from the Evje-Iveland pegmatite field of the Bamble-Evje pegmatite cluster, southern Norway. The area was targeted due to its well constrained geological setting and previously identified potential for HPQ. The aim of the investigation is to develop exploration tools for HPQ by determining the genetic history of the pegmatites and mode of HPQ formation. The study is focussing on 7 pegmatites and their country rocks. Each shows typical pegmatite zonation, with quartz/feldspar intergrowths at the margins, a massive quartz core and a variety of accessory (including REE-bearing) phases. The proximal Høvringsvatnet granite was previously suggested to have supplied late-stage, highly fractionated melts to form the pegmatites. However, from their trace element systematics (no relationship was observed between trace element content and degree of fractionation in each pegmatite body), and a difference in U/Pb age of approximately 70 Ma, the pegmatites cannot be related to the granites. From field evidence (corroborated by geochemical modelling) the pegmatites formed by ‘in situ’ anatexis of country rocks; some locally, some from distal sources. Some pegmatites contain brecciated feldspar and replacement quartz. From LA-ICP-MS analyses, hydrothermal quartz, compared with magmatic quartz, typically contains lower quantities of trace elements. Hydrothermal material shows relatively elevated levels of Al and Li, low Ge and a complete absence of Ti, indicating relatively low temperature hydrothermal formation. Different quartz domains (from SEM-CL imaging) show distinct δ18O values; late stage low trace element zones show values consistent with meteorically derived fluids. In situ LA-ICP-MS studies will provide further information about the characteristics of the fluids which have replaced/refined magmatic quartz to form HPQ. This beneficiation process is a potential mechanism for the generation of economically significant HPQ deposits.
APA, Harvard, Vancouver, ISO, and other styles
28

Nicholas, Paul, and not supplied. "Approaches to Interdependency: early design exploration across architectural and engineering domains." RMIT University. Architecture and Design, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20081204.151243.

Full text
Abstract:
While 3D digital design tools have extended the reach of architectural and engineering designers within their own domains, restrictions on the use of the tools and an approach to practice whereby the architect designs (synthesises) and the engineer solves (analyses) - in that order ¡V have limited the opportunities for interdependent modes of interaction between the two disciplines during the early design phase. While it is suggested that 3D digital design tools can facilitate a more integrated approach to design exploration, this idea remains largely untested in practice. The central proposition of my research is that that 3D digital tools can enable interdependencies between crucial aspects of architectural and engineering design exploration during the early design phase which, before the entry of the computer, were otherwise impossible to affect. I define interdependency as a productive form of practice enabled by mutual and lateral dependence. Interdependent parties use problem solving processes that meet not only their own respective goals, but also those of others, by constructively engaging difference across their boundaries to actively search for solutions that go beyond the limits of singular domains. Developed through practice-based project work undertaken during my 3 year postgraduate internship within the Melbourne Australia office of the engineering firm Arup, my research explores new and improved linkages between early design exploration, analysis and making. The principal contribution of my research is to explore this problem from within the context, conditi ons and pressures of live practice. To test the research proposition this dissertation engages firstly with available literature from the fields of organisation theory and design, secondly with information gathered from experts in the field principally via interview, and lastly with processes of testing through practice-based (as opposed to university-based) project work. The dissertation is organized as follows: The Introductory Chapter outlines the central hypothesis, the current state of the discourse, and my motivations for conducting this research. I summarise the structure of my research, and the opportunities and limitations that have framed its ambitions. Chapter Two, Approach to Research and Method, details the constraints and possibilities of the Embedded Research within Architectural Practice context, within which this work has been undertaken, and describes the Melbourne office of Arup, the practice with whom I have been embedded. These contexts have led to the selection of a particular set of ethnographic research instruments, being the use of semi-structured interviews and the undertaking of practice-based studies as a participant-observer. These modes of testing are explained, and the constraints, limitations and requirements associated with them described. Within Chapter Three, Factors for Separation and Integration in Architectural and Engineering Design, I examine selected design literature to detail several factors impacting upon the historic and contemporary relationship between architects and engineers, and to introduce the problem towards which this thesis is addressed. I describe a process of specialisation that has led architects and engineers to see different aspects of a common problem, detail the historical factors for separation, the current relationship between domains and the emerging idea of increased integration during the early design phase. The aim of this section is primarily contextual - to introduce the characters and to understand why their interaction can be difficult - and investigation occurs through the concepts of specialisation and disciplinary roles. Chapter Four, Unravelling Interdependency, establishes an understanding of interdependency through the concept of collaboration. While I differentiate interdependency from collaboration because of the inconsistent manner in which the latter term is employed, the concept of collaboration is useful to initialise my understanding of interdependency because it, as opposed to the closely linked processes of cooperation and coordination, is recognised as being characterised by interdependency, and in fact is a viewed as a response specific to wider conditions of interdependency. From the literature, I identify four sites of intersection crucial to an understanding of interdependency; these are differing perceptions, shared and creative problem solving, communication and trust. These themes, which correlate with my practice experience at Arup Melbourne, are developed to introduce the concepts and vocabulary underlying my research. Chapter Five, Intersections & Interdependency between Architects and Engineers, grounds these four sites of intersection within contemporary issues of digital architectural and engineering practice. Each site is developed firstly through reference to design literature and secondly through the experiences and understandings of senior Arup practitioners as captured through my interviews. The views and experiences of these practitioners are used to locate digital limits to, and potential solutions for, interdependent design exploration between architects and engineers as they are experienced within and by practice. Through this combination of design literature and grounded experience, I extend: * the understanding of differing perceptions through reference to problems associated with digital information transfer. * the understanding of joint and creative problem solving by connecting it to the notion of performance-based design. * the understanding of communication by focussing it upon the idea of back propagating design information. * the understanding of trust by connecting it to the management and reduction of perceived complexity and risk. Chapter Six, Testing through Projects, details the project studies undertaken within this research. These studies are grouped into three discourses, characterized as Design(Arch)Design(Eng), Design|Analysis and Design|Making. As suggested by the concurrency operator that separates the two terms that constitute each of the three labels, each discourse tests how architectural and engineering explorations might execute in parallel. The section Design(Arch)|Design(Eng) reports projects that use a common language of geometry to link architectural and engineering design ideas through geometric interpretation. The section Design|Analysis reports projects in which analytical tools have been used generatively to actively guide and synthesise design exploration. The final section, Design|Making, reports projects in which the architectural and engineering design processes are synthesised around the procurement of fabrication information. Conclusions are then drawn and discussed in Chapter Seven. In evaluating the research I discuss how 3D digital design tools have enabled alternative approaches that resolve issues associated with differing perceptions, establishing common meanings, communication and trust. I summarise how these approaches have enabled increased interdependency in architect engineer interaction. Lastly, I draw together the impacts of intersecting 3D digital aspects of architectural and engineering design exploration during the early design phase, and indicate those aspects that require further analysis and research.
APA, Harvard, Vancouver, ISO, and other styles
29

Wood, Lisa Jane. "Social capital, neighbourhood environments and health : development of measurement tools and exploration of links through qualitative and quantitative research." University of Western Australia. School of Population Health, 2006. http://theses.library.uwa.edu.au/adt-WU2006.0111.

Full text
Abstract:
[Truncated abstract] BACKGROUND This thesis explored the relationship between social capital, sense of community and mental health and wellbeing; and factors that may influence these within the environments in which people live. Area variations in health are well documented and are mirrored in emerging evidence of geographic and neighbourhood variations in social capital. Little is known, however, about the specific facets of the impact of local physical environment on social capital; or about the mechanisms by which these are linked with each other, and with health determinants and outcomes. Despite the recent proliferation of social capital literature and growing research interest within the public health realm, its relationship to mental health and protective factors for mental health have also been relatively unexplored. AIMS The overall aim of this thesis was to explore the potential associations between social capital, health and mental health, and neighbourhood environments. In particular, the thesis considered whether the physical attributes and street network design of neighbourhoods are associated with social capital or particular dimensions of the social capital construct. It also examined the relationship between social capital and demographic and residency factors and pet ownership ... CONCLUSION The combined use of qualitative and quantitative research is a distinguishing feature of this study, and the triangulation of these data has a unique contribution to make to the social capital literature. Studies concerned with the measurement of social capital to date have tended to focus on dimensions pertaining to people’s involvement, perceptions and relationship with others and their community. While these constructs provide insight into what comprises social capital, it is clear that each is in turn influenced by a range of other factors. Elucidating what fosters trust and neighbourly interactions in one community and not in another, and by what mechanisms, is one of many research questions unanswered in the published literature to date. The consideration of measures of social capital that relate to the physical environment is therefore of relevance to the growing research and public policy interest in identifying what might build or restore social capital in communities.
APA, Harvard, Vancouver, ISO, and other styles
30

Wlodarczyk, Radoslaw Stanislaw Verfasser], Joachim [Akademischer Betreuer] Sauer, Marek [Akademischer Betreuer] Sierka, and Bernd [Akademischer Betreuer] [Hartke. "Surface structure predictions and development of global exploration tools / Radoslaw Stanislaw Wlodarczyk. Gutachter: Joachim Sauer ; Marek Sierka ; Bernd Hartke." Berlin : Mathematisch-Naturwissenschaftliche Fakultät, 2015. http://d-nb.info/1071596489/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Spaulding, Timothy J. (Timothy James) 1979. "Tools for evolutionary acquisition : a study of Multi-Attribute Tradespace Exploration (MATE) applied to the Space Based Radar (SBR)." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/82703.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2003.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Statement of responsibility on t.p. reads: 2nd Lieutenant Timothy J. Spaulding, USAF.
Includes bibliographical references (p. 139-142).
by Timothy J. Spaulding.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
32

López, Baeza Jesús. "Unveiling urban dynamics: An exploration of tools and methods using crowd-sourced data for the study of urban space." Doctoral thesis, Universidad de Alicante, 2020. http://hdl.handle.net/10045/108227.

Full text
Abstract:
The following work presents several trans-disciplinary resources for understanding cities beyond just their physical form and spatial processes. The conceptualization of cities from a top-down, modern and post- modern approach to the form-function duality lacks multiple dimensions, which need to be studied in order to gain a proper understanding of how contemporary urban societies perform nowadays. Instead, this work considers settlements as a set of an infinite number of individual perceptions and experiences, which construct overlapping layers of hidden and intangible information that shape cities as complex systems. Social relations that are moving progressively to the virtual realm are becoming major factors in decision-making and location choices by citizens. This definition of a city’s hidden image is developed through the study of data retrieved from online servers. To do so, this work focuses on spatial and temporal activity patterns, values of certain places and their quantitative weight within the urban fabric, the distribution and nature of places, the observation of people’s perception of certain places through the representation of activities captured by pictures posted online, or several other theoretical and methodological approaches under the umbrella of crowd-sourced data in the city.
APA, Harvard, Vancouver, ISO, and other styles
33

Lagraa, Sofiane. "New MP-SoC profiling tools based on data mining techniques." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM026/document.

Full text
Abstract:
La miniaturisation des composants électroniques a conduit à l'introduction de systèmes électroniques complexes qui sont intégrés sur une seule puce avec multiprocesseurs, dits Multi-Processor System-on-Chip (MPSoC). La majorité des systèmes embarqués récents sont basées sur des architectures massivement parallèles MPSoC, d'où la nécessité de développer des applications parallèles embarquées. La conception et le développement d'une application parallèle embarquée devient de plus en plus difficile notamment pour les architectures multiprocesseurs hétérogènes ayant différents types de contraintes de communication et de conception tels que le coût du matériel, la puissance et la rapidité. Un défi à relever par de nombreux développeurs est le profilage des applications parallèles embarquées afin qu'ils puissent passer à l'échelle sur plusieurs cœurs possible. Cela est particulièrement important pour les systèmes embarqués de type MPSoC, où les applications doivent fonctionner correctement sur de nombreux cœurs. En outre, la performance d'une application ne s'améliore pas forcément lorsque l'application tourne sur un nombre de cœurs encore plus grand. La performance d'une application peut être limitée en raison de multiples goulot d'étranglement notamment la contention sur des ressources partagées telles que les caches et la mémoire. Cela devient contraignant etune perte de temps pour un développeur de faire un profilage de l'application parallèle embarquée et d'identifier des goulots d'étranglement dans le code source qui diminuent la performance de l'application. Pour surmonter ces problèmes, dans cette thèse, nous proposons trois méthodes automatiques qui détectent les instructions du code source qui ont conduit à une diminution de performance due à la contention et à l'évolutivité des processeurs sur une puce. Les méthodes sont basées sur des techniques de fouille de données exploitant des gigaoctets de traces d'exécution de bas niveau produites par les platesformes MPSoC. Nos approches de profilage permettent de quantifier et de localiser automatiquement les goulots d'étranglement dans le code source afin d'aider les développeurs à optimiserleurs applications parallèles embarquées. Nous avons effectué plusieurs expériences sur plusieurs applications parallèles embarquées. Nos expériences montrent la précision des techniques proposées, en quantifiant et localisant avec précision les hotspots dans le code source
Miniaturization of electronic components has led to the introduction of complex electronic systems which are integrated onto a single chip with multiprocessors, so-called Multi-Processor System-on-Chip (MPSoC). The majority of recent embedded systems are based on massively parallel MPSoC architectures, hence the necessity of developing embedded parallel applications. Embedded parallel application design becomes more challenging: It becomes a parallel programming for non-trivial heterogeneous multiprocessors with diverse communication architectures and design constraints such as hardware cost, power, and timeliness. A challenge faced by many developers is the profiling of embedded parallel applications so that they can scale over more and more cores. This is especially critical for embedded systems powered by MPSoC, where ever demanding applications have to run smoothly on numerous cores, each with modest power budget. Moreover, application performance does not necessarily improve as more cores are added. Application performance can be limited due to multiple bottlenecks including contention for shared resources such as caches and memory. It becomes time consuming for a developer to pinpoint in the source code the bottlenecks decreasing the performance. To overcome these issues, in this thesis, we propose a fully three automatic methods which detect the instructions of the code which lead to a lack of performance due to contention and scalability of processors on a chip. The methods are based on data mining techniques exploiting gigabytes of low level execution traces produced by MPSoC platforms. Our profiling approaches allow to quantify and pinpoint, automatically the bottlenecks in source code in order to aid the developers to optimize its embedded parallel application. We performed several experiments on several parallel application benchmarks. Our experiments show the accuracy of the proposed techniques, by quantifying and pinpointing the hotspot in the source code
APA, Harvard, Vancouver, ISO, and other styles
34

Cook, Arica B. "The Effect that Child Neglect has on the Trafficking of Minors: An Exploration into the Gaps Between Victim Identification and Precursory Events." Youngstown State University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1620329807410116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Serim, Baris. "Designing for Engagement: Using indirect manipulation to support form exploration in 3D modeling." Thesis, Malmö högskola, Fakulteten för kultur och samhälle (KS), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-22790.

Full text
Abstract:
This thesis aims to study the design possibilities for supporting explorative form-finding in 3D modeling applications. For today’s many design professions, 3D forms are achieved partly in engagement with digital environments. Use of software has far exceeded final idea execution, extending to the early phases of design work in which the outcome is not predetermined. This insight led designers of interactive systems support sketching and ideating activities by reducing the risk of experimentation and cognitive effort demanded from user. Yet, there has been less emphasis on traditional design and craft practice that acknowledges engagement with materials and effort spent on work as an integral part of creative process.The notion of exploration in the scope of this thesis attempts to incorporate such aspects. Relevant literature about workshop practice in design and craft has been reviewed, as well as examples of CAD technologies that aid designers. In this light, HCI perspectives on the design of creativity support tools and games have been discussed. The thesis work aimed to concretize this background by building a design strategy and an interactive artifact. A 3D form-finding application concept using objects in modeling space to indirectly manipulate geometry, “kfields”, has been developed and evaluated with users at various stages. The thesis concludes by reflecting on the findings of different design stages and proposing further directions for design.
APA, Harvard, Vancouver, ISO, and other styles
36

Dhawi, Fahad A. "Redesigning Arabic learning books : an exploration of the role of graphic communication and typography as visual pedagogic tools in Arabic-Latin bilingual design." Thesis, University of the Arts London, 2017. http://ualresearchonline.arts.ac.uk/13472/.

Full text
Abstract:
What are ‘educational typefaces’ and why are they needed today? Do Arabic beginners need special typefaces that can simplify learning further? If so, what features should they have? Research findings on the complexity of learning Arabic confirm that the majority of language textbooks and pedagogic materials lead to challenging learning environments due to the poor quality of book design, text-heavy content and the restricted amount of visuals used. The complexity of the data and insufficient design quality of the learning materials reviewed in this practice-based research demand serious thought toward simplification, involving experts in the fields of graphic communication, learning and typeface design. The study offers solutions to some of the problems that arise in the course of designing language-learning books by reviewing selected English learning and information design books and methods of guidance for developing uniform learning material for basic Arabic. Key findings from this study confirm the significant role of Arabic designers and educators in the production of efficient and effective learning materials. Their role involves working closely with Arabic instructors, mastering good language skills and being aware of the knowledge available. Also, selecting legible typefaces with distinct design characteristics to help fulfil various objectives of the learning unit. This study raises awareness of the need for typefaces that can attract people to learn Arabic more easily within a globalized world. The absence of such typefaces led to the exploration of simplified twentieth-century Arabic typefaces that share a similar idea of facilitating reading and writing, and resolving script and language complexity issues. This study traces their historical context and studies their functional, technical and aesthetic features to incorporate their thinking and reassign them as learning tools within the right context. The final outcome is the construction of an experimental bilingual Arabic-English language book series for Arab and non-Arab adult beginners. The learning tools used to create the book series were tested through workshops in Kuwait and London to measure their level of simplification and accessibility. They have confirmed both accessibility and incompatibility within different areas of the learning material of the books and helped improve the final outcome of the practice. The tools have established the significant role of educational typefaces, bilingual and graphic communication within visual Arabic learning.
APA, Harvard, Vancouver, ISO, and other styles
37

Toledo, Zambrano Tania Andrea [Verfasser], Charlotte [Akademischer Betreuer] Krawczyk, Philippe [Akademischer Betreuer] Jousset, Charlotte [Gutachter] Krawczyk, Hansruedi [Gutachter] Maurer, Maren [Gutachter] Brehme, and Philippe [Gutachter] Jousset. "Seismological tools for geothermal exploration and monitoring / Tania Andrea Toledo Zambrano ; Gutachter: Charlotte Krawczyk, Hansruedi Maurer, Maren Brehme, Philippe Jousset ; Charlotte Krawczyk, Philippe Jousset." Berlin : Technische Universität Berlin, 2021. http://d-nb.info/1238140815/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Zalila, Faiez. "Methods and tools for the integration of formal verification in domain-specific languages." Phd thesis, Toulouse, INPT, 2014. http://oatao.univ-toulouse.fr/14159/1/zalila.pdf.

Full text
Abstract:
Domain specific Modeling Languages (DSMLs) are increasingly used at the early phases in the development of complex systems, in particular, for safety critical systems. The goal is to be able to reason early in the development on these models and, in particular, to fulfill verification and validation activities (V and V). A widely used technique is the exhaustive behavioral model verification using model-checking by providing a translational semantics to build a formal model from DSML conforming models in order to reuse powerful tools available for this formal domain. Defining a translational semantics, expressing formal properties to be assessed and analysing such verification results require such an expertise in formal methods that it restricts their adoption and may discourage the designers. It is thus necessary to build for each DSML, a toolchain which hides formal aspects for DSML end-users. The goal of this thesis consists in easing the development of such verification toolchains. Our contribution includes 1) expressing behavioral properties in the DSML level by relying on TOCL (Temporal Object Constraint Language), a temporal extension of OCL; 2) An automated transformation of these properties on formal properties while reusing the key elements of the translational semantics; 3) the feedback of verification results thanks to a higher-order transformation and a language which defines mappings between DSML and formal levels; 4) the associated process implementation. Our approach was validated by the experimentation on a subset of the development process modeling language SPEM, and on Ladder Diagram language used to specify programmable logic controllers (PLCs), and by the integration of a formal intermediate language (FIACRE) in the verification toolchain. This last point allows to reduce the semantic gap between DSMLs and formal domains.
APA, Harvard, Vancouver, ISO, and other styles
39

Brands, Christian. "Scenario-based strategic planning and strategic management in family firms." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-125931.

Full text
Abstract:
This cumulative dissertation covers the concepts of scenario-based strategic planning and strategic management in family firms over five articles. The first article gives an overview of the cumulative dissertation explaining the research gap, approach and contribution of the dissertation. The paper highlights the two research areas covered by the dissertation with two articles focusing on scenario-based strategic planning and two on strategic management in family firms. The second article is the first of two focusing on scenario-based strategic planning. It introduces and describes a set of six tools facilitating the implementation of scenario-based strategic planning in corporate practice. The third paper adapts these tools to the financial management and controlling context in private companies highlighting the tools’ flexibility in managing uncertain and volatile environments. The fourth article is the first of two focusing on strategic management in family firms. It analyzes organizational ambidexterity as a factor explaining family firm performance. The article shows that a high level of organizational ambidexterity in family firms leads to a higher family firm performance. The final paper concludes the dissertation examining the tendency of family firms to focus on capability exploration or resource exploitation over different generations managing the family firm.
APA, Harvard, Vancouver, ISO, and other styles
40

Huynh, Hiep Xuan. "Interestingness measures for association rules in a KDD process : postprocessing of rules with ARQAT tool." Nantes, 2006. http://www.theses.fr/2006NANT2110.

Full text
Abstract:
Ce travail s'insère dans le cadre de l'extraction de connaissances dans les données (ECD), souvent dénommé "fouille de données". Ce domaine de recherche multidisciplinaire offre également de nombreuses applications en entreprises. L'ECD s'attache à la découverte de connaissances cachées au sein de grandes masses de données. Parmi les modèles d'extraction de connaissances disponibles, celui des règles d'association est fréquemment utilisé. Il offre l'avantage de permettre une découverte non supervisée de tendances implicatives dans les données, mais, en retour, délivre malheureusement de grandes quantités de règles. Son usage nécessite donc la mise en place d'une phase de post-traitement pour aide l'utilisateur final, un décideur expert des données, à réduire la masse de règles produites. Une manière de réduire la quantité de règles consiste à utiliser des indicateurs numériques de la qualité des règles, appelés "mesures d'intérêts". La littérature propose de nombreuses mesures de ce type, et étudie leurs propriétés. Cette thèse se propose d'étudier la panoplie de mesures d'intérêts disponibles afin d'évaluer leur comportement en fonction d'une part, de la nature des données et d'autre part, des préférences du décideur. L'objectif final étant de guider le choix de l'utilisateur vers les mesures les mieux adaptées à ses besoins et in fine de sélectionner les meilleures règles. A cette fin, nous proposons une approche novatrice implémentée dans un nouvel outil, ARQAT (Association Rule Quality Analysis Tool), afin de faciliter l'analyse du comportement des 40 mesures d'intérêt recensées. En plus de statistiques élémentaires, l'outil permet une analyse poussée des corrélations entre mesures à l'aide de graphes de corrélation s'appuyant sur les coefficients proposés par Pearson, Spearman et Kendall. Ces graphes sont également utilisés pour l'identification de clusters de mesures similaires. En outre, nous avons proposé une série d'études comparatives sur les corrélations entre les mesures d'intérêt sur plusieurs jeux de données. A l'issue de ces études, nous avons découvert un ensemble de correlations peu sensibles à la nature des données utilisées, que nous avons appelées corrélations stables. Enfin, nous présentons 14 graphiques et vues complémentaires structures en 5 niveaux d'analyse : l'analyse de jeu de règles, l'analyse de corrélation et de clustering, l'analyse des meilleures règles, l'analyse de sensibilité, et l'analyse comparative. Au travers d’exemples nous montrons l'intérêt de l'approche exploratoire et de l'utilisation des vues complémentaires
This work takes place in the framework of Knowledge Discovery in Databases (KDD), often called "Data Mining". This domain is both a main research topic and an application field in companies. KDD aims at discovering previously unknown and useful knowledge in large databases. In the last decade many researches have been published about association rules, which are frequently used in data mining. Association rules, which are implicative tendencies in data, have the advantage to be an unsupervised model. But, in counter part, they often deliver a large number of rules. As a consequence, a postprocessing task is required by the user to help him understand the results. One way to reduce the number of rules - to validate or to select the most interesting ones - is to use interestingness measures adapted to both his/her goals and the dataset studied. Selecting the right interestingness measures is an open problem in KDD. A lot of measures have been proposed to extract the knowledge from large databases and many authors have introduced the interestingness properties for selecting a suitable measure for a given application. Some measures are adequate for some applications but the others are not. In our thesis, we propose to study the set of interestingness measure available in the literature, in order to evaluate their behavior according to the nature of data and the preferences of the user. The final objective is to guide the user's choice towards the measures best adapted to its needs and in fine to select the most interesting rules. For this purpose, we propose a new approach implemented in a new tool, ARQAT (Association Rule Quality Analysis Tool), in order to facilitate the analysis of the behavior about 40 interestingness measures. In addition to elementary statistics, the tool allows a thorough analysis of the correlations between measures using correlation graphs based on the coefficients suggested by Pearson, Spearman and Kendall. These graphs are also used for identifying the clusters of similar measures. Moreover, we proposed a series of comparative studies on the correlations between interestingness measures on several datasets. We discovered a set of correlations not very sensitive to the nature of the data used, and which we called stable correlations. Finally, 14 graphical and complementary views structured on 5 levels of analysis: ruleset analysis, correlation and clustering analysis, most interesting rules analysis, sensitivity analysis, and comparative analysis are illustrated in order to show the interest of both the exploratory approach and the use of complementary views
APA, Harvard, Vancouver, ISO, and other styles
41

Willems, Derk Ludolf. "Tools of care explorations into the semiotics of medical technology /." [Maastricht : Maastricht : Rijksuniversiteit Limburg] ; University Library, Maastricht University [Host], 1995. http://arno.unimaas.nl/show.cgi?fid=5762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Knight, Joseph M. (Joseph McCarty). "Explorations of computer-based design tools for urban design projects." Thesis, Massachusetts Institute of Technology, 1992. http://hdl.handle.net/1721.1/39039.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Architecture, 1992.
Includes bibliographical references (leaves 117-118).
This thesis is an investigation into issues involving computers, information, and automation in the designing of large-scale environments. It is an attempt to understand the issues at root in developing an "intelligent" design environment which provides tools for handling tasks often too mundane and distracting for sustained design activity. In the process of devising these tools, fundamental issues regarding the elements or objects of design, their characteristics, and the transformations they undergo are revealed in light of the particular capabilities of the computer. This study was undertaken as an attempt to discover these issues for myself by working to create a system of tools on top of an existing computer-aided design program - a personalized design environment. The path of discovery taken is reconstructed in this paper, in part to illustrate some of the pitfalls of dealing with real-world programming tasks, and also to demonstrate the revelation of the inherent issues that are involved when attempting such a project. Although the programmed end-product is incomplete and greatly simplifies the true nature of such a design problem, the lessons learned are distilled and clarified to provide a basis for further work and investigation in this field. The first part is a synopsis of issues related to computer-aided design: an historic overview, current applications, on-going research and forecasts. This section is provided to illustrate the foundation of understanding that I had when undertaking to develop tools of my own. The second part includes the initial tool concepts and their intended purposes, a discussion of hardware and software platforms, and numerous considerations I was compelled to address while developing modeling and information-handling components of the design environment. Part three deals in depth with a sophisticated tool proposal for instantiating urban type elements for illustrating a possible "realization" of a schematic design; this tool was not able to be developed on the chosen platform. I have also included some possible scenarios for using the modeling and information handling tools.
by Joseph M. Knight.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
43

Herrera, Francisco. "A usability study of the "Tksee" software exploration tool." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape7/PQDD_0029/MQ47519.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wilson, Brenda G. "Exploration of mind mapping as an organizational change tool." Thesis, Pepperdine University, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10100912.

Full text
Abstract:

Mind mapping is a communication tool that has been around for decades though it is rarely discussed as a tool for facilitating organizational change. It is possible for this underutilized communication tool and the ever-present challenge of organizational change to work in harmony on a more consistent basis. This exploratory research asked Change Leaders and Change Participants about their current mind mapping usage or experience, and requested their input on the use of mind mapping for organizational change efforts. There were 76 Change Leaders and 11 others who self-described themselves as Change Participants who responded to a virtual data collection process. Overall change readiness levels were predominantly at the moderate level, 37% of Change Leaders and 45% of Change Participants, an encouraging statistic for organizations considering change. Respondents reported that mind mapping is mostly used as a personal tool for organization, planning events, setting goals, and writing papers. Change Leaders (n=20) who reported using mind mapping professionally, commented they used it primarily for communication and collaboration, and project and systems planning and design. Specific practices included coaching, clarifying objectives, evaluating and monitoring projects, assessing lessons learned, redesigning curriculum, realigning resources, setting expectations, objectives and goals, and establishing timelines. One study conclusion was that these change practitioners understood how change is inevitable, and indicated their willingness to actively participate. This makes it important for organizations to capitalize on change participants’ knowledge and enthusiasm to enable successful change and enhance employee well-being. Concluding that attitudes, behaviors, and feelings toward change vary based on the role one plays, Change Leaders can benefit from the efforts of Change Participants by simply respecting their role and knowledge and involving them in the entire process, from planning to implementation. It is a foregone conclusion that communication is essential for any change process regardless of what specific tool is used, but the importance of selecting an appropriate method(s) based on the situation, message, and the recipients is critical. Using mind mapping as a change management tool specifically designed for certain aspects of organizational change is highly recommended, as it allows for both linear and non-linear communication.

Keywords: mind mapping, organizational change, Organizational Change Readiness Assessment, OCRA, visual communication

APA, Harvard, Vancouver, ISO, and other styles
45

Hahsler, Michael, and Kurt Hornik. "Dissimilarity Plots. A Visual Exploration Tool for Partitional Clustering." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 2009. http://epub.wu.ac.at/1244/1/document.pdf.

Full text
Abstract:
For hierarchical clustering, dendrograms provide convenient and powerful visualization. Although many visualization methods have been suggested for partitional clustering, their usefulness deteriorates quickly with increasing dimensionality of the data and/or they fail to represent structure between and within clusters simultaneously. In this paper we extend (dissimilarity) matrix shading with several reordering steps based on seriation. Both methods, matrix shading and seriation, have been well-known for a long time. However, only recent algorithmic improvements allow to use seriation for larger problems. Furthermore, seriation is used in a novel stepwise process (within each cluster and between clusters) which leads to a visualization technique that is independent of the dimensionality of the data. A big advantage is that it presents the structure between clusters and the micro-structure within clusters in one concise plot. This not only allows for judging cluster quality but also makes mis-specification of the number of clusters apparent. We give a detailed discussion of the construction of dissimilarity plots and demonstrate their usefulness with several examples.
Series: Research Report Series / Department of Statistics and Mathematics
APA, Harvard, Vancouver, ISO, and other styles
46

Ishibashi, Ryo. "Cognitive and neural bases of human tool use : Exploration into body and tool representations." Kyoto University, 2012. http://hdl.handle.net/2433/159409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Selin, Magnus. "Efficient Autonomous Exploration Planning of Large-Scale 3D-Environments : A tool for autonomous 3D exploration indoor." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-163329.

Full text
Abstract:
Exploration is of interest for autonomous mapping and rescue applications using unmanned vehicles. The objective is to, without any prior information, explore all initially unmapped space. We present a system that can perform fast and efficient exploration of large scale arbitrary 3D environments. We combine frontier exploration planning (FEP) as a global planning strategy, together with receding horizon planning (RH-NBVP) for local planning. This leads to plans that incorporate information gain along the way, but do not get stuck in already explored regions. Furthermore, we make the potential information gain estimation more efficient, through sparse ray-tracing, and caching of already estimated gains. The worked carried out in this thesis has been published as a paper in Robotand Automation letters and presented at the International Conference on Robotics and Automation in Montreal 2019.
APA, Harvard, Vancouver, ISO, and other styles
48

Forsström, David. "The use and experience of responsible gambling tools : An explorative analysis of user behavior regarding a responsible gambling tool and the consequences of use." Doctoral thesis, Stockholms universitet, Psykologiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-147476.

Full text
Abstract:
Responsible gambling tools are an intervention that is designed to decrease gambling among individuals with an at-risk gambling behavior. Studies have indicated that responsible gambling tools can decrease gambling behavior, but little is known about how this intervention is used by gamblers. The aim of the present thesis was to explore different facets of the use, experience and functions of these tools. Study I used descriptive statistics and latent class analysis (LCA) combined with multinomial regression to explore the use of the responsible gambling tool Playscan among 9528 gamblers (regular and at-risk gamblers). The participants had volunteered to use the tool. The functions of the tool had a high rate of initial use but a low rate of repeated use. The LCA identified five user classes. Two of the classes (self-testers and multifunctional users) were defined as high users of the tool and had a higher risk of developing gambling problems according to multinomial regression. The multifunctional users were characterized by an extensive use of all the functions while the other high usage class had an extensive use of the self-test. The three other classes were as follows: those who did not use the tool, those who visited the tool but did not engage in any of the functions, and those who only used the tool’s advice on how to decrease their gambling. Participants’ reasons for use and non-use of the tool were attributed to their degree of need of the tool and its functions. The tool’s most widely used function was the self-test that investigated the level of negative consequences faced by a user due to his or her gambling. Study II was a qualitative study investigating participants’ views, experiences and their reasons for using the tool. The study was conducted by interviewing 20 volunteer users of the tool. These semi-structured interviews were analyzed by thematic analysis. The results showed that the users had a positive attitude towards the tool and understood its purpose. The self-test was the most widely used function in this sample as well. However, the participants’ positive attitude toward the tool did not effectively encourage them to use it; they displayed low use of the tool’s functions. This paradox was explained by lack of feedback and the fact that some participants did not understand that they had registered to use the tool. Providing more feedback and tailoring the feedback to individual users were seen as ways of bridging the paradox. Study II also found that participants used the gambling website (which Playscan was linked to) in an analogue way, preparing their bets before placing them online. This limited the time they spent on the site and inhibited their use of Playscan. Study III was motivated by the extensive use of the self-test among users in Study I and Study II. The aim of Study III was to investigate the psychometric properties of the self-test (known as GamTest) to better understand how it could be used with Playscan in the most efficient way. Two thousand two hundred and thirty four respondents answered the questionnaire, along with instruments measuring depression, anxiety and another instrument measuring problems due to gambling. Factor analysis, parallel analysis, Cronbach’s alpha, and correlations were used to establish the tool’s psychometric properties. The results yielded a three-factor model, excellent reliability, and high correlation with the Problem Gambling Severity Index (PGSI), endorsing the validity of the self-test. The results also indicated that the questionnaire could be effectively shortened. Overall, the studies show that the tool has an initial high use, low repeated use and that the self-test is the most used feature. In addition, the self-test had good psychometric properties.

At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 3: Submitted.

APA, Harvard, Vancouver, ISO, and other styles
49

Lee, Larix. "Tool-based capture and exploration of software architectural design decisions." Thesis, University of British Columbia, 2009. http://hdl.handle.net/2429/5305.

Full text
Abstract:
Developing software-intensive systems involves making many design decisions, some of which are decisions that govern the architecture of the system. Since changes to these architectural decisions affect many parts of the system being developed, design decisions pertaining to the system architecture should be documented and the knowledge the decisions contain should be explored. Many researchers and industry practitioners in the software architecture and maintenance communities have identified this need for design decision documentation as well as exploration. They have proposed design knowledge, rationale and decision representation models, suggested requirements, and determined uses and challenges to overcome when utilizing software architectural design decisions. Summarizing and integrating the various works of these researchers and industry practitioners would better represent the current state of research in exploring architectural knowledge and documenting design decisions, thereby creating a common foundation for new discoveries to be built. I present a new system-based tool that I developed called ADDEX, which attempts to unify the current discoveries, models, requirements, and guidelines for design decisions. In addition to integrating the various works together, the ADDEX tool is a system designed to take a holistic approach to decision capture and exploration by explicitly supporting customized decision capture processes for software development organizations. The tool also provides visualization support to promote a better understanding of the software architecture through several decision visualization aspects. I used ADDEX to acquire and display industry decision sets to demonstrate the ability of the tool-based solution to capture and explore software architectural design decisions. Combined with industry feedback, the decision sets help evaluate the tool and verify that ADDEX met the requirements and guidelines described by the various researchers and industry practitioners on which the integrated solution is based. Feedback from industry provides insight into decision capturing and the practical use of decision visualization.
APA, Harvard, Vancouver, ISO, and other styles
50

Flores, Aaron G. (Aaron Guerrero). "Active cooling for electronics in a wireline oil-exploration tool." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/10937.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1996.
Folded drawing in pocket following text.
Includes bibliographical references (leaves 210-212).
by Aaron G. Flores.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography