Dissertations / Theses on the topic '3Ddigital methods and tools'

To see the other types of publications on this topic, follow the link: 3Ddigital methods and tools.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic '3Ddigital methods and tools.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Nicholas, Paul, and not supplied. "Approaches to Interdependency: early design exploration across architectural and engineering domains." RMIT University. Architecture and Design, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20081204.151243.

Full text
Abstract:
While 3D digital design tools have extended the reach of architectural and engineering designers within their own domains, restrictions on the use of the tools and an approach to practice whereby the architect designs (synthesises) and the engineer solves (analyses) - in that order ¡V have limited the opportunities for interdependent modes of interaction between the two disciplines during the early design phase. While it is suggested that 3D digital design tools can facilitate a more integrated approach to design exploration, this idea remains largely untested in practice. The central proposition of my research is that that 3D digital tools can enable interdependencies between crucial aspects of architectural and engineering design exploration during the early design phase which, before the entry of the computer, were otherwise impossible to affect. I define interdependency as a productive form of practice enabled by mutual and lateral dependence. Interdependent parties use problem solving processes that meet not only their own respective goals, but also those of others, by constructively engaging difference across their boundaries to actively search for solutions that go beyond the limits of singular domains. Developed through practice-based project work undertaken during my 3 year postgraduate internship within the Melbourne Australia office of the engineering firm Arup, my research explores new and improved linkages between early design exploration, analysis and making. The principal contribution of my research is to explore this problem from within the context, conditi ons and pressures of live practice. To test the research proposition this dissertation engages firstly with available literature from the fields of organisation theory and design, secondly with information gathered from experts in the field principally via interview, and lastly with processes of testing through practice-based (as opposed to university-based) project work. The dissertation is organized as follows: The Introductory Chapter outlines the central hypothesis, the current state of the discourse, and my motivations for conducting this research. I summarise the structure of my research, and the opportunities and limitations that have framed its ambitions. Chapter Two, Approach to Research and Method, details the constraints and possibilities of the Embedded Research within Architectural Practice context, within which this work has been undertaken, and describes the Melbourne office of Arup, the practice with whom I have been embedded. These contexts have led to the selection of a particular set of ethnographic research instruments, being the use of semi-structured interviews and the undertaking of practice-based studies as a participant-observer. These modes of testing are explained, and the constraints, limitations and requirements associated with them described. Within Chapter Three, Factors for Separation and Integration in Architectural and Engineering Design, I examine selected design literature to detail several factors impacting upon the historic and contemporary relationship between architects and engineers, and to introduce the problem towards which this thesis is addressed. I describe a process of specialisation that has led architects and engineers to see different aspects of a common problem, detail the historical factors for separation, the current relationship between domains and the emerging idea of increased integration during the early design phase. The aim of this section is primarily contextual - to introduce the characters and to understand why their interaction can be difficult - and investigation occurs through the concepts of specialisation and disciplinary roles. Chapter Four, Unravelling Interdependency, establishes an understanding of interdependency through the concept of collaboration. While I differentiate interdependency from collaboration because of the inconsistent manner in which the latter term is employed, the concept of collaboration is useful to initialise my understanding of interdependency because it, as opposed to the closely linked processes of cooperation and coordination, is recognised as being characterised by interdependency, and in fact is a viewed as a response specific to wider conditions of interdependency. From the literature, I identify four sites of intersection crucial to an understanding of interdependency; these are differing perceptions, shared and creative problem solving, communication and trust. These themes, which correlate with my practice experience at Arup Melbourne, are developed to introduce the concepts and vocabulary underlying my research. Chapter Five, Intersections & Interdependency between Architects and Engineers, grounds these four sites of intersection within contemporary issues of digital architectural and engineering practice. Each site is developed firstly through reference to design literature and secondly through the experiences and understandings of senior Arup practitioners as captured through my interviews. The views and experiences of these practitioners are used to locate digital limits to, and potential solutions for, interdependent design exploration between architects and engineers as they are experienced within and by practice. Through this combination of design literature and grounded experience, I extend: * the understanding of differing perceptions through reference to problems associated with digital information transfer. * the understanding of joint and creative problem solving by connecting it to the notion of performance-based design. * the understanding of communication by focussing it upon the idea of back propagating design information. * the understanding of trust by connecting it to the management and reduction of perceived complexity and risk. Chapter Six, Testing through Projects, details the project studies undertaken within this research. These studies are grouped into three discourses, characterized as Design(Arch)Design(Eng), Design|Analysis and Design|Making. As suggested by the concurrency operator that separates the two terms that constitute each of the three labels, each discourse tests how architectural and engineering explorations might execute in parallel. The section Design(Arch)|Design(Eng) reports projects that use a common language of geometry to link architectural and engineering design ideas through geometric interpretation. The section Design|Analysis reports projects in which analytical tools have been used generatively to actively guide and synthesise design exploration. The final section, Design|Making, reports projects in which the architectural and engineering design processes are synthesised around the procurement of fabrication information. Conclusions are then drawn and discussed in Chapter Seven. In evaluating the research I discuss how 3D digital design tools have enabled alternative approaches that resolve issues associated with differing perceptions, establishing common meanings, communication and trust. I summarise how these approaches have enabled increased interdependency in architect engineer interaction. Lastly, I draw together the impacts of intersecting 3D digital aspects of architectural and engineering design exploration during the early design phase, and indicate those aspects that require further analysis and research.
APA, Harvard, Vancouver, ISO, and other styles
2

Hellstadius, Monica. "Implementation of methods and tools." Thesis, KTH, Tillämpad maskinteknik (KTH Södertälje), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-124255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

McParland, Patrick J. "Software tools to support formal methods." Thesis, Queen's University Belfast, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.292757.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ivanova, Ivelina. "Supply chain management tools and methods." Thesis, University of Huddersfield, 2004. http://eprints.hud.ac.uk/id/eprint/4591/.

Full text
Abstract:
In today's business environment, manufacturers need to manage their enterprises as an inseparable part of a supply chain. Key to achieving this is the creation of an extended and integrated information system. In an attempt to find out what needs to be done to improve current supply chain methods and tools, the current research project 1) reviewed the literature to establish current approaches to Supply Chain Management (SCM); 2) identified what tools and methods are available; 3) categorised the current approaches to supply chain management and established a current practice SCM model; 4) identified the requirements for improved SCM; 5) produced an outline requirements specification for improved SCM. The research has made a number of contributions to knowledge. A literature survey on the subject of what SCM involves and what a SCM system is was carried out and was followed by the conclusions that existing software systems have not been classified and tested against the criteria of a true SCM system. A survey of existing SCM software solutions provided data for an analysis of what typical SCM applications include and concluded that a comprehensive SCM solution currently does not exist. That conclusion was verified by a survey based on SCM expert interviews. Three case studies were carried out that looked into different parts of the supply chain and demonstrated the significance of advanced SCM functionality for each one of them. The case studies also involved the design and implementation of a supply chain mapping tool and a supplier relationship management tool. Finally, a conceptual specification of an improved SCM system was developed. The research will be of interest to practitioners in the area of SCM that are looking for ideas to improve SCM procedures and namely, are looking into implementing or developing an already existing software system for SCM. It also suggests ideas for further research, which may be of interest to research students who are interested in the area of SCM.
APA, Harvard, Vancouver, ISO, and other styles
5

Marchesan, Almeida Gabriel. "Adaptive mpsoc architectures : principLes, methods and tools." Thesis, Montpellier 2, 2011. http://www.theses.fr/2011MON20154/document.

Full text
Abstract:
Les systèmes multiprocesseurs sur puce (MPSoC) offrent des performances supérieures tout en conservant la flexibilité et la réutilisabilité grâce à la customisation du logiciel embarqué. Alors que la plupart de MPSoC sont aujourd'hui hétérogènes pour mieux répondre aux besoins des applications ciblées, les MPSoCs homogènes pourraient devenir dans un proche avenir une alternative viable apportant d'autres avantages tels que l'équilibrage de charge de l'exécution, la migration des tâches et l'ájustement de fréquence dynamique. Cette thèse s'appuie sur une plateforme MPSoC homogène, développée pour explorer techniques d'adaptation en ligne. Chaque processeur de ce système est compact et exécute un système d'exploitation préemptif qui surveille diverses métriques et est habilité à prendre des décisions de remapping grâce à des techniques de migration de code et du changement dynamique de la fréquence. Cette approche permet la mise en œuvre des capacités de raffinage d'application à l'exécution en fonction de différents critères
Multiprocessor Systems-on-Chip (MPSoC) offer superior performance while maintaining flexibility and reusability thanks to software oriented personalization. While most MPSoCs are today heterogeneous for better meeting the targeted application requirements, homogeneous MPSoCs may become in a near future a viable alternative bringing other benefits such as run-time load balancing, task migration and dynamic frequency scaling. This thesis relies on a homogeneous NoC-based MPSoC platform developed for exploring scalable and adaptive on-line continuous mapping techniques. Each processor of this system is compact and runs a tiny preemptive operating system that monitors various metrics and is entitled to take remapping decisions through code migration techniques and dynamic frequency scaling. This approach that endows the architecture with decisional capabilities permits refining application implementation at run-time according to various criteria
APA, Harvard, Vancouver, ISO, and other styles
6

Davidson, Gordon Westly. "Aircraft NLG shimmy : methods, tools and analysis." Thesis, Imperial College London, 2012. http://hdl.handle.net/10044/1/10554.

Full text
Abstract:
ELGEAR was a UK government/industry partnership created in 2006 to research electrical technology in aircraft landing gear systems. The project designed, built and ground tested electrical technology for a Landing Gear Extension/Retraction system (LGERS) and Nose Wheel Steering systems (NWS) for a 120 passenger aircraft. One thread of this research, and the topic of the present thesis, was to develop analysis methods and tools to investigate the effects of shimmy. Shimmy is the self excited vibration of a castored wheel assembly. It may be experienced on assemblies such as wheel chairs, shopping trolleys, motor cycles, and it is particularly prevalent on aircraft landing gears. Developed within this report is a generic set of system shimmy requirements to guide future engineers. Several models are developed from first principles, allowing a potential design to be quickly evaluated and damper sizes determined. A detailed Multi-Body Systems model has been created using the software modelling package ADAMS. The model includes a detailed representation of mass and inertia properties, stiffness’s, friction and tyre properties. The report includes a detailed method to determine the tyre parameters compatible with the ADAMS PAC2002 formulation (magic tyre formula). The dynamic model is simulated to investigate the landing gear’s behaviour to each requirement and is a significant enhancement to the models developed using first principles. To explore the influence of higher frequency tyre dynamics, a model incorporating a simplified SWIFT tyre model is generated from first principles. The report concludes with a validation of the methods and tools. Test results are used to validate several aspects of the tyre model, and a comparison between hand derived models and the detailed model provides a thorough check of modelling practices.
APA, Harvard, Vancouver, ISO, and other styles
7

Carrasco, Carrillo Tomás. "Methods and tools for the design of RFICs." Doctoral thesis, Universitat de Barcelona, 2013. http://hdl.handle.net/10803/131009.

Full text
Abstract:
Ambient intelligence is going to focus the next advances in wireless technologies. Hence, the increasing demand on radio frequency (RF) devices and applications represents, not only a challenge for technological industries to improve its roadmaps, but also for RF engineers to design more robust, low-power, small-size and low-cost devices. Regarding to communication robustness, in the latest years, differential topologies have acquired an important relevance because of its natural noise and interference immunity. Within this framework, a differential n-port device can still be treated with the classical analysis circuit theory by means of Z-,Y-, h-parameters or the most suitable S-parameters in the radio frequency field. Despite of it, Bockelman introduced the mixed-mode scattering parameters, which more properly express the differential and common-mode behavior of symmetrical devices. Since then, such parameters have been used with a varying degree of success, as it will be shown, mainly because of a misinterpretation. Thereby, this thesis is devoted to extend the theory of mixed-mode scattering parameters and proposes the methodology to analyze such devices. For this proposal, the simplest case of a two-port device is developed. By solving this simple case, most of the lacks of the current theory are filled up. As instance, it allows the characterization and comparison of symmetric and spiral inductors, which have remained a controversy point until now. After solving this case, the theory is extended to a n-port device. Another key point on the fast and inexpensive development of radio frequency devices is the advance on fast CAD tools for the analysis and synthesis of passive devices. In the case of silicon technologies, planar inductors have become the most popular shapes because of its integrability. However, the design of inductors entails a deep experience and acknowledge not only on the behavior of such devices but on the use of electromagnetic (EM) simulators. Unfortunately, the use of EM simulators consumes an important quantity of time and resources. Thereby, this thesis is devoted to improve some of the aspects that slow down the synthesis process of inductors. Therefore, an ‘ab initio’ technique for the meshing of planar radio frequency and microwave circuits is described. The technique presented can evaluate the losses in the component with a high accuracy just in few seconds where an electromagnetic simulator would normally last hours. Likewise, a simple bisection algorithm for the synthesis of compact planar inductors is presented. It is based on a set of heuristic rules obtained from the study of the electromagnetic behavior of these planar devices. Additionally, design of a single-ended to differential low noise amplifier (LNA) in a CMOS technology is performed by using the methods and tools described.
L'enginyeria de radiofreqüència i la tecnologia de microones han assolit un desenvolupament inimaginable i avui en dia formen part de la majoria de les nostres activitats diàries. Probablement, la tecnologia mòbil ha tingut un desenvolupament més ràpid que qualsevol altre avenç tecnològic de l'era digital. Avui en dia, podem dir que el paradigma de la mobilitat s'ha assolit i tenim accés ràpid a internet des de qualsevol lloc on podem estar amb un dispositiu de butxaca. No obstant això, encara hi ha fites per endavant. Es més que probable que el paradigma de l’ "ambient intelligence” sigui el centre dels pròxims avenços en les tecnologies sense fils. A diferencia del paradigma de l"ambient intelligence', l'evolució de la tecnologia de la informació mai ha tingut l'objectiu explícit de canviar la societat, sinó que ho van fer com un efecte secundari, en canvi, les visions d' “ambient intelligence” proposen expressament el transformar la societat mitjançant la connexió completa i la seva informatització. Per tant, l'augment de la demanda de dispositius de ràdio freqüència (RF) i de les seves possibles aplicacions representa, no només un repte per a les indústries tecnològiques per millorar els seus plans de treball, sinó també per als enginyers de RF que hauran de dissenyar dispositius de baixa potència, més robusts, de mida petita i de baix cost. Quant a la robustesa dels dispositius, en els últims anys, les topologies de tipus diferencial han adquirit una important rellevància per la seva immunitat natural al soroll i resistència a les interferències. Dins d'aquest marc, un dispositiu de nports diferencial, encara pot ser tractat com un dispositiu 2nx2n i la teoria clàssica d'anàlisi de circuits (és a dir, la temia de quadripols) es pot aplicar a través de paràmetres Z, Y, h o els paràmetres S, més adequats en el camp de freqüència de ràdio. Tot i això, Bockelman i Eisenstadt introdueixen els paràmetres S mixtos, que expressen més adequadament el comportament diferencial i en mode comú de dispositius simètrics o asimètrics. Des de llavors, aquests paràmetres s'han utilitzat amb un grau variable d'èxit, com es mostrarà, principalment a causa d'una mala interpretació. D'aquesta manera, la primera part d'aquesta tesi està dedicada a estendre la teoria dels paràmetres S de mode mixt i proposa la metodologia d'anàlisi d'aquest tipus de dispositius i circuits. D'aquesta forma, en el Capítol 2, es desenvolupa el cas més simple d'un dispositiu de dos ports. En resoldre aquest cas simple, la major part de les mancances de la teoria actual es posen de relleu. Com a exemple, pennet la caracterització i la comparació de bobines simètriques i espiral no simètriques, que han estat un punt de controvèrsia fins ara. Després de resoldre aquest cas, al Capítol 3 s'estén la teOIia a un dispositiu de n-ports dels quals un nombre pot ser single-ended i la resta diferencials. És en aquest moment quan la dualitat existent entre els paràmetres S estàndard i de mode mixt es pot veure clarament i es destaca en el seu conjunt. Aquesta teoria permet, tanmateix, estendre la teoria clàssica d'amplificadors quan s'analitzen per mitjà de paràmetres S. Un altre punt clau en el desenvolupament ràpid i de baix cost dels dispositius de radiofreqüència és l'avenç en les eines CAD ràpides per a l'anàlisi i síntesi dels dispositius passius, en especial dels inductors. Aquests dispositius apareixen tot sovint en el disseny de radio freqüència degut a la seva gran versatilitat. Tot i que hi ha hagut múltiples intents de reemplaçar amb components externs o circuits, fins i tot actius, en el cas de les tecnologies de silici, els inductors planars s'han convertit en les formes més populars per la seva integrabilitat. No obstant això, el disseny d'inductors implica conèixer i posseir una experiència profunda no només en el comportament d'aquests dispositius, però també en l'ús de simuladors electromagnètics (EM). Desafortunadament, l'ús dels simuladors EM consumeix una quantitat important de temps i recursos. Per tant, la síntesi dels inductors representa un important inconvenient actualment. D'aquesta manera, la segona part d'aquesta tesi està dedicada a millorar alguns dels aspectes que frenen el procés de síntesi dels inductors. Per tant, en el Capítol 4, es descriu una tècnica 'ab initio' de generació de la malla per bobines planars en ràdio freqüència i microones. La tècnica es basa en l'estudi analític dels fenòmens d'aglomeració de corrent que tenen lloc a l'interior del component. En aquesta avaluació, no es requereix una solució explícita dels corrents i de les càrregues arreu del circuit. Llavors, el nombre de cel•les de la malla assignades a una tira de metall donada, depèn del valor inicialment obtingut a partir de l'estudi analític. La tècnica presentada pot avaluar les pèrdues en el component amb una gran precisió només en uns pocs segons, quan comparat amb un simulador electromagnètic normalment es necessitaria hores. De la mateixa manera, en el Capítol 5 es presenta un senzill algoritme de bisecció per a la síntesi d'inductors planars compactes. Es basa en un conjunt de regles heurístiques obtingut a partir de l'estudi del comportament electromagnètic d'aquests dispositius planars. D'aquesta manera, el nombre d'iteracions es manté moderadament baix.D'altra banda, per tal d'accelerar l'anàlisi en cada pas, s'utilitza un simulador ràpid electromagnètic planar, el qual es basa en el coneixement que es té del component sintetitzat. Finalment, en el Capítol 6, la metodologia de paràmetres S de mode mixt proposada i les eines CAD introduides s'utilitzen àmpliament en el disseny d'un amplificador de baix soroll “single-ended” a diferencial (LNA), mitjançant una tecnologia estàndard CMOS.L'amplificador de baix soroll és un dels components claus en un sistema de recepció de radio freqüència, ja que tendeix a dominar la sensibilitat i la figura de soroll (NF) de tot el sistema. D'altra banda, les característiques d'aquest circuit estan directament relacionades amb els components actius i passius disponibles en una tecnologia donada. Per tant, la tecnologia escollida, el factor de qualitat dels passius, i la forma com es caracteritzen tindran un alt impacte en les principals figures de mèrit del circuit real.
APA, Harvard, Vancouver, ISO, and other styles
8

Bylund, Nicklas. "Models methods and tools for car body development /." Luleå, 2002. http://epubl.luth.se/1402-1757/2002/15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Woksepp, Stefan. "Virtual reality in construction : tools, methods and processes." Doctoral thesis, Luleå : Division of Structural Engineering, Department of Civil, Mining and Environmental Engineering, Luleå University of Technology, 2007. http://epubl.ltu.se/1402-1544/2007/49/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pillac, Victor. "Dynamic vehicle routing : solution methods and computational tools." Phd thesis, Ecole des Mines de Nantes, 2012. http://tel.archives-ouvertes.fr/tel-00742706.

Full text
Abstract:
Within the wide scope of logistics management,transportation plays a central role and is a crucialactivity in both production and service industry.Among others, it allows for the timely distributionof goods and services between suppliers, productionunits, warehouses, retailers, and final customers.More specifically, Vehicle Routing Problems(VRPs) deal with the design of a set of minimal costroutes that serve the demand for goods orservices of a set of geographically spread customers,satisfying a group of operational constraints.While it was traditionally a static problem, recenttechnological advances provide organizations withthe right tools to manage their vehicle fleet in realtime. Nonetheless, these new technologies alsointroduce more complexity in fleet managementtasks, unveiling the need for decision support systemsdedicated to dynamic vehicle routing. In thiscontext, the contributions of this Ph.D. thesis arethreefold : (i) it presents a comprehensive reviewof the literature on dynamic vehicle routing ; (ii)it introduces flexible optimization frameworks thatcan cope with a wide variety of dynamic vehiclerouting problems ; (iii) it defines a new vehicle routingproblem with numerous applications.
APA, Harvard, Vancouver, ISO, and other styles
11

Nobrand, Hambeck Erika. "Feasibility study of flow analysis methods and tools." Thesis, KTH, Industriell produktion, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-154994.

Full text
Abstract:
Scania CV is one of the leading manufacturer of trucks in the world. Beside the truck business Scania also manufactures buses and industrial engines. A new logistic concept at Scania means major change regarding material flow to the productions lines and today Scania is lacking a tool to analyse these material flows. The underlying hypothesis is that with a tool that could analyse logistic material flows the design of new logistic flows would be more efficient and have a better quality. A case-study was done with the tool Flow Planner and in parallel interviews with logistics engineers to find out how they are working today. The conclusion is that an analysis tool similar to Flow Planner might have a positive effect on the lead time for changing the design of material flows and could therefore be one link in the chain when striving for higher flexibility. Flow Planner can model all the different Scania standard deliv-ery methods but the output gives only a part of the information that is needed in example configura-tion of the train carts. Flow Planner can give an overview of the plant and the traffic flow within the plant. It can also contribute to more accurate risk analysis concerning the congestion of traffic in plant. A weakness of Flow Planner is that forklifts and tugger trains cannot be modelled together. A risk that was exposed during the case study is that input data has errors that will give a result that is not valid. The part of the case study that was performed with data from Oskarhamn only includes on the tug-ger train delivery of units. To get a complete picture of Flow Planners capabilities and limitations it would be preferable to make similar case-studies with real data for kit, sequence and batch also.Recommendations for the continued work is to make a requirement specification to clearly define the Scania requirements and evaluate several tools.
APA, Harvard, Vancouver, ISO, and other styles
12

Hammar, Karl. "Content Ontology Design Patterns : Qualities, Methods, and Tools." Doctoral thesis, Linköpings universitet, Interaktiva och kognitiva system, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-139584.

Full text
Abstract:
Ontologies are formal knowledge models that describe concepts and relationships and enable data integration, information search, and reasoning. Ontology Design Patterns (ODPs) are reusable solutions intended to simplify ontology development and support the use of semantic technologies by ontology engineers. ODPs document and package good modelling practices for reuse, ideally enabling inexperienced ontologists to construct high-quality ontologies. Although ODPs are already used for development, there are still remaining challenges that have not been addressed in the literature. These research gaps include a lack of knowledge about (1) which ODP features are important for ontology engineering, (2) less experienced developers' preferences and barriers for employing ODP tooling, and (3) the suitability of the eXtreme Design (XD) ODP usage methodology in non-academic contexts. This dissertation aims to close these gaps by combining quantitative and qualitative methods, primarily based on five ontology engineering projects involving inexperienced ontologists. A series of ontology engineering workshops and surveys provided data about developer preferences regarding ODP features, ODP usage methodology, and ODP tooling needs. Other data sources are ontologies and ODPs published on the web, which have been studied in detail. To evaluate tooling improvements, experimental approaches provide data from comparison of new tools and techniques against established alternatives. The analysis of the gathered data resulted in a set of measurable quality indicators that cover aspects of ODP documentation, formal representation or axiomatisation, and usage by ontologists. These indicators highlight quality trade-offs: for instance, between ODP Learnability and Reusability, or between Functional Suitability and Performance Efficiency. Furthermore, the results demonstrate a need for ODP tools that support three novel property specialisation strategies, and highlight the preference of inexperienced developers for template-based ODP instantiation---neither of which are supported in prior tooling. The studies also resulted in improvements to ODP search engines based on ODP-specific attributes. Finally, the analysis shows that XD should include guidance for the developer roles and responsibilities in ontology engineering projects, suggestions on how to reuse existing ontology resources, and approaches for adapting XD to project-specific contexts.
APA, Harvard, Vancouver, ISO, and other styles
13

Rensfelt, Olof. "Tools and methods for evaluation of overlay networks." Licentiate thesis, Uppsala universitet, Avdelningen för datorteknik, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-85836.

Full text
Abstract:
Overlay networks is a popular method to deploy new functionality which does not currently exist in the Internet. Such networks often use the peer-to-peer principle where users are both servers as well as clients at the same time. We evaluate how overlay networks performs in a mix of strong and weak peers. The overlay system of study in this thesis is Bamboo, which is based on a distributed hash table (DHT). For the performance evaluation we use both simulations in NS-2 and emulations in the testbed PlanetLab. One of our contributions is an NS-2 implementation of the Bamboo DHT. To simulate nodes joining and leaving, NS-2 is modified to be aware of the identity of overlay nodes. To control experiments on PlanetLab we designed Vendetta. Vendetta is both a tool to visualize network events and a tool to control the individual peer-to-peer nodes on the physical machines. PlanetLab does not support bandwidth limitations which is needed to emulate weak nodes. Therefore we designed a lightweight connectivity tool called Dtour. Both the NS-2 and PlanetLab experiments indicate that a system like Bamboo can handle as much as 50% weak nodes and still serve requests. Although, the lookup latency and the number of successful lookups suffer with the increased network dynamics.
APA, Harvard, Vancouver, ISO, and other styles
14

Mallory, Richard Smith. "Tools for explaining complex qualitative simulations /." Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Törlind, Peter. "Distributed engineering : tools and methods for collaborative product development /." Luleå, 2002. http://epubl.luth.se/1402-1544/2002/32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ginbayashi, Jun. "Formal methods and tools for systems analysis and design." Thesis, University of Oxford, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.294381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Novak, Daniel Marcell. "Methods and tools for preliminary low thrust mission analysis." Thesis, University of Glasgow, 2012. http://theses.gla.ac.uk/3338/.

Full text
Abstract:
The aim of the present thesis is to develop new methods that are useful for a space mission analyst to design low thrust trajectories in the preliminary phases of a mission study, where the focus is more on exploring various concepts than on obtaining one optimal transfer. The tools cover three main axes: generating low thrust trajectories from scratch, improving existing low thrust trajectories and exploring large search spaces related to multiple gravity assist transfers. Stress is put on the computational efficiency of the tools. Transfer arcs are generated with shaped based approaches, which have the advantage of having the ability to reproduce close to optimal transfers satisfying time of flight constraints and varied boundary constraints without the need for propagation. This thesis presents a general framework for the development of shape-based approaches to low-thrust trajectory design. A novel shaping method, based on a three-dimensional description of the trajectory in spherical coordinates, is developed within this general framework. Both the exponential sinusoid and the inverse polynomial shaping are demonstrated to be particular two-dimensional cases of the spherical one. The pseudo-equinoctial shaping is revisited within the new framework, and the nonosculating nature of the pseudo-equinoctial elements is analysed. A two-step approach is introduced to solve the time of flight constraint, related to the design of low-thrust arcs with boundary constraints for both spherical and pseudo-equinoctial shaping. The solutions derived from the shaping approach are improved with a feedback linear-quadratic controller and compared against a direct collocation method based on finite elements in time. Theoretical results are given on the validity of the method and a theorem is derived on the criteria of optimality of the results. The shaping approaches and the combination of shaping and linear-quadratic controller are tested on four case studies: a mission to Mars, a mission to asteroid 1989ML, to comet Tempel-1 and to Neptune. The design of low thrust multiple gravity assist trajectories is tackled by an incremental pruning approach. The incremental pruning of reduced search spaces is performed for decoupled pairs of transfer legs, after which regions of the total search space are identified where all acceptable pairs can be linked together. The gravity assists are not powered therefore the trajectory is purely low thrust and the transfer arcs are modelled by shaping functions and improved with the linear quadratic controller. Such an approach can reduce the computational burden of finding a global optimum. Numerical examples are presented for LTMGA transfers from Earth to asteroid Apollo and to Jupiter.
APA, Harvard, Vancouver, ISO, and other styles
18

Ahmad, Abbas. "Model-Based Testing for IoT Systems : Methods and tools." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD008/document.

Full text
Abstract:
L'internet des objets (IoT) est aujourd'hui un moyen d'innovation et de transformation pour de nombreuses entreprises. Les applications s'étendent à un grand nombre de domaines, tels que les villes intelligentes, les maisons intelligentes, la santé, etc. Le Groupe Gartner estime à 21 milliards le nombre d'objets connectés d'ici 2020. Le grand nombre d'objets connectés introduit des problèmes, tels que la conformité et l'interopérabilité en raison de l'hétérogénéité des protocoles de communication et de l'absence d'une norme mondialement acceptée. Le grand nombre d'utilisations introduit des problèmes de déploiement sécurisé et d'évolution du réseau des IoT pour former des infrastructures de grande taille. Cette thèse aborde la problématique de la validation de l'internet des objets pour répondre aux défis des systèmes IoT. Pour cela, nous proposons une approche utilisant la génération de tests à partir de modèles (MBT). Nous avons confronté cette approche à travers de multiples expérimentations utilisant des systèmes réels grâce à notre participation à des projets internationaux. L'effort important qui doit être fait sur les aspects du test rappelle à tout développeur de système IoT que: ne rien faire est plus cher que de faire au fur et à mesure
The Internet of Things (IoT) is nowadays globally a mean of innovation and transformation for many companies. Applications extend to a large number of domains, such as smart cities, smart homes, healthcare, etc. The Gartner Group estimates an increase up to 21 billion connected things by 2020. The large span of "things" introduces problematic aspects, such as conformance and interoperability due to the heterogeneity of communication protocols and the lack of a globally-accepted standard. The large span of usages introduces problems regarding secure deployments and scalability of the network over large-scale infrastructures. This thesis deals with the problem of the validation of the Internet of Things to meet the challenges of IoT systems. For that, we propose an approach using the generation of tests from models (MBT). We have confronted this approach through multiple experiments using real systems thanks to our participation in international projects. The important effort which is needed to be placed on the testing aspects reminds every IoT system developer that doing nothing is more expensive later on than doing it on the go
APA, Harvard, Vancouver, ISO, and other styles
19

Schmitt, Peter (Peter Alfons). "originalMachines : developing tools and methods for object-oriented mechatronics." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/67761.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. [161]-166).
The digital revolution has fundamentally changed our lives by giving us new ways to express ourselves through digital media. For example, accessible multimedia content creation tools allow people to instantiate their ideas and share them easily. However, most of these outcomes only exist on-screen and online. Despite the growing accessibility of digital design and fabrication tools the physical world and everyday objects surrounding us have been largely excluded from a parallel explosion of possibilities to express ourselves. Increasingly, webbased services allow professional and non-professional audiences to access computer-aided manufacturing (CAM) tools like 3D-printing and laser-cutting. Nonetheless, there are few (if any) design tools and methods for creating complex mechanical assemblies that take full advantage of CAM systems. Creating unique mechatronic artifacts or "originalMachines" requires more specific and sophisticated design tools than exist today. "Object-Oriented Mechatronics" is a parametric design approach that connects knowledge about mechanical assemblies and electronics with the requirements of digital manufacturing processes. Parametric instances like gears, bearing and servos are made available as objects within a CAD environment which can then be implemented into specific projects. The approach addresses the missing link between accessible rapid-manufacturing services and currently available design tools thereby creating new opportunities for self-expression through mechatronic objects and machines. The dissertation matches mechanical components and assemblies with rapid manufacturing methods by exploring transferability of conventional manufacturing techniques to appropriate rapid manufacturing tools. I rebuild various gearing and bearing principles like four-contact point bearings, cross roller bearings, spur and helical gears, planetary gears, cycloidal and harmonic gear reducers using the laser cutter, the CNC-mill and the 3D-printer. These explorations lead to more complex assemblies such as the PlywoodServo, 3DprintedClock and 3-DoF (Degree of Freedom) Head. The lessons from these explorations are summarized in a detailed "cook book" of novel mechatronic assemblies enabled by new fabrication tools. Furthermore, I use the results to develop a CAD tool that brings together several existing software packages and plug-ins including Rhino, Grasshopper and the Firefly experiments for Arduino, which will allow animation, fabrication and control of original machines. The tool is an example of an object-oriented design approach to mechatronic assemblies. A user calls a DoF (Degree of Freedom) object (parametric servo) with specific parameters like gearing and bearing types, motor options and control and communication capabilities. The DoF object then creates the corresponding geometry which can be connected and integrated with other actuators and forms. A group of roboticists and designers participated in a workshop to test the tool and make proposals for original machines using the tool. The dissertation has contributions on multiple levels. First, the actuator assembly examples and parametric design tool present a body of novel work that illustrates the benefits of going beyond off-the-shelf actuator assemblies and kit-of-parts for robotic objects. Second, this tool and the accompanying examples enable the design of more original machines with custom actuator assemblies using the latest digital fabrication tools. Finally, these explorations illustrate how new CAD/ CAM tools can facilitate an exchange between more design-oriented users and more engineering-oriented users.
by Peter Schmitt.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
20

Sehloho, Nobaene Elizabeth. "An indoor positioning system using multiple methods and tools." Thesis, Cape Peninsula University of Technology, 2015. http://hdl.handle.net/20.500.11838/2288.

Full text
Abstract:
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2015.
Recently, the deployment and availability of wireless technology have led to the development of location and positioning services. These Location Based Services (LBSs) are attracting the attention of researchers and mobile service providers. With the importance of ubiquitous computing, the main challenge seen in the LBS is in the mobile positioning or localization within reasonable and certain accuracy. The Global Positioning System (GPS), as a widely known and used navigation system, is only appropriate for use in outdoor environments, due to the lack of line-of-sight (LOS) in satellite signals that they cannot be used accurately inside buildings and premises. Apart from GPS, Wi-Fi is among others, a widely used technology as it is an already existing infrastructure in most places. This work proposes and presents an indoor positioning system. As opposed to an Ad-hoc Positioning System (APS), it uses a Wireless Mesh Network (WMN). The system makes use of an already existing Wi-Fi infrastructure. Moreover, the approach tests the positioning of a node with its neighbours in a mesh network using multi-hopping functionality. The positioning measurements used were the ICMP echo requests, RSSI and RTS/CTS requests and responses. The positioning method used was the trilateral technique, in combination with the idea of the fingerprinting method. Through research and experimentation, this study developed a system which shows potential as a positioning system with an error of about 2 m – 3 m. The hybridization of the methods proves an enhancement in the system though improvements are still required
APA, Harvard, Vancouver, ISO, and other styles
21

Kilianson, Nicole, Annie Larsson, and Ramona Lindholm. "Empowerment- Paternalism. A study about Försäkringskassans tools and methods." Thesis, Malmö högskola, Fakulteten för hälsa och samhälle (HS), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-24971.

Full text
Abstract:
Försäkringskassan har på sista tiden varit ett föremål för diskussioner i media. En stark negativ bild av myndighetens handläggare tenderar att framkomma ur mediala sammanhang. Samtidigt finns det en medvetenhet hos handläggarna kring vikten av att arbeta med hänsyn till individen. Studiens syfte är att studera de redskap och arbetssätt som används av Försäkringskassans handläggare i arbetslivsinriktad rehabilitering utifrån ett empowermentperspektiv. Vi gjorde en kvalitativ studie som bygger på intervjuer med nio handläggare på Försäkringskassan som arbetar med sjukpenningsärende på enheten fördjupad utredning. Vår uppsats visar att handläggarna lägger vikt på individen i rehabiliteringen då faktorer som information, bemötande, motivation och delaktighet är centrala i arbetet med individen. Vidare visar uppsatsen att det finns en variation i handläggarnas sätt att använda sig av regelverket i syfte att hjälpa individen. Empowerment som arbetssätt var upp till var och en handläggare att använda sig av. Trots detta kunde vi se drag av empowerment hos de flesta handläggare.
Authorities like Försäkringskassan have lately been discussed in the Swedish media. A negative picture has been painted about the personal administrative officers’ in media. Meanwhile there is awareness among the personal administrative officers about the importance of focusing on the individual. The study aimed to examine the tools and methods used by the personal administrative officers at Försäkringskassan working with vocational rehabilitation from an empowerment perspective. Our study is a qualitative study, build on interviews with nine personal administrative officers ‘who work with sickness compensation cases at the unit depth-investigation. Our paper shows that the personal administrative officers’ focus on the individual in the rehabilitation, factors like information, how the individual is treated by the personal administrative officers, motivation and participation are central in the work with the individual. The paper also shows that there is a variation in the personal administrative officers’ way of using the law in order to help the individual. Empowerment as a way of working was up to each personal administrative officer to use. Despite this, we could see the features of empowerment among most personal administrative officers.
APA, Harvard, Vancouver, ISO, and other styles
22

Frantz, Ferreira Felipe. "Architectural exploration methods and tools for heterogeneous 3D-IC." Thesis, Ecully, Ecole centrale de Lyon, 2012. http://www.theses.fr/2012ECDL0033/document.

Full text
Abstract:
L'intégration tridimensionnelle (3D), où plusieurs puces sont empilées et interconnectées, est en train de révolutionner l'industrie des semi-conducteurs.Cette technologie permet d'associer, dans un même boîtier, des puces électroniques (analogique, numérique, mémoire) avec des puces d'autres domaines(MEMS, bio-capteurs, optique, etc). Cela ouvre de nombreuses voies d'innovation. Néanmoins, l'absence d'outils de conception assistée ordinateur(CAO) adaptés aux systèmes 3D freine l'adoption de la technologie.Cette thèse contribue à deux problématiques liées à la conception 3D : le partitionnement d'un système sur de multiples puces et l'optimisation hiérarchique de systèmes multiphysiques (hétérogènes).La première partie de la thèse est dédiée au problème de partitionner la fonctionnalité d'un système sur de multiples puces. Un outil de « floorplan » 3D a été développé pour optimiser ce partitionnement en fonction de la surface des puces, de la température d'opération du circuit et de la structure des interconnexions. Ce type d'outil étant complexe, nous proposons de régler ses paramètres de façon automatique par l'utilisation d'algorithmes évolutionnaires.Des résultats expérimentaux sur une suite de benchmarks et sur une architecture multi processeur connecté en réseau démontrent l'efficacité et l'applicabilité des techniques d'optimisation proposées.Dans la deuxième partie, nous présentons une méthodologie de conception hiérarchique qui est adaptée aux systèmes hétérogènes. La méthode combine une approche ascendante et descendante et utilise des courbes de compromis(Fronts de Pareto) comme une abstraction de la performance d'un circuit.La contribution principale de la thèse consiste à utiliser des techniques d'interpolation pour représenter les Fronts de Pareto par des fonctions continues et à leur intégration dans des processus d'optimisation classiques. Cela permet un gain en flexibilité lors de l'étape ascendante du flot (caractérisation) et un gain en temps lors de l'étape descendante (synthèse). Le flot de conception est démontré sur un amplificateur opérationnel ainsi comme sur la synthèse d'un lien optoélectronique avec trois niveaux hiérarchiques
3D integration technology is driving a strong paradigm shift in the design of electronic systems. The ability to tightly integrate functions from different technology nodes (analog, digital, memory) and physical domains (MEMS, optics, etc) offers great opportunities for innovation (More than Moore). However, leveraging this potential requires efficient CAD tools to compare architectural choices at early design stages and to co-optimize multiphysics systems.This thesis work is divided into two parts. The first part is dedicated to the problem of partitioning a system into multiple dies. A 3D floorplanning tool was developed to optimize area, temperature and the interconnect structure of a 3DIC. Moreover, a meta-optimization approach based on genetic algorithms is proposed to automatically configure the key parameters of the floorplanner. Tests were carried out on architectural benchmarks and a NoC based multiprocessor to demonstrate the efficiency of the proposed techniques.In the second part of the thesis, a hierarchical design methodology adapted to heterogeneous systems is presented. The method combines the bottom-up and top-down approaches with Pareto-front techniques and response surface modeling. The Pareto front of lower level blocks are extracted and converted into predictive performance models that can be stored and reused in a top-down optimization process. The design flow is demonstrated on an operational amplifier as well as on the synthesis of an optoelectronic data link with three abstraction levels
APA, Harvard, Vancouver, ISO, and other styles
23

Törlind, Peter. "Distributed engineering : tools and methods for collaborative product development." Doctoral thesis, Luleå tekniska universitet, Innovation och Design, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-26705.

Full text
Abstract:
Engineering design is fundamentally social, requiring much interaction and communication between the people involved. Additionally, good design often relies upon the ability of a cross-functional team to create a shared understanding of the task, the process, and the respective roles of its members. Coordination and exchange of information between participants in a distributed product development team is technically difficult and time consuming, where different locations and time zones further complicate communication. It is therefore important to provide tools and methods so that a geographically distributed design team can also collaborate as co- located teams do. Successful teamwork in geographically distributed teams is not only dependent on formal meetings; they are also highly dependent on tools that support informal communication, such as opportunistic and spontaneous interaction. Such informal communication is responsible for much of the information flow in an organisation. A distributed engineering environment must support many forms of collaboration: formal meetings with high quality videoconferencing, brainstorming sessions where people use their body language and whiteboards to clarify their ideas, and informal and mobile communication. This thesis presents a distributed engineering environment that uses broadband conferencing, shared multimedia, shared whiteboards, application sharing, and a distributed virtual reality environment for sharing engineering information. The system also supports lightweight informal communication such as the web based contact portal combining several information channels in one place, e.g. e-mail archives, awareness cameras, diaries, instant messaging, and SMS. The Contact Portal is the natural starting point for initiating and maintaining contact with remote team members. The thesis also presents how mobility support for distributed collaborative teamwork can be designed. The physical environment where the collaboration is done is also very important; the design of several types of collaboration environments is presented and evaluated, from high end studios to low end personal workspaces. The development of the environment is based on several case studies of distributed work where the tools have been used and evaluated in a realistic environment in close collaboration with several industrial companies such as Volvo Car Corporation, Conex, Hägglunds Drives and Alkit Communications.
Godkänd; 2002; 20061110 (haneit)
APA, Harvard, Vancouver, ISO, and other styles
24

Al-Solbi, Ali Nasser. "Evaluating and improving e-readiness assessment methods and tools." Thesis, University of East Anglia, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.426344.

Full text
Abstract:
Developing countries have realised that if they fail to provide an adequate infrastructure and knowledge base, then they risk falling behind both economically and socially in the emerging networked world. In this context, e-readiness assessment tools can help in the formulation of national action plans for both developed and developing countries. The purpose of this study is to develop a comprehensive tool to assess e-readiness in developing countries, in particular in the Arab ones, to improve e-readiness processes and to benefit from the use of information and communications technology. Specifically, this research: compares the current e-readiness assessment tools, identifies the crucial factors and steps needed for generating a comprehensive tool to measure e-readiness in developing countries and suggests how to improve e-readiness assessment in these countries with special reference to Saudi Arabia where the research was carried out. An important part of this study has been the design and development of a new ereadiness tool to assess e-readiness in the developing countries. A total of nine ereadiness factors were studied to investigate e-readiness in order to develop the new tool: ICT Infrastructure, Access to Skilled Workforce, Knowledgeable People, Culture, E-government and Policy, E-economy and E-commerce, Competitiveness, Cost of Living and E-health. These factors consist of 132 variables. Although the selection of the factors and their variables was based on literature and consultations with professionals in the ICT field, the tool was modified by using statistical tests on 87 of the variables. To test the new tool, a total of 200 questionnaires were distributed to Saudi organisations across the Kingdom of Saudi Arabia but only 87 organisations (48 public and 39 private sector organisations) responded. Then, 30 interviews with ICT managers were carried out to explore in more detail ICT issues such as the level of e-readiness in the whole country, e-commerce, ICT strategy, the role of the government, ICT managers as decisions-makers, e-health and the barriers that prevent the development of ICT infrastructure over the whole country. E-readiness indices based on the mean of respondents' answers were calculated for the nine e-readiness factors using five scales measure in which one equalled poor and five equalled excellent. In general, the tool found that five out of the eight e-readiness factors i.e. e-governance and policy, knowledgeable people, ICT infrastructure, eeconomy and e-commerce, and access to skilled workforce had indices values which varied approximately between the average i.e. 3 and less than 3.45 for the public sector, private sector and for the whole country. Factors such as e-health. cost of living, and culture have indices values below the average and vary approximately between 1.810 and 2.800. In order to evaluate and test the new e-readiness tool. the researcher used it to assess an e-readiness score for the Saudi private and public sectors and for the whole country. This found their indices were 2.815, 3.038 and 2.917 respectively. The test result was also compared with existing e-readiness tools which had been used on other developing countries. It was seen that the new e-readiness tool was more accurate and more reliable as it was based on newly collected data from the country surveyed rather than on statistical information from international organisations.
APA, Harvard, Vancouver, ISO, and other styles
25

Rullmann, Markus. "Models, Design Methods and Tools for Improved Partial Dynamic Reconfiguration." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-61526.

Full text
Abstract:
Partial dynamic reconfiguration of FPGAs has attracted high attention from both academia and industry in recent years. With this technique, the functionality of the programmable devices can be adapted at runtime to changing requirements. The approach allows designers to use FPGAs more efficiently: E. g. FPGA resources can be time-shared between different functions and the functions itself can be adapted to changing workloads at runtime. Thus partial dynamic reconfiguration enables a unique combination of software-like flexibility and hardware-like performance. Still there exists no common understanding on how to assess the overhead introduced by partial dynamic reconfiguration. This dissertation presents a new cost model for both the runtime and the memory overhead that results from partial dynamic reconfiguration. It is shown how the model can be incorporated into all stages of the design optimization for reconfigurable hardware. In particular digital circuits can be mapped onto FPGAs such that only small fractions of the hardware must be reconfigured at runtime, which saves time, memory, and energy. The design optimization is most efficient if it is applied during high level synthesis. This book describes how the cost model has been integrated into a new high level synthesis tool. The tool allows the designer to trade-off FPGA resource use versus reconfiguration overhead. It is shown that partial reconfiguration causes only small overhead if the design is optimized with regard to reconfiguration cost. A wide range of experimental results is provided that demonstrates the benefits of the applied method
Partielle dynamische Rekonfiguration von FPGAs hat in den letzten Jahren große Aufmerksamkeit von Wissenschaft und Industrie auf sich gezogen. Die Technik erlaubt es, die Funktionalität von progammierbaren Bausteinen zur Laufzeit an veränderte Anforderungen anzupassen. Dynamische Rekonfiguration erlaubt es Entwicklern, FPGAs effizienter einzusetzen: z.B. können Ressourcen für verschiedene Funktionen wiederverwendet werden und die Funktionen selbst können zur Laufzeit an veränderte Verarbeitungsschritte angepasst werden. Insgesamt erlaubt partielle dynamische Rekonfiguration eine einzigartige Kombination von software-artiger Flexibilität und hardware-artiger Leistungsfähigkeit. Bis heute gibt es keine Übereinkunft darüber, wie der zusätzliche Aufwand, der durch partielle dynamische Rekonfiguration verursacht wird, zu bewerten ist. Diese Dissertation führt ein neues Kostenmodell für Laufzeit und Speicherbedarf ein, welche durch partielle dynamische Rekonfiguration verursacht wird. Es wird aufgezeigt, wie das Modell in alle Ebenen der Entwurfsoptimierung für rekonfigurierbare Hardware einbezogen werden kann. Insbesondere wird gezeigt, wie digitale Schaltungen derart auf FPGAs abgebildet werden können, sodass nur wenig Ressourcen der Hardware zur Laufzeit rekonfiguriert werden müssen. Dadurch kann Zeit, Speicher und Energie eingespart werden. Die Entwurfsoptimierung ist am effektivsten, wenn sie auf der Ebene der High-Level-Synthese angewendet wird. Diese Arbeit beschreibt, wie das Kostenmodell in ein neuartiges Werkzeug für die High-Level-Synthese integriert wurde. Das Werkzeug erlaubt es, beim Entwurf die Nutzung von FPGA-Ressourcen gegen den Rekonfigurationsaufwand abzuwägen. Es wird gezeigt, dass partielle Rekonfiguration nur wenig Kosten verursacht, wenn der Entwurf bezüglich Rekonfigurationskosten optimiert wird. Eine Anzahl von Beispielen und experimentellen Ergebnissen belegt die Vorteile der angewendeten Methodik
APA, Harvard, Vancouver, ISO, and other styles
26

Abugessaisa, Imad. "Analytical tools and information-sharing methods supporting road safety organizations." Doctoral thesis, Linköpings universitet, GIS - Geografiska informationssystem, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11596.

Full text
Abstract:
A prerequisite for improving road safety are reliable and consistent sources of information about traffic and accidents, which will help assess the prevailing situation and give a good indication of their severity. In many countries there is under-reporting of road accidents, deaths and injuries, no collection of data at all, or low quality of information. Potential knowledge is hidden, due to the large accumulation of traffic and accident data. This limits the investigative tasks of road safety experts and thus decreases the utilization of databases. All these factors can have serious effects on the analysis of the road safety situation, as well as on the results of the analyses. This dissertation presents a three-tiered conceptual model to support the sharing of road safety–related information and a set of applications and analysis tools. The overall aim of the research is to build and maintain an information-sharing platform, and to construct mechanisms that can support road safety professionals and researchers in their efforts to prevent road accidents. GLOBESAFE is a platform for information sharing among road safety organizations in different countries developed during this research. Several approaches were used, First, requirement elicitation methods were used to identify the exact requirements of the platform. This helped in developing a conceptual model, a common vocabulary, a set of applications, and various access modes to the system. The implementation of the requirements was based on iterative prototyping. Usability methods were introduced to evaluate the users’ interaction satisfaction with the system and the various tools. Second, a system-thinking approach and a technology acceptance model were used in the study of the Swedish traffic data acquisition system. Finally, visual data mining methods were introduced as a novel approach to discovering hidden knowledge and relationships in road traffic and accident databases. The results from these studies have been reported in several scientific articles.
APA, Harvard, Vancouver, ISO, and other styles
27

Lindahl, Mattias. "Engineering Designers' Requirements on Design for Environment Methods and Tools." Doctoral thesis, Stockholm, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Abugessaisa, Imad-Eldin Ali. "Analytical tools and information-sharing methods supporting road safety organizations /." Linköping : Department of Computer and Information Science, Linköpings University, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11596.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Moody, Shirley A. "Methods and tools for modelling linear and integer programming problems." Thesis, Brunel University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Wilcock, Reuben. "Switched-current filters and phase-locked loops : methods and tools." Thesis, University of Southampton, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.416917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Mintchev, Elisabeth. "Investigation of Methods and Tools to Profile a Virtual Platform." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-285546.

Full text
Abstract:
In the embedded system development industry, virtual platforms are used heavily for emulating the target hardware to shorten the testing phase and obtain a faster time-to-market for the product. To make the virtual platforms better performing in respect of time, an increased observability of the system is necessary. Tools and techniques for increased visibility exist but their uses have not been documented for virtual platforms yet. This thesis examines the existing tools available and use them in a case study of Ericsson’s SVP startup time. A simplified version of the SVP has been used to perform the profiling and tracing. The parameters - modules, registers and sockets - are analyzed as they determine the size of the virtual platform. These parameters are used while profiling the platform to find a correlation between the startup time and their respective quantity. Using the profiling and tracing tools on the studied virtual platform, it is possible to determine the correlation between the startup time and the parameters. From the results obtained, the cost of creation of one unit of module, register and socket can be determined. For a large size virtual platform with an important amount of communication between the modules, the results obtained from the case study show it is beneficial to use the IP-XACT register with a static top. Additionally, the profiling and tracing help to explain delays such as processes waiting to be killed. These results can be utilized to make virtual platforms more time efficient and to identify and troubleshoot bottlenecks.
Virtuella plattformar som emulerar hardvara i inbyggda system anvands ofta under utveckling av produkter, for att minska tiden for testning och for att korta produkternas marknadsledtid. En okad observerbarhet ar nodvandig for att optimera de virtuella plattformarnas prestanda. Verktyg och metoder for okad observerbarhet finns tillgangliga, men deras anvandning for virtuella plattformar ar annu inte dokumenterad. Detta examensarbete undersoker tillgangliga verktyg, och anvander dem i en fallstudie for starttiden for Ericssons SVP. En forenklad version av SVP har anvants for att profilera och samla in data. Parametrarna – moduler, register och socketar – utvarderas eftersom de avgor storleken pa den virtuella plattformen. Dessa parametrar anvands for profilering av plattformen, med syfte att finna korrelation mellan starttiden och vardet pa respektive parameter. Det ar mojligt, genom anvandning av verktygen for profilering och insamling av data pa den aktuella virtuella plattformen, att bestamma korrelationen mellan starttiden och parametrarna. Resultaten kan anvandas for att visa hur kostnaden for att skapa en enhet av respektive modul, register och socket, kan bestammas. Resultaten fran fallstudien visar att det ar fordelaktigt, for en stor virtuell plattform med mycket kommunikation mellan modulerna, att anvanda IP-XACT register och en statisk topp-konfiguration. Profileringen och insamlingen av data har aven anvants for att forklara fordrojningar i samband med nedstangnin av processer. Resultaten kan anvandas for att gora virtuella plattformar mer tidseffektiva, och for flaskhalsar.
APA, Harvard, Vancouver, ISO, and other styles
32

Shu, Guoqiang. "Formal Methods and Tools for Testing Communication Protocol System Security." The Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=osu1211333211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Purwanto, Alex. "User research and opportunities for innovation : Exploring methods and tools." Thesis, Uppsala universitet, Avdelningen för visuell information och interaktion, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-296643.

Full text
Abstract:
First-class software engineering is no longer enough for an information system product to gain success on a market. Developing successful information system products has become a challenging practice that requires an understanding of those who are going to use the products. As product innovation has become the lifeblood of companies competing in the fast- paced IT industry, the end users have ultimately become those who determine the success of these type of products. User research is conducted to gather insights of users’ contexts, behaviors and feelings when using products. It can be practiced to explore how to create products and features that end users will find useful. This thesis examines how methods and tools used in user research can expose opportunities for innovation. The study was conducted by a literature study and a case study, where user research methods were put to practice to discover opportunities for creating a concept for a new product. Emphasis was also put on studying how to provide utility when developing a new product. The case study was performed over a four month period at an e- commerce company called Swiss Clinic in Stockholm, Sweden. The study shows that opportunities for innovation in user research occur in the interplay between business, user research discoveries and iterative design and that effective communication and artifacts play essential parts for innovating successfully.
APA, Harvard, Vancouver, ISO, and other styles
34

Velpanur, Shashank. "Development of methods and tools for monitoring and analyzing customer data." Thesis, Linköpings universitet, Kvalitetsteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-89611.

Full text
Abstract:
The market in today’s world is dominated by customers and their requirements. Customer feedback is essential for manufacturing companies trying to establish a foothold in the service environment. New product development is often recognized as the trump card held by most manufacturing companies that allow them to gain a competitive edge over their rivals in the market. Utilizing customer’s feedback in new product  development is a step that many companies are taking to be able to satisfy customer needs and requirements which enables the companies to firstly, retain the customers and secondly, bring in new business. Interviewing customers for feedback is a very common method employed by service companies to be able to capture and store data which is then analyzed to identify any patterns emerging i.e. any particular features preferred, any features disliked, etc. This thesis uses interviews as a method to be able to capture feedback from the customers which can then be utilized in new product development. The basis of this thesis is formed by developing a method for analyzing and monitoring customer data. The customer perceives a product or a service based on his/ her experience in using it and hence forms an opinion on it. This thesis mainly focuses on interacting with internal operators (Volvo CE employees who participated in a previous measurement) to understand and collect their feedback of the product, the L220F wheel loader, as an example to develop the method, due to the fact that “real” customers are not as easy to reach and interview. The data collected will then be compared to logged data of the product usage by these operators. Finding a correlation between the answers to the interviews and the measured data will help identify the gaps in how the wheel loader is used by operators of different skill levels and how it can be used. If successful, this method can then be utilized on external customers of Volvo CE and also on other products manufactured by Volvo CE. The conclusions drawn from this thesis are that while all customers must be equally considered while taking feedback, whether they are professionals or rookies, Volvo CE should rate their answers differently while designing a new product to meet customer requirements. Also, what could be seen from the clustering is that more heterogeneous groupings of operators are formed wherein no one cluster is made up of purely one class of operators.
APA, Harvard, Vancouver, ISO, and other styles
35

Moden, Treichl Julia. "E-KOLL - Methods and tools to engage residents in energy saving." Thesis, Linköpings universitet, Institutionen för teknik och naturvetenskap, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93160.

Full text
Abstract:
The Earth’s population is increasing at the same time as our resources are decreasing. More and more people are therefore realising the importance of a sustainable society. There are two main reasons. Firstly, energy as a scarce commodity is becoming more expensive every day. Secondly, we need to think long-term in order not to endanger life for future generations. The public landlord Hyresbostäder in Norrköping has set a target to decrease their energy consumption with 30 percent by 2030. This investment, in collaboration with Hyresgästföreningen, is called E-KOLL (E stands for energy and KOLL for awareness) and includes both internal and external work within the areas of Organisation, Economy and Behaviour. The last mentioned category deals with the importance of including and engaging the tenants to reach the energy goals of E-KOLL. To make this happen, Hyresbostäder has chosen to engage volunteer tenants, so called energyambassadors. Together with the district supervisor and representatives of Hyresgästföreningen they are going to work on a local level to include and engage the tenants. The purpose of this thesis is to investigate how this collaboration could function as well as the needs for creating a concept to inspire tenants to save energy. This was done with the help of field studies such as observations and focus group interviews. The result, which is based on a thematic analysis of the collected data, is a set of key factors that can be used in the creation of a concept to include the tenants in EKOLL. An essential key factor for the work of changing people’s behaviour in E-KOLL is to pinpoint who is in charge and define clear guidelines for the task. Co-operation of residents is also a key factor to make the communication and flow of information to work on a higher level. Therefore this should be a priority with the help of participatory workshops and more feedback in general. It should also be taken to consideration that the target group of E-KOLL, the tenants, is not just one target group but several. One important target group that’s not in the same position to absorb information is immigrants that don’t fully understand the Swedish language. Another important target group is children that have the power to influence other people in their surroundings.
I takt med att jordens befolkning ökar samtidigt som resurserna minskar har allt fler insett vikten av ett hållbart samhälle. Två skäl kan ses som huvudargument. Dels att energi som bristvara blir allt dyrare, dels insikten om att vi måste tänka långsiktigt för att inte äventyra kommande generationer. Det kommunägda fastighetsbolaget Hyresbostäder i Norrköping har som mål att till 2030 sänka sin energiförbrukning med 30 procent. Satsningen, som görs tillsammans med Hyresgästföreningen, går under namnet E-KOLL (E står för energi) och innefattar internt och externt arbete inom de tre strategiområdena Organisation, Ekonomi och Beteende. Den sistnämnda kategorin berör bland annat vikten av att inkludera hyresgästerna i arbetet med att nå energimålen. För att göra detta har Hyresbostäder valt att engagera ideellt verksamma hyresgäster, så kallade nergiambassadörer. Dessa ska tillsammans med Hyresbostäders områdesansvariga och representanter för Hyresgästföreningen tillsammans verka för att på lokal nivå engagera och involvera hyresgästerna. Syftet med arbetet är undersöka hur detta samarbete skulle kunna gå till samt vilka behov som finns för att ta fram ett koncept för att involvera hyresgäster i E-KOLL. Detta har gjorts genom en deltagande observation och ostrukturerade intervjuer i fokusgrupper. Resultatet, som grundar sig på en tematisk analys av det insamlade materialet, är ett antal nyckelfaktorer som kan tillämpas vid utformningen av ett koncept för att inspirera och involvera boende att delta i E-KOLL. Faktorer som kan konstateras väsentliga för E-KOLLs framgång är vikten av att peka ut vem som ska driva och agera som motor i beteendefrågan samt upprättandet av tydliga riktlinjer. Boendesamverkan är något som underlättar kommunikation och in formationsflöde varför detta bör prioriteras genom satsning på utbildning i ett mindre format (deltagande workshops) och förbättrade rutiner av återkoppling och feedback. Det har under arbetets gång kunnat konstateras att målgruppen för E-KOLL, hyres gästerna, inte bara bör ses som en grupp utan flera. En målgrupp som idag inte har samma förutsättningar att ta till sig viktig information är invandrare som inte behärskar det svenska språket. En annan viktig målgrupp är barn som i sin tur har förmåga att påverka sin omgivning.
APA, Harvard, Vancouver, ISO, and other styles
36

Last, Daniel [Verfasser]. "Novel methods and tools for lactonases, acylases and proteases / Daniel Last." Greifswald : Universitätsbibliothek Greifswald, 2016. http://d-nb.info/1116639831/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Llamas, Rodríguez Manuel José. "Design Automation methods and tools for building Digital Printed Microelectronics Circuits." Doctoral thesis, Universitat Autònoma de Barcelona, 2017. http://hdl.handle.net/10803/457967.

Full text
Abstract:
La electrónica orgánica/impresa está continuamente creciendo en interés, con la aparición de nuevas propuestas y aplicaciones. Este tipo de tecnologías no pretenden competir directamente con las que provienen de la industria tradicional basada en Silicio, sino que tienen como propósito complementarla con nuevos dispositivos que proporcionen ciertas ventajas en determinadas situaciones, ya sea en términos de coste u otras. Sin embargo, en lo que se refiere al campo del procesado digital queda mucho trabajo por hacer para, paulatinamente, ir siguiendo los pasos del modelo ‘fabless’ que rige el mercado de semiconductores actual. Este modelo consiste en la deslocalización entre los equipos de diseño y los fabricantes. Respecto a dicho progreso me refiero no solo a las mejoras que acontecen a nivel de procesos de fabricación, sino también en el campo de la automatización de los procesos de diseño. Nuestro grupo de investigación concibió una novedosa estrategia para producir, de manera eficiente, diseños de circuitos digitales para electrónica impresa, basados en lo que denominamos Inkjet-configurable Gate Arrays, aprovechando las ventajas de la impresión digital. Estos Inkjet Gate Arrays consisten en matrices de transistores sobre sustratos flexibles que, una vez conectados mediante impresión digital, conforman puertas lógicas; las cuales, en su conjunto, materializan circuitos. El trabajo presentado en esta tesis se centra en una etapa específica de cualquier flujo de diseño común de circuitos integrados, llamada síntesis física. En concreto, este trabajo proporciona una novedosa metodología para resolver el problema de ubicar y conectar, ‘Placement and Routing’, los circuitos sobre las mencionadas matrices de transistores, teniendo en cuenta su rendimiento, y con independencia de la tecnología de fabricación. Se aborda la manera de cómo tratar con tecnologías impresas diferentes, que puedan presentar distintos niveles de rendimiento, normalmente debidos a la alta variabilidad intrínseca a los procesos de fabricación actuales. En tales casos, un factor clave para asegurar que la colocación de los circuitos sea funcionalmente correcta es poder procesar de manera efectiva la información sobre la distribución de fallos de las matrices. Además del concepto de mapeo según el rendimiento, la novedosa heurística aquí propuesta proporciona la capacidad de personalizar los circuitos, lo que permite mayor flexibilidad en su construcción, dependiendo de distintas razones u objetivos posibles (p. ej. congestión). Esta metodología no solo es conveniente para los primeros pasos que, en la actualidad, se están llevando a cabo en el desarrollo de prototipos de circuitos digitales para la electrónica orgánica, sino que también es escalable hacia nuevas mejoras en el rendimiento de las tecnologías de fabricación, así como en tamaños y densidad de integración.
Organic/Printed Electronics are, day by day, increasing on interest, as new applications are being proposed and developed. This kind of technologies do not intend to compete directly with the Silicon-based well-established industry, but rather to complement it with new devices that are advantageous for certain situations, whether in terms of cost or others. However, in the digital processing domain there is still much work to be done to, slowly but steadily, follow the steps of the conventional fabless model that rules today’s semiconductor market. I am referring not only to progresses at fabrication level, but also on the field of Electronic Design Automation. Our research group conceived a novel strategy to efficiently produce Printed Electronics digital circuit designs based on what we called Inkjet-configurable Gate Arrays, which takes advantage of digital printing techniques. The Inkjet Gate Arrays consist in matrices of transistors over flexible substrates that, after being connected by digital printing techniques, they describe logic gates, and thus circuits. The work presented in this dissertation targets a specific stage of any common Integrated Circuit design flow, referred to as physical synthesis. Specifically, my contribution provides a new approach to the Placement and Routing problem, where circuits are mapped onto the Inkjet Gate Arrays in a technology independent yield-aware manner. I tackle the issue of dealing with different Printed Electronics technologies that might present distinct yield properties, usually due to the intrinsic high variability of current fabrication processes. In such cases, being able to effectively process the IGA’s fault distribution information is key to ensure that the mapped circuits will be capable of working correctly, from a functional perspective. In addition to the yield awareness concept, the circuit personalization capabilities of the novel P&R heuristic proposed herein allow more mapping flexibility, depending on different possible reasons/purposes (e.g. congestion). This approach is not only convenient for today’s first steps of digital circuit prototyping over Organic Electronics, but also scalable to future technological improvements at yield level, and on sizes and integration density.
APA, Harvard, Vancouver, ISO, and other styles
38

Ruiz, Arenas Carlos 1990. "Methods and bioinformatic tools to study polymorphic inversions in complex diseases." Doctoral thesis, Universitat Pompeu Fabra, 2019. http://hdl.handle.net/10803/666582.

Full text
Abstract:
Las inversiones cromosómicas son variantes estructurales donde un segmento de ADN cambia su orientación. Las inversiones cromosómicas reducen la recombinación homóloga y producen diferentes haplotipos en los cromosomas estándar e invertidos. Como resultado, influyen en la adaptación y la selección y desempeñan un papel en la susceptibilidad a las enfermedades humanas. Las inversiones se pueden estudiar con métodos experimentales y bioinformáticos. Los datos de SNP array se pueden usar para determinar genotipos de inversión mediante el uso de diferencias de haplotipos entre cromosomas invertidos y estándares. Sin embargo, estos métodos no están optimizados para grandes cohortes (con miles de individuos, como dbGaP o UK Biobank). Además, los métodos actuales solo pueden genotipar las inversiones con dos haplotipos y la clasificación es difícil de armonizar entre cohortes. Finalmente, se conoce que las inversiones cromosómicas afectan la expresión génica y la metilación del ADN. Sin embargo, no existen métodos precisos para evaluar globalmente el efecto de las inversiones en la expresión génica local o la metilación del ADN. El objetivo principal de esta tesis es desarrollar nuevos métodos robustos y escalables así como herramientas bionformáticas para estudiar los efectos fenotípicos y funcionales de las inversiones cromosómicas, superando las limitaciones existentes. Con este fin, he desarrollado un nuevo método para genotipar las inversiones cromosómicas que se puede usar en grandes cohortes, con inversiones con múltiples haplotipos y que utiliza haplotipos de referencia que permite el análisis conjunto de múltiples cohortes. En segundo lugar, he implementado un método multivariante basado en el análisis de la redundancia para estudiar los efectos de las inversiones cromosómicas en la metilación del ADN y la expresión génica locales. A continuación, he aplicado ambos métodos para estudiar el papel de las inversiones cromosómicas en dos grupos de enfermedades complejas: trastornos del neurodesarrollo y cáncer. Finalmente, he desarrollado un nuevo método para estudiar cómo las inversiones cromosómicas afectan los patrones de recombinación. Este método es aplicable a cualquier región genómica que contenga subpoblaciones con diferentes patrones de recombinación, lo que permite asociar estas subpoblaciones a rasgos fenotípicos.
Chromosomal inversions are structural variants where a segment changes its orientation. Chromosomal inversions reduce homologous recombination, producing different haplotypes in standard and inverted chromosomes. As a result, they influence adaptation and selection and play a role in susceptibility to human diseases. Inversions can be studied using experimental and bioinformatic methods. SNP array data can be used to call inversion genotypes by using haplotype differences between inverted and standard chromosomes. However, these methods are not optimized for large cohorts (thousands of individuals from existing databases such as dbGaP or UK Biobank). Also, current methods can only genotype inversions with two haplotypes and the inversion calling is difficult to be harmonized among cohorts. Finally, it is recognized that chromosomal inversions affect gene expression and DNA methylation. However, there are no accurate methods to globally assess the effect of inversions on local gene expression or DNA methylation. The main aim of this thesis is to develop new robust and scalable methods and bioinformatic tools to study the phenotypic and functional effects of chromosomal inversions by overcoming the existing limitations. To this end, I have developed a new method to genotype chromosomal inversions that can be used in large cohorts, inversions with multiple haplotypes and that uses reference haplotypes allowing the integrative analysis of multiple cohorts. Second, I have implemented a multivariate method based on redundancy analysis to study the effects of chromosomal inversions on local DNA methylation and gene expression. Then, I applied both methods to study the role of chromosomal inversions in two groups of complex diseases: neurodevelopmental disorders and cancer. Finally, I developed a new method to study how chromosomal inversions affect recombination patterns. This method is extendable to any genomic regions containing subpopulations with different recombination patterns, allowing associating these subpopulations to phenotypic traits.
APA, Harvard, Vancouver, ISO, and other styles
39

Carlsson, Mats. "Methods and computer based tools for handling medical terminologies and classifications /." Linköping : Univ, 2000. http://www.bibl.liu.se/liupubl/disp/disp2000/tek645s.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Messelink, W. A. C. M. "Numerical methods for the manufacture of optics using sub-aperture tools." Thesis, University College London (University of London), 2015. http://discovery.ucl.ac.uk/1471480/.

Full text
Abstract:
Moore's law, predicting a doubling of transistor count per microprocessor every two years, remains valid, demonstrating exponential growth of computing power. This thesis examines the application of numerical methods to aid optical manufacturing for a number of case-studies related to the use of sub-aperture tools. One class of sub-aperture tools consists of rigid tools which are well suited to smooth surfaces. Their rigidity leads to mismatch between the surfaces of tool and aspheric workpieces. A novel, numerical method is introduced to analyse the mismatch qualitatively and quantitatively, with the advantage that it can readily be applied to aspheric or free-form surfaces for which an analytical approach is difficult or impossible. Furthermore, rigid tools exhibit an edge-effect due to the change in pressure between tool and workpiece when the tool hangs over the edge. An FEA model is introduced that simulates the tool and workpiece as separate entities, and models the contact between them; in contrast to the non-contact, single entity model reported in literature. This model is compared to experimental results. Another class of sub-aperture processes does not use physical tools to press abrasives onto the surface. A numerical analysis of one such process, Fluid Jet Polishing, is presented - work in collaboration with Chubu University. Numerical design of surfaces, required for generating tool-paths, is investigated, along with validation techniques for two test-cases, E-ELT mirror segments and IXO mirror segment slumping moulds. Conformal tools are not well suited to correct surface-errors with dimensions smaller than the contact area between tool and workpiece. A method with considerable potential is developed to analyse spatial-frequency error-content, and used to change the size of the contact area during a process run, as opposed to the constant-sized contact area that is state-of-the-art. These numerical methods reduce dependence on empirical data and operator experience, constituting important steps towards the ultimate and ambitious goal of fully-integrated process-automation.
APA, Harvard, Vancouver, ISO, and other styles
41

Gichuru, Phillip Karanja. "Developing robust statistical scoring methods for use in child assessment tools." Thesis, Lancaster University, 2018. http://eprints.lancs.ac.uk/127847/.

Full text
Abstract:
Timely and accurate diagnosis of developmental disability reduces its detrimental effect on children. Most of the current scoring methods do not appropriately remove the effect of age on development scores. This frustrates both disability status classification and comparison of scores across different child populations because their age dependent development profiles are usually quite different. Hence, the key objective of this research is to develop robust statistical scoring methods that appropriately correct for age using a) item by item age estimation methods that provide the expected age of achieving specific developmental milestones and b) overall score norms independent of the age effect using all the responses of a child to give one score across the entire domain for each child. Using data from 1,446 healthy and normally developing children (standard group) from the 2007 Malawi Development Assessment Tool (MDAT) study, a review of classical methods including generalised linear models, simple sum, Z-score, Log Age Ratio and Item Response Theory scoring methods in this child development context using binary responses only was carried out. While evaluating the pros and cons of each method, extensions to the current scoring methods using more flexible and robust methods including smoothing to reduce score variability are suggested. The results show that; a) the suggested generalized additive model extensions used for age estimation were more suited to deal with skewed item pass rate response distributions, b) smoothing of Z-scores was especially beneficial when variability in certain age groups is high due to low sample sizes, c) the more complex methods accounting for item response correlation or increase in item difficulty resulted in reliable and generalisable normative scores d) the extended overall scoring approaches were able to effectively correct for age achieving correlation coefficients of less than +0.25 between age and scores. The suggested overall scoring extensions improved the accuracy of detecting delayed development both in the disabled and even in the harder to classify malnourished children achieving sensitivity values of up to 98% and 85% respectively.
APA, Harvard, Vancouver, ISO, and other styles
42

Eves, Keenan Louis. "A Comparative Analysis of Computer-Aided Collaborative Design Tools and Methods." BYU ScholarsArchive, 2018. https://scholarsarchive.byu.edu/etd/7253.

Full text
Abstract:
Collaboration has always been critical to the success of new product development teams, and the advent of geographically dispersed teams has significantly altered the way that team members interact. Multi-user computer-aided design (MUCAD) and crowdsourcing are two results of efforts to enable collaboration between geographically dispersed individuals. In this research, a study was done to investigate the differences in performance between MUCAD and single-user CAD teams, in which teams competed to create the best model of a hand drill. This was done across a three-day period to recreate the scenario found in industry. It was found that MUCAD increases awareness of teammates' activities and increases communication between team members. Different sources of frustration for single-user and multi-user teams were identified, as well as differing patterns of modeling style. These findings demonstrate that MUCAD software has significant potential to improve team collaboration and performance. A second study explored a number of potentially significant factors in MUCAD team performance, including leadership, design style, unfamiliar parts, knowledge transfer, individual experience, and team composition. In this study, teams of undergraduate mechanical engineering students worked together to complete tasks using NXConnect, a MUCAD plugin for NX developed at Brigham Young University. A primary finding was that having an appointed leader for a MUCAD team improves performance, in particular when that leader works with the team in creating the CAD model. It was also found that creating a framework to aid in organizing and coordinating the creation of the CAD model may decrease the time required for completion. In the final study, the possibility of using crowdsourcing to complete complex product design tasks was explored. In this study, a process for crowdsourcing complex product design tasks was developed, as well as a website to act as the platform for testing this process. A crowd consisting of engineering and technology students then worked together on the website to design a frisbee tracking device. The crowd was able to collaborate to accomplish some detailed product design tasks, but was not able to develop a complete product. Major findings include the need for more formal leadership and crowd organization, the need for better decision making mechanisms, and the need for a better model for engaging crowd members on a consistent basis. It was also found that crowd members had a greater willingness to pay for the product they developed than individuals who had not worked on the project. Results also show that although crowd members were often frustrated with the collaboration process, they enjoyed being able to work with a large group of people on a complex project.
APA, Harvard, Vancouver, ISO, and other styles
43

Paepcke, Verena Natalie. "Tools for balancing design: analysis and evaluation methods for restricted workspaces." The Ohio State University, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=osu1329235564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

FOUTS, II BRUCE EDWARD. "INVESTIGATION INTO TESTING METHODS AND NOISE CONTROL OF INDUSTRIAL POWER TOOLS." University of Cincinnati / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1029443901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Gladden, Jonathan. "Applications, methods, tools, and virtual environments for mapping web site structures." The Ohio State University, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=osu1327525993.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Keltsch, Jan-Niklas. "Technology management tools : configuration in context." Thesis, University of Cambridge, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.610558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Shepherd, David. "Optimisation of iterative multi-user receivers using analytical tools /." View thesis entry in Australian Digital Theses Program, 2008. http://thesis.anu.edu.au/public/adt-ANU20081114.221408/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Wang, Yafang [Verfasser], and Gerhard [Akademischer Betreuer] Weikum. "Methods and tools for temporal knowledge harvesting / Yafang Wang. Betreuer: Gerhard Weikum." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2013. http://d-nb.info/1052779883/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Vesteraas, Astrid Hetland. "Comparision of methods and software tools for availability assessment of production systems." Thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2008. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9767.

Full text
Abstract:

This thesis presents and considers several different methods for computation of availability and production availability for production system. It is assumed that the system handles flow of a fluid. The thesis presents two software programs for computation of reliability measures, MIRIAM Regina and Relex Reliability Studio, and several analytical methods, among them one especially adapted to computation of production availability. For the methods not able to compute production availability, a method is presented which makes it possible to estimate production availability from computation of availability. Among the methods, Relex and three of the analytical computation methods are made to compute availability of the system. The analytical methods considered are standard availability computation based on the structure function of the system and the definitions of availability and computation based on renewal and quasi renewal processes. Relex makes it possible to compute availability both by simulation and, if the system is simple enough, by analytical methods. The usefulness of the analytical methods is to an extent limited by the assumptions laid on the system. Relex makes it possible to take into account more features one would expect to have in a real life system, but for analytical methods to be employed in the computations, the system must be quite simple. Two methods especially made for computing production availability are presented. These are the software program MIRIAM Regina, which combines a sophisticated flow algorithm with Monte Carlo simulation, and a method based on using Markov chains for computing the probability distribution for flow through subsystems of the system under consideration, and then employing simple merging rules to compute the flow through the entire system. These methods are both very flexible, and makes it possible to take into account many different aspects of a real life system. The most important source of uncertainty in the results form a computation, lies in the relation between the real life system and the model system computations are made on. A model system will always be significantly simplified. When choosing a computation method and interpreting results, it is important to keep in mind all assumptions made regarding the system, both explicitly when making the model, and implicit in the computation method. Another source of uncertainty is uncertainty in the input data. A method for propagation of uncertainty through computations is presented and employed on some of the methods. For simulation, one will in addition have the uncertainty due to simulation being an way of making a statistical sample. The size of the sample, given by the number of simulation iterations done, will decide the accuracy of the result.

APA, Harvard, Vancouver, ISO, and other styles
50

Eckerberg, Klas. "Information technology in landscape architecture : development of tools, methods, and professional role /." Uppsala : Sveriges lantbruksuniv, 1999. http://epsilon.slu.se/7436917.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography