Dissertations / Theses on the topic 'General design framework'

To see the other types of publications on this topic, follow the link: General design framework.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 21 dissertations / theses for your research on the topic 'General design framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Van, Schaik Jeroen Robbert. "A framework for design rationale capture and use during geometry design." Thesis, University of Southampton, 2014. https://eprints.soton.ac.uk/371822/.

Full text
Abstract:
Despite broad agreement on the utility of design rationale use and capture, a review of the relevant literature shows that industrial usage remains limited, especially during geometry design. An initial field study confirmed low design rationale capture during the geometry design stage. The lack of linking between design rationale and geometry models is identified as a factor holding back design rationale capture. A toolset is presented to link entities in geometry models to design rationale, allowing the creation of design rationale referring to a specific geometry design decision. Using the design rationale links it is possible to create graphs of the structure of geometry models and attached rationale. Furthermore the presence and quantity of design rationale can be displayed as a coloured overlay on the geometry. The toolset has been tested by 7 groups of student-designers, and although the uptake of the design rationale linking tool by the users was low, results show that groups using the tool captured relatively more design rationale during geometry design, although reservations have to be made regarding to self-selection bias. The study shows that the availability of design rationale linking tools is not by itself enough to improve design rationale capture during geometry design.
APA, Harvard, Vancouver, ISO, and other styles
2

Umar, Abubakar Attah. "Design for safety framework for offshore oil and gas platforms." Thesis, University of Birmingham, 2010. http://etheses.bham.ac.uk//id/eprint/1135/.

Full text
Abstract:
This main aim of this work is to develop a “design for safety” based risk assessment technique for the offshore platforms in order to facilitate decision making. This is achieved through detailed examination of related risks, and review of relevant literatures and traditional safety assessment methods leading to the development of a new knowledge-based risk assessment method (KBRAM) through the research methodology process. The methodology involves detailed definition of the research aim and objectives, further literature review on risk analysis and the related topics of safety assessment and safety management systems. This process laid the foundation for the establishment of a framework for the integration of design for safety and fuzzy reasoning approach to model the risk assessment procedure for offshore platforms. The research procedure requires collection of data which was obtained from the industry in this instance. The collection methods involve surveys visit interviews and questionnaires which together constitute vital information required for test running the model and conduct preliminary validation studies with regard to offshore platform risk assessment to enable provision reaching some conclusions. The results obtained through testing of KBRAM using data collected from the industry show the determination of risk level classification has been improved compared to the one obtained using same data on the traditional fuzzy two-input parameter risk assessment method (TPRAM) due to the addition of a third parameter in the KBRAM. In conclusion, the above result satisfy the research aim of facilitating decision-making process based on reduced cost of safety due to more efficient risk evaluations.
APA, Harvard, Vancouver, ISO, and other styles
3

Noune, Mohamed Badreddine. "SC-FDE with flexible resource allocation : a general transceiver design framework." Thesis, University of Bristol, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.535212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yong, Kin Fuai. "Emerging human-computer interaction interfaces : a categorizing framework for general computing." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/90692.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 86).
Executive summary: The dominant design of Human-Computer Interface over last thirty years has been the combination of monitor, keyboard and mouse. However the constant miniaturization of IC and sensors and the availability of computing power has spurred incredible new dimensions of inputs (touch, gesture, voice, brain wave, etc.) and outputs (watch, glasses, phone, surface, etc.), which started the explosive growth of recombination of both inputs and outputs into new classes of devices. The design constraints have also noticeably shifted from technical to ergonomic and contextual. This thesis sets out to map these new interfaces to the use context in general computing and project the adoption path and the driving factors behind them. The theoretical foundation of this thesis is based on multiple technology innovation theories including the importance of Innovation and Technology Diffusion Models from Paul Geroski, Dominant Design from James Utterback, the Curse of Innovation from John Gourville and Lead User Innovation by Eric Von Hippel. System Architecture thinking, founded most notably by Ed Crawley and Olivier de Weck from MIT, is also applied to analyze the architecture of Human- Computer Interface. The study of Human-Computer Interface starts with a case study of the invention of the computer mouse - conceived in 1968 by Douglas Engelbart. A paper published by Engelbart compared different technologies and the mouse emerged as superior with lower fatigue and error rate yet a surprisingly short learning time. The mouse, however, was not popularized until Apple showcased the design with the first GUI1 on a personal computer on its Macintosh in 1984, and its subsequent mass adoption by Microsoft Windows in the late 1980s. The case study showed that even with the superior design of a specific HCI, a number of other factors, including holistic solution, killer application, market position and platform strategy, are required for successful adoption. The next chapter maps out developing Human-Computer Interface technologies and notable existing or developing products and their company background. The superiority of an interface depends on how well it fits into the inherent nature of a specific use context. The daily general computing domains of an average computer user include collaboration, productivity, media consumption, communication and augmentation. The clear distinction of the use context in each domain strongly correlates with the effectiveness of the Human-Computer Interface in each class of device. The chapter includes analysis of proposed frameworks that place HCI interface on a plot of interaction complexity against screen sizes. Several industry experts generally agreed on a few observations: the keyboard and mouse will remain as the primary input interface for the productivity domain, the growing importance of collaboration, the increasing emphasis on human-centered design, and the huge opportunity in the wearable market with a potential size of $50 billion. In conclusion, the projected future of adoption is: * The collaboration domain needs the combination of a low fatigue, high precision interface for productivity; a high freedom, low precision interface for creativity; and a large output screen for multiple collaborators. This will remain the frontier battleground for a variety of concepts from several giant players and niche players, each with a different competitive edge. * Productivity domain input interfaces will likely continue to be dominated by low fatigue, high precision interfaces that are not necessarily intuitive i.e. a keyboard and mouse. 3D manipulation will remain a niche interface only needed by specific industries, while a 3D general computing environment is unlikely to be realized in the short term. * The media consumption domain will be the major area of adoption for medium accuracy, highly intuitive interfaces, e.g. gesture and sound. Personal media consumption devices might be challenged by head-mounted display while group media consumption devices face an interesting challenge from bridging devices like Chromecast. * The communication domain needs an input interface that is fairly accurate and responsive, with just enough screen space. Voice recognition is rising fast to challenge typing. The dominating form factor will be the smartphone but challenged by glasses. * The augmentation domain needs an interface that is simple and fairly accurate. New input interfaces like brainwave, gaze detection, and muscle signal will be adopted here given the right context. Flexible OLED is likely to revolutionize both input and output interfaces for wearable devices. Product developers should choose technology according to their targeted domain and identify competitors using this framework. Killer applications should be developed early, internally or with partners, to ensure success, while platform strategy can leverage innovation of third-party developers to widen the application. During the course of research, other opportunities arising from the proliferation of computing are also identified in the areas of the Internet of Things, smart objects and smart healthcare. This thesis is based mainly in qualitative analysis due to the lack of comprehensive data on the new Human-Computer Interfaces. Future research can collect quantitative data based on the framework of the five domains of general computing activities and their categorical requirements. It is also possible to extend the model to other computing use cases, for example Gaming, Virtual Reality and Augmented Reality.
by Kin Fuai Yong.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
5

Hohnloser, Peter. "Design of a general framework for synchronizing behaviors in a complex robot." Thesis, Umeå universitet, Institutionen för datavetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-58254.

Full text
Abstract:
This thesis describes a general framework for synchronizing behaviors in a complex robot, using a Finite State Machine. The framework is developed in C++ and with the robotic framework ROS. It will be used for the EU funded research project CROPS for developing a fruit harvesting robot. The thesis also focuses on how to connect a robot behavior to a state that makes pre-emptive multitasking possible. One important thing about connecting a behavior to a state is which kind of communication to use; publish-subscribe, request-reply, or a goal-feedback-result communication. These communications can be used by two different state interfaces. Another important point regarding connecting a behavior to a state is the definition of state transitions. The state transitions are defined in a text file in yaml format. There are also three different ways of implementing state transactions presented. Passing data by ROS messages, by ROS parameter server and by saving and loading data in and from the Finite State Machine. The framework has been successfully implemented in CROPS and is able to control a robot arm.
APA, Harvard, Vancouver, ISO, and other styles
6

Phoomboplab, Tirawat. "Self-resilient production systems : framework for design synthesis of multi-station assembly systems." Thesis, University of Warwick, 2012. http://wrap.warwick.ac.uk/59325/.

Full text
Abstract:
Product design changes are inevitable in the current trend of time-based competition where product models such as automotive bodies and aircraft fuselages are frequently upgraded and cause assembly process design changes. In recent years, several studies in engineering change management and reconfigurable systems have been conducted to address the challenges of frequent product and process design changes. However, the results of these studies are limited in their applications due to shortcomings in three aspects which are: (i) They rely heavily on past records which might only be a few relevant cases and insufficient to perform a reliable analysis; (ii) They focus mainly on managing design changes in product architecture instead of both product and process architecture; and (iii) They consider design changes at a station-level instead of a multistation level. To address the aforementioned challenges, this thesis proposes three interrelated research areas to simulate the design adjustments of the existing process architecture. These research areas involve: (i) the methodologies to model the existing process architecture design in order to use the developed models as assembly response functions for assessing Key Performance Indices (KPIs); (ii) the KPIs to assess quality, cost, and design complexity of the existing process architecture design which are used when making decisions to change the existing process architecture design; and (iii) the methodology to change the process architecture design to new optimal design solutions at a multi-station level. In the first research area, the methodology in modeling the functional dependence of process variables within the process architecture design are presented as well as the relations from process variables and product architecture design. To understand the engineering change propagation chain among process variables within the process architecture design, a functional dependence model is introduced to represent the design dependency among process variables by cascading relationships from customer requirements, product architecture, process architecture, and design tasks to optimise process variable design. This model is used to estimate the level of process variable design change propagation in the existing process architecture design Next, process yield, cost, and complexity indices are introduced and used as KPIs in this thesis to measure product quality, cost in changing the current process design, and dependency of process variables (i.e, change propagation), respectively. The process yield and complexity indices are obtained by using the Stream-of-Variation (SOVA) model and functional dependence model, respectively. The costing KPI is obtained by determining the cost in optimizing tolerances of process variables. The implication of the costing KPI on the overall cost in changing process architecture design is also discussed. These three comprehensive indices are used to support decision-making when redesigning the existing process architecture. Finally, the framework driven by functional optimisation is proposed to adjust the existing process architecture to meet the engineering change requirements. The framework provides a platform to integrate and analyze several individual design synthesis tasks which are necessary to optimise the multi-stage assembly processes such as tolerance of process variables, fixture layouts, or part-to-part joints. The developed framework based on transversal of hypergraph and task connectivity matrix which lead to the optimal sequence of these design tasks. In order to enhance visibility on the dependencies and hierarchy of design tasks, Design Structure Matrix and Task Flow Chain are also adopted. Three scenarios of engineering changes in industrial automotive design are used to illustrate the application of the proposed redesign methodology. The thesis concludes that it is not necessary to optimise all functional designs of process variables to accommodate the engineering changes. The selection of only relevant functional designs is sufficient, but the design optimisation of the process variables has to be conducted at the system level with consideration of dependency between selected functional designs.
APA, Harvard, Vancouver, ISO, and other styles
7

Taylor, Richard Paul. "An artificial intelligence framework for experimental design and analysis in discrete event simulation." Thesis, University of Warwick, 1988. http://wrap.warwick.ac.uk/109868/.

Full text
Abstract:
Simulation studies cycle through the phases of formulation, programming, verification and validation, experimental design and analysis, and implementation. The work presented has been concerned with developing methods to enhance the practice and support for the experimental design and analysis phase of a study. The investigation focussed on the introduction of Artificial Intelligence (AI) techniques to this phase, where previously there existed little support. The reason for this approach was the realisation that the experimentation process in a simulation study can be broken down into a reasoning component and a control of execution component. In most studies, a user would perform both of these. The involvement of a reasoning process attracted the notion of artificial intelligence or at least the prospective use of its techniques. After a study into the current state of the art, work began by considering the development of a support system for experimental design and analysis that had human intelligence and machine control of execution. This provided a semi-structured decision-making environment in the form of a controller that requested human input. The controller was made intelligent when it was linked to a non-procedural (PROLOG) program that provided remote intelligent input from either the user or default heuristics. The intelligent controller was found to enhance simulation experimentation because it ensures that all the steps in the experimental design and analysis phase take place and receive appropriate input. The next stage was to adopt the view that simulation experimental design and analysis may be enhanced through a system that had machine intelligence but expected human control of execution. This provided the framework of an advisor that adopted a consultation expert system paradigm. Users were advised on how to perform simulation experimentation. Default reasoning strategies were implemented to provide the system with advisory capabilities in the tasks of prediction, evaluation, comparison, sensitivity analysis, transient behaviour, functional relations, optimisation. Later the controller and the advisor were linked to provide an integrated system with both machine intelligence and machine control of execution. User involvement in the experimentation process was reduced considerably as support -¿as provided in both the reasoning and control of execution aspects. Additionally, this integrated system supports facilities for refinement purposes that aim at turning the system’s knowledge into expertise. It became theoretically possible for other simulation experts to teach the system or experiment with their own rules and knowledge. The following stage considered making the knowledge of the system available to the user, thereby turning the system into a teacher and providing pedagogical support Teaching was introduced through explanation and demonstration. The explanation facility used a mixed approach: it combined a first time response explanation facility to "how" and "why" questions with a menu driven information system facility for "explain" requests or further queries. The demonstration facility offers tutorials on the use of the system and how to carry out an investigation of any of the tasks that the system can address. The final part of the research was to collect some empirical results about the performance of the system. Some experiments were performed retroactively on existing studies. The system was also linked to a data-driven simulation package 'hat permitted evaluation using some large scale industrial applications. The system’s performance was measured by its ability to perform as well as students with simulation knowledge but not necessarily expertise. The system was also found to assist the user with little or no simulation knowledge to perform as well as students with knowledge. This study represents the first practical attempts to use the expert system framework to model the processes involved in simulation experimentation. The framework described in this thesis has been implemented as a prototype advisory system called WES (Warwick Expert Simulator). The thesis concludes that the framework proposed is robust for this purpose.
APA, Harvard, Vancouver, ISO, and other styles
8

Duranti, Daniele. "Tangible Interaction in Museums and Cultural Heritage Sites: Towards a Conceptual and Design Framework." Thesis, IMT Alti Studi Lucca, 2017. http://e-theses.imtlucca.it/232/1/Duranti_phdthesis.pdf.

Full text
Abstract:
Drawing on a design perspective, the research explores the application of tangible interaction in museums and cultural heritage sites. Tangible interaction is today a quite consolidated research area inside HCI (Human-Computer Interaction) and Interaction Design. It refers to a new way of interacting with computer systems that is more similar to the way one commonly interacts with the real world: instead of using generic devices like the mouse or the keyboard, one interacts using specific objects or the body. In this way, tangible interaction is able to bridge the gap between the world of atoms and the world of bits (Ishii et al., 1997). Since the early 2000s, tangible interaction has also been applied to the cultural heritage field for the creation of onsite interactive installations that better integrate digital technologies, the materiality of the objects and the physicality of the experience during the visit. So far, research in the field of tangible interaction applied to cultural heritage has mainly focused on developing new systems and evaluating them while a move towards more theoretical and conceptual works is still missing. As a consequence, there is not a common language in the field, there is not a deep understanding of what has been done and what is missing, and there is not a formalization of the aspects that make up the design of tangible interaction systems in the cultural heritage field. This situation might generate issues such as ambiguity and misunderstanding between the different professionals involved in projects, it might slow down innovation in the field, and last but not least, it might make the design process slower, less efficient and effective. This research represents a first attempt to overcome at least partially these problems by replying to three main questions that are: 1. How has tangible interaction been applied to onsite interactive installations in the cultural sector? 2. What kind of experiences of cultural heritage does tangible interaction allow? 3. What are the aspects that make up the design of a tangible interaction system? In order to answer these questions a theoretical framework for tangible interaction in museums and cultural heritage sites is proposed, similarly to what has been done in the past for other types of technologies (e.g. Spallazzo, 2012). The framework developed as part of this research can be intended as both a conceptual framework and the theoretical foundations of a design framework. Indeed, not only it shows what tangible interaction is by providing a categorization of past tangible interaction systems, but it also identifies a set of aspects that make up the design of such interactive systems. These aspects represent themes around which choices have to be made during the process of design, and the knowledge of which can facilitate or inspire the design process itself. The framework has been developed starting from the collection and analysis of more than 60 tangible interaction projects. In particular, the projects have been analysed using a thematic analysis, combining an inductive (bottom-up) and deductive (top-down) approach in order to identify themes and subthemes (categories and subcategories). In order to discuss and develop further reflections about the framework being proposed, the research goes on presenting a reconstruction and analysis of a practical case study, the interactive exhibition “Voices from Forte Pozzacchio” developed as part of the EU funded “meSch” research project. The proposed framework can be beneficial for researchers as it provides a language and a conceptual model that can help them to reflect and discuss about the topic, to orient future research, to cooperate with other researchers. It can also be used to provide different practitioners (e.g. designers, developers and cultural heritage professionals) with a shared view of what tangible interaction is, that can help reduce misunderstandings and can facilitate collaboration between them. In addition, the framework lays the theoretical foundations for a design framework, addressed to designers or design teams, that aims to provide them with a greater awareness of important aspects to consider during the design process, potentially making it more effective and efficient.
APA, Harvard, Vancouver, ISO, and other styles
9

Mitseas, Ioannis. "An efficient stochastic dynamics framework for response determination, reliability assessment, and performance-based design of nonlinear structural systems." Thesis, University of Liverpool, 2015. http://livrepository.liverpool.ac.uk/2010745/.

Full text
Abstract:
An approximate analytical technique for determining the survival probability and first-passage probability density function (PDF) of nonlinear multi-degree-of-freedom (MDOF) structural systems subject to an evolutionary stochastic excitation vector is developed. The proposed technique can be construed as a two-stage approach. First, relying on statistical linearization and utilizing a dimension reduction approach the nonlinear n-degree-of-freedom system is decoupled and cast into (n) effective single-degree-of-freedom (SDOF) linear time-varying (LTV) oscillators corresponding to each and every DOF of the original MDOF system. Second, utilizing the effective SDOF LTV oscillator time-varying stiffness and damping elements in conjunction with a stochastic averaging treatment of the problem, the MDOF system survival probability and first-passage PDF are efficiently determined. Applications regarding MDOF structural systems exhibiting highly nonlinear behavior subject to stochastic excitations possessing separable as well as non-separable evolutionary power spectra (EPS) are included. Furthermore, a computationally efficient methodology for conducting fragility analysis of nonlinear/hysteretic MDOF structural systems is developed. Specifically, fragility surfaces are estimated for nonlinear/hysteretic MDOF structural systems subject to evolutionary stochastic earthquake excitations. An approximate nonlinear stochastic dynamics formulation which consist the core of the developed methodology, allows for the efficient computation of structural system fragilities in a straightforward manner while it keeps the computational cost for the corresponding analyses at a minimum level. Nonlinear MDOF structural systems exhibiting a hysteretic restoring force-displacement Bouc-Wen feature, serve as numerical examples for demonstrating the efficiency of the proposed methodology. Comparisons with pertinent Monte Carlo simulations are included as well demonstrating the satisfactory level of the exhibited accuracy. Appended to the above, a novel integrated approach for structural system optimal design considering life cycle cost (LCC) is developed. Specifically, a performance-based multi-objective design optimization framework for nonlinear/hysteretic MDOF structural systems subject to non-stationary stochastic excitations is formulated. The developed approach encompasses an efficient analytical nonlinear stochastic dynamics approach for the determination of the response EPS as well as the non-stationary inter-story drift ratio (IDR) amplitude PDFs, circumventing computationally intensive numerical integrations of the nonlinear equations of motion. It is notable that the proposed framework complies with the most contemporary performance-based earthquake engineering (PBEE) provisions proposed by the Pacific Earthquake Engineering Research (PEER) center. Although the herein developed framework is tailored specifically for earthquake engineering related applications, it can be readily modified to account for other hazard kinds as well. Nonlinear building structures comprising the versatile Bouc-Wen (hysteretic) model serve as numerical applications for demonstrating the efficiency of the developed methodology.
APA, Harvard, Vancouver, ISO, and other styles
10

Mansour, Ali Abdul Hadi. "A framework for the design of a medical tutoring system for the instruction of undergraduates in general practice." Thesis, University of Sheffield, 1990. http://etheses.whiterose.ac.uk/14648/.

Full text
Abstract:
One of the difficulties in teaching clinical medicine is the lack of opportunity a student has to acquire techniques for solving clinical problems. By using a computer to simulate a General Practice environment where patients with sets of symptoms are presented, a student can gain experience of diagnostic techniques and treatment management for any medical condition. Such an approach should enhance a student's development of properly structured clinical algorithms for interrogating a patient and arriving at an appropriate management plan. The intelligent tutoring system developed at the Department of Computer Science with the collaboration of the Department of General Practice aims not only to simulate this environment but also to formulate the basis for a general interactive learning environment for all subject domains with similar problem-solving model. In this system, a student may question, examine and provide treatment plans for a patient whilst constantly being monitored by the system:. Using Artificial Intelligence techniques, the tutor is able to assess the progress of a student throughout the tutorial session and produce tutoring interventions at appropriate stages, according to the student's ability. The system's knowledge base consists of disease profiles and population parameters which are created and updated by a separate system - the Medical Editor. The manipulation of this database allows tailoring of the system to simulate any clinical situation in Primary Care. This research considers in detail the current teaching/tutoring strategies adopted by all medical computer-assisted learning systems. It identifies the main areas of difficulty for using such systems in the Primary Care undergraduate course and discusses the consultation model used in this system with full comparison of the models used in Secondary Care. The research also discusses the main design issues which forms the framework for building learning environments based on intelligent tutoring systems.
APA, Harvard, Vancouver, ISO, and other styles
11

Pius, Tofigh. "The generation of a theoretical background for an architectural design framework : towards the definition of the systems thinking architect." Thesis, City University London, 1990. http://openaccess.city.ac.uk/7675/.

Full text
Abstract:
The objectives of this research were ambitious: the definition and development of a structured methodology that: 1. captures the total environment of complex architectural design problems; 2. enables the many parts and interests in an architectural development to communicate and understand each other; 3. provides a framework that can embrace the many sub-problems and conflicts that are bound to arise during the definition and appraisal of a design; 4. provides a sequence of procedures that enables a design to be elaborated that both solves the technical problems and induces consensus between the interests.
APA, Harvard, Vancouver, ISO, and other styles
12

Inci, Semsa Ebru. "An Investigation Into The Process Of Architectural Design Within The Framework Of Game." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606427/index.pdf.

Full text
Abstract:
THE THESIS STUDY AIMS TO UNDERSTAND AND INVESTIGATE THE ARCHITECTURAL DESIGN PROCESS BY UTILIZING THE CHARACTERISTICS AND TYPES OF ANOTHER FIELD, GAME. THE STEPS TAKEN IN ORDER TO ACCOMPLISH THIS AIM ARE '
ANALYZING GAME, ITS PROPERTIES, AND TYPES'
, '
RE-READING AND UNDERSTANDING ARCHITECTURAL DESIGN PROCESS BY INVESTIGATING THE SIMILARITIES AND DIFFERENCES WITH GAME TYPES IN ORDER TO END UP WITH INFORMATIVE, UNDERSTANDABLE TABULAR RESULTS'
, RESPECTIVELY.
APA, Harvard, Vancouver, ISO, and other styles
13

Lesprier, Jérémy. "A general controller design framework using H8 and dynamic inversion for robust control in the presence of uncertainties and saturations." Thesis, KTH, Reglerteknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-107401.

Full text
Abstract:
This thesis deals with robust controller design using recently developed methods and tools. Starting from a nonlinear model, nonlinear dynamic inversion (NDI) is applied in order to linearize the system and deal with the varying parameters. Since the resulting closed-loop model lacks of good robustness properties, an H8 scheme is used in order to improve it, by using new Matlab© routines also allowing to x the structure and the order of the controller. Then a next step is to consider actuator saturations, which leads to a multi-objective anti-windup design. At the end the stability and the performance properties of the closedloop system in the presence of linear time-invariant (LTI) and linear time-varying (LTV) uncertainties are formally evaluated using -analysis based tools and integral quadratic constraints (IQCs). All the theory is briey exposed for each technique and is then applied on the control of the angle of attack for a simple aircraft longitudinal model. All this framework shows interesting and satisfying results which prove the eectiveness of H8-based methods and the progress that have been made in this eld.
APA, Harvard, Vancouver, ISO, and other styles
14

Anil, Engin Burak. "A Web Based Multi User Framework For The Design And Detailing Of Reinforced Concrete Frames - Beams." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12610365/index.pdf.

Full text
Abstract:
Structural design of reinforced concrete structures requires involvement of many engineers who contribute to a single project. During the design process engineers have to exchange wide variety of information. Unmanaged data exchange may result in loss of resources. Developing a data model and setting up protocols for the management of data related to various structural design tasks can help to improve the quality of the structural design. In this study, an object oriented data model was developed for reinforced concrete beams. Geometry of the structure, detailed shape and placement of the reinforcement, and design specific information for beams were defined in the data model. Design code based computations are facilitated by developing a code library. Another focus of this study is developing a web based, platform independent data management and multi-user framework for structural design and detailing of reinforced concrete frames. The framework allows simultaneous design of a structure by multiple engineers. XML Web Services technology was utilized for the central management of design data. Design data was kept as XML files. Information was exchanged between the server and the engineer on a per-request basis. To design a beam strip, the engineer connects to the server and chooses a series of connected beams. The strip that is selected is locked for modifications of other engineers to prevent any data loss and unnecessary duplicate efforts. When the engineer finalizes the design of a beam strip, data is updated on the server and the lock for this strip is released. Between these requests no active connection is required between the engineer and the server. As a final task, the framework can produce structural CAD drawings in DXF format.
APA, Harvard, Vancouver, ISO, and other styles
15

Azgur, Serhat Mehmet. "A Hierarchical Modeling Tool For Instructional Design." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12611470/index.pdf.

Full text
Abstract:
A component-oriented tool for hierarchical modeling of instructional designs is developed. The motivation is to show that hierarchical representation of instructional designs is easier, better and more effective for modeling. Additionally a modeling language is developed to provide an effective, flexible and easy to use integration model in which all teaching components are discovered, defined and connected. In order to fulfill the above purposes an abstract notation is developed that is sufficiently general and adapting top-down hierarchic approach to represent Units of Learning (UoL), Operational Knowledge Units (OKU), Learning Objects (LO), and Learning Components (LC) with respect to the common structures found in different instructional models. COSEML, a top-down hierarchic, and component oriented modeling language has been used as a reference and the core concept in developing the Educational Component Oriented Modeling Language (ECOML). The high-level architecture of ECOML provides the means for designing instructional structures. It describes how LOs, UoLs, OKUs and LCs are sequenced in a certain context or knowledge domain. The resulting model can be reused in different contexts and across different educational platforms.
APA, Harvard, Vancouver, ISO, and other styles
16

Bjäremo, Svante. "The Nordic syllabi and the Common European Framework of Reference : Similarities and differences." Thesis, Linnéuniversitetet, Institutionen för språk (SPR), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-54088.

Full text
Abstract:
This study examines the similarities and differences between the Nordic syllabi (the Finnish, Swedish and Norwegian syllabi) and the influence CEFR has had on their structure and development. This was carried out using the method of hermeneutics, looking for similarities and differences using seven different dimensions of comparison. The study shows that there are similarities between the Nordic syllabi which have all been influenced by the CEFR. The most notable similarity between the documents is the communicative nature of teaching and assessment. This could give a deeper understanding of the Nordic countries' similarities and differences when it comes to language teaching. Further studies are needed using quantitative methods to say if these findings and connections between the Nordic syllabi are due to the influence of the CEFR or if other factors have been just as influential.
APA, Harvard, Vancouver, ISO, and other styles
17

Ramduny-Ellis, Devina. "Frameworks for enhancing temporal interface behaviour through software architectural design." Thesis, Staffordshire University, 2002. http://eprints.lancs.ac.uk/41655/.

Full text
Abstract:
The work reported in this thesis is concerned with understanding aspects of temporal behaviour. A large part of the thesis is based on analytical studies of temporal properties and interface and architectural concerns. The main areas covered include: i. analysing long-term human processes and the impact of interruptions and delays ii. investigating how infrastructures can be designed to support synchronous fast pace activity iii.design of the Getting-to-Know (GtK) experimental notification server The work is motivated by the failure of many collaborative systems to effectively manage the temporal behaviour at the interface level, as they often assume that the interaction is taking place over fast, reliable local area networks. However, the Web has challenged this assumption and users are faced with frequent network-related delays. The nature of cooperative work increases the importance of timing issues. Collaborative users require both rapid feedback of their own actions and timely feedthrough of other actions. Although it may appear that software architectures are about the internals of system design and not a necessary concern for the user interface, internal details do show up at the surface in non-functional aspects, such as timing. The focus of this work is on understanding the behavioural aspects and how they are influenced by the infrastructure. The thesis has contributed to several areas of research: (a)the study of long-term work processes generated a trigger analysis technique for task decomposition in HCI (b)the analysis of architectures was later applied to investigate architectural options for mobile interfaces (c)the framework for notification servers commenced a design vocabulary in CSCW for the implementation of notification services, with the aim of improving design (d)the impedance matching framework facilitate both goal-directed feedthrough and awareness In particular, (c) and (d) have been exercised in the development of the GtK separable notification server.
APA, Harvard, Vancouver, ISO, and other styles
18

GARAU, FEDERICA. "DESIGN, SYNTHESIS AND CHARACTERISATION OF POLYFUNCTIONAL COORDINATION POLYMERS." Doctoral thesis, Università degli studi di Padova, 2010. http://hdl.handle.net/11577/3427373.

Full text
Abstract:
The research group where I performed my PhD researches is interested in the synthesis and characterisation of Coordination Polymers (CPs). CPs are infinite systems built up with metal ions (connectors) and organic ligands (linkers) as main elementary units, connected via coordination bonds and other weak interactions [Robin & Fromm, Coord. Chem. Rev., 2006, 250, 2127]. One of the interests in building CPs is the creation of new tuneable functional materials, since CPs are promising materials for applications in gas storage, anion exchange, catalysis, conductivity, luminescence, chirality, magnetism, spin transition behaviour, NLO or deposition of thin films. CPs can be normally prepared by reacting a suitable polynucleating ligand with a transition metal ion, having at least two coordination sites available. In order to obtain new CPs, during the first year of my PhD studies, two different classes of new potentially polynucleating nitrogen containing ligands were designed, synthesised and fully characterised. Unfortunately, some preliminary tests on the reactivity of these linkers with Cu(II), Ag(I), Zn(II) and Ru(II) in hydroalcoholic solvents evidenced the relatively easy hydrolysis of the ligands. For these reasons, at the moment, the study of the coordination properties of these ligands to transition metal ions has been set aside, and the attention has been focused on the understanding of the decomposition mechanism upon treatment with transition metal compounds [Casarin et al. Inorg. Chim. Acta, 2009, 362, 4358; Casarin et al. J. Phys. Chem. A, 2008, 112, 6723]. Moreover, we continued some other studies, concerning the synthesis of trinuclear triangular Cu(II) clusters. The latter can be easily prepared by reacting copper(II) carboxylates with pyrazole (Hpz) in hydroalcoholic solvents, obtaining 1D, 2D or 3D CPs [Casarin et al. Inorg. Chem., 2004, 43, 5865; Casarin et al. Inorg. Chem., 2005, 44, 6265; Di Nicola et al. Inorg. Chem., 2007, 46, 221]. Particularly, during my PhD studies some other trinuclear triangular Cu(II) clusters (starting both from saturated and unsaturated Cu(II) carboxylates) were synthesised and characterised [Di Nicola et al. Eur. J. Inorg. Chem., 2009, 666; Contaldi et al. Dalton Trans., 2009, 4928]. We started also to study the reactivity of the trinuclear triangular copper(II)/carboxylate/pyrazolate clusters towards substitution reactions. Particularly, we examined first the stability of the trinuclear triangular moiety by treating [Cu3(μ3-OH)(μ-pz)3(CH3COO)2(Hpz)], 2, with strong acids, observing that the trinuclear moiety was in a good extent maintained, and obtaining new hexanuclear and heptanuclear coordination polymers, in some cases porous [Casarin et al. Cryst. Growth Des., 2007, 4, 676]. Since the trinuclear moieties resulted to be quite stable towards decomposition, we decided to use them as starting material to synthesise new different CPs, having the trinuclear Cu(II) clusters directly bridged by polynucleating ligands. We decided to follow two different approaches: i) substitution of the monocarboxylates with bicarboxylates; and ii) replacement of the neutral ligands coordinated to the Cu(II) centres (pyrazole and/or solvent molecules) with neutral bidentate nitrogen-containing ligands. First attempts under bench-top conditions to replace the carboxylate ions by reacting the trinuclear compounds with bicarboxylic acids were not successful, leading to the isolation of the starting materials, while reactions carried out upon deprotonation of the acids led to the instantaneous precipitation of insoluble powders. The reaction with bidentate nitrogen-containing ligands gave instead interesting results, and numerous new CPs were isolated and fully characterised. Noteworthy, upon reaction between [Cu3(μ3-OH)(μ-pz)3(HCOO)2(Hpz)2(H2O)], 1, and an excess of 4,4’-bipyridine, a porous coordination polymer was obtained. The electrochemical properties of some selected copper(II) CPs isolated by us were examined; we also investigated the catalytic activities of some of these compounds in the peroxidative oxidation of cyclohexane. Both of these researches were performed in prof. Armando J. L. Pombeiro laboratories, at the Centro de Química Estrutural of the Instituto Superior Técnico (Lisbon). Finally, more recently, we started to investigate the catalytic activity of 2 in the oxidation of methyl-p-tolyl sulfide to the corresponding sulfoxide. Preliminary results indicated that 2 is able to bind reversibly oxygen coming from H2O2 and transfer it to the sulfide.
Il gruppo di ricerca presso cui ho svolto il Dottorato di Ricerca si occupa della sintesi e caratterizzazione di Polimeri di Coordinazione (CP). I CP sono dei sistemi polimerici infiniti, costruiti a partire da ioni metallici (connettori) e leganti organici (linkers) come principali unità di base, connessi tra loro attraverso legami di coordinazione e altre interazioni relativamente deboli [Robin & Fromm, Coord. Chem. Rev., 2006, 250, 2127]. Uno dei principali interessi nella costruzione di CP è l'ottenimento di nuovi materiali funzionali, dotati di proprietà modulabili; i CP, infatti, presentano potenziali applicazioni in diversi settori, quali ad esempio immagazzinamento di gas, scambio ionico, catalisi, conduttività, luminescenza, chiralità, magnetismo, ottica non lineare o deposizione di strati sottili. I CP vengono generalmente sintetizzati per “copolimerizzazione” di un opportuno legante polinucleanti con uno ione di un metallo di transizione avente almeno due siti di coordinazione disponibili. Al fine di ottenere nuovi CP, durante il primo anno del Dottorato di Ricerca sono state progettate e sintetizzate due diverse classi di nuovi leganti azotati potenzialmente polinucleanti. Test preliminari sulla reattività di questi composti con Cu(II), Ag(I), Zn(II) e Ru(II) in solventi protici hanno messo in evidenza la relativamente facile idrolisi dei leganti. Per questo motivo, lo studio delle loro proprietà coordinative a metalli di transizione è stato momentaneamente accantonato, e l’attenzione è stata focalizzata sulla comprensione del meccanismo della decomposizione in seguito al trattamento con metalli di transizione [Casarin et al. Inorg. Chim. Acta, 2009, 362, 4358; Casarin et al. J. Phys. Chem. A, 2008, 112, 6723]. E’ stato inoltre sviluppato un secondo progetto, riguardante la sintesi di cluster trinucleari triangolari di rame(II). Questi composti possono essere preparati facendo reagire carbossilati di Cu(II) con pirazolo (Hpz) in solvente protico (H2O, MeOH o EtOH). L’autoassemblaggio dei building blocks porta alla formazione di CP di diversa dimensionalità (1D, 2D o 3D) [Casarin et al. Inorg. Chem., 2004, 43, 5865; Casarin et al. Inorg. Chem., 2005, 44, 6265; Di Nicola et al. Inorg. Chem., 2007, 46, 221]. Sono stati quindi sintetizzati e caratterizzati nuovi cluster trinucleari triangolari di Cu(II) (partendo sia da carbossilati di Cu(II) saturi che insaturi) [Di Nicola et al. Eur. J. Inorg. Chem., 2009, 666; Contaldi et al. Dalton Trans., 2009, 4928]. Abbiamo inoltre esaminato la stabilità del frammento trinucleare [Cu3(μ3-OH)(μ-pz)3]2+ facendo reagire il composto [Cu3(μ3-OH)(μ-pz)3(CH3COO)2(Hpz)], 2, con diversi acidi forti. Benché in tutti i casi siano stati isolati complessi mononucleari, derivanti dalla parziale decomposizione del cluster trinucleare, la contemporanea formazione di derivati tri-, esa- ed eptanucleari, in alcuni casi porosi [Casarin et al. Cryst. Growth Des., 2007, 4, 676], ha messo in evidenza la relativa stabilità del frammento [Cu3(μ3-OH)(μ-pz)3]2+. Si è deciso quindi di utilizzare i differenti cluster trinucleari come materiale di partenza per la sintesi di nuovi CP, in cui più unità trinucleari fossero connesse tra loro tramite leganti polinucleanti. Abbiamo deciso di seguire due diversi approcci: i) sostituzione dei monocarbossilati con bicarbossilati; e ii) sostituzione dei leganti neutri coordinati ai centri di Cu(II) (pirazolo e/o molecole di solvente) con leganti azotati neutri bidentati. I primi tentativi in condizioni bench-top di sostituire gli ioni carbossilato facendo reagire i composti trinucleari con acidi bicarbossilici non hanno avuto successo, portando all’isolamento dei composti di partenza, mentre le reazioni condotte utilizzando i bicarbossilati hanno portato all’istantanea precipitazione di polveri insolubili. Le reazioni con leganti azotati neutri bidentati hanno invece portato all’ottenimento di risultati interessanti; infatti, sono stati isolati e caratterizzati numerosi nuovi CP. In particolare, dalla reazione di [Cu3(μ3-OH)(μ-pz)3(HCOO)2(Hpz)2(H2O)], 1, con un largo eccesso di 4,4’-bipiridina, è stato isolato un polimero di coordinazione poroso. Infine, sono state esaminate sia le proprietà elettrochimiche di alcuni CP di rame(II) da noi isolati che l’attività catalitica di alcuni di questi composti nella reazione di ossidazione perossidativa del cicloesano. Entrambi questi studi sono stati condotti nei laboratori del prof. Armando J. L. Pombeiro, presso il Centro de Química Estrutural dell’Instituto Superior Técnico (Lisbona). Più recentemente, abbiamo iniziato a studiare l’attività catalitica di 2 nell’ossidazione del metil-p-tolil solfuro al corrispondente solfossido. Risultati preliminari indicano che 2 è in grado di legare in modo reversibile l’ossigeno proveniente da H2O2 e di trasferirlo al solfuro.
APA, Harvard, Vancouver, ISO, and other styles
19

Bellucci, Luca. "Thermometers at the Nanoscale: a Molecular Approach to Design and Develop Functional Lanthanoid-based Luminescent Materials." Doctoral thesis, Università degli studi di Padova, 2019. http://hdl.handle.net/11577/3422334.

Full text
Abstract:
The present work is devoted to the development of lanthanoid-based luminescent thermometers and to the study of the correlations between the thermometric properties and the different building blocks composing the systems. In particular, using rare earth cations, b-diketones (H(b-dike)= dibenzoylmethane, Hdbm; benzoyltrifluoroacetone, Hbta; hexafluoroacetylacetone, Hhfac; thenoyltrifluoroacetone, Htta), and divergent ligands (4,4’-bipyridine, bipy; 4,4’-bipyridine-N-oxide, bipyMO; pyrazine-N-oxide, pyrzMO; 2,5-dihydroxy-1,4-dicarboxylate, H2DHT2-) we prepared molecular systems with different dimensionality: i) dinuclear complexes (0D), ii) Coordination Polymers (CPs, 1D), and Metal-Organic Frameworks (MOFs, 3D). We started with europium β-dike dinuclear compounds with molecular formula [Eu2(β-dyke)6(L-MO)x] (x=3 for hfac, x=2 for dbm, bta, and tta). Substituting the hfac -CF3 groups with phenyl rings (one in bta, two in dbm) or with a thienyl ring (tta) the β-dike electronic and steric properties were modulated. Conversely, the different steric hindrance of bipyMO and pyrzMO influenced the spatial disposition of the β-dike ligands and the inter- and intra-molecular interactions. In the -50 ÷ 100 °C temperature range, the complexes showed relative thermal sensitivity (Sr) values higher than 1 (generally assumed as a quality criterion for these thermometers) that depended on the nature of both the β-dike and L-MO ligands. The β-diketonates influenced Sr values, thermal operative range, and photostability of the system, while pyrzMO-containing compounds showed improved performances (Sr maximum from 4.6 to 8.1 % °C-1 depending on the β-dike ligand) compared to that based on bipyMO (Sr maximum from 3.4 to 5.1 % °C-1). In the second part, a series of ratiometric Eu3+/ Tb3+ luminescent thermometers were obtained by mixing different quantities of the two homometallic [Ln(hfac)3(bipy)]n 1D-CPs (Ln3+= Eu3+ and Tb3+) in spectroscopically inert KBr. Here, we studied the effect of the relative metal amounts and of the excitation wavelength on the Sr. For all the samples, Sr values almost independent from the Tb/Eu molar ratio and the excitation wavelength were found between -190 and 110 °C. Each sample showed a peculiar temperature-dependent emission colour, from green to red, that was exploited to develop colour-coded thermometers able to distinguish temperature intervals in the order of 10/ 20 °C. Then we developed a mild-condition synthetic procedure to obtain single crystal to single crystal post-synthesis modifications of the 3D-MOF [Eu2(H2DHT)3(DMF)4]∙2DMF (DMF= N,N-dimethylformamide) to substitute DMF in the channels with different organic molecules (i.e. CHCl3, imidazole, pyridine, and tetrahydrofuran) and to study the Sr modulation. Depending on the guest molecule, the temperature at which Sr reaches the maximum and its value varied from 2.5 at -90 °C to 4.2 % °C-1 at -10 °C. Molecular compounds like complexes, CPs and MOFs are ideal systems to study the correlations between structure, composition (in the meaning of molecular functionalization) and functional properties because they can be easily modified through chemical processes. Nevertheless, to exploit the unique properties of these compounds in commonly used tools and instruments they need to be integrated into a device. Surface functionalization is a recurrent way to achieve this target. However, maintaining the control on the arrangement of the various building blocks and characterize the functionalized surface is not straightforward, so that the development of “ad hoc” synthetic protocols and methods is required. In this context, we developed a general procedure for surface functionalization. We exploited the reactivity of lanthanoid N,N-dialkylcarbamate complexes to create an ordered Eu3+-Tb3+ heterobimetallic sequence grafted on amorphous silica using terephthalic acid as divergent ligand to connect the two Ln3+ ions. Photoluminescence was here used to determine the spatial disposition of the two metal ions on silica. In particular, Tb3+-to-Eu3+ energy transfer was used as molecular ruler to study the Ln3+ ions spatial distribution and intermetal distances that allow us to obtain data supporting the formation of the desired sequence.
Questo lavoro è volto allo sviluppo di termometri molecolari a base di ioni lantanoidei ed allo studio delle correlazioni tra proprietà termometriche ed i diversi building blocks che compongono i sistemi. In particolare, usando cationi lantanoidei, β-dichetoni (H(β-dike)= dibenzoilmetano, Hdbm; benzoiltrifluoroacetone, Hbta; esafluoroacetilacetone, Hhfac; tenoiltrifluoroacetone, Htta), e leganti divergenti (4,4’-bipiridina, bipy; 4,4’-bipiridina-N-ossido, bipyMO; pirazina-N-ossido, pyrzMO; 2,5-diidrossi-1,4-dicarbossilato, H2DHT2-) sono stati preparati sistemi molecolari con diversa dimensionalità: i) complessi dinucleari (0D), polimeri di coordinazione (CPs, 1D) e Metal Organic Frameworks (MOFs, 3D). Nei complessi dinucleari di europio [Eu2(β-dike)6(L-MO)x] (x=3 per hfac, x=2 per dbm, bta, and tta) le proprietà elettroniche e steriche del β-dichetonato sono state modulate sostituendo i gruppi -CF3 di hfac con anelli fenilici (uno nel bta, due nel dbm) o con un anello tienilico (tta). Il diverso ingombro sterico dei leganti bipyMO e pyrzMO influenzarono invece la disposizione spaziale dei leganti β-dichetonato e le loro interazioni inter- ed intra-molecolari. I complessi mostrarono valori di sensibilità termica relativa (Sr) maggiori di 1 (valore generalmente usato come criterio di qualità per questi termometri) dipendenti dalla natura dei leganti β-dike e L-MO nell’intervallo di temperatura -50 ÷ 100 °C. I leganti β-dichetonato influenzarono i valori di Sr, l’intervallo applicativo di temperature e la fotostabilità dei complessi. I composti contenenti pyrzMO mostrarono performances migliori (Sr massimo da 4.6 a 8.1 % °C-1 a seconda del β-dike) rispetto agli analoghi con bipyMO (Sr massimo da 3.4 a 5.1 % °C-1). Successivamente è stata ottenuta una serie di termometri luminescenti raziometrici a base di Eu3+ e Tb3+ miscelando diverse quantità dei due polimeri di coordinazione 1D [Ln(hfac)3(bipy)]n (Ln3+= Eu3+ e Tb3+) in KBr (inerte dal punto di vista spettroscopico). In questo caso è stato studiato l’effetto su Sr della quantità relativa dei due ioni metallici e della lunghezza d’onda di eccitazione nell’intervallo -190 ÷ 110 °C. Per tutti i campioni i valori di Sr risultarono non influenzati dal rapporto molare Tb/Eu e dalla lunghezza d’onda di eccitazione. Ciascun campione mostrò un peculiare colore emesso (dal verde al rosso) in funzione della temperatura. Ciò è stato sfruttato per sviluppare termometri basati su codici di colori capaci di distinguere intervalli di temperatura di 10/ 20 °C. È stata inoltre sviluppata una procedura basata su condizioni blande per attuare la modifica post-sintetica da cristallo singolo a cristallo singolo del MOF [Eu2(H2DHT)3(DMF)4]∙2DMF (DMF= N,N-dimetilformamide) sostituendo la DMF nei canali con diverse molecole (CHCl3, imidazolo, piridina e tetraidrofurano) per studiare la modulazione di Sr. A seconda della molecola ospitata, la temperatura alla quale si raggiunge il massimo di Sr e il suo valore variarono da 2.5 a -90 °C a 4.2 % °C-1 a -10 °C. Composti molecolari come complessi, CPs e MOFs sono ideali per lo studio delle correlazioni tra struttura, composizione (nel senso di funzionalizzazione molecolare) e proprietà funzionali in quanto facilmente modificabili tramite processi chimici. Tuttavia, è necessario integrare questi sistemi in dispositivi per sfruttare le loro proprietà in strumenti di uso comune. La funzionalizzazione di superfici è una procedura comunemente usata per questo scopo. In questo caso tuttavia, non è semplice mantenere il controllo sulla disposizione dei vari building blocks e caratterizzare la superficie funzionalizzata così che è fondamentale sviluppare un adeguato protocollo sintetico. In questo contesto, è stata messa a punto una procedura sintetica per la funzionalizzazione di superfici. Per questo scopo è stata sfruttata la reattività degli N,N-dialchilcarbammati lantanoidei per creare una sequenza eterobimetallica Eu3+-Tb3+ ordinata aggraffata su silice amorfa, usando l’acido tereftalico come legante divergente per connettere i due centri metallici. La fotoluminescenza è stata qui utilizzata per determinare la distribuzione spaziale dei due ioni metallici sulla silice. In particolare, fenomeni di trasferimento energetico dal Tb3+ all’Eu3+ sono stati usati come righello molecolare per determinare la distribuzione e le distanze intermetalliche ottenendo dati a supporto della formazione della sequenza desiderata.
APA, Harvard, Vancouver, ISO, and other styles
20

Srinivasan, V. "Supporting Novelty In Conceptual Phase Of Engineering Design." Thesis, 2010. http://etd.iisc.ernet.in/handle/2005/2266.

Full text
Abstract:
Current design models, approaches and theories are highly fragmented, have seldom been compared with one another, and rarely attempted to be consolidated. Novelty is a measure of creativity of engineering products and positively influences product success. Using physical laws and effects for designing can improve the chances of creativity but they cannot be used directly owing to their inadequate current representations. It is important to address activities, outcomes, requirements and solutions in designing. Conceptual design is an early phase in engineering design and needs to be supported better. A systematic approach for designing often increases effectiveness and efficiency. Thus, the broad objective of this thesis is to develop and validate a comprehensive understanding of how designing occurs during the conceptual phase of engineering design, and to support variety and novelty of designs during this phase. The approach followed is: (a) formulate and validate an understanding of novelty and its relationships to the designing constructs, in current designing, and(b)develop and validate a support, founded on the current designing, to improve novelty. The understanding and the support are addressed, respectively, through an integrated model and a systematic framework for designing; the model and the framework comprise activities, outcomes(including laws and effects), requirements and solutions. An integrated model of designing, GEMS of SAPPhIRE as req-sol is developed by combining activities(Generate, Evaluate, Modify, Select– GEMS), outcomes (State change, Action, Parts, Phenomenon, Input, oRgans, Effect–SAPPhIRE), requirements (req) and solutions (sol), identified from a comprehensive survey of existing design models and approaches. Validation of SAPPhIRE model with existing systems indicates that the model can be used to describe analysis and synthesis, both of which together constitute designing. Validation of the integrated model using existing videos of design sessions, to check if all its constructs are naturally used in designing, reveals that:(a) all the constructs are naturally used;(b) not all the outcomes are explored with equal intensity;(c) while high numbers of action and parts are observed, only low numbers of phenomenon, effects and organs are found. Empirical study using another set of design sessions to study the relationships between novelty and the outcomes reveals that novelty of a concept space depends on the variety of the concept space, which in turn depends on the variety of the idea space explored. Novelty and variety of a concept space also depend on the number of outcomes explored at each abstraction level. Thus, phenomena and effects are also vital for variety and novelty. Based on the above, GEMS of SAPPhIRE as req-sol framework for designing is proposed. The framework is divided into: Requirements Exploration Stage(RES) and Solutions Exploration Stage(SES). In RES and SES, requirements and solutions respectively at all the abstraction levels including SAPPhIRE are generated, evaluated, modified and selected. The framework supports task clarification, conceptual and early embodiment phases of designing, and provides process knowledge. Comparison of the framework against existing design models, theories and approaches reveals that:(a) not all existing models, theories and approaches address activities, outcomes, requirements and solutions together;(b) those that address all these constructs together do not make a distinction between requirements and solutions; and(c) no model or approach explicitly addresses novelty. The usability of the framework and Idea-inspire is assessed by applying them in an industrial project for designing novel concepts of lunar vehicle mobility system. The use of this combined support enables identification of critical requirements, development of a large variety of ideas and concepts. One of these concepts is physically and virtually modelled, and tested, and is found to satisfy all the requirements. A catalogue of physical laws and effects is developed using SAPPhIRE model to provide assistance to designers, especially for phenomena, effects and organs. Observations found during this development are reported. A comparative validation of the framework and the catalogue for their support to design for variety and novelty is done using comparative observational studies. Results from the observational studies reveal that the variety and the novelty of concept space improve with the use of the framework, or with the frame work and the catalogue, as compared to variety and novelty with no support.
APA, Harvard, Vancouver, ISO, and other styles
21

Padayachee, Indira. "A methodology for evaluating intelligent tutoring systems." Diss., 2000. http://hdl.handle.net/10500/15728.

Full text
Abstract:
Dissertation
This dissertation proposes a generic methodology for evaluating intelligent tutoring systems (ITSs), and applies it to the evaluation of the SQL-Tutor, an ITS for the database language SQL. An examination of the historical development, theory and architecture of intelligent tutoring systems, as well as the theory, architecture and behaviour of the SQL-Tutor sets the context for this study. The characteristics and criteria for evaluating computer-aided instruction (CAl) systems are considered as a background to an in-depth investigation of the characteristics and criteria appropriate for evaluating ITSs. These criteria are categorised along internal and external dimensions with the internal dimension focusing on the intrinsic features and behavioural aspects of ITSs, and the external dimension focusing on its educational impact. Several issues surrounding the evaluation of ITSs namely, approaches, methods, techniques and principles are examined, and integrated within a framework for assessing the added value of ITS technology for instructional purposes.
Educational Studies
M. Sc. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography