To see the other types of publications on this topic, follow the link: Complex problems.

Dissertations / Theses on the topic 'Complex problems'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Complex problems.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bukhkalo, S. I., and A. O. Ageicheva. "Complex projects development problems." Thesis, National Technical University "Kharkiv Polytechnic Institute", 2019. http://repository.kpi.kharkov.ua/handle/KhPI-Press/41490.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Whitmer, Brian C. "Improving Spreadsheets for Complex Problems." Diss., CLICK HERE for online access, 2008. http://contentdm.lib.byu.edu/ETD/image/etd2361.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pérez, Foguet Agustí. "Numerical modelling of complex geomechanical problems." Doctoral thesis, Universitat Politècnica de Catalunya, 2000. http://hdl.handle.net/10803/6703.

Full text
Abstract:
La tesis se centra en el desarrollo de técnicas numéricas específicas para la resolución de problemas de mecánica de sólidos, tomando como referencia aquellos que involucran geomateriales (suelos, rocas, materiales granulares,...). Concretamente, se tratan los siguientes puntos: 1) formulaciones Arbitrariamente Lagrangianas Eulerianas (ALE) para problemas con grandes desplazamientos del contorno; 2) métodos de resolución para problemas no lineales en el campo de la mecánica de sólidos y 3) modelización del comportamiento mecánico de materiales granulares mediante leyes constitutivas elastoplásticas.
Las principales aportaciones de la tesis son: el desarrollo de una formulación ALE para modelos hyperelastoplásticos y el cálculo de operadores tangentes para distintas leyes constitutivas y esquemas de integración temporal no triviales (uso de esquemas de derivación numérica, técnicas de subincrementación y modelos elastoplásticos con endurecimiento y/o reblandecimiento dependientes del trabajo plástico o la densidad). Se presentan diversas aplicaciones que muestran las principales características de los desarrollos presentados (análisis del ensayo del molinete para arcillas blandas, del ensayo triaxial para arenas, de la rotura bajo una cimentación, del proceso de estricción de una barra metálica circular y de un proceso de estampación en frío), dedicando una especial atención a los aspectos computacionales de la resolución de dichos problemas. Por último, se dedica un capítulo específico a la modelización y la simulación numérica de procesos de compactación fría de polvos metálicos y cerámicos.
Numerical modelling of problems involving geomaterials (i.e. soils, rocks, concrete and ceramics) has been an area of active research over the past few decades. This fact is probably due to three main causes: the increasing interest of predicting the material behaviour in practical engineering situations, the great change of computer capabilities and resources, and the growing interaction between computational mechanics, applied mathematics and different engineering fields (concrete, soil mechanics...). This thesis fits within this last multidisciplinary approach. Based on constitutive modelling and applied mathematics and using both languages the numerical simulation of some complex geomechanical problems has been studied.

The state of the art regarding experiments, constitutive modelling, and numerical simulations involving geomaterials is very extensive. The thesis focuses in three of the most important and actual ongoing research topics within this framework: 1) the treatment of large boundary displacements by means of Arbitrary Lagrangian-Eulerian (ALE) formulations; 2) the numerical solution of highly nonlinear systems of equations in solid mechanics; and 3) the constitutive modelling of the nonlinear mechanical behaviour of granular materials. The three topics have been analysed and different contributions for each one of them have been developed. Moreover, some of the new developments have been applied to the numerical modelling of cold compaction processes of powders. The process consists in transforming a loose powder into a compacted sample through a large volume reduction. This problem has been chosen as a reference application of the thesis because it involves large boundary displacements, finite deformations and highly nonlinear material behaviour. Therefore, it is a challenging geomechanical problem from a numerical modelling point of view.

The most relevant contributions of the thesis are the following: 1) with respect to the treatment of large boundary displacements: quasistatic and dynamic analyses of the vane test for soft materials using a fluid-based ALE formulation and different non-newtonian constitutive laws, and the development of a solid-based ALE formulation for finite strain hyperelastic-plastic models, with applications to isochoric and non-isochoric cases; 2) referent to the solution of nonlinear systems of equations in solid mechanics: the use of simple and robust numerical differentiation schemes for the computation of tangent operators, including examples with several non-trivial elastoplastic constitutive laws, and the development of consistent tangent operators for different substepping time-integration rules, with the application to an adaptive time-integration scheme; and 3) in the field of constitutive modelling of granular materials: the efficient numerical modelling of different problems involving elastoplastic models, including work hardening-softening models for small strain problems and density-dependent hyperelastic-plastic models in a large strain context, and robust and accurate simulations of several powder compaction processes, with detailed analysis of spatial density distributions and verification of the mass conservation principle.
APA, Harvard, Vancouver, ISO, and other styles
4

Mohammed, Alip. "Boundary value problems of complex variables." [S.l. : s.n.], 2002. http://www.diss.fu-berlin.de/2003/23/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kytmanov, Aleksandr, Simona Myslivets, Bert-Wolfgang Schulze, and Nikolai Tarkhanov. "Elliptic problems for the Dolbeault complex." Universität Potsdam, 2001. http://opus.kobv.de/ubp/volltexte/2008/2597/.

Full text
Abstract:
The inhomogeneous ∂-equations is an inexhaustible source of locally unsolvable equations, subelliptic estimates and other phenomena in partial differential equations. Loosely speaking, for the anaysis on complex manifolds with boundary nonelliptic problems are typical rather than elliptic ones. Using explicit integral representations we assign a Fredholm complex to the Dolbeault complex over an arbitrary bounded domain in C up(n).
APA, Harvard, Vancouver, ISO, and other styles
6

Mitchell, Helen Margaret. "Index policies for complex scheduling problems." Thesis, University of Newcastle Upon Tyne, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.397534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Maidstone, Robert. "Efficient analysis of complex changepoint problems." Thesis, Lancaster University, 2016. http://eprints.lancs.ac.uk/83055/.

Full text
Abstract:
Many time series experience abrupt changes in structure. Detecting where these changes in structure, or changepoints, occur is required for effective modelling of the data. In this thesis we explore the common approaches used for detecting changepoints. We focus in particular on techniques which can be formulated in terms of minimising a cost over segmentations and solved exactly using a class of dynamic programming algorithms. Often implementations of these dynamic programming methods have a computational cost which scales poorly with the length of the time series. Recently pruning ideas have been suggested that can speed up the dynamic programming algorithms, whilst still being guaranteed to be optimal. In this thesis we extend these methods. First we develop two new algorithms for segmenting piecewise constant data: FPOP and SNIP. We evaluate them against other methods in the literature. We then move on to develop the method OPPL for detecting changes in data subject to fitting a continuous piecewise linear model. We evaluate it against similar methods. We finally extend the OPPL method to deal with penalties that depend on the segment length.
APA, Harvard, Vancouver, ISO, and other styles
8

Hanna, S. "Addressing complex design problems through inductive learning." Thesis, University College London (University of London), 2012. http://discovery.ucl.ac.uk/1353781/.

Full text
Abstract:
Optimisation and related techniques are well suited to clearly defined problems involving systems that can be accurately simulated, but not to tasks in which the phenomena in question are highly complex or the problem ill-defined. These latter are typical of architecture and particularly creative design tasks, which therefore currently lack viable computational tools. It is argued that as design teams and construction projects of unprecedented scale are increasingly frequent, this is just where such optimisation and communication tools are most needed. This research develops a method by which to address complex design problems, by using inductive machine learning from example precedents either to approximate the behaviour of a complex system or to define objectives for its optimisation. Two design domains are explored. A structural problem of the optimisation of stiffness and mass of fine scale, modular space frames has relatively clearly defined goals, but a highly complex geometry of many interconnected members. A spatial problem of the layout of desks in the workplace addresses the social relationships supported by the pattern of their arrangement, and presents a design situation in which even the problem objectives are not known. These problems are chosen to represent a range of scales, types and sources of complexity against which the methods can be tested. The research tests two hypotheses in the context of these domains, relating to the simulation of a system and to communication between the designer and the machine. The first hypothesis is that the underlying structure and causes of a system’s behaviour must be understood to effectively predict or simulate its behaviour. This hypothesis is typical of modelling approaches in engineering. It is falsified by demonstrating that a function can be learned that models the system in question—either optimising of structural stiffness or determining desirable spatial patterns—without recourse to a bottom up simulation of that system. The second hypothesis is that communication of the behaviour of these systems to the machine requires explicit, a priori definitions and agreed upon conventions of meaning. This is typical of classical, symbolic approaches in artificial intelligence and still implicitly underlies computer aided design tools. It is falsified by a test equivalent to a test of linguistic competence, showing that the computer can form a concept of, and satisfy, a particular requirement that is implied only by ostensive communication by examples. Complex, ill-defined problems are handled in practice by hermeneutic, reflective processes, criticism and discussion. Both hypotheses involve discerning patterns caused by the complex structure from the higher level behaviour only, forming a predictive approximation of this, and using it to produce new designs. It is argued that as these abilities are the input and output requirements for a human designer to engage in the reflective design process, the machine can thus be provided with the appropriate interface to do so, resulting in a novel means of interaction with the computer in a design context. It is demonstrated that the designs output by the computer display both novelty and utility, and are therefore a potentially valuable contribution to collective creativity.
APA, Harvard, Vancouver, ISO, and other styles
9

Harper, Courtney Christine. "Complex problems in peroxisome matrix protein import." Available to US Hopkins community, 2003. http://wwwlib.umi.com/dissertations/dlnow/3080674.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Viaud, Quentin. "Mathematical programming methods for complex cutting problems." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0350.

Full text
Abstract:
Cette thèse s’intéresse à un problème de bin-packing en deux dimensions avec des défauts sur les bins rencontré dans l’industrie verrière. Les plans de découpe sont guillotine 4-stage exact, les objets à couper sans défauts.Une possible résolution utilise la décomposition de Dantzig-Wolfe puis une génération de colonnes et un branch-and-price. Cela est impossible dans notre cas du fait d’instances de trop grande taille. Nous résolvons d’abord le problème de pricing sans défauts par un algorithme incrémental de labelling basé sur un programme dynamique (DP), représenté par un problème de flot dans un hypergraphe. Notre méthode est générique pour les problèmes de sac-à-dos guillotine mais ne résout pas de larges instances en un temps de calcul raisonnable. Nous résolvons alors le problème de bin-packing sans défauts grâce à un DP et une heuristique de diving. Le DP génère des colonnes “non propres”,ne pouvant pas participer à une solution entière. Nous adaptons le diving pour ce cas sans perte d’efficacité. Nous l’étendons alors au cas avec défauts. Nous réparons d’abord heuristiquement une solution du problème sans défauts. La fixation des colonnes dans le diving sans-défaut est ensuite modifiée pour gérer les défauts. Les résultats industriels valident nos méthodes
This thesis deals with a two-dimensional bin-packing problem with defects on bins from the glass industry. Cutting patterns have to be exact 4-stage guillotine and items defect-free. A standard way to solve it isto use Dantzig-Wolfe reformulation with column generation and branch-and price.This is impossible in our case due to large instance size. We first study and solve the defect-free pricing problem with an incremental labelling algorithm based on a dynamic program (DP), represented as a flow problem in a hypergraph. Our method is generic for guillotine knapsack problems but fails to solve large instance in a short amount of time. Instead we solve the defect freebin-packing problem with a DP and a diving heuristic. This DP generatesnon-proper columns, cutting patterns that cannot be in an integer solution.We adapt standard diving heuristic to this “non-proper” case while keeping itseffectiveness. We then extend the diving heuristic to deal with defects. Ourfirst proposal heuristically repairs a given defect-free solution. Secondly the defect-free diving heuristic is adjusted to handle defects during column fixing.Our industrial results outline the effectiveness of our methods
APA, Harvard, Vancouver, ISO, and other styles
11

Turner, Stephen Richard. "The Role of Strategies in Complex Technology Problem Solving." Thesis, Griffith University, 2012. http://hdl.handle.net/10072/366076.

Full text
Abstract:
Two issues are addressed in this thesis. Firstly, the nature of technological problems and the ways in which they differ from everyday problems are explored. It is argued that technological problems are complex and ill-defined and that these characteristics determine that specific problem-solving strategies are required to resolve these problems successfully. The second issue addressed is the manner in which pre-service technology teachers solve technological problems including the strategies they employ to solve them. The results of the empirical studies in this thesis reveal that problem-solvers, while employing expert-like strategies in one domain, apply a combination of both expert and novice-like problem-solving strategies (sometimes referred to as heuristics) when they are confronted with an unfamiliar domain or new problem type. It is argued that this phenomenon occurs when the problem-solver has exhausted the knowledge and skills acquired in previous problem-solving events and the transference of this experience to the new domain or problem type ceases. As a result, the problem-solver reverts to novice-like heuristics such as trial-and-error in an effort to resolve the problem or its sub-problems. However, this leads the problem-solver to switch direction numerous times, diverting their efforts, in many cases, towards low priority issues and unproductive outcomes. It is argued that systemised strategies such as Advanced Systematic of Inventive Thinking (ASIT), guide the problem-solvers activities toward more productive and rewarding outcomes leading to plausible solutions being generated from within the problem elements thereby simplifying the problem-solving process.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Education and Professional Studies
Arts, Education and Law
Full Text
APA, Harvard, Vancouver, ISO, and other styles
12

Otri, Sameh. "Improving the bees algorithm for complex optimisation problems." Thesis, Cardiff University, 2011. http://orca.cf.ac.uk/11568/.

Full text
Abstract:
An improved swarm-based optimisation algorithm from the Bees Algorithm family for solving complex optimisation problems is proposed. Like other Bees Algorithms, the algorithm performs a form of exploitative local search combined with random exploratory global search. This thesis details the development and optimisation of this algorithm and demonstrates its robustness. The development includes a new method of tuning the Bees Algorithm called Meta Bees Algorithm and the functionality of the proposed method is compared to the standard Bees Algorithm and to a range of state-of-the-art optimisation algorithms. A new fitness evaluation method has been developed to enable the Bees Algorithm to solve a stochastic optimisation problem. The new modified Bees Algorithm was tested on the optimisation of parameter values for the Ant Colony Optimisation algorithm when solving Travelling Salesman Problems. Finally, the Bees Algorithm has been adapted and employed to solve complex combinatorial problems. The algorithm has been combined with two neighbourhood operators to solve such problems. The performance of the proposed Bees Algorithm has been tested on a number of travelling salesman problems, including two problems on printed circuit board assembly machine sequencing.
APA, Harvard, Vancouver, ISO, and other styles
13

Liu, Wudong. "Evolutionary multiobjective optimisation for expensive and complex problems." Thesis, University of Essex, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.537937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Barr, Steven William. "An integrated approach to complex problems of organisations." Thesis, University of Bristol, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.768200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Abdollahi, Jafar. "Analysing Complex Oil Well Problems through Case-Based Reasoning." Doctoral thesis, Norwegian University of Science and Technology, Faculty of Engineering Science and Technology, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1702.

Full text
Abstract:

The history of oil well engineering applications has revealed that the frequent operational problems are still common in oil well practice. Well blowouts, stuck pipes, well leakages are examples of the repeated problems in the oil well engineering industry. The main reason why these unwanted problems are unavoidable can be the complexity and uncertainties of the oil well processes. Unforeseen problems happen again and again, because they are not fully predictable, which could be due to lack of sufficient data or improper modelling to simulate the real conditions in the process. Traditional mathematical models have not been able to totally eliminate unwanted oil well problems because of the many involved simplifications, uncertainties, and incomplete information. This research work proposes a new approach and breakthrough for overcoming these challenges. The main objective of this study is merging two scientific fields; artificial intelligence and petroleum engineering in order to implement a new methodology.

Case-Based Reasoning (CBR) and Model-Based Reasoning (MBR), two branches of the artificial intelligence science, are applied for solving complex oil well problems. There are many CBR and MBR modelling tools which are generally used for different applications for implementing and demonstrating CBR and MBR methodologies; however, in this study, the Creek system which combines CBR and MBR has been utilized as a framework. One specific challenging task related to oil well engineering has been selected to exemplify and examine the methodology. To select a correct candidate for this application was a challenging step by itself. After testing many different issues in the oil well engineering, a well integrity issue has been chosen for the context. Thus, 18 leaking wells, production and injection wells, from three different oil fields have been analysed in depth. Then, they have been encoded and stored as cases in an ontology model given the name Wellogy.

The challenges related to well integrity issues are a growing concern. Many oil wells have been reported with annulus gas leaks (called internal leaks) on the Norwegian Continental Shelf (NCS) area. Interventions to repair the leaking wells or closing and abandoning wells have led to: high operating cost, low overall oil recovery, and in some cases unsafe operation. The reasons why leakages occur can be different, and finding the causes is a very complex task. For gas lift and gas injection wells the integrity of the well is often compromised. As the pressure of the hydrocarbon reserves decreases, particularly in mature fields, the need for boosting increases. Gas is injected into the well either to lift the oil in the production well or to maintain pressure in the reservoir from the injection well. The challenge is that this gas can lead to breakdown of the well integrity and cause leakages. However, as there are many types of leakages that can occur and due to their complexity it can be hard to find the cause or causal relationships. For this purpose, a new methodology, the Creek tool, which combines CBR and MBR is applied to investigate the reasons for the leakages. Creek is basically a CBR system, but it also includes MBR methods.

In addition to the well integrity cases, two complex cases (knowledge-rich cases) within oil well engineering have also been studied and analysed through the research work which is part of the PhD. The goal here is to show how the knowledge stored in two cases can be extracted for the CBR application.

A model comprising general knowledge (well-known rules and theories) and specific knowledge (stored in cases) has been developed. The results of the Wellogy model show that the CBR methodology can automate reasoning in addition to human reasoning through solving complex and repeated oil well problems. Moreover, the methodology showed that the valuable knowledge gained through the solved cases can be sustained and whenever it is needed, it can be retrieved and reused. The model has been verified for unsolved cases by evaluating case-matching results. The model gives elaborated explanations of the unsolved cases through the building of causal relationships. The model also facilitates knowledge acquisition and learning curves through its growing case base.

The study showed that building a CBR model is a rather time-consuming process due to four reasons:

1. Finding appropriate cases for the CBR application is not straightforward

2. Challenges related to constructing cases when transforming reported information to symbolic entities

3. Lack of defined criteria for amount of information (number of findings) for cases

4. Incomplete data and information to fully describe problems of the cases at the knowledge level

In this study only 12 solved cases (knowledge-rich cases) have been built in the Wellogy model. More cases (typically hundreds for knowledge-lean cases and around 50 for knowledge-rich cases) would be required to have a robust and efficient CBR model. As the CBR methodology is a new approach for solving complex oil well problems (research and development phase), additional research work is necessary for both areas, i.e. developing CBR frameworks (user interfaces) and building CBR models (core of CBR). Feasibility studies should be performed for implemented CBR models in order to use them in real oil field operations. So far, the existing Wellogy model has showed some benefits in terms of; representing the knowledge of leaking well cases in the form of an ontology, retrieving solved cases, and reusing pervious cases.

APA, Harvard, Vancouver, ISO, and other styles
16

Burkov, Andriy. "Leveraging Repeated Games for Solving Complex Multiagent Decision Problems." Thesis, Université Laval, 2011. http://www.theses.ulaval.ca/2011/28028/28028.pdf.

Full text
Abstract:
Prendre de bonnes décisions dans des environnements multiagents est une tâche difficile dans la mesure où la présence de plusieurs décideurs implique des conflits d'intérêts, un manque de coordination, et une multiplicité de décisions possibles. Si de plus, les décideurs interagissent successivement à travers le temps, ils doivent non seulement décider ce qu'il faut faire actuellement, mais aussi comment leurs décisions actuelles peuvent affecter le comportement des autres dans le futur. La théorie des jeux est un outil mathématique qui vise à modéliser ce type d'interactions via des jeux stratégiques à plusieurs joueurs. Des lors, les problèmes de décision multiagent sont souvent étudiés en utilisant la théorie des jeux. Dans ce contexte, et si on se restreint aux jeux dynamiques, les problèmes de décision multiagent complexes peuvent être approchés de façon algorithmique. La contribution de cette thèse est triple. Premièrement, elle contribue à un cadre algorithmique pour la planification distribuée dans les jeux dynamiques non-coopératifs. La multiplicité des plans possibles est à l'origine de graves complications pour toute approche de planification. Nous proposons une nouvelle approche basée sur la notion d'apprentissage dans les jeux répétés. Une telle approche permet de surmonter lesdites complications par le biais de la communication entre les joueurs. Nous proposons ensuite un algorithme d'apprentissage pour les jeux répétés en ``self-play''. Notre algorithme permet aux joueurs de converger, dans les jeux répétés initialement inconnus, vers un comportement conjoint optimal dans un certain sens bien défini, et ce, sans aucune communication entre les joueurs. Finalement, nous proposons une famille d'algorithmes de résolution approximative des jeux dynamiques et d'extraction des stratégies des joueurs. Dans ce contexte, nous proposons tout d'abord une méthode pour calculer un sous-ensemble non vide des équilibres approximatifs parfaits en sous-jeu dans les jeux répétés. Nous montrons ensuite comment nous pouvons étendre cette méthode pour approximer tous les équilibres parfaits en sous-jeu dans les jeux répétés, et aussi résoudre des jeux dynamiques plus complexes.
Making good decisions in multiagent environments is a hard problem in the sense that the presence of several decision makers implies conflicts of interests, a lack of coordination, and a multiplicity of possible decisions. If, then, the same decision makers interact continuously through time, they have to decide not only what to do in the present, but also how their present decisions may affect the behavior of the others in the future. Game theory is a mathematical tool that aims to model such interactions as strategic games of multiple players. Therefore, multiagent decision problems are often studied using game theory. In this context, and being restricted to dynamic games, complex multiagent decision problems can be algorithmically approached. The contribution of this thesis is three-fold. First, this thesis contributes an algorithmic framework for distributed planning in non-cooperative dynamic games. The multiplicity of possible plans is a matter of serious complications for any planning approach. We propose a novel approach based on the concept of learning in repeated games. Our approach permits overcoming the aforementioned complications by means of communication between players. We then propose a learning algorithm for repeated game self-play. Our algorithm allows players to converge, in an initially unknown repeated game, to a joint behavior optimal in a certain, well-defined sense, without communication between players. Finally, we propose a family of algorithms for approximately solving dynamic games, and for extracting equilibrium strategy profiles. In this context, we first propose a method to compute a nonempty subset of approximate subgame-perfect equilibria in repeated games. We then demonstrate how to extend this method for approximating all subgame-perfect equilibria in repeated games, and also for solving more complex dynamic games.
APA, Harvard, Vancouver, ISO, and other styles
17

Weiss, John C. (John Chandler). "Adaptive dialogues--a university's response to complex environmental problems." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/36490.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Dukellis, John N. (John Nicholas) 1977. "Applications of auction algorithms to complex problems with constraints." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/28455.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2000.
Includes bibliographical references (leaves 87-88).
Linear and nonlinear assignment problems are addressed by the use of auction algorithms. The application of auction to the standard linear assignment problem is reviewed. The extension to nonlinear problems is introduced and illustrated with two examples. Techniques that are employed for model reduction include discretization, classification, and imposition of assignment constraints. The tradeoff between solution speed and optimality for the nonlinear problem is analyzed and demonstrated for the sample problem.
by John N. Dukellis.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
19

Ghaderi, Reza. "Arranging simple neural networks to solve complex classification problems." Thesis, University of Surrey, 2000. http://epubs.surrey.ac.uk/844428/.

Full text
Abstract:
In "decomposition/reconstruction" strategy, we can solve a complex problem by 1) decomposing the problem into simpler sub-problems, 2) solving sub-problems with simpler systems (sub-systems) and 3) combining the results of sub-systems to solve the original problem. In a classification task we may have "label complexity" which is due to high number of possible classes, "function complexity" which means the existence of complex input-output relationship, and "input complexity" which is due to requirement of a huge feature set to represent patterns. Error Correcting Output Code (ECOC) is a technique to reduce the label complexity in which a multi-class problem will be decomposed into a set of binary sub-problems, based oil the sequence of "0"s and "1"s of the columns of a decomposition (code) matrix. Then a given pattern can be assigned to the class having minimum distance to the results of sub-problems. The lack of knowledge about the relationship between distance measurement and class score (like posterior probabilities) has caused some essential shortcomings to answering questions about "source of effectiveness", "error analysis", " code selecting ", and " alternative reconstruction methods" in previous works. Proposing a theoretical framework in this thesis to specify this relationship, our main contributions in this subject are to: 1) explain the theoretical reasons for code selection conditions 2) suggest new conditions for code generation (equidistance code)which minimise reconstruction error and address a search technique for code selection 3) provide an analysis to show the effect of different kinds of error on final performance 4) suggest a novel combining method to reduce the effect of code word selection in non-optimum codes 5) suggest novel reconstruction frameworks to combine the component outputs. Some experiments on artificial and real benchmarks demonstrate significant improvement achieved in multi-class problems when simple feed forward neural networks are arranged based on suggested framework To solve the problem of function complexity we considered AdaBoost, as a technique which can be fused with ECOC to overcome its shortcoming for binary problems. And to handle the problems of huge feature sets, we have suggested a multi-net structure with local back propagation. To demonstrate these improvements on realistic problems a face recognition application is considered. Key words: decomposition/ reconstruction, reconstruction error, error correcting output codes, bias-variance decomposition.
APA, Harvard, Vancouver, ISO, and other styles
20

Du, Dawei. "Biogeography-based optimization for combinatorial problems and complex systems." Cleveland State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=csu1400504249.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Kiss, Kristina, and P. G. Pererva. "Strategic problems of economic security of Ukraine." Thesis, Національний технічний університет "Харківський політехнічний інститут", 2018. http://repository.kpi.kharkov.ua/handle/KhPI-Press/40227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Kandjani, Hadi Esmaeilzadeh. "Engineering Self-designing Enterprises as Complex Systems Using Enterprise Architecture Cybernetics." Thesis, Griffith University, 2013. http://hdl.handle.net/10072/367332.

Full text
Abstract:
Various disciplines have contributed to Complexity Science by experiencing the problem of how to design, build and control more and more complex systems (i.e., to ‘beat the complexity barrier’) and tried to suggest some solutions. However, apart from the description of this problem, very few concrete proposals exist to solve it. The observation of this Conceptual Analytical dissertation is that while improved design methodologies, modelling languages and analysis tools can certainly lessen the designer’s problem, they only extend the complexity barrier that a designer (or group of designers) can deal with, but they do not remove that barrier. The hypothesis of this dissertation is that perhaps the system (or system of systems) and the designer (group of designers) should not be separated and systems should design themselves, out of component systems that have the same self-designing property. Therefore the informal research questions are: 1. Is it possible to remove this problem from the design of complex systems? 2. If yes how (or to what extent)? Many disciplines attempted to attack the question of complexity management, and as will be seen, an interdisciplinary approach seems necessary to be able to give useful answers. Enterprise Architecture as a discipline, which evolved in the past 20 to 30 years (initially called 'enterprise integration'), has defined as its mission to bring together all that knowledge which is necessary to maintain enterprises through life (ISO 15704, 2000). Therefore, this thesis will attempt to look at the problem through the eyes of an interdisciplinary EA researcher.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Information and Communication Technology
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
23

Sterner, Paula Franzen. "The influence of process utilization and analogous problem solving experiences in solving complex, multiple-step problems /." free to MU campus, to others for purchase, 1997. http://wwwlib.umi.com/cr/mo/fullcit?p9841337.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Hector, Donald Charles Alexander. "Towards a new philosophy of engineering: structuring the complex problems from the sustainability discourse." Thesis, The University of Sydney, 2008. http://hdl.handle.net/2123/2690.

Full text
Abstract:
This dissertation considers three broad issues which emerge from the sustainability discourse. First is the nature of the discourse itself, particularly the underlying philosophical positions which are represented. Second, is the nature of the highly complex types of problem which the discourse exposes. And third is whether the engineering profession, as it is practised currently, is adequate to deal with such problems. The sustainability discourse exposes two distinct, fundamentally irreconcilable philosophical positions. The first, “sustainable development”, considers humanity to be privileged in relation to all other species and ecosystems. It is only incumbent upon us to look after the environment to the extent to which it is in our interests to do so. The second, “sustainability”, sees humanity as having no special moral privilege and recognises the moral status of other species, ecosystems, and even wilderness areas. Thus, sustainability imposes upon us a moral obligation to take their status into account and not to degrade or to destroy them. These two conflicting positions give rise to extremely complex problems. An innovative taxonomy of problem complexity has been developed which identifies three broad categories of problem. Of particular interest in this dissertation is the most complex of these, referred to here as the Type 3 problem. The Type 3 problem recognises the systemic complexity of the problem situation but also includes differences of the domain of interests as a fundamental, constituent part of the problem itself. Hence, established systems analysis techniques and reductionist approaches do not work. The domain of interests will typically have disparate ideas and positions, which may be entirely irreconcilable. The dissertation explores the development of philosophy of science, particularly in the last 70 years. It is noted that, unlike the philosophy of science, the philosophy of engineering has not been influenced by developments of critical theory, cultural theory, and postmodernism, which have had significant impact in late 20th-century Western society. This is seen as a constraint on the practice of engineering. Thus, a set of philosophical principles for sustainable engineering practice is developed. Such a change in the philosophy underlying the practice of engineering is seen as necessary if engineers are to engage with and contribute to the resolution of Type 3 problems. Two particular challenges must be overcome, if Type 3 problems are to be satisfactorily resolved. First, issues of belief, values, and morals are central to this problem type and must be included in problem consideration. And second, the problem situation is usually so complex that it challenges the capacity of human cognition to deal with it. Consequently, extensive consideration is given to cognitive and behavioural psychology, in particular to choice, judgement and decision-making in uncertainty. A novel problem-structuring approach is developed on three levels. A set philosophical foundation is established; a theoretical framework, based on general systems theory and established behavioural and cognitive psychological theory, is devised; and a set of tools is proposed to model Type 3 complex problems as a dynamic systems. The approach is different to other systems approaches, in that it enables qualitative exploration of the system to plausible, hypothetical disturbances. The problem-structuring approach is applied in a case study, which relates to the development of a water subsystem for a major metropolis (Sydney, Australia). The technique is also used to critique existing infrastructure planning processes and to propose an alternative approach.
APA, Harvard, Vancouver, ISO, and other styles
25

Hector, Donald Charles Alexander. "Towards a new philosophy of engineering: structuring the complex problems from the sustainability discourse." University of Sydney, 2008. http://hdl.handle.net/2123/2690.

Full text
Abstract:
Doctor of Philosophy (PhD)
Revised work with minor emendations approved by supervisor.
This dissertation considers three broad issues which emerge from the sustainability discourse. First is the nature of the discourse itself, particularly the underlying philosophical positions which are represented. Second, is the nature of the highly complex types of problem which the discourse exposes. And third is whether the engineering profession, as it is practised currently, is adequate to deal with such problems. The sustainability discourse exposes two distinct, fundamentally irreconcilable philosophical positions. The first, “sustainable development”, considers humanity to be privileged in relation to all other species and ecosystems. It is only incumbent upon us to look after the environment to the extent to which it is in our interests to do so. The second, “sustainability”, sees humanity as having no special moral privilege and recognises the moral status of other species, ecosystems, and even wilderness areas. Thus, sustainability imposes upon us a moral obligation to take their status into account and not to degrade or to destroy them. These two conflicting positions give rise to extremely complex problems. An innovative taxonomy of problem complexity has been developed which identifies three broad categories of problem. Of particular interest in this dissertation is the most complex of these, referred to here as the Type 3 problem. The Type 3 problem recognises the systemic complexity of the problem situation but also includes differences of the domain of interests as a fundamental, constituent part of the problem itself. Hence, established systems analysis techniques and reductionist approaches do not work. The domain of interests will typically have disparate ideas and positions, which may be entirely irreconcilable. The dissertation explores the development of philosophy of science, particularly in the last 70 years. It is noted that, unlike the philosophy of science, the philosophy of engineering has not been influenced by developments of critical theory, cultural theory, and postmodernism, which have had significant impact in late 20th-century Western society. This is seen as a constraint on the practice of engineering. Thus, a set of philosophical principles for sustainable engineering practice is developed. Such a change in the philosophy underlying the practice of engineering is seen as necessary if engineers are to engage with and contribute to the resolution of Type 3 problems. Two particular challenges must be overcome, if Type 3 problems are to be satisfactorily resolved. First, issues of belief, values, and morals are central to this problem type and must be included in problem consideration. And second, the problem situation is usually so complex that it challenges the capacity of human cognition to deal with it. Consequently, extensive consideration is given to cognitive and behavioural psychology, in particular to choice, judgement and decision-making in uncertainty. A novel problem-structuring approach is developed on three levels. A set philosophical foundation is established; a theoretical framework, based on general systems theory and established behavioural and cognitive psychological theory, is devised; and a set of tools is proposed to model Type 3 complex problems as a dynamic systems. The approach is different to other systems approaches, in that it enables qualitative exploration of the system to plausible, hypothetical disturbances. The problem-structuring approach is applied in a case study, which relates to the development of a water subsystem for a major metropolis (Sydney, Australia). The technique is also used to critique existing infrastructure planning processes and to propose an alternative approach.
APA, Harvard, Vancouver, ISO, and other styles
26

Gaertner, Evgeniya. "Basic complex boundary value problems in the upper half-plane." [S.l.] : [s.n.], 2006. http://www.diss.fu-berlin.de/2006/320/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Gong, Jing. "Hybrid Methods for Unsteady Fluid Flow Problems in Complex Geometries." Doctoral thesis, Uppsala universitet, Avdelningen för teknisk databehandling, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8341.

Full text
Abstract:
In this thesis, stable and efficient hybrid methods which combine high order finite difference methods and unstructured finite volume methods for time-dependent initial boundary value problems have been developed. The hybrid methods make it possible to combine the efficiency of the finite difference method and the flexibility of the finite volume method. We carry out a detailed analysis of the stability of the hybrid methods, and in particular the stability of interface treatments between structured and unstructured blocks. Both the methods employ so called summation-by-parts operators and impose boundary and interface conditions weakly, which lead to an energy estimate and stability. We have constructed and analyzed first-, second- and fourth-order Laplacian based artificial dissipation operators for finite volume methods on unstructured grids. The first-order artificial dissipation can handle shock waves, and the fourth-order artificial dissipation eliminates non-physical numerical oscillations efficiently. A stable hybrid method for hyperbolic problems has been developed. It is shown that the stability at the interface can be obtained by modifying the dual grid of the unstructured finite volume method close to the interface. The hybrid method is applied to the Euler equation by the coupling of two stand-alone CFD codes. Since the coupling is administered by a third separate coupling code, the hybrid method allows for individual development of the stand-alone codes. It is shown that the hybrid method is an accurate, efficient and practically useful computational tool that can handle complex geometries and wave propagation phenomena. Stable and accurate interface treatments for the linear advection–diffusion equation have been studied. Accurate high-order calculation are achieved in multiple blocks with interfaces. Three stable interface procedures — the Baumann–Oden method, the “borrowing” method and the local discontinuous Galerkin method, have been investigated. The analysis shows that only minor differences separate the different interface handling procedures. A conservative stable and efficient hybrid method for a parabolic model problem has been developed. The hybrid method has been applied to the full Navier–Stokes equations. The numerical experiments support the theoretical conclusions and show that the interface coupling is stable and converges at the correct order for the Navier–Stokes equations.
APA, Harvard, Vancouver, ISO, and other styles
28

Shi, Ning. "Dynamic resource allocation problems with uncertainties and complex work rules." View abstract or full-text, 2007. http://library.ust.hk/cgi/db/thesis.pl?IELM%202007%20SHI.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Malm, Catharina, and Mika Silfver. "Investigating the complex problems of waste paper at Larsson Offsettryck." Thesis, Linköpings universitet, Institutionen för teknik och naturvetenskap, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-95276.

Full text
Abstract:
Waste paper is a big problem in the graphic arts business, both from an environmental point of view and from a financial one. Since focus is often put on obtaining low costs and high production speed, the work of reducing the amount of waste paper becomes a secondary issue for many companies. In this master thesis we have looked for reasons for waste paper generation in the beginning of the printing process, the adjustment phase, by studying the printing press Heidelberg Speedmaster CD 102-5 at the printing house Larsson Offsettryck AB in Linkoping. We have found suggestions on how to reduce the amount of waste paper by observing and interviewing the staff members at Larsson Offsettryck AB, measuring print quality and perform analyses based on the theoretical knowledge we have obtained through our studies at Linkoping University. For our test printings we used one coated paper quality, Tom&Otto Silk 150 g, and one uncoated paper quality, Maxi Offset 170 g. To analyse the printing result we used both objective and subjective evaluations. The result from our study is that the adjustments made in the beginning of the printing process, to achieve correct colour level, can be finished earlier than the printing press indicates. A total of 40-60 sheets can be saved for each adjustment. According to our research the number of sheets per adjustment at Larsson Offsettryck AB today is approximately 140, which means that the number of adjustment sheets for Larsson Offsettryck AB can be decreased by roughly 35 % through this simple alteration of the printing process. We have also examined the ICC profiles used at the printing house and concluded that the grey balance for uncoated paper is not satisfactory and should be improved.
Pappersmakulatur ar ett stort problem inom den grafiska branschen, bade ur miljoperspektiv och ur en ekonomisk synvinkel. Eftersom fokus ofta laggs pa att halla laga priser och hog produktionshastighet blir arbetet med att minska mangden pappersmakulatur en sekundar fraga for manga foretag. I detta examensarbete har vi sokt orsaker till makulaturuppkomst i borjan av tryckprocessen, installningsfasen, genom att studera tryckpressen Heidelberg Speedmaster CD 102-5 pa tryckeriet Larsson Offsettryck AB i Linkoping. Vi har tagit fram forslag pa hur mangden makulatur kan minskas genom att observera och intervjua personalen pa Larsson Offsettryck AB, mata tryckkvalitet och utfora analyser baserade pa den teoretiska kunskap vi inforskaffat under var utbildning vid Linkopings universitet. For vara testtryckningar har vi anvant oss av en bestruken typ av papper, Tom&Otto Silk 150 g, och en obestruken typ av papper, Maxi Offset 170 g. For att analysera vara resultat har vi anvant oss av bade objektiva och subjektiva utvarderingsmetoder. Vara resultat visar att de justeringar som gors i borjan av tryckprocessen, for att astadkomma ratt fargbalans i tryckpressen, kan avslutas tidigare an vad tryckpressen indikerar. Under varje justering kan 40-60 ark sparas. Enligt var undersokning kravs det idag omkring 140 ark per justering pa Larsson Offsettryck AB, vilket innebar att antalet installningsark kan minskas med cirka 35 % genom denna enkla forandring av tryckprocessen. Vi har aven undersokt Larsson Offsettryck AB:s ICC-profiler och slutit oss till att den grabalans som aterfinns i profilen for obestruket papper inte ar   optimal och kraver forbattring.
APA, Harvard, Vancouver, ISO, and other styles
30

Maden, William. "Models and heuristic algorithms for complex routing and scheduling problems." Thesis, Lancaster University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.441791.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Jia, Haibo. "Semantic enhanced argumentation based group decision making for complex problems." Thesis, Glasgow Caledonian University, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.555820.

Full text
Abstract:
This thesis is concerned with issues ansmg from group argumentation based decision making support. An investigation was carried out into the semantic representation of argumentation schema ontology and the influence of it on decision making support problem. Previous research has shown argumentation as a process of communication and reasoning is a powerful way of discovering the structure and identifying various aspects of ill structured problems. The literature review revealed that many researchers have covered different aspects of representing and evaluating argumentation for decision making purpose, however there is no clearly defined comprehensive conceptual group argumentation framework for decision making support. In most cases, group argumentation and decision making are regarded as separate processes which cause difficulty to fully integrate the argumentation process with the decision making process. In this thesis, the main elements of group argumentation and decision making are identified. A new conceptual framework is designed to glue those two sets of elements together to support decision making fully using argumentation approach. In order to better integrate different sources of argumentative information, a semantic based approach is employed to model argumentative schema ontology. The design of this ontology not only considers the basic discussion and group interaction concepts, but also the notion of strength of the claim and pro/cons argument and different argument types from practical view and epistemic view. In this research, the semantic support is not only constrained to the structure of the argumentation but also to the topic of argumentation content. The experiment has shown the semantic topic annotation of utterances can facilitate the intelligent agent to discover, retrieve and map related 2 information which can bring some new benefits for supporting decision making such as better presenting the perspectives of decision problems, automatically identifying the criteria for evaluating solution, modelling and updating experts' credibility in the topic level etc. Different from a fully automatic or manual semantic annotation approach, a middle way solution for semantic annotation is proposed which allows users to manually label the content with a simple keyword and then automatically conceptualize the keyword using the formal ontological term querying from the cross domain ontology knowledge base -DBpedia. Based on the designed framework and semantics of the defined argumentative ontology, a prototype agent based distributed group argumentation system for decision making was developed. This prototype system, acting as a test bed, was used in a group argumentation experiment to test the proposed hypothesis. The experiment result was gathered from observation and users' experience based on the questionnaire. The analysis of the result indicates that this semantic enhanced group argumentation based decision making approach not only can advise the solution route for a decision task with a high degree of user satisfaction but also can present more perspectives of the decision problems which can enable an iterative process of problem solving. It is consistent with the new vision of group decision making support. A metric based evaluation was conducted to compare our proposed approach with other related approaches from the different aspects regarding group argumentation based decision making support; the conclusion shows our approach not only share many common features with others, but also has many unique characteristics enhanced by the comprehensive argumentation model and semantic support which are essential for the new decision support paradigm. 3 It is considered that the expectations as given in the initial aims have been achieved. Existing methods either focus on the reasoning capability of the argumentation for the decision making or focus on the communicative capability of the argumentation for discovering different problem perspectives and iterating the problem solving process. In our proposed approach, a comprehensive argumentation ontology for argumentation structure and a semantic annotation mechanism to conceptualize the argumentative content are designed so that the semantic support can cover both argumentation structure level and content level, via which the system can better interpret and manage the information generated in the process of group argumentation and provide more semantic services such as argumentation process iteration, decision rationale reuse, decision problem discovery etc. The findings from this study may make a contribution to the development of new paradigm group decision making systems based on group argumentation. 4.
APA, Harvard, Vancouver, ISO, and other styles
32

Cox, Jürgen 1970. "Solution of sign and complex action problems with cluster algorithms." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/8646.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Physics, 2001.
Includes bibliographical references (p. [105]-109) and index.
Two kinds of models are considered which have a Boltzmann weight which is either not real or real but not positive and so standard Monte Carlo methods are not applicable. These sign or complex action problems are solved with the help of cluster algorithms. In each case improved estimators for the Boltzmann weight are constructed which are real and positive. The models considered belong to two classes: fermionic and non-fermionic models. An example for a non-fermionic model is the Potts model approximation to QCD at non-zero baryon density. The three-dimensional three-state Potts model captures the qualitative features of this theory. It has a complex action and so the Boltzmann weight cannot be interpreted as a probability. The complex action problem is solved by using a cluster algorithm. The improved estimator for the complex phase of the Boltzmann factor is real and positive and is used for importance sampling. The first order deconfinement transition line is investigated and the universal behavior at its critical endpoint is studied.
(cont.) An example for a fermionic model with a sign problem are staggered fermions with 2 flavors in 3+1 dimensions. Here the sign is connected to the permutation sign of fermion world lines and is of nonlocal nature. Cluster flips change the topology of the fermion world lines and they have a well defined effect on the permutation sign independent of the other clusters. The sign problem is solved by suppressing those clusters whose contribution to the partition function and observables of interest would be zero. We confirm that the universal critical behavior of the finite temperature chiral phase transition is the one of the three dimensional Ising model. We also study staggered fermions with one flavor in 2+1 dimensions and confirm that the chiral phase transition then belongs to the universality class of the two dimensional Ising model.
by Jürgen Cox.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
33

AlAbed-AlHaq, Abrar Fawwaz. "APPLYING GRAPH MINING TECHNIQUES TO SOLVE COMPLEX SOFTWARE ENGINEERING PROBLEMS." Kent State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=kent1442986844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Daroui, Danesh. "Efficient PEEC-based solver for complex electromagnetic problems in power electronics." Doctoral thesis, Luleå tekniska universitet, EISLAB, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-26524.

Full text
Abstract:
The research presented in this thesis discusses an electromagnetic (EM) analysis tool which is based on the partial element equivalent circuit (PEEC) method and is appropriate for combined EM and circuit simulations especially power electronics applications. EM analysis is important to ensure that a system will not affect the correct operation of other devices nor cause interference between various electrical systems. In power electronic applications, the increased switching speed can cause voltage overshoots, unbalanced current share between semiconductor modules, and unwanted resonances. Therefore, EM analysis should be carried out to perform design optimizations in order to minimize unwanted effects of high frequencies. The solver developed in this work is an appropriate solution to address the needs of EM analysis in general and power electronics in particular. The conducted research consists of performance acceleration and implementation of the solver, and verification of the simulation results by means of measurements. This work was done in two major phases.In the first phase, the solver was accelerated to optimize its performance when quasi-static (R,Lp,C)PEEC as well as full-wave (R,Lp,C,tau)PEEC simulations were carried out. The main optimizations were based on exploiting parallelism and high performance computing to solve very large problems and non-uniform mesh, which was helpful in simulating skin- and proximity effects while keeping the problem size to a minimum. The presented results and comparisons with the measurements confirmed that non-uniform mesh helped in accurately simulating large bus bar models and correctly predicting system resonances when the size of the problem was minimized. On-the-fly calculation was also developed to reduce memory usage, while increasing solution time.The second phase consists of methods to increase the performance of the solver while including some levels of approximations. In this phase sparsification techniques were used to convert a dense PEEC system into a sparse system. The sparsification was done by calculating the reluctance matrix, which can be sparsified by maintaining the accuracy at the desired level, because of the locality and the shielding effect of the reluctance matrix. Efficient algorithms were developed to perform sparse matrix-matrix multiplication and assemble the sparse coefficient matrix in a row-by-row manner to reduce the peak memory usage. The sparse system was then solved using both sparse direct and iterative solvers with proper preconditioning. The acquired results from the sparse direct solution confirmed that the memory consumption and solution time were reduced by orders of magnitude and by a factor 3 to 5. Moreover, the Schur complement was used together with the iterative approach, making it possible to solve large problems within a few iterations by preconditioning the system, and using less memory and lower computational complexity. Bus bars used in two types of power frequency converters manufactured by ABB were modelled and analysed with the developed PEEC-based solver in this research, and the simulations and measurements agreed very well. Results of simulations also led to improvement in the physical design of the bars, which reduced the inductance of the layout.With the accelerated solver, it is now possible to solve very large and complex problems on conventional computer systems, which was not possible before. This provides new possibilities to study real-world problems which are typically large in size and have complex structures.
Godkänd; 2012; 20121114 (dan); DISPUTATION Ämne: Industriell elektronik/Industrial Electronics Opponent: PhD Bruce Archambeault, IBM, Research Triangle Park, North Carolina, USA. Opponenten utför sitt uppdrag via distansöverbryggande teknik Ordförande: Docent Jonas Ekman, Institutionen för system- och rymdteknik, Luleå tekniska universitet Tid: Torsdag den 17 januari 2013, kl 13.30 Plats: A109, Luleå tekniska universitet
APA, Harvard, Vancouver, ISO, and other styles
35

Laber, Micaela. "The Politics of Biosimilars: Understanding Stakeholder Influence Over Complex Policy Problems." Scholarship @ Claremont, 2018. http://scholarship.claremont.edu/cmc_theses/1815.

Full text
Abstract:
The health care industry’s involvement with biosimilar policies suggests that building coalitions and reducing opposition are critical factors for interest group success. As government decision-makers wrestle with how to handle a perplexing category of prescription drugs, companies and patient groups alike receive ample opportunities to contribute to the policymaking process. When stakeholders in the biosimilar arena – including manufacturers, physicians, and patients – unite, we see that the United States government takes steps toward fixing the policy problem. This occurred most recently with policies about biosimilar drug coverage under Medicare Part D and reimbursement under Medicare Part B. In both cases, stakeholders took a united stance and consequently faced no opposition. On the contrary, internal industry disputes between brand and biosimilar manufacturers about patent exclusivity laws and interchangeability rules revealed the nuances of biosimilar policy and the challenge that regulators face when they receive mixed messages. Across all of their efforts, biosimilar stakeholders pursued numerous strategies which may have contributed to their successes. They focused on niche issues and used their lobbying expertise to actively submit comments, testify in hearings, and meet with government officials; however, the differentiating tactic between the industry’s successes and failures was whether they formed coalitions. By coming together, stakeholders lowered their chances of facing opposition. A closer analysis of the politics of biosimilars illustrates that when they present a united front to lawmakers, interest groups reduce the likelihood of opposition and successfully influence policy change.
APA, Harvard, Vancouver, ISO, and other styles
36

Zhang, Zili, and mikewood@deakin edu au. "An Agent-based hybrid framework for decision making on complex problems." Deakin University. School of Computing and Mathematics, 2001. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20050815.110726.

Full text
Abstract:
Electronic commerce and the Internet have created demand for automated systems that can make complex decisions utilizing information from multiple sources. Because the information is uncertain, dynamic, distributed, and heterogeneous in nature, these systems require a great diversity of intelligent techniques including expert systems, fuzzy logic, neural networks, and genetic algorithms. However, in complex decision making, many different components or sub-tasks are involved, each of which requires different types of processing. Thus multiple such techniques are required resulting in systems called hybrid intelligent systems. That is, hybrid solutions are crucial for complex problem solving and decision making. There is a growing demand for these systems in many areas including financial investment planning, engineering design, medical diagnosis, and cognitive simulation. However, the design and development of these systems is difficult because they have a large number of parts or components that have many interactions. From a multi-agent perspective, agents in multi-agent systems (MAS) are autonomous and can engage in flexible, high-level interactions. MASs are good at complex, dynamic interactions. Thus a multi-agent perspective is suitable for modeling, design, and construction of hybrid intelligent systems. The aim of this thesis is to develop an agent-based framework for constructing hybrid intelligent systems which are mainly used for complex problem solving and decision making. Existing software development techniques (typically, object-oriented) are inadequate for modeling agent-based hybrid intelligent systems. There is a fundamental mismatch between the concepts used by object-oriented developers and the agent-oriented view. Although there are some agent-oriented methodologies such as the Gaia methodology, there is still no specifically tailored methodology available for analyzing and designing agent-based hybrid intelligent systems. To this end, a methodology is proposed, which is specifically tailored to the analysis and design of agent-based hybrid intelligent systems. The methodology consists of six models - role model, interaction model, agent model, skill model, knowledge model, and organizational model. This methodology differs from other agent-oriented methodologies in its skill and knowledge models. As good decisions and problem solutions are mainly based on adequate information, rich knowledge, and appropriate skills to use knowledge and information, these two models are of paramount importance in modeling complex problem solving and decision making. Follow the methodology, an agent-based framework for hybrid intelligent system construction used in complex problem solving and decision making was developed. The framework has several crucial characteristics that differentiate this research from others. Four important issues relating to the framework are also investigated. These cover the building of an ontology for financial investment, matchmaking in middle agents, reasoning in problem solving and decision making, and decision aggregation in MASs. The thesis demonstrates how to build a domain-specific ontology and how to access it in a MAS by building a financial ontology. It is argued that the practical performance of service provider agents has a significant impact on the matchmaking outcomes of middle agents. It is proposed to consider service provider agents' track records in matchmaking. A way to provide initial values for the track records of service provider agents is also suggested. The concept of ‘reasoning with multimedia information’ is introduced, and reasoning with still image information using symbolic projection theory is proposed. How to choose suitable aggregation operations is demonstrated through financial investment application and three approaches are proposed - the stationary agent approach, the token-passing approach, and the mobile agent approach to implementing decision aggregation in MASs. Based on the framework, a prototype was built and applied to financial investment planning. This prototype consists of one serving agent, one interface agent, one decision aggregation agent, one planning agent, four decision making agents, and five service provider agents. Experiments were conducted on the prototype. The experimental results show the framework is flexible, robust, and fully workable. All agents derived from the methodology exhibit their behaviors correctly as specified.
APA, Harvard, Vancouver, ISO, and other styles
37

Feng, Luming. "PRACTICAL APPROACHES TO COMPLEX ROLE ASSIGNMENT PROBLEMS IN ROLE-BASED COLLABORATION." Thesis, Laurentian University of Sudbury, 2013. https://zone.biblio.laurentian.ca/dspace/handle/10219/2105.

Full text
Abstract:
Group role assignment (GRA) is an important task in Role-Based Collaboration (RBC). The complexity of group role assignment becomes very high as the constraints are introduced. According to recent studies, considerable efforts have been put towards research on complex group role assignment problems. Some of these problems are clearly defined and initial solutions are proposed. However some of these solutions were unable to guarantee an optimal result, or the time complexity is very high. In fact, many real world collaboration problems concern many types of constraints. Therefore, to make them practical, the accuracy and efficiency of the algorithms should be improved. Role is the center of a role-based collaboration mechanism. Role plays a very essential part in the whole process of a collaboration system, without the roles, there would be no collaboration. One important function of the role is that it defines the features or requirements of a position which can be used to filter or access the candidates. The definition of roles greatly influences the evaluation results of candidates, which in turn influence the RBC algorithms significantly. Based on previous research, the role-based evaluation is associated with multiple attribute decision making (MADM). Role-based evaluation methods can be adopted from MADM methods. Selecting an appropriate method for a specific problem is difficult and domain oriented. Therefore, a dynamic evaluation model which can be expanded by domain experts and adapted to many cases is required. At present, there is limited research related to this requirement. This thesis first focuses on two complex role-based collaboration problems. The first being group role assignment problems with constraints of conflicting agents, and the iv second an agent training problem for a sustainable group. Practical solutions to these problems are proposed and resolved by IBM ILOG CPLEX. Simulations are conducted to demonstrate the performance of these solutions. From which I compare the solutions’ performances with the initial solutions, and indicate the improvement of these proposed solutions. Secondly, this thesis clarifies the difficulties of connecting evaluation methods with real world requirements. In order to overcome these difficulties, I introduce an additional parameter, propose a dynamic evaluation model, and provide four synthesis methods to facilitate the requirements of a co-operation project which is funded by NSERC (Natural Sciences and Engineering Research Council of Canada). The contributions of this thesis includes: clarifying the complexity of two complex role-based collaboration problem; proposing a better solution and verifying its efficiency and practicability; discussing the difficulties of connecting evaluation methods with real world problems; introducing an additional parameter to improve the accuracy of evaluation to some problems; proposing a role-based evaluation model to meet the requirements of adaptive and expandable.
APA, Harvard, Vancouver, ISO, and other styles
38

Roper, Mark. "Honeybee visual cognition : a miniature brain's simple solutions to complex problems." Thesis, Queen Mary, University of London, 2017. http://qmro.qmul.ac.uk/xmlui/handle/123456789/25938.

Full text
Abstract:
In recent decades we have seen a string of remarkable discoveries detailing the impressive cognitive abilities of bees (social learning, concept learning and even counting). But should these discoveries be regarded as spectacular because bees manage to achieve human-like computations of visual image analysis and reasoning? Here I offer a radically different explanation. Using theoretical bee brain models and detailed flight analysis of bees undergoing behavioural experiments I counter the widespread view that complex visual recognition and classification requires animals to not only store representations of images, but also perform advanced computations on them. Using a bottom-up approach I created theoretical models inspired by the known anatomical structures and neuronal responses within the bee brain and assessed how much neural complexity is required to accomplish behaviourally relevant tasks. Model simulations of just eight large-field orientation-sensitive neurons from the optic ganglia and a single layer of simple neuronal connectivity within the mushroom bodies (learning centres) generated performances remarkably similar to the empirical result of real bees during both discrimination and generalisation orientation pattern experiments. My models also hypothesised that complex 'above and below' conceptual learning, often used to exemplify how 'clever' bees are, could instead be accomplished by very simple inspection of the target patterns. Analysis of the bees' flight paths during training on this task found bees utilised an even simpler mechanism than anticipated, demonstrating how the insects use unique and elegant solutions to deal with complex visual challenges. The true impact of my research is therefore not merely showing a model that can solve a particular set of generalisation experiments, but in providing a fundamental shift in how we should perceive visual recognition problems. Across animals, equally simple neuronal architectures may well underlie the cognitive affordances that we currently assume to be required for more complex conceptual and discrimination tasks.
APA, Harvard, Vancouver, ISO, and other styles
39

Hu, Kun. "Three Essays on Modeling Complex Dynamic Problems in Health and Safety." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/27621.

Full text
Abstract:
Essay #1 – Factors influencing the risk of falls in the construction industry: a review of the evidence Falls are a significant public health risk and a leading cause of nonfatal and fatal injuries among construction workers worldwide. A more comprehensive understanding of casual factors leading to fall incidents is essential to prevent falls in the construction industry. However, an extensive overview of causal factors is missing from the literature. In this paper, 536 articles on factors contributing to the risk of falls were retrieved. One hundred twenty-one (121) studies met the criteria for relevance and quality to be coded, and were synthesized to provide an overview. In lieu of the homogeneity needed across studies to conduct a structured meta-analysis, a literature synthesis method based on macro-variables was advanced. This method provides a flexible approach to aggregating previous findings and assessing agreement across those studies. Factors commonly associated with falls included working surfaces and platforms, workers’ safety behaviors and attitudes, and construction structure and facilities. Significant differences across qualitative and quantitative studies were found in terms of focus, and areas with limited agreement in previous research were identified. Findings contribute to research on the causes of falls in construction, developing engineering controls, informing policy and intervention design to reduce the risk of falls, and improving research synthesis methods. Essay #2 – Review of quantitative studies of interventions for responding to infectious disease outbreaks We reviewed the modeling and retrospective literature on responding to outbreaks of infectious diseases in humans and animals. Unlike routine immunization and control efforts, outbreak response activities require rapid reactive actions to address an urgent or emergent situation. We focused our review on characterizing the types of diseases analyzed, the interventions used, and the models employed. Out of the 211 studies identified, we find that the majority focus on a few diseases (influenza, foot and mouth disease, smallpox, measles, and hepatitis). We identified 34 distinct interventions explored in these studies that fall under the general categories of vaccination, prophylaxis, quarantine/isolation, contact restriction, exposure reduction, killing/slaughtering, and surveillance. A large number of studies (141) use simulation/analytical models to analyze outbreak response strategies. We identify key factors contributing to the effectiveness of different interventions that target high-risk individuals, trace infected contacts, or use a ring to delineate geographical boundaries for an intervention. Essay #3 – Development of an individual-based model for polioviruses: implications of the selection of network type and outcome metrics We developed an individual-based (IB) model to explore the stochastic attributes of state transitions, the heterogeneity of the individual interactions, and the impact of different network structure choices on the poliovirus transmission process in the context of understanding the dynamics of outbreaks. We used a previously published differential equation-based model to develop the IB model and inputs. To explore the impact of different types of networks, we implemented a total of 26 variations of six different network structures in the IB model. We found that the choice of network structure plays a critical role in the model estimates of cases and the dynamics of outbreaks. This study provides insights about the potential use of an IB model to support policy analyses related to managing the risks of polioviruses and shows the importance of assumptions about network structure.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
40

Lawson, Suzanne. "Addressing Complex Problems: Spatial Targeting, Disadvantage and Urban Governance in Australia." Thesis, Griffith University, 2011. http://hdl.handle.net/10072/366655.

Full text
Abstract:
Over the past 20 years governments in Australia have been experimenting with spatial targeting as a way to address disadvantage. Spatially targeted programs are distinct from traditional functional programs in that they are geographically based and focus on working with communities and across governments to address multiple problems at the local level. In this context NSW and Queensland stand out as two states that adopted explicit spatial responses to urban disadvantage in the form of place management programs. Place management programs were cast as innovative attempts to address concentrated disadvantage in discrete local communities. Why these two states adopted spatial targeting is the central question of this research. The research uses in-depth case study analysis of place management programs in Western Sydney and Brisbane to uncover the multiple factors that led to this form of spatial targeting. Analysis of the decision-making and implementation process for these programs provides insights about the policy process in Australia and the prospects for spatial targeting to tackle complex social policy issues. Place management programs are comparable with examples of spatial targeting in other Western democracies, for example area-based initiatives in the UK. Area-based initiatives sought to address the concentrated disadvantage that arose from restructuring and deindustrialisation, with targeted intensive interventions in local areas. Drawing on the international literature, this research extends the concept of spatial targeting by applying it to the Australian case. Whilst some of the international features are identifiable here, other aspects of spatial targeting are unique to the Australian institutional context. In this thesis it is argued that from the 1980s onwards, economic restructuring and urban redevelopment in Australian cities contributed to the emergence of complex problems. Existing governance arrangements were unable to respond to these problems and the capacity of the service system was undermined by increasing demand as well as public sector reforms and changing welfare policy. Spatially targeted programs were seen as a new way to respond to these issues.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Griffith School of Environment
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
41

Gül, Asmen. "Crowdsourcing of Complex Problems in Industrial Firms : A Case Study Within the Packaging Industry." Thesis, KTH, Industriell ekonomi och organisation (Inst.), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284204.

Full text
Abstract:
This study takes root in the emergence of new crowdsourcing techniques that have made it possible to solve business problems of complex nature by reaching outside of traditional organizational boundaries. While crowdsourcing is not a new concept, emerging technological trends such as Industry 4.0 and a growing interest by organizations to leverage the collective intelligence of online communities have made it an intriguing subject to study. The case study was conducted in USA, California at an established Swedish firm within the packaging industry. As opposed to traditional forms of crowdsourcing of repetitive and simple tasks, this study has an emphasis on complex problems with open-ended goals that often require iteration and expert skills to solve. The case project called Megatron relates to a packaging product developed to transport data servers for major technology companies in Silicon Valley. The product significantly differs in material and design compared with earlier packaging versions that have served the same purpose. Not only does it fold within its footprint when empty, to save space and gain logistical benefits, but it has a lightweight and robust material. The project development of this prototype of seemingly new features constituted a complex problem for the industrial firm that could be studied. Using innovation, problem types, speed and agility of the project as metrics and benchmarks to evaluate the potential of crowdsourcing, interviews and project documentation was gathered as empirical data. The empirics were then analyzed with the help of literature, theory and models that offered scope of crowdsourcing models for different purposes, and a relationship between problem types and innovation. The study indicated a strong willingness among company employees to integrate digital platforms and tools for experimentation and prototyping. Furthermore, the study identified paradoxes and drawbacks such as the malleability of complex problems. However, recommendations to deal with the uncertainty are provided as well, such as online reputation systems and peerreviewing tools to validate the quality of work.
Denna studie grundar sig i uppkomsten av nya crowdsourcing-tekniker som har gjort det möjligt att lösa affärsproblem av komplexa natur genom att nå utanför organisationens traditionella gränser. Även om crowdsourcing inte är ett nytt koncept har nya teknologiska trender som Industri 4.0 och ett ökat intresse från organisationer att dra nytta av kunskap från internetbaserade grupper och plattformar gjort det till ett intressant fall att studera. Fallstudien genomfördes i USA på ett väletablerat svenskt företag inom förpackningsindustring. Till skillnad från traditionella former av crowdsourcing av repetitiva och enkla uppgifter, fokuserar denna studie på komplexa problem med öppna mål, vilket ofta kräver iteration och expert-kompetens. Fallstudien om projektet Megatron avser en förpackningsprodukt utvecklad för att transportera dataservrar till globala teknikföretag i Silicon Valley. Projektet skiljer sig avsevärt i form av produktens material och design jämfört tidigare förpackningsversioner som tjänat samma syfte. Dels kan den vikas som tom för att spara utrymme, erhålla logistikfördelar som mindre koldioxidutsläpp och kostnad, och dels har den konkurrenskraftiga egenskaper som starkt och lätt material. Med utgångspunkt från projektet och produktens nya egenskaper klassas det i rapporten som ett komplext problem för industriföretaget. Innovation, kostnad och projektetstidsaspekt användes som mätvärden för att utvärdera möjligheterna kring crowdsourcing. Den empiriska datan inkluderade intervjuer och dokumentationen kring projektets olika faser. Empirin analyserades därefter med hjälp av litteratur, teorier och modeller som syftade till sambandet mellan problemtyper och innovation, samt hur dessa förhåller sig till crowdsourcing och företagets affärsvärde. Studien visar en stark vilja bland företagets anställda att integrera digitala plattformar och verktyg för experimentering och prototyputveckling. Vidare identifierade studien paradoxer såsom mycket förändringsbara egenskapen av komplexa problem I förhållande till innovation. Däremot presenterar arbetet rekommendationer för att hantera osäkerheten, exempelvis genom ryktesbaserade plattformar och och granskningsverktyg för att validera arbetens kvalitet.
APA, Harvard, Vancouver, ISO, and other styles
42

Cameron, Mark A., and Mark Cameron@csiro au. "A Problem Model for Decision Support Systems." The Australian National University. Faculty of Engineering and Information Technology, 2000. http://thesis.anu.edu.au./public/adt-ANU20020717.144031.

Full text
Abstract:
This body of research focuses on supporting problem-stakeholders, decision-makers and problem-solvers faced with an ill-defined and complex real world problem. An ill-defined problem has a characteristic trait of continual refinement. That is, the definition of the problem changes throughout the problem investigation and resolution process. The central theme of this research is that a support system should provide problem stakeholders with a problem definition model for constructing and manipulating a representation of the definition of the problem as they understand it. The approach adopted herein is to first develop a problem definition model for ill-defined problems— the 6-Component problem definition model. With this model, it is then possible to move on to identifying the types of changes or modifications to the problem definition that problem stakeholders, decision makers and problem solvers may wish to explore. Importantly, there must be a connection between the surface representation of the problem and the underlying implementation of the support system. This research argues that by focusing the support system around the problem definition, it is possible to reduce the mismatch between the problem objectives and the representation of the problem that the support system offers. This research uses the Unified Modelling Language to record and explore the requirements that problem stakeholders, faced with an evolving problem definition, place on a support system. The 6-Component problem definition model is then embedded within a design for an evolutionary support system. This embedding, supported by collaboration diagrams, shows how a system using the 6-Component problem definition model will support stakeholders in their exploration, evaluation and resolution of an ill-defined and complex real-world problem. A case study provides validation of the effectiveness of the 6-Component problem definition model proposed and developed in this work. The case study uses the 6-Component problem definition model as a basis for implementing the Integration Workbench, an evolutionary support system for land-use planning. Stakeholders explore, communicate, evaluate and resolve the Tasmanian Regional Forest Agreement problem with assistance from the Integration Workbench.
APA, Harvard, Vancouver, ISO, and other styles
43

Thies, Anna. "Understanding Complex Problems in Healthcare : By Applying a Free-Flowing Design Practice." Licentiate thesis, Stockholms universitet, Institutionen för data- och systemvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-129710.

Full text
Abstract:
Healthcare in Sweden is in need of a transformation. The increase of chronic conditions poses a great challenge to the organisational structure of healthcare, which still largely remains based on acute, rather than chronic care. Development in the healthcare realm is commonly conducted in fragmented processes and from the professions', rather than the patients', perspective. A new type of design methodology has in recent years entered the field of healthcare development and innovation. It has progressively been used to enable the reorganisation of the healthcare system itself rather than solely developing artefacts, IT-systems or services within it. This thesis focuses on problems that can be described as open, complex, dynamic, networked, or wicked. The following research questions are investigated:    I.     How can designers' competence contribute to healthcare innovation?  II.     How can designers support the identification of complex problems in healthcare? Empirical data were collected during two healthcare innovation projects in which the author took an active role as both designer and researcher. The research work was based on qualitative data that were gathered using ethnographic methodology (i.e., interviews, participant observation and field notes). The data were analysed using open coding principles and activity theory. The results highlighted the valuable role of free-flowing design practice, supporting a thorough understanding of complex problems. The free-flowing design practice entails that the problem space and the solution space co-evolve. These spaces expand iteratively, continuously affecting each other while redefining problems in search of solutions that aim at radical innovation and not merely incremental ameliorations.
Hälso- och sjukvården är i behov av förändring. Ökningen av kroniska tillstånd utgör en stor utmaning för sjukvårdsorganisationens struktur som fortfarande till stor del baseras på akut, snarare än kronisk vård. Sjukvårdsutveckling sker ofta i fragmenterade processer, och utifrån professioners, snarare än patienters perspektiv. En ny typ av designmetodologi har på senare år börjat tillämpas för att utveckla och skapa innovationer inom hälso- och sjukvården. Den har allt mer börjat användas för att utveckla sjukvårdsorganisationen i sig, snarare än att bara utveckla dess produkter, IT-system eller tjänster. Avhandlingen fokuserar på problem som kan beskrivas som öppna, komplexa, föränderliga, sammanlänkade eller 'wicked' [onda]. Följande forskningsfrågor undersöks:    I.     Hur kan designers kompetens bidra till innovation inom hälso-och sjukvården?  II.     Hur kan designers bidra till att identifiera komplexa problem inom hälso- och sjukvården? Empirisk data har samlats in i samband med två innovationsprojekt inom sjukvården där författaren haft en aktiv roll som såväl designer som forskare. Forskningen baseras på kvalitativ data som har samlats in genom etnografisk metodologi (dvs. intervjuer, deltagande observationer och fältanteckningar). Datan har analyserats enligt 'open coding'-principer och aktivitetsteori. Resultaten lyfter fram värdet av en 'free-flowing design practice' [fritt föränderlig designpraktik] för att förstå komplexa problem. Den fritt föränderliga designpraktiken medför att problem-rymden och lösnings-rymden sam-utvecklas. Dessa rymder expanderar iterativt, medan de kontinuerligt påverkar varandra mot en omdefiniering av problem, i syfte att hitta lösningar som kan leda till radikal innovation, snarare än bara inkrementella förbättringar.
APA, Harvard, Vancouver, ISO, and other styles
44

Middleton, Howard Eric, and n/a. "The Role of Visual Mental Imagery in Solving Complex Problems in Design." Griffith University. School of Education, 1998. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20050919.170056.

Full text
Abstract:
The problem addressed in this thesis is the nature of design expertise and the role of visual mental imagery in design. The problem is addressed firstly, by examining the nature of problems, including design problems. It is argued that design problems are complex and ill-defined and can be distinguished from non-design problems. Secondly, design expertise is examined. It was found that design experts have a large store of design knowledge in a form that is readily accessible, and engage in extensive problem-finding prior to generating design solutions. Thirdly, the role of visual mental images as a component of design problem-solving and design expertise is examined. It is argued that visual mental images are important features of both design expertise and the transition from novice to expert. A number of case studies are designed and conducted. The findings of these studies are interpreted as supporting the theoretical ideas developed in the thesis. The introduction of design-based technology programs into Australian high schools has created the need for teachers to be able to assist students to generate creative solutions to design problems. Currently, technology teachers are experiencing difficulty in helping students to generate creative solutions to design problems. Hence a better understanding of design process may help to shape teaching and learning in design-based subjects. Furthermore, many complex everyday problems share similar properties with design problems. The research may therefore contribute to the understanding of the way people solve problems that have some characteristics in common with design problems. It is argued in this thesis that existing theories and models explaining the nature of problems and of the processes of solving problems are adequate in explaining many categories of problems and problem-solving but are inadequate in explaining the process of solving design problems. A new model of a problem space is proposed and justified. It is argued that design problems occur within a problem space that consists of a problem zone, a search and construction space and a satisficing zone. To establish, theoretically, the role of visual mental imagery in designing, two bodies of cognitive research literature are employed. Firstly, research into the utility of sketches in problem-solving are examined. This research indicates that external images assist problem-solving. Secondly, research into the relationship between perception and imagery is examined. This research suggests that visual mental images are functionally equivalent to perceived images. Thirdly, by combining the findings on sketches in problem-solving with the findings on imagery and perception, it is then possible to argue that visual mental images can assist problem-solving, and may play an important role in the resolution of complex design problems. The cognitive theory explaining the role of visual mental imagery in problem-solving in design is used to develop predictions for testing in two practical studies. Designers use visual imagery to represent and transform complex design problems within the problem space, and visual images are theorised as capable of providing more efficient representations for solving design problems than other forms of representation such as propositions. In the two studies undertaken in this thesis, a case study methodology was employed. The findings of the two studies support the arguments developed in this thesis that expert designers are able to form more complete and more detailed images of design problems and solutions than novices. Expert designers have a large store of previous solutions that can be retrieved from long-term memory as visual mental images. Expert designers are able to recognise when their existing solutions can be used, how they might be modified for use, and where something new is required. The study examined designing in terms of the deployment of procedures and the relationship among these procedures, and with images usage. It was found that designers traverse the design problem space using generative and exploratory procedures and that these procedures are facilitated by and facilitate, the production of visual mental images. The study provides a model of a problem space that can be used to explain the process of solving complex ill-defined problems, the cognitive processing involved in creative thinking and the role of mental imagery in an information processing theory of problem-solving. Conceptualising the problem space as containing a problem zone, search and construction space and satisficing zone makes it possible to apply the concept of a problem space to problems that do not contain well specified problem and goal states and with a limited number of operators. Integrating imagery theories with information processing theories provides an account of the process of solving complex design problems and the generation of novel solutions.
APA, Harvard, Vancouver, ISO, and other styles
45

Tap, Koray. "Complex source point beam expansions for some electromagnetic radiation and scattering problems." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1190015563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Du, Zhihua [Verfasser]. "Boundary Value Problems for Higher Order Complex Partial Differential Equations / Zhihua Du." Berlin : Freie Universität Berlin, 2008. http://d-nb.info/1022870912/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Spellman, Kevin James. "Using ideation tools for face-to-face collaboration within complex design problems." Thesis, Goldsmiths College (University of London), 2010. http://research.gold.ac.uk/4744/.

Full text
Abstract:
The focus of this research are ideation tools and their ability to catalyse ideas to address complex design problems. Complex design problems change over time and the interactions among the components of the problem and the interaction between the problem and its environment are of such that the system as a whole cannot be fully understood simply by analyzing its components (Cilliers 1998, pp. I). Ideation for this research is defined as a process of generating, developing and communicating ideas that are critical to the design process (Broadbent, in Fowles 1979, pp. 15). Based on Karni and Arciszewski, who stated that ideation tools should act more like an observer or suggester rather than controller or an expert, I defne design ideation tools as tools or methods that enhance, increase and improve the user's ability to generate ideas with the client (Karni and Arciszewski 1997; Reineg and Briggs 2007). Based on a survey of over 70 ideation tools, protocol analysis of design activities, a web survey and semistructured interviews, I conclude that designers and clients may not have sufficient knowledge of ideation or ideation tools in either testing or practice as a catalyst for generating possibilities and that measuring ideation tools based on how many ideas they generate is misleading because it relates creativity and idea generation but does not adequately reflect the participants' experience. This research suggests that participants' cultural perceptions of design ideation and the design process actively inhibit idea generation and that a shift from design outcome led ideation tool design to designing ideation tools that engage design contexts are necessary to effectively address complex design problems. This research identifed a gap in ideation tools for designers to collaborate with their clients during the ideation phase to catalyse possibilities to complex design problems as the contribution to new knowledge.
APA, Harvard, Vancouver, ISO, and other styles
48

Middleton, Howard Eric. "The Role of Visual Mental Imagery in Solving Complex Problems in Design." Thesis, Griffith University, 1998. http://hdl.handle.net/10072/366392.

Full text
Abstract:
The problem addressed in this thesis is the nature of design expertise and the role of visual mental imagery in design. The problem is addressed firstly, by examining the nature of problems, including design problems. It is argued that design problems are complex and ill-defined and can be distinguished from non-design problems. Secondly, design expertise is examined. It was found that design experts have a large store of design knowledge in a form that is readily accessible, and engage in extensive problem-finding prior to generating design solutions. Thirdly, the role of visual mental images as a component of design problem-solving and design expertise is examined. It is argued that visual mental images are important features of both design expertise and the transition from novice to expert. A number of case studies are designed and conducted. The findings of these studies are interpreted as supporting the theoretical ideas developed in the thesis. The introduction of design-based technology programs into Australian high schools has created the need for teachers to be able to assist students to generate creative solutions to design problems. Currently, technology teachers are experiencing difficulty in helping students to generate creative solutions to design problems. Hence a better understanding of design process may help to shape teaching and learning in design-based subjects. Furthermore, many complex everyday problems share similar properties with design problems. The research may therefore contribute to the understanding of the way people solve problems that have some characteristics in common with design problems. It is argued in this thesis that existing theories and models explaining the nature of problems and of the processes of solving problems are adequate in explaining many categories of problems and problem-solving but are inadequate in explaining the process of solving design problems. A new model of a problem space is proposed and justified. It is argued that design problems occur within a problem space that consists of a problem zone, a search and construction space and a satisficing zone. To establish, theoretically, the role of visual mental imagery in designing, two bodies of cognitive research literature are employed. Firstly, research into the utility of sketches in problem-solving are examined. This research indicates that external images assist problem-solving. Secondly, research into the relationship between perception and imagery is examined. This research suggests that visual mental images are functionally equivalent to perceived images. Thirdly, by combining the findings on sketches in problem-solving with the findings on imagery and perception, it is then possible to argue that visual mental images can assist problem-solving, and may play an important role in the resolution of complex design problems. The cognitive theory explaining the role of visual mental imagery in problem-solving in design is used to develop predictions for testing in two practical studies. Designers use visual imagery to represent and transform complex design problems within the problem space, and visual images are theorised as capable of providing more efficient representations for solving design problems than other forms of representation such as propositions. In the two studies undertaken in this thesis, a case study methodology was employed. The findings of the two studies support the arguments developed in this thesis that expert designers are able to form more complete and more detailed images of design problems and solutions than novices. Expert designers have a large store of previous solutions that can be retrieved from long-term memory as visual mental images. Expert designers are able to recognise when their existing solutions can be used, how they might be modified for use, and where something new is required. The study examined designing in terms of the deployment of procedures and the relationship among these procedures, and with images usage. It was found that designers traverse the design problem space using generative and exploratory procedures and that these procedures are facilitated by and facilitate, the production of visual mental images. The study provides a model of a problem space that can be used to explain the process of solving complex ill-defined problems, the cognitive processing involved in creative thinking and the role of mental imagery in an information processing theory of problem-solving. Conceptualising the problem space as containing a problem zone, search and construction space and satisficing zone makes it possible to apply the concept of a problem space to problems that do not contain well specified problem and goal states and with a limited number of operators. Integrating imagery theories with information processing theories provides an account of the process of solving complex design problems and the generation of novel solutions.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Education
Full Text
APA, Harvard, Vancouver, ISO, and other styles
49

Taylor, Ryan M. "Bioinformatic Solutions to Complex Problems in Mass Spectrometry Based Analysis of Biomolecules." BYU ScholarsArchive, 2014. https://scholarsarchive.byu.edu/etd/5585.

Full text
Abstract:
Biological research has benefitted greatly from the advent of omic methods. For many biomolecules, mass spectrometry (MS) methods are most widely employed due to the sensitivity which allows low quantities of sample and the speed which allows analysis of complex samples. Improvements in instrument and sample preparation techniques create opportunities for large scale experimentation. The complexity and volume of data produced by modern MS-omic instrumentation challenges biological interpretation, while the complexity of the instrumentation, sample noise, and complexity of data analysis present difficulties in maintaining and ensuring data quality, validity, and relevance. We present a corpus of tools which improves quality assurance capabilities of instruments, provides comparison abilities for evaluating data analysis tool performance, distills ideas pertinent in MS analysis into a consistent nomenclature, enhances all lipid analysis by automatic structural classification, implements a rigorous and chemically derived lipid fragmentation prediction tool, introduces custom structural analysis approaches and validation techniques, simplifies protein analysis form SDS-PAGE sample excisions, and implements a robust peak detection algorithm. These contributions provide improved identification of biomolecules, improved quantitation, and improve data quality and algorithm clarity to the MS-omic field.
APA, Harvard, Vancouver, ISO, and other styles
50

Charles, Mehdi. "Modeling and solving complex multi-item lot-sizing problems with inventory constraints." Thesis, Lyon, 2021. http://www.theses.fr/2021LYSEM039.

Full text
Abstract:
Nous nous sommes intéressés au problème de lot-sizing multi-produits avec capacité, temps de lancement et ventes perdues. Nous avons étendu ce problème afin de prendre en compte des aspects industriels importants, en particulier des contraintes sur les stocks. Nous avons d'abord étudié les effets de fin d’horizon des solutions aux problèmes de lot-sizing, qui peuvent entraîner des coûts importants même pour des horizons temporels très longs. Pour compenser ces effets, nous avons proposé de rajouter une contrainte de stock final minimal ainsi que des contraintes de stock maximal par produit. Ces valeurs ont été déduites d'une analyse d'un problème de lot-sizing cyclique avec temps de lancement et capacité dont la relaxation linéaire peut être résolue de manière analytique. Par la suite, nous nous sommes intéressés à la modélisation de l'évolution des stocks intra-périodes. Cet aspect est particulièrement important lorsque les capacités de stockage sont limitées. Nous avons proposé des nouvelles contraintes qui différent en fonction des hypothèses sur la production et la demande. L'objectif est de limiter les excès et les déficits de stock lors de l'ordonnancement détaillé du plan de production à chaque période. Nos résultats numériques ont montré que ces nouvelles contraintes permettent de construire des plans de production respectant davantage les contraintes sur les stocks. Des méthodes de résolution génériques et plus particulièrement des méthodes de décomposition (relaxation Lagrangienne, relax-and-fix) ont été développées. Une approche originale de parallélisation a été proposée, dont l’objectif est de réduire la taille des sous-problèmes à résoudre et d'utiliser les outils disponibles à DecisionBrain. L'objectif final de cette thèse a été l'implémentation des heuristiques proposées dans l'outil d'optimisation développé par DecisionBrain ainsi que des tests de performance sur des instances industrielles
In this thesis, we considered the capacitated multi-item lot-sizing problem with setup times and lost sales. We extended this problem to take into account important industrial aspects, especially with regards to inventory management. We first studied the end-of-horizon effects on optimal solutions of lot-sizing problems that, even on a rolling horizon, can lead to important additional costs. To reduce these effects, we have added a global minimum ending inventory constraint as well as a maximum ending inventory constraint for each item. These values were deduced from the analysis of a cyclical capacitated lot-sizing problem with setup times, whose linear relaxation can be analytically solved. Then, we modeled the inventory evolution within each period. This point is especially relevant when the storage capacity is limited. We added new inventory constraints to better respect inventory bounds when scheduling productions within each period. The constraints differ based on hypotheses on the shapes of evolution of production and demand. Numerical experiments showed that these new constraints enable to schedule production plans with a better inventory management. Decomposition approaches (Lagrangian relaxation, relax-and-fix) were developed in order to propose generic approaches to solve capacitated lot-sizing problems with setup times. An original use of parallelization was proposed in order to reduce the size of the subproblems to solve and to use Decisionbrain's tools.Finally, the parallelized relax-and-fix was implemented into DecisionBrain's optimization tool and tests were performed on industrial instances
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography