To see the other types of publications on this topic, follow the link: Theory of Constraints.

Dissertations / Theses on the topic 'Theory of Constraints'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Theory of Constraints.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Smith, Barry Crawford. "Epistemic constraints on semantic theory." Thesis, University of Edinburgh, 1991. http://hdl.handle.net/1842/26947.

Full text
Abstract:
In this thesis I adopt an antirealist view of language understanding. According to that view we have understanding of a language only when it can be shown to others in common linguistic practice. I examine claims for the kind of knowledge involved in that understanding; claims about what we possess, how this is revealed, and how we may investigate it. By accepting that language use is a cognitive skill, I have to reject accounts of knowledge in purely linguistic, behavioural or social terms. The best way to examine cognitivist claims is by developing a cognitivist theory for the empirical investigation of the mind. I then show that these epistemic considerations bear on the epistemological claims of the antirealist. Further, I show that this way of explaining is quite compatible with the antirealist scruples about public display of our knowledge. It is claimed that empirical cognitive theories are not known to speakers who recognise and reveal their understanding, but explain how they do this. It is then shown why this claim is neither irrelevant to the philosopher's inquiry into meaning, nor posterior to it, nor leads to the replacement of it. A partnership of constructivism and cognitivism is necessary to reflect and explain the epistemic limits to the semantical content of human languages.
APA, Harvard, Vancouver, ISO, and other styles
2

Patel, Mrudula A. "Intertemporal consumer theory with borrowing constraints." Thesis, University of Essex, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.238399.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Horncastle, Edward T. "Core flow modelling : Constraints from dynamo theory." Thesis, University of Liverpool, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.502302.

Full text
Abstract:
In recent history our understanding of the magnetic field and the generating motions of the molten iron in the Earths core have increased dramatically. The two major approaches to investigate fluid flow have been core surface flow modelling from magnetic data inversion and modelling of the dynamo itself. Core flow modelling involves downward continuation of the magnetic field to the core mantle boundary (CMB), then adopting the frozen flux approximation plus added assumptions e.g. tangential geostrophy to reduce non-uniqueness, to obtain fluid flow at the surface of the core that produces the observed secular variation (SV). The main check on the validity of these flows has been observed changes in length· of day. This study aims to test the fluid flow inversion more rigorously by using synthetic data of main field, SV and fluid flow from two self-consistent convection driven dipole dominated dynamos. The dynamo magnetic data are inverted and comparisons made with the true dynamo flow. The use of two large scale assumptions, the strong norm and the KE norm, has been tested. Forward models of advection, a neglected advection, and diffusion, from the dynamo data have been calculated to compare contributions to the secular variation. It is shown that within the dynamos the definition of the magnetic Reynolds number is flawed, relating to a failure of the frozen flux approximation. The effects of truncation of field and flow on the generated advection has been studied. It was found that both the failure of the frozen flux approximation and truncation had a large effect on the flow inversions. Another possible reason for non-recovery of some parts of the flow was found to be that much of the true and inverted flow was along contours of Br / cos 0, the null space caused by the geostrophic assumption. With reducing this non-uniqueness in mind, the validity of of a new assumption called helical flow was checked by studying the true properties of the dynamo flow. A new spectral helical flow constraint that can be applied separately to tangential geostrophy has been developed. With the caveat that the results have been found on dynamos with parameters very different to the Earth, cautious conclusions have been made on the best combinations of assumptions to use in Earth core flow models. It has been shown that, at the truncation of Earth models, when the new helical flow constraint is used with the KE norm and weak geostrophy more of the dynamo true flow has been recovered. The results have been applied to the Earth and validated by using changes in the length of day.
APA, Harvard, Vancouver, ISO, and other styles
4

Barreau, Sofka. "Developmental constraints on a theory of memory." Thesis, University College London (University of London), 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.267644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Flieger, Wojciech. "Constraints on neutrino mixing from matrix theory." Doctoral thesis, Katowice : Uniwersytet Śląski, 2021. http://hdl.handle.net/20.500.12128/21721.

Full text
Abstract:
Jeden z kluczowych problemów współczesnej fizyki cząstek elementarnych dotyczy liczby zapachów neutrin występujących w naturze. Do tej pory udało się ustalić, ze istnieją trzy rodzaje neutrin aktywnych. Istotnym problemem jest ustalenie, czy istnieją inne dodatkowe stany neutrinowe. Neutrina takie nazywamy sterylnymi ze względu na fakt, ze ich oddziaływanie słabe ze znaną materią jest jak do tej pory poniżej eksperymentalnego progu detekcji. Niemniej jednak neutrina sterylne mogą się mieszać z neutrinami aktywnymi pozostawiając tym samym ślady swojego istnienia na poziomie Modelu Standardowego w postaci nieunitarności macierzy mieszania neutrin. Z tego powodu badanie nieunitarności macierzy mieszania jest tak istotne dla pełnego zrozumienia fizyki neutrin. W rozprawie przedstawiamy nową metodę analizy macierzy mieszania neutrin opartą na teorii macierzy. Fundament naszego podejścia do badania macierzy mieszania neutrin stanowią pojęcia wartości osobliwych oraz kontrakcji. Dzięki tym pojęciom zdefiniowaliśmy obszar fizycznie dopuszczalnych macierzy mieszania jako powłokę wypukłą rozpiętą na trójwymiarowych unitarnych macierzach mieszania wyznaczonych na podstawie danych eksperymentalnych. W rozprawie badamy geometryczne własności tego obszaru wyznaczając jego objętość wyrażoną poprzez miarę Haara rozkładu na wartości osobliwe oraz studiując jego strukturę wewnętrzną zależną od minimalnej liczby dodatkowych sterylnych neutrin. Stosując teorię unitarnej dylatacji pokazujemy jak wartości osobliwe pozwalają zidentyfikować nieunitarne macierze mieszania oraz jak tworzyć ich rozszerzenia do pełnej macierzy unitarnej wymiaru większego niż trzy, opisującej kompletną teorię zawierającą neutrina sterylne. Na tej podstawie wyznaczamy nowe ograniczenia w modelach gdzie aktywne neutrina mieszają się z jednym dodatkowym neutrinem sterylnym.
APA, Harvard, Vancouver, ISO, and other styles
6

Komen, Erwin R. "Branching constraints." Universität Potsdam, 2009. http://opus.kobv.de/ubp/volltexte/2009/3227/.

Full text
Abstract:
Rejecting approaches with a directionality parameter, mainstream minimalism has adopted the notion of strict (or unidirectional) branching. Within optimality theory however, constraints have recently been proposed that presuppose that the branching direction scheme is language specific. I show that a syntactic analysis of Chechen word order and relative clauses using strict branching and movement triggered by feature checking seems very unlikely, whereas a directionality approach works well. I argue in favor of a mixed directionality approach for Chechen, where the branching direction scheme depends on the phrase type. This observation leads to the introduction of context variants of existing markedness constraints, in order to describe the branching processes in terms of optimality theory. The paper discusses how and where the optimality theory selection of the branching directions can be implemented within a minimalist derivation.
APA, Harvard, Vancouver, ISO, and other styles
7

Grundling, Hendrik, Fernando Lledo, and hendrik@maths unsw edu au. "Local Quantum Constraints." ESI preprints, 2000. ftp://ftp.esi.ac.at/pub/Preprints/esi897.ps.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chow, Chi-Ming. "Predictive control with constraints." Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.320170.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ortega, Sandoval Josue Alberto. "Matching with real-life constraints." Thesis, University of Glasgow, 2017. http://theses.gla.ac.uk/8492/.

Full text
Abstract:
This thesis consists of four chapters. The first chapter explains the relevance of the research that has been undertaken and it contains an overview of this research for a general audience. The second chapter studies a multi-unit assignment with endogenous quotas in a dichotomous preference domain. The main conclusion I obtain is that pseudo-market mechanisms perform poorly in this type of environment. The third and fourth chapters use matching theory to understand segregation in matching environments ranging from integrating kidney exchanges platforms to the increase in interracial marriages after the popularization of online dating platforms. In both Chapters, using different formulations, I show under which conditions social integration can be obtained.
APA, Harvard, Vancouver, ISO, and other styles
10

Kwong, April P. "Tree pattern constraints for XML : theory and applications /." For electronic version search Digital dissertations database. Restricted to UC campuses. Access is free to UC campus dissertations, 2004. http://uclibs.org/PID/11984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Malherbe, Johannes Louw. "Scheduling program based on the theory of constraints." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53581.

Full text
Abstract:
Thesis (MEng)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: The goal of this thesis is to provide a stepping-stone for the design and development of a software package that implements the Goal System Algorithm, based on the Theory of Constraints (TOC). This includes the complete description and explanation of the Goal System Algorithm (GS), as well as the partial implementation of this algorithm using Microsoft Access as a Database Management System (DBMS) and Microsoft Visual C++ as programming language. The main development effort was put into the development of a scheduling algorithm and the implementation of a data structure that lies at the core of this algorithm. The reason for the development of such a package is that it will aid a production manager, working in a small to medium size job-shop, in generating a schedule for production that will increase throughput, while simultaneously reducing both inventory and operating expense thereby generating profits and cash flow. With regard to this thesis and the overall project goal the following have been achieved. 1. The complete project has been researched, scoped and each step has been explained. 2. The complete program structure has been defined and broken into two separate modules; the Data Mining and Conversion Module and the TOC Scheduling Algorithm. 3. The database containing all the MRP data necessary for scheduling has been designed and implemented using a MS Access database with an ODBC connection. An ODBC connection to the database was used so that a smooth transmission to other database management systems can be made. 4. The TOC Scheduling Algorithm has been developed and the following have been implemented: • A basic user interface has been created for the insertion of all the user input and to display the constraint schedule. • A data structure called a linked list has been developed and used to store the scheduling data in memory. • The complete GS algorithm had been researched and explained. • The GS algorithm has been and implemented and tested up to the point where it schedules the constraint. • The pseudo code for the part of the GS algorithm that was not implemented has been documented and included in this report. More development needs to be done and a proper Graphical User Interface must also be created to complete this project, but after completion a Toe software package will exist that is completely unique in South Africa and the market potential for this package will be considerable.
AFRIKAANSE OPSOMMING: Die doel van hierdie tesis is om die grondslag te skep vir die ontwerp en ontwikkeling van 'n sagteware pakket wat Goldratt se Doel Sisteem Algoritme, gebasseer op die 'Theory of Constraints', implementeer. Dit sluit die gedetaileerde beskrywing van die Doel Sisteem Algoritme in en 'n gedeeltelike implementasie van die algortime, deur gebruik te maak van 'n Microsoft Access databasis as databasis bestuur sisteem en Microsoft Visual C++ as 'n programerings taal. Die hoof klem is gelê op die ontwikkeling van die skedulerings algortime en die implementasie van die strukture wat deel van die kern uitmaak van die algoritme. Die hoof rede vir die ontwikkeling van so 'n pakket is sodat dit 'n produksie bestuurder van 'n klein to medium grootte vervaardigings besigheid sal help om 'n skedule vir produksie the genereer wat die vloer se deurset sal verhoog, terwyl dit voorraad en operasionele kostes sal verlaag. Met ander woorde dit sal die besigheid help om meer geld te maak huidiglik en in die toekoms. Met betrekking tot die tesis en die algehele projek doel is die volgende bereik: 1. Die hele projek is nagevors, uit een gesit en verduidelik. 2. Die hele program struktuur is gedefinieer en opgebreek in twee aparte modules; nl. die 'Data Mining and Conversion Module' en die 'TOC Scheduling Algorithm'. 3. Die databasis wat al die nodige MRP inligting bevat wat benodig word vir skedulering is ontwerp en geimplementeer deur gebruil te maak van 'n MS Access databasis met 'n ODBC konneksie. Daar is van 'n ODBC konneksie gebruik gemaak sodat as die nodig is, daar sonder enige moeite na ander databasis bestuurs sisteme oorgeskakel kan word. 4. Die 'TOC Scheduling Algorithm' is onwikkel en die volgende is geimplementeer. • A basisse gebruikers vlak is ontwikkel sodat al die nodig invoer data in die program ingevoer kan word. • 'n Geskakelde lys is ontwikkel en gebruik as die data struktuur om al die skedulerings informasie in geheue te stoor. • Die Doel Sisteem algorimte is in sy geheel verduidelik en gedokumenteer. Die Doel Sisteem algoritme is geïmplementeer tot op die punt waar dit die primêre bottelnek skeduleer. • Die pseudo kode vir die deel van die GS algoritme wat nie geimplementeer is nie is uitgelê in ingesluit as deel van die verslag. Verdere ontwikkeling word nog benodig en 'n beter gebruikers vlak moet nog geskep word om die projek te finaal afte handel, maar na dit gedoen is sal daar 'n TOe skedulering pakket bestaan wat heeltemal uniek is tot Suid-Afrika en 'n groot mark potensiaal sal hê.
APA, Harvard, Vancouver, ISO, and other styles
12

Kizilersü, Ayşe. "Gauge theory constraints on the fermion-boson vertex." Thesis, Durham University, 1995. http://etheses.dur.ac.uk/4886/.

Full text
Abstract:
In this thesis we investigate the role played by fundamental properties of QED in determining the non-perturbative fermion-boson vertex. These key features are gauge invariance and multiplicative renormalisability. We use the Schwinger-Dyson equations as the non- perturbative tool to study the general structure of the fermion-boson vertex in QED. These equations, being an infinite set, have to be truncated if they are to be solved. Such a truncation is made possible by choosing a suitable non-perturbative ansatz for the fermion-boson vertex. This choice must satisfy these key properties of gauge invariance and multiplicative renormalisability. In this thesis we develop the constraints, in the case of massless unquenched QED, that have to be fulfilled to ensure that both the fermion and photon propagators are multiplicatively renormalisable-at least as far as leading and subleading logarithms are concerned. To this end, the Schwinger-Dyson equations are solved perturbatively for the fermion and photon wave-function renormalisations. We then deduce the conditions imposed by multiplicative renormalisability for these renormalisation functions. As a last step we compare the two results coming from the solution of the Schwinger-Dyson equations and multiplicative renormalisability in order to derive the necessary constraints on the vertex function. These constitute the main results of this part of the thesis. In the weak coupling limit the solution of the Schwinger-Dyson equations must agree with perturbation theory. Consequently, we can find additional constraints on the 3- point vertex by perturbative calculation. Hence, the one loop vertex in QED is then calculated in arbitrary covariant gauges as an analytic function of its momenta. The vertex is decomposed into a longitudinal part, that is fully responsible for ensuring the Ward and Ward-Takahashi identities are satisfied, and a transverse part. The transverse part is decomposed into 8 independent components each being separately free of kinematic singularities in any covariant gauge in a basis that modifies that proposed by Ball and Chiu. Analytic expressions for all 11 components of the O(a) vertex are given explicitly in terms of elementary functions and one Spence function. These results greatly simplify in particular kinematic regimes. These are the new results of the second part of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
13

Weber, Guglielmo. "Consumption, liquidity constraints and aggregation." Thesis, London School of Economics and Political Science (University of London), 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.262094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Corbett, Dan R. "Unification and constraints over conceptual structures." Title page, contents and summary only, 2000. http://web4.library.adelaide.edu.au/theses/09PH/09phc7889.pdf.

Full text
Abstract:
Bibliography: leaves 150-161. This thesis addresses two areas in the field of conceptual structures. The first is the unification of conceptual graphs, and the consequent work in projection and in type hierarchies... The second area of investigation is the definition of constraints, especially real-value constraints on the concept referents, with particular attention to handling constraints during the unification of conceptual graphs.
APA, Harvard, Vancouver, ISO, and other styles
15

Beck, James Michael. "Applying theory of constraints to an aircraft remanufacturing line." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1993. http://handle.dtic.mil/100.2/ADA277987.

Full text
Abstract:
Thesis (M.S. in Management) Naval Postgraduate School, December 1993.
Thesis advisor(s): Shu S. Liao ; Martinus Sarigul-Klijn. "December 1993." Includes bibliographical references. Also available online.
APA, Harvard, Vancouver, ISO, and other styles
16

Sancak, Cemile. "Investment under borrowing constraints, theory and evidence from Turkey." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0026/NQ52333.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wiedemann, Urs Achim. "Constraints and spontaneous symmetry breaking in quantum field theory." Thesis, University of Cambridge, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Movahhed, Abdolmohammad. "Context and constraints in Stanley Fish's reader-response theory." Thesis, University of Strathclyde, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.510849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Sancak, Cemile 1969 Carleton University Dissertation Economics. "Investment under borrowing constraints; theory and evidence from Turkey." Ottawa.:, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
20

Elyasir, Ahmed H. S. "The relationship between performance measures of theory of constraints." Thesis, University of Manchester, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.629933.

Full text
Abstract:
The subject of this thesis is the performance management system of the theory of constraints (TOe). Its objectives are to examine to what extent the strength of the relationship between the operational measure X (Ox) and the global measure Y (Gy) may be affected by: 1) changing one or both parts of the relationship and 2) changes in the operational environment (OE), then to utilize the knowledge gained in analyzing and planning the overall performance of industrial firms working under a Toe environment, as well as supporting strategic decision-making regarding the choice of the most suitable strategy to improve the overall performance of such a firm. The first dimension of the research objectives was tested under four different OEs and the second dimension was examined under the nine possible combinations of the operational and global measures. Two aspects of the operational environment were chosen, one to represent the external environment at the strategic level, which is the financial openness (or closeness) of the system, and the second to represent the internal environment at the operational level, which is the inventory replenishment policy. Each of the two factors are of two levels; in the case of the first factor the levels are 'open system' against 'closed system' ('open system' means the unit sells some of its products on credit, and obtains some of its supplies on debit; 'closed system' means that all these transactions take place on a ready-cash basis). For the second factor, the two levels are 'lot for lot' against 'reorder-point' polices of inventory replenishment. The combination of these four levels creates four different operational environments under each of which the relationships were measured and compared. The following statement represents the two research hypotheses: 1- The strength of the relationship between the different pairs of measures within the same operational environment differs from one pair to another; and 2- The strength of the relationship between the same pair of measures di ffers from one operational environment to another. A simulation model was built to provide the data required to measure the relationships and then to test the research hypotheses. The strength of the relationships was measured in tenns of regression coefficient. The total effect of each of the operational measures on each of the global measures of the system was measured using the causal model of the relationships which was developed on the basis of TOe and Throughput accounting principles and definitions. The research hypotheses were then tested using MANOVA doubly repeated measures with a fully factorial design. The statistical analysis supports the two hypotheses. Based on the research results, a simple model was developed which may be used as a decision support system to help management in choosing the most efficient and effective combination of the operational environment and strategy to achieve the stated goal. Towards the end of the thesis, a number of further research opportunities are identified, including further validation and verification of the proposed model.
APA, Harvard, Vancouver, ISO, and other styles
21

Tagawa, John T. (John Tetsuo) 1972. "Implementing theory of constraints in a job shop environment." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/9805.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1999.
Includes bibliographical references (p. 91).
The Boeing Company is under performance pressures due to internal process performance and external cost pressures. The response has been a focus on manufacturing fundamentals to meet market demands on schedule and cost. Boeing has utilized Lean Manufacturing as the methodology for improving manufacturing. This thesis describes an implementation of Theory of Constraints in a job shop environment in a Boeing component manufacturing shop. Lean Manufacturing and Theory of Constraints are described and compared as methodologies for improving manufacturing systems. This thesis demonstrates that the two methodologies can be integrated in one manufacturing system. The TOC five-step continuous improvement methodology was utilized in the implementation as a framework for analysis. A process for identifying bottleneck operations in a job shop is detailed. The steps are to identify the process flows, determine constraints within the process flows and release material into the flowpath at the constraint production rate. It is probable that the actual constraint in the system will not be identifiable through data analysis, and methods for determining constraint operations through constraint engineering are described. An implementation of the drum-buffer-rope material control process is described in this thesis. To enable the implementation, a data management system was developed. The system utilizes the concept of critical ratio scheduling priority, a time buffer to protect bottlenecks from starvation and process flow to provide the necessary information for operating in a drum-buffer-rope pull environment. The drum-buffer-rope material control policy provides a method for controlling the WIP and cycle time in a factory within a MRP framework. The issues encountered in the implementation are detailed. These are related to information systems, organizational history, metrics, organizational culture and incumbent policies. These all provide challenges to implementation a TOC system and need to be managed properly.
by John T. Tagawa.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
22

Lopez-Mejia, Alejandro. "Liquidity constraints, near rationality and consumption." Thesis, Queen Mary, University of London, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390359.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Delgado, Pamela I. "Bipartitions Based on Degree Constraints." Digital Commons @ East Tennessee State University, 2014. https://dc.etsu.edu/etd/2410.

Full text
Abstract:
For a graph G = (V,E), we consider a bipartition {V1,V2} of the vertex set V by placing constraints on the vertices as follows. For every vertex v in Vi, we place a constraint on the number of neighbors v has in Vi and a constraint on the number of neighbors it has in V3-i. Using three values, namely 0 (no neighbors are allowed), 1 (at least one neighbor is required), and X (any number of neighbors are allowed) for each of the four constraints, results in 27 distinct types of bipartitions. The goal is to characterize graphs having each of these 27 types. We give characterizations for 21 out of the 27. Three other characterizations appear in the literature. The remaining three prove to be quite difficult. For these, we develop properties and give characterization of special families.
APA, Harvard, Vancouver, ISO, and other styles
24

Graham, Judson L. (Judson Lawrence) 1968. "Application of theory of constraints and process control theory to multi-stage manufacturing systems." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/9992.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Sloan School of Management; and, Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1998.
Includes bibliographical references (p. 105).
by Judson L. Graham.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
25

Al-Alawi, Raida. "The functionality, training and topological constraints of digital neural networks." Thesis, Brunel University, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.278373.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Chou, Remi. "Information-theoretic security under computational, bandwidth, and randomization constraints." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53837.

Full text
Abstract:
The objective of the proposed research is to develop and analyze coding schemes for information-theoretic security, which could bridge a gap between theory an practice. We focus on two fundamental models for information-theoretic security: secret-key generation for a source model and secure communication over the wire-tap channel. Many results for these models only provide existence of codes, and few attempts have been made to design practical schemes. The schemes we would like to propose should account for practical constraints. Specifically, we formulate the following constraints to avoid oversimplifying the problems. We should assume: (1) computationally bounded legitimate users and not solely rely on proofs showing existence of code with exponential complexity in the block-length; (2) a rate-limited public communication channel for the secret-key generation model, to account for bandwidth constraints; (3) a non-uniform and rate-limited source of randomness at the encoder for the wire-tap channel model, since a perfectly uniform and rate-unlimited source of randomness might be an expensive resource. Our work focuses on developing schemes for secret-key generation and the wire-tap channel that satisfy subsets of the aforementioned constraints.
APA, Harvard, Vancouver, ISO, and other styles
27

Kawasaki, Takako 1968. "Coda constraints : optimizing representations." Thesis, McGill University, 1998. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=35970.

Full text
Abstract:
Languages differ in their sound patterns, but these differences are, to a large extent, systematic. One goal of Universal Grammar (Chomsky 1957, 1965) is to account for the systematic patterns which are attested across languages. Toward this end, Universal Grammar is considered to contain a set of phonological primitives such as features, and some restrictions on their combination. However, in rule-based phonology, it is assumed that rules are part of the grammar of an individual language. By their very nature, rules describe operations. As such, they are not well-suited to express restrictions on the ways in which segments may combine when no overt operation is involved. To account for such restrictions, Chomsky & Halle (Sound Pattern of English (SPE): 1968) supplemented rules with Morpheme Structure Constraints (MSCs) which define the possible morpheme shapes that a particular language allows (see also Halle 1959). Thus, in SPE, both MSCs and rules played a role in accounting for the phonological patterns observed in languages.
APA, Harvard, Vancouver, ISO, and other styles
28

Nuttall, Peter D. A. "Operations Management and the Theory of Constraints in the NHS." Thesis, University of Manchester, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.525188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

O'Leary, Matthew C. (Matthew Clarence). "Performance measures for product development utilizing theory of constraints methodology." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Schwain, Kevin D. (Kevin Douglas) 1974. "Prioritization and integration of lean initiatives with theory of constraints." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/34774.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering; in conjunction with the Leaders for Manufacturing Program at MIT, 2004.
Includes bibliographical references (leaves 44-45).
The principles of lean manufacturing have taken hold in a number of manufacturing firms as a means of achieving operational excellence through continuous improvement. Womack and Jones have suggested a generalized process for lean transformation in their 1996 book, Lean Thinking. A key element of this process is the creation of value stream maps for each product line. Value stream maps are the basis for planning and tracking a firm's lean transformation. Rother and Shook go further in their 1998 work Learning to See as they describe how these maps are created and then integrated into both the transformation process and the regular business planning cycle. The authors note that difficult questions remain, including: "In what order should we implement?" and "Where do we start?" Advice offered by Rother and Shook is helpful but insufficient given the complexity of many business environments and the scarcity of resources in competitive industries. This thesis builds upon Rother and Shook's work in proposing a framework for prioritizing lean initiatives. Specifically, Theory of Constraints (TOC) tools are employed as a basis for selecting programs and projects that provide the greatest system-wide productivity improvement for the least cost. In this manner, application of the proposed prioritization framework results in a more effective and efficient lean transformation. Research at the Eastman Kodak Company illustrates how this framework can be applied in a paper finishing production facility. Results highlight the system constraint in the paper slitting operation and the high leverage of machine changeover time in productivity improvement. We conclude that the Theory of Constraints can provide an effective focusing tool for the lean enterprise.
by Kevin D. Schwain.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
31

Reimers, Arne Cornelis [Verfasser]. "Metabolic Networks, Thermodynamic Constraints, and Matroid Theory / Arne C. Reimers." Berlin : Freie Universität Berlin, 2014. http://d-nb.info/1058587331/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Mntonintshi, Unathi. "The application of theory of constraints in mergers and acquisitions." Diss., University of Pretoria, 2003. http://hdl.handle.net/2263/33241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Strabic, Natasa. "Theory and algorithms for matrix problems with positive semidefinite constraints." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/theory-and-algorithms-for-matrix-problems-with-positive-semidefinite-constraints(5c8ac15f-9666-4682-9297-73d976bed63e).html.

Full text
Abstract:
This thesis presents new theoretical results and algorithms for two matrix problems with positive semidefinite constraints: it adds to the well-established nearest correlation matrix problem, and introduces a class of semidefinite Lagrangian subspaces. First, we propose shrinking, a method for restoring positive semidefiniteness of an indefinite matrix $M_0$ that computes the optimal parameter $\a_*$ in a convex combination of $M_0$ and a chosen positive semidefinite target matrix. We describe three algorithms for computing $\a_*$, and then focus on the case of keeping fixed a positive semidefinite leading principal submatrix of an indefinite approximation of a correlation matrix, showing how the structure can be exploited to reduce the cost of two algorithms. We describe how weights can be used to construct a natural choice of the target matrix and that they can be incorporated without any change to computational methods, which is in contrast to the nearest correlation matrix problem. Numerical experiments show that shrinking can be at least an order of magnitude faster than computing the \ncm\ and so is preferable in time-critical applications. Second, we focus on estimating the distance in the Frobenius norm of a symmetric matrix $A$ to its nearest correlation matrix $\Ncm(A)$ without first computing the latter. The goal is to enable a user to identify an invalid correlation matrix relatively cheaply and to decide whether to revisit its construction or to compute a replacement. We present a few currently available lower and upper bounds for $\dcorr(A) = \normF{A - \Ncm(A)}$ and derive several new upper bounds, discuss the computational cost of all the bounds, and test their accuracy on a collection of invalid correlation matrices. The experiments show that several of our bounds are well suited to gauging the correct order of magnitude of $\dcorr(A)$, which is perfectly satisfactory for practical applications. Third, we show how Anderson acceleration can be used to speed up the convergence of the \apm\ for computing the \ncm, and that the acceleration remains effective when it is applied to the variants of the nearest correlation matrix problem in which specified elements are fixed or a lower bound is imposed on the smallest eigenvalue. This is particularly significant for the nearest correlation matrix problem with fixed elements because no Newton method with guaranteed convergence is available for it. Moreover, alternating projections is a general method for finding a point in the intersection of several sets and this appears to be the first demonstration that these methods can benefit from Anderson acceleration. Finally, we introduce semidefinite Lagrangian subspaces, describe their connection to the unique positive semidefinite solution of an algebraic Riccati equation, and show that these subspaces can be represented by a subset $\mathcal{I} \subseteq \{1,2,\dots, n\}$ and a Hermitian matrix $X\in\mathbb{C}^{n\times n}$ that is a generalization of a quasidefinite matrix. We further obtain a semidefiniteness-preserving version of an optimization algorithm introduced by Mehrmann and Poloni [\textit{SIAM J.\ Matrix Anal.\ Appl.}, 33(2012), pp.\ 780--805] to compute a pair $(\mathcal{I}_{\opt},X_{\opt})$ with $M = \max_{i,j} \abs{(X_{\opt})_{ij}}$ as small as possible, which improves numerical stability in several contexts.
APA, Harvard, Vancouver, ISO, and other styles
34

Salguero, Fabela Carlos Edson, and David Becerril. "IS Project Implementation : An approach using the Theory of Constraints." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Informatik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-13476.

Full text
Abstract:
Nowadays companies face continuous changes. Every change a firm face has to be carefully addressed by everyone in the organization but especially by leaders and the managerial team. However these changes not always produce monetary benefits. A change on the IS might increase productivity in one business area but it might not represent profit to the whole company. There are several change studies which can be used as guide by leaders when im-plementing a change. However none of these theories consider the possibility to improve efficiency as a result of the change. Neither to assure that after the IS change implementation the company‟s profitability increases. The theory of con-straints (TOC) is a useful tool which covers both of the issues mentioned before. This paper will combine several change theories with the theory of constraints. With this mixture of ideas we want to show leaders a new procedure on which they can relay on when dealing with the process of an IS change. This procedure should assure an increment on the productivity produce after the change. But also consider an addition to the company profitability. We studied three change related theories. We then revised the TOC and we com-pared all this recollected information with the way six leaders handled IS changes in their companies. These five companies were Dell, Cisco Systems, Desca, Ericsson, and Nortel. At the end we were able to identify critical success factors which any leader should consider when facing an IS change. These factors cover from the beginning of the IS change, the implementation of it and finally the way to make this change maximize the business performance.
APA, Harvard, Vancouver, ISO, and other styles
35

Heiberg, Andrea. "Coda Neutralization: Against Purely Phonetic Constraints." Department of Linguistics, University of Arizona (Tucson, AZ), 1995. http://hdl.handle.net/10150/227245.

Full text
Abstract:
The neutralization of the laryngeal features of a consonant that is not directly followed by a vowel is a common process cross -linguistically. Laryngeal neutralization in this position has a clear phonetic cause: laryngeal features are not salient unless they are immediately followed by a vowel. Since laryngeal neutralization has a phonetic cause, it seems reasonable to characterize it directly in phonetic terms, without positing any additional layer of phonological abstraction. However, a phonetic explanation is not sufficient to account for all cases of laryngeal neutralization. For example, in Korean, laryngeal neutralization occurs in a nonneutralizing phonetic environment; in Nisgha, laryngeal neutralization occurs only in the reduplicant, although the phonetic environment for neutralization is found in both the reduplicant and the base. Although phonetics is the major factor leading to the development of these types of restrictions on laryngeal features, I argue that a phonetic account is not adequate for all such restrictions. Abstract phonological constraints and representations are necessary. Hence, two types of neutralization are possible: (i) phonetic neutralization, which results directly from the lack of saliency of cues and occurs in every instance of the neutralizing environment; and (ii) abstract phonological neutralization, which may occur where the neutralizing environment is absent (as will be demonstrated for Korean), and may fail to occur in every instance of the neutralizing environment (as will be demonstrated for Nisgha).
APA, Harvard, Vancouver, ISO, and other styles
36

Vlerick, Michael Marie Patricia Lucien Hilda. "Darwin's doubt : implications of the theory of evolution for human knowledge." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/71595.

Full text
Abstract:
Thesis (DPhil)--Stellenbosch University, 2012.
ENGLISH ABSTRACT: In this dissertation I enquire into the status, scope and limits of human knowledge, given the fact that our perceptual and cognitive faculties are the product of evolution by natural selection. I argue that the commonsense representations these faculties provide us with yield a particular, species-specific scope on the world that does not ‘correspond’ in any straightforward way to the external world. We are, however, not bound by these commonsense representations. This particular, species-specific view of the world can be transgressed. Nevertheless, our transgressing representations remain confined to the conceptual space defined by the combinatorial possibilities of the various representational tools we possess. Furthermore, the way in which we fit representations to the external world is by means of our biologically determined epistemic orientation. Based on the fact that we are endowed with a particular set of perceptual and cognitive resources and are guided by a particular epistemic orientation, I conclude that we have a particular cognitive relation to the world. Therefore, an accurate representation for us is a particular fit (our epistemic orientation) with particular means (our perceptual and cognitive resources).
AFRIKAANSE OPSOMMING: Hierdie tesis handel oor die aard, omvang en limiete van kennis, gegewe dat ons perseptuele en kognitiewe vermoëns die resultaat van evolusie deur middel van natuurlike seleksie is. Eerstens, word daar geargumenteer dat die algemene voorstellings wat hierdie vermoëns aan ons bied ‘n partikuliere, spesie-spesifieke siening van die wêreld aan ons gee, wat nie op ‘n eenvoudige manier korrespondeer aan die werklikheid nie. Ons is egter nie gebonde aan hierdie voorstellings nie. Hierdie partikuliere, spesie-spesifieke siening van die wêreld kan oorskry word. Ons is egter wel beperk tot die konseptuele ruimte wat gedefinieër word deur die kombinatoriese moontlikhede van die voorstellingsmiddele tot ons beskikking. Verder word die manier waarop ons hierdie voorstellings aan die wêreld laat pas deur ons biologies gedetermineerde epistemiese oriëntasie bepaal. Dus, gegewe dat ons ‘n spesifieke stel perseptuele en kognitiewe vermoëns het en deur ‘n spesifieke kognitiewe epistemiese oriëntasie gelei word, staan ons in ‘n spesifieke kognitiewe verhouding tot die wêreld. ‘n Akkurate voorstelling (m.a.w. kennis vir ons) is om spesifieke vermoëns (perseptuele en kognitiewe vermoëns) op ‘n spesifieke manier (epsitemiese oriëntasie) aan die wêreld te laat pas.
APA, Harvard, Vancouver, ISO, and other styles
37

Vonasek, Scott M. "Synchronizing the 3M Cushion Mount Plus supply chain." Online version, 2000. http://www.uwstout.edu/lib/thesis/2000/2000vonaseks.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Gossner, Jesse Ross. "Generalised predictive control in the presence of constraints." Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.318601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Artuso, Mariangela. "Information processing constraints on the acquisition of a theory of mind." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0003/NQ41397.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Smith, Rachel. "The evolution of debris disk systems : constraints from theory and observation." Thesis, University of Edinburgh, 2008. http://hdl.handle.net/1842/3131.

Full text
Abstract:
Debris disks are believed to be the remnants of planet formation; a disk of solid bodies called planetesimals that did not get incorporated into planets. They provide an ideal opportunity for studying the outcome of planet formation in their systems. The best studied disks exhibit cool emission peaking at ≥ 60 microns, lying in Edgeworth-Kuiper belt-like regions with an inner dust-free hole. However around half of the main sequence stars with excess emission seen in IRAS observations show an excess at 25 microns only. This thesis presents a study of mid-infrared debris disks through theory and observations to examine the following questions: are such disks around Sun-like stars simply debris disks of truncated planetary systems?; can this emission be explained by a collisionally-evolving disk analogous to the asteroid belt?; is the degree of variation in emission levels seen around otherwise similar A stars evidence of stochastic evolution? An analytical model of debris disk evolution assuming the disk evolves under a steadystate collisional cascade is presented and shows there is a maximum flux that can be expected from a disk of a given radius and age and that, for a given disk location, the excess emission arising from the disk will decrease linearly with time. Comparison of observations with the maximum predicted flux from the analytical model indicates some Sun-like stars are likely hosts of transient emission. Comparison with A star statistics shows that A star excesses can be explained by collisionally-evolving disks, and that the variation in emission between similar stars can be explained by varying initial conditions. However the model assumes the disk consists of a narrow ring (whereas the true dust distribution may be spatially extended) at a location predicted by blackbody fitting to the excess SED (which can lead to errors in the dust location of up to a factor of 3). Resolved imaging is needed to determine the true disk morphology and the implications of this on the transient or steady-state interpretation. A sample of 12 Sun-like stars with mid-infrared excess, and a complementary sample of 11 A-type stars, are observed with TIMMI2, VISIR on the VLT and MICHELLE and TReCS on Gemini. Six of the Sun-like sources are shown not to be debris disks, highlighting the need for high-resolution imaging to remove bogus disk sources. None of the Sun-like stars show resolved emission, however a new method of determining extension limits from unresolved imaging is presented and used to show that a single-temperature dust model for the η Corvi mid-infrared excess with transient dust at 1.7AU is more likely than a 2-temperature fit with dust belts at 1.3AU (transient) and 12AU (steady-state). The A star observations reveal a further bogus disk source. Unresolved images of HD71155 constrain the excess emission to be from 2 dust populations: a transient population at 0.6AU and a steady-state population at 61AU. The extension limits modelling is further used to highlight A star disks which may present fruitful subjects for future 8m imaging. One such source, HD181296, is observed with TReCS and shown to possess an edge-on disk at around 22 AU which fits with the steady-state interpretation. As the unresolved 8m observations are used to constrain the outer limits of the disk emission, MIDI observations of 2 Sun-like and 2 A-type sources are used to constrain the inner limits of the disk. The first results from these observations indicate that there are changes in the visibility function with wavelength that match the predicted changes for a completely resolved disk component. The combined limits for the hot Corvi emission suggest the source has a transient disk lying between 0.9AU and 3.0AU.
APA, Harvard, Vancouver, ISO, and other styles
41

Odendaal, Maghiel Jock. "Business process modelling using model checking and the theory of constraints." Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/4269.

Full text
Abstract:
Thesis (MScEng (Industrial Engineering))--University of Stellenbosch, 2010.
ENGLISH ABSTRACT: Concurrent and distributed business processes are becoming the norm in many organisations. Current modelling techniques do not address the problems faced by concurrent business processes sufficiently. We showhowmodel checking is applied to business processes to prove behavioural properties to address the aforementioned shortcomings. A method of abstraction is required to construct business process models that can be model checked. In this thesis we show the suitability of the Logical Thinking Process as an abstraction tool. We call the combination of the Logical Thinking Process and model checking the Complexity Alleviation Method (CAM). We apply CAM to two well-known supply chain and manufacturing problems, and insightful results are obtained. This leads us to the conclusion that CAM allows for the quicker modelling of business processes, as well as providing problem-specific and proven solutions in amanner not possible with simulation or other techniques.
AFRIKAANSE OPSOMMING: Gelyklopende en verspreide besigheidsprosesse word ’n alledaagse verskynsel in menigte instansies. Huidige modelleringstegnieke is nie in staat om die probleme geassosieer met gelyklopende besigheidsprosesse aan te spreek nie. Ons wys hoe model model verifikasie (“model checking”) toegepas word op besigheidsprosesse om gedragseienskappe te bewys en sodoende die voorgenoemde tekortkominge aan te spreek. ’nMetode van abstraksie word benodig ombesigheidsprosesmodelle, wat verifieerbaar is, te konstrueer. In hierdie verhandeling word die geskiktheid van die Logiese Denkproses (“the Logical Thinking Process”) as abstraksie gereedskap aangetoon. Ons noem die kombinasie van die Logiese Denkproses en model verifikasie Kompleksiteitsverligtingsmetodologie (CAM). Ons pas CAM op twee welbekende aanbodketting- en vervaardigingsprobleme toe en insiggewende resultate is verkry. Dit lei ons tot die gevolgtrekking dat CAM vinniger konstruering van modelle te weeg bring, sowel as probleem spesifieke en bewysbare oplossings verskaf wat nie moontlik ismet simulasie of ander tegnieke nie.
APA, Harvard, Vancouver, ISO, and other styles
42

Nilsson, Jacob, and Pontus Stomberg. "Lokalisering av förbättringsområde : Enkla lokaliseringsmetoder för produktionsprocesser." Thesis, Högskolan i Borås, Akademin för textil, teknik och ekonomi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-10544.

Full text
Abstract:
Detta examenarbete redogör och belyser hur en lokalisering av förbättringsområde inom en producerande process kan ske. Rapporten har sin grund i ett sökande efter var en effektivisering kan genomföras. Hur uppnås målet och finns det metoder som kan appliceras på en diversifierad mängd områden. Många företag och verksamheter har idag ett stort behov av förbättring. Detta för att kunna överleva det allt tuffare klimatet på marknaden. En konkurrenskraftig produkt både är pris- och kvalitetsmässigt i framkant krävs för att nå marknadsandelar. Undersökningen grundar sig i en nulägesanalys på ett verkligt företag för att göra undersökningen så rättvisande som möjligt. Rapporten redogör även för en undersökning av olika metoder från tidigare forskning inom lean och theory of constraints. Genom att använda sig av verktyg som 5S, JIT och koncept i sin helhet som lean och TOC kan en lokalisering av förbättringsområden ske. Genom att studera koncept som lean och TOC lokaliseras förbättringsområden med hjälp av deras grundläggande filosofi. De delar som finns i produktionen som bör tas i beaktning är omställningar, slöserier och begränsningar. Dessa tre faktorer påverkar en process negativt om de inte lokaliseras och effektiviseras eller elimineras. Den metod som bör användas varierar från fall till fall och det är av yttersta vikt att hitta en metod som fungerar bra för den egna processen. Det första som bör göras är att bli väl påläst och vara tydlig med vad metoden ska bidra med. De metoder som denna studie har granskat och visat sig fungera i en producerande process är: 5S, JIT, SMED samt TOC:s läran om flaskhalsar och leans läran om slöserier. Metoderna bidrar med olika sätt för lokalisering av förbättringsområden.
This report describes and illustrates how to locate an area for improvement in a producing process. The report is based on a search for where the streamlining can be performed. How can we achieve the goal, and are there methods that can be applied to a diverse range of areas. Many companies and businesses today have a great need for improvement, this in order to survive the increasingly tougher climate on the market. Where you now must be able to deliver a competitive product that is both price and quality at the forefront. The study is based on a situation analysis on a real business company, with a producing process, to do the survey as accurate as possible. The report also describes a study of the different methods from previous research in lean and TOC. By using tools such as 5S, JIT and concepts in its entirety as lean and TOC can you locate each one's areas of improvement. By studying the concept of lean and TOC makes you localize areas of improvement with the help of their basic philosophy. The parts that are in production that should be taken into consideration are transitions, waste and limitations. These three factors affect the process negatively even if they are not localized and streamlined or eliminated. The method you should use will vary from case to case and it is important that you find a method that works well for one's own process. The first thing you should always do is to be well prepared and be clear about what you want the method to contribute with. The methods of this study has been reviewed and concluded that: 5S, JIT, SMED and TOC's doctrine of bottlenecks and leans doctrine of wastes will work. All these practices help with ways for locating areas of improvement.
APA, Harvard, Vancouver, ISO, and other styles
43

Bhattacharjee, Sangita, and University of Lethbridge Faculty of Arts and Science. "A primal-dual algorithm for the maximum charge problem with capacity constraints." Thesis, Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science, 2010, 2010. http://hdl.handle.net/10133/2557.

Full text
Abstract:
In this thesis, we study a variant of the maximum cardinality matching problem known as the maximum charge problem. Given a graph with arbitrary positive integer capacities assigned on every vertex and every edge, the goal is to maximize the assignment of positive feasible charges on the edges obeying the capacity constraints, so as to maximize the total sum of the charges. We use the primal-dual approach. We propose a combinatorial algorithm for solving the dual of the restricted primal and show that the primal-dual algorithm runs in a polynomial time.
ix, 96 leaves : ill. ; 29 cm
APA, Harvard, Vancouver, ISO, and other styles
44

Slade, Michael L. "A layout algorithm for hierarchical graphs with constraints /." Online version of thesis, 1994. http://hdl.handle.net/1850/11724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Yeung, Siu Yin. "Combinatorial property of prefix-free trees with some regular constraints /." View abstract or full-text, 2004. http://library.ust.hk/cgi/db/thesis.pl?COMP%202004%20YEUNG.

Full text
Abstract:
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2004.
Includes bibliographical references (leaves 67-68). Also available in electronic version. Access restricted to campus users.
APA, Harvard, Vancouver, ISO, and other styles
46

van, Dijk Sander Gerrit. "Informational constraints and organisation of behaviour." Thesis, University of Hertfordshire, 2014. http://hdl.handle.net/2299/15436.

Full text
Abstract:
Based on the view of an agent as an information processing system, and the premise that for such a system it is evolutionary advantageous to be parsimonious with respect to informational burden, an information-theoretical framework is set up to study behaviour under information minimisation pressures. This framework is based on the existing method of relevant information, which is adopted and adapted to the study of a range of cognitive aspects. Firstly, the model of a simple reactive actor is extended to include layered decision making and a minimal memory, in which it is shown that these aspects can decrease some form of bandwidth requirements in an agent, but at the cost of an increase at a different stage or moment in time, or for the system as a whole. However, when combined, they do make it possible to operate with smaller bandwidths at each part of the cognitive system, without increasing the bandwidth of the whole or lowering performance. These results motivate the development of the concept of look-ahead information, which extends the relevant information method to include time, and future informational effects of immediate actions in a more principled way. It is shown that this concept can give rise to intrinsic drives to avoid uncertainty, simplify the environment, and develop a predictive memory. Next, the framework is extended to incorporate a set of goals, rather than deal with just a single task. This introduces the task description as a new source of relevant information, and with that the concept of relevant goal information. Studying this quantity results in several observations: minimising goal information bandwidth results in ritualised behaviour; relevant goal and state information may to some point be exchanged for one another without affecting the agent’s performance; the dynamics of goal information give rise to a natural notion of sub-goals; bottlenecks on goal memory, and a measure of efficiency on the use of these bottlenecks, provide natural abstractions of the environment, and a global reference frame that supersedes local features of the environment. Finally, it is shown how an agent or species could actually arrive at having a large repertoire of goals and accompanying optimal sensors and behaviour, while under a strong information-minimisation pressure. This is done by introducing an informational model of sensory evolution, which indicates that a fundamental information-theoretical law may underpin an important evolutionary catalyst; namely, even a fully minimal sensor can carry additional information, dubbed here concomitant information, that is required to unlock the actual relevant information, which enables a minimal agent to still explore, enter and acquire different niches, accelerating a possible evolution to higher acuity and behavioural abilities.
APA, Harvard, Vancouver, ISO, and other styles
47

Davies, Andrea Jane. "Modelling goes to museums : experiential consumption, the Theory of Planned Behaviour and old and new museology." Thesis, Open University, 1999. http://oro.open.ac.uk/57947/.

Full text
Abstract:
This study adopts a two-stage structural equation modelling approach to demonstrate the nomological validity and utility of The Theory of Planned Behaviour to both predict and to explain the visiting intentions of middle-class residents to social history museums within the next 12 months. Working within an 'experience-based management approach' the present study provides both a descriptive contribution, in terms of identifying and providing significant improvements in the measurement of museum anticipated experiences and resource facilitators and constraints, as well as a predictive contribution, in terms of assessing the ability of The Theory of Planned Behaviour, and in particular, the relative contribution of attitudes, subjective norms and perceived behavioural control modelled with complex summated- interactive antecedents, to explain museum visiting intentions. Particular attention is given to the neglected role of belief evaluation in previous museum and heritage studies in describing the structure and structural dynamics of anticipated museum experience opportunities. Furthermore, attention is given to the potential contribution of perceived behavioural control, and an understanding of an individual's resource constraints, to the experience-based management approach. A two-stage development of a summated interactive- complex model is shown to overcome methodological and conceptual deficiencies which have been noted in previous expectancy-value attitude studies. In addition, this study examines the impact of the anticipated interpretative environment (physical designed space) on the museum experiential opportunities, control and social influences perceived by individuals, and compares the interpretative orientation of The New Museology (idea-based museum) to traditional mixes of museum interpretative media (object-based museum) in this respect. A qualitative-quantitative research design was employed. Thirty extended qualitative interviews formed the basis of the study by providing a 'real lived' understanding of common consumption experiences at heritage attractions, the resource problems associated with museum visits and the influences of social referents. Four hundred quantitative interviews with respondents from middle-class households formed the main focus of the study. Interviews were conducted using a systematic random sampling method applied in two spatially and demographically contrasting electoral wards of Edinburgh, Scotland. Across the spatial wards, respondents were randomly divided in two sub-groups (n=200). In each sub-group respondents were asked to evaluate a pictorial collage designed to capture the interpretative orientation of either the New Museology or traditional approach to museum interpretative mixes. The study highlights the superiority of interpretative media mixes common to The New Museology in raising the instrumental and experiential-process value individuals anticipate from this style of museum attraction. In doing so, the study finds support for the continued application of The Manning-Haas Hierarchy of Demand, where the importance of 'setting' in managing the consumption experiences of consumers is explicitly recognised. However, due to the 'egalitarian' objective of The New Museology, and the expected 'levelling' or increasing homogeneity observed between visitors and non-visitors to idea-based (The New Museology) in terms of anticipated experiential benefits and costs perceived in this museum environment, the present study finds the predictive ability of attitudes in The Theory of Planned Behaviour is reduced. For the idea-based museum, these findings raise some questions regarding the ability of the Manning-Haas Hierarchy, which is based on expectancy-value theory, to operate as a predictive modd of motivation as it was intended. However, the present study does support the use of the Manning-Haas Hierarchy as a descriptive heuristic for product development alone. Subjective norms were not found to increase our understanding of museum visiting intentions, while the explanatory ability of perceived behavioural control was limited to idea-based museum attractions. Further, based on the significant contribution for past expereince to explain visiting intentions to the idea based museum, the present study calls for further research to identify potential 'deficiencies' in explanatory variables needed to more fully understand the motivations of individuals to visit idea-based museums associated with The New Museology. Finally, the present study demonstrates the importance of both sub-group analysis in the Theory of Planned Behaviour in order to identify the moderating impact of past experience and gender on the relative impact of attitude, subjective norms and perceived behaviour control on museum visiting intentions.
APA, Harvard, Vancouver, ISO, and other styles
48

Qualls, Joshua D. "UNIVERSAL CONSTRAINTS ON 2D CFTS AND 3D GRAVITY." UKnowledge, 2014. http://uknowledge.uky.edu/physastron_etds/21.

Full text
Abstract:
We study constraints imposed on a general unitary two-dimensional conformal field theory by modular invariance. We begin with a review of previous bounds on the conformal dimension Delta1 of the lowest primary operator assuming unitarity, a discrete spectrum, modular invariance, cL, cR > 1, and no extended chiral algebra. We then obtain bounds on the conformal dimensions Delta2, Delta3 using no additional assumptions. We also show that in order to find a bound for Delta4 or higher Deltan, we need to assume a larger minimum value for ctot that grows logarithmically with n. We next extend the previous results to remove the requirement that our two-dimensional conformal field theories have no extended chiral algebra. We then show that modular invariance also implies an upper bound on the total number of states of positive energy less than ctot=24 (or equivalently, states of conformal dimension between ctot=24 and ctot=12), in terms of the number of negative energy states. Finally, we consider the case where the CFT has a gravitational dual and investigate the gravitational interpretation of our results. Using the AdS3/CFT2 correspondence, we obtain an upper bound on the lightest few massive excitations (both with and without the constraint of no chiral primary operators) in a theory of 3D matter and gravity with Lambda < 0. We show our results are consistent with facts and expectations about the spectrum of BTZ black holes in 2+1 gravity. We then discuss the upper and lower bounds on number of states and primary operators in the dual gravitational theory, focusing on the case of AdS3 pure gravity.
APA, Harvard, Vancouver, ISO, and other styles
49

Kasljevic, Ivan, and Emir Mustafic. "Theory of Constraints och Lean Production i High-mix Low-volume företag." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-28403.

Full text
Abstract:
Konkurrensen i tillverkningsföretag ökar ständigt och detta medför att det ställs hårda krav på produktkvalitet och förmågan att leverera produkter i tid. För att företag ska kunna nå upp till dessa mål krävs utveckling och ständiga förbättringar i takt med den växande marknaden. Detta uppnås på olika sätt men ett vanligt sätt är att arbeta med förbättringsmetodiker. Val av förbättringsmetodik beror på företagets tillverkning, kunskap inom företaget och resurser inom företaget. Att välja förbättringsmetodik anpassat till den egna organisationen är väldigt svårt för många organisationer och många organisationer ställer sig frågan: ”Vilken metod passar bäst för den egna organisationen?”. Målet med examensarbetet är att undersöka om och hur Lean Production och Theory of Constraints kan användas i företag med high-mix low-volume produktion. Med hjälp av en litteraturstudie och en fallstudie har följande frågor besvarats: 1) hur Lean Prodution och Theory of Constraints används i en HMLV produktion och hur dessa kombineras, 2) hur produktionsflödet styrs enligt Lean Production och Theory of Constraints för att effektivisera en produktionsprocess i ett HMLV företag, 3) vilka personliga paradigm och policys det uppstår enligt Lean Production och Theory of Constraints i HMLV företag och vad sambandet mellan dessa och kvalitet är. Resultat från litteraturstudien och fallstudien visar att Lean Production och dess verktyg går bra att använda kombinerat med Theory of Constraints i HMLV företag med vissa undantag. Dessa undantag är dock inga undantag som säger att Lean Production och dess verktyg inte går att använda i HMLV företag i kombination med Theory of Constraints. Dessa undantag behöver endast anpassas för HMLV produktion. Resultaten visade också att standardiserat arbete i en produktion har en avgörande roll när Lean Production och Theory of Constraints används. Standardiserat arbete bidrar till att fortsatt utveckling och implementering underlättas. Resultaten visar även på att produktionsflödet kan anpassas med hjälp av metoder inom Lean Production och Theory of Constraints och att detta göras bäst när dessa två metodiker kombineras. Vidare visade resultaten att nya investeringar i form av ny utrustning inte är nödvändiga i många fall. Detta förutsätter att flaskhalsar, så som personliga paradigm och policys, identifieras. Resultaten visade även att kvaliteten är direkt kopplad till dessa två flaskhalsar och att en investering i ledarskap är att föredra.
Competition between manufacturing companies is constantly increasing and this causes high demands on product quality and ability to deliver products in time. For companies to be able to reach these goals it is necessary to work with development and continuous improvements and respond to the current pace of the market. This can be achieved in different ways, but a common method is to work with improvement methodologies. Selecting which improvement methodologies to use depends on a company’s manufacturing, knowledge, and resources. Many organisations find it difficult to choose improvement methodologies for their particular organisation and ask the question: “Which method is best suited for our organisation?”. The goal of this study is to examine whether and how Lean Production and Theory of Constraints can be utilized in companies with high-mix low-volume production. By performing a literature review and a case study the following questions have been answered: 1) how Lean production and Theory of constraints are utilized in a HMLV production system and how they can be combined, 2) how the production flow is controlled according to Lean Production and Theory of Constraints to improve the efficiency of a production process in a HMLV company, 3) what personal paradigms and policys arise according to Lean Production and Theory of Constraints in HMLV company and what the relationship between these and quality is. Results from the literature review and case study show that Lean Production and its tools are possible to use in a combination with Theory of Constraints in HMLV companies with some exceptions. However there is no exceptions saying that Lean Production and its tools combined with Theory of Constraints can not be utilized in HMLV companies. These exceptions only need to be adjusted for HMLV production. The results also showed that standardized work in a production process plays a crucial role when combining Lean production with Theory of Constraints. Standardized work contributes to facilitation of further development and implementation. The results show that the production flow can be adjusted using methods from Lean Production and Theory of Constraints, and that this is done best by combining these two methodologies. Furthermore, the results showed that in many cases new investments in the form of new equipment are not necessary. This assumes that bottlenecks, such as personal paradigms and policies are identified. The results showed that quality is directly linked to these two bottlenecks and that an investment in management is preferable.
APA, Harvard, Vancouver, ISO, and other styles
50

Ustun, Pinar. "Application Of The Theory Of Constraints To An Elective Course Registration System." Thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12611665/index.pdf.

Full text
Abstract:
The Theory of Constraints (TOC) is a holistic management philosophy put forward by Eliyahu Goldratt in 1984. The thinking process and improvement tools discussed in this theory are mainly geared to manufacturing environments, however their applicability to service environments has also been shown for private professional service organizations. This study demonstrates that the steps and principles of the TOC can also be applied to non-profit services, such as the elective course registration process described in this thesis. In the case of non-profit organizations, the challenge is to define the performance measures of the TOC, which are Throughput, Inventory, and Operating Expense. This study offers a novel definition for these measurements, and using the principles of the TOC, it identifies the bottleneck and constraints of the elective course registration process. Using this analysis, the study then redesigns the system in order to improve the performance measures of the system.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography