Dissertations / Theses on the topic 'Point approach'

To see the other types of publications on this topic, follow the link: Point approach.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Point approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Klinke, Olaf Karl. "A bitopological point-free approach to compactifications." Thesis, University of Birmingham, 2012. http://etheses.bham.ac.uk//id/eprint/3470/.

Full text
Abstract:
This thesis extends the concept of compactifications of topological spaces to a setting where spaces carry a partial order and maps are order-preserving. The main tool is a Stone-type duality between the category of d-frames, which was developed by Jung and Moshier, and bitopological spaces. We demonstrate that the same concept that underlies d-frames can be used to do recover short proofs of well-known facts in domain theory. In particular we treat the upper, lower and double powerdomain constructions in this way. The classification of order-preserving compactifications follows ideas of B. Banaschewski and M. Smyth. Unlike in the categories of spaces or locales, the lattice-theoretic notion of normality plays a central role in this work. It is shown that every compactification factors as a normalisation followed by the maximal compactification, the Stone-Cech compactification. Sample applications are the Fell compactification and a stably compact extension of algebraic domains.
APA, Harvard, Vancouver, ISO, and other styles
2

Robbins, Michael. "The likelihood approach in precipitation change-point testing." Connect to this title online, 2006. http://etd.lib.clemson.edu/documents/1175185482/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

阮邦志 and Pong-chi Yuen. "Recognition of occluded objects: a dominant point approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1993. http://hub.hku.hk/bib/B31233375.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yuen, Pong-chi. "Recognition of occluded objects : a dominant point approach /." [Hong Kong : University of Hong Kong], 1993. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13437574.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rafler, Mathias. "Gaussian loop- and Pólya processes : a point process approach." Phd thesis, Universität Potsdam, 2009. http://opus.kobv.de/ubp/volltexte/2009/3870/.

Full text
Abstract:
This thesis considers on the one hand the construction of point processes via conditional intensities, motivated by the partial Integration of the Campbell measure of a point process. Under certain assumptions on the intensity the existence of such a point process is shown. A fundamental example turns out to be the Pólya sum process, whose conditional intensity is a generalisation of the Pólya urn dynamics. A Cox process representation for that point process is shown. A further process considered is a Poisson process of Gaussian loops, which represents a noninteracting particle system derived from the discussion of indistinguishable particles. Both processes are used to define particle systems locally, for which thermodynamic limits are determined.
Betrachtet wird zum einen die Konstruktion von Punktprozessen mittels bedingter Intensitäten, motivert durch die partielle Integration des Campbell-Maßes eines Punktprozesses, die gerade bedingte Intensitäten liefert. Unter bestimmten Annahmen an die Intensitäten wird gezeigt, dass ein solcher Punktprozess existiert. Als ein fundamentaler Vertreter stellt sich der Pólyasche Summenprozess heraus, aus einer Verallgemeinerung der Dynamik der Pólyaschen Urne hervorgeht. Fuer ihn werden u.a. eine Darstellung als Cox-Prozess gezeigt. Mit einem Poissonprozess von Gaußschen Loops wird ein nicht wechselwirkendes Teilchensystem betrachtet, das aus der Diskussion von Systemen ununterscheidbarer Teilchen abgeleitet ist. Mit beiden Prozessen werden jeweils lokal Teilchensysteme konstuiert, fuer die die thermodynamischen Limiten identifiziert werden.
APA, Harvard, Vancouver, ISO, and other styles
6

Rafler, Mathias. "Gaussian loop- and polya processes : a point process approach." Universität Potsdam, 2009. http://opus.kobv.de/ubp/volltexte/2011/5163/.

Full text
Abstract:
Zufällige Punktprozesse beschreiben eine (zufällige) zeitliche Abfolge von Ereignissen oder eine (zufällige) räumliche Anordnung von Objekten. Deren wichtigster Vertreter ist der Poissonprozess. Der Poissonprozess zum Intensitätsmaß, das Lebesgue-Maß ordnet jedem Gebiet sein Volumen zu, erzeugt lokal, d.h in einem beschränkten Gebiet B, gerade eine mit dem Volumen von B poissonverteilte Anzahl von Punkten, die identisch und unabhängig voneinander in B plaziert werden; im Mittel ist diese Anzahl (B). Ersetzt man durch ein Vielfaches a, so wird diese Anzahl mit dem a-fachen Mittelwert erzeugt. Poissonprozesse, die im gesamten Raum unendlich viele Punkte realisieren, enthalten bereits in einer einzigen Stichprobe genügend Informationen, um Statistik betreiben zu können: Bedingt man lokal bzgl. der Anzahl der Teilchen einer Stichprobe, so fragt man nach allen Punktprozessen, die eine solche Beobachtung hätten liefern können. Diese sind Limespunktprozesse zu dieser Beobachtung. Kommt mehr als einer in Frage, spricht man von einem Phasenübergang. Da die Menge dieser Limespunktprozesse konvex ist, fragt man nach deren Extremalpunkten, dem Rand. Im ersten Teil wird ein Poissonprozess für ein physikalisches Teilchenmodell für Bosonen konstruiert. Dieses erzeugt sogenannte Loops, das sind geschlossene Polygonzüge, die dadurch charakterisiert sind, dass man an einem Ort mit einem Punkt startet, den mit einem normalverteilten Schritt läuft und dabei nach einer gegebenen, aber zufälligen Anzahl von Schritten zum Ausgangspunkt zurückkehrt. Für verschiedene Beobachtungen von Stichproben werden zugehörige Limespunktprozesse diskutiert. Diese Beobachtungen umfassen etwa das Zählen der Loops gemäaß ihrer Länge, das Zählen der Loops insgesamt, oder das Zählen der von den Loops gemachten Schritte. Jede Wahl zieht eine charakteristische Struktur der invarianten Punktprozesse nach sich. In allen hiesigen Fällen wird ein charakteristischer Phasenübergang gezeigt und Extremalpunkte werden als spezielle Poissonprozesse identifiziert. Insbesondere wird gezeigt, wie die Wahl der Beobachtung die Länge der Loops beeinflusst. Geometrische Eigenschaften dieser Poissonprozesse sind der Gegenstand des zweiten Teils der Arbeit. Die Technik der Palmschen Verteilungen eines Punktprozesses erlaubt es, unter den unendlich vielen Loops einer Realisierung den typischen Loop herauszupicken, dessen Geometrie dann untersucht wird. Eigenschaften sind unter anderem die euklidische Länge eines Schrittes oder, nimmt man mehrere aufeinander folgende Schritte, das Volumen des von ihnen definierten Simplex. Weiterhin wird gezeigt, dass der Schwerpunkt eines typischen Loops normalverteilt ist mit einer festen Varianz. Der dritte und letzte Teil befasst sich mit der Konstruktion, den Eigenschaften und der Statistik eines neuartigen Punktprozesses, der Polyascher Summenprozess genannt wird. Seine Konstruktion verallgemeinert das Prinzip der Polyaschen Urne: Im Gegensatz zum Poissonprozess, der alle Punkte unabhängig und vor allem identisch verteilt, werden hier die Punkte nacheinander derart verteilt, dass der Ort, an dem ein Punkt plaziert wird, eine Belohnung auf die Wahrscheinlichkeit bekommt, nach der nachfolgende Punkte verteilt werden. Auf diese Weise baut der Polyasche Summenprozess "Türmchen", indem sich verschiedene Punkte am selben Ort stapeln. Es wird gezeigt, dass dennoch grundlegende Eigenschaften mit denjenigen des Poissonprozesses übereinstimmen, dazu gehören unendliche Teilbarkeit sowie Unabhängigkeit der Zuwächse. Zudem werden sein Laplace-Funktional sowie seine Palmsche Verteilung bestimmt. Letztere zeigt, dass die Höhe der Türmchen gerade geometrisch verteilt ist. Abschließend werden wiederum Statistiken, nun für den Summenprozess, diskutiert. Je nach Art der Beobachtung von der Stichprobe, etwa Anzahl, Gesamthöhe der Türmchen oder beides, gibt es in jedem der drei Fälle charakteristische Limespunktprozesse und es stellt sich heraus, dass die zugehörigen Extremalverteilungen wiederum Polyasche Summenprozesse sind.
APA, Harvard, Vancouver, ISO, and other styles
7

Baek, Yeongcheon. "An interior point approach to constrained nonparametric mixture models /." Thesis, Connect to this title online; UW restricted, 2006. http://hdl.handle.net/1773/5753.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nehring, Benjamin. "Point processes in statistical mechanics : a cluster expansion approach." Phd thesis, Universität Potsdam, 2012. http://opus.kobv.de/ubp/volltexte/2012/6268/.

Full text
Abstract:
A point process is a mechanism, which realizes randomly locally finite point measures. One of the main results of this thesis is an existence theorem for a new class of point processes with a so called signed Levy pseudo measure L, which is an extension of the class of infinitely divisible point processes. The construction approach is a combination of the classical point process theory, as developed by Kerstan, Matthes and Mecke, with the method of cluster expansions from statistical mechanics. Here the starting point is a family of signed Radon measures, which defines on the one hand the Levy pseudo measure L, and on the other hand locally the point process. The relation between L and the process is the following: this point process solves the integral cluster equation determined by L. We show that the results from the classical theory of infinitely divisible point processes carry over in a natural way to the larger class of point processes with a signed Levy pseudo measure. In this way we obtain e.g. a criterium for simplicity and a characterization through the cluster equation, interpreted as an integration by parts formula, for such point processes. Our main result in chapter 3 is a representation theorem for the factorial moment measures of the above point processes. With its help we will identify the permanental respective determinantal point processes, which belong to the classes of Boson respective Fermion processes. As a by-product we obtain a representation of the (reduced) Palm kernels of infinitely divisible point processes. In chapter 4 we see how the existence theorem enables us to construct (infinitely extended) Gibbs, quantum-Bose and polymer processes. The so called polymer processes seem to be constructed here for the first time. In the last part of this thesis we prove that the family of cluster equations has certain stability properties with respect to the transformation of its solutions. At first this will be used to show how large the class of solutions of such equations is, and secondly to establish the cluster theorem of Kerstan, Matthes and Mecke in our setting. With its help we are able to enlarge the class of Polya processes to the so called branching Polya processes. The last sections of this work are about thinning and splitting of point processes. One main result is that the classes of Boson and Fermion processes remain closed under thinning. We use the results on thinning to identify a subclass of point processes with a signed Levy pseudo measure as doubly stochastic Poisson processes. We also pose the following question: Assume you observe a realization of a thinned point process. What is the distribution of deleted points? Surprisingly, the Papangelou kernel of the thinning, besides a constant factor, is given by the intensity measure of this conditional probability, called splitting kernel.
Ein Punktprozess ist ein Mechanismus, der zufällig ein lokalendliches Punktmaß realisiert. Ein Hauptresultat dieser Arbeit ist ein Existenzsatz für eine sehr große Klasse von Punktprozessen mit einem signierten Levy Pseudomaß L. Diese Klasse ist eine Erweiterung der Klasse der unendlich teilbaren Punktprozesse. Die verwendete Methode der Konstruktion ist eine Verbindung der klassischen Punktprozesstheorie, wie sie von Kerstan, Matthes und Mecke ursprünglich entwickelt wurde, mit der sogenannten Methode der Cluster-Entwicklungen aus der statistischen Mechanik. Ausgangspunkt ist eine Familie von signierten Radonmaßen. Diese definiert einerseits das Levysche Pseudomaß L; andererseits wird mit deren Hilfe der Prozess lokal definiert. Der Zusammenhang zwischen L und dem Prozess ist so, dass der Prozess die durch L bestimmte Integralgleichung (genannt Clustergleichung) löst. Wir zeigen, dass sich die Resultate aus der klassischen Theorie der unendlich teilbaren Punktprozesse auf natürliche Weise auf die neue Klasse der Punktprozesse mit signiertem Levy Pseudomaß erweitern lassen. So erhalten wir z.B. ein Kriterium für die Einfachheit und eine Charackterisierung durch die Clustergleichung für jene Punktprozesse. Unser erstes Hauptresultat in Kapitel 3 zur Analyse der konstruierten Prozesse ist ein Darstellungssatz der faktoriellen Momentenmaße. Mit dessen Hilfe werden wir die permanentischen respektive determinantischen Punktprozesse, die in die Klasse der Bosonen respektive Fermionen Prozesse fallen, identifizieren. Als ein Nebenresultat erhalten wir eine Darstellung der (reduzierten) Palm Kerne von unendlich teilbaren Punktprozessen. Im Kapitel 4 konstruieren wir mit Hilfe unseres Existenzsatzes unendlich ausgedehnte Gibbsche Prozesse sowie Quanten-Bose und Polymer Prozesse. Unseres Wissens sind letztere bisher nicht konstruiert worden. Im letzten Teil der Arbeit zeigen wir, dass die Familie der Clustergleichungen gewisse Stabilitätseigenschaften gegenüber gewissen Transformationen ihrer Lösungen aufweist. Dies wird erstens verwendet, um zu verdeutlichen, wie groß die Klasse der Punktprozesslösungen einer solchen Gleichung ist. Zweitens wird damit der Ausschauerungssatz von Kerstan, Matthes und Mecke in unserer allgemeineren Situation gezeigt. Mit seiner Hilfe können wir die Klasse der Polyaschen Prozesse auf die der von uns genannten Polya Verzweigungsprozesse vergrößern. Der letzte Abschnitt der Arbeit beschäftigt sich mit dem Ausdünnen und dem Splitten von Punktprozessen. Wir beweisen, dass die Klassen der Bosonen und Fermionen Prozesse abgeschlossen unter Ausdünnung ist. Die Ergebnisse über das Ausdünnen verwenden wir, um eine Teilklasse der Punktprozesse mit signiertem Levy Pseudomaß als doppelt stochastische Poissonsche Prozesse zu identifizieren. Wir stellen uns auch die Frage: Angenommen wir beobachten eine Realisierung einer Ausdünnung eines Punktprozesses. Wie sieht die Verteilung der gelöschten Punktkonfiguration aus? Diese bedingte Verteilung nennen wir splitting Kern, und ein überraschendes Resultat ist, dass der Papangelou-Kern der Ausdünnung, abgesehen von einem konstanten Faktor, gegeben ist durch das Intensitätsmaß des splitting Kernes.
APA, Harvard, Vancouver, ISO, and other styles
9

Rafler, Mathias. "Gaussian loop- and Pólya processes a point process approach." Potsdam Univ.-Verl, 2009. http://d-nb.info/999884360/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Schachtel, Bernard 1943. "Bulimia: a Phenomenological Approach." Thesis, North Texas State University, 1988. https://digital.library.unt.edu/ark:/67531/metadc331020/.

Full text
Abstract:
This study used a qualitative/phenomenological research methodology to examine the perspective of five bulimic subjects about their lives in order to understand the bulimic individual's point of view and develop a clearer picture of the world of the bulimic. This approach involved three interviews for each of the five subjects totalling 22 1/2 hours. The three interviews dealt with the subjects' past and present experiences and their ideas about the future. The qualitative/phenomenological methodology created an in-depth view of each subject's relationship to the beginning of her bulimia and its subsequent development. During the period when the interviews were being transcribed, patterns and concepts emerged and were examined. Nine categories were developed from this data reflecting some of the characteristics of a bulimic's personality. Six research questions were formulated and then answered by evaluating them in the light of the nine categories as well as data and descriptions from the interviews. No one single category was found to be uniquely dominant, but rather the categories tended to appear in a cluster-like fashion depending on the individual personality of the bulimic. The data of this study revealed a distinction between the personality and the behavior of the bulimic. A form with a Likert-like response was developed by the researcher and given out to 11 raters in order to evaluate the presence or non-presence of the categories in selected passages. On the basis of the findings of this study, with its limited subject pool, certain recommendations are presented for the reader that might perhaps be of some use in understanding bulimia.
APA, Harvard, Vancouver, ISO, and other styles
11

Oropallo, William Edward Jr. "A Point Cloud Approach to Object Slicing for 3D Printing." Thesis, University of South Florida, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10751757.

Full text
Abstract:

Various industries have embraced 3D printing for manufacturing on-demand, custom printed parts. However, 3D printing requires intelligent data processing and algorithms to go from CAD model to machine instructions. One of the most crucial steps in the process is the slicing of the object. Most 3D printers build parts by accumulating material layers by layer. 3D printing software needs to calculate these layers for manufacturing by slicing a model and calculating the intersections. Finding exact solutions of intersections on the original model is mathematically complicated and computationally demanding. A preprocessing stage of tessellation has become the standard practice for slicing models. Calculating intersections with tessellations of the original model is computationally simple but can introduce inaccuracies and errors that can ruin the final print.

This dissertation shows that a point cloud approach to preprocessing and slicing models is robust and accurate. The point cloud approach to object slicing avoids the complexities of directly slicing models while evading the error-prone tessellation stage. An algorithm developed for this dissertation generates point clouds and slices models within a tolerance. The algorithm uses the original NURBS model and converts the model into a point cloud, based on layer thickness and accuracy requirements. The algorithm then uses a gridding structure to calculate where intersections happen and fit B-spline curves to those intersections.

This algorithm finds accurate intersections and can ignore certain anomalies and error from the modeling process. The primary point evaluation is stable and computationally inexpensive. This algorithm provides an alternative to challenges of both the direct and tessellated slicing methods that have been the focus of the 3D printing industry.

APA, Harvard, Vancouver, ISO, and other styles
12

Broadfoot, Alison Ann. "Comparing the Dominance Approach to the Ideal-Point Approach in the Measurement and Predictability of Personality." Bowling Green State University / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1211913274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Hill, Bryony J. "An orientation field approach to modelling fibre-generated spatial point processes." Thesis, University of Warwick, 2011. http://wrap.warwick.ac.uk/49422/.

Full text
Abstract:
This thesis introduces a new approach to analysing spatial point data clustered along or around a system of curves or fibres with additional background noise. Such data arise in catalogues of galaxy locations, recorded locations of earthquakes, aerial images of minefields, and pore patterns on fingerprints. Finding the underlying curvilinear structure of these point-pattern data sets may not only facilitate a better understanding of how they arise but also aid reconstruction of missing data. We base the space of fibres on the set of integral lines of an orientation field. Using an empirical Bayes approach, we estimate the field of orientations from anisotropic features of the data. The orientation field estimation draws on ideas from tensor field theory (an area recently motivated by the study of magnetic resonance imaging scans), using symmetric positive-definite matrices to estimate local anisotropies in the point pattern through the tensor method. We also propose a new measure of anisotropy, the modified square Fractional Anisotropy, whose statistical properties are estimated for tensors calculated via the tensor method. A continuous-time Markov chain Monte Carlo algorithm is used to draw samples from the posterior distribution of fibres, exploring models with different numbers of clusters, and fitting fibres to the clusters as it proceeds. The Bayesian approach permits inference on various properties of the clusters and associated fibres, and the resulting algorithm performs well on a number of very different curvilinear structures.
APA, Harvard, Vancouver, ISO, and other styles
14

Ongchin, Derrick Cokee. "Monitoring and evaluating reorder point system performance : a cost-weighted approach." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/66054.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; in conjunction with the Leaders for Global Operations Program at MIT, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 89).
Organizations are quickly realizing the need to leverage data and analytics to stay one step ahead of the competition as fast-paced global markets continue to emerge and grow and the world becomes increasingly complex. More than ever, corporate executives are executing data-driven decisions and strategies to run businesses. They require scenarios and simulations on alternative courses of action that incorporate complex business parameters in order to make decisions that continuously hone customer focus. In an environment of global economic uncertainty, Cisco Systems sees itself entering a time of unprecedented opportunity. With the customer as a leading priority, this thesis investigates the monitoring and evaluation of Cisco's reorder point system in increasing supply chain visibility and driving customer satisfaction excellence. We aim to develop a model that will aid in data-driven decision making and provide an organization the capability to quickly respond to changes in a volatile environment without additional costs or impact to customer experience. The model is intended to serve as a tool to bridge strategy and execution by providing lean process and supply planners invaluable insights into optimizing the inventory management system and improving customer service levels. The model aggregates historical demand data, inventory policy settings, and costweighted item performance to gauge system-wide performance. Model testing accurately corroborates previously known issues of insufficient reorder points. Preliminary user feedback suggests strong initial buy-in within the organization.
by Derrick Cokee Ongchin.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
15

Jiang, Tao. "Information Approach for Change Point Detection of Weibull Models with Applications." Bowling Green State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1434382384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Afonso, Daniel Gil. "Industrial applicability of single point incremental forming : functional and energetic approach." Doctoral thesis, Universidade de Aveiro, 2017. http://hdl.handle.net/10773/21534.

Full text
Abstract:
Doutoramento em Engenharia Mecânica
Incremental sheet forming processes like single point incremental forming have been majorly studied since the beginning of the 2000's. Besides the applications in the prototyping eld, ISF processes can also be used to the manufacture of unique parts and small batches. This possibility has a great potential for speed up new product development and to make products in smaller series economically viable. Also, this capability leads to a new business possibilities, enable the development of exclusive or custom products. However, mainly due to its novelty, SPIF industrial operation is still very apprehensive with just a few examples of application. The main purpose of the present work is to create tools that can be used for the SPIF process management and present examples of usage in di erent industrial elds. The SPIF process is studied using the SPIF-A machine design and built at the Department of Mechanical Engineering at the University of Aveiro. Despite being a free form manufacture process, SPIF has some geometric limitations, manly due to the forming mechanics and formability limit of the materials. The possible part con gurations and the design orientation are settled, allowing a suitable part development. The hardware to perform incremental forming operations is outlined and the forming process is described, presenting alternative solutions both based on experimental work and state of the art review. A group of parts are developed and manufactured using SPIF as examples of industrial application. Parts are developed and evaluated to meet design and development requirements. New applications using SPIF as a rapid tooling process, typically exclusive form additive manufacturing technologies, are developed. The parity between SPIF and AM processes encounter industrial applications not only in prototyping or part manufacturing but also in tool development and fabrication. This novelty allows to decrease the time to market, decrease tooling cost and increase tooling complexity and consequential part design freedom in sheet metal moulds. The concept is developed and proof for a variety of thermoplastic and composite materials processing technologies.
Os processos de estampagem incremental de chapa, como a estampagem incremental por ponto unico, t^em sido estudados em profundidade desde o in cio dos anos 2000. Para al em da aplica c~ao no desenvolvimento de prot otipos, os processo de estampagem incremental apresentam potencial de aplica c~ao no fabrico de produto unicos ou pequenos lotes. Esta possibilidade oferece vantagens ao permitir acelerar o processo de design e desenvolvimento de produto e ao tornar economicamente vi avel a produ c~ao de pequenas s eries. Para al em disso, esta possibilidade permite a cria c~ao de novas tipologias de neg ocio, possibilitando o desenvolvimento e fabrico de produtos exclusivos ou customizados. No entanto, principalmente devido a novidade do processo, a estampagem incremental ainda n~ao tem muitos exemplos de aplica c~ao em empresas. O principal objetivo do trabalho apresentado e desenvolver ferramentas que possam ser utilizadas para a industrializa c~ao do processo de estampagem incremental por ponto unico e apresentar exemplos de aplica c~oes em diferentes areas industriais. A m aquina SPIF-A desenvolvida no Departamento de Engenharia Mec^anica da Universidade de Aveiro e utilizada para o estudo do processo de estampagem incremental. Apesar do potencial do processo de estampagem para fabricar superf cies de forma livre, existem algumas limita c~oes. Estas devem-se maioritariamente ao comportamento do material e ao processo e par^ametros de estampagem. S~ao de nidas linhas orientadoras para o design de pe cas, bem como as poss veis con gura c~oes, de forma a possibilitar o desenvolvimento de pe cas fact veis. O equipamento necess ario para a realiza c~ao de trabalhos de estampagem incremental e os par^ametros de trabalho s~ao estudados com recurso a an alise de estado da arte e a trabalho experimental. Como exemplo de aplica c~ao industrial da estampagem incremental, s~ao desenvolvidas e fabricadas pe cas. Os produtos s~ao desenvolvidos e avaliados de forma a garantir o cumprimento dos requisitos de nidos. S~ao propostas novas aplica c~oes para a utiliza c~ao de estampagem incremental para o fabrico r apido de ferramentas, tipicamente exclusivo do processos de fabrico aditivo. A analogia entre a estampagem incremental e o fabrico aditivo permite encontrar aplica c~oes industriais para al em da prototipagem, com grande potencial para o desenvolvimento e fabrico de ferramentas. Esta novidade contribui para a redu c~ao do tempo de comercializa c~ao, reduzindo custos e permitindo uma maior exibilidade do desenho de um produto. O conceito de fabrico de moldes em chapa para diversos materiais termopl asticos e comp ositos e desenvolvido e analisado.
Les processus de formage incr emental de t^ole, come formage incr emental un point, sont etudi es en profondeur d es le d ebut des ann ees 2000. Les processus ont son application dans le d eveloppement des prototypes et pr esentent aussi du vrai potentiel dans la fabrication des produits uniques et dans des petits lots. Cette possibilit e o re des avantages parce que permit d'acc el erer le processus de design et d eveloppement de produit et de faire le projet des petites s eries economiquement viables. En plus, formage incr emental possibilit e la cr eation des nouvelles typologies de a aires a cause de ca contribution dans la fabrication des produits personnalis es et exclusives. Malgr e ca et comme celui est un processus tr es r ecent, pour l'instant, le formage incr emental n'a pas beaucoup de utilisation industrielle. L'objectif principal du travail pr esent e est de d evelopper des moyens que peut ^etre utilis es pour auxili e l'industrialisation do processus de formage incr emental un point et pr esenter des exemples pour des distinctes applications industrielles. La machine SPIF-A d evelopp e dans le D epartement de Ing enierie M ecanique de l'Universit e d'Aveiro est utilis ee pour l' etude du processus de formage incr emental. Nonobstant le potentiel du processus de formage incr emental pour fabriquer des surfaces de forme libre il y a quelques limitations g eom etriques. C a d epend du comportement du mat eriel et les param etres de travail. Les con gurations g eom etriques possibles et les lignes directrices de conception sont d e nies de fa con a possibilit e le dessein des pi eces faisables. L'Equipment n ecessaire pour la r ealisation des travaux de formage incr ementa et les param etres de travail sont etudi es en utilisant l'analyse de l' etat de l'art et des travaux exp erimentaux. Comme exemple des applications industrielles du formage incr emental, sont d evelopp ees et fabriqu es des pi eces. Les produits sont d evelopp es et avalis es de fa con a assurer qu'il respecte les exigences d e nis.
APA, Harvard, Vancouver, ISO, and other styles
17

Rafler, Mathias [Verfasser]. "Gaussian loop- and Pólya processes : a point process approach / Mathias Rafler." Potsdam : Univ.-Verl, 2009. http://d-nb.info/1000115828/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Skalla, John Robert. "USING THE QUANTIFIED PROCESS APPROACH IN EXAMINATION OF THE FIVE POINT TEST." Cleveland State University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=csu1342748977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Arvola, Bjelkesten Kim. "Feasibility of Point Grid Room First Structure Generation : A bottom-up approach." Thesis, Blekinge Tekniska Högskola, Institutionen för kreativa teknologier, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-15721.

Full text
Abstract:
Context. Procedural generation becomes increasingly important for videogames in an age where the scope of the content required demands bot a lot of time and work. One of the fronts of this field is structure generation where algorithms create models for the game developers to use. Objectives. This study aims to explore the feasibility of the bottom-up approach within the field of structure generation for video games. Methods. Developing an algorithm using the bottom-up approach, PGRFSG, and utilizing a user study to prove the validity of the results. Each participant evaluates five structures giving them a score based on if they belong in a video game. Results. The participants evaluations show that among the structures generated were some that definitely belonged in a video game world. Two of the five structures got a high score though for one structure that was deemed as not the case. Conclusions. A conclusion can be made that the PGRFSG algorithm creates structures that belong in a video game world and that the bottom-up approach is a suitable one for structure generation based on the results presented.
APA, Harvard, Vancouver, ISO, and other styles
20

Suzuki, Makoto. "The best imperative approach to deontic discourse." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1186164664.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ehrke, John E. Henderson Johnny. "A functional approach to positive solutions of boundary value problems." Waco, Tex. : Baylor University, 2007. http://hdl.handle.net/2104/5026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Dahl, Yngve. "Ubiquitous Computing at Point of Care in Hospitals: A User-Centered Approach." Doctoral thesis, Norwegian University of Science and Technology, Department of Computer and Information Science, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1816.

Full text
Abstract:

Ubiquitous computing opens up for a wide range of ways to support human-computer interaction beyond the desktop, and promises more seamless integration between computer technology and situations of use. However, the dissemination of ubiquitous computing has been slow. Research on this type of technology has in many ways been technically motivated, rather than focusing on how it can be made practically useful. Most critical, there is little design guidance that can help technology developers apply ubiquitous computing designs and concepts to real-world use settings, and provide an understanding of how this technology presents itself to users.

This thesis addresses the applicability of ubiquitous computing in the highly dynamic work environment that hospitals form. The current work aims to inform user-centered design of ubiquitous computing solutions for hospital workers and care situations that occur at the patient’s bedside.

The conducted research has resulted in five journal and conference papers (see Part II) that address various aspects relevant for the different phases (analysis, design, and evaluation) of user-centered design.

In the first paper, requirements for design methods, context models, and system properties of mobile electronic patient charts are discussed. In particular, it shows how the proceeding of events occurring in the information system and the real world relative to specific user can be used as a basis for navigation in clinical information.

The second paper investigates the affordances of paper-based medication charts out of the motivation that this can help inform design of ubiquitous computing solutions for clinical use. It shows how paper as an information medium offers affordances (and constraints) central for clinical information work, many of which are not directly transferable to digital media.

The third paper proposes a visual formalism for describing human-computer interaction in digitally augmented spaces. The paper also describes and discusses results from an expert group evaluation of the formalism.

In the fourth paper, a usability comparison of different location and token-based interaction techniques for accessing medical information at the point of care is presented. The paper identifies three user-perceived usability issues relevant for implementation of sensor-based interaction techniques in hospital settings: required user attention, predictability of system behavior, and integration with work situation. It also shows that the interaction techniques differ in terms of the extent to which they fulfill the above criteria, and that the usability of the various techniques is highly relative to the immediate use situation.

Lastly, in the fifth paper the usability of a location-based communication service is evaluated. The service allows hospital workers to leave short digital messages at relevant physical locations (e.g., by a patient bed), so that colleagues can access them later when entering such a location. A usability evaluation of the service indicated that participants (nurses) valued its non-interruptive means of exchanging information, and that it potentially can reduce reliance on their personal memory, when used as a personal reminder service.

Taken together the papers form a platform for future research on UbiComp technology applied in hospital work.

APA, Harvard, Vancouver, ISO, and other styles
23

Schiela, Anton. "The control reduced interior point method : a function space oriented algorithmic approach /." München : Hut, 2006. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=015438070&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Acosta, Zapién Carlos Eduardo. "A constraint-based approach to verification of programs with floating-point numbers." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2007. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Makris, Alexia Melissa. "A Monte Carlo Approach to Change Point Detection in a Liver Transplant." Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4824.

Full text
Abstract:
Patient survival post liver transplant (LT) is important to both the patient and the center's accreditation, but over the years physicians have noticed that distant patients struggle with post LT care. I hypothesized that patient's distance from the transplant center had a detrimental effect on post LT survival. I suspected Hepatitis C (HCV) and Hepatocellular Carcinoma (HCC) patients would deteriorate due to their recurrent disease and there is a need for close monitoring post LT. From the current literature it was not clear if patients' distance from a transplant center affects outcomes post LT. Firozvi et al. (Firozvi AA, 2008) reported no difference in outcomes of LT recipients living 3 hours away or less. This study aimed to examine outcomes of LT recipients based on distance from a transplant center. I hypothesized that the effect of distance from a LT center was detrimental after adjusting for HCV and HCC status. Methods: This was a retrospective single center study of LT recipients transplanted between 1996 and 2012. 821 LT recipients were identified who qualified for inclusion in the study. Survival analysis was performed using standard methods as well as a newly developed Monte Carlo (MC) approach for change point detection. My new methodology, allowed for detection of both a change point in distance and a time by maximizing the two parameter score function (M2p) over a two dimensional grid of distance and time values. Extensive simulations using both standard distributions and data resembling the LT data structure were used to prove the functionality of the model. Results: Five year survival was 0.736 with a standard error of 0.018. Using Cox PH it was demonstrated that patients living beyond 180 miles had a hazard ratio (HR) of 2.68 (p-value<0.004) compared to those within 180 miles from the transplant center. I was able to confirm these results using KM and HCV/HCC adjusted AFT, while HCV and HCC adjusted LR confirmed the distance effect at 180 miles (p=0.0246), one year post LT. The new statistic that has been labeled M2p allows for simultaneous dichotomization of distance in conjunction with the identification of a change point in the hazard function. It performed much better than the previously available statistics in the standard simulations. The best model for the data was found to be extension 3 which dichotomizes the distance Z, replacing it by I(Z>c), and then estimates the change point c and tau. Conclusions: Distance had a detrimental effect and this effect was observed at 180 miles from the transplant center. Patients living beyond 180 miles from the transplant center had 2.68 times the death rate compared to those living within the 180 mile radius. Recipients with HCV fared the worst with the distance effect being more pronounced (HR of 3.72 vs. 2.68). Extensive simulations using different parameter values in both standard simulations and simulations resembling LT data, proved that these new approaches work for dichotomizing a continuous variable and finding a point beyond which there is an incremental effect from this variable. The recovered values were very close to the true values and p-values were small.
APA, Harvard, Vancouver, ISO, and other styles
26

Dassios, Angelos. "Insurance, storage and point processes : an approach via piecewise deterministicc Markov processes." Thesis, Imperial College London, 1987. http://hdl.handle.net/10044/1/38278.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Rahmani, Aviva A. "Trigger point theory as aesthetic activism : a transdisciplinary approach to environmental restoration." Thesis, University of Plymouth, 2015. http://hdl.handle.net/10026.1/9326.

Full text
Abstract:
This dissertation presents a new approach to addressing environmental degradation based on transdisciplinary ecological art. Transdisciplinarity is defined here as merging art and science to discover new insights. Ecological art is defined as an aesthetic practice that promotes environmental resilience. This writing will describe why those approaches are essential to restoring resilient bioregionalism. It introduces the author’s own heuristic perspectives and methodologies and demonstrates how they may be integrated with technology and science. The problems of accelerated loss of coastal (littoral) zone biodiversity, degraded water quality, and habitat fragmentation need critical attention. The author’s research goal was to present a replicable set of guidelines for identifying small points of restoration for wetland littoral zones (the coastal region between terrestrial and marine life) based on a case study called Ghost Nets, scaled to a second case study, Fish Story. Her novel approach included establishing relevant parallels from quantum physics and acupuncture to energetic systems. Additional specific analogies were explored from visual arts, theatre, music, dance, and performance art, to discover a holistic and integrated point of view. Parallels and analogies were drawn by interrogating the two case studies. An important aim of the study was to examine how certain restoration practices could be scaled up to the bioregional level and integrated with a special theory, Trigger Point Theory, to reinforce healthy ecosystems. This included an analysis of how restored upland ecotones and a different relationship to other species could contribute to restoration in the littoral zone. The analysis critiqued how anthropocentric considerations often fail to protect vulnerable water systems. The role of environmental justice for vulnerable human populations and ethical concerns for other animal species was included in that analysis. The author also claims that when artists work with Geographic Information Systems (GIS) mapping, that may propel a new transdiscourse and eventually make heuristic information scientifically useful. Insight from the Ghost Nets case study informed data collections and GIS mapping for the Southern Gulf of Maine. Those insights and the mapping were used to analyze relationships between finfish abundance, eelgrass, and invasive, predatory green crabs. Conclusions were drawn that are relevant to coastal and fisheries management practices. The author used performative approaches to contribute expert witnessing to her conclusions. Questionnaires were used to determine how much community awareness was accomplished with the case studies, and assess effects on future behavior. By combining art and science methodologies, the author revealed insights that could help small restored sites act as trigger points towards restoration of healthy bioregional systems more efficiently than would be possible through restoration science alone. In scaling up (applying small models to larger systems) and applying these practices for landscape ecology, the author assembled a set of recommendations for other researchers to implement these ideas in the future. Those recommendations included the formal engagement of ecological artists as equal partners on environmental restoration teams.
APA, Harvard, Vancouver, ISO, and other styles
28

Yung, Chung-kwong Sunny, and 翁松光. "Kindergarten: a podium approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31986262.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

OLIVEIRA, DARIO AUGUSTO BORGES. "A LINEAR PROGRAMMING APPROACH TO VASCULAR NETWORK SEGMENTATION FROM A SINGLE SEED POINT." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2013. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=23618@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
FUNDAÇÃO DE APOIO À PESQUISA DO ESTADO DO RIO DE JANEIRO
Esta tese apresenta o desenvolvimento e os resultados deste projeto de doutorado, cujo objetivo, de caráter multidisciplinar, foi desenvolver uma metodologia e uma ferramenta para segmentação de redes vasculares a partir de imagens de tomografia computadorizada, utilizando procedimentos de segmentação automática de imagens e visualização tridimensional de dados. A metodologia sugerida segmenta a rede vascular iterativamente utilizando um único ponto de partida. A abordagem utiliza um modelo de amostragem cônico composto de várias camadas esféricas concêntricas ordenadas. Cada ponto amostrado é avaliado utilizando-se uma medida de vascularidade proposta nesta tese, que busca identificar pontos que pertencem a vasos. Um grafo dirigido é então construído com os pontos selecionados e analisado para que se encontre localmente cadeias de pontos conectados que compõem pedaços de ramos da rede vascular. Cada segmento da rede vascular gera uma nova semente a partir da qual uma nova amostragem é realizada e desta forma o procedimento iterativo se repete até que toda a estrutura vascular seja segmentada. A metodologia foi testada utilizando-se imagens sintéticas e reais. Dentre as imagens reais foram segmentadas estruturas vasculares coronárias, carótidas, hepáticas, pulmonares além de uma rede de fibras nervosas do sistema olfativo. Também foram extraídas as topologias das redes vasculares. A avaliação foi quando possível quantitativa, embora este tipo de dado muito raramente ofereça uma segmentação de referência, e nestes casos a avaliação foi qualitativa e visual. Os resultados obtidos confirmam o potencial do método e indicam direções para promover desenvolvimentos futuros.
This thesis presents the development and results of this PhD project, which objective, multidisciplinary, was to develop a methodology and a tool for segmenting vascular networks from CT images, using automatic segmentation procedures and visualization of three-dimensional images data. The suggested methodology tracks a vascular network iteratively using a single starting point. The approach uses a conical sampling model composed of multiple concentric and ordered spherical layers. Each sampled point is evaluated using a measurement of vascularity proposed in this thesis, which seeks to identify points that belong to vessels. A directed graph is then built with the selected points and analyzed to find chains of connected points that make up pieces of branches of the vascular network. Each vascular segment found generates a new seed from which a new sampling is performed, and in this way the iterative procedure is repeated until the entire vascular structure is segmented. The methodology was tested using synthetic and real images. Among the real images several different vascular structures were segmented, such as coronary, carotid, hepatic, pulmonary and even a network of nerve fibers in the olfactory system. Vascular network topologies were also identified. The evaluation was quantitative where possible, although this type of data rarely provides a segmentation of reference, and apart from these cases the assessment was qualitative and visual. The results confirm the potential of the method and suggest directions for further developments.
APA, Harvard, Vancouver, ISO, and other styles
30

Louw, C. J. "Benefits of a blended approach in teaching undergraduate mathematics." Journal for New Generation Sciences, Vol 10, Issue 3: Central University of Technology, Free State, Bloemfontein, 2012. http://hdl.handle.net/11462/620.

Full text
Abstract:
Published Article
The purpose of this paper is to provide a discussion of the educational potential of a blended approach to teaching and learning in the context of the challenges related to mastering basic concepts in mathematics at higher education level. Based on the results of the application of blended learning and teaching for two consecutive semesters at a university of technology, their potential to support meaningful learning of undergraduate mathematics is discussed. The use of clickers, minute and muddiest point papers and board work as educational tools with incomplete sentences as evaluative tool, are discussed. The conclusion is that a blended approach to teaching and learning has many benefits when applied appropriately for a particular context. The lecturer's attitude remains vital for successful implementation of technology-enhanced strategies.
APA, Harvard, Vancouver, ISO, and other styles
31

Leung, Hiu-lan, and 梁曉蘭. "Wandering ideal point models for single or multi-attribute ranking data: a Bayesian approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B29552357.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Polaha, Jodi, and Ivy A. Click. "Implementation Science at the End-Point: A New Approach for Researchers in Primary Care." Digital Commons @ East Tennessee State University, 2018. https://dc.etsu.edu/etsu-works/6372.

Full text
Abstract:
Book Summary:Practice-Based Research shows mental-health practitioners how to establish viable and productive research programs in routine clinical settings. Chapters written by experts in practice-based research use real-world examples to help clinicians work through some of the most common barriers to research output in these settings, including lack of access to institutional review boards, lack of organizational support, and limited access to financial resources. Specialized chapters also provide information on research methods and step-by-step suggestions tailored to a variety of practice settings. This is an essential volume for clinicians interested in establishing successful, long-lasting practice-based research programs.
APA, Harvard, Vancouver, ISO, and other styles
33

Serra, Saurina Laura. "Mixed models and point processes." Doctoral thesis, Universitat de Girona, 2013. http://hdl.handle.net/10803/127348.

Full text
Abstract:
The main objective of this Thesis is to model the occurrence of wildfires and, in particular, knowing the factors with more influence, to evaluate how they are distributed in space and time. The Thesis presents three major goals. Firstly it has been analysed if data follows a particular pattern or behaves randomly. Secondly, because of fire distribution is variable in time, a model which includes the temporal component is used. Finally, it has been analysed those fires that burn areas greater than a given extension of hectares (50ha, 100ha or 150ha); even though they represent a small percentage of all fires, they signify a high percentage of the area burned and cause important environmental damage. The results presented may contribute to the prevention and management of wildfires. In addition, the methodology used in this work can be useful to determine those factors that help any fire to become a big wildfire
L’objectiu principal d’aquesta tesi és modelitzar l’ocurrència dels incendis i, en particular, analitzar la variabilitat del seu comportament en funció de l’espai i el temps tot coneixent quins són els factors que, amb més o menys intensitat, influeixen en el seu comportament. Es plantegen tres grans objectius. Primerament, s’analitza si les dades segueixen un patró determinat o altrament tenen un comportament aleatori. En segon lloc, s’estudia la variabilitat temporal dels incendis i s’aplica un model que incorpora la component temporal. Finalment, s’analitzen els incendis més grans que una extensió específica fixada (50ha, 100ha o 150ha) que, tot i no ser els més freqüents, són els que més mal mediambiental ocasionen. Els resultats obtinguts poden contribuir a la prevenció i a la gestió dels incendis forestals. A més, la metodologia utilitzada és útil per conèixer quins són els factors que fan que un incendi es converteixi en un gran incendi forestal
APA, Harvard, Vancouver, ISO, and other styles
34

Colledanchise, Michele. "Stabilization and Collision Avoidance of Non-point Agents in Dynamic Environments: A Potential Field Approach." Thesis, KTH, Reglerteknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-107404.

Full text
Abstract:
Mobile robot navigation has found interest during the last few years in civilian and military research areas due to its main benefit of substituting human presence in dangerous tasks. Examples include bomb deactivation, fire suppression and aerial surveillance. In literature, different definitions of the environment where the vehicles’ motion takes place are considered and the main challenge for mobile robots is to navigate in such environment guaranteeing collision avoidance and convergence towards a desired target. Firstly, in the thesis these classical environments are extended considering a more general scenario characterized by workspace bounds and an arbitrary number on objects within it. The main contribution is to develop control strategies able to perform several tasks using artificial potential functions methodologies. In particular, the first studied problem consists of a single spherical shaped agent navigating among both fixed and moving obstacles. The case of study is then extended to the multi-agent navigation problem, proposing both centralized and decentralized policies. In these problems, single integrator kinematic models are considered. The methodologies are then extended to the more realistic case of mobile robots described by a double integrator kinematic model. In the thesis, theoretical results will be developed using tools from Lyapunov stability and LaSalle’s invariance principle. The derivations will be then illustrated by a set of representative numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
35

Bingham, Mark. "An interest point based illumination condition matching approach to photometric registration within augmented reality worlds." Thesis, University of Huddersfield, 2011. http://eprints.hud.ac.uk/id/eprint/11048/.

Full text
Abstract:
With recent and continued increases in computing power, and advances in the field of computer graphics, realistic augmented reality environments can now offer inexpensive and powerful solutions in a whole range of training, simulation and leisure applications. One key challenge to maintaining convincing augmentation, and therefore user immersion, is ensuring consistent illumination conditions between virtual and real environments, so that objects appear to be lit by the same light sources. This research demonstrates how real world lighting conditions can be determined from the two-dimensional view of the user. Virtual objects can then be illuminated and virtual shadows cast using these conditions. This new technique uses pairs of interest points from real objects and the shadows that they cast, viewed from a binocular perspective, to determine the position of the illuminant. This research has been initially focused on single point light sources in order to show the potential of the technique and has investigated the relationships between the many parameters of the vision system. Optimal conditions have been discovered by mapping the results of experimentally varying parameters such as FoV, camera angle and pose, image resolution, aspect ratio and illuminant distance. The technique is able to provide increased robustness where greater resolution imagery is used. Under optimal conditions it is possible to derive the position of a real world light source with low average error. An investigation of available literature has revealed that other techniques can be inflexible, slow, or disrupt scene realism. This technique is able to locate and track a moving illuminant within an unconstrained, dynamic world without the use of artificial calibration objects that would disrupt scene realism. The technique operates in real-time as the new algorithms are of low computational complexity. This allows high framerates to be maintained within augmented reality applications. Illuminant updates occur several times a second on an average to high end desktop computer. Future work will investigate the automatic identification and selection of pairs of interest points and the exploration of global illuminant conditions. The latter will include an analysis of more complex scenes and the consideration of multiple and varied light sources.
APA, Harvard, Vancouver, ISO, and other styles
36

Jenkins, Thomas. "A biomechanical approach to improved fracture risk assessment with a focus on reference point microindentation." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/376998/.

Full text
Abstract:
Osteoporosis is a bone disease with two primary definitions: 1) the World Health Organisation definition of low Bone Mineral Density (BMD) and 2) the National Institutes of Health definition of increased bone fragility and fracture risk. Though BMD, alongside clinical factors, is the current gold standard for diagnosing osteoporosis, these measures are risk factors that only define a small proportion of individuals who fracture. Changes in the biomechanical properties of the bone may relate to fracture risk and bone disease and, if they can be clinically assessed, could be useful in supplementing existing techniques to improve future diagnosis and treatment. This thesis addresses whether there are biomechanical changes in osteoporotic as well as osteoarthritic bone and whether these can be measured through Reference Point Microindentation (RPI). RPI has previously been applied in vivo with no reported complications and the hypothesis that a higher indentation depth of a needle-like test probe implies increased fracture risk. Despite emerging implementation of RPI, there is limited research to advise on optimal testing with this technique. Therefore, recommendations are provided to minimise testing variation by studying the variability associated with RPI testing parameters in vitro. Primarily, a best practice would be to: keep maximum load, sample preparation and mode-of-use consistent, fix the device in its stand, remove soft tissue and machine the bone, ensure sufficient cortical thickness and make multiple repeat measurements. These recommendations then guided the main, clinically focussed, study of this thesis. RPI was applied to femoral neck samples extracted from fractured and osteoarthritic patients, finding the surface measured indentation depth to be increased relative to cadaveric control samples. Furthermore, the measured indentation property had minimal correlation with current existing techniques, supplementing BMD and clinical factors to improve discrimination of fractured from control tissue. Complementary fracture toughness testing allowed for improved understanding of bone disease and interpretation of the RPI results. This study demonstrated that fracture toughness properties of the inferomedial femoral neck seem largely unaffected by osteoporosis or osteoarthritis. Furthermore, correlation between RPI and fracture properties was minimal, implying that the technique was assessing other properties to discriminate the osteoporotic and osteoarthritic from control bone. Indent imaging, through micro-computed tomography and serial sectioning techniques, confirmed this to be the case with RPI being a multifactorial measure comprised of structural as well as fracture mechanics and elastoplastic properties. Altogether, this thesis provides insight into the effects of both osteoporosis and osteoarthritis on the biomechanical properties of the femoral neck and presents how these could potential be clinically assessed through RPI, supplementing existing techniques to improve fracture risk assessment.
APA, Harvard, Vancouver, ISO, and other styles
37

Nyberg, Borrfors André. "Energy Decomposition Analysis of Neutral and Anionic Hydrogen Bonded Dimers Using a Point-Charge Approach." Thesis, KTH, Tillämpad fysikalisk kemi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288970.

Full text
Abstract:
En stor samling dimolekylära vätebindningar med formen A – H … B, där AH är en alkyn, alkohol eller tiol och B = [Br–, Cl–, NH3, HCN] beräknas och utvärderas med Kohn-Sham täthetsfunktionalteori tillsammans med bassetet m062x/6-311+g(2df.2p). Dessa komplex utvärderas även med en punktladdningsmodell (som använder samma metod och basset), där atomerna i vätebindningsmottagaren B byts ut mot laddningar som passats för att återskapa laddningsfördelningen runt molekylen, med målet att separera och isolera de elektrostatiska och polariserande energikomponenterna från de totala interaktionsenergierna. Med hjälp av detta tillvägagångssätt visade det sig att vätebindningars komplexeringsenergi (i.e. interaktionsenergin med energikostnaden för att deformera atomkärnornas rymdgeometri borttagen), oberoende av karaktären hos monomeren AH eller B, till stor del består av elektrostatik och polarisation, medan laddningsutbyte, dispersion, och andra resttermer endast utgör en liten del av den totala interaktionen. Fördelningen mellan elektrostatik och polarisation varierar beroende på typen av monomerer i vätebindningen, men deras summa, den resulterande punktladdningsenergin, korrelerar linjärt (ΔECompl = 0.85ΔEPC ) med R2 = 0.995 över energiomfånget 0 < ΔECompl < 50 kcal mol–1. Detta blir ännu mer anmärkningsvärt då inkluderingen av komplexeringsenergierna från halogenbindningar i samma korrelation inte förändrar korrelationskoefficienten avsevärt, vilket indikerar att båda bindningstyperna består av samma energikomponenter även då bindningarna i sig är väldigt olika.
A large set of dimeric hydrogen bonds of the type A – H … B, where AH is an alkyne, alcohol, or thiol and B = [Br–, Cl–, NH3, HCN]  are computed and evaluated using Kohn-Sham density functional theory together with the m062x/6-311+g(2df.2p) basis set. These complexes are also evaluated using a point charge (PC) approach (using the same method and basis set), where the atoms of the hydrogen bond acceptor B are substituted for charges that are optimized to reproduce the charge distribution of the molecule, with the purpose of separating and isolating the electrostatics- and polarization energy components of the interaction energies. Using this approach it was discovered that the complexation energy of hydrogen bonds (i.e.the interaction energy with the energy cost of nuclear deformation corrected for), independent on the nature of either monomer AH or B, are largely made up of electrostatics and polarization, while charge transfer, dispersion, and other rest terms only make up a small fraction of the total interaction. The composition of electrostatics and polarization vary depending on the type of monomers in the hydrogen bond, but their sum, the PC interaction energy, correlates linearly (ΔECompl = 0.85ΔEPC )  with R2 = 0.995 over an energy span of 0 < ΔECompl < 50 kcal mol–1. This is made even more remarkable by the inclusion of halogen bonded complexation energies in the same correlation without changing the correlation coefficient significantly, indicating that the two bond types are comprised of the same components even though they are remarkably different in origin.
APA, Harvard, Vancouver, ISO, and other styles
38

Schutte, Jeffrey Scott. "Simultaneous multi-design point approach to gas turbine on-design cycle analysis for aircraft engines." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28169.

Full text
Abstract:
Thesis (M. S.)--Aerospace Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Mavris, Dimitri; Committee Member: Gaeta, Richard; Committee Member: German, Brian; Committee Member: Jones, Scott; Committee Member: Schrage, Daniel; Committee Member: Tai, Jimmy.
APA, Harvard, Vancouver, ISO, and other styles
39

Schall, Judith [Verfasser]. "Optimization of point grids in regional satellite gravity analysis using a Bayesian approach / Judith Schall." Bonn : Universitäts- und Landesbibliothek Bonn, 2020. http://d-nb.info/1208831577/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Feydt, Austin Pack. "A Higher-Fidelity Approach to Bridging the Simulation-Reality Gap for 3-D Object Classification." Case Western Reserve University School of Graduate Studies / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=case1558355175360648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

MCLAUGHLIN, PATRICK IAN. "LATE ORDOVICIAN SEISMITES OF KENTUCKY AND OHIO: A SEDIMENTOLOGICAL AND SEQUENCE STRATIGRAPHIC APPROACH." University of Cincinnati / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1028144697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Nehring, Benjamin [Verfasser], and Sylvie [Akademischer Betreuer] Roelly. "Point processes in statistical mechanics : a cluster expansion approach [[Elektronische Ressource]] / Benjamin Nehring. Betreuer: Sylvie Roelly." Potsdam : Universitätsbibliothek der Universität Potsdam, 2012. http://d-nb.info/1028397100/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Good, Keith William. "An examination of the contribution of the 'starting point approach' (SPA) to primary design and technology." Thesis, University of Greenwich, 2009. http://gala.gre.ac.uk/5641/.

Full text
Abstract:
The starting point approach (spa) to design and technology education presented in this thesis, is intended to stimulate children’s ideas and to allow projects with different purposes to be designed and made in one class. The projects all originate from a common starting point. The approach is intended to promote creativity and individual choice whilst being manageable for the teacher. Data were collected during a spa session taught to a group of 10-11 year old children in London. They were introduced to the pressure pad switch that was to be the starting point for their designing. The activity was initiated by the group brainstorming existing uses for pressure pads and ways to operate the switch prior to making their own. Each child went on to develop a project with a purpose selected by them. A transcript derived from the video of the above was subjected to analysis by means of coding and a specially devised grid. Children were also observed working and data were gathered using questionnaires, video recording, Dictaphone, field notes, interviews and digital pictures of the final artefacts. The study was qualitative in nature and based on an interpretative paradigm. The data were considered in two phases. Phase 1 of the study examined whether the children could do what the spa required. Phase 2 concentrated on examining in more detail what occurred when the spa was used. The research showed that children following the spa were able to design and make products with different purposes within a single class. It is argued that an advantage of the spa is that it reconciles the often conflicting demands of teaching skills and knowledge with encouraging individual creativity. The starting point approach is a pedagogical tool and process that can be used to motivate children through allowing them to decide the purpose of their individual project as well as its design.
APA, Harvard, Vancouver, ISO, and other styles
44

Bose, Saptak. "An integrated approach encompassing point cloud manipulation and 3D modeling for HBIM establishment: a case of study." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Find full text
Abstract:
In the case of Cultural Heritage buildings, the need for an effective, exhaustive, efficient method to replicate its state of being in an interactive, three-dimensional environment is today, of paramount importance, both from an engineering as well as a historical point of view. Modern geomatics entails the usage of Terrestrial Laser Scanners (TLS) and photogrammetric modelling from Structure-from-Motion (SfM) techniques to initiate this modelling operation. To realize its eventual existence, the novel Historic Building Information Modelling (HBIM) technique is implemented. A prototype library of parametric objects, based on historic architectural data, HBIM allows the generation of an all-encompassing, three-dimensional model which possesses an extensive array of information pertaining to the structure at hand. This information, be it geometric, architectural, or even structural, can then be used to realize reinforcement requirements, rehabilitation needs, stage of depreciation, method of initial construction, material makeup, historic alterations, etc. In this paper, the study of the San Michele in Acerboli’s church, located in Santarcangelo di Romagna, Italy, is considered. A HBIM model is prepared and its accuracy analyzed. The final model serves as an information repository for the aforementioned Church, able to geometrically define its finest characteristics.
APA, Harvard, Vancouver, ISO, and other styles
45

Martell, Richard J. "The participatory design of an ecosystem approach to monitoring in support of sense-making : What's the Point? /." Waterloo, Ont. : University of Waterloo [Dept. of Environment and Resource Studies], 1999. http://etd.uwaterloo.ca/etd/rjmartel1999.pdf.

Full text
Abstract:
Thesis (M.E.S.) - University of Waterloo, 1999.
"A thesis presented to the University of Waterloo in fulfilment of the thesis requirements for the degree of Master of Environmental Studies in Environment and Resource Studies". Includes bibliographical references (p. 223-240). Issued also in PDF format and available via the World Wide Web. Requires Internet connectivity, World Wide Web browser, and Adobe Acrobat Reader.
APA, Harvard, Vancouver, ISO, and other styles
46

Martell, Richard. "The participatory design of an ecosystem approach to monitoring in support of sense-making: What's the Point?" Thesis, University of Waterloo, 1999. http://hdl.handle.net/10012/1005.

Full text
Abstract:
Environmental monitoring initiatives are typically conceived as strictly scientific affairs designed to provide support for managerial decision-making; as a consequence most initiatives are centered on a formal mandate or an overarching mission statement that provides direction for monitoring activity. But official frameworks tend to marginalize lay perspectives as experts pursue disciplinary rigor at the expense of public input, a situation not in keeping with the spirit of the biosphere reserve concept. This thesis argues that an alternative design approach that reaches beyond scientists and resource managers is necessary. Environmental monitoring under an ecosystem approach is subject to scientific, social, and bureaucratic demands that defy easy disentanglement. A medley of factors influence how data are collected, interpreted, and used; neglect of these 'soft' dimensions runs the risk of failing to win the enduring support of stakeholders. There is a need to coordinate activity and to partially align multiple perspectives-this is the 'soft underbelly' of integrated monitoring that gets short shrift in most designs. While there is much monitoring being done in and around the Long Point World Biosphere Reserve, there is little coordination among monitoring groups and no obvious way to combine disparate data sets in a meaningful way. This thesis describes the elements of a locally-sensible framework for monitoring practice that is mainly concerned with trying to make sense of confusing and ambiguous situations; it strives to integrate the 'why', 'what', and 'how' of monitoring in as transparent a manner as possible by crafting 'boundary objects' that help to congeal understanding and provide centers of coordination. Using principles of participatory design in the soft-systems tradition, the overall intent is to primarily support sense-making, not decision-making; to generate searching questions, not final solutions; to facilitate learning, not control.
APA, Harvard, Vancouver, ISO, and other styles
47

Qin, Xiao. "A Data-Driven Approach for System Approximation and Set Point Optimization, with a Focus in HVAC Systems." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/318828.

Full text
Abstract:
Dynamically determining input signals to a complex system, to increase performance and/or reduce cost, is a difficult task unless users are provided with feedback on the consequences of different input decisions. For example, users self-determine the set point schedule (i.e. temperature thresholds) of their HVAC system, without an ability to predict cost--they select only comfort. Users are unable to optimize the set point schedule with respect to cost because the cost feedback is provided at billing-cycle intervals. To provide rapid feedback (such as expected monthly/daily cost), mechanisms for system monitoring, data-driven modeling, simulation, and optimization are needed. Techniques from the literature require in-depth knowledge in the domain, and/or significant investment in infrastructure or equipment to measure state variables, making these solutions difficult to implement or to scale down in cost. This work introduces methods to approximate complex system behavior prediction and optimization, based on dynamic data obtained from inexpensive sensors. Unlike many existing approaches, we do not extract an exact model to capture every detail of the system; rather, we develop an approximated model with key predictive characteristics. Such a model makes estimation and prediction available to users who can then make informed decisions; alternatively, these estimates are made available as an input to an optimization tool to automatically provide pareto-optimized set points. Moreover, the approximation nature of this model makes the determination of the prediction and optimization parameters computationally inexpensive, adaptive to system or environment change, and suitable for embedded system implementation. Effectiveness of these methods is first demonstrated on an HVAC system methodology, and then extended to a variety of complex system applications.
APA, Harvard, Vancouver, ISO, and other styles
48

Wu, Ching-Yu, and 吳敬輿. "An Address Exchange Approach for Point to Point Networking." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/g9bj83.

Full text
Abstract:
碩士
國立中正大學
資訊工程研究所
102
Because more and more popular peer to peer network architecture in recent years, our develop team also developed a peer to peer network architecture platform a few years ago, the entire system can be maintained for each node autonomy, will not be big business monopoly. And any point-to-peer-based applications will be met with the so-called NAT problem, for our system, it is unable to identify the user under the NAT who perform actions in the end, its main purpose is to allow the service mode on our network platform group can properly record these acts, so that users are able to properly and correctly use the service module. In this paper we not only solve the above mentioned problems, there are various problems encountered on the way, and finally propose an improved method related issues.
APA, Harvard, Vancouver, ISO, and other styles
49

Li, Yi-Hua, and 李依樺. "Flash point prediction by UNIFAC approach." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/64489391655243426974.

Full text
Abstract:
碩士
中國醫藥大學
職業安全衛生學系碩士班
97
The implementation of GHS (Globally Harmonized System of Classification and Labeling of Chemicals) is the international trend, and Taiwan has implemented GHS since 2008. In the implementation of GHS, the flash point of mixtures is the critical property to classify flammable liquids. The flash point value of liquids is the primary parameter in hazard classification for the flammable liquids. If a flash point prediction model for mixtures is developed, the flash point of mixtures can be estimated rapidly and economically, and it is helpful in the promotion of GHS. The problem of classification of flammable liquids was faced in 2008 in Taiwan, and it is urgently and immediately necessary to be resolved. This study is to give a solution for the estimation of flash point for mixtures, the necessity for classification of flammable liquids. The traditional models for predicting flash point of mixtures usually by the activity coefficient approach. However, the parameters of activity coefficient were regressed from phase equilibrium data in the literatures. If there is no such parameter in literatures for the desired mixture, the model cannot predict the flash point oft hat mixture. Thus, this research aims at the improvement of deficiency of the flash point prediction models in the literatures by useing UNIFAC (Universal Quasi-chemical Functional Group Activity Coefficient) equation, Dortmund-UNIFAC equation and Lyngby-UNIFAC equation. In this study, we were aim at the prediction model for miscible mixtures. From the result, it is suggested to use different type of UNIFAC equation to estimast activivity coefficient in the predition of flash point for different mixture type. In addition to the academic value, the result can be applied to help the government and industries to promote the implementation of GHS. Potential application for the model concerns the assessment of fire and explosion hazards, and the development of inherently safer designs for chemical processes.
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Cheng-Kaye, and 陳正凱. "Extreme Point Approach to Robust Controller Design." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/48234121638474035938.

Full text
Abstract:
碩士
國立海洋大學
電機工程學系
87
The focal point of this thesis is robust controller design for systems with real parametric uncertainty. Based on the geometric concept of value set, we provide an efficient approach to achieve the design by using the extreme points, which can alleviate the computational burden of robust controller designs. We first investigate extreme gain and phase properties associated with affine linear and multilinear uncertainty systems. Then apply these properties to achieve the robust classical controller designs. Finally, it is shown in this thesis that these design procedures can be easily converted intosystematic CAD design algorithms and be effectively coded in MATLAB m-files.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography