To see the other types of publications on this topic, follow the link: Model synthesis.

Dissertations / Theses on the topic 'Model synthesis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Model synthesis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Andriushchenko, Roman. "Computer-Aided Synthesis of Probabilistic Models." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2020. http://www.nusl.cz/ntk/nusl-417269.

Full text
Abstract:
Předkládaná práce se zabývá problémem automatizované syntézy pravděpodobnostních systémů: máme-li rodinu Markovských řetězců, jak lze efektivně identifikovat ten který odpovídá zadané specifikaci? Takové rodiny často vznikají v nejrůznějších oblastech inženýrství při modelování systémů s neurčitostí a rozhodování i těch nejjednodušších syntézních otázek představuje NP-těžký problém. V dané práci my zkoumáme existující techniky založené na protipříklady řízené induktivní syntéze (counterexample-guided inductive synthesis, CEGIS) a na zjemňování abstrakce (counterexample-guided abstraction refinement, CEGAR) a navrhujeme novou integrovanou metodu pro pravděpodobnostní syntézu. Experimenty nad relevantními modely demonstrují, že navržená technika je nejenom srovnatelná s moderními metodami, ale ve většině případů dokáže výrazně překonat, někdy i o několik řádů, existující přístupy.
APA, Harvard, Vancouver, ISO, and other styles
2

Merrell, Paul C. Manocha Dinesh N. "Model synthesis." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2009. http://dc.lib.unc.edu/u?/etd,2752.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2009.
Title from electronic title page (viewed Mar. 10, 2010). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Computer Science." Discipline: Computer Science; Department/School: Computer Science.
APA, Harvard, Vancouver, ISO, and other styles
3

Jung, Yong-Kyu. "Model-based processor synthesis." Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/14451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sather, Paula Joan. "Synthesis of cholesterol based model glycolipids." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29876.

Full text
Abstract:
The synthesis of glycolipids containing a variable length polyethylene glycol spacer group between a glucuronic acid (glu) headgroup and a cholesterol (chol) tail glu-0CH₂(CH₂OCH₂ )nCH₂O-chol is described. The homologs (n=2,3,5) were prepared by reaction of an excess of commercially available tri, tetra and hexaethylene glycols with cholesteryl-p-toluene sulfonate. 3-O-(8-hydroxy-3,6-dioxaoctyl) cholest-5-ene (2), 3-O-(ll-hydroxy-3,6,9-trioxaundecyl)cholest-5-ene (3) and 3-O-(17-hydroxy-3,6,9,12,15-pentaoxaheptadecyl)cholest-5-ene (4) were produced, and yields were dependent on the amount of excess used. The headgroup was prepared by esterification and acetylation of glucuronolactone to produce methyl (1, 2, 3, 4-tetra-O-acetyl-β-D-glucopyran)uronate which was then brominated at the anomeric carbon to produce methyl (2, 3, 4-tri-O-acetyl-α-D-glucopyranosyl bromide)uronate (1). The headgroup was coupled to the cholesteroxy oligoethylene glycols by a Koenig Knorr type reaction using freshly prepared silver carbonate as the catalyst. Methyl[3-O-(3,6-dioxaoctyl)cholest-5-en-3β-y1-2,3,4-tri-O-acetyl-β-D-glucopyranosid] uronate (5), Methyl[3-O-(3,6,9-trioxaundecyl) cholest-5-en-3β-yl-2,3,4-tri-0-acetyl-β-D-glucopyranosid] uronate (6), and Methyl[3-O-(3,6,9,12,15-pentaoxaheptadecyl)cholest-5-en-3β-yl-2, 3, 4-tri-O -acetyl-β-D-glucopyranosid] uronate (7) were produced with yields of up to 30%. The removal of the methyl ester and acetate protecting groups on the headgroup was accomplished using NaOH in a mixture of solvents followed by acidification with HCl to produce 3-O-(3,6-dioxaoctyl)cholest-5-en-3β-yl-β-D-glucopyranosiduronic acid (8) and 3-O-(3,6,9-trioxaundecyl)cholest-5-en-3β-yl-β-D-glucopyranosiduronic acid (9). Octaethylene glycol and dodecaethylene glycol were prepared using a solid supported synthesis. The solid polymer used was a trityl chloride functionalized polystyrene 1% divinyl benzene. Mono protected tetraethylene glycol was prepared and attached to the polymer. The protecting group was removed, and the hydroxy terminal was converted to a mesylate leaving group by reaction with methane sulfonyl chloride. To elongate the chain, the anion of tetraethylene glycol was prepared using sodium hydride in DMF. The tetraethylene glycol bound resin was added, and reaction continued at 120 °C for 24 hours. Cleavage of the resultant product from the polymer support yielded octaethylene glycol. Repetition of the mesylation and elongation steps followed by cleavage yielded dodecaethylene glycol. The oligoethylene glycols were purified by passage through a Fractogel 40S gel permeation column. Two different protecting groups for the tetraethylene glycol were tried. Trialkyl silyl groups were first attempted, but were abandoned due to reduced reactivity and monitoring difficulties during the deprotection. An acetate protecting group was finally used and deprotection was monitored with infrared spectroscopy.
Science, Faculty of
Chemistry, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
5

Varejao, Jorge Manuel Tavares Branco. "Towards the synthesis of model peroxidases." Thesis, University of Liverpool, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.343702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bennett, Fiona Catherine. "Design and synthesis of model peptides." Thesis, King's College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299882.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Biris, Brilhante Virginia. "Ontology and reuse in model synthesis." Thesis, University of Edinburgh, 2003. http://hdl.handle.net/1842/25164.

Full text
Abstract:
A much pursued ontological capability is knowledge reuse and sharing. One aspect of this is reuse of ontologies across different applications, including construction of other ontologies. Another aspect is the reuse of knowledge that is founded on ontologies, which, in principle, is attainable in that an ontology renders well-defined and accessible the meaning of concepts that underly the knowledge. In this thesis we address both these aspects of reuse, grounding the study in the problem of synthesis of structural ecological models. One basis for the design of ecological models, along with human expertise, is data properties. This motivated the construction of a formal ontology - Ecolingua - where we define concepts of the ecological data domain. The diversity of the domain led us to use the Ontolingua Server as an ontology design tool, since it makes available an extensive library of shareable ontologies. We reused definitions from several of these ontologies to design Ecolingua. However, Ecolingua's specification in the representation language of the Ontolingua Server did not easily translate into a manageable, executable specification in our choice of implementation language, which required us to develop our own complementary tools for scope reduction and translation. Still, a great deal of manual re-specification was necessary to achieve an operational, useful ontology. In sum, reuse of multiple ontologies in ontology construction employing state-of-the-art tools proved impractical. We analyse the practical problems in reuse of heterogeneous ontologies for the purpose of developing large, combined ones. In its operational form, Ecolingua is used to describe ecological data to give what we call metadata, or, said differently, instantiated ontological concepts. Synthesis is achieved by connecting metadata to model structure. Two approaches were developed to do this. The first, realised in the Synthesis-0 system, uses metadata alone as synthesis resource, while the second, realised in the Synthesis-R system, exploits Ecolingua to reuse existing models as an additional synthesis resource. Both Synthesis-0 and Synthesis-R are working systems implemented as logic programs. The systems were empirically evaluated and compared on run time efficiency criteria. Results show a remarkable efficiency gain in SynthesisR, the system with the reuse feature. We generalise the results to systems for synthesis of structural models informed by domain data. Once a data ontology and ontology-constrained synthesis mechanisms are in place, existing models can efficiently be reused to produce new models for new data.
APA, Harvard, Vancouver, ISO, and other styles
8

Addazi, Lorenzo. "Automated Synthesis of Model Comparison Benchmarks." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-44333.

Full text
Abstract:
Model-driven engineering promotes the migration from code-centric to model-based software development. Systems consist of model collections integrating different concerns and perspectives, while semi-automated model transformations generate executable code combining the information from these. Increasing the abstraction level to models required appropriate management technologies supporting the various software development activities. Among these, model comparison represents one of the most challenging tasks and plays an essential role in various modelling activities. Its hardness led researchers to propose a multitude of approaches adopting different approximation strategies and exploiting specific knowledge of the involved models. However, almost no support is provided for their evaluation against specific scenarios and modelling practices. This thesis presents Benji, a framework for the automated generation of model comparison benchmarks. Given a set of differences and an initial model, users generate models resulting from the application of the first on the latter. Differences consist of preconditions, actions and postconditions expressed using a dedicated specification language. The generator converts benchmark specifications to design-space exploration problems and produces the final solutions along with a model-based description of their differences with respect to the initial model. A set of representative use cases is used to evaluate the framework against its design principles, which resemble the essential properties expected from model comparison benchmark generators.
APA, Harvard, Vancouver, ISO, and other styles
9

Ligon, Thomas (Thomas Crumrine). "Automated inter-model parameter connection synthesis for simulation model integration." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/39887.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2007.
Includes bibliographical references (p. 73-75).
New simulation modeling environments have been developed such that multiple models can be integrated into a single model. This conglomeration of model data allows designers to better understand the physical phenomenon being modeled. Models are integrated together by creating connections between their interface parameters, referred to as parameter mapping, that are either shared by common models or flow from the output of one model to the input of a second model. However, the process of integrating simulation models together is time consuming, and this development time can outweigh the benefit of the increased understanding. This thesis presents two algorithms that are designed to automatically generate and suggest these parameter mappings. The first algorithm attempts to identify previously built integration model templates that have a similar function. Model interfaces and integration models are represented by attributed graphs. Interface graph nodes represent interface parameters and arcs relate the input and output parameters, and integration models graph nodes represent interface graphs and arc represent parametric connections between interface graph nodes.
(cont.) A similarity based pattern matching algorithm initially compares interface graphs in two integration model graphs. If the interface graphs are found to match, the algorithm attempts to apply the template integration model's parameter mappings to the new integration model. The second algorithm compares model interface parameters directly. The algorithm uses similarity measures developed for the pattern matching algorithm to compare model parameters. Parameter pairs that are found to be very similar are processed using a set of model integration rules and logic and those pairs that fit these criteria are mapped together. These algorithms were both implemented in JAVA and integrated into the modeling environment DOME (Distributed Object-based Modeling Environment). A small set of simulation models were used to build both new and template integration models in DOME. Tests were conducted by recording the time required to build these integration models manually and using the two proposed algorithms. Integration times were generally ten times faster but some inconsistencies and mapping errors did occur. In general the results are very promising, but a wider variety of models should be used to test these two algorithms.
by Thomas Ligon.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
10

Troeng, Tor. "Frequency Response Analysis using Component Mode Synthesis." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-37809.

Full text
Abstract:
Solutions to physical problems described by Differential Equationson complex domains are in except for special cases almost impossibleto find. This turns our interest toward numerical approaches. Sincethe size of the numerical models tends to be very large when handlingcomplex problems, the area of model reduction is always a hot topic. Inthis report we look into a model reduction method called ComponentMode Synthesis. This can be described as dividing a large and complexdomain into smaller and more manageable ones. On each of thesesubdomains, we solve an eigenvalue problem and use the eigenvectorsas a reduced basis. Depending on the required accuracy we mightwant to use many or few modes in each subdomain, this opens for anadaptive selection of which subdomains that affects the solution most.We cover two numerical examples where we solve Helmholtz equationin a linear elastic problem. The first example is a truss and the othera gear wheel. In both examples we use an adaptive algorithm to refinethe reduced basis and compare the results with a uniform refinementand with a classic model reduction method called Modal Analysis. Wealso introduce a new approach when computing the coupling modesonly on the adjacent subdomains.
APA, Harvard, Vancouver, ISO, and other styles
11

Taylor, Tammye L. "UV photochemistry of synthetic model peptides." Thesis, Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/26966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Landelius, Tomas. "Reinforcement Learning and Distributed Local Model Synthesis." Doctoral thesis, Linköpings universitet, Bildbehandling, 1997. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54348.

Full text
Abstract:
Reinforcement learning is a general and powerful way to formulate complex learning problems and acquire good system behaviour. The goal of a reinforcement learning system is to maximize a long term sum of instantaneous rewards provided by a teacher. In its extremum form, reinforcement learning only requires that the teacher can provide a measure of success. This formulation does not require a training set with correct responses, and allows the system to become better than its teacher. In reinforcement learning much of the burden is moved from the teacher to the training algorithm. The exact and general algorithms that exist for these problems are based on dynamic programming (DP), and have a computational complexity that grows exponentially with the dimensionality of the state space. These algorithms can only be applied to real world problems if an efficient encoding of the state space can be found. To cope with these problems, heuristic algorithms and function approximation need to be incorporated. In this thesis it is argued that local models have the potential to help solving problems in high-dimensional spaces and that global models have not. This is motivated with the biasvariance dilemma, which is resolved with the assumption that the system is constrained to live on a low-dimensional manifold in the space of inputs and outputs. This observation leads to the introduction of bias in terms of continuity and locality. A linear approximation of the system dynamics and a quadratic function describing the long term reward are suggested to constitute a suitable local model. For problems involving one such model, i.e. linear quadratic regulation problems, novel convergence proofs for heuristic DP algorithms are presented. This is one of few available convergence proofs for reinforcement learning in continuous state spaces. Reinforcement learning is closely related to optimal control, where local models are commonly used. Relations to present methods are investigated, e.g. adaptive control, gain scheduling, fuzzy control, and jump linear systems. Ideas from these areas are compiled in a synergistic way to produce a new algorithm for heuristic dynamic programming where function parameters and locality, expressed as model applicability, are learned on-line. Both top-down and bottom-up versions are presented. The emerging local models and their applicability need to be memorized by the learning system. The binary tree is put forward as a suitable data structure for on-line storage and retrieval of these functions.
APA, Harvard, Vancouver, ISO, and other styles
13

Argyropoulos, Dimitris S. "Synthesis and degradation of model network polymers." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=72032.

Full text
Abstract:
Theoretical expressions essentially based on the Flory-Stockmayer statistics of gelation were experimentally examined for their applicability beyond the gel point. By studying the crosslinking process of a polyester network formed from 1,3,5-benzenetriacetic acid and 1,10-decamethylene glycol beyond the gel point, the validity of the expressions was quantitatively confirmed, and their limitations were delineated.
On stepwise degradation of a similar network, increasingly large soluble fractions were obtained at each step, and their weight-average molecular weights increased as the degelation point was approached. The molecular weights and distributions of these fractions were in close quantitative agreement with theory, i.e., they represented a near-mirror image of the molecular weights of sol fractions obtained on crosslinking beyond the gel point. Similar results were obtained by degrading a network prepared by the random crosslinking of monodisperse primary chains of polystyrene.
Experimental support was thus obtained for treating random network degradation by reversing the statistics of the Flory-Stockmayer theory of gelation.
APA, Harvard, Vancouver, ISO, and other styles
14

Berber, Karatepe Gulen. "Model studies towards the synthesis of gelsemine." Thesis, University of East Anglia, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

McPheators, Gary. "Model studies toward the synthesis of vinblastine." Thesis, University of Strathclyde, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.288739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

McGuire, Thomas M. "Model studies towards the synthesis of vinblastine." Thesis, University of Strathclyde, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.423874.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Kypuros, Javier Angel. "Variable structure model synthesis for switched systems /." Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3008373.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Yu, Qingzhao. "Bayesian synthesis." Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1155324080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Hawker, Craig Jon. "Model studies on the spiro intermediate." Thesis, University of Cambridge, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.293833.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Narnur, Soumya. "Model Development, Synthesis and Validation Using the Modeler's Assistant." Thesis, Virginia Tech, 1999. http://hdl.handle.net/10919/34500.

Full text
Abstract:
This thesis discusses 'Modeler's Assistant', an interactive graphics tool which aids in the rapid development of VHDL models. The tool provides modeling, test bench generation, simulation, synthesis and validation features. The 'Process Model graph' which has representations for the concurrent processes is used as the basis for Modeler's Assistant. Test generation environment is integrated into the tool. A range of test bench options are provided to the user. The tool interfaces to 'Synopsys' VHDL analyzer, graphics debugger and synthesis tools. Validation of the behavioral model versus the synthesized structural model is also discussed. A detailed programming manual with many examples is provided for the benefit of the user.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
21

Bozianu, Rodica. "Synthesis of Interactive Reactive Systems." Thesis, Paris Est, 2016. http://www.theses.fr/2016PESC1026/document.

Full text
Abstract:
Nous étudions le problème de la synthèse automatique de programmes dans des architectures multi-composants tels qu'elles respectent les spécifications par construction. Le principal objectif de cette thèse est de développer des procédures pour résoudre le problème de synthèse qui peut conduire à des implémentations efficaces. Chaque composant a une observation partielle sur l'état global du système multi-composants. Le problème est alors de fournir des protocoles basés sur les observations tel que les composants synthétisés assurent les spécifications pour tout le comportement de leur environnement. L'environnement peut être antagoniste, ou peut avoir ses propres objectifs et se comporter de façon rationnelle. Nous étudions d'abord le problème de synthèse lorsque l'environnement est présumé antagoniste. Pour ce contexte, nous proposons une procédure "Safraless" pour la synthèse d'un composant partiellement informé et un environnement omniscient à partir de spécications KLTL+. Elle est implémentée dans l'outil Acacia-K. Ensuite, nous étudions le problème de synthèse lorsque les composants de l'environnement ont leurs propres objectifs et sont rationnels. Pour le cadre plus simple de l'information parfaite, nous fournissons des complexités serrées pour des objectifs omega-réguliers particuliers. Pour le cas de l'information imparfaite, nous prouvons que le problème de la synthèse rationnelle est indécidable en général, mais nous regagnons la décidabilité si on demande à synthétiser un composant avec observation partielle contre un environnement multi-composante, omniscient et rationnel
We study the problem of automatic synthesis of programs in multi-component architectures such that they satisfy the specifications by construction. The main goal of the thesis is to develop procedures to solve the synthesis problem that may lead to efficient implementations.Each component may have partial observation on the global state of the multi-component system.Therefore, the synthesis problem asks to provide observation-based protocols for the components that have to be synthesized that ensure that specifications hold on all interactions with their environment.The environment may be antagonist, or may have its own objectives and behave rationally.We first study the synthesis problem when the environment is presumed to be completely antagonist. For this setting, we propose a "Safraless" procedure for the synthesis of one partially informed component and an omniscient environment from KLTL+ specifications. It is implemented in the tool Acacia-K. Secondly, we study the synthesis problem when the components in the environment have their own objectives and are rational. For the more relaxed setting of perfect information, we provide tight complexities for particular omega-regular objectives. Then, for the case of imperfect information, we prove that the rational synthesis problem is undecidable in general, but we gain decidability if is asked to synthesize only one component against a rational omniscient environment
APA, Harvard, Vancouver, ISO, and other styles
22

Bovee, Matthew J. "The synthesis and characterization of a polymer-supported cellulose model." Diss., Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/5810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Nguyen, Mai Thanh Thi. "Synthesis and Oxidation of Lignin-Carbohydrate Model Compounds." Fogler Library, University of Maine, 2008. http://www.library.umaine.edu/theses/pdf/NguyenMTT2008.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Farges, Eric P. "An analysis-synthesis hidden Markov model of speech." Diss., Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/14775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Williams, Douglas. "Human Systems Integration Synthesis Model for Ship Design." Thesis, Monterey, California. Naval Postgraduate School, 2012. http://hdl.handle.net/10945/17477.

Full text
Abstract:
Approved for public release; distribution is unlimited
Current fiscal constraints are driving the reduction of system life cycle cost (LCC). A key objective of HSI is the reduction of operational cost and the improvement of operational performance. This thesis seeks to develop a Human Systems Integration (HSI) Synthesis Model for Ship Design. This model is based on the premise that ship design characteristics interact with the domains of HSI. The thesis begins with an historical overview of ship architecture and technology and their interactions with the domains of HSI. The HSI Synthesis Model for Ship Design was developed using the Framework of Naval Postgraduate Schools Systems Engineering Ship Synthesis Model. Quantitative data analysis was conducted using Offshore Patrol Vessel (OPV) design data from Information Handling Services (IHS) Janes database. The data analyzed included 35 ships from 21 nations. Multiple regression analysis consisted of nine independent ship design variables and a response variable of manpower. Data analysis revealed that ship length and ship draught were statistically significant. The proposed HSI Synthesis Model accounted for 49 per cent of the variance of crew complement. This thesis lays the foundation for future qualitative and quantitative analysis of the interaction between ship design characteristics and HSI domains. Additionally, it provides an initial HSI model that can be expanded upon by including additional HSI domains and, ultimately, may lead to a viable design tool for HSI practitioners and systems engineers.
APA, Harvard, Vancouver, ISO, and other styles
26

Kayembe, Jean Pierre. "Synthesis and fluorescent labelling of model mycolic acids." Diss., University of Pretoria, 2014. http://hdl.handle.net/2263/79255.

Full text
Abstract:
Mycolic Acids (MAs) are long chain α-alkyl-β-hydroxy fatty acids that form part of the cell wall of Mycobacterium species and a few other genera. They play an important role in steering the host-pathogen relationship to establish active TB disease. These compounds are recognized by antibodies and therefore show potential for use in new diagnostic techniques such as biosensor assays. Previous studies have shown that MA-methyl esters were not antigenic, while fluorescein labelled-MA maintained antigenicity. It was proposed that this was due to the presence of the carboxylic acid group on the fluorescein label that substituted for the one on MA that became esterified during conjugation. However the existence of the free carboxylic acid on fluorescein after labelling of MA was uncertain as fluorescein labelling of palmitic acid apparently produced two structural forms, the free acid and the lactone, as a mixture of tautomers. Furthermore the original biological studies were not done on characterized material and activity may have been due to the presence of some unreacted MA. Here, the synthesis of a corynomycolic acid homologue is reported via two routes: (a) Claisen condensation followed by reduction or (b) aldol condensation. Due to the cost and poor quality of commercial “5-BromoMethylFluorescein”, this reagent had to be synthesised in the laboratory. The synthetic Corynomycolic acid homologue made in this study and a mixture of naturally occurring bovine mycolic acids obtained from Sigma-Aldrich were labelled with freshly prepared 4(5)-BMF. NMR characterization of the fluorescein labelled-MA showed the presence of a carboxylic acid group on the fluoresceinsuggesting that it is likely to maintain biological activity.
Dissertation (MSc)--University of Pretoria, 2014.
Chemistry
MSc
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
27

Jose, Bijoy Antony. "Formal Model Driven Software Synthesis for Embedded Systems." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/28622.

Full text
Abstract:
Due to the ever increasing complexity of safety-critical applications, handwritten code is being replaced by automatically generated code derived from a high level specification. Code generation from high level specification requires a model of computation with an underlying formalism and correctness-preserving refinement steps to generate the lower level application code. Such software synthesis techniques are said to be â correct-by-constructionâ . Synchronous programming languages such as Esterel, LUSTRE, which are based on a synchronous model of computation are used for sequential code generation. They work on a synchrony assumption (zero time intraprocess computation and zero time inter process communication) at the specification level. Early versions of synchronous languages followed an execution pattern where an iteration of software was mapped to an interval between ticks of an external reference clock. Since this external reference tick was unrelated to variables (or signals) within the software, redundant operations such as reading of ports, computation of guards were performed for each tick. In this dissertation, we highlight some of these performance issues and missed optimization opportunities. Also we show how a multi-clock (or polychronous) formalism, where each variable has an independent rate of execution associated with it, can avoid these problems. An existing polychronous language named SIGNAL, creates a hierarchy of clocks based on the rate of execution of individual variables, to form a root clock which acts a reference tick. We seek to replace the clock analysis with a technique to form a unique order of events without a reference time line. For this purpose, we present a new polychronous formalism termed Multi-rate Instantaneous Channel connected Data Flow (MRICDF). Our new synthesis technique inspects the specification to identify a master trigger at a Boolean equation level to act as the reference tick. Furthermore, we attempt to make polychronous specification based software synthesis more accessible to practicing engineers, by constructing a software tool EmCodeSyn, with a visual environment for specification and a more intuitive analysis technique. Our Boolean approach to sequential synthesis of embedded software has multiple implementations, each of which utilizes existing academic software tools. Optimizations are proposed to minimize synthesis time by simplifying the input to these external tools. Weaknesses in causal loop analysis techniques applied by existing synthesis tools are highlighted and solutions for performing time efficient loop analysis are integrated into EmCodeSyn. We have also determined that a part of the non-synthesizable polychronous specifications can be used to generate correct multi-threaded code. Additionally, we investigate composition of polychronous modules and propose properties that are necessary to guarantee agreement on shared signals.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
28

Faggi, Simone. "An Evaluation Model For Speech-Driven Gesture Synthesis." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/22844/.

Full text
Abstract:
The research and development of embodied agents with advanced relational capabilities is constantly evolving. In recent years, the development of behavioural signal generation models to be integrated in social robots and virtual characters, is moving from rule-based to data-driven approaches, requiring appropriate and reliable evaluation techniques. This work proposes a novel machine learning approach for the evaluation of speech-to-gestures models that is independent from the audio source. This approach enables the measurement of the quality of gestures produced by these models and provides a benchmark for their evaluation. Results show that the proposed approach is consistent with evaluations made through user studies and, furthermore, that its use allows for a reliable comparison of speech-to-gestures state-of-the-art models.
APA, Harvard, Vancouver, ISO, and other styles
29

Li, Kaichang. "Synthesis of lignin-carbohydrate model compounds and neolignans." Diss., This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-06062008-152716/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Ciszewski, Gregory M. "Synthesis of Bis(2,2,2-trifluoroethyl) Phosphonates A New Model for the Synthesis of Phosphonates." Connect to online version at OhioLINK ETD Connect to online version at Digiatl.Maag, 1998. http://hdl.handle.net/1989/4786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kerns, Corey Michael. "Naval Ship Design and Synthesis Model Architecture Using a Model-Based Systems Engineering Approach." Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/32459.

Full text
Abstract:
The Concept and Requirements Exploration process used at Virginia Tech is based on a Multi-Objective Optimization approach that explores the design space to produce a Non-Dominated set of ship design solutions ranked objectively by Cost, Risk, and Effectiveness. Prior research and effort has also been made to leverage the validation and verification of the U.S. Navyâ s ship synthesis design tool, ASSET, into the Virginia Tech Ship Synthesis Model. This thesis applies Design Structure Matrix theory to analyze and optimize the ASSET synthesis process by reducing or removing the feedback dependencies that require the iterative convergence process. This optimized ASSET synthesis process is used as the basis to develop a new Simplified Ship Synthesis Model (SSSM) using Commercial Off-The-Shelf (COTS) software, ASSET Response Surface Models (RSMs) and simplified parametric equations to build the individual synthesis modules. The current method of calculating an Overall Measure of Effectiveness (OMOE) used at Virginia Tech is based on expert opinion and pairwise comparison. This thesis researches methods for building a Design Reference Mission (DRM) composed of multiple operational situations (OpSits) required by the shipâ s mission. The DRM is defined using a Model Based Systems Engineering (MBSE) approach and an overall Ship Design System Architecture to define and understand the relationships between various aspects of the ship design. The system architecture includes the DRM and enables the development of Operational Effectiveness Models (OEMs) as an alternative to an expert opinion-based OMOE. The system architecture also provides the means for redefining and optimizing the entire ship design process by capturing the entire process and all related data into a single repository. This thesis concludes with a preliminary assessment of the utility of these various system engineering tools to the naval ship design process.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
32

OUAISS, IYAD. "HIERARCHICAL MEMORY SYNTHESIS IN RECONFIGURABLE COMPUTERS." University of Cincinnati / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1033498452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Donnelly, Ronald F. "Model studies for the synthesis of taxoid side-chains." Thesis, University of Ottawa (Canada), 1996. http://hdl.handle.net/10393/9466.

Full text
Abstract:
The objectives of this thesis were to make progress toward the synthesis of the side-chain of paclitaxel. The model systems chosen were benzoyl-1,3-dithiane, (1-phenyl-2-(1,3-dithiacyclohexyl)ethanone) and hydrocinnamoyl-1,3-dithiane, (4-phenyl-1-(1,3-dithiacyclohexyl)butan-2-one). This work consisted of screening several acyl equivalents to determine which one could best be used for a one carbon homologation. Enantioselective reduction of the carbonyl group, using three different chiral reducing agents, was also studied. Studies were also conducted to determine the most efficient manner in which to unmask the carbonyl group and convert it to the more stable methyl ester. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
34

Patwardhan, Rohit S. "Studies in synthesis and analysis of model predictive controllers." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0016/NQ46902.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Yin, Lijun. "Facial expression analysis and synthesis for model based coding." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0011/NQ59702.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kovur, Srinivasulu Raju. "Synthesis and Enzymatic Oxidation of Model Lignin Carbohydrate Complexes." Fogler Library, University of Maine, 2008. http://www.library.umaine.edu/theses/pdf/KovurSR2008.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Stein, Benno [Verfasser]. "Model construction in analysis and synthesis tasks / Benno Stein." Paderborn : Universitätsbibliothek, 2001. http://d-nb.info/1036971635/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Forsythe, W. Graham. "The synthesis and transformation of novel lignin model oligomers." Thesis, Queen's University Belfast, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.675451.

Full text
Abstract:
Lignin is a complex, interlinked, non-repeating, heterogeneous bio-polymer found in wood which consists of phenylpropane subunits. The phenolic monomers of lignin are linked via various ether and carbon-carbon bonds. It is the second most abundant biopolymer after cellulose, accounting for 15-30 % of biomass and it has the potential to be a renewable source of small aromatic feedstock molecules. The synthesis of lignin model compounds plays an important role in both the elucidation of lignin's structure and in modelling the conditions required for the production of valuable feedstock molecules from lignin depolymerisation. The aim of the research presented in this thesis has been to provide an efficient route to multi-gram quantities of lignin model compounds which are more representative of lignin than the low molecular weight models typically employed in previously published work. The syntheses of several hexameric lignin models as well as an octamer, all of which contain three of the most common linkages in the native polymer have been carried out. The synthetic methodology used improves upon existing methods of preparing higher molecular weight lignin models in both overall yield and efficiency as well as practically on a multi-gram scale. Several of these compounds were then subjected to published procedures aimed at the transformation of lignin and model compounds into added value fine chemical intermediates. Extensive analysis of the major products was performed leading to some preliminary mechanistic insights into lignin oxidative and reductive depolymerisation chemistry.
APA, Harvard, Vancouver, ISO, and other styles
39

Woźniak, Ernest. "Model-based Synthesis of Distributed Real-time Automotive Architectures." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112145/document.

Full text
Abstract:
Les solutions basées sur le logiciel/matériel jouent un rôle important dans le domaine de l'automobile. Il est de plus en plus fréquent que l’implémentation de certaines fonctions jusqu’ici réalisées par des composants mécaniques, se fasse dans les véhicules d’aujourd’hui par des composants électroniques embarquant du logiciel. Cette tendance conduit à un grand nombre de fonctions implémentées comme un ensemble de composants logiciels déployés sur unités de commande électronique (ECU). Par conséquent, la quantité de code embarqué dans les automobiles est estimée à des dizaines de giga-octets et le nombre d’ECU de l’ordre de la centaine. Les pratiques actuelles de développement deviennent donc inefficaces et sont en cours d’évolution. L'objectif de cette thèse est de contribuer aux efforts actuels qui consistent à introduire l’utilisation de l'Ingénierie Dirigée par les Modèles dans la conception d’architectures automobiles basées sur le logiciel/matériel. Une première série de contributions de cette thèse porte sur la proposition de techniques pour soutenir les activités décrites dans la méthodologie automobile établie par le langage EAST-ADL2 et le standard AUTOSAR dont l’objectif principal est l'intégration de l'architecture logicielle avec la plate-forme matérielle. Bien que de nombreux travaux sur la synthèse d’architectures existent, cette thèse met en exergue les principaux défauts les empêchant de pleinement supporter la méthodologie EAST-ADL2/AUTOSAR et propose de nouvelles techniques aidant à surmonter les déficiences actuelles. Une deuxième série de contributions concerne les approches de modélisation. L'utilisation de langages de modélisation généralistes (dans le sens non spécifique à un domaine industriel donné) comme SysML et MARTE bien que bénéfique, n'a pas encore trouvé une manière d'être pleinement exploité par les constructeurs automobiles. Cela concerne en particulier la modélisation d’une spécification analysable et l'optimisation des préoccupations qui permettrait d’effectuer des analyses et optimisations à base de modèles. Ce travail définit une méthodologie et les concepts nécessaires à la construction de modèles d'analyse et d'optimisation de ces systèmes
Hardware/software based solutions play significant role in the automotive domain. It is common that the implementation of certain functions that was done in a mechanical manner, in nowadays cars is done through the software and hardware. This tendency lead to the substantial number of functions operating as a set of software components deployed into hardware entities, i.e. Electronic Control Units (ECU). As a consequence the capacity of the overall code is estimated as tens of gigabytes and the number of ECUs reaches more than 100. Consequently the industrial state of the practice development approaches become inefficient. The objective of this thesis is to add to the current efforts trying to employ the Model Driven Engineering (MDE) in the context of the automotive SW/HW architectures design. First set of contributions relates to the guided strategies supporting the key engineering activities of the automotive methodology established by the EAST-\ADL2 language and the AUTOSAR standard. The main is the integration of the software architecture with the hardware platform. Although the amount of work on the synthesis is substantial, this thesis presents shortcomings of the existing approaches that disable them to fully support the EAST-ADL2/AUTOSAR methodology and delivers new techniques overcoming the current deficiencies. Second contribution concerns approaches for the modeling. Surprisingly the usage of general purpose modeling languages such as the SysML and MARTE although beneficial, haven’t found its way yet to be fully exploited by the automotive OEMs (Original Equipment Manufacturer). This especially relates to the modeling of the analyzable input and the optimization concerns which would enable triggering of the analysis and optimization directly from the models level. This work shows a way and defines additional concepts, necessary to construct analysis and optimization models
APA, Harvard, Vancouver, ISO, and other styles
40

Paul, Uchenna Prince. "Microkinetic Model of Fischer-Tropsch Synthesis on Iron Catalysts." Diss., CLICK HERE for online access, 2008. http://contentdm.lib.byu.edu/ETD/image/etd2535.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Brown, Ross A. "Image synthesis based on a model of human vision." Thesis, Queensland University of Technology, 2003. https://eprints.qut.edu.au/31368/1/31368.pdf.

Full text
Abstract:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
APA, Harvard, Vancouver, ISO, and other styles
42

BUTT, SHAHZAD AHMAD. "Model-based High Level Synthesis and Design Space Exploration." Doctoral thesis, Politecnico di Torino, 2013. http://hdl.handle.net/11583/2507516.

Full text
Abstract:
Hardware blocks are used to execute very different Digital Signal Processing tasks, ranging from simple audio applications to state of the art LTE mobile communication and high definition video processing. Users expect continuous and very rapid innovation, and each new product to be more energy efficient and more powerful in terms of functionality than the previous generation. These user driven market requirements force companies to continuously struggle to design new devices with novel features and improved performance. Also the large number of semiconductor companies (especially fab-less) results in a strong competition, especially in terms of time to market. In summary, two key factors that determine the success of a new product are its differentiating features and its design time. These requirements ultimately translate into an increased pressure on the hardware and software designers to come up with new and innovative designs in short time spans, which can only be achieved with novel design tools and flows. For hardware design the standard RTL-based flow employed in the industry consists of using pre-built and pre-verified hardware IPs plus some customized blocks that are specially designed for new devices. These customized hardware blocks are one of the main ingredients that enable product differentiation in the market. These blocks are normally designed using a procedure that consists of two major designs and modeling steps. The first step models the algorithm in some C-like language that allows algorithmic analysis and verification. The second step manually implements a hardware architecture coded in a hardware description language. The manual translation involved in the second step requires a long time, and in theory it should be iterated for many possible hardware architectures, in order to find the best one. This hardware Design Space Exploration is especially lengthy because verification must be repeated for each new RTL implementation. The best way to cope with these issues is to use hardware synthesis tools that allow modeling in a C-like language at the algorithmic level, and can directly and quickly perform design space exploration with as little manual effort as possible. This observation is the basic motivation for the methods and results described in this thesis. The model-based design paradigm has been studied extensively by the research community to raise the level of abstraction for design, verification and synthesis of hardware and software. The scope of the term is very broad, and it cannot be captured easily in a few lines. However, in the scope of hardware and software design for embedded systems model-based design can be defined as a design paradigm that allows one to rapidly design, verify, validate and implement hardware and software systems starting from pre-verified abstract component models. Model-based design usually starts from abstract graphical drag-and-drop models which are usually modeled in a proprietary way (as in the case of Simulink ) or in a loosely standardized way (as in the case of the UML). The components used for modeling range from simple atomic blocks like arithmetic operations to complex blocks like a Fast Fourier Transform, a Finite Impulse Response filter, a Viterbi decoder, a Discrete Cosine Transform etc. The entire algorithm is modeled in this way, and then automatically synthesized to an implementation-dependent model, in order to enhance re-use under different application scenarios. Model-based design is normally carried out using frameworks and design environments, such as Simulink or Labview, that are equipped with powerful model-to-model translation tools. These tools can be used to implement different low level models for software and hardware simulation and implementation from a single “golden” verified algorithmic model. Most of the model-based design tools that are available in industry and academia today allow automatic software code generation from graphical models. The software code generators can generate optimized code that can come close to hand optimized code in terms of size and performance, because they exploit information about the target processor architecture and the associated memory hierarchy. Model-based design tools also offer the capability to generate hardware from graphical models. However, they either rely on parameterized hand-optimized implementations of the larger macro blocks, or require the designer to use smaller blocks at an abstraction level that is very close to RTL. In other words these tools either rely either on pre-modeled intellectual proprietary blocks (IPs) with fixed hardware architecture or tend to decrease the abstraction level in a proprietary manner. The reliance on a single architectural template can result in a sub-optimal implementation if this architecture is not suitable for particular application, while working at lower abstraction level design space exploration becomes again very expensive, essentially defeating one of the main purposes of model-based design. Simulink (from The Mathworks) is a very powerful modeling, simulation and analysis tool that is used in this thesis as the starting point for model-based design. It has a huge set of libraries and allows modeling systems and algorithms from very different application domains, ranging from embedded control to video and communication systems. It includes many parameterized built in components that can be used for modeling, as well as components that allow easy mathematical and graphical analysis of data in the time and frequency domains, thus making the debugging and verification very easy and intuitive. Simulink also has tools that perform fixed point analysis analysis and ease the transition from floating to fixed point models. Simulink comes bundled with different code-generators that can generate C/C++ code from models for different embedded processors. There are many code generators available with Simulink that can generate C/C++ software code for a variety of processor architectures. These code generators are called Target Language Compilers (TLC) in Simulink terminology. Simulink also supports for hardware generation from graphical models, but as explained above for complex algorithms it relies on parameterizable fixed architectural templates which are not flexible enough to target different application domains with different requirements or it requires modeling at a low abstraction level, resulting in longer design times. The research presented in this thesis uses Simulink as graphical modeling environment front-end for hardware synthesis and hardware/software trade-off analysis. It proposes a novel design flow, which incorporates a more flexible and powerful approach for hardware synthesis and design space exploration than the industrial and research state of the art. Several high level synthesis (HLS) tools are available to synthesize RTL hardware from C/C++/SystemC abstract specifications. Normally these high level synthesis tools also require a set of constraints which drive them to synthesize a specific macro and micro-architecture from the C-like specification. Hence Design Space Exploration can be performed only by changing these constraints, while preserving the same pre-verified C code. Simulink used as a modeling front-end to high level hardware synthesis results in a very powerful flow that allows flexible hardware design and synthesis. Since Simulink supports code generation for software implementation, hardware/software trade-off analysis can also be performed easily starting from Simulink models. During the course of this thesis Simulink was evaluated as graphical modeling front-end for high level hardware synthesis and hardware software trade-off analysis. The tool that was used for high level hardware synthesis is Cadence C-to-Silicon, or CtoS. In the input, C++ constructs are used to specify the algorithm, while SystemC constructs are used for more hardware oriented artifacts, such as bit-true IOs and timing. Hence our goal is to define a methodology to convert the C output from Simulink and efficiently synthesize it with CtoS. In particular, by exploring the variety of TLCs available from Simulink , we found that Embedded Real Time coder (ERT), which was specifically designed to produce readable and optimized code for embedded processors, can generate C code that is well suited for high level hardware synthesis. ERT has many optimization that can be selectively enabled or disabled to generate code that is suitable for a particular application. We first analyzed them to understand which ones can be used to generate code that is best suited for high level hardware synthesis. Also different graphical modeling options were analyzed to enable hierarchical and modular code generation, that is especially useful when model partitioning is required in order to obtain a more concurrent hardware implementation. Simulink allows one to easily control the level of granularity, by grouping sub-designs into aggregate blocks that can become single function calls in the generated C code. In order to illustrate how one can perform hardware/software trade-off analysis, we used a case study from the domain of wireless video surveillance sensor networks, where cameras are triggered to capture video whenever an interesting audio event is detected by an audio detector. The platform that was used as implementation target consisted of ARM Cortex-M3 processor. The audio algorithm was modeled in Simulink and then model was grouped into aggregate Simulink blocks to generate modular code through ERT. This modular code was then used in the form of different C-code partitions to perform extensive hardware/software trade-off analysis, where final implementation decisions were made based on very accurate low level power/area/throughput estimates. An accurate picture of power, area and throughput was easily generated because most of the design space exploration was performed using CtoS with minor or no changes to the C-model. Very simple and abstract architectural constraints to automatically generate the low level synthesis scripts for both CtoS and the downstream tools (e.g. RTL Compiler). Once a C-model is generated from Simulink through ERT it can be synthesized using CtoS by providing architectural constraints. But in order to efficiently perform design space exploration one must gain a high level understanding of the computationally most expensive loops and code segments. This may again become a complex and tedious task for automatically generated code. Hence a very important part of our research was to devise and implement a mechanism that can shield the designer from understanding the automatically generated code during high level synthesis. We implemented an automated TCL script to perform automatic design space exploration. This tool takes as input a set of simple high-level directives that limit the design space to be explored, and then produces many points in the design space that can be used for further low level throughput, power and area analysis. The directives specify, for example, how many resources such as multipliers or adders must be considered, what type of memories must be used, how many operations should be scheduled in a single cycle etc. In the second part of this thesis, we also explored in detail how to implement algorithms, like an FFT, that are used in a multitude of different applications and may require inherently different macro/micro-architectures to satisfy very different performance requirements. The fully sequential C-code that is generated by ERT for such complex blocks limits the design space that can be explored. Especially more concurrent implementations are very difficult to derive from such code because it is inherently sequential and not well suited to achieve the level of concurrency which can be essential for some hardware implementations. . We thus proposed and experimented with a technique that allowed us to represent complex block as proprietary blocks in Simulink . These blocks can be considered as IPs that are defined at a higher abstraction level to enable efficient hardware synthesis and extended hardware design space exploration, still starting from a single C model. The modeling strategy that we used for defining these high level synthesis IPs (HLS-IP) relies on the S-function modeling mechanism. This is a mechanism provided by Simulink to extend its component library. It allows modeling of algorithms in a well-defined manner to enable integration into the simulation environment and C-code generation. We used plain C for modeling S-functions because C-like specifications are also well suited for high level hardware synthesis, which is the target of our proposed HLS-IP scheme. Complex algorithms, like an FFT, have many different signal flow graph representations, where each representation is suited for different purpose. Some representations may be best suited for software implementation and some are more interesting for hardware implementation. The choice of signal flow graph and template architecture in our case was made based on flexibility and opportunity to map to different macro/micro-architectures, to enable more extended hardware design space exploration. This mostly involves a change of the level of concurrency, because state of the art high level synthesis tools are hardly able to increase the level of concurrency starting from sequential C-like specifications, thus essentially limiting a very important dimension of the hardware design space that can explored. We experimented extensively with an FFT test case for designing high level flexible IPs that can be used for modeling, simulation and verification in the Simulink environment and then for efficient hardware synthesis and design space exploration. In the end we were able to formulate and write an FFT IP-generator that generates flexible HLS-IPs following all the guidelines described above, including all the wrappers that are required for integration into Simulink and the high level synthesis scripts. The HLS-IPs are generated in a way that allows separation of pure functionality and behavioral constructs from the SystemC interfaces and threads. The functional part of HLS-IP that is generated remains the same in order to perform verification once, and only the level of parallelism and datapath bit-width is changed to reflect the application requirements.
APA, Harvard, Vancouver, ISO, and other styles
43

Deboer, Richard J. "Measurement of cross sections for 65Cu([alpha],p) 68Zn nuclear reaction at low energy with comparison to Hauser-Feshbach statistical model." Virtual Press, 2005. http://liblink.bsu.edu/uhtbin/catkey/1319221.

Full text
Abstract:
Where did the elements come from? Why are they found in the abundance that they are? These are two of the fundamental questions that the field of astrophysics has sought to answer. The first major studies of elemental synthesis were done in the 1950's and 1960's. Most notable among them was the Burbidge, Burbidge, Fowler, and Hoyle paper [Clayton 73]. This paper set forth the general theory of elemental synthesis in stars and supernovae by means of nuclear reactions. It remains the leading theory for elemental abundance today.As with most theories, the picture of elemental synthesis remains incomplete. While it is thought that the overall theory is correct, there are still many mysteries in the details. There are several kinds of nuclear reactions that occur in stars and supernova that create the elements heavier than iron. They include the r-process, s-process, and p-process, along with several others. However, there are some elements whose creation is not fully understood. There are a variety of reasons for this, which will be discussed.In our experiment we studied the nuclear reaction properties of an isotope of Copper (65Cu). It is theorized that it is produced by the p-process during a supernova explosion. The p-process can be described simply as the collision of an alpha particle with a large atomic nucleus with a proton byproduct. Little actual experimental data has been taken involving the p-process, which is why we chose this reaction. The experiment was done using the Tandem Van de Graaff Accelerator at Ohio University.
Department of Physics and Astronomy
APA, Harvard, Vancouver, ISO, and other styles
44

Zhao, Xiaoning. "Synthesis and applications of functional magnetic polymer beads; synthesis and mass spectrometry analysis of model peptides." Scholarly Commons, 2012. https://scholarlycommons.pacific.edu/uop_etds/156.

Full text
Abstract:
The first part of the thesis describes the synthesis and application of functional magnetic polymer beads. The traditional suspension polymerization approach was used to synthesize polystyrene-iron oxide (Fe 3 O 4 ) based magnetic beads. The beads were coupled to different surface functional groups. The Fe 3 O 4 particles were encapsulated into a polystyrene shell. The surface functional groups were generated by graft-polymerization with functional monomers. The average size of the beads was in the range of 100-500 μm. Chemical tests showed that the beads were stable in strong acid, strong base and polar solvent. The beads had a fast response to an external magnetic field. A self-emulsion-polymerization approach was developed to synthesize smaller magnetic beads with the - OH groups on the surface. A modified approach based on traditional suspension-polymerization was developed to synthesize acid-durable beads with more Fe 3 O 4 encapsulated inside the beads. A novel emulsion-suspension polymerization method was successfully developed to synthesize much smaller magnetic beads ( A new peptide synthesis approach was developed using functional magnetic beads as the resin for solid phase synthesis. In this application, synthesized magnetic beads were further modified by a two-step reaction. The amino group was anchored onto the surface of these beads, followed by coupling with the Rink amide linker. The resulting beads were used as the resin to synthesize several model peptides. The peptides were successfully synthesized, and the sequences were confirmed by mass spectrometry analysis. The yields of the peptides were comparable to those obtained from commercial Rink amide resin. The second part of the thesis describes the synthesis and mass spectrometry analysis of two series of model peptides. One series has the linear (non-cyclic) structure, A n K, KA n , P n K, and AcA n K. The other series contains cyclic peptides, c-Ac-DAKAK and c-Ac-DADapAK. All peptides were synthesized using solid phase peptide synthesis. The relative proton affinities of the model peptides were measured using the collision induced dissociation experiments using a triple quadrupole mass spectrometer. It was found that the effective proton affinity of a cyclic peptide was significantly reduced compared to a linear analogue. The reduced proton affinity implies an increased lipophilicity of the peptide.
APA, Harvard, Vancouver, ISO, and other styles
45

Cunière, Nicolas. "Model studies toward the synthesis of Phomoidrides A and B. : studies toward the synthesis of Mycoepoxydiene /." The Ohio State University, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=osu1486400446373247.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Visagie, J. P. "Generic gasifier modelling evaluating model by gasifier type /." Pretoria : [s.n.], 2009. http://upetd.up.ac.za/thesis/available/etd-07022009-133535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Dumbuya, Karifala. "Synthesis and characterisation of planar model catalysts for Olefin metathesis." [S.l.] : [s.n.], 2005. http://www.diss.fu-berlin.de/2005/324/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Cabral, Joao P. "HMM-based speech synthesis using an acoustic glottal source model." Thesis, University of Edinburgh, 2011. http://hdl.handle.net/1842/4877.

Full text
Abstract:
Parametric speech synthesis has received increased attention in recent years following the development of statistical HMM-based speech synthesis. However, the speech produced using this method still does not sound as natural as human speech and there is limited parametric flexibility to replicate voice quality aspects, such as breathiness. The hypothesis of this thesis is that speech naturalness and voice quality can be more accurately replicated by a HMM-based speech synthesiser using an acoustic glottal source model, the Liljencrants-Fant (LF) model, to represent the source component of speech instead of the traditional impulse train. Two different analysis-synthesis methods were developed during this thesis, in order to integrate the LF-model into a baseline HMM-based speech synthesiser, which is based on the popular HTS system and uses the STRAIGHT vocoder. The first method, which is called Glottal Post-Filtering (GPF), consists of passing a chosen LF-model signal through a glottal post-filter to obtain the source signal and then generating speech, by passing this source signal through the spectral envelope filter. The system which uses the GPF method (HTS-GPF system) is similar to the baseline system, but it uses a different source signal instead of the impulse train used by STRAIGHT. The second method, called Glottal Spectral Separation (GSS), generates speech by passing the LF-model signal through the vocal tract filter. The major advantage of the synthesiser which incorporates the GSS method, named HTS-LF, is that the acoustic properties of the LF-model parameters are automatically learnt by the HMMs. In this thesis, an initial perceptual experiment was conducted to compare the LFmodel to the impulse train. The results showed that the LF-model was significantly better, both in terms of speech naturalness and replication of two basic voice qualities (breathy and tense). In a second perceptual evaluation, the HTS-LF system was better than the baseline system, although the difference between the two had been expected to be more significant. A third experiment was conducted to evaluate the HTS-GPF system and an improved HTS-LF system, in terms of speech naturalness, voice similarity and intelligibility. The results showed that the HTS-GPF system performed similarly to the baseline. However, the HTS-LF system was significantly outperformed by the baseline. Finally, acoustic measurements were performed on the synthetic speech to investigate the speech distortion in the HTS-LF system. The results indicated that a problem in replicating the rapid variations of the vocal tract filter parameters at transitions between voiced and unvoiced sounds is the most significant cause of speech distortion. This problem encourages future work to further improve the system.
APA, Harvard, Vancouver, ISO, and other styles
49

Apsey, Glenn C. "The synthesis and chemistry of model compounds related to fluoropolymers." Thesis, Durham University, 1988. http://etheses.dur.ac.uk/6725/.

Full text
Abstract:
The objectives of this research project were to synthesize and investigate the chemistry of model compounds related to the hexafluoropropene/vinylidene fluoride copolymer system. A number of compounds of this type were prepared which underwent a series of reactions in order to obtain definitive information about the chemical processes occurring during the cross-linking of the copolymer system with bis-nucleophiles. Further studies with the model compounds also indicated potential sites in the cured copolymers through which chemical degradation could take place, during their use in aggressive environments. Other investigations with the model compounds, together with unsaturated compounds derived from these systems, led to the observation of some very unusual chemistry. Lewis acid induced dehydrofluorination reactions with antimony pentafluoride led to the formation of a number of observable carbocations and a unique contiguous dication. This methodology was further developed in the treatment of saturated homopolymers, in which dehydrohalogenation by antimony pentafluoride led to formation of polyacetylene derivatives displaying intense colouration. In order to circumvent the formation of potential sites of chemical instability during the curing process with nucleophiles, a methodology was investigated in which cross-linking could occur via a free radical mechanism involving hemolytic bond cleavage of sterically demanding groups. A number of monomers containing a bulky pendant group were prepared and were investigated in order to determine their suitability to undergo copolymerisation with vinylidene fluoride. Copolymers obtained in this way were then examined to determine whether polymer radicals could be produced by thermally induced hemolytic bond scission of sites involving the sterically crowded groups. The results obtained clearly demonstrated that this type of cross-linking process is entirely feasible.
APA, Harvard, Vancouver, ISO, and other styles
50

Brownhill, Andrew. "The synthesis and study of model systems for enzymic catalysis." Thesis, University of Essex, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.315690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography