Dissertations / Theses on the topic 'GENETIC FRAMEWORK'

To see the other types of publications on this topic, follow the link: GENETIC FRAMEWORK.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'GENETIC FRAMEWORK.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wååg, Håkan. "Development of a Framework for Genetic Algorithms." Thesis, Jönköping University, JTH, Computer and Electrical Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-11537.

Full text
Abstract:

Genetic algorithms is a method of optimization that can be used tosolve many different kinds of problems. This thesis focuses ondeveloping a framework for genetic algorithms that is capable ofsolving at least the two problems explored in the work. Otherproblems are supported by allowing user-made extensions.The purpose of this thesis is to explore the possibilities of geneticalgorithms for optimization problems and artificial intelligenceapplications.To test the framework two applications are developed that look attwo distinct problems, both of which aim at demonstrating differentparts. The first problem is the so called Travelling SalesmanProblem. The second problem is a kind of artificial life simulator,where two groups of creatures, designated predator and prey, aretrying to survive.The application for the Travelling Salesman Problem measures theperformance of the framework by solving such problems usingdifferent settings. The creature simulator on the other hand is apractical application of a different aspect of the framework, wherethe results are compared against predefined data. The purpose is tosee whether the framework can be used to create useful data forthe creatures.The work showed how important a detailed design is. When thework began on the demonstration applications, things were noticedthat needed changing inside the framework. This led to redesigningparts of the framework to support the missing details. A conclusionfrom this is that being more thorough in the planning, andconsidering the possible use cases could have helped avoid thissituation.The results from the simulations showed that the framework iscapable of solving the specified problems, but the performance isnot the best. The framework can be used to solve arbitrary problemsby user-created extensions quite easily.

APA, Harvard, Vancouver, ISO, and other styles
2

Dighe, Rahul. "Human pattern nesting strategies in a genetic algorithms framework." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36083.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Berntsson, Lars Johan. "An adaptive framework for Internet-based distributed genetic algorithms." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16242/1/Johan_Berntsson_Thesis.pdf.

Full text
Abstract:
Genetic Algorithms (GAs) are search algorithms inspired by genetics and natural selection, and have been used to solve difficult problems in many disciplines, including modelling, control systems and automation. GAs are generally able to find good solutions in reasonable time, however as they are applied to larger and harder problems they are very demanding in terms of computation time and memory. The Internet is the most powerful parallel and distributed computation environment in the world, and the idle cycles and memories of computers on the Internet have been increasingly recognized as a huge untapped source of computation power. By combining Internet computing and GAs, this dissertation provides a framework for Internet-based parallel and distributed GAs that gives scientists and engineers an easy and affordable way to solve hard real world problems. Developing parallel computation applications on the Internet is quite unlike developing applications in traditional parallel computation environments, such as multiprocessor systems and clusters. This is because the Internet is different in many respects, such as communication overhead, heterogeneity and volatility. To develop an Internet-based GA, we need to understand the implication of these differences. For this purpose, a convergence model for heterogenous and volatile networks is presented and used in experiments that study GA performance and robustness in Internet-like scenarios. The main outcome of this research is an Internet-based distributed GA framework called G2DGA. G2DGA is an island model distributed GA, which can provide support for big populations needed to solve many real world problems. G2DGA uses a novel hybrid peer-to-peer (P2P) design with island node activity coordinated by supervisor nodes that offer a global overview of the GA search state. Compared to client/server approaches, the P2P architecture improves scalability and fault tolerance by allowing direct communication between the islands and avoiding single-point-of-failure situations. One of the defining characteristics of Internet computing is the dynamics and volatility of the environment, and a parallel and distributed GA that does not adapt to its environment cannot use the available resources efficiently. Two novel adaptive methods are investigated. The first method is migration topology adaptation, which uses clustering on elite individuals from each island to rebuild the migration topology. Experiments with the migration topology adapter show that it gives G2DGA better performance than a GA with static migration topology of a similar or larger connectivity level. The second method is population size adaptation, which automatically finds the number of islands and island population sizes needed to solve a given problem efficiently. Experiments on the population size adapter show that it is robust, and compares favourably with the traditional trial-and-error approach in terms of computational effort and solution quality. The scalability and robustness of G2DGA has been extensively tested in network scenarios of varying volatility and heterogeneity. Experiments with up to 60 computers were conducted in computer laboratories, while more complex network scenarios have been studied in an Internet simulator. In the experiments, G2DGA consistently performs as well as, and usually significantly better than, static distributed GAs and the difference grows larger with increased network instability. The results show that G2DGA, by continuously adjusting the migration policy and the population size, can detect and make efficient use of idle cycles donated over volatile Internet connections. To demonstrate that G2DGA can be used to implement and solve real world problems, a challenging application in VLSI design was developed and used in the testing of the framework. The application is a multi-layer floorplanner, which uses a novel GA representation and operators based on a slicing structure approach. Its packing quality compares favourably with other multi-layer floorplanners found in the literature. Internet-based distributed GA research is exciting and important since it enables GAs to be applied to problem areas where resource limitations make traditional approaches unworkable. G2DGA provides a scalable and robust Internet-based distributed GA framework that can serve as a foundation for future work in the field.
APA, Harvard, Vancouver, ISO, and other styles
4

Berntsson, Lars Johan. "An adaptive framework for Internet-based distributed genetic algorithms." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16242/.

Full text
Abstract:
Genetic Algorithms (GAs) are search algorithms inspired by genetics and natural selection, and have been used to solve difficult problems in many disciplines, including modelling, control systems and automation. GAs are generally able to find good solutions in reasonable time, however as they are applied to larger and harder problems they are very demanding in terms of computation time and memory. The Internet is the most powerful parallel and distributed computation environment in the world, and the idle cycles and memories of computers on the Internet have been increasingly recognized as a huge untapped source of computation power. By combining Internet computing and GAs, this dissertation provides a framework for Internet-based parallel and distributed GAs that gives scientists and engineers an easy and affordable way to solve hard real world problems. Developing parallel computation applications on the Internet is quite unlike developing applications in traditional parallel computation environments, such as multiprocessor systems and clusters. This is because the Internet is different in many respects, such as communication overhead, heterogeneity and volatility. To develop an Internet-based GA, we need to understand the implication of these differences. For this purpose, a convergence model for heterogenous and volatile networks is presented and used in experiments that study GA performance and robustness in Internet-like scenarios. The main outcome of this research is an Internet-based distributed GA framework called G2DGA. G2DGA is an island model distributed GA, which can provide support for big populations needed to solve many real world problems. G2DGA uses a novel hybrid peer-to-peer (P2P) design with island node activity coordinated by supervisor nodes that offer a global overview of the GA search state. Compared to client/server approaches, the P2P architecture improves scalability and fault tolerance by allowing direct communication between the islands and avoiding single-point-of-failure situations. One of the defining characteristics of Internet computing is the dynamics and volatility of the environment, and a parallel and distributed GA that does not adapt to its environment cannot use the available resources efficiently. Two novel adaptive methods are investigated. The first method is migration topology adaptation, which uses clustering on elite individuals from each island to rebuild the migration topology. Experiments with the migration topology adapter show that it gives G2DGA better performance than a GA with static migration topology of a similar or larger connectivity level. The second method is population size adaptation, which automatically finds the number of islands and island population sizes needed to solve a given problem efficiently. Experiments on the population size adapter show that it is robust, and compares favourably with the traditional trial-and-error approach in terms of computational effort and solution quality. The scalability and robustness of G2DGA has been extensively tested in network scenarios of varying volatility and heterogeneity. Experiments with up to 60 computers were conducted in computer laboratories, while more complex network scenarios have been studied in an Internet simulator. In the experiments, G2DGA consistently performs as well as, and usually significantly better than, static distributed GAs and the difference grows larger with increased network instability. The results show that G2DGA, by continuously adjusting the migration policy and the population size, can detect and make efficient use of idle cycles donated over volatile Internet connections. To demonstrate that G2DGA can be used to implement and solve real world problems, a challenging application in VLSI design was developed and used in the testing of the framework. The application is a multi-layer floorplanner, which uses a novel GA representation and operators based on a slicing structure approach. Its packing quality compares favourably with other multi-layer floorplanners found in the literature. Internet-based distributed GA research is exciting and important since it enables GAs to be applied to problem areas where resource limitations make traditional approaches unworkable. G2DGA provides a scalable and robust Internet-based distributed GA framework that can serve as a foundation for future work in the field.
APA, Harvard, Vancouver, ISO, and other styles
5

Song, Yeunjoo E. "New Score Tests for Genetic Linkage Analysis in a Likelihood Framework." Case Western Reserve University School of Graduate Studies / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=case1354561219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kumuthini, Judit. "Extraction of genetic network from microarray data using Bayesian framework." Thesis, Cranfield University, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.442547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mittra, James. "Genetic information, life assurance, and the UK policy and regulatory framework." Thesis, University of Warwick, 2004. http://wrap.warwick.ac.uk/106450/.

Full text
Abstract:
This thesis provides the first extensive sociological analysis of the genetics and life assurance debate in the UK. It uses data from original qualitative interviews, as well as various policy documents and reports, to investigate the likely implications of genetic information for life assurance provision, reveal the narrative strategies used by key stakeholders as they account for their concerns on the issue, and evaluate the efficacy of the policy and regulatory framework. It also attempts to evaluate the suitability of the citizens’ jury model as an alternative to existing decision-making procedures. The thesis begins by revealing the most likely social, commercial, legal, and ethical implications of allowing insurers to access new kinds of genetic information. A history of insurance, risk and probability is used as a starting point to challenge many of the pervasive fears and anxieties. This part of the thesis critically analyses the social and philosophical basis of such contested notions as ’discrimination’, 'social exclusion’, 'genetic information', and ‘social justice’, and begins to reveal some of the key strategies of stakeholders in the debate. The thesis then analyses stakeholder accounts of their concerns, and begins to reveal the ways in which they draw on a broad narrative repertoire to give their beliefs a degree of moral legitimacy/coherency. The impact this may have on the quality of debate is also investigated. Following from the analysis of stakeholder accounts, the thesis proceeds to investigate the nature of the policymaking and regulatory framework. Through a sociological analysis of the work of various advisory committees, which led to the implementation of a moratorium on insurers' use of genetic information, the thesis investigates how fair and equitable the overall political process has been, particularly in terms of the treatment of stakeholder evidence. It also assesses the role of the public and media in shaping the political response to this issue. The thesis concludes by assessing the citizens’ jury as suitable procedures for resolving the conflicts around genetic information and life assurance. Both the potential advantages and persistent problems with the model are critically evaluated.
APA, Harvard, Vancouver, ISO, and other styles
8

Parandekar, Amey V. "Development of a Decision Support Framework forIntegrated Watershed Water Quality Management and a Generic Genetic Algorithm Based Optimizer." NCSU, 1999. http://www.lib.ncsu.edu/theses/available/etd-19990822-032656.

Full text
Abstract:

PARANDEKAR, AMEY, VIJAY. Development of a Decision Support Framework for Integrated Watershed Water Quality Management and a Generic Genetic Algorithm Based Optimizer. (Under the direction of Dr. S. Ranji Ranjithan.)The watershed management approach is a framework for addressing water quality problems at a watershed scale in an integrated manner that considers many conflicting issues including cost, environmental impact and equity in evaluating alternative control strategies. This framework enhances the capabilities of current environmental analysis frameworks by the inclusion of additional systems analytic tools such as optimization algorithms that enable efficient search for cost effective control strategies and uncertainty analysis procedures that estimate the reliability in achieving water quality targets. Traditional optimization procedures impose severe restrictions in using complex nonlinear environmental processes within a systematic search. Hence, genetic algorithms (GAs), a class of general, probabilistic, heuristic, global, search procedures, are used. Current implementation of this framework is coupled with US EPA's BASINS software system. A component of the current research is also the development of GA object classes and optimization model classes for generic use. A graphical user interface allows users to formulate mathematical programming problems and solve them using GA methodology. This set of GA object and the user interface classes together comprise the Generic Genetic Algorithm Based Optimizer (GeGAOpt), which is demonstrated through applications in solving interactively several unconstrained as well as constrained function optimization problems.Design of these systems is based on object oriented paradigm and current software engineering practices such as object oriented analysis (OOA) and object oriented design (OOD). The development follows the waterfall model for software development. The Unified Modeling Language (UML) is used for the design. The implementation is carried out using the JavaTM programming environment

APA, Harvard, Vancouver, ISO, and other styles
9

Silva, Carlos H. "A Proposed Framework for Establishing Optimal Genetic Designsfor Estimating Narrow-sense Heritability." NCSU, 2000. http://www.lib.ncsu.edu/theses/available/etd-20000414-113213.

Full text
Abstract:

We develop a framework for establishing sample sizes in breeding plans, so that one is able to estimate narrow-sense heritability with smallest possible variance, for a given amount of effort. We call this an optimal genetic design. The framework allows one to compare the variances of estimators of narrow-sense heritability, when estimated from each one of the alternative plans under consideration, and does not require data simulation, but does require computer programming. We apply it to the study of a peanut (Arachis hypogaea) breading example, in order to determine the ideal number of plants to be selected at each generation. We also propose a methodology that allows one to estimate the additive genetic variance for the estimation of the narrow-sense heritability using MINQUE and REML, without an analysis of variance model. It requires that one can build the matrix of genetic variances and covariances among the subjects on which observations are taken. This methodology can be easily adapted to ANOVA-based methods, and we exemplify by using Henderson's Method III. We compare Henderson's Method III, MINQUE, and REML under the proposed methodology, with an emphasis on comparing these estimation methods with non-normally distributed data and unbalanced designs. A location-scale transformation of the beta density is proposed for simulation of non-normal data.

APA, Harvard, Vancouver, ISO, and other styles
10

Aevan, Nadjib Danial. "MDO Framework for Design of Human PoweredPropellers using Multi-Objective Genetic Algorithm." Thesis, Linköpings universitet, Fluida och mekatroniska system, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-123626.

Full text
Abstract:
This thesis showcases the challenges, downsides and advantages to building a MultiDisciplinary Optimization (MDO) framework to automate the generation of an efficientpropeller design built for lightly loaded operation, more specifically for humanpowered aircrafts. Two years ago, a human powered aircraft project was initiatedat Linköping University. With the help of several courses, various students performedconceptional design, calculated and finally manufactured a propeller bymeans of various materials and manufacturing techniques. The performance ofthe current propeller is utilized for benchmarking and comparing results obtainedby the MDO process.The developed MDO framework is constructed as a modeFRONITER project wereseveral Computer Aided Engineering softwares (CAE) such as MATLAB, CATIAand XFOIL are connected to perform multiple consequent optimization subprocesses.The user is presented with several design constraints such as blade quantity,required input power, segment-wise airfoil thickness, desired lift coefficientetc. Also, 6 global search optimization algorithms are investigated to determinethe one which generate most efficient result according to several set standards.The optimization process is thereafter initialized by identifying the most efficientchord distribution with a help of an initial blade cross-section which has been previouslyused in other human powered propellers, the findings are thereafter usedto determine the flow conditions at different propeller stations. Two different aerodynamicoptimized shapes are generated with the help of consecutively performedsubprocesses. The optimized propeller requires 7.5 W less input power to generatenearly equivalent thrust as the original propeller with a total efficiency exceedingthe 90 % mark (90.25 %). Moreover, the MDO framework include an automationprocess to generate a CAD design of the optimized propeller. The generatedCAD file illustrates a individual surface blade decrease of 12.5 % compared tothe original design, the lightweight design and lower input power yield an overallpropulsion system which is less tedious to operate.
APA, Harvard, Vancouver, ISO, and other styles
11

Parandekar, Amey V. "Development of a decision support framework for integrated watershed water quality management and a Generic Genetic Algorithm Based Optimizer." Raleigh, NC : North Carolina State University, 1999. http://www.lib.ncsu.edu/etd/public/etd-492632279902331/etd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lynch, Joshua, Karen Tang, Sambhawa Priya, Joanna Sands, Margaret Sands, Evan Tang, Sayan Mukherjee, Dan Knights, and Ran Blekhman. "HOMINID: a framework for identifying associations between host genetic variation and microbiome composition." OXFORD UNIV PRESS, 2017. http://hdl.handle.net/10150/626556.

Full text
Abstract:
Recent studies have uncovered a strong effect of host genetic variation on the composition of host-associated microbiota. Here, we present HOMINID, a computational approach based on Lasso linear regression, that given host genetic variation and microbiome taxonomic composition data, identifies host single nucleotide polymorphisms (SNPs) that are correlated with microbial taxa abundances. Using simulated data, we show that HOMINID has accuracy in identifying associated SNPs and performs better compared with existing methods. We also show that HOMINID can accurately identify the microbial taxa that are correlated with associated SNPs. Lastly, by using HOMINID on real data of human genetic variation and microbiome composition, we identified 13 human SNPs in which genetic variation is correlated with microbiome taxonomic composition across body sites. In conclusion, HOMINID is a powerful method to detect host genetic variants linked to microbiome composition and can facilitate discovery of mechanisms controlling host-microbiome interactions.
APA, Harvard, Vancouver, ISO, and other styles
13

Voog, Justin C. "Defining a genetic framework for stem cell niche maintenance in the Drosophila testis." Diss., [La Jolla] : University of California, San Diego, 2009. http://wwwlib.umi.com/cr/ucsd/fullcit?p3365768.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2009.
Title from first page of PDF file (viewed Aug. 10, 2009). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 107-125).
APA, Harvard, Vancouver, ISO, and other styles
14

Silva, Carlos H. O. "A proposed framework for establishing optimal genetic designs for estimating narrow-sense heritability." Raleigh, NC : North Carolina State University, 2000. http://www.lib.ncsu.edu/etd/public/etd-8321114310051041/etd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Jang, Won-Hyouk. "Novel cost allocation framework for natural gas processes: methodology and application to plan economic optimization." Diss., Texas A&M University, 2004. http://hdl.handle.net/1969.1/368.

Full text
Abstract:
Natural gas plants can have multiple owners for raw natural gas streams and processing facilities as well as for multiple products. Therefore, a proper cost allocation method is necessary for taxation of the profits from natural gas and crude oil as well as for cost sharing among gas producers. However, cost allocation methods most often used in accounting, such as the sales value method and the physical units method, may produce unacceptable or even illogical results when applied to natural gas processes. Wright and Hall (1998) proposed a new approach called the design benefit method (DBM), based upon engineering principles, and Wright et al. (2001) illustrated the potential of the DBM for reliable cost allocation for natural gas processes by applying it to a natural gas process. In the present research, a rigorous modeling technique for the DBM has been developed based upon a Taylor series approximation. Also, we have investigated a cost allocation framework that determines the virtual flows, models the equipment, and evaluates cost allocation for applying the design benefit method to other scenarios, particularly those found in the petroleum and gas industries. By implementing these individual procedures on a computer, the proposed framework easily can be developed as a software package, and its application can be extended to large-scale processes. To implement the proposed cost allocation framework, we have investigated an optimization methodology specifically geared toward economic optimization problems encountered in natural gas plants. Optimization framework can provide co-producers who share raw natural gas streams and processing plants not only with optimal operating conditions but also with valuable information that can help evaluate their contracts. This information can be a reasonable source for deciding new contracts for co-producers. For the optimization framework, we have developed a genetic-quadratic search algorithm (GQSA) consisting of a general genetic algorithm and a quadratic search that is a suitable technique for solving optimization problems including process flowsheet optimization. The GQSA inherits the advantages of both genetic algorithms and quadratic search techniques, and it can find the global optimum with high probability for discontinuous as well as non-convex optimization problems much faster than general genetic algorithms.
APA, Harvard, Vancouver, ISO, and other styles
16

Naik, Apoorv. "Orchestra Framework: Protocol Design for Ad Hoc and Delay Tolerant Networks using Genetic Algorithms." Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/43409.

Full text
Abstract:
Protocol designs targeted at a specific network scenario or performance metric appear promising on paper, but the complexity and cost of implementing and tuning a routing protocol from scratch presents a major bottleneck in the protocol design process. A unique framework called 'Orchestra` is proposed in the literature to support the testing and development of novel routing designs. The idea of the Orchestra framework is to create generic and reusable routing functional components which can be combined to create unique protocol designs customized for a specific performance metric or network setting. The first contribution of this thesis is the development of a generic, modular, scalable and extensible architecture of the Orchestra framework. Once the architecture and implementation of the framework is completed, the second contribution of this thesis is the development of functional components and strategies to design and implement routing protocols for delay tolerant networks (DTNs). DTNs are a special type of ad hoc network characterized by intermittent connectivity, long propagation delays and high loss rate. Thus, traditional ad hoc routing approaches cannot be used in DTNs, and special features must be developed for the Orchestra framework to support the design of DTN routing protocols. The component-based architecture of Orchestra can capture a variety of modules that can be used to assemble a routing protocol. However, manually assembling these components may result in suboptimal designs, because it is difficult to determine what the best combination is for a particular set of performance objectives and network characteristics. The third contribution of the thesis addresses this problem. A genetic algorithm based approach to automate the process of routing protocol design is developed and its performance is evaluated in the context of the Orchestra framework.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
17

CAPORALE, NICOLO'. "A UNIFYING FRAMEWORK TO STUDY THE GENETIC AND ENVIRONMENTAL FACTORS SHAPING HUMAN BRAIN DEVELOPMENT." Doctoral thesis, Università degli Studi di Milano, 2020. http://hdl.handle.net/2434/697871.

Full text
Abstract:
The development of human brain is a fascinating and complex process that still needs to be uncovered at the molecular resolution. Even though animal studies have revealed a lot of its unfolding, the fine regulation of cellular differentiation trajectories that characterizes humans has become only recently open to experimental tractability, thanks to the development of organoids, human cellular models that are able to recapitulate the spatiotemporal architecture of the brain in a 3D fashion. Here we first benchmarked human brain organoids at the level of transcriptomic and structural architecture of cell composition along several stages of differentiation. Then we harnessed their properties to probe the longitudinal impact of GSK3 on human corticogenesis, a pivotal regulator of both proliferation and polarity, that we revealed having a direct impact on early neurogenesis with a selective role in the regulation of glutamatergic lineages and outer radial glia output. Moreover, we spearheaded the use of organoids for regulatory toxicology through the study of Endocrine disrupting chemicals (EDC), pervasive compounds that can interfere with human hormonal systems. Early life exposure to EDC is associated with human disorders, but the molecular events triggered remain unknown. We developed a novel approach, integrating epidemiological with experimental biology to study the mixtures of EDC that were associated with neurodevelopmental and metabolic adverse effects in the biggest pregnancy cohort profiled so far. Our experiments were carried out on two complementary models i) human fetal primary neural stem cells, and ii) 3-dimensional cortical brain organoids and we identified the genes specifically dysregulated by EDC mixture exposure, unravelling a significant enrichment for autism spectrum disorders causative genes, thereby proposing a convergent paradigm of neurodevelopmental disorders pathophysiology between genetic and environmental factors. Finally, while EDCs are everywhere, their impact on adverse health outcomes can vary substantially among individuals, suggesting that other genetic factors may play a pivotal role for the onset of the disorders. We took advantage of organoids multiplexing to recapitulate, at the same time, neurodevelopmental trajectories on multiple genetic backgrounds, and showed that chimeric organoids preserved the overall morphological organization and transcriptomic signatures of the ones generated from single lines. In conclusion our work shows the possibility to perform population level studies in vitro and use the deep resolution of molecular biology to dissect key aspects of human neurodevelopment.
APA, Harvard, Vancouver, ISO, and other styles
18

Bonàs, Guarch Sílvia. "Implementation of a novel analytical framework for large-scale genetic data. Extending the genetic architecture of type 2 diabetes beyond common variants." Doctoral thesis, Universitat de Barcelona, 2017. http://hdl.handle.net/10803/402781.

Full text
Abstract:
The major landmark in modern genomic and biological research has been the first survey of the entire human genome. On June 2000 the staging of Bill Clinton along with Craig Venter and Francis Collins extolled how genome science would impact our lives by revolutionizing diagnosis, prevention and treatment for a vast number of human diseases (Collins 2010). Since that, we underwent a breathtaking progress in genome science with the unique conjunction of the development of new technologies such as Next Generation Sequencing (NGS) or genotyping arrays (Collins 2010; Hofker et al. 2014) and extensive data sharing initiatives catalysing new discoveries (Kaye et al. 2009; Collins 2010; Hood and Rowen 2013). To underscore the magnitude of this summit, the first sequence from the Human Genome Project (HGP) took 13 years and several collaborative efforts from a lace of international public research institutions entailing a 3 billion budget (U.S. Department of Energy & Human Genome Project program). Less than a decade later, NGS technologies have been implemented for clinical diagnosis, we entered in the $1,000 genome era, and the last Illumina sequencer, HiSeq X Ten is capable of producing up to 16 human genomes (1.8 terabases of data) in three days (Hayden 2014). The success of NGS led to an astonishing rate of growth of sequence data (Koboldt et al. 2013), which is doubling every seven months (Stephens et al. 2015). A downstream consequence has been the rapid accumulation of the number of sequenced genomes of many vertebrates, invertebrates, fungi, plants and microorganisms enabling tackling evolution and genome function through the rationale of comparative genomics (Collins 2010). In addition, the build-up of sequence data of thousands of human subjects contributed to catalogue the genetic differences between individuals, or also called as genetic variation (Hofker et al. 2014). There are different types of genetic variation but the most abundant are Single Nucleotide Polymorphisms (SNPs) (Stranger et al. 2011), substitutions of single nucleotides. While the HGP reported around 1.4 M of SNPs (Lander et al. 2001) more than 84 M of SNPs have been described in the new phase 3 release of the 1000 Genomes Project (1000G-Phase3) (Sudmant et al. 2015; The 1000 Genomes Project Consortium et al. 2015). A final example to illustrate the large efforts invested in more accurate descriptions of genetic variation is the last work published from the Exome Aggregation Consortium (ExAC). This study involved the aggregation and analysis of exomic regions through sequencing data of 60,706 individuals (Lek et al. 2016). The disposal of this kind of data showed a widespread mutational recurrence in human genomes, it allowed detecting genes subjected to strong selection depending on the class of mutation and it is expected to facilitate the clinical interpretation of disease-causing variants (Lek et al. 2016). Thus, the accumulation of individual genetic data has empowered researchers to unravel those specific genetic variants associated with disease liability. We also moved from biologically guided candidate single gene-studies involving a few hundreds of individuals towards hypothesis-free genome-wide analysis, performing extensive and massive genomic interrogation of thousands of individuals (Relling and Evans 2015; Wang et al. 2015). Piecing these advances all together, we have expanded our understanding of disease pathophysiology. Therefore, integrating the genetic understanding of the health-status alongside with clinical explorations constitutes the idea beneath personalized medicine. This genomic paradigm shift for clinical medicine provides a new source of therapeutic breakthroughs and diagnosis (Hood and Rowen 2013). As an example of this, targeted therapeutics have been resourceful for the treatment of lung cancer: sequence information revealed that tumours carrying specific mutations in the epidermal growth factor receptor (EGFR) were vulnerable to kinase inhibitors, resulting in higher response rates compared to traditional platinum-based chemotherapy (Levy et al. 2012; Swanton and Govindan 2016). Moreover, genetic tests are able to predict which breast cancer patients will benefit from chemotherapy (Innocenti et al. 2011; Gyorffy et al. 2015). Finally, notable successes have been achieved in pharmacogenomics, in which warfarin dose can be adjusted on the basis of genetic polymorphisms placed in CYP2C8 and VKORC1C genes (Collins 2010; Hood and Rowen 2013; Relling and Evans 2015). In line with this, there are large efforts under way to prioritize targeted therapeutics and to optimize drug selection and dosing, such as the Genomics England 100,000 Genomes Project and the US National of Health (NIH) Pharmacogenomics Research Network (Relling and Evans 2015; Wilson and Nicholls 2015). However, clear successes in clinical decision-making through genomic knowledge are anecdotal due to a poor understanding of human genetic diseases (Hofker et al. 2014; Relling and Evans 2015). For instance, Genome Wide Association Studies (GWAS) is undoubtedly one of the most important methodological advances emerging from the availability of complete human genome sequences and affordable DNA chips (Visscher et al. 2012; Hofker et al. 2014; Paul et al. 2014). GWAS have been extremely resourceful in identifying genetic variants associated with multiple diseases, but the translation of these results to clinics is sparse (Manolio et al. 2009; Collins 2010; Hofker et al. 2014). Some of the limitations lie on (1) the still small proportion of disease causing genetic factors identified for most complex diseases and (2) a lack of functional characterization and interpretation of disease associated variants, which hampers the identification of the underlying molecular mechanism (Manolio et al. 2009; Hofker et al. 2014). The genomic revolution has brought new decisive players for the future trend in biomedical research and clinical genetics. The ‘genomical’ challenge is one of the most demanding Big Data sciences in all four big computer science domains (data acquisition, storage, distribution and computation). In order to meet this rapid progress of genomic research, the build-up of whole-genome sequences and the emergence of large population biobanks (Stephens et al. 2015) urges a parallel development of computational frameworks. Moreover, a real social concern about data privacy can discourage the participation in genetic studies, which requires a major discussion about the ethical consequences of the return of information to participants seeking for genetic diagnosis (Hood and Rowen 2013; Koboldt et al. 2013). From this brief overview, the agenda of human genomics has clearly many issues to address. In this thesis I translated some of them into the following general goal: setting a cost-effective genetic research environment through the implementation of novel analytical and computational methods in order to better understand the genetics of Type 2 Diabetes (T2D). This work is a small glimpse of the frenzied activity in human genomics research and it aims to modestly contribute along with countless research efforts on this broad deployment of P4 medicine (Predictive, Preventive, Personalized, Participatory). In the next sections of this dissertation, I want to spell out this primary focus by providing several concepts that I learned during these years, which prompted this research to successfully achieve the goals of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
19

Bowman, Kelly Eric. "Optimization Constrained CAD Framework with ISO-Performing Design Generator." Diss., CLICK HERE for online access, 2008. http://contentdm.lib.byu.edu/ETD/image/etd2599.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

McBride, Cameron D. "A Mathematical and Engineering Framework to Predict the Effect of Resource Sharing on Genetic Networks." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/111712.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 73-76).
In this thesis, a framework is developed to investigate the effect of resource sharing on the performance of genetic networks. A model of a genetic system with shared resources for protein degradation is developed that captures resource sharing effects and is subsequently analyzed to discover ways in which this form of resource sharing effects genetic networks. It is shown that sharing of degradation resources may cancel undesirable effects due to resource sharing of protein production resources. Next, a theoretical framework is developed to find conditions in which a genetic network may exhibit a change in its number of equilibria due to resource sharing effects. Finally, metrics and an experimental method are proposed to estimate the quantity of resources a genetic network uses and the sensitivity of the network to disturbances in resource availability. These measures may be utilized to inform design choices in genetic networks in which resource sharing plays a significant role. These effects become increasingly important in more complex genetic networks. Quantification of such resource sharing effects are an important step in increasing the predictability of genetic networks.
by Cameron D. McBride.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
21

Morley, Mark S. "A framework for evolutionary optimization applications in water distribution systems." Thesis, University of Exeter, 2008. http://hdl.handle.net/10036/42400.

Full text
Abstract:
The application of optimization to Water Distribution Systems encompasses the use of computer-based techniques to problems of many different areas of system design, maintenance and operational management. As well as laying out the configuration of new WDS networks, optimization is commonly needed to assist in the rehabilitation or reinforcement of existing network infrastructure in which alternative scenarios driven by investment constraints and hydraulic performance are used to demonstrate a cost-benefit relationship between different network intervention strategies. Moreover, the ongoing operation of a WDS is also subject to optimization, particularly with respect to the minimization of energy costs associated with pumping and storage and the calibration of hydraulic network models to match observed field data. Increasingly, Evolutionary Optimization techniques, of which Genetic Algorithms are the best-known examples, are applied to aid practitioners in these facets of design, management and operation of water distribution networks as part of Decision Support Systems (DSS). Evolutionary Optimization employs processes akin to those of natural selection and “survival of the fittest” to manipulate a population of individual solutions, which, over time, “evolve” towards optimal solutions. Such algorithms are characterized, however, by large numbers of function evaluations. This, coupled with the computational complexity associated with the hydraulic simulation of water networks incurs significant computational overheads, can limit the applicability and scalability of this technology in this domain. Accordingly, this thesis presents a methodology for applying Genetic Algorithms to Water Distribution Systems. A number of new procedures are presented for improving the performance of such algorithms when applied to complex engineering problems. These techniques approach the problem of minimising the impact of the inherent computational complexity of these problems from a number of angles. A novel genetic representation is presented which combines the algorithmic simplicity of the classical binary string of the Genetic Algorithm with the performance advantages inherent in an integer-based representation. Further algorithmic improvements are demonstrated with an intelligent mutation operator that “learns” which genes have the greatest impact on the quality of a solution and concentrates the mutation operations on those genes. A technique for implementing caching of solutions – recalling the results for solutions that have already been calculated - is demonstrated to reduce runtimes for Genetic Algorithms where applied to problems with significant computation complexity in their evaluation functions. A novel reformulation of the Genetic Algorithm for implementing robust stochastic optimizations is presented which employs the caching technology developed to produce an multiple-objective optimization methodology that demonstrates dramatically improved quality of solutions for given runtime of the algorithm. These extensions to the Genetic Algorithm techniques are coupled with a supporting software library that represents a standardized modelling architecture for the representation of connected networks. This library gives rise to a system for distributing the computational load of hydraulic simulations across a network of computers. This methodology is established to provide a viable, scalable technique for accelerating evolutionary optimization applications.
APA, Harvard, Vancouver, ISO, and other styles
22

Ketenci, Ahmet. "Development Of A Grid-aware Master Worker Framework For Artificial Evolution." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612881/index.pdf.

Full text
Abstract:
Genetic Algorithm (GA) has become a very popular tool for various kinds of problems, including optimization problems with wider search spaces. Grid search techniques are usually not feasible or ineffective at finding a solution, which is good enough. The most computationally intensive component of GA is the calculation of the goodness (fitness) of candidate solutions. However, since the fitness calculation of each individual does not depend each other, this process can be parallelized easily. The easiest way to reach high amounts of computational power is using grid. Grids are composed of multiple clusters, thus they can offer much more resources than a single cluster. On the other hand, grid may not be the easiest environment to develop parallel programs, because of the lack of tools or libraries that can be used for communication among the processes. In this work, we introduce a new framework, GridAE, for GA applications. GridAE uses the master worker model for parallelization and offers a GA library to users. It also abstracts the message passing process from users. Moreover, it has both command line interface and web interface for job management. These properties makes the framework more usable for developers even with limited parallel programming or grid computing experience. The performance of GridAE is tested with a shape optimization problem and results show that the framework is more convenient to problems with crowded populations.
APA, Harvard, Vancouver, ISO, and other styles
23

Ugwu, Onuegbu O. "A decision support framework for resource optimisation and management using hybrid genetic algorithms : application in earthworks." Thesis, London South Bank University, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.297926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Nguyen, Peter H. T. (Peter Hung Trung). "Fine-Mapping Tools : an interactive framework for dissecting disease-associated genetic loci with functional genomics data." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113121.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 57-58).
Fine mapping causal SNPs from GWAS summary statistics is hard. Although many frame- works exist to support fine mapping, some of which leverage epigenomic contexts to increase predictive power, they fail to provide interactivity. Here, we introduce Fine-Mapping Tools (fm-tools), a framework for doing interactive and iterative fine mapping. Fm-tools provides a harmonized data store and implements a number of algorithms for fine mapping -- one of which is the custom RiVIERA-mini, an efficient Bayesian inference framework -- and exposes them via a rich API that can be plugged into a variety of services (e.g., web applications for visualization). Most importantly, fm-tools allows scientists to interactively and iteratively explore dynamically generated hypotheses, as demonstrated by a case study for celiac disease. In summary, fm-tools standardizes the way fine mapping is done, reduces the overhead of fine mapping for scientists and of algorithm development for researchers, and paves the way towards achieving real-time personalized medicine.
by Peter HT Nguyen.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
25

COSTANZO, STEFANO. "An Evolutionary Multi-Objective Optimization Framework for Bi-level Problems." Doctoral thesis, Università degli Studi di Trieste, 2017. http://hdl.handle.net/11368/2908206.

Full text
Abstract:
Genetic algorithms (GA) are stochastic optimization methods inspired by the evolutionist theory on the origin of species and natural selection. They are able to achieve good exploration of the solution space and accurate convergence toward the global optimal solution. GAs are highly modular and easily adaptable to specific real-world problems which makes them one of the most efficient available numerical optimization methods. This work presents an optimization framework based on the Multi-Objective Genetic Algorithm for Structured Inputs (MOGASI) which combines modules and operators with specialized routines aimed at achieving enhanced performance on specific types of problems. MOGASI has dedicated methods for handling various types of data structures present in an optimization problem as well as a pre-processing phase aimed at restricting the problem domain and reducing problem complexity. It has been extensively tested against a set of benchmarks well-known in literature and compared to a selection of state-of-the-art GAs. Furthermore, the algorithm framework was extended and adapted to be applied to Bi-level Programming Problems (BPP). These are hierarchical optimization problems where the optimal solution of the bottom-level constitutes part of the top-level constraints. One of the most promising methods for handling BPPs with metaheuristics is the so-called "nested" approach. A framework extension is performed to support this kind of approach. This strategy and its effectiveness are shown on two real-world BPPs, both falling in the category of pricing problems. The first application is the Network Pricing Problem (NPP) that concerns the setting of road network tolls by an authority that tries to maximize its profit whereas users traveling on the network try to minimize their costs. A set of instances is generated to compare the optimization results of an exact solver with the MOGASI bi-level nested approach and identify the problem sizes where the latter performs best. The second application is the Peak-load Pricing (PLP) Problem. The PLP problem is aimed at investigating the possibilities for mitigating European air traffic congestion. The PLP problem is reformulated as a multi-objective BPP and solved with the MOGASI nested approach. The target is to modulate charges imposed on airspace users so as to redistribute air traffic at the European level. A large scale instance based on real air traffic data on the entire European airspace is solved. Results show that significant improvements in traffic distribution in terms of both schedule displacement and air space sector load can be achieved through this simple, en-route charge modulation scheme.
APA, Harvard, Vancouver, ISO, and other styles
26

Kariyawasam, G. S. Kanchana. "Developing a Regulatory Framework for the Protection of Intellectual Property Rights in Plant Genetic Resources in Sri Lanka." Thesis, Griffith University, 2006. http://hdl.handle.net/10072/365888.

Full text
Abstract:
This thesis represents a critical examination of the intellectual property protection over biogenetic resources in Sri Lanka. While Sri Lanka is obliged to comply with international obligations, consideration needs to be given to the maintenance of a sustainable agricultural framework. The extension of intellectual property rights to agriculture is a significant issue that confronts Sri Lanka. The challenge for Sri Lanka is to ensure that these rights are implemented in the domestic context in an appropriate manner.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Griffith Law School
Arts, Education and Law
Full Text
APA, Harvard, Vancouver, ISO, and other styles
27

CANNAS, LAURA MARIA. "A framework for feature selection in high-dimensional domains." Doctoral thesis, Università degli Studi di Cagliari, 2013. http://hdl.handle.net/11584/266105.

Full text
Abstract:
The introduction of DNA microarray technology has lead to enormous impact in cancer research, allowing researchers to analyze expression of thousands of genes in concert and relate gene expression patterns to clinical phenotypes. At the same time, machine learning methods have become one of the dominant approaches in an effort to identify cancer gene signatures, which could increase the accuracy of cancer diagnosis and prognosis. The central challenges is to identify the group of features (i.e. the biomarker) which take part in the same biological process or are regulated by the same mechanism, while minimizing the biomarker size, as it is known that few gene expression signatures are most accurate for phenotype discrimination. To account for these competing concerns, previous studies have proposed different methods for selecting a single subset of features that can be used as an accurate biomarker, capable of differentiating cancer from normal tissues, predicting outcome, detecting recurrence, and monitoring response to cancer treatment. The aim of this thesis is to propose a novel approach that pursues the concept of finding many potential predictive biomarkers. It is motivated from the biological assumption that, given the large numbers of different relationships which are possible between genes, it is highly possible to combine genes in many ways to produce signatures with similar predictive power. An intriguing advantage of our approach is that it increases the statistical power to capture more reliable and consistent biomarkers while a single predictor may not necessarily provide important clues as to biological differences of interest. Specifically, this thesis presents a framework for feature selection that is based upon a genetic algorithm, a well known approach recently proposed for feature selection. To mitigate the high computationally cost usually required by this algorithm, the framework structures the feature selection process into a multi-step approach which combines different categories of data mining methods. Starting from a ranking process performed at the first step, the following steps detail a wrapper approach where a genetic algorithm is coupled with a classifier to explore different feature subspaces looking for optimal biomarkers. The thesis presents in detail the framework and its validation on popular datasets which are usually considered as benchmark by the research community. The competitive classification power of the framework has been carefully evaluated and empirically confirms the benefits of its adoption. As well, experimental results obtained by the proposed framework are comparable to those obtained by analogous literature proposals. Finally, the thesis contributes with additional experiments which confirm the framework applicability to the categorization of the subject matter of documents.
APA, Harvard, Vancouver, ISO, and other styles
28

Li, Miaoxin, and 李淼新. "Development of a bioinformatics and statistical framework to integratebiological resources for genome-wide genetic mapping and itsapplications." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B43572030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Frithiof, Fredrik. "A framework for designing a modular muffler system by global optimization." Thesis, KTH, Optimeringslära och systemteori, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-169650.

Full text
Abstract:
When creating a muffler to be installed on a noise generating machine, the design parameters as well as the placements of sound attenuating elements has to be optimized in order to minimize the sound coming out of the equipage. This is exemplified in a small project task for students of a basic course in optimization at KTH. The task is however flawed, since both the way in which the optimization problem is formed is overly simplistic and the algorithm used to solve the problem, fmincon, does not cope well with the mathematical complexity of the model, meaning it gets stuck in a local optimum that is not a global optimum. This thesis is about investigating how to solve both of these problems. The model is modified to combine several frequencies and adjusting them to the sensitivity to different frequencies in the human ear. By doing this, the objective is changed from the previous way of maximizing Dynamic Insertion Loss Dilfor a specific frequency to minimize the total perceived sound level LA.  The model is based on the modular design of TMM from four-pole theory. This divides the muffler into separate parts, with the sound attenuating elements being mathematically defined only by what T matrix it has. The element types to choose from are the Expansion Chamber, the Quarter Wave Resonator and the Helmholtz Resonator. The global optimization methods to choose from are Global Search, MultiStart, Genetic Algorithm, Pattern Search and Simulated Annealing. By combining the different types of sound attenuating elements in every way and solving each case with every global optimization method, the best combination to implement to the model is chosen. The choice is two Quarter Wave Resonators being solved by MultiStart, which provides satisfactory results. Further analysis is done to ensure the robustness of chosen implementation, which does not reveal any significant flaws. The purpose of this thesis is fulfilled.
När man skapar en ljuddämpare som ska installeras på en ljud-genererande maskin bör designparametrarna samt placeringarna av ljuddämpande element optimeras för att minimera ljudet som kommer ut ur ekipaget. Detta exemplifieras i en liten projektuppgift för studenter till en grundkurs i optimering på KTH. Uppgiften är dock bristfällig, eftersom både det sätt som optimeringsproblemet är utformat är alltför förenklat och den algoritm som används för att lösa problemet, fmincon, inte klarar av modellens matematiska komplexitet bra, vilket menas med att den fastnar i ett lokalt optimum som inte är ett globalt optimum. Detta examensarbete handlar om att undersöka hur man kan lösa båda dessa problem. Modellen är modifierad för att kombinera flera frekvenser och anpassa dem till känsligheten för olika frekvenser i det mänskliga örat. Genom att göra detta är målet ändrat från det tidigare sättet att maximera den dynamiska insatsisoleringen DIL för en specifik frekvens till att minimera den totala upplevda ljudnivån LA. Modellen bygger på den modulära designen av TMM från 4-polsteori. Detta delar upp ljuddämparen i separata delar, med ljuddämpande element som matematiskt endast definieras av vilken T matris de har. De elementtyper att välja mellan är expansionskammare, kvartsvågsresonator och Helmholtzresonator. De globala optimeringsmetoder att välja mellan är Global Search, MultiStart, Genetic Algorithm, Pattern Search och Simulated Annealing. Genom att kombinera de olika typerna av ljuddämpande element på alla sätt och lösa varje fall med varje global optimeringsmetod, blir den bästa kombinationen vald och implementerad i modellen. Valet är två kvartsvågsresonatorer som löses genom MultiStart, vilket ger tillfredsställande resultat. Ytterligare analyser görs för att säkerställa robustheten av den valda implementationen, som inte avslöjar några väsentliga brister. Syftet med detta examensarbete är uppfyllt.
APA, Harvard, Vancouver, ISO, and other styles
30

Teske, Alexander. "Automated Risk Management Framework with Application to Big Maritime Data." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38567.

Full text
Abstract:
Risk management is an essential tool for ensuring the safety and timeliness of maritime operations and transportation. Some of the many risk factors that can compromise the smooth operation of maritime activities include harsh weather and pirate activity. However, identifying and quantifying the extent of these risk factors for a particular vessel is not a trivial process. One challenge is that processing the vast amounts of automatic identification system (AIS) messages generated by the ships requires significant computational resources. Another is that the risk management process partially relies on human expertise, which can be timeconsuming and error-prone. In this thesis, an existing Risk Management Framework (RMF) is augmented to address these issues. A parallel/distributed version of the RMF is developed to e ciently process large volumes of AIS data and assess the risk levels of the corresponding vessels in near-real-time. A genetic fuzzy system is added to the RMF's Risk Assessment module in order to automatically learn the fuzzy rule base governing the risk assessment process, thereby reducing the reliance on human domain experts. A new weather risk feature is proposed, and an existing regional hostility feature is extended to automatically learn about pirate activity by ingesting unstructured news articles and incident reports. Finally, a geovisualization tool is developed to display the position and risk levels of ships at sea. Together, these contributions pave the way towards truly automatic risk management, a crucial component of modern maritime solutions. The outcomes of this thesis will contribute to enhance Larus Technologies' Total::Insight, a risk-aware decision support system successfully deployed in maritime scenarios.
APA, Harvard, Vancouver, ISO, and other styles
31

Cott, Andrew. "An examination of analysis and optimization procedures within a PBSD framework." Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/2318.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Bamba, Aliyu Siise Abdullah. "Beyond landraces : framework for the genetic improvement of Bambara groundnut (Vigna subterranea (L.) Verdc.) for global food security." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/38865/.

Full text
Abstract:
The world over the past decades has gone through some steady bio-physical and socio-demographical changes. Specifically, climate change (leading to increased temperature, salinization, drought etc.), population explosion, urbanization and migration (especially, rural to urban migration), among others have resulted in a new set of global challenges. In this regard, the threat to global food security has been recognised as one of the key challenges facing humanity in this 21st century. The Green Revolution of the 1960s was a major success in safeguarding global food security and still remains relevant. Nonetheless, it left behind some negative footprints. Worthy of note, is the ‘erosion’ in species diversity due to the mono-culture cultivation systems it was primarily design for. For this reason, there are serious concerns that the yield gains possible with the small number of ‘major’ staple crop species [mainly cereals; wheat (Triticum spp.), rice (Oryza sativa) and maize (Zea mays) that have supported our food through the Green Revolution for the past four decades may not be enough to sustain a growing global population in the face of climate change. Recently, the accession that there is the need for policy shift on addressing global food security, away from the classical concept of the ‘Green Revolution’ is gaining some level of acceptance. Against this backdrop, exploiting underutilised crop plants with an abundance of genetic resources and potentially beneficial traits is seen as one of the solutions that could provide a more diversified agricultural system and additional food sources. For this reason, Bambara groundnut [Vigna subterranea (L.) Verdc.] has provided a focus for exemplar studies particularly in the developing world (Africa), through its ability to produce yields with minimal inputs in drought prone environments. However, typical of most underutilised species which have suffered neglect within the research community, landraces which have been selected by farmers remain the main source of planting materials. These landrace collections, in most cases may not possess the ‘optimum combinations’ of phenotypic traits desired by potential stakeholders interested in the crop (particularly; consumers, processors and farmers). The ability to develop improved germplasm resources of Bambara groundnut (through controlled breeding) is an important step towards harnessing the potentials of the crop to address food and nutritional security concerns. In this light, the need to establish a ‘coordinated breeding programmes’ for Bambara groundnut has gain some level of attention. As part of this initiative, this project reports herein; 1.Identification and critical analysis of core breeding objectives for Bambara groundnut that could be of particular importance to various stakeholders across the value chain of the crop. Additionally, a conceptual breeding framework that could serve as de facto guide for current and future breeding programmes has been reported. 2.Diversity (genetic and phenotypic) and population structure analysis of Bambara groundnut ‘global germplasm’ with emphasis on its implications for breeding programmes. Potential implications for ‘domestication theory’ have also been highlighted. Additionally, a conceptual framework of population structure of Bambara groundnut indicating its utility and linkages for crop improvement programme has been reported. 3.Heritability and response to selection (genetic advance) estimates of phenotypic traits in F2 genotypes of Bambara groundnut. 4.Development of improved germplasm resources for traits analysis in Bambara groundnut (with potential for drought studies). 5.Mapping and QTL analysis of phenotypic traits in F2 and F3 derived genotypes of Bambara groundnut.
APA, Harvard, Vancouver, ISO, and other styles
33

Botero, Oscar. "Heterogeneous RFID framework design, analysis and evaluation." Phd thesis, Institut National des Télécommunications, 2012. http://tel.archives-ouvertes.fr/tel-00714120.

Full text
Abstract:
The Internet of Things paradigm establishes interaction and communication with a huge amount of actors. The concept is not a new-from-scratch one; actually, it combines a vast number of technologies and protocols and surely adaptations of pre-existing elements to offer new services and applications. One of the key technologies of the Internet of Things is the Radio Frequency Identification just abbreviated RFID. This technology proposes a set of solutions that allow tracking and tracing persons, animals and practically any item wirelessly. Considering the Internet of Things concept, multiple technologies need to be linked in order to provide interactions that lead to the implementation of services and applications. The challenge is that these technologies are not necessarily compatible and designed to work with other technologies. Within this context, the main objective of this thesis is to design a heterogeneous framework that will permit the interaction of diverse devices such as RFID, sensors and actuators in order to provide new applications and services. For this purpose in this work, our first contribution is the design and analysis of an integration architecture for heterogeneous devices. In the second contribution, we propose an evaluation model for RFID topologies and an optimization tool that assists in the RFID network planning process. Finally, in our last contribution, we implemented a simplified version of the framework by using embedded hardware and performance metrics are provided as well as the detailed configuration of the test platform
APA, Harvard, Vancouver, ISO, and other styles
34

Hanlon, Nicholas P. "Simulation Research Framework with Embedded Intelligent Algorithms for Analysis of Multi-Target, Multi-Sensor, High-Cluttered Environments." University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1460730865.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Papadopoulou, Frantzeska. "Opening Pandora's Box : Exploring Flexibilities and Alternatives for Protecting Traditional Knowledge and Genetic Resources under the Intellectual Property Framework." Doctoral thesis, Stockholms universitet, Juridiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-100568.

Full text
Abstract:
What happens when resources get valuable and scarce? How is Intellectual Property dealing with market failures related to sub-patentable innovation or purely traditional knowledge with interesting applications? The protection of traditional knowledge and genetic resources (TKGR) has been one of the major modern challenges in international IP law. The entry into force of the Convention on Biological Diversity (CBD) and its implementation in national legislation has created more questions than the ones it answered. The objective of this dissertation is to assist in the evaluation of current national and regional implementation initiatives as well in the presentation and evaluation of different forms of entitlements that could be applicable in the case of TKGR. The dissertation has employed a theoretical framework for this evaluation, by combining the Coase Theorem and Rawls' theory of justice. The choice of these two theoretical models is not a random one. In order for the entitlement covering TKGR to be successful, it has to be efficient. It has to offer a stable and efficient marketplace where access to TKGR is possible without unnecessary frictions. However, efficiency could not be the only objective.  An entitlement focusing solely on efficiency would fall short of the needs and special considerations of TKGR trade. It would above all be counter to the objectives and major principles of the CBD, the “fair and equitable sharing of the benefits” and would certainly fail to address the very important North-South perspective.  Fairness is thus a necessary complement to the efficiency of the proposed entitlement. This dissertation proposes a thorough investigation of the special characteristics, of right-holders, subject-matter, market place as well as of the general expectations that an entitlement is supposed to fulfill. In parallel to that, it  looks into the meaning and scope of alternative entitlements in order to be able to propose the best alternative.
APA, Harvard, Vancouver, ISO, and other styles
36

Li, Miaoxin. "Development of a bioinformatics and statistical framework to integrate biological resources for genome-wide genetic mapping and its applications." Click to view the E-thesis via HKUTO, 2009. http://sunzi.lib.hku.hk/hkuto/record/B43572030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Agulló, Brotons Jonás César. "Studies on ecology, reproductive biology and genetic diversity of Helianthemum caput-felis Boiss. (Cistaceae). A framework for its conservation." Doctoral thesis, Universidad de Alicante, 2016. http://hdl.handle.net/10045/70958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Crain, Stacie M. "Designer Genes: An analysis of a theoretical framework for policy proposals in relation to genetic engineering as a reproductive technology." Thesis, Virginia Tech, 2003. http://hdl.handle.net/10919/44317.

Full text
Abstract:
With the new capabilities of genetic engineering and such biotechnologies, come added considerations for policy makers. If gene therapy (or even embryo selection) becomes common practice, we must look not only to creating policies that protect the interests of individuals in the legal and social realms, but consideration must also be given to the equality of opportunity in the genetic sense. This additional level brings with it much significance; one can argue that financial disparity is at least theoretically surmountable but it is difficult to account for intentional genetic alterations that would forever give certain individuals a physical advantage over non-enhanced persons. It is with these new boundaries that genetic policy must find itself creating legislation; it is also with these new boundaries that policy will find its greatest hurdles. Given the ever-expanding field of biotechnology and gene therapy, one can hardly expect policy written today to be up-to-date ten, or even two years from now. Instead of focusing, therefore, on specific recommendations, I will center my discussion on a broad framework that outlines the arguments that should be considered when dealing with genetic engineering and public policy. After creating a theoretical structure centered on historical experiences and the philosophical writings of John Rawls, we will delve deeper into the actual possibilities created by genetic engineering and embryo selection. I will further analyze the differences between positive and negative genetic interventions and discuss the consequences of these differences as they should (or should not) affect policy. This particular distinction and the implications of these differences on policy will serve as the bulk of my discussion.
Master of Arts
APA, Harvard, Vancouver, ISO, and other styles
39

Cooper, Lisa Suzanne. "Population genomics of pollinating fig wasps and their natural enemies." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/28954.

Full text
Abstract:
The advent of next generation sequencing technologies has had a major impact on inference methods for population genetics. For example, community ecology studies can now assess species interactions using population history parameters estimated from genomic scale data. Figs and their pollinating fig wasps are obligate mutualists thought to have coevolved for some 75 million years. This relationship, along with additional interactions with many species of non-pollinating fig wasps (NPFW), makes this system an excellent model for studying multi-trophic community interactions. A common way of investigating the population histories of a community's component species is to use genetic markers to estimate demographic parameters such as divergence times and effective population sizes. The extent to which histories are congruent gives insights into the way in which the community has assembled. Because of coalescent variance, using thousands of loci from the genomes of a small number of individuals gives more statistical power and more realistic estimates of population parameters than previous methods using just a handful of loci from many individuals. In this thesis, I use genomic data from eleven fig wasp species, which are associated with three fig species located along the east coast of Australia, to characterise community assembly in this system. The first results chapter describes the laboratory and bioinformatic protocols required to generate genomic data from individual wasps, and assesses the level of genetic variation present across populations using simple summaries. The second results chapter presents a detailed demographic analysis of the pollinating fig wasp, Pleistodontes nigriventris. The inferences were made using a likelihood modelling framework and the pairwise sequentially Markovian coalescent (PSMC) method. The final results chapter characterises community assembly by assessing congruence between the population histories inferred for eight fig wasp species. The population histories were inferred using a new composite likelihood modelling framework. I conclude by discussing the implications of the results presented along with potential future directions for the research carried out in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
40

Shakya, Siddhartha. "DEUM : a framework for an estimation of distribution algorithm based on Markov random fields." Thesis, Robert Gordon University, 2006. http://hdl.handle.net/10059/39.

Full text
Abstract:
Estimation of Distribution Algorithms (EDAs) belong to the class of population based optimisation algorithms. They are motivated by the idea of discovering and exploiting the interaction between variables in the solution. They estimate a probability distribution from population of solutions, and sample it to generate the next population. Many EDAs use probabilistic graphical modelling techniques for this purpose. In particular, directed graphical models (Bayesian networks) have been widely used in EDA. This thesis proposes an undirected graphical model (Markov Random Field (MRF)) approach to estimate and sample the distribution in EDAs. The interaction between variables in the solution is modelled as an undirected graph and the joint probability of a solution is factorised as a Gibbs distribution. The thesis describes a model of fitness function that approximates the energy in the Gibbs distribution, and shows how this model can be fitted to a population of solutions to estimate the parameters of the MRF. The estimated MRF is then sampled to generate the next population. This approach is applied to estimation of distribution in a general framework of an EDA, called Distribution Estimation using Markov Random Fields (DEUM). The thesis then proposes several variants of DEUM using different sampling techniques and tests their performance on a range of optimisation problems. The results show that, for most of the tested problems, the DEUM algorithms significantly outperform other EDAs, both in terms of number of fitness evaluations and the quality of the solutions found by them. There are two main explanations for the success of DEUM algorithms. Firstly, DEUM builds a model of fitness function to approximate the MRF. This contrasts with other EDAs, which build a model of selected solutions. This allows DEUM to use fitness in variation part of the evolution. Secondly, DEUM exploits the temperature coefficient in the Gibbs distribution to regulate the behaviour of the algorithm. In particular, with higher temperature, the distribution is closer to being uniform and with lower temperature it concentrates near some global optima. This gives DEUM an explicit control over the convergence of the algorithm, resulting in better optimisation.
APA, Harvard, Vancouver, ISO, and other styles
41

Sichtig, Heike. "The SGE framework discovering spatio-temporal patterns in biological systems with spiking neural networks (S), a genetic algorithm (G) and expert knowledge (E) /." Diss., Online access via UMI:, 2009.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Bioengineering, Biomedical Engineering, 2009.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
42

Warshawsky, David. "A system of systems flexibility framework: A method for evaluating designs that are subjected to disruptions." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54277.

Full text
Abstract:
As systems become more interconnected, the focus of engineering design must shift to include consideration for systems of systems (SoS) e ects. As the focus shifts from singular systems to systems of systems, so too must the focus shift from performance based analysis to an evaluation method that accounts for the tendency of such large scale systems to far outlive their original operational environments and continually evolve in order to adapt to the changes. It is nearly impossible to predict the nature of these changes, therefore the rst focus of this thesis is the measurement of the exibility of the SoS and its ability to evolve and adapt. Flexibility is measured using a combination of network theory and a discrete event simulation, therefore, the second focus is the development of a simulation environment that can also measure the system's performance for baseline comparisons. The results indicate that simulated exibility is related to the performance and cost of the SoS and is worth measuring during the design process. The third focus of this thesis is to reduce the computational costs of SoS design evaluation by developing heuristics for exibility. This was done by developing a network model to correspond with the discrete event simulation and evaluating network properties using graph theory. It was shown that the network properties can correlate with simulated exibility. In such cases it was shown that the heuristics could be used in connection with an evolutionary algorithm to rapidly search the design space for good solutions. The entire methodology was demonstrated on a multi-platform maintenance planning problem in connection with the Navy Hardware Open System Technologies initiative.
APA, Harvard, Vancouver, ISO, and other styles
43

Kuti, Temitope Babatunde. "Towards effective multilateral protection of traditional knowledge within the global intellectual property framework." University of the Western Cape, 2017. http://hdl.handle.net/11394/6339.

Full text
Abstract:
Magister Legum - LLM (Mercantile and Labour Law)
Traditional Knowledge (TK) has previously been considered a 'subject' in the public domain, unworthy of legal protection. However, the last few decades have witnessed increased discussions on the need to protect the knowledge of indigenous peoples for their economic sustenance, the conservation of biodiversity and modern scientific innovation. Questions remain as to how TK can best be protected through existing, adapted or sui generis legal frameworks. Based on an examination of the formal knowledge-protection mechanisms (i.e. the existing intellectual property system), this mini-thesis contends that these existing systems are inadequate for protecting TK. As a matter of fact, they serve as veritable platforms for incidences of biopiracy. It further argues that the many international initiatives designed to protect TK have so far failed owing to inherent shortcomings embedded in them. Furthermore, a comparative assessment of several national initiatives (in New Zealand, South Africa and Kenya) supports an understanding that several domestic efforts to protect TK have been rendered ineffective due to the insurmountable challenge of dealing with the international violations of local TK rights. It is therefore important that on-going international negotiations for the protection of TK, including the negotiations within the World Intellectual Property Organisation's Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (IGC), do not adopt similar approaches to those employed in previous initiatives if TK must be efficiently and effectively protected. This mini-thesis concludes that indigenous peoples possess peculiar protection mechanisms for their TK within the ambit of their customary legal systems and that these indigenous mechanisms are the required anchors for effective global protections.
APA, Harvard, Vancouver, ISO, and other styles
44

Kuti, Temitope Babatunde. "Towards effective Multilateral protection of traditional knowledge within the global intellectual property framework." University of the Western Cape, 2018. http://hdl.handle.net/11394/6245.

Full text
Abstract:
Magister Legum - LLM (Mercantile and Labour Law)
Traditional Knowledge (TK) has previously been considered a 'subject' in the public domain, unworthy of legal protection. However, the last few decades have witnessed increased discussions on the need to protect the knowledge of indigenous peoples for their economic sustenance, the conservation of biodiversity and modern scientific innovation. Questions remain as to how TK can best be protected through existing, adapted or sui generis legal frameworks. Based on an examination of the formal knowledge-protection mechanisms (i.e. the existing intellectual property system), this mini-thesis contends that these existing systems are inadequate for protecting TK. As a matter of fact, they serve as veritable platforms for incidences of biopiracy. It further argues that the many international initiatives designed to protect TK have so far failed owing to inherent shortcomings embedded in them. Furthermore, a comparative assessment of several national initiatives (in New Zealand, South Africa and Kenya) supports an understanding that several domestic efforts to protect TK have been rendered ineffective due to the insurmountable challenge of dealing with the international violations of local TK rights. It is therefore important that on-going international negotiations for the protection of TK, including the negotiations within the World Intellectual Property Organisation's Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (IGC), do not adopt similar approaches to those employed in previous initiatives if TK must be efficiently and effectively protected. This mini-thesis concludes that indigenous peoples possess peculiar protection mechanisms for their TK within the ambit of their customary legal systems and that these indigenous mechanisms are the required anchors for effective global protections.
APA, Harvard, Vancouver, ISO, and other styles
45

Puncher, Gregory Neils <1980&gt. "Assessment of the population structure and temporal changes in spatial dynamics and genetic characteristics of the Atlantic bluefin tuna under a fishery independent framework." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amsdottorato.unibo.it/7227/4/Puncher_Gregory_Neils_tesi.pdf.

Full text
Abstract:
As a large and long-lived species with high economic value, restricted spawning areas and short spawning periods, the Atlantic bluefin tuna (BFT; Thunnus thynnus) is particularly susceptible to over-exploitation. Although BFT have been targeted by fisheries in the Mediterranean Sea for thousands of years, it has only been in these last decades that the exploitation rate has reached far beyond sustainable levels. An understanding of the population structure, spatial dynamics, exploitation rates and the environmental variables that affect BFT is crucial for the conservation of the species. The aims of this PhD project were 1) to assess the accuracy of larval identification methods, 2) determine the genetic structure of modern BFT populations, 3) assess the self-recruitment rate in the Gulf of Mexico and Mediterranean spawning areas, 4) estimate the immigration rate of BFT to feeding aggregations from the various spawning areas, and 5) develop tools capable of investigating the temporal stability of population structuring in the Mediterranean Sea. Several weaknesses in modern morphology-based taxonomy including demographic decline of expert taxonomists, flawed identification keys, reluctance of the taxonomic community to embrace advances in digital communications and a general scarcity of modern user-friendly materials are reviewed. Barcoding of scombrid larvae revealed important differences in the accuracy of the taxonomic identifications carried out by different ichthyoplanktologists following morphology-based methods. Using a Genotyping-by-Sequencing a panel of 95 SNPs was developed and used to characterize the population structuring of BFT and composition of adult feeding aggregations. Using novel molecular techniques, DNA was extracted from bluefin tuna vertebrae excavated from late iron age, ancient roman settlements Byzantine-era Constantinople and a 20th century collection. A second panel of 96 SNPs was developed to genotype historical and modern samples in order to elucidate changes in population structuring and allele frequencies of loci associated with selective traits.
APA, Harvard, Vancouver, ISO, and other styles
46

Puncher, Gregory Neils <1980&gt. "Assessment of the population structure and temporal changes in spatial dynamics and genetic characteristics of the Atlantic bluefin tuna under a fishery independent framework." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amsdottorato.unibo.it/7227/.

Full text
Abstract:
As a large and long-lived species with high economic value, restricted spawning areas and short spawning periods, the Atlantic bluefin tuna (BFT; Thunnus thynnus) is particularly susceptible to over-exploitation. Although BFT have been targeted by fisheries in the Mediterranean Sea for thousands of years, it has only been in these last decades that the exploitation rate has reached far beyond sustainable levels. An understanding of the population structure, spatial dynamics, exploitation rates and the environmental variables that affect BFT is crucial for the conservation of the species. The aims of this PhD project were 1) to assess the accuracy of larval identification methods, 2) determine the genetic structure of modern BFT populations, 3) assess the self-recruitment rate in the Gulf of Mexico and Mediterranean spawning areas, 4) estimate the immigration rate of BFT to feeding aggregations from the various spawning areas, and 5) develop tools capable of investigating the temporal stability of population structuring in the Mediterranean Sea. Several weaknesses in modern morphology-based taxonomy including demographic decline of expert taxonomists, flawed identification keys, reluctance of the taxonomic community to embrace advances in digital communications and a general scarcity of modern user-friendly materials are reviewed. Barcoding of scombrid larvae revealed important differences in the accuracy of the taxonomic identifications carried out by different ichthyoplanktologists following morphology-based methods. Using a Genotyping-by-Sequencing a panel of 95 SNPs was developed and used to characterize the population structuring of BFT and composition of adult feeding aggregations. Using novel molecular techniques, DNA was extracted from bluefin tuna vertebrae excavated from late iron age, ancient roman settlements Byzantine-era Constantinople and a 20th century collection. A second panel of 96 SNPs was developed to genotype historical and modern samples in order to elucidate changes in population structuring and allele frequencies of loci associated with selective traits.
APA, Harvard, Vancouver, ISO, and other styles
47

Ayo, Babatope S. "Data-driven flight path rerouting during adverse weather: Design and development of a passenger-centric model and framework for alternative flight path generation using nature inspired techniques." Thesis, University of Bradford, 2018. http://hdl.handle.net/10454/17387.

Full text
Abstract:
A major factor that negatively impacts flight operations globally is adverse weather. To reduce the impact of adverse weather, avoidance procedures such as finding an alternative flight path can usually be carried out. However, such procedures usually introduce extra costs such as flight delay. Hence, there exists a need for alternative flight paths that efficiently avoid adverse weather regions while minimising costs. Existing weather avoidance methods used techniques, such as Dijkstra’s and artificial potential field algorithms that do not scale adequately and have poor real time performance. They do not adequately consider the impact of weather and its avoidance on passengers. The contributions of this work include a new development of an improved integrated model for weather avoidance, that addressed the impact of weather on passengers by defining a corresponding cost metric. The model simultaneously considered other costs such as flight delay and fuel burn costs. A genetic algorithm (GA)-based rerouting technique that generates optimised alternative flight paths was proposed. The technique used a modified mutation strategy to improve global search. A discrete firefly algorithm-based rerouting method was also developed to improve rerouting efficiency. A data framework and simulation platform that integrated aeronautical, weather and flight data into the avoidance process was developed. Results show that the developed algorithms and model produced flight paths that had lower total costs compared with existing techniques. The proposed algorithms had adequate rerouting performance in complex airspace scenarios. The developed system also adequately avoided the paths of multiple aircraft in the considered airspace.
APA, Harvard, Vancouver, ISO, and other styles
48

Botha, Marthinus Ignatius. "Modelling and simulation framework incorporating redundancy and failure probabilities for evaluation of a modular automated main distribution frame." Diss., University of Pretoria, 2013. http://hdl.handle.net/2263/33345.

Full text
Abstract:
Maintaining and operating manual main distribution frames is labour-intensive. As a result, Automated Main Distribution Frames (AMDFs) have been developed to alleviate the task of maintaining subscriber loops. Commercial AMDFs are currently employed in telephone exchanges in some parts of the world. However, the most significant factors limiting their widespread adoption are costeffective scalability and reliability. Therefore, an impelling incentive is provided to create a simulation framework in order to explore typical implementations and scenarios. Such a framework will allow the evaluation and optimisation of a design in terms of both internal and external redundancies. One of the approaches to improve system performance, such as system reliability, is to allocate the optimal redundancy to all or some components in a system. Redundancy at the system or component levels can be implemented in one of two schemes: parallel redundancy or standby redundancy. It is also possible to mix these schemes for various components. Moreover, the redundant elements may or may not be of the same type. If all the redundant elements are of different types, the redundancy optimisation model is implemented with component mixing. Conversely, if all the redundant components are identical, the model is implemented without component mixing. The developed framework can be used both to develop new AMDF architectures and to evaluate existing AMDF architectures in terms of expected lifetimes, reliability and service availability. Two simulation models are presented. The first simulation model is concerned with optimising central office equipment within a telephone exchange and entails an environment of clients utilising services. Currently, such a model does not exist. The second model is a mathematical model incorporating stochastic simulation and a hybrid intelligent evolutionary algorithm to solve redundancy allocation problems. For the first model, the optimal partitioning of the model is determined to speed up the simulation run efficiently. For the second model, the hybrid intelligent algorithm is used to solve the redundancy allocation problem under various constraints. Finally, a candidate concept design of an AMDF is presented and evaluated with both simulation models.
Dissertation (MEng)--University of Pretoria, 2013.
gm2014
Electrical, Electronic and Computer Engineering
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
49

Bukáček, Jan. "Implementace evolučního expertního systému." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2010. http://www.nusl.cz/ntk/nusl-218352.

Full text
Abstract:
This thesis is focused on working up evolutionals and genetics algorithms issues Especially for multiobjective algorithms VEGA, SPEA and NSGA – II. Thereinafter one of FrameWork working with genetics algorithms namely WWW NIMBUS. From this mentioned algorithms was selected VEGA algorithm for implementation in JAVA to preselected problem. Thereby problem is choice thick columns of profile according to predetermined criteria. Selected algorithm works on division of population into several groups and each group evaluates the resulting fitness function. Here is a sample implementation of this algorithm. Furthermore there is a example of working with FrameWork. In the next section are compared the results of generated progam with results that were obtained by FrameWork WWW NIMBUS. As for VEGA, and the Nimbus there are shown different results. The VEGA is presented also the development of individual fitness functions. Also, there are shown graphs, that can be obtained from NIMBUS. At the end of work is introduced the comparation of the results ane propose possible improvements.
APA, Harvard, Vancouver, ISO, and other styles
50

Gholizadeh, Tayyar Shadan. "An optimization-based framework for concurrent planning of multiple projects and supply chain : application on building thermal renovation projects." Thesis, Ecole nationale des Mines d'Albi-Carmaux, 2017. http://www.theses.fr/2017EMAC0006/document.

Full text
Abstract:
Le contexte d’application de cette recherche a été le projet CRIBA. CRIBA vise à industrialiser une solution intégrée de rénovation et d’isolation de grands bâtiments. De ce fait, une part importante de la valeur ajoutée est transférée des chantiers de rénovation vers des usines de fabrications devant être synchronisées avec les chantiers. La planification est l'une des étapes importantes de la gestion de projets. S’adaptant à une organisation, elle vise une réalisation optimale en considérant les facteurs de temps, coût, qualité ainsi que l’affectation efficace des ressources. Cette affectation est d’autant plus complexe lorsqu’un ensemble de projets se partagent les ressources, renouvelables ou non renouvelables. L'objectif global de notre étude est de développer un outil d’aide à la décision pour un décideur visant à planifier plusieurs projets en intégrant l'allocation des ressources renouvelables, et la planification des flux de ressources non-renouvelables vers ces projets. Dans ce cadre, les ressources non renouvelables telles que les machines et la main-d'œuvre ont une disponibilité initiale limitée sur les chantiers. Cependant, nous supposons que des quantités limitées supplémentaires peuvent être achetées. En outre, nous prenons en compte la volonté des coordinateurs des projets pour l’approvisionnement des chantiers en juste à temps (just in time), en particulier pour les ressources peu demandées, encombrantes et à forte valeur. Ceci oblige à étendre le cadre du modèle de la planification des projets en incluant la planification de la chaîne logistique qui approvisionne les ressources non renouvelables des chantiers. Enfin, pour répondre au besoin d’outils décisionnels responsables sur le plan environnemental, le modèle prévoit le transport et le recyclage des déchets des chantiers dans les centres appropriés. Un modèle linéaire mixte du problème est ainsi posé. Puisqu’il rentre dans la classe des modèles d'optimisation NP-durs, une double résolution est proposée. D’abord à l’aide d’un solveur puis une méta-heuristique basée sur un algorithme génétique. De plus, pour faciliter l'utilisation du modèle par des utilisateurs peu familiers avec la recherche opérationnelle, un système d'aide à la décision basé sur une application web a été développé. L’ensemble de ces contributions ont été évaluées sur des jeux de test issus du projet CRIBA
The application context of the current study is on a CRIBA project. The CRIBA aims to industrialize an integrated solution for the insulation and thermal renovation of building complexes in France. As a result, a significant part of the added value is transferred from the renovation sites to the manufacturing centers, making both synchronized. Planning is one of the important steps in project management. Depending on the different viewpoints of organizations, successful planning for projects can be achieved by performing to optimality within the time, cost, quality factors as well as the efficient assignment of resources. Planning for the allocation of resources becomes more complex when a set of projects is sharing renewable and non-renewable resources. The global objective of the study is to develop a decision-making tool for decision-makers to plan multiple projects by integrating the allocation of the renewable resources and planning the flow of non-renewable resources to the project worksites. In this context, non-renewable resources such as equipment and labor have a limited initial availability at the construction sites. Nevertheless, we assume that additional limited amounts can be added to the projects. In addition, we take into account the interest of the project coordinators in supplying the non-renewable resources in a just-in-time manner to the projects, especially for low-demand resources with a high price. This requires extending the framework of the project planning by including the planning of the supply chain which is responsible. Finally, in order to meet the requirements for environmentally responsible decision-making, the model envisages the transportation and recycling of waste from project sites to appropriate centers. A mixed integer linear model of the problem is proposed. Since it falls within the class of NP-hard optimization models, a double resolution is targeted: first, using a solver and then a metaheuristic based on the genetic algorithm. In addition, in order to facilitate the use of the model by users unfamiliar with operational research, a web-based decision-making support system has been developed. All the contributions are evaluated in a set of case studies from the CRIBA project
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography