To see the other types of publications on this topic, follow the link: Biological framework.

Dissertations / Theses on the topic 'Biological framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Biological framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Tagger, B. "A framework for the management of changing biological experimentation." Thesis, University College London (University of London), 2010. http://discovery.ucl.ac.uk/147616/.

Full text
Abstract:
There is no point expending time and effort developing a model if it is based on data that is out of date. Many models require large amounts of data from a variety of heterogeneous sources. This data is subject to frequent and unannounced changes. It may only be possible to know that data has fallen out of date by reconstructing the model with the new data but this leads to further problems. How and when does the data change and when does the model need to be rebuilt? At best, the model will need to be continually rebuilt in a desperate attempt to remain current. At worst, the model will be producing erroneous results. The recent advent of automated and semi-automated data-processing and analysis tools in the biological sciences has brought about a rapid expansion of publicly available data. Many problems arise in the attempt to deal with this magnitude of data; some have received more attention than others. One significant problem is that data within these publicly available databases is subject to change in an unannounced and unpredictable manner. Large amounts of complex data from multiple, heterogeneous sources are obtained and integrated using a variety of tools. These data and tools are also subject to frequent change, much like the biological data. Reconciling these changes, coupled with the interdisciplinary nature of in silico biological experimentation, presents a significant problem. We present the ExperimentBuilder, an application that records both the current and previous states of an experimental environment. Both the data and metadata about an experiment are recorded. The current and previous versions of each of these experimental components are maintained within the ExperimentBuilder. When any one of these components change, the ExperimentBuilder estimates not only the impact within that specific experiment, but also traces the impact throughout the entire experimental environment. This is achieved with the use of keyword profiles, a heuristic tool for estimating the content of the experimental component. We can compare one experimental component to another regardless of their type and content and build a network of inter-component relationships for the entire environment. Ultimately, we can present the impact of an update as a complete cost to the entire environment in order to make an informed decision about whether to recalculate our results.
APA, Harvard, Vancouver, ISO, and other styles
2

Linsley, Drew. "A revised framework for human scene recognition." Thesis, Boston College, 2016. http://hdl.handle.net/2345/bc-ir:106986.

Full text
Abstract:
Thesis advisor: Sean P. MacEvoy
For humans, healthy and productive living depends on navigating through the world and behaving appropriately along the way. But in order to do this, humans must first recognize their visual surroundings. The technical difficulty of this task is hard to comprehend: the number of possible scenes that can fall on the retina approaches infinity, and yet humans often effortlessly and rapidly recognize their surroundings. Understanding how humans accomplish this task has long been a goal of psychology and neuroscience, and more recently, has proven useful in inspiring and constraining the development of new algorithms for artificial intelligence (AI). In this thesis I begin by reviewing the current state of scene recognition research, drawing upon evidence from each of these areas, and discussing an unchallenged assumption in the literature: that scene recognition emerges from independently processing information about scenes’ local visual features (i.e. the kinds of objects they contain) and global visual features (i.e., spatial parameters. ). Over the course of several projects, I challenge this assumption with a new framework for scene recognition that indicates a crucial role for information sharing between these resources. Development and validation of this framework will expand our understanding of scene recognition in humans and provide new avenues for research by expanding these concepts to other domains spanning psychology, neuroscience, and AI
Thesis (PhD) — Boston College, 2016
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Psychology
APA, Harvard, Vancouver, ISO, and other styles
3

Keane, John F. "A framework for molecular signal processing and detection in biological cells /." Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/6126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hwang, Daehee 1971. "A statistical framework for extraction of structured knowledge from biological/biotechnological systems." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/29603.

Full text
Abstract:
Thesis (Sc. D.)--Massachusetts Institute of Technology, Dept. of Chemical Engineering, 2003.
Includes bibliographical references (leaves 203-215).
Despite enormous efforts to understand complex biological/biotechnological systems, a significant amount of knowledge has still remained unraveled. However, recent advances in high throughput technologies have offered new opportunities to understand these complex systems by providing us with huge amounts of data about these systems. Unlike traditional tools, these high throughput detection tools: (1) permit large-scale screening of formulations to find the optimal condition, and (2) provide us with a global scale of measurement for a given system. Thus, there has been a strong need for computational tools that effectively extract useful knowledge about systems behavior from the vast amount of data. This thesis presents a comprehensive set of computational tools that enables us to extract important information (called structured knowledge) from this huge amount of data to improve our understanding of biological and biotechnological systems. Then, in several case studies, this extracted knowledge is used to optimize these systems. These tools include: (1) optimal design of experiments (DOE) for efficient investigation of systems, and (2) various statistical methods for effective analyses of the data to capture all structured knowledge in the data. These tools have been applied to various biological and biotechnological systems for identification of: (1) discriminatory characteristics for several diseases from gene expression data to construct disease classifiers; (2) rules to improve plasma absorptions of drugs from high-throughput screening data; (3) binding rules of epitopes to MHC molecules from binding assay data to artificially activate immune responses involving these MHC molecules; (4) rules for pre-conditioning and plasma supplementation from metabolic profiling data to improve the bio-artificial liver (BAL) device;
(cont.) (5) rules to facilitate protein crystallizations from high-throughput screening data to find the optimal condition for crystallization; (6) a new clinical index from metabolic profiling through serum data to improve the diagnostic resolution of liver failure. The results from these applications demonstrate that the developed tools successfully extracted important information to understand systems behavior from various high-throughput data and suggested rules to improve systems performance. In the first case study, the statistical methods helped us identify a drug target for Multiple Scleroses disease through analyses of gene expression data and, then, facilitated finding a peptide drug to inhibit the drug target. In the fifth case study, the methodology enabled us to find large protein crystals for several test proteins difficult to crystallize. The rules identified from the other case studies are being validated for improvement of the systems behavior.
by Daehee Hwang.
Sc.D.
APA, Harvard, Vancouver, ISO, and other styles
5

Yates, Phillip. "An Inferential Framework for Network Hypothesis Tests: With Applications to Biological Networks." VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/2200.

Full text
Abstract:
The analysis of weighted co-expression gene sets is gaining momentum in systems biology. In addition to substantial research directed toward inferring co-expression networks on the basis of microarray/high-throughput sequencing data, inferential methods are being developed to compare gene networks across one or more phenotypes. Common gene set hypothesis testing procedures are mostly confined to comparing average gene/node transcription levels between one or more groups and make limited use of additional network features, e.g., edges induced by significant partial correlations. Ignoring the gene set architecture disregards relevant network topological comparisons and can result in familiar n
APA, Harvard, Vancouver, ISO, and other styles
6

Alkhairy, Samiya Ashraf. "A modeling framework and toolset for simulation and characterization of the cochlea within the auditory system." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/67201.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Biological Engineering, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 50-53).
Purpose: This research develops a modeling approach and an implementation toolset to simulate reticular lamina displacement in response to excitation at the ear canal and to characterize the cochlear system in the frequency domain. Scope The study develops existing physical models covering the outer, middle, and inner ears. The range of models are passive linear, active linear, and active nonlinear. These models are formulated as differential algebraic equations, and solved for impulse and tone excitations to determine responses. The solutions are mapped into tuning characteristics as a function of position within the cochlear partition. Objectives The central objective of simulation is to determine the characteristic frequency (CF)-space map, equivalent rectangular bandwidth (ERB), and sharpness of tuning (QERB) of the cochlea. The focus of this research is on getting accurate characteristics, with high time and space resolution. The study compares the simulation results to empirical measurements and to predictions of a model that utilizes filter theory and coherent reflection theory. Method We develop lumped and distributed physical models based on mechanical, acoustic, and electrical phenomena. The models are structured in the form of differential-algebraic equations (DAE), discretized in the space domain. This is in contrast to existing methods that solve a set of algebraic equations discretized in both space and time. The DAEs are solved using numerical differentiation formulas (NDFs) to compute the displacement of the reticular lamina and intermediate variables such as displacement of stapes in response to impulse and tone excitations at the ear canal. The inputs and outputs of the cochlear partition are utilized in determining its resonances and tuning characteristics. Transfer functions of the cochlear system with impulse excitation are calculated for passive and active linear models to determine resonance and tuning of the cochlear partition. Output characteristics are utilized for linear systems with tone excitation and for nonlinear models with stimuli of various amplitudes. Stability of the system is determined using generalized eigenvalues and the individual subsystems are stabilized based on their poles and zeros. Results The passive system has CF map ranging from 20 kHz at the base to 10 Hz at the apex of the cochlear partition, and has the strongest resonant frequency corresponding to that of the middle ear. The ERB is on the order of the CF, and the QERB is on the order of 1. The group delay decreases with CF which is in contradiction with findings from Stimulus Frequency Otoacoustic Emissions (SFOAE) experiments. The tuning characteristics of the middle ear correspond well to experimental observations. The stability of the system varies greatly with the choice of parameters, and number of space sections used for both the passive and active implementations. Implication Estimates of cochlear partition tuning based on solution of differential algebraic equations have better time and space resolution compared to existing methods that solve discretized set of equations. Domination of the resonance frequency of the reticular lamina by that of the middle ear rather than the resonant frequency of the cochlea at that position for the passive model is in contradiction with Bekesys measurements on human cadavers. Conclusion The methodology used in the thesis demonstrate the benefits of developing models and formulating the problem as differential-algebraic equations and solving it using the NDFs. Such an approach facilitates computation of responses and transfer functions simultaneously, studying stability of the system, and has good accuracy (controlled directly by error tolerance) and resolution.
by Samiya Ashraf Alkhairy.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
7

De, Blocq Andrew Dirk. "Estimating spotted hyaena (Crocuta crocuta) population density using camera trap data in a spatially-explicit capture-recapture framework." Bachelor's thesis, University of Cape Town, 2014. http://hdl.handle.net/11427/13053.

Full text
Abstract:
Includes bibliographical references.
Species-specific population data are important for the effective management and conservation of wildlife populations within protected areas. However such data are often logistically difficult and expensive to attain for species that are rare and have large ranges. Camera trap surveys provide a non-invasive, inexpensive and effective method for obtaining population level data on wildlife species. Provided that species can be individually identified, a photographic capture-recapture framework can be used to provide density estimates. Spatially-explicit capture-recapture (SECR) models have recently been developed, and are currently considered the most robust method for analysing capture-recapture data. Camera trap data sourced from a leopard survey performed in uMkhuze Game Reserve, KwaZulu-Natal, South Africa, was analysed using SPACECAP, a Bayesian inference-based SECR modelling program. Overall hyaena density for the reserve was estimated at 10.59 (sd=2.10) hyaenas/100 km2, which is comparable to estimates obtained using other methods for this reserve and some other protected areas in southern Africa. SECR methods are typically conservative in comparison to other methods of measuring large carnivore populations, which is somewhat supported by higher estimates in other nearby reserves. However, large gaps in time between studies and the variety of historical methods used confound comparisons between estimates. The findings from this study provide support for both camera trap surveys and SECR models in terms of deriving robust population data for spotted hyaenas and other individually recognisable species. Such data allows for studies on the drivers of population and distribution changes for such species in addition to temporal and spatial activity patterns and habitat preference for select species. The generation of accurate population data for ecologically important predators provides reserve managers with robust data upon which to make informed management decisions. This study shows that estimates for spotted hyaenas can be produced from an existing survey of leopards, which makes photographic capture-recapture methods a sensible and cost-effective option for the less charismatic species. The implementation of standardized and scientifically robust population estimation methods such as SECR using camera trap data would contribute appreciably to the conservation of important wildlife species and the ecological processes they support.
APA, Harvard, Vancouver, ISO, and other styles
8

Kalantari, John I. "A general purpose artificial intelligence framework for the analysis of complex biological systems." Diss., University of Iowa, 2017. https://ir.uiowa.edu/etd/5953.

Full text
Abstract:
This thesis encompasses research on Artificial Intelligence in support of automating scientific discovery in the fields of biology and medicine. At the core of this research is the ongoing development of a general-purpose artificial intelligence framework emulating various facets of human-level intelligence necessary for building cross-domain knowledge that may lead to new insights and discoveries. To learn and build models in a data-driven manner, we develop a general-purpose learning framework called Syntactic Nonparametric Analysis of Complex Systems (SYNACX), which uses tools from Bayesian nonparametric inference to learn the statistical and syntactic properties of biological phenomena from sequence data. We show that the models learned by SYNACX offer performance comparable to that of standard neural network architectures. For complex biological systems or processes consisting of several heterogeneous components with spatio-temporal interdependencies across multiple scales, learning frameworks like SYNACX can become unwieldy due to the the resultant combinatorial complexity. Thus we also investigate ways to robustly reduce data dimensionality by introducing a new data abstraction. In particular, we extend traditional string and graph grammars in a new modeling formalism which we call Simplicial Grammar. This formalism integrates the topological properties of the simplicial complex with the expressive power of stochastic grammars in a computation abstraction with which we can decompose complex system behavior, into a finite set of modular grammar rules which parsimoniously describe the spatial/temporal structure and dynamics of patterns inferred from sequence data.
APA, Harvard, Vancouver, ISO, and other styles
9

Griesel, Gerhard. "Development and management framework for the Gouritz River Catchment." Pretoria : [s.n.], 2003. http://upetd.up.ac.za/thesis/available/etd-11202003-155742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Moxley, Courtney. "Characterization of biotic and sodic lawns of the Kruger National Park using the framework of the positive feedback loop / Courtney Moxley." Bachelor's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/14020.

Full text
Abstract:
The classical grazing lawn model is an intensely-grazed patch composed of short-statured, grazing-tolerant grass species. The formation and maintenance of these communities is controlled by positive feedbacks between grazers and the high-quality resource forage that the component grass species provide. Different nutrient cycling dynamics among the lawns identifies two discrete lawn types in the savanna: bioticallydriven lawns on nutrient-rich gabbroic soils and abiotically-driven lawns at sodic sites. We were interested in identifying whether the biotic and sodic lawns represented two distinct systems in terms of the feedback responses among herbivores, decomposers and grass and decomposer community assemblages in a mesic savanna. We sampled these components of the abiotic and biotic template of five sodic and five biotic lawns in the Kruger National Park. We used β diversity in grass and dung beetle community assemblages among the lawns to identify whether sodic and biotic lawns were distinct for grass percent cover and dung beetle species abundance. Four and three categories of lawns were identified for these traits, respectively, and placed the lawns on a gradient from biotic-like to sodic-like with a range of intermediates. Soil Na content was higher among sodic lawns but these levels did not manifest themselves in the grass foliar Na content, as for biotic lawns. Herbivore utilization of the sodic lawns was higher than the biotic lawns. Biotic lawns showed no difference in herbivore metabolic biomass between the late-wet and early-mid dry season. We concluded that the systems of nutrient cycling and lawn maintenance are distinct between the biotic and sodic lawns, but that the lawns exist along a gradient in terms of their community characteristics and abiotic features. Efforts to classify grazing lawns will present benefits in improving our understanding of their dynamics and, resultantly, the management and conservation approaches that use them to control herbivore populations in African savanna ecosystems.
APA, Harvard, Vancouver, ISO, and other styles
11

Ödén, Jakob. "Proton plan evaluation : a framework accounting for treatment uncertainties and variable relative biological effectiveness." Licentiate thesis, Stockholms universitet, Fysikum, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-146256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Felley, Mary Louise. "A biodiversity conservation policy and legal framework for Hong Kong." Thesis, Hong Kong : University of Hong Kong, 1996. http://sunzi.lib.hku.hk/hkuto/record.jsp?B17457592.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Goel, Gautam. "Dynamic flux estimation a novel framework for metabolic pathway analysis /." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31769.

Full text
Abstract:
Thesis (Ph.D)--Biomedical Engineering, Georgia Institute of Technology, 2010.
Committee Chair: Voit, Eberhard O.; Committee Member: Butera, Robert; Committee Member: Chen, Rachel; Committee Member: Kemp, Melissa; Committee Member: Neves, Ana Rute. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
14

Remmele, Steffen [Verfasser], and Jürgen [Akademischer Betreuer] Hesser. "A Deconvolution Framework with Applications in Medical and Biological Imaging / Steffen Remmele ; Betreuer: Jürgen Hesser." Heidelberg : Universitätsbibliothek Heidelberg, 2011. http://d-nb.info/1179784944/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

He, Xin. "A semi-automated framework for the analytical use of gene-centric data with biological ontologies." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/25505.

Full text
Abstract:
Motivation Translational bioinformatics(TBI) has been defined as ‘the development and application of informatics methods that connect molecular entities to clinical entities’ [1], which has emerged as a systems theory approach to bridge the huge wealth of biomedical data into clinical actions using a combination of innovations and resources across the entire spectrum of biomedical informatics approaches [2]. The challenge for TBI is the availability of both comprehensive knowledge based on genes and the corresponding tools that allow their analysis and exploitation. Traditionally, biological researchers usually study one or only a few genes at a time, but in recent years high throughput technologies such as gene expression microarrays, protein mass-spectrometry and next-generation DNA and RNA sequencing have emerged that allow the simultaneous measurement of changes on a genome-wide scale. These technologies usually result in large lists of interesting genes, but meaningful biological interpretation remains a major challenge. Over the last decade, enrichment analysis has become standard practice in the analysis of such gene lists, enabling systematic assessment of the likelihood of differential representation of defined groups of genes compared to suitably annotated background knowledge. The success of such analyses are highly dependent on the availability and quality of the gene annotation data. For many years, genes were annotated by different experts using inconsistent, non-standard terminologies. Large amounts of variation and duplication in these unstructured annotation sets, made them unsuitable for principled quantitative analysis. More recently, a lot of effort has been put into the development and use of structured, domain specific vocabularies to annotate genes. The Gene Ontology is one of the most successful examples of this where genes are annotated with terms from three main clades; biological process, molecular function and cellular component. However, there are many other established and emerging ontologies to aid biological data interpretation, but are rarely used. For the same reason, many bioinformatic tools only support analysis analysis using the Gene Ontology. The lack of annotation coverage and the support for them in existing analytical tools to aid biological interpretation of data has become a major limitation to their utility and uptake. Thus, automatic approaches are needed to facilitate the transformation of unstructured data to unlock the potential of all ontologies, with corresponding bioinformatics tools to support their interpretation. Approaches In this thesis, firstly, similar to the approach in [3,4], I propose a series of computational approaches implemented in a new tool OntoSuite-Miner to address the ontology based gene association data integration challenge. This approach uses NLP based text mining methods for ontology based biomedical text mining. What differentiates my approach from other approaches is that I integrate two of the most wildly used NLP modules into the framework, not only increasing the confidence of the text mining results, but also providing an annotation score for each mapping, based on the number of pieces of evidence in the literature and the number of NLP modules that agreed with the mapping. Since heterogeneous data is important in understanding human disease, the approach was designed to be generic, thus the ontology based annotation generation can be applied to different sources and can be repeated with different ontologies. Secondly, in respect of the second challenge proposed by TBI, to increase the statistical power of the annotation enrichment analysis, I propose OntoSuite-Analytics, which integrates a collection of enrichment analysis methods into a unified open-source software package named topOnto, in the statistical programming language R. The package supports enrichment analysis across multiple ontologies with a set of implemented statistical/topological algorithms, allowing the comparison of enrichment results across multiple ontologies and between different algorithms. Results The methodologies described above were implemented and a Human Disease Ontology (HDO) based gene annotation database was generated by mining three publicly available database, OMIM, GeneRIF and Ensembl variation. With the availability of the HDO annotation and the corresponding ontology enrichment analysis tools in topOnto, I profiled 277 gene classes with human diseases and generated ‘disease environments’ for 1310 human diseases. The exploration of the disease profiles and disease environment provides an overview of known disease knowledge and provides new insights into disease mechanisms. The integration of multiple ontologies into a disease context demonstrates how ‘orthogonal’ ontologies can lead to biological insight that would have been missed by more traditional single ontology analysis.
APA, Harvard, Vancouver, ISO, and other styles
16

Hafeez, Abdul. "A Software Framework For the Detection and Classification of Biological Targets in Bio-Nano Sensing." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/50490.

Full text
Abstract:
Detection and identification of important biological targets, such as DNA, proteins, and diseased human cells are crucial for early diagnosis and prognosis. The key to discriminate healthy cells from the diseased cells is the biophysical properties that differ radically. Micro and nanosystems, such as solid-state micropores and nanopores can measure and translate these properties of biological targets into electrical spikes to decode useful insights. Nonetheless, such approaches result in sizable data streams that are often plagued with inherit noise and baseline wanders. Moreover, the extant detection approaches are tedious, time-consuming, and error-prone, and there is no error-resilient software that can analyze large data sets instantly. The ability to effectively process and detect biological targets in larger data sets lie in the automated and accelerated data processing strategies using state-of-the-art distributed computing systems. In this dissertation, we design and develop techniques for the detection and classification of biological targets and a distributed detection framework to support data processing from multiple bio-nano devices. In a distributed setup, the collected raw data stream on a server node is split into data segments and distributed across the participating worker nodes. Each node reduces noise in the assigned data segment using moving-average filtering, and detects the electric spikes by comparing them against a statistical threshold (based on the mean and standard deviation of the data), in a Single Program Multiple Data (SPMD) style. Our proposed framework enables the detection of cancer cells in a mixture of cancer cells, red blood cells (RBCs), and white blood cells (WBCs), and achieves a maximum speedup of 6X over a single-node machine by processing 10 gigabytes of raw data using an 8-node cluster in less than a minute, which will otherwise take hours using manual analysis. Diseases such as cancer can be mitigated, if detected and treated at an early stage. Micro and nanoscale devices, such as micropores and nanopores, enable the translocation of biological targets at finer granularity. These devices are tiny orifices in silicon-based membranes, and the output is a current signal, measured in nanoamperes. Solid-state micropore is capable of electrically measuring the biophysical properties of human cells, when a blood sample is passed through it. The passage of cells via such pores results in an interesting pattern (pulse) in the baseline current, which can be measured at a very high rate, such as 500,000 samples per second, and even higher resolution. The pulse is essentially a sequence of temporal data samples that abruptly falls below and then reverts back to a normal baseline with an acceptable predefined time interval, i.e., pulse width. The pulse features, such as width and amplitude, correspond to the translocation behavior and the extent to which the pore is blocked, under a constant potential. These features are crucial in discriminating the diseased cells from healthy cells, such as identifying cancer cells in a mixture of cells.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
17

Bergsneider, Andres. "Biological Semantic Segmentation on CT Medical Images for Kidney Tumor Detection Using nnU-Net Framework." DigitalCommons@CalPoly, 2021. https://digitalcommons.calpoly.edu/theses/2298.

Full text
Abstract:
Healthcare systems are constantly challenged with bottlenecks due to human-reliant operations, such as analyzing medical images. High precision and repeatability is necessary when performing a diagnostics on patients with tumors. Throughout the years an increasing number of advancements have been made using various machine learning algorithms for the detection of tumors helping to fast track diagnosis and treatment decisions. “Black Box” systems such as the complex deep learning networks discussed in this paper rely heavily on hyperparameter optimization in order to obtain the most ideal performance. This requires a significant time investment in the tuning of such networks to acquire cutting-edge results. The approach of this paper relies on implementing a state of the art deep learning framework, the nn-UNet, in order to label computed tomography (CT) images from patients with kidney cancer through semantic segmentation by feeding raw CT images through a deep architecture and obtaining pixel-wise mask classifications. Taking advantage of nn-UNet’s framework versatility, various configurations of the architecture are explored and applied, benchmarking and assorting resulting performance, including variations of 2D and 3D convolutions as well as the use of distinct cost functions such as the Sørensen-Dice coefficient, Cross Entropy, and a compound of them. 79% is the accuracy currently reported for the detection of benign and malign tumors using CT imagery performed by medical practitioners. The best iteration and mixture of parameters in this work resulted in an accuracy of 83% for tumor labelling. This study has further exposed the performance of a versatile and groundbreaking approach to deep learning framework designed for biomedical image segmentation.
APA, Harvard, Vancouver, ISO, and other styles
18

Cárcamo, Behrens Martín. "A computational framework for predicting CHO cell culture performance in bioreactors." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/123055.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, in conjunction with the Leaders for Global Operations Program at MIT, 2019
Thesis: S.M., Massachusetts Institute of Technology, Department of Biological Engineering, in conjunction with the Leaders for Global Operations Program at MIT, 2019
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 86-99).
Breaking the trade-off between speed and productivity is a key milestone across industries. In particular, in the biopharmaceutical industry this trade-off is exacerbated by a highly regulated environment, which hinders continuous improvement and fixes future manufacturing costs. Given the complexity of living organisms and the improvement in quality of life offered by the product - which demand agile development - the industry has traditionally taken phenomenological approaches to process development, generally sacrificing costs. Nonetheless, technological developments and lower entry barriers make the biopharmaceutical industry far more competitive than in its origins, demanding efficient and reliable processes. Developing efficient manufacturing processes for new products while being agile to market is a key differentiating capability of Amgen's process development organization.
In collaboration with the process development team at Amgen, a computational framework for in-silico upstream bioprocess development has been developed, allowing for faster, more robust, and more optimal process development. Specifically, a mechanistic model of a bioreactor has been designed, implemented, and applied to an Amgen product. The project was divided into three major components: The first was a survey of internal Amgen capabilities and the state of the art in external industrial and academic models to identify the algorithms and design the signal flow required to support the range of expected process engineering applications. The second consisted of implementing a modular, extensible software platform with the architecture and interfaces dictated by the first component. The third part consisted of applying the software to an actual product development problem capturing the primary process variables.
A constraint-based model of a metabolic network consisting of 35 reactions of the main carbon-nitrogen metabolism relevant in energy and redox balance was adapted from literature (Nolan & Lee, 2011). The metabolic network was coupled with glucose, glutamine and asparagine kinetics with temperature, dissolved oxygen, pH and osmolarity dependence. Stress induced by temperature shifts was modeled as a first-order step response coupled to a non-growth associated ATP of maintenance. The cellular model was coupled with a well-mixed bioreactor model consisting of mass balance equations. We solved the model using dynamic Flux Balance Analysis (dFBA). We first calibrated the model with experimental process characterization data for a product in development. We used a Non-dominated Sorted Genetic Algorithm (NSGA-II) to solve the calibration problem, minimizing the error in metabolite concentrations to yield estimates of 13 strain-specific parameters.
We then assessed the calibrated model's predictions of biomass growth and metabolite concentrations against a second experiment run with different process settings. Finally, I developed a graphical user interface for subject-matter-experts to simulate experiments and test hypotheses using the model. We applied the tool to three process-relevant case studies, and analyzed the in-silico results. The calibrated model can predict biomass and titer from process settings, potentially reducing experimental time from 20 days to 30 seconds, in addition to reducing the experimental cost.
by Martín Cárcamo Behrens.
M.B.A.
S.M.
M.B.A. Massachusetts Institute of Technology, Sloan School of Management
S.M. Massachusetts Institute of Technology, Department of Biological Engineering
APA, Harvard, Vancouver, ISO, and other styles
19

Sun, Guoli. "Significant distinct branches of hierarchical trees| A framework for statistical analysis and applications to biological data." Thesis, State University of New York at Stony Brook, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3685086.

Full text
Abstract:

One of the most common goals of hierarchical clustering is finding those branches of a tree that form quantifiably distinct data subtypes. Achieving this goal in a statistically meaningful way requires (a) a measure of distinctness of a branch and (b) a test to determine the significance of the observed measure, applicable to all branches and across multiple scales of dissimilarity.

We formulate a method termed Tree Branches Evaluated Statistically for Tightness (TBEST) for identifying significantly distinct tree branches in hierarchical clusters. For each branch of the tree a measure of distinctness, or tightness, is defined as a rational function of heights, both of the branch and of its parent. A statistical procedure is then developed to determine the significance of the observed values of tightness. We test TBEST as a tool for tree-based data partitioning by applying it to five benchmark datasets, one of them synthetic and the other four each from a different area of biology. With each of the five datasets, there is a well-defined partition of the data into classes. In all test cases TBEST performs on par with or better than the existing techniques.

One dataset uses Cores Of Recurrent Events (CORE) to select features. CORE was developed with my participation in the course of this work. An R language implementation of the method is available from the Comprehensive R Archive Network: cran.r-project.org/web/packages/CORE/index.html.

Based on our benchmark analysis, TBEST is a tool of choice for detection of significantly distinct branches in hierarchical trees grown from biological data. An R language implementation of the method is available from the Comprehensive R Archive Network: cran.r-project.org/web/packages/TBEST/index.html.

APA, Harvard, Vancouver, ISO, and other styles
20

Ramakrishnan, Ranjani. "A data cleaning and annotation framework for genome-wide studies." Full text open access at:, 2007. http://content.ohsu.edu/u?/etd,263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Miaoxin. "Development of a bioinformatics and statistical framework to integrate biological resources for genome-wide genetic mapping and its applications." Click to view the E-thesis via HKUTO, 2009. http://sunzi.lib.hku.hk/hkuto/record/B43572030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Zheng, Chenlin. "Visual consonance: An analytical framework for 3D biomedical animation." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/99825/4/Chenlin_Zheng_Thesis.pdf.

Full text
Abstract:
As an emerging medium of 3D animation, 3D biomedical animation has become a solid and important facet for communicating complex biological knowledge. This PhD research developed visual consonance as an analytical framework for providing conceptual guidance to animators while producing 3D biomedical animation. Specifically, it examines and combines the twelve principles of animation and the twelve principles of multimedia learning to generate this analytical framework. Visual consonance provides theoretical guidelines for 3D biomedical animators during the decision-making process and helps them become goal-orientated when producing 3D biomedical animations that are cognitively engaging to the target audiences. Visual consonance can also be used as an analytical tool to understand and critique the current constructional elements of 3D biomedical animation.
APA, Harvard, Vancouver, ISO, and other styles
23

Panayi, Marilyn. "Cognition in action C-i-A : rethinking gesture in neuro-atypical young people : a conceptual framework for embodied, embedded, extended and enacted intentionality." Thesis, City, University of London, 2014. http://openaccess.city.ac.uk/15292/.

Full text
Abstract:
The three aims of my interdisciplinary thesis are: - To develop a conceptual framework for re-thinking the gestures of neuro-atypical young people, that is non-traditional and non-representational - To develop qualitative analytical tools for the annotation and interpretation of gesture that can be applied inclusively to both neuro-atypical and neuro typical young people - To consider the conceptual framework in terms of its theoretical implications and practical applications Learning to communicate and work with neuro-atypical young people provides the rationale and continued impetus for my work. My approach is influenced by the limited social, physical and communicative experiences of young people with severe speech and motor impairment, due to cerebral palsy (CP). CP is described as: a range of non-progressive syndromes of posture and motor impairment. The aetiology is thought to result from damage to the developing central nervous system during gestation or in the neonate. Brain lesions involve the basal ganglia and the cerebellum; both these sites are known to support motor control and integration. However, a paucity of theoretical research and empirical data for this target group of young people necessitated the development of both an alternative theoretical framework and two new tools. Biological Dynamic Systems Theory is proposed as the best candidate structure for the re-consideration of gesture. It encompasses the global, synthetic and embodied nature of gesture. Gesture is re-defined and considered part of an emergent dynamic, complex, non-linear and self-organizing system. My construct of Cognition-in-Action (C-i-A) is derived from the notion of knowing-as-doing influenced by socio-biological paradigms; it places the Action-Ready-Body centre stage. It is informed by a theoretical synthesis of knowledge from the domains of Philosophy, Science and Technology, including practices in the clinical, technology design and performance arts arenas. The C-i-A is a descriptive, non-computational feature-based framework. Its development centred around two key questions that served as operational starting points: What can gestures reveal about children’s cognition-in-action? and Is there the potential to influence gestural capacity in children?. These are supported by my research objectives. In my empirical study I present three case studies that focus on the annotation and interpretative analyses of corporeal exemplars from a gesture corpus. These exemplars were contributed by neuro-atypical young people: two adolescent males aged 16.9 and 17.9 years, and one female girl aged 10.7 years. The Gesture-Action-Entity (GAE) is proposed as a unit of interest for the analysis of procedural, semantic and episodic aspects of our corporeal knowledge. A body-based-action-annotation-system (G-ABAS) and Interpretative Phenomenological Analysis methodology is applied for the first time to gesture (G-IPA). These tools facilitate fine-grained corporeal dynamic and narrative gesture feature analyses. These phenomenal data reveal that these young people have latent resources and capacities that they can express corporeally. Iteration of these interpretative findings with the Cognition-in-Action framework allows for the inference of processes that may underlie the strategies they use to achieve such social-motor-cognitive functions. In summary, their Cognition-in-Action is brought-forth, carried forward and has the potential to be culturally embodied. The utility of C-i-A framework lies in its explanatory power to contribute to a deeper understanding of child gesture. Furthermore, I discuss and illustrate its potential to influence practice in the domains of pedagogy, rehabilitation and the design of future intimate, assistive and perceptually sensitive technologies. Such technologies are increasingly mediating our social interactions. My thesis makes an original and significant contribution to the field of cognitive science, by offering an ecologically valid alternative to tradition conceptualization of perception, cognition and action.
APA, Harvard, Vancouver, ISO, and other styles
24

Ghebre, Michael Abrha. "A statistical framework for modeling asthma and COPD biological heterogeneity, and a novel variable selection method for model-based clustering." Thesis, University of Leicester, 2016. http://hdl.handle.net/2381/38488.

Full text
Abstract:
This thesis has two main parts. The first part is an application that focuses on the identification of a statistical framework to model the biological heterogeneity of asthma and COPD using sputum cytokines. Clustering subjects using the actual cytokines measurements may not be straightforward as these mediators have strong correlations, which are currently ignored by standard clustering techniques. Artificial data, which have similar patterns as the cytokines, but with known class membership, are simulated. Several approaches, such as data reduction using factor analysis, were performed on the simulated data to identify suitable representative of the variables and to use as input into clustering algorithm. In the simulation study, using "factor-scores" (derived from factor analysis) as input variables into clustering outperformed the alternative approaches. Thus, this approach was applied to model the biological heterogeneity of asthma and COPD, and identified three stable and three exacerbation clusters, with different proportions of overlap between the diseases. The second part is a statistical methodology in which a new method for variable selection in model-based clustering was proposed. This method generalizes the approach of Raftery and Dean (2006, JASA 101, 168-178). It relaxes the global prior assumptions of linear-relationships between clustering relevant and irrelevant variables by searching for latent structures among the variables, and accounts for nonlinear relationships between these variables by splitting the data into sub-samples. A Gaussian mixture model (unconstrained variance-covariance matrices fitted using the EM-algorithm) is applied to identify the optimal clusters. The new method performed considerably better than the Raftery and Dean technique when applied to simulated and real datasets, and demonstrates that variable selection within clustering can substantially improve the identification of optimal clusters. However, at the moment it perhaps does not perform adequately in uncovering the optimal clusters in the dataset which have strong correlations such as sputum mediators.
APA, Harvard, Vancouver, ISO, and other styles
25

Alshehri, Saud Ali. "A proposed framework for resilience to biological disasters : the case of MERS-CoV threat in a transient mass gathering event." Thesis, Cardiff University, 2016. http://orca.cf.ac.uk/86517/.

Full text
Abstract:
The increase in disasters in recent decades has impacted on humanity through large loss of life and negative long-term economic and environmental consequences. Disasters cannot always be prevented but their impacts can be mitigated through adapted disaster management strategies, including improving community resilience. The aim of this thesis is to develop a framework of community resilience to disaster in Saudi Arabia (CRDSA). Saudi Arabia has experienced a number of disasters between 2005 and 2014; two were of a biological nature H1N1 and MERS-CoV while the rest were caused by flooding. The study uses a mixed-method approach and paradigm of pragmatism, and is structured into three stages. First, based on the literature review, a survey questionnaire is used to examine the public perception of risk of disaster. The second stage is divided into two steps: 1) the Delphi consultation is employed to refine a set of initial criteria, organised into dimensions, derived from a review of the literature, and to explore additional criteria to inform the development of CRDSA. The Delphi technique combines a quantitative calculation to justify each dimension and associated criteria with qualitative views of experts to reach consensus around the proposed CRDSA. Data collection involved three-round questionnaire before achieving experts’ consensus; 2) AHP is used to achieve the objectives of: a) local priority weights from pair-wise comparative methods of judgment and b) determining the importance of the dimensions of the framework. The third stage focuses on the validation of the CRDSA through a real mass gathering event. The approach involves a field study investigation, including interviews and observations during the 2013 Hajj to 1) determine community resilience level at the Hajj, 2) inform prevention strategies against the risk of a MERS-CoV epidemic; and 3) validate the CRDSA in a real situation. The study finds that the proposed CRDSA framework can be used as an assessment tool to build community resilience to disasters in permanent and transient communities.
APA, Harvard, Vancouver, ISO, and other styles
26

Ribera, Pi Judit. "Hybrid systems for wastewater treatment in the framework of circular economy : coupling biological and membrane technologies for a sustainable water cycle." Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/668391.

Full text
Abstract:
The increasing water demand coupled to the depletion of natural water sources has raised the need to investigate and develop in wastewater treatment and reuse. Even more, the application of circular economy principies to water cycle has highlighted the need to see wastewater as a source of water and resources. Therefore, hybridization of already developed technologies can help achieve circular economy goals. Moreover, these hybrid systems that take the best of each technology are capable to gain to the limitations of current conventional treatments. Thus, in this thesis, different hybrid systems have been developed and tested (at bench and pilot scales) for wastewater treatment, both urban and industrial. On one hand, three upflow anaerobic sludge blanket (UASB) reactors with different configurations: flocculent biomass, flocculent biomass and membrane solids separation and granular biomass and membrane solids separation (UASB-AnMBR), were operated to compare start-up, solids hydrolysis and effluent quality. The challenges of this work were both the low temperature and the low COD content. A really quick start-up was observed for the three reactors and was attributed to the previous acclimation of the seed sludge. The UASB configurations with membrane retained the solids in the reactor increasing solids hydrolysis efficiency. Moreover, flocculent biomass promoted slightly higher hydrolysis than granular one. Therefore, a configuration based on flocculent UASB-AnMBR was appropriate for the treatment of urban wastewater with low COD content at 10°C. ' On the other hand, a single-stage AnMBR for the treatment of cheese whey and its co-digestion with cattle slurry was investigated with the aim of potentially recovering water and energy. High COD removal (91% ± 7%) was achieved with a biogas production of 0.2-0.9 m3 biogas/kg COD removed. Therefore, high energy recovery could be obtained when using this process with a mean value of 2.4 kWh/kg COD removed. Although energy recovery was directly validated, severallimitations were detected regarding water reuse. Those limitations comprised high salt concentration in the permeate, which should be removed prior to its reuse. Moreover, petrochemical wastewater pre-treatment was optimised with the final objective of water recycling. lt consisted in a coagulation-flocculation (CF) step followed by a moving bed biofilm reactor (MBBR) aimed to decrease suspended solids (SS) and organic content. In this case, only the first part of the hybrid system was optimised, membrane units were not included in this work. CF tests showed a decrease in wastewater turbidity but no significant DOC removal. Wastewater was then treated by MBBR. In MBBR, high sCOD removal efficiency (80-90%) was maintained. The MBBR proved to be also effective when treating raw wastewater as well as when feed wastewater effluent proportions were changed. The obtained results showed that MBBR was a suitable technology for petrochemical wastewater pre-treatment. Finally, a novel treatment strategy for landfillleachate aimed to decrease its environmental impact was studied. The system consisted in a membrane bioreactor (MBR) pre-treatment aimed to remove COD, N and SS. lt was followed by a combined reverse osmosis-electrodialysis reversa! (RO-EDR) treatment aimed to remove salts and decrease brine volume. MBR decreased inorganic carbon by 92 ± 8% and achieved N removal of 85%. RO achieved a recovery of 84% and rejections of above 95%. EDR unit treating RO brine achieved a recovery of 67%. Thus, average recovery of the whole system was above 90%. lt is important to highlight that end-of-life RO regenerated membranes were used in this study. This fact, together with the low volume of brine (<10%) helped decrease the environmental impact of leachate treatment. Hence, this thesis was conducted from an applied research approach, aimed to reduce the gap between basic technology development and industrial implementation.
La creixent demanda d'aigua i l'esgotament de les fonts naturals ha generat la necessitat d'investigar i desenvolupar nous tractaments d'aigua així com la seva reutilització. L'aplicació dels principis de l'economia circular al cicle de l'aigua ha posat de manifest la necessitat de percebre les aigües residuals com a font d'aigua i recursos. Així dones, la hibridació de tecnologies ja desenvolupades pot ajudar a complir els objectius de l'economia circular. A més, aquests sistemes híbrids són capaços de superar les limitacions deis tractaments convencionals. Així dones , en aquesta tesi, s'han desenvolupat i provat diferents sistemes híbrids (a escala de banc de proves i pilot) per al tractament d'aigües residuals urbanes i industrials. D'una banda, s'han operat tres configuracions de reactors UASB (Upflow Anaerobic Sludge Blanket) per comparar la posada en marxa, la hidrólisi dels sòlids i qualitat de l'efluent. Aquestes configuracions eren: biomassa flocular, biomassa flocular amb separació per membrana i biomassa granular amb separació per membrana (UASB-AnMBR). Els reptes d'aquest treball han estat tant la baixa temperatura com el baix contingut en DQO. La posada en marxa ha estat molt ràpida per als tres reactors, atribuïda a l'aclimatació prèvia dels fangs . Els resultats mostren que una configuració basada en UASB-AnMBR amb biomassa flocular ha estat adequada peral tractament d'aigües residuals urbanes amb baix contingut en DQO a 10°C. D'altra banda, s'ha investigat un AnMBR per al tractament de xerigot i la seva codigestió amb purí amb l'objectiu de recuperar aigua i energia . S'ha aconseguit una elevada eliminació de DQO (91% ± 7%) amb una producció de biogàs de 0,2 a 0,9 m3 de biogàs/kg de DQO eliminada. Per tant, es calcula que es podria obtenir una elevada recuperació d'energia amb un valor mitja de 2,4 kW/kg de DQO eliminada. Tot i que s'ha validat directament la recuperació d'energia, s'han detectat diverses limitacions en relació amb la reutilització de l'aigua. Aquestes limitacions inclouen una elevada concentració de sal en el permeat, que caldria eliminar abans de la seva reutilització. A més, s'ha optimitzat el pretractament pera aigües residuals petroquímiques amb l'objectiu de reciclar l'aigua. Aquest ha consistit en una coagulació-floculació (CF) seguida d'un MBBR (Moving Bed Biofilm Reactor) per tal de disminuir els sòlids en suspensió (SS) i el contingut orgànic. En aquest cas, només s'ha optimitzat la primera part del sistema híbrid ja que no s'han inclòs les etapes de membrana en aquest treball. Les proves de CF han mostrat una disminució de la terbolesa de les aigües residuals sense eliminació significativa de DQO. Aquest efluent s'ha tractat per MBBR. A I'MBBR s'ha mantingut una elevada eficiència d'eliminació de DQOs (80-90%). Els resultats obtinguts mostren que el MBBR és una tecnologia adequada per al pretractament de les aigües residuals petroquímiques. Finalment, s'ha estudiat una nova estratègia de tractament de lixiviats d'abocador per disminuir el seu impacte ambiental. El sistema s'ha basat en un pretractament amb bioreactor de membrana (MBR) pera l'eliminació de DQO, Ni SS seguit d'un tractament combinat d'osmosi inversa-electrodiàlisi reversible (01-EDR) pera l'eliminació de sals i disminució el volum de salmorra . L'MBR ha disminuït el carboni inorgànic en un 92 ± 8% i ha aconseguit una eliminació de N del 85%. Gracies a les etapes d'OI i EDR, la recuperació mitjana de tot el sistema ha superat el 90%. És important destacar que en aquest estudi s'han utilitzat membranes regenerades d'OI al final de la seva vida útil. Aquest fet, juntament amb el baix volum de salmorra {<10%) ha contribuït a disminuir !'impacte ambiental del tractament de lixiviats. Per tant, aquesta tesi s'ha dut a terme des d'un enfoc de recerca aplicada, amb l'objectiu de reduir la bretxa entre el desenvolupament tecnològic basic i la implementació industrial
APA, Harvard, Vancouver, ISO, and other styles
27

Baker, Syed Murtuza [Verfasser], Björn H. [Akademischer Betreuer] Junker, Falk [Akademischer Betreuer] Schreiber, and Michael [Akademischer Betreuer] Mangold. "A parameter estimation framework for kinetic models of biological systems / Syed Murtuza Baker. Betreuer: Björn H. Junker ; Falk Schreiber ; Michael Mangold." Halle, Saale : Universitäts- und Landesbibliothek Sachsen-Anhalt, 2012. http://d-nb.info/1028798792/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Namanda, Sam. "Current and potential systems for maintaining sweetpotato planting material in areas with prolonged dry seasons : a biological, social and economic framework." Thesis, University of Greenwich, 2012. http://gala.gre.ac.uk/11977/.

Full text
Abstract:
This study on sweetpotato seed systems was conducted in Mukono, Kamuli, Bukedea and Soroti districts in Uganda, and in Mwanza, Shinyanga and Meatu regions in the Lake Zone of Tanzania during 2007 – 2011. It aimed at developing simple, affordable and applicable technologies for conserving and multiplying sweetpotato planting material for early season planting after the long dry season. The study sought to understand and describe farmers’ existing approaches, improve on rapid multiplication techniques and develop rational use of available planting material. Complete lack of or insufficient planting material for early planting immediately following the long dry season was reported. Farmers recognised that obtaining planting material early was beneficial as it resulted in increased root yield, an early source of food and sales at high prices. The Triple S (Sand, Storage and Sprouting) method of producing ample planting material for early season planting was developed in Uganda after testing various ways of storing the roots during the dry season so as to eliminate dry season mortality. Using roots obtained from crops planted in the conventional time and planting them out in watered gardens 1-2 mths before the rains to act as sources of sprouts for vine cuttings was the most appropriate. The method was validated in Tanzania which has a longer dry season. The use of 20cm cuttings instead of the mini cuttings (10 cm) in rapid multiplication of vines needed less labour and care. Pre-planting fertiliser (NPK: 25:5:5) doubled the quantity of planting material generated, and planting shorter and fewer cuttings than recommended saved planting material to enable more extensive plant coverage and doubled potential production. All these findings greatly contribute to the improvement of the conservation and multiplication of planting material, especially to improving the availability of early planting material.
APA, Harvard, Vancouver, ISO, and other styles
29

Mao, Feng. "Ecological water quality assessment and science-driven policy : investigating the EU Water Framework Directive and river basin governance in China." Thesis, University of Cambridge, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.708638.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hay, J. "The dilemma of a theoretical framework for the training of education support services staff within inclusive education." Journal for New Generation Sciences, Vol 10, Issue 3: Central University of Technology, Free State, Bloemfontein, 2012. http://hdl.handle.net/11462/606.

Full text
Abstract:
Published Article
The medical biological and ecosystemic models are two paradigms which are currently making a huge impact on education support services on an international level. The medical biological model has been dominating the way in which multidisciplinary support has been delivered within 20th-century special education. However, with the advent of inclusive education, the ecosystemic model has initially been pushed to the fore as the preferred metatheory of support services. This article specifically interrogates these two conflicting paradigms in education support services within the South African schooling and higher education bands, as well as Bronfenbrenner's integration of these models with regard to the bio-ecological model. Finally, this article proposes the bio-ecosystemic framework according to which the training of multidisciplinary education support services staff should proceed in order to ensure a sound and less conflicting theoretical framework.
APA, Harvard, Vancouver, ISO, and other styles
31

Sichtig, Heike. "The SGE framework discovering spatio-temporal patterns in biological systems with spiking neural networks (S), a genetic algorithm (G) and expert knowledge (E) /." Diss., Online access via UMI:, 2009.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Bioengineering, Biomedical Engineering, 2009.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
32

Kuti, Temitope Babatunde. "Towards effective multilateral protection of traditional knowledge within the global intellectual property framework." University of the Western Cape, 2017. http://hdl.handle.net/11394/6339.

Full text
Abstract:
Magister Legum - LLM (Mercantile and Labour Law)
Traditional Knowledge (TK) has previously been considered a 'subject' in the public domain, unworthy of legal protection. However, the last few decades have witnessed increased discussions on the need to protect the knowledge of indigenous peoples for their economic sustenance, the conservation of biodiversity and modern scientific innovation. Questions remain as to how TK can best be protected through existing, adapted or sui generis legal frameworks. Based on an examination of the formal knowledge-protection mechanisms (i.e. the existing intellectual property system), this mini-thesis contends that these existing systems are inadequate for protecting TK. As a matter of fact, they serve as veritable platforms for incidences of biopiracy. It further argues that the many international initiatives designed to protect TK have so far failed owing to inherent shortcomings embedded in them. Furthermore, a comparative assessment of several national initiatives (in New Zealand, South Africa and Kenya) supports an understanding that several domestic efforts to protect TK have been rendered ineffective due to the insurmountable challenge of dealing with the international violations of local TK rights. It is therefore important that on-going international negotiations for the protection of TK, including the negotiations within the World Intellectual Property Organisation's Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (IGC), do not adopt similar approaches to those employed in previous initiatives if TK must be efficiently and effectively protected. This mini-thesis concludes that indigenous peoples possess peculiar protection mechanisms for their TK within the ambit of their customary legal systems and that these indigenous mechanisms are the required anchors for effective global protections.
APA, Harvard, Vancouver, ISO, and other styles
33

Kuti, Temitope Babatunde. "Towards effective Multilateral protection of traditional knowledge within the global intellectual property framework." University of the Western Cape, 2018. http://hdl.handle.net/11394/6245.

Full text
Abstract:
Magister Legum - LLM (Mercantile and Labour Law)
Traditional Knowledge (TK) has previously been considered a 'subject' in the public domain, unworthy of legal protection. However, the last few decades have witnessed increased discussions on the need to protect the knowledge of indigenous peoples for their economic sustenance, the conservation of biodiversity and modern scientific innovation. Questions remain as to how TK can best be protected through existing, adapted or sui generis legal frameworks. Based on an examination of the formal knowledge-protection mechanisms (i.e. the existing intellectual property system), this mini-thesis contends that these existing systems are inadequate for protecting TK. As a matter of fact, they serve as veritable platforms for incidences of biopiracy. It further argues that the many international initiatives designed to protect TK have so far failed owing to inherent shortcomings embedded in them. Furthermore, a comparative assessment of several national initiatives (in New Zealand, South Africa and Kenya) supports an understanding that several domestic efforts to protect TK have been rendered ineffective due to the insurmountable challenge of dealing with the international violations of local TK rights. It is therefore important that on-going international negotiations for the protection of TK, including the negotiations within the World Intellectual Property Organisation's Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (IGC), do not adopt similar approaches to those employed in previous initiatives if TK must be efficiently and effectively protected. This mini-thesis concludes that indigenous peoples possess peculiar protection mechanisms for their TK within the ambit of their customary legal systems and that these indigenous mechanisms are the required anchors for effective global protections.
APA, Harvard, Vancouver, ISO, and other styles
34

Schneider, Tamara. "A Framework for Analyzing and Optimizing Regional Bio-Emergency Response Plans." Thesis, University of North Texas, 2010. https://digital.library.unt.edu/ark:/67531/metadc33200/.

Full text
Abstract:
The presence of naturally occurring and man-made public health threats necessitate the design and implementation of mitigation strategies, such that adequate response is provided in a timely manner. Since multiple variables, such as geographic properties, resource constraints, and government mandated time-frames must be accounted for, computational methods provide the necessary tools to develop contingency response plans while respecting underlying data and assumptions. A typical response scenario involves the placement of points of dispensing (PODs) in the affected geographic region to supply vaccines or medications to the general public. Computational tools aid in the analysis of such response plans, as well as in the strategic placement of PODs, such that feasible response scenarios can be developed. Due to the sensitivity of bio-emergency response plans, geographic information, such as POD locations, must be kept confidential. The generation of synthetic geographic regions allows for the development of emergency response plans on non-sensitive data, as well as for the study of the effects of single geographic parameters. Further, synthetic representations of geographic regions allow for results to be published and evaluated by the scientific community. This dissertation presents methodology for the analysis of bio-emergency response plans, methods for plan optimization, as well as methodology for the generation of synthetic geographic regions.
APA, Harvard, Vancouver, ISO, and other styles
35

Horn, Laura Sandra. "The common concern of humankind and legal protection of the global environment." Phd thesis, Faculty of Law, 2001. http://hdl.handle.net/2123/6188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Papadopoulou, Frantzeska. "Opening Pandora's Box : Exploring Flexibilities and Alternatives for Protecting Traditional Knowledge and Genetic Resources under the Intellectual Property Framework." Doctoral thesis, Stockholms universitet, Juridiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-100568.

Full text
Abstract:
What happens when resources get valuable and scarce? How is Intellectual Property dealing with market failures related to sub-patentable innovation or purely traditional knowledge with interesting applications? The protection of traditional knowledge and genetic resources (TKGR) has been one of the major modern challenges in international IP law. The entry into force of the Convention on Biological Diversity (CBD) and its implementation in national legislation has created more questions than the ones it answered. The objective of this dissertation is to assist in the evaluation of current national and regional implementation initiatives as well in the presentation and evaluation of different forms of entitlements that could be applicable in the case of TKGR. The dissertation has employed a theoretical framework for this evaluation, by combining the Coase Theorem and Rawls' theory of justice. The choice of these two theoretical models is not a random one. In order for the entitlement covering TKGR to be successful, it has to be efficient. It has to offer a stable and efficient marketplace where access to TKGR is possible without unnecessary frictions. However, efficiency could not be the only objective.  An entitlement focusing solely on efficiency would fall short of the needs and special considerations of TKGR trade. It would above all be counter to the objectives and major principles of the CBD, the “fair and equitable sharing of the benefits” and would certainly fail to address the very important North-South perspective.  Fairness is thus a necessary complement to the efficiency of the proposed entitlement. This dissertation proposes a thorough investigation of the special characteristics, of right-holders, subject-matter, market place as well as of the general expectations that an entitlement is supposed to fulfill. In parallel to that, it  looks into the meaning and scope of alternative entitlements in order to be able to propose the best alternative.
APA, Harvard, Vancouver, ISO, and other styles
37

Persson, Jan. "Stream monitoring using near-infrared spectroscopy of epilithic material." Licentiate thesis, Umeå : Ecology and Environmental Science, Umeå University, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Guo, Shishi. "Biologically-inspired control framework for insect animation." Thesis, Bournemouth University, 2015. http://eprints.bournemouth.ac.uk/22502/.

Full text
Abstract:
Insects are common in our world, such as ants, spiders, cockroaches etc. Virtual representations of them have wide applications in Virtual Reality (VR), video games and films. Compared with the large volume of works in biped animation, the problem of insect animation was less explored. Their small body parts, complex structures and high-speed movements challenge the standard techniques of motion synthesis. This thesis addressed the aforementioned challenge by presenting a framework to efficiently automate the modelling and authoring of insect locomotion. This framework is inspired by two key observations of real insects: fixed gait pattern and distributed neural system. At the top level, a Triangle Placement Engine (TPE) is modelled based on the double-tripod gait pattern of insects, and determines the location and orientation of insect foot contacts, given various user inputs. At the low level, a Central Pattern Generator (CPG) controller actuates individual joints by mimicking the distributed neural system of insects. A Controller Look-Up Table (CLUT) translates the high-level commands from the TPE into the low-level control parameters of the CPG. In addition, a novel strategy is introduced to determine when legs start to swing. During high-speed movements, the swing mode is triggered when the Centre of Mass (COM) steps outside the Supporting Triangle. However, this simplified mechanism is not sufficient to produce the gait variations when insects are moving at slow speed. The proposed strategy handles the case of slow speed by considering four independent factors, including the relative distance to the extreme poses, the stance period, the relative distance to the neighbouring legs, the load information etc. This strategy is able to avoid the issues of collisions between legs or over stretching of leg joints, which are produced by conventional methods. The framework developed in this thesis allows sufficient control and seamlessly fits into the existing pipeline of animation production. With this framework, animators can model the motion of a single insect in an intuitive way by specifying the walking path, terrains, speed etc. The success of this framework proves that the introduction of biological components could synthesise the insect animation in a naturalness and interactive fashion.
APA, Harvard, Vancouver, ISO, and other styles
39

Nielsen, Alec A. K. "Biomolecular and computational frameworks for genetic circuit design." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/109665.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Biological Engineering, 2017.
Page 322 blank. Cataloged from PDF version of thesis.
Includes bibliographical references (pages 295-321).
Living cells naturally use gene regulatory networks termed "genetic circuits" to exhibit complex behaviors such as signal processing, decision-making, and spatial organization. The ability to rationally engineer genetic circuits has applications in several biotechnology areas including therapeutics, agriculture, and materials. However, genetic circuit construction has traditionally been time- and labor-intensive; tuning regulator expression often requires manual trial-and-error, and the results frequently function incorrectly. To improve the reliability and pace of genetic circuit engineering, we have developed biomolecular and computational frameworks for designing genetic circuits. A scalable biomolecular platform is a prerequisite for genetic circuits design. In this thesis, we explore TetR-family repressors and the CRISPRi system as candidates. First, we applied 'part mining' to build a library of TetR-family repressors gleaned from prokaryotic genomes. A subset were used to build synthetic 'NOT gates' for use in genetic circuits. Second, we tested catalytically-inactive dCas9, which employs small guide RNAs (sgRNAs) to repress genetic loci via the programmability of RNA:DNA base pairing. To this end, we use dCas9 and synthetic sgRNAs to build transcriptional logic gates with high on-target repression and negligible cross-talk, and connected them to perform computation in living cells. We further demonstrate that a synthetic circuit can directly interface a native E. coli regulatory network. To accelerate the design of circuits that employ these biomolecular platforms, we created a software design tool called Cello, in which a user writes a high-level functional specification that is automatically compiled to a DNA sequence. Algorithms first construct a circuit diagram, then assign and connect genetic "gates", and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design the largest library of genetic circuits to date, where each DNA sequence was built as predicted by the software with no additional tuning. Across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decisionmaking, control, sensing, or spatial organization.
by Alec A.K. Nielsen.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
40

Ullah, Amjad. "Towards a novel biologically-inspired cloud elasticity framework." Thesis, University of Stirling, 2017. http://hdl.handle.net/1893/26064.

Full text
Abstract:
With the widespread use of the Internet, the popularity of web applications has significantly increased. Such applications are subject to unpredictable workload conditions that vary from time to time. For example, an e-commerce website may face higher workloads than normal during festivals or promotional schemes. Such applications are critical and performance related issues, or service disruption can result in financial losses. Cloud computing with its attractive feature of dynamic resource provisioning (elasticity) is a perfect match to host such applications. The rapid growth in the usage of cloud computing model, as well as the rise in complexity of the web applications poses new challenges regarding the effective monitoring and management of the underlying cloud computational resources. This thesis investigates the state-of-the-art elastic methods including the models and techniques for the dynamic management and provisioning of cloud resources from a service provider perspective. An elastic controller is responsible to determine the optimal number of cloud resources, required at a particular time to achieve the desired performance demands. Researchers and practitioners have proposed many elastic controllers using versatile techniques ranging from simple if-then-else based rules to sophisticated optimisation, control theory and machine learning based methods. However, despite an extensive range of existing elasticity research, the aim of implementing an efficient scaling technique that satisfies the actual demands is still a challenge to achieve. There exist many issues that have not received much attention from a holistic point of view. Some of these issues include: 1) the lack of adaptability and static scaling behaviour whilst considering completely fixed approaches; 2) the burden of additional computational overhead, the inability to cope with the sudden changes in the workload behaviour and the preference of adaptability over reliability at runtime whilst considering the fully dynamic approaches; and 3) the lack of considering uncertainty aspects while designing auto-scaling solutions. This thesis seeks solutions to address these issues altogether using an integrated approach. Moreover, this thesis aims at the provision of qualitative elasticity rules. This thesis proposes a novel biologically-inspired switched feedback control methodology to address the horizontal elasticity problem. The switched methodology utilises multiple controllers simultaneously, whereas the selection of a suitable controller is realised using an intelligent switching mechanism. Each controller itself depicts a different elasticity policy that can be designed using the principles of fixed gain feedback controller approach. The switching mechanism is implemented using a fuzzy system that determines a suitable controller/- policy at runtime based on the current behaviour of the system. Furthermore, to improve the possibility of bumpless transitions and to avoid the oscillatory behaviour, which is a problem commonly associated with switching based control methodologies, this thesis proposes an alternative soft switching approach. This soft switching approach incorporates a biologically-inspired Basal Ganglia based computational model of action selection. In addition, this thesis formulates the problem of designing the membership functions of the switching mechanism as a multi-objective optimisation problem. The key purpose behind this formulation is to obtain the near optimal (or to fine tune) parameter settings for the membership functions of the fuzzy control system in the absence of domain experts’ knowledge. This problem is addressed by using two different techniques including the commonly used Genetic Algorithm and an alternative less known economic approach called the Taguchi method. Lastly, we identify seven different kinds of real workload patterns, each of which reflects a different set of applications. Six real and one synthetic HTTP traces, one for each pattern, are further identified and utilised to evaluate the performance of the proposed methods against the state-of-the-art approaches.
APA, Harvard, Vancouver, ISO, and other styles
41

Dong, L. "A biologically Inspired Framework for Classification of Visual Information." Thesis, Queen Mary, University of London, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.509679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Willimott, India Jane. "Biological molecules for the formation and assembly of metal-organic frameworks." Thesis, University of Southampton, 2018. https://eprints.soton.ac.uk/422223/.

Full text
Abstract:
Metal-organic frameworks (MOFs) are a class of highly porous hybrid materials – consisting of metal ion based nodes and bridging organic linkers, connecting in two or three-dimensional periodic structures. By combining different inorganic and organic building units a variety of structures and properties can be tailored to a range of applications – gas storage and separation, catalysis and biotechnology. Some of these applications require benign building blocks that are biologically compatible or require specific morphology and crystal size. The first part of the thesis focuses on using biomolecules as the organic constituents in MOFs. Biomolecules attract particular attention because of their diverse metal binding sites, increased biocompatibility along with developing structural and chemical diversity of the material, for example affording chiral frameworks. Three groups of molecules were investigated: nucleotides, nucleic acids and peptides. Various experimental conditions were employed in the quest to establish conditions under which a porous network comprising of one of these biomolecules were formed. One-dimensional crystal morphology of zinc and copper with nucleotides were identified along with a spherical zinc-peptide structure. However, no porous material was discerned in this investigation. The final part of the thesis explores biomimetic mineralisation of MOFs; utilising biomolecules as organic matrix for the assembly of inorganic materials. Peptides were explored due to their selfassembly and substrate recognition properties that could potential provide high levels of control during the synthesis. Specific peptide sequences that bound to the desired material were highlighted using the phage display technique, a combinatorial biology protocol used to identify binding peptides via a rapid directed-evolution approach. Millions of phages bearing different peptides are exposed to a target material to select the best binding sequence(s) from the genetically engineered library. This thesis demonstrates a successful application of phage display identifying sequence-specific peptides for MOFs and the peptides subsequent successful use in synthesising a crystalline material; experimental conditions were established where a crystalline framework does not form in the absence of the peptide. Additional outcomes emphasised the effect of framework functionality on the identified peptide sequence(s).
APA, Harvard, Vancouver, ISO, and other styles
43

Pitt, Joel Peter William. "Modelling the spread of invasive species across heterogeneous landscapes." Diss., Lincoln University, 2008. http://hdl.handle.net/10182/912.

Full text
Abstract:
Invasive species are well known to cause millions of dollars of economic as well as ecological damage around the world. New Zealand, as an island nation, is fortunate because it has the opportunity to regulate and monitor travel and trade to prevent the establishment of new species. Nevertheless foreign species continue to arrive at the borders and continue to cross them, thus requiring some form of management. The control and management of a new incursion of an invasive species would clearly benefit from predictive tools that might indicate where and how quickly the species is likely to spread after it has established. During the process of spread an invasing species must interact with a complex and heterogeneous environment and the suitability of the habitat in a region determines whether it survives. Many dispersal models ignore such interactions and while they may be interesting theoretical models, they are less useful for practical management of invasive species. The purpose of this study was to create and investigate the behaviour of a spatially explicit model that simulates insect dispersal over realistic landscapes. The spatially explicit model (Modular Dispersal in GIS, MDiG) was designed as am open-source modular framework for dispersal simulation integrated within a GIS. The model modules were designed to model an an approximation of local diffusion, long distance dispersal, growth, and chance population mortality based on the underlying suitability of a region for establishment of a viable population. The spatially explicit model has at its core a dispersal module to simulate long distance dispersal based an underlying probability distribution of dispersal events. This study illustrates how to extract the frequency of long distance dispersal events, as well as their distance, from time stamped occurrence data, to fit a Cauchy probability distribution that comprises the dispersal module. An investigation of the long distance dispersal modules behaviour showed that, in general, it generated predictions of the rate of spread consistent with those of analytical partial differential and integrodifference equations. However, there were some differences. Spread rate was found to be mainly dependent on the measurement technique used to determine the invasion front or boundary, therefore an alternative method to determine the boundary of a population for fat-tailed dispersal kernels is presented. The method is based on the point of greatest change in population density. While previously it was thought that number of foci rather than foci size was more important in stratified dispersal and that finer resolution simulations would spread more quickly, simulations in this study showed that there is an optimal resolution for higher spread rates and rate of area increase. Additionally, much research has suggested that the observed lag at the beginning of an invasion may be due to lack of suitable habitats or low probability of individuals striking the right combination of conditions in a highly heterogeneous environment. This study shows an alternative explanation may simply be fewer dispersal event sources. A case study is described that involved the application of the spatially explicit dispersal model to Argentine ant spread to recreate the invasion history of that species in New Zealand. Argentine ant is a global invasive pest which arrived in New Zealand in 1990 and has since spread to both main islands of New Zealand, primarily through human mediated dispersal. The spatially explicit simulation model and its prediction ability were compared to that of a uniform spread model based on equivalent total area covered. While the uniform spread model gave more accurate predictions of observed occurrences early in the invasion process it was less effective as the invasion progressed. The spatially explicit model predicted areas of high probability of establishment (hot spots) consistent with where populations have been found but accuracy varied between 40-70% depending on the year of the simulation and parameter selection. While the uniform spread model sometimes slightly outperformed or was equivalent to the simulation with respect to accuracy early in the invasion process, it did not show the relative risk of establishment and was less effective later in the invasion when stochastic random events generated by the simulation model were averaged to represent trends in the pattern of spread. Additionally, probabilistic predictions as generated by the spatially explicit model allow the uncertainty of prediction to be characterised and communicated. This thesis demonstrates that heterogeneous spread models can give more insight and detail than one dimensional or homogeneous spread models but that both can be useful at different stages of the invasion process. The importance of compiling appropriate data on dispersal and habitat suitability to aid invasion management has been highlighted. Additionally, a number of important hypotheses that need to be addressed to increase understanding of how species interact with the complex environment, have been identified and discussed.
APA, Harvard, Vancouver, ISO, and other styles
44

Foucault-Collet, Alexandra. "Luminescent lanthanide metal-organic frameworks and dendrimer complexes for optical biological imaging." Thesis, Orléans, 2013. http://www.theses.fr/2013ORLE2027/document.

Full text
Abstract:
Les composés à base de lanthanides luminescents possèdent des propriétés uniques offrant de nombreux avantages pour l’étude de problèmes biologiques et pour le diagnostic. Ils résistent notamment à la photodécomposition, possèdent des temps de vie de luminescence longs ainsi que des bandes d’émissions étroites qui ne se recouvrent pas. De plus, certains lanthanides émettent dans le proche infrarouge, ce qui les rend particulièrement intéressants pour des applications d'’imagerie in vivo. De part l’interdiction des transitions f → f, les cations lanthanides ont des coefficients d’extinction très faibles. C’est la raison pour laquelle, il est nécessaire d’utiliser un ou plusieurs sensibilisateur(s) (comme un chromophore organique) pour exciter le lanthanide par « effet antenne ». Nous proposons ici de nouveaux composés émettant dans le proche infrarouge dont la structure permet d’incorporer une densité importante de lanthanides et de sensibilisateurs par unite de volume : i) les nano-MOF Yb-PVDC-3 constitués de chromophores dérivés de dicarboxylates de phenylènevinylène qui sensibilisent les cations Yb3+ du réseau. ii) les complexes formés avec des ligands dendrimères dérivés de polyamidoamine de génération 3 capables de sensibiliser 8 lanthanides (Eu3+, Yb3+, Nd3+) par le biais de 32 antennes dérivées du groupe 1,8-naphthalimide. La caractérisation physique, photophysique et la biocompatibilité de ces composés ont été réalisées. Ils ont montré une bonne stabilité dans différents environnements. Leur faible cytotoxicité a permis d’obtenir des images de microscopie proche infrarouge sur cellules vivantes. La preuve de principe que les nano-MOFs et les dendrimères complexant des lanthanides peuvent être utilisés comme rapporteurs luminescents in cellulo et in vivo a été ici établie. Les résultats obtenus valident la stratégie d’utiliser ce type de matériel pour augmenter le nombre de photons émis par unité de volume afin d’obtenir une meilleure sensibilité de détection
Unique properties of luminescent lanthanides reporters explain their emergence for bioanalytical and optical imaging applications. Lanthanide ions possess long emission lifetimes, a good resistance to photodecomposition and sharp emission bands that do not overlap. In addition, several lanthanides emit in the near infrared (NIR) region of the electromagnetic spectrum making them very interesting for in vivo imaging. Free lanthanide cations have low extinction coefficients due to the forbidden nature of the f → f transition. Therefore, lanthanides must be sensitized using a photonic converter such as an organic chromophore through the “antenna effect". We report here new near-infrared emitting compounds whose structure allows to incorporate a high density of lanthanide cations and sensitizers per unit volume: i) nano-MOF Yb-PVDC-3 based on Yb3+ sensitized by phenylenevinylene dicarboxylates. ii) polymetallic dendrimer complexes formed with derivatives of new generation-3 polyamidoamine dendrimers. In these complexes, 8 lanthanide ions (Eu3+, Yb3+, Nd3+) can be sensitized by the 32 antenna derived from 1,8-naphthalimide. These two families of compounds were fully characterised for their physical, photophysical properties as well as for their biological respective compatibilities. They are stable in various media and their low cytotoxicity and emission of a sufficient number of photons are suitable for near-infrared live cell imaging. One of the main goal outcomes of this work is the establishment of the proof of principle that nano- MOFs and lanthanide derived dendrimers can be used for the sensitization of NIR emitting lanthanides to create a new generation of NIR optical imaging agents suitable for both in cellulo and in vivoapplications.The present work also validates the efficiency of the strategy to use both types of nanoscale systems described here to increase the number of emitted photons per unit volume for an improved detection sensitivity and to compensate for low quantum yields
APA, Harvard, Vancouver, ISO, and other styles
45

AMIRESMAEILI, NASIM. "DEVELOPING FRAMEWORKS FOR IDENTIFYING THE BIOLOGICAL CONTROL AGENTS OF DROSOPHILA SUZUKII IN LOMBARDY, ITALY." Doctoral thesis, Università degli Studi di Milano, 2017. http://hdl.handle.net/2434/490803.

Full text
Abstract:
Many invasive pests are arthropods that every year reach, colonize and spread into new areas from their native countries. Invasive species are the second largest threat to biodiversity after habitat loss as they compete with natives for food and spaces. Drosophila suzukii (Matsumuta) (Diptera: Drosophilidae) -spotted wing drosophila (SWD)- a hazardous quarantine pest native to Eastern and southeastern Asia infested simultaneously during 2008, for the first time in the European continent, Italy and Spain. Huge monetary losses, due to direct damage costs, market loss, management techniques, and rejection of exports for altered processing practices were immediately associated to D. suzukii arrival in Italy. As oviposition of D. suzukii begins at ripening very close to harvest, finding the most promising natural enemy, alternative to chemicals, is an important tool in the control of this fruit fly. The aim of the present research was to deepen the knowledge on pest-parasitoid relationship in Lombardy inventoring the natural native enemies present in the area in association to the new pest. Six parasitoids have been found in association to drosophilids in Lombardy: Pachycrepoideus vindemmiae (Rondani, 1875) and Spalangia erythromera Forster, 1850 (Hym.: Pteromalidae), Leptopilina boulardi (Barbotin, Carton and Kelner-Pillaut, 1979) and Leptopilina heterotoma (Thomson, 1862) (Hym.: Figitidae), Trichopria drosophilae (Perkins, 1910) (Hym.: Diapriidae) and Asobara tabida Nees von Esenbeck, 1834 (Hym.: Braconidae). Among the six parasitoids found in field monitoring, the attention was focused on the pupal parasitoid T. drosophilae (Hymenoptera: Diapriidae). Laboratory tests were made and information on the performances of T. drosophilae in relation to host prey (D. suzukii vs D. melanogaster) and to D. suzukii was acquired. T. drosophilae revealed to be a good candidate for mass rearing and biological control methods. Information on the coexinstence of D. suzukii with other exotic drosophilids were also acquired. The presence in Lombardy of two other exotic drosophilids was ascertained: Chymomyza amoena (Loew, 1862) and Zaprionus tuberculatus (Malloch, 1932). This was the first report for Lombardy region as C. amoena was first detected in Italy in Veneto in 1999 (Bächli et al. 1999) and Z. tuberculatus was first detected in Trentino in 2013 (Raspi et al. 2014). In the present study, the population, seasonal activity, favorable habitat and hosts of these two exotic pests were also studied.
APA, Harvard, Vancouver, ISO, and other styles
46

Stylianou, Kyriakos C. "Biologically derived and pyrene-based metal-organic frameworks for advanced applications." Thesis, University of Liverpool, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.569888.

Full text
Abstract:
Metal-Organic Frameworks (MOFs) represent a new type of multi functional porous materials. These materials can be obtained from the connection of metal centres or clusters and organic ligands in such way that extended ID, 2D or 3D structures can be isolated. The unlimited possibilities of connecting the different building blocks within these materials make possible the formation of networks with a range of pore sizes and applications in gas storage and separation, catalysis, molecular sensing and drug delivery. In this thesis, the synthesis and characterisation of adenine based and pyrene derived MOFs is presented. Based on the physical properties of these MOFs, we are demonstrating that they can potentially used for advanced applications in CO2 capture, C}--4 storage, C02/C}--4 separation, xylene separation, sensing and base pairing recognition. The first part of this thesis deals with the synthesis of novel adenine based MO F s. Within the isolated MOFs, adenine provides a wide range of binding modes for metal coordination and different protonated forms acting as a bridging or terminal ligand. The diverse functionality of small biologically relevant building blocks such as adenine can tune the chemistry of MOF type structures to permit increased chemical interactions with guest molecules (C02 and C}--4) for applications such as gas storage and separation. The second part describes the design of a tetracarboxylate ligand based on an optically active pyrene core, H4 TBAPy. The choice of the ligand for the construction of extended networks is important for the construction of fluorescent MOFs for sensing applications. The isolated MOFs show interesting properties for CO2 capture and CH4 storage and due to the specific size and shape of the channels and nature of the polyaromatic ligand used for the isolation of the frameworks, aromatic molecules were introduced for separation applications. The [mal part demonstrates the synthesis of a network based on Zn, adenine and }--4TBAPy. The Watson-Crick face of the nucleobase within this framework is located within the channels, responsible for the hydrogen bonding with thymine, the pair base of adenine. Thymine solution isotherms revealed that the 3D structure can adsorb thymine molecules and this is further confirmed with Infrared Spectroscopy and Thermogravimetric and Elemental Analyses. In conclusion, this thesis represents a small contribution to the extraordinary field of Metal-Organic Framework chemistry. The formation of novel materials with unprecedented network topologies and possible applications in gas storage, gas and liquid separation and molecular sensing is important in order to create novel materials with improved and advanced properties.
APA, Harvard, Vancouver, ISO, and other styles
47

Kumar, Ketan. "Product management framework for the development of automation solutions for biologics drug substance manufacturing." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/126904.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, May, 2020
Thesis: S.M., Massachusetts Institute of Technology, Department of Chemical Engineering, May, 2020
Cataloged from the official PDF of thesis.
Includes bibliographical references (pages 119-122).
This thesis presents a product management framework for the development of innovative manufacturing automation solutions, and the application of this framework to the development of automation for a continuous biomanufacturing platform at Amgen. A recently formed team at Amgen - Next Gen Automation (Drug Substance)(NGA(DS)) - is working to develop innovative automation solutions that support Amgen's strategic initiatives. Being an innovation team, NGA(DS) faces uncertainty regarding what aspects of the existing process are best suited to be improved using automation and what the best automation solutions are to achieve these results. The framework presented in this thesis provides NGA(DS) a methodology to develop useful solutions in the presence of this uncertainty. Supporting automation development for the continuous biomanufacturing platform is one of the work streams of NGA(DS), and was used as a case study for the development of the product management framework. Several prominent innovation and product management frameworks were lever-aged in the development of the framework for this project, including Lean Startup and Disciplined Entrepreneurship. As recommended by the sources studied, this project modelled innovation as a collaborative and iterative process of testing hypotheses regarding the value of the product being developed. Specific tools and concepts were applied from the source frameworks, as relevant to the teams's needs. The framework developed in this project consisted of two phases - Opportunity Analysis and Solution Development - with multiple data collection and analysis activities in each phase. Results from the activities were validated through reviews by the NGA(DS) team leadership and other relevant Subject Matter Experts within Amgen. The framework developed in this project is intended to guide future decision making for product development activities by NGA(DS).
by Ketan Kumar.
M.B.A.
S.M.
M.B.A. Massachusetts Institute of Technology, Sloan School of Management
S.M. Massachusetts Institute of Technology, Department of Chemical Engineering
APA, Harvard, Vancouver, ISO, and other styles
48

Ribeiro, Edward de Oliveira. "p2pBIOFOCO : um framework Peer-to-Peer para processamento distribuido do BLAST." reponame:Repositório Institucional da UnB, 2006. http://repositorio.unb.br/handle/10482/3093.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2006.
Submitted by Diogo Trindade Fóis (diogo_fois@hotmail.com) on 2009-11-24T10:46:45Z No. of bitstreams: 1 2006_Edward de Oliveira Ribeiro.pdf: 1709738 bytes, checksum: 915e695fb5277b397e2455f5ea396348 (MD5)
Approved for entry into archive by Carolina Campos(carolinacamposmaia@gmail.com) on 2010-01-11T16:46:22Z (GMT) No. of bitstreams: 1 2006_Edward de Oliveira Ribeiro.pdf: 1709738 bytes, checksum: 915e695fb5277b397e2455f5ea396348 (MD5)
Made available in DSpace on 2010-01-11T16:46:22Z (GMT). No. of bitstreams: 1 2006_Edward de Oliveira Ribeiro.pdf: 1709738 bytes, checksum: 915e695fb5277b397e2455f5ea396348 (MD5) Previous issue date: 2006-03-27
Uma área promissora para o projeto e desenvolvimento de sistemas distribuídos tem sido a Bioinformática, um campo de pesquisa interdisciplinar que usa conhecimentos de Ciência da Computação, Matemática e Estatística para resolver problemas de Biologia Molecular. Entretanto, apesar do amplo desenvolvimento e uso de tecnologias distribuídas no comércio, indústria e meio acadêmico, os sistemas distribuídos baseados no modelo Peer-to-Peer (P2P) ainda permanecem relativamente inexplorados no campo científico. Nesta dissertação, propomos uma nova arquitetura distribuída para a execução de aplicações em Bioinformática, particularmente o BLAST (Basic Local Alignment Search Tool), utilizando o modelo P2P. O BLAST é uma família de ferramentas que identifica a similaridade entre seqüências de DNA ou RNA fornecidas pelo usuário e seqüências existentes em bancos de dados de aminoácidos e nucleotídeos. Neste trabalho, projetamos e desenvolvemos um framework, baseado na plataforma P2P JXTA, para distribuir o processamento do BLAST entre dois ou mais domínios remotos utilizando um algoritmo de escalonamento de tarefas do tipo "alternância circular" (round robin) em uma rede privada virtual. O sistema conta ainda com um mecanismo de presença para anunciar o estado (ativo/inativo) dos Peers, e a flexibilidade de adicionar e remover serviços de forma dinâmica, isto é, sem a necessidade de reiniciar a aplicação. Os resultados do processamento do BLAST foram armazenados em um diretório FTP através de uma conexão segura. O banco de dados utilizado pelo BLAST foi o nr, o maior banco de dados de nucleotídeos disponível no National Center for Biotechnology Information (NCBI). Analisamos os ganhos reais de execução de arquivos contendo seqüências de DNA em 10 máquinas, distribuídas entre três domínios remotos, de forma a verificar a aplicabilidade da abordagem P2P em um ambiente de testes real, e o impacto que as limitações de memória RAM de cada máquina exerce sobre o tempo de execução total do sistema. Os bons resultados obtidos motivam novas melhorias no modelo atual, como inclusão de novos algoritmos de escalonamento de tarefas ou mecanismos de tolerância a falhas, além do uso desta arquitetura em projetos reais de Bioinformática. ___________________________________________________________________________________________ ABSTRACT
A rewarding area for the project and design of distributed systems has been Bioinformatics, an interdisciplinary research field that uses knowledge from Computer Science, Mathematics and Statistics to solve problems in Molecular Biology. Nevertheless, in spite of the development and use of distributed technologies in business, industry and academia, distributed systems based on the Peer-to- Peer (P2P) model are still relatively unexplored in the scientific field. In this dissertation, we propose a new distributed architecture to the execution of Bioinformatics applications, particularly the BLAST (Basic Local Alignment Search Tool), using a P2P computing model. The BLAST is a suite of tools that verify the similarity between DNA or RNA sequences issued by the user and the sequences stored in nucleotides and aminoacids databases. In this work, we designed and developed a framework, based on JXTA P2P platform, to distribute BLAST processing among two or more remote sites according to a round robin task-scheduling algorithm in a virtual private network. The system has also a presence mechanism to advertise the status of the Peers (online/offline), and the flexibility to dynamically add or remove services, that is, without restarting the application. The results of the BLAST processing were stored in a FTP directory through a secure connection. The database used by BLAST was nr, the largest nucleotide database available at the National Center for Biotechnology Information (NCBI). We analyzed the real gains of the execution of DNA sequence files in 10 machines, distributed among three remote sites, to verify the applicability of the P2P approach in a real testbed environment, and the impact that RAM memory limitations of each machine has over the total execution time of the system. The good results obtained motivate us new improvements in the current model, like the inclusion of new task scheduling algorithms or fault tolerance mechanisms, and the use of this architecture in real Bioinformatics projects.
APA, Harvard, Vancouver, ISO, and other styles
49

Ríos, Gutiérrez Ana de los. "Evaluación integrada de vertidos urbanos, industriales y portuarios a estuarios y zonas costeras mediante análisis químicos y medidas de efectos biológicos (una propuesta para la Directiva Marco del Agua). Assessment of urban, industrial and harbour discharges into estuarine and marine environments using integrated chemical analyses and biological effect measurements (a proposal for the Water Framework Directive)." Doctoral thesis, Universidad de Cantabria, 2016. http://hdl.handle.net/10803/381259.

Full text
Abstract:
El objetivo general de esta tesis doctoral es determinar la utilidad de los biomarcadores en la evaluación del estado de masas de agua costeras y de transición afectadas por vertidos dentro del ámbito de la Directiva Marco del Agua. Para ello, se realizaron trasplantes con mejillones en zonas afectadas por diferentes fuentes (difusas y puntuales) y tipos de contaminación (urbana, industrial y portuaria). Se cuantificaron concentraciones de contaminantes en muestras de agua, sedimentos y mejillones. Así mismo, se recogieron mejillones para el análisis de una batería de biomarcadores de efecto y de exposición. Además, se estudió la estructura de la comunidad de invertebrados bentónicos en algunos lugares de estudio. Los biomarcadores indicaron que los mejillones estuvieron expuestos a metales, disruptores endocrinos estrogénicos, compuestos genotóxicos y contaminantes orgánicos en algunos sitios de estudio. Además, los resultados sobre biomarcadores mostraron correlaciones significativas con las concentraciones de contaminantes en el agua y con las comunidades bentónicas. Por lo tanto, el análisis de biomarcadores se propone como un estadio intermedio en la evaluación de los estados químico y ecológico exigidos por la Directiva Marco del Agua.
The general objective of this PhD thesis is to determine the usefulness of the biomarker approach as a complementary tool for the assessment of the effects of a broad range of discharges for the evaluation of the status of coastal and transitional water bodies within the scope of the Water Framework Directive. For this purpose, mussels were transplanted to areas affected by different types of diffuse and point source discharges. Chemical analyses of contaminants were performed in water, sediments and mussel samples. A battery of effect and exposure biomarkers were also performed on mussels. Besides, the structure of the community of benthic invertebrates was analysed in certain study sites. Results on biomarkers indicated that mussels were exposed to metals, estrogenic endocrine disruptors, genotoxic compounds and organic pollutants in some study sites. Besides, results on biomarkers showed significant correlations with concentrations of contaminants in water and with results on benthic communities. Thus, the analysis of biomarkers is proposed as a link between chemical and ecological statuses demanded by Water Framework Directive.
APA, Harvard, Vancouver, ISO, and other styles
50

Lombardo, Luiz Roberto. "FrameEST: um framework de componentes, no padrão MVC, para o domínio de biologia molecular." Universidade Federal de São Carlos, 2006. https://repositorio.ufscar.br/handle/ufscar/333.

Full text
Abstract:
Made available in DSpace on 2016-06-02T19:05:21Z (GMT). No. of bitstreams: 1 DissLRL.pdf: 1027181 bytes, checksum: 9aff4fc2bc614d350392b6d83ff1ecac (MD5) Previous issue date: 2006-08-25
Nowadays, some projects of genomes of different organisms are being analyzed generating a great volume of data, which are stored in heterogeneous and distributed data sources. Moreover, there are available tools in the genome domain that also need to be integrated. Another problem is that the systems developed for these objectives do not offer all the support to the researchers, therefore in their majority do not possess flexibility and are of difficult expansion. The proposal of this work is the development of a software component framework, called FrameEST, developed with the most recent technologies of reuses, that structures and guides the development of different applications of molecular biology domain. The FrameEST is available for reuses of the applications: in the phase of modeling in a CASE tool and the phase of implementation as one plug-in in the Eclipse enviroment. A case study is used to illustrate the FrameEST reuse.
Atualmente, vários projetos de genomas de diferentes seres vivos estão sendo mapeados gerando um grande volume de dados, os quais são armazenados em fontes de dados heterogêneas e distribuídas. Além disso, existem ferramentas disponíveis no domínio de genomas que também necessitam serem integradas. Outro problema é que os sistemas desenvolvidos para este fim não atendem aos pesquisadores, pois na sua grande maioria não possuem flexibilidade e são de difícil expansão. A proposta deste trabalho é o desenvolvimento de um framework de componentes de software, denominado FrameEST, desenvolvido com as mais recentes tecnologias de reuso, que estrutura e orienta o desenvolvimento de diferentes aplicações do domínio de biologia molecular. O FrameEST está disponível para reuso das aplicações: na fase de modelagem em uma ferramenta CASE e na fase de implementação como um plug-in no ambiente Eclipse. Um estudo de caso é utilizado para ilustrar o reuso do FrameEST.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography