Дисертації з теми "New tools for network analysis"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: New tools for network analysis.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "New tools for network analysis".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Larhlimi, Abdelhalim [Verfasser]. "New concepts and tools in constraint-based analysis of metabolic networks / Abdelhalim Larhlimi." Berlin : Freie Universität Berlin, 2009. http://d-nb.info/1023579944/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Trier, Matthias. "Towards a Social Network Intelligence Tool for visual Analysis of Virtual Communication Networks." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-140161.

Повний текст джерела
Анотація:
Communities of Practice regularly utilize virtual means of communication. The according software support provides its members with many sophisticated features for generating content and for communicating with each other via the internet or intranet. However, functionalities to monitor, assess, coordinate, and communicate the quality and development of the underlying electronic networks of experts are frequently missing. To meet this need of increased manageability, this contribution introduces a Social Network Intelligence software approach which aims at supporting the comprehension of the structure and value of electronic communities by automatically extracting and mining available electronic data of various types of virtual communication networks, like e-mail archives, discussion groups, or instant messaging communication. Experimental structural visualizations employing Social Network Analysis methods are combined with Keyword Extraction to move towards a Social Network Intelligence approach which generates transparency of complex virtual communication networks. Together with a comprehensive visualization method, an approach for software-supported communication network measurement and evaluation is suggested. It supports the identification of important participants, topics, or clusters in the network, evaluates the interpersonal communication structure and visually traces the evolvement of the knowledge exchange over time.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Trier, Matthias. "Towards a Social Network Intelligence Tool for visual Analysis of Virtual Communication Networks." Technische Universität Dresden, 2006. https://tud.qucosa.de/id/qucosa%3A27871.

Повний текст джерела
Анотація:
Communities of Practice regularly utilize virtual means of communication. The according software support provides its members with many sophisticated features for generating content and for communicating with each other via the internet or intranet. However, functionalities to monitor, assess, coordinate, and communicate the quality and development of the underlying electronic networks of experts are frequently missing. To meet this need of increased manageability, this contribution introduces a Social Network Intelligence software approach which aims at supporting the comprehension of the structure and value of electronic communities by automatically extracting and mining available electronic data of various types of virtual communication networks, like e-mail archives, discussion groups, or instant messaging communication. Experimental structural visualizations employing Social Network Analysis methods are combined with Keyword Extraction to move towards a Social Network Intelligence approach which generates transparency of complex virtual communication networks. Together with a comprehensive visualization method, an approach for software-supported communication network measurement and evaluation is suggested. It supports the identification of important participants, topics, or clusters in the network, evaluates the interpersonal communication structure and visually traces the evolvement of the knowledge exchange over time.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Subagadis, Yohannes Hagos. "A new integrated modeling approach to support management decisions of water resources systems under multiple uncertainties." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-189212.

Повний текст джерела
Анотація:
The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational framework. Such integrative research to link different knowledge domains faces several practical challenges. The complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. This thesis aims to overcome some of these challenges, and to demonstrate how new modeling approaches can provide successful integrative water resources research. It focuses on the development of new integrated modeling approaches which allow integration of not only physical processes but also socio-economic and environmental issues and uncertainties inherent in water resources systems. To achieve this goal, two new approaches are developed in this thesis. At first, a Bayesian network (BN)-based decision support tool is developed to conceptualize hydrological and socio-economic interaction for supporting management decisions of coupled groundwater-agricultural systems. The method demonstrates the value of combining different commonly used integrated modeling approaches. Coupled component models are applied to simulate the nonlinearity and feedbacks of strongly interacting groundwater-agricultural hydrosystems. Afterwards, a BN is used to integrate the coupled component model results with empirical knowledge and stakeholder inputs. In the second part of this thesis, a fuzzy-stochastic multiple criteria decision analysis tool is developed to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrates physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approaches are applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structures. The results show the effectiveness of the proposed methods. The first method can aid in the impact assessment of alternative management interventions on sustainability of aquifer systems while accounting for economic (agriculture) and societal interests (employment in agricultural sector) in the study area. Results from the second method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach suits to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. The new approaches can be applied to address the complexities and uncertainties inherent in water resource systems to support management decisions, while serving as a platform for stakeholder participation.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

MAZZA, ANDREA. "Innovative Approaches for Optimization of the Distribution System Operation." Doctoral thesis, Politecnico di Torino, 2015. http://hdl.handle.net/11583/2596760.

Повний текст джерела
Анотація:
The present work of thesis presents some innovative approaches aiming at analyzing and optimizing the distribution system. In the first part (composed of the Chapters 1–4), some innovative methods developed for distribution network reconfiguration, both in single objective framework and in multi- objective framework, are presented and applied. In particular, Chapter 1 shows the general problem of distribution network reconfigu- ration, while Chapter 2 presents the different types of Optimization Methods (OM), by focusing on their nature (deterministic or heuristic) and on the goal they have (single or multi-objective optimization). Chapter 3 introduces the decision-making methods, which have been used for the multi- objective optimization problems (whose application examples are reported in Chapter 5). Chapter 4 reports some innovative approach in the single objective optimization, by analyzing the possibility to apply intraday reconfiguration, by considering the cost of this operation with respect to the initial solution in which the actual configuration is maintained without any change. Chapter 6 presents some tools developed within the European Project SiNGULAR as functions of the main tool DERMAT, aiming at handling the different input data pattern, the correlation among loads and generators, as well as the presence of correlated harmonic sources in the network. Finally, in Appendix A the networks used during the doctoral work are shown, while Appendix B presents an overview of the general aspects and characteristics of the probability-based methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Burns, Zackory T. "Quantifying the sociality of wild tool-using New Caledonian crows through an animal-borne technology." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:16db8026-53e4-4fb0-aa69-80d7cc34e183.

Повний текст джерела
Анотація:
New Caledonian crows (NC crows; Corvus moneduloides) are the most prolific avian tool-users and crafters, using up to three unique tool types derived from numerous plant materials. Since the discovery that wild populations of NC crows use and manufacture different tools in different locations with no measured environmental correlates to these distributions, the process by which NC crows acquire their tool-oriented behavior has been investigated. Two major findings were discovered in 2005: NC crows have a genetic predisposition to manipulate stick like objects, and they increase their rate of manipulation when exposed to social influences. Since then, much of the research into the sociality of wild NC crows has focused on direct social influences, especially the parent-juvenile relationship, yet no social network of wild NC crows has been described. In my thesis, I characterized a new proximity-logging device, Encounternet, and outline a four-step plan to assess error in animal borne devices; uncovered drivers, such as relatedness, space-use, and environmental factors, of wild NC crow sociality, and experimentally manipulated the social network, revealing immediate changes to the number of day-time and roosting partners, the breakdown of first-order relatedness driving sociality, and an increase in the amount of time NC crows associate; and revealed an indirect pathway via tools left behind by conspecifics allowing for the transmission of tool-properties between unrelated NC crows. Altogether, I furthered our understanding of wild NC crow sociality through the use of an animal-borne device, experimental manipulation in the wild measuring the response of the NC crow social network, and demonstrated the utility of animal-borne devices in mapping the network of a population of wild birds.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Power, Jane Elizabeth. "New NMR tools for impurity analysis." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/new-nmr-tools-for-impurity-analysis(f6814907-cb3b-4c67-9702-dda58fbc726c).html.

Повний текст джерела
Анотація:
New NMR Tools for Impurity Analysis was written by Jane Power and submitted for the degree of Doctor of Philosophy in the Faculty of Engineering and Physical Sciences at the University of Manchester, on 31st March 2016.NMR spectroscopy is rich in structural information and is a widely used technique for structure elucidation and characterization of organic molecules; however, for impurity analysis it is not generally the tool of choice. While 1H NMR is quite sensitive, due to its narrow chemical shift range (0 - 10 ppm) and the high abundance of hydrogen atoms in most drugs, its resolution is often poor, with much signal overlap. Therefore, impurity signals, especially for chemically cognate species, are frequently obscured. 19F NMR on the other hand offers extremely high resolution for pharmaceutical applications. It exhibits far wider chemical shift ranges (± 300 ppm) than 1H NMR, and typical fluorinated drugs, of which there are many on the market, have only one or two fluorine atoms. In view of this, 19F NMR is being considered as an alternative for low-level impurity analysis and quantification, using a chosen example drug, rosuvastatin. Before 19F NMR can be effectively used for such analysis, the significant technical problem of pulse imperfections, such as sensitivity to B1 inhomogeneity and resonance-offset effects, has to be overcome. At present, due to the limited power of the radiofrequency amplifiers, only a fraction of the very wide frequency ranges encountered with nuclei such as fluorine can be excited uniformly at any one time. In this thesis, some of the limitations imposed by pulse imperfections are addressed and overcome. Two new pulse sequences are developed and presented, CHORUS and CHORUS Oneshot, which use tailored, ultra-broadband swept-frequency chirp pulses to achieve uniform constant amplitude and constant phase excitation and refocusing over very wide bandwidths (approximately 250 kHz), with no undue B1 sensitivity and no significant loss in sensitivity. CHORUS, for use in quantitative NMR, is demonstrated to give accuracies better than 0.1%. CHORUS Oneshot, a diffusion-ordered spectroscopic technique, exploits the exquisite sensitivity of the 19F chemical shift to its local environment, giving excellent resolution, which allows for accurate discrimination between diffusion coefficients with high dynamic range and over very wide bandwidths. Sulfur hexafluoride (SF6) is investigated and shown to be a suitable reference material for use in 19F NMR. The bandshape of the fluorine signal and its satellites is simple, without complex splitting patterns, and therefore good for reference deconvolution; in addition, it is sufficiently soluble in the solvent of choice, DMSO-d6.To demonstrate the functionality of the CHORUS sequences for low-level impurity analysis, 470 MHz 1H decoupled 19F spectra were acquired on a 500 MHz Bruker system, using a degraded sample of rosuvastatin, to reveal two low-level impurities. Using a standard Varian probe with a single high frequency channel, simultaneous 1H irradiation and 19F acquisition was made possible by time-sharing. Simultaneous 19F{1H} and 19F{13C} double decoupling was then performed using degraded and fresh samples of rosuvastatin, to reveal three low-level impurities (in the degraded sample) and low-level 1H and 13C modulation artefacts.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Wong, David H. (David Hsing-Wang) 1976. "Finite state analysis with tools for network protocols." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/80577.

Повний текст джерела
Анотація:
Thesis (S.B. and M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.
Includes bibliographical references (p. 147).
by David H. Wong.
S.B.and M.Eng.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Gleeson, P. J. "New tools and specification languages for biophysically detailed neuronal network modelling." Thesis, University College London (University of London), 2012. http://discovery.ucl.ac.uk/1347263/.

Повний текст джерела
Анотація:
Increasingly detailed data are being gathered on the molecular, electrical and anatomical properties of neuronal systems both in vitro and in vivo. These range from the kinetic properties and distribution of ion channels, synaptic plasticity mechanisms, electrical activity in neurons, and detailed anatomical connectivity within neuronal microcircuits from connectomics data. Publications describing these experimental results often set them in the context of higher level network behaviour. Biophysically detailed computational modelling provides a framework for consolidating these data, for quantifying the assumptions about underlying biological mechanisms, and for ensuring consistency in the explanation of the phenomena across scales. Such multiscale biophysically detailed models are not currently in wide- spread use by the experimental neuroscience community however. Reasons for this include the relative inaccessibility of software for creating these models, the range of specialised scripting languages used by the available simulators, and the difficulty in creating and managing large scale network simulations. This thesis describes new solutions to facilitate the creation, simulation, analysis and reuse of biophysically detailed neuronal models. The graphical application neuroConstruct allows detailed cell and network models to be built in 3D, and run on multiple simulation platforms without detailed programming knowledge. NeuroML is a simulator independent language for describing models containing detailed neuronal morphologies, ion channels, synapses, and 3D network connectivity. New solutions have also been developed for creating and analysing network models at much closer to biological scale on high performance computing platforms. A number of detailed neocortical, cerebellar and hippocampal models have been converted for use with these tools. The tools and models I have developed have already started to be used for original scientific research. It is hoped that this work will lead to a more solid foundation for creating, validating, simulating and sharing ever more realistic models of neurons and networks.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Banerji, C. "Network theoretic tools in the analysis of complex diseases." Thesis, University College London (University of London), 2015. http://discovery.ucl.ac.uk/1470036/.

Повний текст джерела
Анотація:
In this thesis we consider the application of network theoretic tools in the analysis of genome wide gene-expression data describing complex diseases, displaying defects in differentiation. After considering the literature, we motivate the construction of entropy based network rewiring methodologies, postulating that such an approach may provide a systems level correlate of the differentiation potential of a cellular sample, and may prove informative in the analysis of pathology. We construct, analytically investigate and validate three such network theoretic tools: Network Transfer Entropy, Signalling Entropy and Interactome Sparsification and Rewiring (InSpiRe). By considering over 1000 genome wide gene expression samples corresponding to healthy cells at different levels of differentiation, we demonstrate that signalling entropy is a strong correlate of cell potency confirming our initial postulate. The remainder of the thesis applies our network theoretic tools to two ends of the developmental pathology spectrum. Firstly we consider cancer, in which the power of cell differentiation is hijacked, to develop a malicious new tissue. Secondly, we consider muscular dystrophy, in which cell differentiation is inhibited, resulting in the poor development of muscle tissue. In the case of cancer we demonstrate that signalling entropy is a measure of tumour anaplasia and intra-tumour heterogeneity, which displays distinct values in different cancer subtypes. Moreover, we find signalling entropy to be a powerful prognostic indicator in epithelial cancer, outperforming conventional gene expression based assays. In the case of muscular dystrophy we focus on the most prevalent: facioscapulohumeral muscular dystrophy (FSHD). We demonstrate that muscle differentiation is perturbed in FSHD and that signalling entropy is elevated in myoblasts over-expressing the primary FSHD candidate gene DUX4. We subsequently utilise InSpiRe, performing a meta-analysis of FSHD muscle biopsy gene-expression data, uncovering a network of DUX4 driven rewired interactions in the pathology, and a novel therapeutic target which we validate experimentally.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Tang, Xiaoting. "New analytical tools for systems biology." Online access for everyone, 2006. http://www.dissertations.wsu.edu/Dissertations/Fall2006/x_tang_081706.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Russell, Richard Anthony. "New tools for quantitative analysis of nuclear architecture." Thesis, Imperial College London, 2010. http://hdl.handle.net/10044/1/5624.

Повний текст джерела
Анотація:
The cell nucleus houses a wide variety of macromolecular substructures including the cell’s genetic material. The spatial configuration of these substructures is thought to be fundamentally associated with nuclear function, yet the architectural organisation of the cell nucleus is only poorly understood. Advances in microscopy and associated fluorescence techniques have provided a wealth of nuclear image data. Such images offer the opportunity for both visualising nuclear substructures and quantitative investigation of the spatial configuration of these objects. In this thesis, we present new tools to study and explore the subtle principles behind nuclear architecture. We describe a novel method to segment fluorescent microscopy images of nuclear objects. The effectiveness of this segmentation algorithm is demonstrated using extensive simulation. Additionally, we show that the method performs as well as manual-thresholding, which is considered the gold standard. Next, randomisationbased tests from spatial point pattern analysis are employed to inspect spatial interactions of nuclear substructures. The results suggest new and interesting spatial relationships in the nucleus. However, this approach probes only relative nuclear organisation and cannot readily yield a description of absolute spatial preference, which may be a key component of nuclear architecture. To address this problem we have developed methodology based on techniques employed in statistical shape analysis and image registration. The approach proposes that the nuclear boundary can be used to align nuclei from replicate images into a common coordinate system. Each nucleus and its contents can therefore be registered to the sample mean shape using rigid and non-rigid deformations. This aggregated data allows inference regarding global nuclear spatial organisation. For example, the kernel smoothed intensity function is computed to return an estimate of the intensity function of the registered nuclear object. Simulation provides evidence that the registration procedure is sensible and the results accurate. Finally, we have investigated a large database of nuclear substructures using conventional methodology as well as our new tools. We have identified novel spatial relationships between nuclear objects that offer significant clues to their function. We have also examined the absolute spatial configuration of these substructures in registered data. The results reveal dramatic underlying spatial preferences and present new and clear insights into nuclear architecture.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Xiao, Ying. "New tools for unsupervised learning." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52995.

Повний текст джерела
Анотація:
In an unsupervised learning problem, one is given an unlabelled dataset and hopes to find some hidden structure; the prototypical example is clustering similar data. Such problems often arise in machine learning and statistics, but also in signal processing, theoretical computer science, and any number of quantitative scientific fields. The distinguishing feature of unsupervised learning is that there are no privileged variables or labels which are particularly informative, and thus the greatest challenge is often to differentiate between what is relevant or irrelevant in any particular dataset or problem. In the course of this thesis, we study a number of problems which span the breadth of unsupervised learning. We make progress in Gaussian mixtures, independent component analysis (where we solve the open problem of underdetermined ICA), and we formulate and solve a feature selection/dimension reduction model. Throughout, our goal is to give finite sample complexity bounds for our algorithms -- these are essentially the strongest type of quantitative bound that one can prove for such algorithms. Some of our algorithmic techniques turn out to be very efficient in practice as well. Our major technical tool is tensor spectral decomposition: tensors are generalisations of matrices, and often allow access to the "fine structure" of data. Thus, they are often the right tools for unravelling the hidden structure in an unsupervised learning setting. However, naive generalisations of matrix algorithms to tensors run into NP-hardness results almost immediately, and thus to solve our problems, we are obliged to develop two new tensor decompositions (with robust analyses) from scratch. Both of these decompositions are polynomial time, and can be viewed as efficient generalisations of PCA extended to tensors.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Weston, Kevin T. (Kevin Thomas) 1981. "Network tools forthe analysis and prediction of protein-protein interactions." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/17999.

Повний текст джерела
Анотація:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.
Includes bibliographical references (p. 139-143).
In this thesis, we present two computational platforms for future biological research. The first, FNAC, is a flexible programmatic Framework for Network Analysis and Comparison that simplifies many common operations on biological networks. As a demonstration of FNAC, we investigate the properties of several prominent protein function and protein-protein interaction networks. In doing so, we uncover evidence suggesting that a recently-developed technique for annotating proteins may also have substantial value in the computational prediction of protein-protein interactions. Our second computational platform, the Coiled-Coil Database (CCDB), serves as a central and easily queryable repository for information about the coiled coil protein structural motif in a variety of organisms.
by Kevin T. Weston, Jr.
M.Eng.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Ramos, Cordoba Eloy. "Development of new tools for local electron distribution analysis." Doctoral thesis, Universitat de Girona, 2014. http://hdl.handle.net/10803/133376.

Повний текст джерела
Анотація:
This thesis focuses in the development and application of new tools for the analysis of the electron distribution in molecules, focusing on the concepts of local spins, and oxidation state. The thesis can be divided into three parts. The first one deals with the formulation of a new atom in molecule definition reproducing to some extent the results of the QTAIM (Quantum theory of atoms in molecules) analysis at a much reduced computational cost. In the second part we propose a new methodology to obtain local spins from wave function analysis and we relate local spins with the chemical bond and the radical character of molecules. Finally, we study the electron configurations of the atom within the molecule and retrieve their oxidation states from a particular analysis of the effective atomic orbitals (eff-AOs)
Aquesta tesi es centra en el desenvolupament i aplicació de noves eines per a l'anàlisi de la distribució electrònica en molècules, posant èmfasi en els conceptes de espins locals i estats d'oxidació. La tesi es pot dividir en tres parts. La primera està dedicada a la formulació d'una nova definició d'àtom dins de la molècula que reprodueix les propietats de l'anàlisi QTAIM (Quantum theory of atoms in molecules) amb un cost computacional associat molt més baix. A la segona part proposem una nova metodologia per a obtenir espins locals a partir de l'anàlisi de la funció d'ona i relacionam aquest concepte amb l'enllaç químic iatom el caràcter radical de les molècules. Per últim, estudiem les configuracions electròniques dels àtoms dins de les molècules i obtenim estats d'oxidació efectius a partir de l'anàlisi dels orbitals atòmics efectius
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Talkington, Gregory Joshua. "Shepherding Network Security Protocols as They Transition to New Atmospheres: A New Paradigm in Network Protocol Analysis." Thesis, University of North Texas, 2019. https://digital.library.unt.edu/ark:/67531/metadc1609134/.

Повний текст джерела
Анотація:
The solutions presented in this dissertation describe a new paradigm in which we shepherd these network security protocols through atmosphere transitions, offering new ways to analyze and monitor the state of the protocol. The approach involves identifying a protocols transitional weaknesses through adaption of formal models, measuring the weakness as it exists in the wild by statically analyzing applications, and show how to use network traffic analysis to monitor protocol implementations going into the future. Throughout the effort, we follow the popular Open Authorization protocol in its attempts to apply its web-based roots to a mobile atmosphere. To pinpoint protocol deficiencies, we first adapt a well regarded formal analysis and show it insufficient in the characterization of mobile applications, tying its transitional weaknesses to implementation issues and delivering a reanalysis of the proof. We then measure the prevalence of this weakness by statically analyzing over 11,000 Android applications. While looking through source code, we develop new methods to find sensitive protocol information, overcome hurdles like obfuscation, and provide interfaces for later modeling, all while achieving a false positive rate of below 10 percent. We then use network analysis to detect and verify application implementations. By collecting network traffic from Android applications that use OAuth, we produce a set of metrics that when fed into machine learning classifiers, can identify if the OAuth implementation is correct. The challenges include encrypted network communication, heterogeneous device types, and the labeling of training data.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Zhou, Pengcheng. "Computational Tools for Identification and Analysis of Neuronal Population Activity." Research Showcase @ CMU, 2016. http://repository.cmu.edu/dissertations/1015.

Повний текст джерела
Анотація:
Recently-developed technologies for monitoring activity in populations of neurons make it possible for the first time, in principle, to ask many basic questions in neuroscience. However, computational tools for analyzing newly available data need to be developed. The goal of this thesis is to contribute to this effort by focusing on two specific problems. First, we used a point-process regression framework to provide a methodology for statistical assessment of the link between neural spike synchrony and network-wide oscillations. In simulations, we showed that our method can recover ground-truth relationships, and in two types of spike train data we illustrated the kinds of results the method can produce. The approach improves on methods in the literature and may be adapted to many different experimental settings. Second, we considered the problem of source extraction in calcium imaging data, i.e., the detection of neurons within a field of view and the extraction of each neuron’s activity. The data we mainly focus on are recorded with a microendoscope, which has the unique advantage of imaging deep brain regions in freely behaving animals. These data suffer from high levels of background fluorescence, as well as the potential for overlapping neuronal signals. Based on the existing constrained nonnegative matrix factorization (CNMF) framework, we developed an efficient method to process microendoscopic data. Our method utilizes a novel algorithm to initialize the spatial shapes and temporal activity of the neurons from the raw video data independently from the strong fluctuating background. This step ensures the efficiency and accuracy of solving a nonconvex CNMF problem. Our method also models the complicated background by including its low-spatial frequency structure and the locally-low-rank feature to avoid absorbing cellular signals into the background term. We developed a tractable solution to estimate the background activity using this new model. After subtracting the approximated background, we followed the CNMF framework to demix neural signals and recover denoised and deconvolved temporal activity. We optimized several algorithms in solving the CNMF problems to get accurate results. In practice, our method outperforms all existing methods and has been adopted by many experimental labs.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

COSTANTINI, GIULIO. "Network analysis: a new perspective on personality psychology." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2015. http://hdl.handle.net/10281/75269.

Повний текст джерела
Анотація:
A new conception of personality based on network analysis has been recently proposed to overcome some of the limitations of the latent-variable theory of personality. While the latent-variable theory assumes that the covariation among the thoughts, feelings, and behaviors characterizing a personality domain can be explained by the effect of an unobservable latent variable (e.g., extraversion), the network approach conceives personality as emerging from direct interactions among these thoughts, feelings and behaviors. The network perspective motivates new ways of analyzing personality data and it can be especially important for investigating the mechanisms underlying personality. In this work, we present the basic network concepts and discuss several alternative ways to define networks from the data that are typically collected in personality psychology. The most important network indices, such as indices of centrality and of clustering coefficient, are described: we examine the properties of each index and explain why some of them, especially some indices of clustering coefficient, should not be applied to personality psychology data sets. Three new indices of clustering coefficient are proposed that are compatible with personality networks: their properties are tested both on simulated networks and networks based on actual personality psychology data. We present two applications of network analysis. The first application considers a network of 24 personality facets: we show how these facets relate to each other, and discuss both the local and the global properties of the network. The second application focuses on the dimension conscientiousness: We show that while some mechanisms underlying conscientiousness are common to many facets, other mechanisms may specifically characterize some facets and not others. By means of network analysis, we draw a comprehensive maps of conscientiousness that can serve as a guidance for future studies. The application of network analysis to the field of personality psychology is recent and its potentialities has not been fully explored yet: in the final part of this work, we discuss the limitations of our investigation and propose future developments of our research that can contribute to overcoming its limits.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

LOVATO, ILENIA. "Statistical tools for the analysis of network-valued data: theory, algorithms, and applications." Doctoral thesis, Università degli studi di Pavia, 2018. http://hdl.handle.net/11571/1228780.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Barcelona, Cabeza Rosa. "Genomics tools in the cloud: the new frontier in omics data analysis." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/672757.

Повний текст джерела
Анотація:
Substantial technological advancements in next generation sequencing (NGS) have revolutionized the genomic field. Over the last years, the speed and throughput of NGS technologies have increased while their costs have decreased, allowing us to achieve base-by-base interrogation of the human genome in an efficient and affordable way. All these advances have led to a growing application of NGS technologies in clinical practice to identify the genomics variations and their relationship with certain diseases. However, there is still the need to improve data accessibility, processing and interpretation due to both the huge amount of data generated by these sequencing technologies and the large number of tools available to process it. In addition to a large number of algorithms for variant discovery, each type of variation and data requires the use of a specific algorithm. Therefore, a solid background in bioinformatics is required to be able to select the most suitable algorithm in each case but also to be able to execute them successfully. On that basis, the aim of this project is to facilitate the processing of sequencing data for variant identification and interpretation for non-bioinformaticians. All this by creating high-performance workflows with a strong scientific basis, while remaining accessible and easy to use, as well as a simple and highly intuitive platform for data interpretation. An exhaustive bibliographic review has been carried out where the best existing algorithm has been selected to create automatic pipelines for the discovery of germline short variants (SNPs and indels) and germline structural variants (SVs), including both CNVs and chromosomal rearrangements, from modern human DNA. In addition to creating variant discovery pipelines, a pipeline has been implemented for in silico optimization of CNV detection from WES and TS data (isoCNV). This optimization pipeline has been shown to increase the sensitivity of CNV discovery using only NGS data. Such increased sensitivity is especially important for diagnosis in the clinical settings. Furthermore, a variant discovery workflow has been developed by integrating WES and RNA-seq data (varRED) that has been shown to increase the number of variants identified over those identified when only using WES data. It is important to note that variant discovery is not only important for modern populations, the study of the variation in ancient genomes is also essential to understand past human evolution. Thus, a germline short variant discovery pipeline from ancient WGS samples has been implemented. This workflow has been applied to a human mandible dated between 16980-16510 calibrated years before the present. The ancient short variants discovered were reported without further interpretation due to the low sample coverage. Finally, GINO has been implemented to facilitate the interpretation of the variants identified by the workflows developed in the context of this thesis. GINO is an easy-to-use platform for the visualization and interpretation of germline variants under user license. With the development of this thesis, it has been possible to implement the necessary tools for a high-performance identification of all types of germline variants, as well as a powerful platform to interpret the identified variants in a simple and fast way. Using this platform allows non-bioinformaticians to focus on interpreting results without having to worry about data processing with the guarantee of scientifically sound results. Furthermore, it has laid the foundations for implementing a platform for comprehensive analysis and visualization of genomic data in the cloud in the near future.
Los avances tecnológicos en la secuenciación de próxima generación (NGS) han revolucionado el campo de la genómica. El aumento de velocidad y rendimiento de las tecnologías NGS de los últimos años junto con la reducción de su coste ha permitido interrogar base por base el genoma humano de una manera eficiente y asequible. Todos estos avances han permitido incrementar el uso de las tecnologías NGS en la práctica clínica para la identificación de variaciones genómicas y su relación con determinadas enfermedades. Sin embargo, sigue siendo necesario mejorar la accesibilidad, el procesamiento y la interpretación de los datos debido a la enorme cantidad de datos generados y a la gran cantidad de herramientas disponibles para procesarlos. Además de la gran cantidad de algoritmos disponibles para el descubrimiento de variantes, cada tipo de variación y de datos requiere un algoritmo específico. Por ello, se requiere una sólida formación en bioinformática tanto para poder seleccionar el algoritmo más adecuado como para ser capaz de ejecutarlo correctamente. Partiendo de esa base, el objetivo de este proyecto es facilitar el procesamiento de datos de secuenciación para la identificación e interpretación de variantes para los no bioinformáticos. Todo ello mediante la creación de flujos de trabajo de alto rendimiento y con una sólida base científica, sin dejar de ser accesibles y fáciles de utilizar, así como de una plataforma sencilla y muy intuitiva para la interpretación de datos. Se ha realizado una exhaustiva revisión bibliográfica donde se han seleccionado los mejores algoritmos con los que crear flujos de trabajo automáticos para el descubrimiento de variantes cortas germinales (SNPs e indels) y variantes estructurales germinales (SV), incluyendo tanto CNV como reordenamientos cromosómicos, de ADN humano moderno. Además de crear flujos de trabajo para el descubrimiento de variantes, se ha implementado un flujo para la optimización in silico de la detección de CNV a partir de datos de WES y TS (isoCNV). Se ha demostrado que dicha optimización aumenta la sensibilidad de detección utilizando solo datos NGS, lo que es especialmente importante para el diagnóstico clínico. Además, se ha desarrollado un flujo de trabajo para el descubrimiento de variantes mediante la integración de datos de WES y RNA-seq (varRED) que ha demostrado aumentar el número de variantes detectadas sobre las identificadas cuando solo se utilizan datos de WES. Es importante señalar que la identificación de variantes no solo es importante para las poblaciones modernas, el estudio de las variaciones en genomas antiguos es esencial para comprender la evolución humana. Por ello, se ha implementado un flujo de trabajo para la identificación de variantes cortas a partir de muestras antiguas de WGS. Dicho flujo se ha aplicado a una mandíbula humana datada entre el 16980-16510 a.C. Las variantes ancestrales allí descubiertas se informaron sin mayor interpretación debido a la baja cobertura de la muestra. Finalmente, se ha implementado GINO para facilitar la interpretación de las variantes identificadas por los flujos de trabajo desarrollados en esta tesis. GINO es una plataforma fácil de usar para la visualización e interpretación de variantes germinales que requiere licencia de uso. Con el desarrollo de esta tesis se ha conseguido implementar las herramientas necesarias para la identificación de alto rendimiento de todos los tipos de variantes germinales, así como de una poderosa plataforma para visualizar dichas variantes de forma sencilla y rápida. El uso de esta plataforma permite a los no bioinformáticos centrarse en interpretar los resultados sin tener que preocuparse por el procesamiento de los datos con la garantía de que estos sean científicamente robustos. Además, ha sentado las bases para en un futuro próximo implementar una plataforma para el completo análisis y visualización de datos genómicos
Bioinformática
Стилі APA, Harvard, Vancouver, ISO та ін.
21

El-Shehaly, Mai Hassan. "A Visualization Framework for SiLK Data exploration and Scan Detection." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/34606.

Повний текст джерела
Анотація:
Network packet traces, despite having a lot of noise, contain priceless information, especially for investigating security incidents or troubleshooting performance problems. However, given the gigabytes of flow crossing a typical medium sized enterprise network every day, spotting malicious activity and analyzing trends in network behavior becomes a tedious task. Further, computational mechanisms for analyzing such data usually take substantial time to reach interesting patterns and often mislead the analyst into reaching false positives, benign traffic being identified as malicious, or false negatives, where malicious activity goes undetected. Therefore, the appropriate representation of network traffic data to the human user has been an issue of concern recently. Much of the focus, however, has been on visualizing TCP traffic alone while adapting visualization techniques for the data fields that are relevant to this protocol's traffic, rather than on the multivariate nature of network security data in general, and the fact that forensic analysis, in order to be fast and effective, has to take into consideration different parameters for each protocol. In this thesis, we bring together two powerful tools from different areas of application: SiLK (System for Internet-Level Knowledge), for command-based network trace analysis; and ComVis, a generic information visualization tool. We integrate the power of both tools by aiding simplified interaction between them, using a simple GUI, for the purpose of visualizing network traces, characterizing interesting patterns, and fingerprinting related activity. To obtain realistic results, we applied the visualizations on anonymized packet traces from Lawrence Berkley National Laboratory, captured on selected hours across three months. We used a sliding window approach in visually examining traces for two transport-layer protocols: ICMP and UDP. The main contribution of this research is a protocol-specific framework of visualization for ICMP and UDP data. We explored relevant header fields and the visualizations that worked best for each of the two protocols separately. The resulting views led us to a number of guidelines that can be vital in the creation of "smart books" describing best practices in using visualization and interaction techniques to maintain network security; while creating visual fingerprints which were found unique for individual types of scanning activity. Our visualizations use a multiple-views approach that incorporates the power of two-dimensional scatter plots, histograms, parallel coordinates, and dynamic queries.
Master of Science
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Do, Lan. "New tools for sample preparation and instrumental analysis of dioxins in environmental samples." Doctoral thesis, Umeå universitet, Kemiska institutionen, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-70218.

Повний текст джерела
Анотація:
Polychlorinated dibenzo-p-dioxins (PCDDs) and dibenzofurans (PCDFs), two groups of structurally related chlorinated aromatic hydrocarbons, are of high concern due to their global distribution and extreme toxicity. Since they occur at very low levels, their analysis is complex, challenging and hence there is a need for efficient, reliable and rapid alternative analytical methods. Developing such methods was the aim of the project this thesis is based upon. During the first years of the project the focus was on the first parts of the analytical chain (extraction and clean-up). A selective pressurized liquid extraction (SPLE) procedure was developed, involving in-cell clean-up to remove bulk co-extracted matrix components from sample extracts. It was further streamlined by employing a modular pressurized liquid extraction (M-PLE) system, which simultaneously extracts, cleans up and isolates planar PCDD/Fs in a single step. Both methods were validated using a wide range of soil, sediment and sludge reference materials. Using dichloromethane/n-heptane (DCM/Hp; 1/1, v/v) as a solvent, results statistically equivalent to or higher than the reference values were obtained, while an alternative, less harmful non-chlorinated solvent mixture - diethyl ether/n-heptane (DEE/Hp; 1/2, v/v) – yielded data equivalent to those values. Later, the focus of the work shifted to the final instrumental analysis. Six gas chromatography (GC) phases were evaluated with respect to their chromatographic separation of not just the 17 most toxic congeners (2,3,7,8-substituted PCDD/Fs), but all 136 tetra- to octaCDD/Fs. Three novel ionic liquid columns performed much better than previously tested commercially available columns. Supelco SLB-IL61 offered the best overall performance, successfully resolving 106 out of the 136 compounds, and 16 out of the 17 2,3,7,8-substituted PCDD/Fs. Another ionic liquid (SLB-IL111) column provided complementary separation. Together, the two columns separated 128 congeners. The work also included characterization of 22 GC columns’ selectivity and solute-stationary phase interactions. The selectivities were mapped using Principal Component Analysis (PCA) of all 136 PCDD/F’s retention times on the columns, while the interactions were probed by analyzing both the retention times and the substances’ physicochemical properties.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Fontana, G. "IMPROVING DISTRICT HEALTH MANAGEMENT PERFORMANCE: A NEW FRAMEWORK WITH TOOLS FOR SITUATION ANALYSIS." Doctoral thesis, Università degli Studi di Milano, 2011. http://hdl.handle.net/2434/153788.

Повний текст джерела
Анотація:
In recent decades, The United Nations, national governments, and many international donors and non-governmental organizations have dedicated considerable resources to the achievement of the Millennium Development Goals (MDGs) in low income countries. Special efforts have gone into achieving the health-related MDGs. Early in the process, many realized that health systems in poor countries would have to be strengthened to deliver the services necessary to achieving the MDGs. Recently it has also become clear that to make health systems stronger, it will be paramount to improve the management of the limited available resources and service provision at the local level. The improvement of district health management performance has been recognized as a key element in the complex task of strengthening health systems. This has become even more compelling since the number of countries that have decentralized health systems has increased significantly. While many studies, essays, guidelines and manuals have been published on how to strengthen health systems as a whole, little evidence is available on how to improve the management at the district level. This work contributes to bridging this gap by providing (a) a new theoretical framework that systematizes the main determinants of the performance of district health management teams in a coherent way using WHO approach to health systems; (b) a set of tools for a quick but effective situation analysis with indicators organized by health systems functions that can be used to identify which determinants are falling short and need to be addressed to improve the performance of health district managers on the ground; and (c) a list of possible strategies to resolve the identified deficits that have been reported in the literature to be successful in different contexts. The necessary characteristics for a comprehensive implementation strategy to improve district health management performance are also discussed. The application of the framework and of the analytical tools described in this document has the potential of producing a major improvement in many countries and constitute the strategy forward in this arena for UNCIEF.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Kamli, Amira. "Analysis and optimization of a new futuristic optical network architecture." Electronic Thesis or Diss., Institut polytechnique de Paris, 2019. http://www.theses.fr/2019IPPAS001.

Повний текст джерела
Анотація:
La demande en débit dans les réseaux augmente en raison de la croissance continue de trafic mondial et l'émergence de nouveaux services avec des exigences de plus en plus élevées. Dans ce contexte, la capacité du réseau devrait être augmenté tout en prenant en compte la consommation énergétique, son coût de construction et de maintenance. La combinaison des réseaux optiques avec un traitement orienté paquet pourrait répondre avantageusement à ces exigences. Cependant, à cause de l’absence de mémoire tout optique pratique, la commutation de paquets est le plus souvent exécutée électriquement rendant l'architecture plus cher et moins performante.De ce fait, dans le cadre du projet ANR/N-GREEN, une nouvelle architecture de commutateur/routeur, qui apporte des solutions à ces contraintes, a été proposée par une équipe de recherche à Nokia Bell Labs. L'architecture a été conçu pour répondre aux exigences strictes de 5G telle qu'un délai de bout en bout de mois de 10μs, mais aussi pour répondre à l'augmentation des trafics prévue spécialement dans la partie métropolitaine de réseau(MAN).Dans le cadre de cette thèse, nous nous sommes intéressés à l'analyse et l'amélioration des performances de cette nouvelle architecture de noeud quand elle est utilisé dans la partie métro de réseau. En effet, considérons cette partie, un réseau en anneau à commutation de slot optique OSS(Optical slot Switching), combinant la flexibilité, la mise à l'échelle de la technique de commutation de slot avec les avantages de la topologie en anneau promettraient une bonne solution pour les réseaux MAN du futur. Cette nouvelle architecture offre des fonctions intelligentes, avec un coût moins élevé en optimisant le type/nombre des composants utilisés. Le rendant ainsi un bon candidat pouvant remplacer les architectures optoélectroniques existantes telles que Ethernet ou d'autres architectures prometteuses proposées dans la littérature telle que POADM (Paquet Optical Add Drop Multexplwing) ou TWIIN (Time-domaine Wavelength Interleaved Networking (TWIN). L'élément fondamental du réseau est le WSADM (Wavelenght Slotted Add Drop Muliptexer) qui est implémenté à l'intérieur des noeuds des réseaux. Ceci permet de garantir la transparence optique et donc une commutation plus rapide et une réduction de consommation énergétique.Dans cette thèse, nous avons analysé les performances en matière de délais d'accès et d'efficacité d'utilisation des ressources d'un réseau en anneau composé d'un certain nombre de noeuds NGREEN. Les résultas préliminaires montrent que le réseau est surdimonsionné provoquant ainsi une large valeur de latence et une dégradation des performances sous certaines conditions telle que des modèles des trafics complexes. Afin d'adapter l'architecture aux différents types de trafics, nous avons utilisé quelques méthodes d'optimisation telle que Nelder Mead Simplex calculant ainsi les valeurs optimales des timers (temps d'attente moyen d'un paquet avant d'être inséré dans l'anneau optique). Trois différents modèles de trafics ont été considérés dans cette étude: outre que le modèle le plus connue Poisson, on a considéré deux modèles basés sur des traces de trafic réel déduite à partir de trace de CAIDA.En ayant recours aux régulateurs mono boucles, nous avons essayé par la suite de trouver un moyen afin d'auto-adapter les réseaux aux changements constants des trafics. Comme on parle d’une adaptation constante de la stratégie dans des conditions bruyantes, nous avons proposé un modèle de trafic très variable, beaucoup plus variable que celui tiré de trace CAIDA composé d'un ou plusieurs générateur de trafics différents. En observant l'effet de changement de fréquence de l’un des coefficients qui réagissent sur la stratégie de transmission des paquets sur les performances momentanées du réseau; nous avons essayé de déduire une relation entre ces deux paramètres
The demand for network throughput is increasing due to the continued growth of global traffic and the emergence of new services with ever higher requirements. In this context, the capacity of the network should be increased while taking into account the energy consumption, its cost of construction and maintenance. The combination of optical networks with packet-oriented processing could meet these requirements. However, because of the lack of all practical optical memory, packet switching is most often performed electrically causing the architecture to be costly expensive and less efficient.Thereby, as part of the ANR / N-GREEN project, a new switch/router architecture that provides solutions to these constraints has been proposed by a research team at Nokia Bell Labs. The architecture was designed to meet the stringent requirements of 5G such as end-to-end delay of 10μs but also to meet the traffic increase planned especially in the Metropolitan Area Network (MAN). Such architecture can therefore be used not only in the access/core part of the MAN but also in the core as well as in the Xhaul consider the 5G technology . In this dissertation, we have been interested in analyzing and improving the performance of this new node design when it is used in the MAN part.Indeed, taking this into consideration, an Optical Slot Switching (OSS) ring network, combining the flexibility and scaling of slot switching technique with the advantages of ring topology such as fast service restoration in case of failure and a good gain in static multiplexing of traffic, would promise a good solution for MAN networks of the future. This new architecture offers intelligent functions, with a lower cost by optimizing the type / number of components used. Such an architecture can replace existing optoelectronic technology such as Ethernet or other promising solution proposed in the literature such as POADM (Optical Package Add Drop Muliplexing) or TWIIN (Time-Domain Wavelength Interleaved Networking (TWIN). The fundamental element of the network is the WSADM (Wavelength Slotted Add Drop Muliptexer) which is implemented inside the nodes of the networks, guaranteeing hence the optical transparency and a faster switching.In this thesis, we analyzed the performance in terms of mean access delay and resource efficiency of a ring network composed of a number of NGREEN nodes. We have used some optimization methods such as Nelader Mead Simplex calculating dynamically the optimal values of the timers (average waiting time of a packet before being inserted in the optical ring), adapting thus the architecture to different types of traffic. Three different models of traffics were considered in this study: besides the most well-known Poisson model, two models based on traces of real traffic deduced from CAIDA traces were considered.Inspired by the idea of the loop controllers, we tried later to find a way to self-adapt the network facing constant changes in traffic. As we talk about a constant adaptation of the strategy in noisy conditions, we proposed a very variable traffic model, more variable than the one drawn from CAIDA trace. Observing the effect of the changing frequency of one of the coefficients that govern the strategy of packets ‘transmission on the momentary performance of the network; allowed to find a frequency range, where an instantaneous dependency between the stimulus and the response direct a self-adaptation scheme of the proposed strategyThe performance analysis of this architecture, taking into account different traffic models, shows that, thanks to the methods proposed and the application of optimization, the network responds to users' needs in terms of packet loss rates, latency
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Song, Yu. "Analysis and developement of tools for improving the CKD supply network in the automotive industry." Dortmund Verl. Praxiswissen, 2009. http://d-nb.info/997294493/04.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Bauer, Verena [Verfasser], and Göran [Akademischer Betreuer] Kauermann. "New approaches in network data analysis / Verena Bauer ; Betreuer: Göran Kauermann." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/121846660X/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Wvong, Russil. "A new methodology for OSI conformance testing based on trace analysis." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29343.

Повний текст джерела
Анотація:
This thesis discusses the problems of the conventional ISO 9646 methodology for OSI conformance testing, and proposes a new methodology based on trace analysis. In the proposed methodology, a trace analyzer is used to determine whether the observed behavior of the implementation under test is valid or invalid. This simplifies test cases dramatically, since they now need only specify the expected behavior of the IUT; unexpected behavior is checked by the trace analyzer. Test suites become correspondingly smaller. Because of this reduction in size and complexity, errors in test suites can be found and corrected far more easily. As a result, the reliability and the usefulness of the conformance testing process are greatly enhanced. In order to apply the proposed methodology, trace analyzers are needed. Existing trace analyzers are examined, and found to be unsuitable for OSI conformance testing. A family of new trace analysis algorithms is presented and proved. To verify the feasibility of the proposed methodology, and to demonstrate its benefits, it is applied to a particular protocol, the LAPB protocol specified by ISO 7776. The design and implementation of a trace analyzer for LAPB are described. The conventional ISO 8882-2 test suite for LAPB, when rewritten to specify only the expected behavior of the IUT, is found to be more than an order of magnitude smaller.
Science, Faculty of
Computer Science, Department of
Graduate
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Fors, Aldrich Octavi. "New Observational Techniques and Analysis Tools for Wide Field CCD Surveys and High Resolution Astrometry." Doctoral thesis, Universitat de Barcelona, 2006. http://hdl.handle.net/10803/745.

Повний текст джерела
Анотація:
The aim of this thesis is two-fold. First it provides a general methodology for applying image deconvolution to wide-field CCD imagery. Second, two new CCD observational techniques and two data analysis tools are proposed for the first time in the context of high resolution astrometry, in particular for lunar occultations and speckle interferometry observations.

In the first part of the thesis a wavelet-based adaptive image deconvolution algorithm (AWMLE) has been applied to two sets of survey type CCD data: QUasar Equatorial Survey Team project(QUEST and Near-Earth Space Surveillance Terrestrial(NESS-T). Richardson-Lucy image deconvolution has also been used with Flagstaff Transit Telescope (FASTT)imagery. Both the obtaining and performance of those images were accomplished by following a new methodology which includes accurate image calibration, source detection and centering, and correct assessment procedures of the performance of the deconvolution. Results show that AWMLE deconvolution can increase limiting magnitude up to 0.6 mag and improve limiting resolution 1 pixel with respect to original image. These studies have been conducted in the context of programs dedicated to macrolensing search (QUEST) and NEOs discovery(NESS-T). Finally, astrometric accuracy of FASTT images have not been found to change significantly after deconvolution. In the same way, no positional bias towards the centre of the pixel has been observed.

In the second part of the thesis a new observational technique based on CCD fast drift scanning has been proposed, implemented and assessed for lunar occultations (LO) and speckle interferometry observations.

In the case of LO, the technique yielded positive detection of binaries up to 2 milliarcseconds of projected separation and stellar diameters measurements in the 7 milliarcsecond regime. The proposed technique implies no optical or mechanical additional adjustments and can be applied to nearly all available full frame CCDs. Thus, it enables all kind of professional and high-end amateur observatories for LO work. Complementary to this work, a four-year LO program (CALOP) at Calar Alto Observatory spanning 71.5 nights of observation and 388 recorded events has been conducted by means of CCD and MAGIC IR array cameras at OAN 1.5m and CAHA 2.2m telescopes. CALOP results include the detection of one triple system and 14 new and 1 known binaries in the near-IR, and one binary in the visible. Their projected separations range from 90 to 2 milliarcseconds with brightness ratios up to 1:35 in the K band. Several angular diameters have been also measured in the near-IR. The performance of CALOP has been calibrated in terms of limiting magnitude (K down to 9.0) and limiting angular resolution (1-3 milliarcseconds).
In addition, the binary detection probability of the program has found to be about 4%. Finally, a new wavelet-based method for extracting and characterizing LO lightcurves in an automated fashion was proposed, implemented and applied to CALOP database. This pipeline addresses the need of disposing of preliminary results in immediate basis for future programs which will provide larger number of events.

In the case of speckle interferometry, CCD fast drift scanning technique has been validated with the observation of four binary systems with well determined orbits. The results of separation, position angle and magnitude difference are in accordance with published measurements by other observers and predicted orbits. Error estimates for these have been found to be 0.017 arcseconds, 1.5 degrees and 0.34 mag, respectively. These are in the order of other authors and can be considered as successful for a first trial of this technique.
Finally, a new approach for calibrating speckle transfer function from the binary power spectrum itself has been introduced. It does not require point source observations, which gives a more effective use of observation time. This new calibration method appears to be limited to zenith angles above 30 degrees when observing with no refraction compensation devices.
En aquest treball s'han dissenyat i desenvolupat una sèrie de noves tècniques observacionals i eines d'anàlisi de dades en dues àrees ben diferenciades. D'una banda, la deconvolució d'imatges CCD de gran camp (tipus survey). D'altra banda, l'astrometria d'alta resolució, i en particular les tècniques observacionals d'ocultacions lunars i interferometria speckle.

Quant a la primera, s'han aplicat dos algorismes de deconvolució (màxima versemblança Richardson-Lucy (MLE) i la seva variant adaptativa basada en wavelets (AWMLE)) a dades de tres projectes survey: el Flagstaff Transit Telescope (FASTT), el QUasar Equatorial Survey Team (QUEST) i el Near-Earth Space Surveillance Terrestrial (NESS-T). Els tres han vist restringida la seva magnitud i resolució límits a causa del mètode d'adquisició drift scanning (per a FASTT i QUEST) o la molt curta relació focal de l'instrument (NESS-T). S'ha proposat i implementat una nova metodologia per a l'aplicació de l'AWMLE i MLE per a les anteriors imatges. Aquesta permet avaluar la millora aportada per la deconvolució en termes d'increment de magnitud i resolució límits. A més, resulta del tot general i és exportable a altres dades survey. Els resultats obtinguts mostren que AWMLE permet aconseguir un increment en la magnitud límit de 0.6 mag i una millora en la resolució límit d'1.0 pixel. A més, s'ha comprovat que tals tendències són assimtòticament independents a partir d'un nombre d'iteracions suficientment gran. Paral·lelament, s'ha comprovat que la deconvolució MLE sobre dades FASTT no afecta significativament a la precisió astromètrica ni introdueix cap biaix posicional cap al centre del píxel.

Quant a les ocultacions lunars, s'ha ideat, desenvolupat, implementat i avaluat una nova tècnica d'observació CCD per a ocultacions lunars. Éstà basada en el mètode d'adquisició "drift scanning" i permet mostrejar la intensitat de l'objecte ocultat cada 2ms. La tècnica permet a pràcticament qualsevol observatori (professional o amateur) afrontar programes d'ocultacions lunars amb propòsits de contribució científica. Paral·lelament, s'ha portat a terme un programa d'observació d'ocultacions lunars (anomenat CALOP) que durant 4 anys i 71.5 nits s'ha portat a terme a l'Observatorio Astronómico de Calar Alto, operant tant en el visible amb CCD com en l'IR amb la càmera MAGIC. Com a resultat, s'han aconseguit mesurar 3 diàmetres estel·lars de l'ordre de 7 mil.lisegons d'arc i detectar 15 nous sistemes binaris i un triple amb separacions angulars de fins a 2 mil.lisegons d'arc. Finalment, s'ha desenvolupat i implementat un nou algorisme de reducció automàtica d'ocultacions basat en wavelets. Tal algorisme ha estat aplicat satisfactòriament en la reducció del conjunt d'ocultacions (400) registrades en el programa CALOP, i permet afrontar la reducció de futures campanyes d'observació massiva.

Quant a la interferometría speckle, s'ha ideat, desenvolupat, implementat i avaluat una nova tècnica d'observació CCD per a aquest tipus d'observacions d'alta resolució espacial. Éstà basada en el mètode d'adquisició drift scanning i permet mostrejar la intensitat de l'objecte dins de l'interval de coherència atmosfèrica. S'ha validat amb la mesura de 4 sistemes binaris d'òrbita coneguda. Els resultats de separació angular, angle de posició i diferència de magnitud estan d'acord amb els publicats per autors anteriors. La tècnica permet a pràcticament qualsevol observatori (professional o amateur) afrontar programes "speckle" amb propòsits de contribució científica. Finalment, s'ha ideat, implementat i validat una nova tècnica de autocalibració de dades speckle que permet estalviar temps d'observació.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Tesfazghi, Oluwakemi. "Accelerating access to new malaria vector control tools : a national and global health policy analysis." Thesis, University of Liverpool, 2016. http://livrepository.liverpool.ac.uk/3006144/.

Повний текст джерела
Анотація:
Background: New malaria vector control tools hold the promise of sustaining gains in malaria control achieved to date and achieving the goal of elimination set for 2030. However, insecticide resistance has the potential to derail these malaria control achievements. Access to innovative vector control tools is key to surmounting the threat of insecticide resistance and will play a major role if malaria elimination is to be achieved. The aim of this thesis is to gather new evidence and provide insight into strategies for accelerating access to new malaria vector control tools. This is done by examining access to new malaria vector control tools in two national settings (Nigeria and Burkina Faso) as well as at the global level. Methods: Three retrospective policy analyses were carried out using an analytical framework to guide the selection of key informants (KI), data collection and analysis. Semi-structured interviews were carried out with KIs in Nigeria (2013), Burkina Faso (2014) and at the global level (2014). Interviews were conducted in English (French in Burkina Faso) audio recorded, transcribed and entered into NVivo10 for data management and analysis. Data were coded according to the framework themes and then analysed to provide a description of the key points and explain patterns in the data. Results: A total of 40 interviews were conducted with policymakers, researchers, donors, multilaterals, Non-governmental organizations and private sector. The synthesized findings of the three case studies show that, in the context of insecticide resistance, the evidence required to facilitate policy change is nuanced and context specific; national policymaking may be well defined and appear to be evidence based, but can be open to being circumvented and hindered by inefficiencies in global policymaking and lack of donor funding; price rather than cost-effectiveness is the key financial variable at the national level; and no readily identifiable policy champions exist to facilitate global and national adoption of new vector control tools. Conclusions: This thesis has identified five areas that need to be strengthened in order to facilitate access to new malaria vector control tools by fostering their global and national adoption. The thesis demonstrates that, without a well-coordinated architecture to: facilitate the development of robust and appropriate evidence; support a transparent and timely global policymaking process; diversify the available funding base, and facilitate price reductions without stifling innovation, accelerating access to new vector control tools and achieving malaria elimination goals is unlikely.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Molinari, Diego <1985&gt. "Development of new tools and devices for CMB and foreground data analysis and future experiments." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6223/1/Molinari_Diego_tesi.pdf.

Повний текст джерела
Анотація:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Molinari, Diego <1985&gt. "Development of new tools and devices for CMB and foreground data analysis and future experiments." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6223/.

Повний текст джерела
Анотація:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Fruth, Matthias. "Formal methods for the analysis of wireless network protocols." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:df2c08f4-001c-42d3-a2f4-9922f081fb49.

Повний текст джерела
Анотація:
In this thesis, we present novel software technology for the analysis of wireless networks, an emerging area of computer science. To address the widely acknowledged lack of formal foundations in this field, probabilistic model checking, a formal method for verification and performance analysis, is used. Contrary to test and simulation, it systematically explores the full state space and therefore allows reasoning about all possible behaviours of a system. This thesis contributes to design, modelling, and analysis of ad-hoc networks and randomised distributed coordination protocols. First, we present a new hybrid approach that effectively combines probabilistic model checking and state-of-the-art models from the simulation community in order to improve the reliability of design and analysis of wireless sensor networks and their protocols. We describe algorithms for the automated generation of models for both analysis methods and their implementation in a tool. Second, we study spatial properties of wireless sensor networks, mainly with respect to Quality of Service and energy properties. Third, we investigate the contention resolution protocol of the networking standard ZigBee. We build a generic stochastic model for this protocol and analyse Quality of Service and energy properties of it. Furthermore, we assess the applicability of different interference models. Fourth, we explore slot allocation protocols, which serve as a bandwidth allocation mechanism for ad-hoc networks. We build a generic model for this class of protocols, study real-world protocols, and optimise protocol parameters with respect to Quality of Service and energy constraints. We combine this with the novel formalisms for wireless communication and interference models, and finally we optimise local (node) and global (network) routing policies. This is the first application of probabilistic model checking both to protocols of the ZigBee standard and protocols for slot allocation.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Baldini, Jacopo. "New visualization tools for sciences and humanities: databases and virtual reality." Doctoral thesis, Scuola Normale Superiore, 2018. http://hdl.handle.net/11384/85816.

Повний текст джерела
Анотація:
Within the scientific view, one of the major problems related to simulations developed for Virtual Reality, more specifically in research, is mainly related to the poor development of interaction between the data being displayed in the database in which they are contained, combined with the difficulty of making everything directly accessible in virtual environment. In Digital Humanities, it becomes increasingly necessary to have an instrument capable of combining scientific visualization in Virtual Reality with the access and consultation of an open repository. The work done for this thesis is based on the creation of a heterogeneous relational database called ArcheoDB, run by an open web platform, on which researchers can upload 3D content and share them with their own collaborators. The web platform and the database are alongside an application called "ArcheoDB VR Toolkit" developed with Unity Engine, which dynamically loads contents within the ArcheoDB database and visualizes them in a complex scene. This application also provide a detailed reconstruction of the various objects, the background of the discovery and the chemical analyzes made in a highly immersive way, thanks to the use of immersive visualization. Likewise, in order for the application to be 3truly innovative, it must allow not only to display 3D data, but also to interact in real time with the displayed models, enabling to enrich and modify them, based on data consultation and metadata drawn from the database.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Lebacher, Michael [Verfasser], and Göran [Akademischer Betreuer] Kauermann. "New approaches in statistical network data analysis / Michael Lebacher ; Betreuer: Göran Kauermann." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/1203067119/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Azab, Ahmad. "Classification of network information flow analysis (CONIFA) to detect new application versions." Thesis, Federation University Australia, 2015. http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/97576.

Повний текст джерела
Анотація:
Monitoring network traffic to identify applications or services is vital for internet service providers, network engineers and law enforcement agencies. The identification of applications enables network traffic to be prioritized, sophisticated plans for network infrastructure to be developed and facilitates the work of law enforcement agencies. Voice over IP (VoIP) and malware services are important to be classified because of the reliance by both legitimate users and cybercriminals respectively on these services. This dissertation addresses the detection of these services, represented by Skype application voice calls traffic and Zeus application command and control traffic. Three major approaches have been used to fulfil the classification goal, which are port-based, deep packet inspection and the use of the statistical features in conjunction with the machine learning algorithms. The latter approach addressed many of the limitations of the first two. However, the existing approach still contains many limitations. The detection of new versions by analysing and building the classifier on an old version was not deeply discussed for the machine learning approach. However, not all the statistical values are similar for different versions for Skype voice calls and Zeus command and control traffic. This is because Skype uses different codecs for different versions and Zeus uses different malware builders for different versions. While some approaches, aside from the machine learning approach, tackled the detection of the different versions, none of them maintain all the characteristics supported by the machine learning approach in terms of providing online classification capability and supporting various transport and application protocols, without the need to access different device’s traffic, access packet’s content or monitor different phase’s traffic. This research study aims to tackle this gap by proposing a novel framework called Classification of Network Information Flow Analysis (CONIFA). CONIFA addresses the detection of different untrained versions for a targeted application (Skype voice calls and Zeus command and control traffic) with a low detection time by analysing and building the classifier on a different single version in a systematic and well-defined approach, providing online classification capability and supporting various transport and application protocols, without the need to access different device’s traffic, access packet’s content or monitor different phase’s traffic. CONIFA is not limited to a specific application and could be extended to other types of applications. CONIFA utilizes the concepts of cost sensitive algorithms and different feature combinations for building the classifiers, unlike the machine learning approach that utilizes cost insensitive algorithms and a single feature combination. The outputs of the first phase are two classifiers, lenient and strict, that are used by the next phase to detect the untrained versions of a targeted application as well as to reduce the error rate. CONIFA results, for detecting the untrained version of Skype voice calls and Zeus C&C traffic, supported this approach in providing a better detection performance compared to the previous approach. While the previous approach was not able to reliably detect new versions of VoIP, CONIFA was able to consistently detect a previously unseen version. For the botnet detection, the previous approach had a good efficacy at the network level. However, CONIFA outperformed this approach in detecting a new version of a known piece of malware.
Doctor of Philosophy
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Catanese, Salvatore Amato. "New perspectives in criminal network analysis: multilayer networks, time evolution, and visualization." Doctoral thesis, Università di Catania, 2017. http://hdl.handle.net/10761/3793.

Повний текст джерела
Анотація:
The work presented in this Dissertation reflects a long-term human, professional and cultural path started some years ago when I first developed LogAnalysis, a tool for the analysis and visualization of criminal and social networks. Since then, I devoted myself to the development of frameworks, algorithms and techniques for supporting intelligence and law enforcement agencies in the task of unveiling the CN structure hidden in communication data, identifying the target offenders for their removal or selecting effective strategies to disrupt a criminal organization. In a natural way, I successively focused on the evaluation of the resilience of criminal networks and on the multiplex formalism, which takes into account the various relationships existing within a criminal organization. In this context I introduce criminal network analysis tools: LogAnalysis, LogViewer, Semantic viewer and Failure simulator. I have been involved in the design, modeling, and writing of all of the works presented. In particular, I have also developed and tested all the visual tools included therein. Finally, I introduce Multiplex PBFS (Mx-PBFS) a novel multi-threaded parallel Breadth-First Search algorithm for categorical and inter-layer couplings multiplex networks, and the framework CriMuxnet (still under development) for multilayer criminal networks analysis based on high-quality 3D visualizations of network data. CriMuxnet was designed to work in conjunction with a 3D computer graphics (CG) packages: Autodesk Maya or Blender. CriMuxnet exploits 3D engine features to significantly improve both exploratory search and visualization strategy.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Widing, Härje. "Business analytics tools for data collection and analysis of COVID-19." Thesis, Linköpings universitet, Statistik och maskininlärning, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176514.

Повний текст джерела
Анотація:
The pandemic that struck the entire world 2020 caused by the SARS-CoV-2 (COVID-19) virus, will have an enormous interest for statistical and economical analytics for a long time. While the pandemic of 2020 is not the first that struck the entire world, it is the first pandemic in history where the data were gathered to this extent. Most countries have collected and shared its numbers of cases, tests and deaths related to the COVID-19 virus using different storage methods and different data types. Gaining quality data from the COVID-19 pandemic is a problem most countries had during the pandemic, since it is constantly changing not only for the current situation but also because past values have been altered when additional information has surfaced. The importance of having the latest data available for government officials to make an informed decision, leads to the usage of Business Intelligence tools and techniques for data gathering and aggregation being one way of solving the problem. One of the mostly used software to perform Business Intelligence is the Microsoft develop Power BI, designed to be a powerful visualizing and analysing tool, that could gather all data related to the COVID-19 pandemic into one application. The pandemic caused not only millions of deaths, but it also caused one of the largest drops on the stock market since the Great Recession of 2007. To determine if the deaths or other reasons directly caused the drop, the study modelled the volatility from index funds using Generalized Autoregressive Conditional Heteroscedasticity. One question often asked when talking of the COVID-19 virus, is how deadly the virus is. Analysing the effect the pandemic had on the mortality rate is one way of determining how the pandemic not only affected the mortality rate but also how deadly the virus is. The analysis of the mortality rate was preformed using Seasonal Artificial Neural Network. Forecasting deaths from the pandemic using the Seasonal Artificial Neural Network on the COVID-19 daily deaths data.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Piovesan, Allison <1986&gt. "New computational biology tools for the systematic analysis of the structure and expression of human genes." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6283/1/piovesan_allison_tesi.pdf.

Повний текст джерела
Анотація:
From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Piovesan, Allison <1986&gt. "New computational biology tools for the systematic analysis of the structure and expression of human genes." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6283/.

Повний текст джерела
Анотація:
From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Kangas, M. (Maria). "Stability analysis of new paradigms in wireless networks." Doctoral thesis, Oulun yliopisto, 2017. http://urn.fi/urn:isbn:9789526215464.

Повний текст джерела
Анотація:
Abstract Fading in wireless channels, the limited battery energy available in wireless handsets, the changing user demands and the increasing demand for high data rate and low delay pose serious design challenges in the future generations of mobile communication systems. It is necessary to develop efficient transmission policies that adapt to changes in network conditions and achieve the target delay and rate with minimum power consumption. In this thesis, a number of new paradigms in wireless networks are presented. Dynamic programming tools are used to provide dynamic network stabilizing resource allocation solutions for virtualized data centers with clouds, cooperative networks and heterogeneous networks. Exact dynamic programming is used to develop optimal resource allocation and topology control policies for these networks with queues and time varying channels. In addition, approximate dynamic programming is also considered to provide new sub-optimal solutions. Unified system models and unified control problems are also provided for both secondary service provider and primary service provider cognitive networks and for conventional wireless networks. The results show that by adapting to the changes in queue lengths and channel states, the dynamic policy mitigates the effects of primary service provider and secondary service provider cognitive networks on each other. We investigate the network stability and provide new unified stability regions for primary service provider and secondary service provider cognitive networks as well as for conventional wireless networks. The K-step Lyapunov drift is used to analyse the performance and stability of the proposed dynamic control policies, and new unified stability analysis and queuing bound are provided for both primary service provider and secondary service provider cognitive networks and for conventional wireless networks. By adapting to the changes in network conditions, the dynamic control policies are shown to stabilize the network and to minimize the bound for the average queue length. In addition, we prove that the previously proposed frame based does not minimize the bound for the average delay, when there are shared resources between the terminals with queues
Tiivistelmä Langattomien kanavien häipyminen, langattomien laitteiden akkujen rajallinen koko, käyttäjien käyttötarpeiden muutokset sekä lisääntyvän tiedonsiirron ja lyhyemmän viiveen vaatimukset luovat suuria haasteita tulevaisuuden langattomien verkkojen suunnitteluun. On välttämätöntä kehittää tehokkaita resurssien allokointialgoritmeja, jotka sopeutuvat verkkojen muutoksiin ja saavuttavat sekä tavoiteviiveen että tavoitedatanopeuden mahdollisimman pienellä tehon kulutuksella. Tässä väitöskirjassa esitetään uusia paradigmoja langattomille tietoliikenneverkoille. Dynaamisen ohjelmoinnin välineitä käytetään luomaan dynaamisia verkon stabiloivia resurssien allokointiratkaisuja virtuaalisille pilvipalveludatakeskuksille, käyttäjien yhteistyöverkoille ja heterogeenisille verkoille. Tarkkoja dynaamisen ohjelmoinnin välineitä käytetään kehittämään optimaalisia resurssien allokointi ja topologian kontrollointialgoritmeja näille jonojen ja häipyvien kanavien verkoille. Tämän lisäksi, estimoituja dynaamisen ohjelmoinnin välineitä käytetään luomaan uusia alioptimaalisia ratkaisuja. Yhtenäisiä systeemimalleja ja yhtenäisiä kontrollointiongelmia luodaan sekä toissijaisen ja ensisijaisen palvelun tuottajan kognitiivisille verkoille että tavallisille langattomille verkoille. Tulokset osoittavat että sopeutumalla jonojen pituuksien ja kanavien muutoksiin dynaaminen tekniikka vaimentaa ensisijaisen ja toissijaisen palvelun tuottajien kognitiivisten verkkojen vaikutusta toisiinsa. Tutkimme myös verkon stabiiliutta ja luomme uusia stabiilisuusalueita sekä ensisijaisen ja toissijaisen palveluntuottajan kognitiivisille verkoille että tavallisille langattomille verkoille. K:n askeleen Lyapunovin driftiä käytetään analysoimaan dynaamisen kontrollointitekniikan suorituskykyä ja stabiiliutta. Lisäksi uusi yhtenäinen stabiiliusanalyysi ja jonon yläraja luodaan ensisijaisen ja toissijaisen palveluntuottajan kognitiivisille verkoille ja tavallisille langattomille verkoille. Dynaamisen algoritmin näytetään stabiloivan verkko ja minimoivan keskimääräisen jonon pituuden yläraja sopeutumalla verkon olosuhteiden muutoksiin. Tämän lisäksi todistamme että aiemmin esitetty frame-algoritmi ei minimoi keskimääräisen viiveen ylärajaa, kun käyttäjät jakavat keskenään resursseja
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Sábio, Rafael Miguel. "Luminescent nanohybrids based on silica and d-f heterobimetallic silylated complexes : new tools for biological media analysis /." Araraquara, 2016. http://hdl.handle.net/11449/144587.

Повний текст джерела
Анотація:
Orientador: Sidney José Lima Ribeiro
Banca: Marie-Joelle Menu
Banca: Lucas Alonso Rocha
Banca: Lauro June Queiroz Maia
Banca: Michel Wong Chi Man
Banca: Marc Verelst
Banca: Isabelle Gautier-Luneau
Resumo: The design of heterobimetallic luminescent complexes has gained growing interest in recent years due to their unique photophysical properties. More specifically, the development of heterobimetallic complexes using d-block chromophores to sensitize the near-infrared (NIR) emission of lanthanide complexes (such as Nd(III) and Yb(III)) has received significant attention taking into account their longer emission wavelengths and the interest of the NIR emission which penetrates human tissue more effectively than UV light. These properties give them potential applications in medical diagnostics or biomedical assays. Transitions to excited state levels of transition metal complexes occurring in the visible and characterized by large absorption coefficients, could efficiently sensitize f-f levels of Ln(III) ions. In this work new d-f heterobimetallic complexes containing silylated ligands were prepared supported on silica materials. [Ru(bpy)2(bpmd)]Cl2 (labeled Ru), [Ru(bpy)(bpy-Si)(bpmd)]Cl2 (labeled RuL) and [Ln(TTA-Si)3] (labeled LnL3) and d-f heterobimetallic complexes, Ru-LnL3 and Ln-RuL (Ln = Nd3+, Yb3+) were prepared. Structural characterization was carried out by Raman Scattering, 1H and 13C NMR spectroscopies. Results obtained from 1H-13C HMBC and HSQC correlation NMR spectra confirm the formation of proposed complexes. Photophysical properties studies highlight the efficiency of Ru-Ln energy transfer processes in NIR-emitting lanthanide complexes mediated by conjugated brid... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: ABSTRACT The design of heterobimetallic luminescent complexes has gained growing interest in recent years due to their unique photophysical properties. More specifically, the development of heterobimetallic complexes using d - block chromophores to sensitize the near - infrared (NIR) emission of lanthanide complexes (such as Nd(III) and Yb(III)) has received significant attention taking into account their longer emission wavelengths and the interest of the NIR emission which pene trates human tissue more effectively than UV light. These properties give them potential applications in medical diagnostics or biomedical assays. Transitions to excited state levels of transition metal complexes occurring in the visible and characterized by large absorption coefficients, could efficiently sensitize f - f levels of Ln(III) ions. I n this work new d - f heterobimetallic complexes containing silylated ligands were prepared supported on silica materials. [Ru(bpy) 2 (bpmd)]Cl 2 (labeled Ru ), [Ru(bpy)(bpy - Si)(bpmd)]Cl 2 (labeled RuL ) and [Ln(TTA - Si) 3 ] (labeled LnL3 ) and d - f heterobimetallic complexes, Ru - LnL3 and Ln - RuL (Ln = Nd 3+, Yb 3+ ) were prepared. Structural characterization was carried out by Raman Scattering, 1 H and 13 C NMR spectroscopies . Results obtained from 1 H - 13 C HMBC and HSQC correlation NMR spectra confirm the formation of proposed complexes. Photophysical properties studies highlight the efficiency of Ru - Ln energy transfer processes in NIR - emitting lanthanide complexes mediated by conjugated bridging ligand (2,2' - bipyrimidine). Lifetime measurements were carried out and values of quantum yield for energy transfer (  ET ) between 30 and 84 % could be evaluated.  ET of 7 3 .4 % obtained for the Yb - RuL complex is the largest value reported for Ru(II) - Yb(III) heterobimetallic complexes so far. Grafting on different silica matrix was also demonstrated...
Doutor
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Sábio, Rafael Miguel [UNESP]. "Luminescent nanohybrids based on silica and d-f heterobimetallic silylated complexes: new tools for biological media analysis." Universidade Estadual Paulista (UNESP), 2016. http://hdl.handle.net/11449/144587.

Повний текст джерела
Анотація:
Submitted by RAFAEL MIGUEL SÁBIO null (rafaelmsabio@gmail.com) on 2016-11-01T21:14:55Z No. of bitstreams: 1 THESE RAFAEL MIGUEL SABIO 26_10_2016.pdf: 9769437 bytes, checksum: 37a38ed4b54498b696d4fe43ebcdfa2e (MD5)
Approved for entry into archive by Felipe Augusto Arakaki (arakaki@reitoria.unesp.br) on 2016-11-10T13:19:03Z (GMT) No. of bitstreams: 1 sabio_rm_dr_araiq_par.pdf: 1635130 bytes, checksum: e5de0aa69e20b9d8d68afedc85cab297 (MD5)
Made available in DSpace on 2016-11-10T13:19:03Z (GMT). No. of bitstreams: 1 sabio_rm_dr_araiq_par.pdf: 1635130 bytes, checksum: e5de0aa69e20b9d8d68afedc85cab297 (MD5) Previous issue date: 2016-10-13
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Le design de complexes luminescents hétérobimétalliques a suscité ces dernières années un intérêt croissant en raison de leurs propriétés photophysiques uniques. Dans ces complexes de lanthanide (Nd (III) et Yb (III)) associé à des chromophores du bloc d, la forte émission des métaux de transition dans le visible est utilisée pour sensibiliser de façon efficace les niveaux f-f des lanthanides(III) qui émettent à leur tour dans le visible ou l’IR selon les terres rares. Plus spécifiquement l’attention s’est focalisée sur le développement de complexes hétérobimétalliques d-f pour l’émission dans le proche infrarouge (NIR). En effet le proche infrarouge, comparé à l’UV, pénètre plus facilement les tissus biologiques humains notamment la peau. Bien que de telles propriétés confèrent à ces complexes bimétalliques un fort potentiel pour le diagnostic médical, aucun complexe hétérobimétallique d-f greffé de façon covalente à une matrice de silice n’a été décrit. Dans ce travail de nouveaux complexes hétérobimétalliques d-f contenant des ligands silylés ont été préparés et greffés sur la silice. Les complexes monomères [Ru(bpy)2(bpmd)]Cl2 (noté Ru), [Ru(bpy)(bpy-Si)(bpmd)]Cl2 (noté RuL) et [Ln(TTA-Si)3] (noté LnL3) et les complexes hétérobimétalliques d-f Ru-LnL3 et Ln-RuL (Ln = Nd3+, Yb3+) ont été préparés. La caractérisation des complexes a été effectuée par spectroscopie Raman, RMN 1H et 13C RMN. Les spectres RMN 1D 1H et 13C NMR ainsi que 2D de corrélation HSQC confirment les structures proposées. L’étude des propriétés photophysiques met en évidence l’émission de l’élément lanthanide dans le proche infrarouge ainsi que l'efficacité du processus de transfert d'énergie Ru-Ln qui est facilité par le ligand (2,2'-bipyrimidine). Les mesures de durée de vie et de rendement quantique (ET) pour le transfert d'énergie indiquent des valeurs remarquables comprises entre 30 et 84 %. La valeur du rendement quantique (ET) du complexe d'Yb-RuL, 73,4 %, est à ce jour la plus grande valeur rapportée pour un complexe hétérobimétallique Ru (II)-Yb (III). Le greffage sur différentes matrices de silice, mésoporeuse SiO2 ou dense SiO2 d, a été réalisé. Les nanohybrides SiO2-RuL, SiO2-NdL3 et SiO2-YbL3 ont été obtenus avec des taux de greffage allant de 0,08 à 0,18 mmol de complexe par gramme de silice. SiO2-RuNd et SiO2- RuYb ont été obtenus par greffage simultané des complexes silylés monomères de ruthénium et de lanthanide, des taux de greffage de 0,10 à 0,16 mmol.g-1 ont été obtenus, respectivement. Les rendements quantiques ET de transfert d’énergie des nanohybrides SiO2-RuNd et SiO2-RuYb sont respectivement de 40 and 27,5 %. La valeur remarquable obtenue pour le nanohybride impliquant le néodyme, SiO2-RuNd, s’explique par bonne adéquation entre les niveaux d’énergie du donneur et de l’accepteur. Les nanohybrides SiO2- RuYbL3, SiO2-YbRuL, SiO2d-YbRuL et SiO2-NdRuL ont été obtenus par greffage des complexes silylés hétérobimétallic d-f élaborés dans ce travail, les taux de greffage, de 0,03 à 0,17 mmol.g-1 permettent d’envisager une fonctionnalisation chimique ultérieure de ces nanoobjets. Les propriétés de luminescence de ces nanohybrides sont similaires à celles des complexes non greffés hormis pour SiO2-YbRuL and SiO2d-YbRuL qui présentent des profils de luminescence différents comparés au complexe libre Yb—RuL. Le greffage à l'intérieur des pores de la silice pourrait éviter le processus de désactivation de la luminescent contrairement au greffage sur la matrice de silice dense. Les propriétés photophysiques associées à la morphologie et à la stabilité de la matrice de silice mésoporeuse permettent d’envisager l’utilisation de ces nouveaux nanohybrides luminescents dans le proche infrarouge comme nanosondes ou nanomarqueurs de systèmes biologiques.
The design of heterobimetallic luminescent complexes has gained growing interest in recent years due to their unique photophysical properties. More specifically, the development of heterobimetallic complexes using d-block chromophores to sensitize the near-infrared (NIR) emission of lanthanide complexes (such as Nd(III) and Yb(III)) has received significant attention taking into account their longer emission wavelengths and the interest of the NIR emission which penetrates human tissue more effectively than UV light. These properties give them potential applications in medical diagnostics or biomedical assays. Transitions to excited state levels of transition metal complexes occurring in the visible and characterized by large absorption coefficients, could efficiently sensitize f-f levels of Ln(III) ions. In this work new d-f heterobimetallic complexes containing silylated ligands were prepared supported on silica materials. [Ru(bpy)2(bpmd)]Cl2 (labeled Ru), [Ru(bpy)(bpy-Si)(bpmd)]Cl2 (labeled RuL) and [Ln(TTA-Si)3] (labeled LnL3) and d-f heterobimetallic complexes, Ru—LnL3 and Ln—RuL (Ln = Nd3+, Yb3+) were prepared. Structural characterization was carried out by Raman Scattering, 1H and 13C NMR spectroscopies. Results obtained from 1H-13C HMBC and HSQC correlation NMR spectra confirm the formation of proposed complexes. Photophysical properties studies highlight the efficiency of Ru—Ln energy transfer processes in NIR-emitting lanthanide complexes mediated by conjugated bridging ligand (2,2'-bipyrimidine). Lifetime measurements were carried out and values of quantum yield for energy transfer (ET) between 30 and 84 % could be evaluated. ET of 73.4 % obtained for the Yb—RuL complex is the largest value reported for Ru(II)—Yb(III) heterobimetallic complexes so far. Grafting on different silica matrix was also demonstrated. SiO2-Ru, SiO2-NdL3 and SiO2-YbL3 nanohybrids were obtained with grafting efficiencies from 0.08 to 0.18 mmol g-1 of silica. SiO2-RuNd and SiO2-RuYb were performed from simultaneous grafting of ruthenium and lanthanides silylated complexes. Grafting efficiencies from 0.10 to 0.16 were obtained. ET of 40 and 27.5 % were obtained from SiO2-RuNd and SiO2-RuYb, respectively. The higher values observed for the Nd(III) nanohybrid is well explained by the matching of donor and acceptor energy levels. SiO2-RuYbL3, SiO2-YbRuL, SiO2 d-YbRuL and SiO2-NdRuL were carried out from grafting of d-f heterobimetallic silylated complexes. Grafting efficiencies from 0.03 to 0.17 were obtained. Luminescent properties from these nanohybrids were similar to the free complexes. However the SiO2-YbRuL and SiO2 d-YbRuL showed distinct luminescent profiles compared with the free Yb—RuL. The grafting inside the mesoporous channels may prevent luminescent desactivation processes comparing to the dense silica matrix. The photophysical properties associated with the morphology and stability of the mesoporous silica matrix allow suggesting these new NIR luminescent nanohybrids as nanoprobes or nanomarkers in biomedicine.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Saleetid, Nattakan. "Epizoological tools for acute hepatopancreatic necrosis disease (AHPND) in Thai shrimp farming." Thesis, University of Stirling, 2017. http://hdl.handle.net/1893/26828.

Повний текст джерела
Анотація:
Acute hepatopancreatic necrosis disease (AHPND) is an emerging bacterial infection in shrimp that has been widespread across the major world shrimp producing countries since 2009. AHPND epizootics have resulted in a huge loss of global shrimp production, similar to that caused by white spot disease in the 1990’s. The epizootiological understanding of the spread of AHPND is still in its early stages, however, and most of the currently published research findings are based on experimental studies that may struggle to capture the potential for disease transmission at the country scale. The main aim of this research, therefore, is to develop epizootiological tools to study AHPND transmission between shrimp farming sites. Some tools used in this research have already been applied to shrimp epizoology, but others are used here for the first time to evaluate the spread of shrimp diseases. According to an epizootiological survey of AHPND in Thailand (Chapter 3), the first case of AHPND in the country was in eastern shrimp farms in January 2012. The disease was then transmitted to the south in December 2012. The results obtained from interviews, undertaken with 143 sample farms were stratified by three farm-scales (large, medium and small) and two locations (east and south). Both the southern location and large-scale farming were associated with a delay in AHPND onset compared with the eastern location and small- and medium-scale farming. The 24 risk factors (mostly related to farming management practices) for AHPND were investigated in a cross-sectional study (Chapter 3). This allowed the development of an AHPND decision tree for defining cases (diseased farms) and controls (non-diseased farms) because at the time of the study AHPND was a disease of unknown etiology. Results of univariate and unconditional logistic regression models indicated that two farming management practices related to the onset of AHPND. First, the absence of pond harrowing before shrimp stocking increased the risk of AHPND occurrence with an odds ratio () of 3.9 (95 % CI 1.3–12.6; P‑value = 0.01), whereas earthen ponds decreased the risk of AHPND with an of 0.25 (95 % CI 0.06–0.8; P‑value = 0.02). These findings imply that good farming management practices, such as pond-bottom harrowing, which are a common practice of shrimp farming in earthen ponds, may contribute to overcoming AHPND infection at farm level. For the purposes of disease surveillance and control, the structure of the live shrimp movement network within Thailand (LSMN) was modelled, which demonstrated the high potential for site-to-site disease spread (Chapter 4). Real network data was recorded over a 13-month period from March 2013 to March 2014 by the Thailand Department of Fisheries. After data validation, c. 74 400 repeated connections between 13 801 shrimp farming sites were retained. 77 % of the total connections were inter-province movements; the remaining connections were intra-province movements (23 %). The results demonstrated that the LSMN had properties that both aided and hindered disease spread (Chapter 4). For hindering transmission, the correlation between and degrees was weakly positive, i.e. it suggests that sites with a high risk of catching disease posed a low risk for transmitting the disease (assuming solely network spread), and the LSMN showed disassortative mixing, i.e. a low preference for connections joining sites with high degree linked to connections with high degree. However, there were low values for mean shortest path length and clustering. The latter characteristics tend to be associated with the potential for disease epidemics. Moreover, the LSMN displayed the power-law in both and degree distributions with the exponents 2.87 and 2.17, respectively. The presence of power-law distributions indicates that most sites in the LSMN have a small number of connections, while a few sites have large numbers of connections. These findings not only contribute to a better understanding of disease spread between sites, therefore, but also reveal the importance of targeted disease surveillance and control, due to the detection of scale-free properties in the LSMN. Chapter 5, therefore, examined the effectiveness of targeted disease surveillance and control in respect to reducing the potential size of epizootics in the LSMN. The study untilised network approaches to identify high-risk connections, whose removal from the network could reduce epizootics. Five disease-control algorithms were developed for the comparison: four of these algorithms were based on centrality measures to represent targeted approaches, with a non-targeted approach as a control. With the targeted approaches, technically admissible centrality measures were considered: the betweenness (the number of shortest paths that go through connections in a network), connection weight (the frequency of repeated connections between a site pair), eigenvector (considering the degree centralities of all neighbouring sites connected to a specified site), and subnet-crossing (prioritising connections that links two different subnetworks). The results showed that the estimated epizootic sizes were smaller when an optimal targeted approach was applied, compared with the random targeting of high-risk connections. This optimal targeted approach can be used to prioritise targets in the context of establishing disease surveillance and control programmes. With complex modes of disease transmission (i.e. long-distance transmission like via live shrimp movement, and local transmission), an compartmental, individual-based epizootic model was constructed for AHPND (Chapter 6). The modelling uncovered the seasonality of AHPND epizootics in Thailand, which were found likely to occur between April and August (during the hot and rainy seasons of Thailand). Based on two movement types, intra-province movements were a small proportion of connections, and they alone could cause a small AHPND epizootic. The main pathway for AHPND spread is therefore long-distance transmission and regulators need to increase the efficacy of testing for diseases in farmed shrimp before movements and improve the conduct of routine monitoring for diseases. The implementation of these biosecurity practices was modelled by changing the values of the long-distance transmission rate. The model demonstrated that high levels of biosecurity on live shrimp movements (1) led to a decrease in the potential size of epizootics in Thai shrimp farming. Moreover, the potential size of epizootics was also decreased when AHPND spread was modelled with a decreased value for the local transmission rate. Hence, not only did the model predict AHPND epizootic dynamics stochastically, but it also assessed biosecurity enhancement, allowing the design of effective prevention programmes. In brief, this thesis develops tools for the systematic epizootiological study of AHPND transmission in Thai shrimp farming and demonstrates that: (1) at farm level, current Thai shrimp farming should enhance biosecurity systems even in larger businesses, (2) at country level, targeted disease control strategies are required to establish disease surveillance and control measures. Although the epizootiological tools used here mainly evaluate the spread of AHPND in shrimp farming sites, they could be adapted to other infectious diseases or other farming sectors, such as the current spread of tilapia lake virus in Nile tilapia farms.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Anantachai, Arnond. "A New Mobile Network Simulation And Analysis System And The Use Of Network Visualizations Through An End-User Graphics Package." OpenSIUC, 2010. https://opensiuc.lib.siu.edu/theses/243.

Повний текст джерела
Анотація:
Network simulations often output a log file, which must be parsed to understand the details of the simulation. Visualizations of these simulations are used to make debugging and analysis easier, and there are many visualizers that will display the simulation in 2D. Those in 3D do not fully utilize 3D graphics operations to visualize asimulation. This thesis explores the ways 3D graphics can be used to further enhance a visualization. To do this, it introduces a new network simulator and a visualizer, consisting of an analyzer, which collects statistics about a simulation, and a renderer, which leverages an existing program package for rendering.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Betts, Loren Cole. "A new theoretical and experimental approach to nonlinear vector network and complex signal analysis." Thesis, University of Leeds, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.522925.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Nahlik, Brady J. "On a Potential New Measurement of the Self-Concept." The Ohio State University, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1610092728212115.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Porcheddu, Antonio. "The Ager valley historic landscape: new tools and quantitative analysis. Architecture and agrarian parcels in the medieval settlement dynamics." Doctoral thesis, Universitat de Lleida, 2017. http://hdl.handle.net/10803/436891.

Повний текст джерела
Анотація:
Aquest treball exposa els resultats de la investigació doctoral sobre el paisatge històric de la Vall d’Àger amb l’aplicació de la metodologia de l’Arqueologia del Paisatge. S'han utilitzar diferents mètodes científics per obtenir informació heteogènia: des de la teledetecció (lidar i radar), els documents escrits, la prospecció i l’excavació arqueològica, fins a la anàlisis de l’arquitectura i l’aplicació dels models de l’arqueologia predictiva (least cost path, site catchment analysis, viewshed analysis etc.). Els objectius consisteixen en la clarificació de les dinàmiques dels assentaments rurals en l'Edat Mitjana, des de el segle V fins al XIII, a través de l'anàlisi dels assentaments, de les vies de comunicació, del paisatge agrari i dels sistemes defensius i de l'estructura eclesiàstica. Aquest estudi ha demostrat que l’estructura del paisatge de la Vall d’Àger es genera amb un primer impuls entre els segles V i VII, i posteriorment experimenta un profund canvi a partir de finals del segle X.
Este trabajo trata de analizar el paisaje histórico del Valle de Àger (España) a través de múltiples fuentes, como las fuentes escritas, la teledetección, las fuentes arqueológicas, las arquitecturas históricas y los análisis del parcelario agrario, de la viabilidad y de la toponímica. El objetivo principal es el de analizar la periodización del asentamiento medieval de forma general y detallada a través del estudio del paisaje fortificado y del paisaje sagrado en una ventana cronológica que va desde el siglo 5 hasta el siglo 13. También se analizaron los elementos principales de las arquitecturas históricas disponibles mediante los métodos de la arqueología de la arquitectura. Otra aplicación metodológica fue la de la Archéogéographie Francesca a través de la cual es posible estudiar las estructuras del parcelario agrario y de la viabilidad. Todos los datos han sido sintetizados para obtener el cuadro general del asentamiento medieval en el valle.
This work deals with the analysis of the Ager Valley historic landscape through the methodologies of Landscape Archaeology. It uses several multidisciplinary sources as written documents, remote sensing images (mainly lidar and radar), parcels analysis, archaeological prospections and archaeological excavations. The main target has been obtaining the periodisation of the medieval settlement in the valley from the 5th to the 13th centuries. It analyses the structure of the agrarian parcel systems through the methodology developed in the Archaeogeographic studies and predictive archaeology (least cost path, viewshed analysis, site catchment analysis). It also uses the application of the Archaeology of Architecture in order to study the material evidences of the churches and the defensive towers of the valley. After the analysis of the different sources, it tries to develop a synthesis of the data following the chronological windows allowed by the sources. All the data have been used also to analyse the Landscape of Power and the Sacred Landscape in the valley during the Middle Ages.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Jacobs, Anna. "New tools for analysis of wood and pulp components : application of MALDI-TOF mass spectroscopy and capillary zone electrophoresis /." Stockholm, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3228.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Wallace, Robert. "Categories of Conceptions in Karlstad Community Classrooms : An analysis of educator interviews regarding new media technologies as teaching tools." Thesis, Karlstads universitet, Avdelningen för medie- och kommunikationsvetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-15046.

Повний текст джерела
Анотація:
Från höstterminstarten 2011 har alla nya elever inom de kommunala gymnasieskolorna i Karlstad och Hammarö tillgång till en egen dator, en satsning som brukar gå under beteckningen en-till-en. Denna situation öppnar för en möjlighet att studera hur denna tekniska förändring också påverkar lärares upplevelse av teknikskiftet och hur lärarna uppfattar att tekniken förändrar skolvardagen. Under december 2011 intervjuades 47 lärare i engelska (du skriver rätt antal) utifrån en gemensam intervjuguide med öppna frågor kring erfarenheter, strategier och åsikter om hur det är att arbeta i klasser som omfattades av en-till-en-satsningen. Syftet med studien var att kategorisera lärarnas uppfattningar och hur de erfar denna nya situation; detta då med hjälp av en fenomenografisk analysmetod. Studien tar sin utgångspunkt i ett sociokulturellt perspektiv där lärandet ingår i en social kommunikativ process. Resultatet av analysen pekar mot att det går att dela in lärarnas syn på den nya situationen i tre huvudkategorier; erfarenhet av fenomen genom jämförelse; undervisning strategier i förhållande till fenomenet; fenomen i praktik.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Hewitson, Andrew. "TeleCities : a new geography of governance? : an institutional, relational and scalar analysis of a transnational network." Thesis, University of Hull, 2006. http://hydra.hull.ac.uk/resources/hull:13199.

Повний текст джерела
Анотація:
In recent years there has been a steady rise in the amount of cities engaing in collaborative transnational networking. Perceived as a valid response to the threats of globalisation and the internationalisation of the economy, cities have formulated partnerships that transcend the remit of their locality enabling them a new mobility. Predominantly focused at the European scale, individual cities are establishing transnational networks that aim to harness and aggregate isolated pockets of power into a powerful cohesive institutional identity that allow them a collective voice and a potential degree of influence within European governance structures. Using the TeleCities network as an empirical focus, this thesis aims to explore the spatial implications within this potential 'new geography of governance'. To do this theories of institutional ism, reflexivity and scale are used to construct an analytical framework that explores the implications and processes of transnational networking. This is then applied to a three way case study methodology that aims to examine the process and attributes of transnational networking from a multi-scalar perspective. In doing so, the thesis provides a theoretical and empirical contextualisation to the origins, functionality and relationality of a transnational network and its ability to link actors and processes at different spatial scales.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії