To see the other types of publications on this topic, follow the link: Analysis pipeline.

Dissertations / Theses on the topic 'Analysis pipeline'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Analysis pipeline.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Vervik, Stian. "Pipeline Accidental Load Analysis." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for marin teknikk, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-15453.

Full text
Abstract:
Ship interaction in terms of anchor hooking on a subsea pipeline has been investigated in this thesis. An attempt has been made to predict the most probable anchor interaction loads on the Kvitebjørn gas pipeline in the North Sea if anchor hooking were to occur, and evaluate the structural consequences of an anchor hooking incident. By utilization of AIS ship data provided by the Norwegian Coastal Administration it has been found that 7160 cargo, tanker and tug ships passed the Kvitebjørn gas pipeline from March 2010 to March 2011. These ships have been evaluated with respect to anchor equipment, ship mass and velocity by use of a developed computer script in the computer code MATLAB. It has been found from geometrical evaluations of anchor hooking that anchors above 3780 kg will have large enough dimensions to hook pipeline. Anchor tow depth analyses predict that stabilization depth of a towed anchor arrangement is about 1/3 of the chain length for velocities around 15 knots. The geometrical evaluations and the tow depth analyses have been included in the computer script, and ships not able to hook pipeline have been separated out. Results predict that 237 of the total 7160 evaluated cargo ships, tankers and tugboats possess the necessary hooking parameters. These ships have large enough anchor for the pipeline to get stuck, and ship velocity low enough that the anchor will touch seabed if dropped. Ship traffic has been found to be largest over pipeline sections with a water depth of around 300 meters. Due to this large water depth only ships with large anchors sizes around 10 tons and above will be able to touch down on the seabed. The most frequently observed anchor equipment and velocities of the ships found to be able to hook the pipeline have been determined. In order to predict the structural consequences of anchor hooking a model in the computer code SIMLA has been developed. The most frequently observed anchor equipment and ship velocities from the AIS studies performed have been included in the SIMLA analyses. Results from the analysis predict that very large strain levels will be observed as a result of anchor hooking. Strains have been found to exceed design strains for interaction by the most frequently observed anchor systems and the pipeline would need extensive reparations due to utilization of the plastic capacity of the cross section beyond capacity corresponding to Specified Minimum Tensile Strength (SMTS).
APA, Harvard, Vancouver, ISO, and other styles
2

Chamorro, Alexander. "The analysis of pipeline leak tests using DEGADIS model." Morgantown, W. Va. : [West Virginia University Libraries], 1999. http://etd.wvu.edu/templates/showETD.cfm?recnum=913.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 1999.
Title from document title page. Document formatted into pages; contains ix, 123 p. : ill. (some col.), maps. Includes abstract. Includes bibliographical references (p. 64-65).
APA, Harvard, Vancouver, ISO, and other styles
3

Mozere, M. "High-throughput sequencing analysis pipeline." Thesis, University College London (University of London), 2016. http://discovery.ucl.ac.uk/1528797/.

Full text
Abstract:
High-throughput sequencing methods were developed to increase the productivity of processing data from genomic DNA. Sequencing platforms are generating massive amounts of genetic variation data which makes it difficult to pinpoint a small subset of functionally important variants. The focus has now shifted from generating sequences to searching for the critical differences that separate normal variants from disease ones. Our High-throughput Sequencing Analysis Pipeline (HSAP) is a multistep analysis software designed to annotate and filter variants in a top-down fashion from Variant Calling Format (VCF) files in order to find disease causing variants in the patients. It is designed in Linux medium and is composed of a collection of interacting task-specific modules written in different programming languages (such as Python, C++) and shell scripts. Each module is designed to perform a specific task, such as: annotate variants with their functional characterisation, zygosity status, allele frequencies within population; filter variants depending on the inherited disease model, read depth, call quality, physical location and other criteria. The output is added to the universal VCF format file, which contains annotated and filtered genomic variants. The pipeline was verified by identifying/confirming a specific disease-causing mutation for a single-gene disorder. HSAP is designed as an open-source locally self-contained bootable software that uses only information from publicly available databases. It has a user-friendly offline web-interface that allows to select different modules and chain them together to create unique filtering arrangements in order to adapt the pipeline as needed.
APA, Harvard, Vancouver, ISO, and other styles
4

Donaldson, Val. "Asynchronous pipeline analysis and scheduling /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1997. http://wwwlib.umi.com/cr/ucsd/fullcit?p9804026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Salahifar, Raydin. "Analysis of Pipeline Systems Under Harmonic Forces." Thesis, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/19820.

Full text
Abstract:
Starting with tensor calculus and the variational form of the Hamiltonian functional, a generalized theory is formulated for doubly curved thin shells. The formulation avoids geometric approximations commonly adopted in other formulations. The theory is then specialized for cylindrical and toroidal shells as special cases, both of interest in the modeling of straight and elbow segments of pipeline systems. Since the treatment avoids geometric approximations, the cylindrical shell theory is believed to be more accurate than others reported in the literature. By adopting a set of consistent geometric approximations, the present theory is shown to revert to the well known Flugge shell theory. Another set of consistent geometric approximations is shown to lead to the Donnell-Mushtari-Vlasov (DMV) theory. A general closed form solution of the theory is developed for cylinders under general harmonic loads. The solution is then used to formulate a family of exact shape functions which are subsequently used to formulate a super-convergent finite element. The formulation efficiently and accurately captures ovalization, warping, radial expansion, and other shell behavioural modes under general static or harmonic forces either in-phase or out-of-phase. Comparisons with shell solutions available in Abaqus demonstrate the validity of the formulation and the accuracy of its predictions. The generalized thin shell theory is then specialized for toroidal shells. Consistent sets of approximations lead to three simplified theories for toroidal shells. The first set of approximations has lead to a theory comparable to that of Sanders while the second set of approximation has lead to a theory nearly identical to the DMV theory for toroidal shells. A closed form solution is then obtained for the governing equation. Exact shape functions are then developed and subsequently used to formulate a finite element. Comparisons with Abaqus solutions show the validity of the formulation for short elbow segments under a variety of loading conditions. Because of their efficiency, the finite elements developed are particularly suited for the analysis of long pipeline systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Hwisu, Shin. "Stochastic Analysis For Water Pipeline System Management." 京都大学 (Kyoto University), 2015. http://hdl.handle.net/2433/202696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sheblee, Jafer Sadeg. "A hierarchical pipeline processor for image analysis." Thesis, University of Newcastle Upon Tyne, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Yunxiao. "One-way buckling analysis of pipeline liners." Thesis, University College London (University of London), 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.404888.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jittamai, Phongchai. "Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows." Texas A&M University, 2004. http://hdl.handle.net/1969.1/3112.

Full text
Abstract:
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T*E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T*E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
APA, Harvard, Vancouver, ISO, and other styles
10

Tinston, S. F. "Fracture toughness of mechanised pipeline girth welds." Thesis, University of Salford, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.381698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Thornton, David J. "Finite element analysis of fibre-reinforced composite pipeline." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0016/MQ47106.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Trille, Christophe. "Reliability analysis for subsea pipeline cathodic protection systems." Thesis, Cranfield University, 1996. http://dspace.lib.cranfield.ac.uk/handle/1826/12142.

Full text
Abstract:
Subsea pipelines, as the main transportation means for oil and gas produced offshore, are a key element of the production system. Cathodic protection systems (CPS) are used in combination with surface coatings to protect the pipeline from external corrosion. Although cases of pipeline failure due to external corrosion remain rare, such failures can have catastrophic effects in terms of human lives, environment degradation and financial losses. The offshore industry was led to the use of risk analysis techniques subsequent to major disasters, such as Piper Alpha and Alexander Kjelland. These accidents made the development and use of risk analysis techniques of highly significant interest, and reliability analysis is presently becoming a more important management tool in that field for determining reliability of components such as pipelines, subsea valves and offshore structures. This research is based on an analysis of subsea pipeline cathodic protection systems and on a model of the electrochemical potentials at the pipeline surface. This potential model uses finite element modelling techniques, and integrates probabilistic modules for taking into account uncertainties on input parameters. Uncertainties are used to calculate standard deviations on the potential values. Based on the potentials and potential variances obtained, several parameters characteristic of the cathodic protection system reliability, such as probability of failure and time to failure, are calculated. The model developed proved suitable for simulating any pipeline, under any environmental and operational conditions. It was used as a reliability prediction tool, and to assess the effects of some parameters on the cathodic protection system reliability.
APA, Harvard, Vancouver, ISO, and other styles
13

Williams, R. A. "A pipeline for the analysis of stellar spectra." Thesis, Liverpool John Moores University, 2018. http://researchonline.ljmu.ac.uk/8704/.

Full text
Abstract:
Understanding the formation and evolution of our galaxy, the Milky Way, has been an ongoing process, which with the development of large-scale surveys has picked up considerable pace. Together with these new surveys, pipelines have been constructed which allow for the rapid and automatic processing of this wealth of new data. These codes are able to turn raw data files into tables of stellar parameters and chemical abundances in far less time than if they were analysed by hand. The results from these surveys open new windows on to the history of our galaxy and other disk galaxies. In this thesis, we present the development of a new pipeline, the STellAR Parameter AND Abundances pipeline (STARPANDA), which is able to rapidly derive stellar parameters, CNO abundances and other elemental abundances by utilising measurements of spectral features in both observed and synthetic spectra. We take the observed spectra, synthetic spectra and line lists employed by the APOGEE survey and produce new values for the stellar parameters, CNO abundances and Al abundances of the APOGEE stars. We then compare our results with those achieved by the APOGEE pipeline.
APA, Harvard, Vancouver, ISO, and other styles
14

Paulo, Gomes Neto Severino. "MARRT Pipeline: Pipeline for Markerless Augmented Reality Systems Based on Real-Time Structure from Motion." Universidade Federal de Pernambuco, 2009. https://repositorio.ufpe.br/handle/123456789/2012.

Full text
Abstract:
Made available in DSpace on 2014-06-12T15:53:49Z (GMT). No. of bitstreams: 2 arquivo1931_1.pdf: 3171518 bytes, checksum: 18e05da39f750dea38eaa754f1aa4735 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009
Atualmente, com o aumento do poder computacional e os estudos em usabilidade, sistemas de tempo real e foto-realismo, os requisitos de qualquer sistema de computador são mais complexos e sofisticados. Sistemas de Realidade Aumentada não são exceção em sua tentativa de resolver problemas da vida real do usuário com um nível reduzido de risco, tempo gasto ou complexidade de aprendizado. Tais sistemas podem ser classificados como baseados em marcadores ou livres de marcadores. O papel essencial da realidade aumentada sem marcadores é evitar o uso desnecessário e indesejável de marcadores nas aplicações. Para atender à demanda por tecnologias de realidade aumentada robustas e não-intrusivas, esta dissertação propõe uma cadeia de execução para o desenvolvimento de aplicações de realidade aumentada sem marcadores, especialmente baseadas na técnica de recuperação da estrutura a partir do movimento em tempo real
APA, Harvard, Vancouver, ISO, and other styles
15

Toscano, Giovanni. "Risk analysis of the transport of ammonia by pipeline." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
Il lavoro di tesi a cui si riferisce il presente abstract è stato svolto durante un soggiorno, realizzato nell’ambito della mobilità Erasmus+Studio, presso l’Universitat Politècnica de Catalunya di Barcellona, sotto la supervisione del prof. Joaquim Casal e dell’ing. Elsa Pastor. La tesi verte sul trasporto in condotta di sostanze pericolose, con particolare riferimento al trasporto di ammoniacaanidra; in questo contesto, l’obiettivo principale del lavoro è stato quello di raccogliere informazioni sul trasporto in condotta di sostanze chimiche, di collezionate dati storici di incidenti avvenuti nel trasporto in condotta di ammoniaca e di analizzare statisticamente i dati raccolti, di impostare il calcolo del rischio locale generato da una condotta considerata come caso di studio. Il lavoro di tesi è così strutturato: il capitolo 1 parla del rischio di incidente rilevante generato dall’industria di processo; il capitolo 2 è dedicato al trasporto in condotta di sostanze pericolose; nel capitolo 3 sono considerate le problematiche di sicurezza delle condotte, anche con riferimento al trasporto su strada, su via ferrata, su acqua. Il capitolo 4 descrive l’analisi storica degli incidenti avvenuti nel trasporto in condotta di ammoniaca. Il capitolo 5 illustra le diverse fase dell’analisi quantitativa del rischio generato da condotte. Nel capitolo 6 è impostata l’analisi di rischio di una condotta considerata come caso di studio. Il capitolo 7 infine riporta alcune considerazioni conclusive.Alcuni risultati conseguiti nell’attività di tesi verranno presentati all’ottava edizione di EMChiE - European Meeting on Chemical Industry and Environment che si terrà a Nantes nel giugno 2018.
APA, Harvard, Vancouver, ISO, and other styles
16

McCallum, Katie Arlene. "Probabilistic Analysis of Pipeline Reliability Using a Markov Process." University of Akron / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=akron1334932206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Evans, Daniel T. "A SNP Microarray Analysis Pipeline Using Machine Learning Techniques." Ohio University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1289950347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Nugroho, Wibowo Harso 1967. "Monitoring of pipeline using smart sensors." Monash University, Dept. of Mechanical Engineering, 2001. http://arrow.monash.edu.au/hdl/1959.1/9236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Patel, Pruthvi Shaileshkumar. "Methodology to Enhance the Reliability of Drinking Water Pipeline Performance Analysis." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/84401.

Full text
Abstract:
Currently, water utilities are facing monetary crises to maintain and expand services to meet the current as well as the future demands. Standard practice in pipeline infrastructure asset management is to collect data and predict the condition of pipelines using models and tools. Water utilities want to be proactive in fixing or replacing the pipes as fixing-when-it-fails ideology leads to increased cost and can affect environmental quality and societal health. There is a number of modeling techniques available for assessing the condition of the pipelines, but there is a massive shortage of methods to check the reliability of the results obtained using different modeling techniques. It is mainly because of the limited data one utility collects and absence of piloting of these models at various water utilities. In general, water utilities feel confident about their in-house condition prediction and failure models but are willing to utilize a reliable methodology which can overcome the issues related to the validation of the results. This paper presents the methodology that can enhance the reliability of model results for water pipeline performance analysis which can be used to parallel the output of the real system with confidence. The proposed methodology was checked using the dataset of two large water utilities and was found that it can potentially help water utilities gain confidence in their analyses results by statistically signifying the results.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
20

Hasslund, Mikael. "Analysis and improvement of tools used in a game-pipeline." Thesis, Linköping University, Department of Science and Technology, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-12548.

Full text
Abstract:

Tools development is a sometimes overlooked process that is crucial for developing high quality games. The experience of the users will vary which makes usability an important issue as well as having good guidelines for communications and development of user interfaces. This report presents the development process of two different tools used in production of high-quality projects at Avalanche Studios and describes both the added functionality as well as the methods used to provide them.


Programmering av verktyg är en process som ofta är förbisedd men som är mycket viktig för utvecklingen av hög-kvalitativa spel. Den varierande kunskapsnivån hos användare gör att det är viktigt att fokusera på användarbarheten samt även ha bra riktlinjer för kommunikationer och utvecklingen av användargränssnitt. Den här rapporten presenterar hela utvecklingsprocessen av två olika verktyg som används vid utveckling av högkvalitativa projekt vid Avalanche Studios, och går igenom både ny funktionalitet samt även använda metoder för att uppnå detta.

APA, Harvard, Vancouver, ISO, and other styles
21

Ivanov, Oleg 1970. "MRI brain analysis testbed (BAT) : methodology and automatic validation pipeline." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=98973.

Full text
Abstract:
Magnetic Resonance Imaging (MRI) is extensively used in brain imaging research and clinical diagnostics. Increasingly, automated image processing algorithms are used for identification of tissue types within the image, such as gray matter, white matter and cerebro-spinal fluid. There is a wide range of algorithms, which vary in speed and accuracy, and it is often difficult to compare their performance in any objective and controlled fashion. The goal of this research was to design an automatic, generic, standard, extensible pipeline for objective and quantitative validation of MRI tissue classification algorithms and their processing pipelines. The main issues and requirements, for objective validation of different algorithms, are the use of common terminology, methodology, standard validation data sets, corresponding ground truth, validation metrics and statistical foundation. Based on those requirements, an automatic Brain Analysis Testbed (BAT) was developed to determine an objective evaluation score for MRI processing method. BAT supports Montreal Neurological Institute on-site or off-site processing of MRI data, accessible by a web interface (http://www.bic.mni.mcgill.ca/validation/). Validation results are stored in the BAT database permanently, allowing the comparison of newly developed processing methods with existing ones. Furthermore, BAT can be used to determine the optimal classification parameters, or the best classifier algorithm for a specific MRI classification purpose, simply by searching the BAT database. The main purposes and principles of BAT are demonstrated with some practical MRI processing examples.
APA, Harvard, Vancouver, ISO, and other styles
22

VEIGA, JORDANA LUIZA BARBOSA DA COSTA. "ANALYSIS OF ACCEPTANCE CRITERIA OF WRINKLES IN PIPELINE COLD BENDS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2008. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=13451@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
Tubos de aço de grande diâmetro, baixa espessura e alta resistência mecânica, possuem tendência à formação de rugas no lado compressivo do curvamento (intrados) ao serem curvados a frio. A presente dissertação descreve os principais códigos de projeto nacionais e internacionais, quanto à presença de rugas provenientes desse tipo de curvamento em dutos, e propõe um ângulo para o qual há a formação do enrugamento. Os códigos internacionais mostram-se conservadores quanto à presença de rugas nos tubos curvados, uma vez que o enrugamento é uma mudança geométrica que, a princípio, gera concentração de tensões e susceptibilidade à ocorrência de falhas por fadiga. Esta dissertação faz uso do método de elementos finitos para modelar a formação do enrugamento e determinar fatores de concentração de tensões, nestas regiões, para carregamentos de pressão interna. Os fatores encontrados são comparados com resultados encontrados na literatura e utilizados no cálculo contra a fadiga por meio de diferentes métodos: Markl, inclinações universais de Manson e ASME seção VIII divisão 2. Neste estudo foram utilizados tubos de aço estrutural API X70 com razão diâmetro espessura (D/t) variando de 20 a 100, modelados por meio do software Abaqus(r). Foram obtidas curvas com ângulo de 4°/diâmetro e enrugamentos severos, com razão entre a altura da ruga e o diâmetro do tubo (d/D) da ordem de 6,5% e fator de concentração de tensão chegando a 1,89. Os modelos de tubo enrugado não apresentaram falha na resistência mecânica à pressão interna aplicada, quando esta é suficiente para obtenção de tensão circunferencial nominal equivalente a 100% do limite de escoamento do material. Os resultados de vida em fadiga para os diferentes métodos aplicados variam de acordo com o método utilizado, mas todos apresentam redução na vida do tubo com presença de enrugamento severo. O estudo propõe que se utilize para o cálculo da vida em fadiga um procedimento conservador que associa o fator de concentração de tensão determinado por Rosenfeld com o método de cálculo contra a fadiga recomendado pelo código ASME VIII. O estudo sugere ainda, que sejam realizadas novas análises de forma a considerar o efeito Bauschinger e a instabilidade à flexão do modelo não avaliados no presente trabalho.
Large diameter, thin walled, high mechanical resistance steel pipe has a tendency to wrinkle on the compressive side (the intrados) of the bend when it is cold bent. This dissertation describes the principal national and international design codes that apply to wrinkling resulting from pipe cold bending, and it proposes an angle at which such wrinkling occurs. The international codes prove to be conservative regarding wrinkling in bent pipe, since a wrinkle is a geometric change, which at first causes a stress concentration and susceptibility to fatigue failure. The dissertation uses the finite element method to model the formation of wrinkling and to determine stress concentration factors in these areas for internal pressure loading. The resulting factors are compared with the results found in the literature and are utilized in calculating fatigue life by means of the following methods: Mark1, Mansons universal inclinations and ASME Section VIII Division 2. In this study API X70 structural pipe with a diameter thickness (D/t) ratio varying from 20 to 100 was utilized, and modeled using Abaqus(r) software. Bends with an angle of 4°/diameter and severe wrinkling resulted, with a ratio between the peak of the wrinkle and the diameter of the pipe (d/D) of about 6.5% and a stress concentration factor nearing 1.89. The wrinkled pipe models did not reveal any lack of mechanical resistance to the applied internal pressure when it is sufficient for obtaining a nominal circumferential stress equivalent to 100% of the yielding limit of the material. The fatigue life results for the different methods varied according to the method that was utilized, but all displayed a reduction in pipe life if there was severe wrinkling. The study proposes a conservative procedure to be utilized for calculating fatigue life. This procedure associates the stress concentration factor determined by Rosenfeld with the method for calculating fatigue recommended by the ASME VIII code. Furthermore, the study suggests that new analyses may be performed in order to consider the Bauschinger effect and the model bend instability, which the study did not evaluated.
APA, Harvard, Vancouver, ISO, and other styles
23

PRADO, JOHN STEVEN CASTELLANOS. "ANALYSIS PIPELINE COMPONENTS WITH METAL LOSS REPAIRED WITH COMPOSITE MATERIALS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2014. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=24562@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
PROGRAMA DE EXCELENCIA ACADEMICA
Neste trabalho foram estudados quatro geometrias de componentes tubulares, para trabalho sob pressão interna, que continham defeitos tipo perda de espessura metálica em regiões de sua superfície externa e que foram reparados por dois sistemas de materiais compósitos. As seguintes geometrias de espécimes tubulares foram estudadas: dutos com redução de diâmetro, dutos curvados a frio, dutos curvados a quente e uniões tubulares em T. Os dois sistemas de materiais compósitos consistiram de um sistema com mantas de resina epóxi reforçadas por fibra de vidro (ERFV) e outro sistema com mantas de resina epóxi reforçadas por fibra de carbono (ERFC). O método de Elementos Finitos foi usado para simular o comportamento dos espécimes quando submetidos a testes hidrostáticos. As simulações levaram em consideração o comportamento do material dos tubos API 5L Grau B nas suas regiões elástica e plástica, e investigaram o aumento da resistência à ruptura dos tubos com defeitos proporcionados pela presença dos reparos compósitos. Os resultados obtidos com os modelos de elementos finitos, usados para as determinações de distribuições de deformações e de pressões de ruptura, foram comparados com resultados experimentais disponíveis para cada componente tubular. Os resultados destas comparações foram satisfatórios. Posteriormente, foram desenvolvidas equações para determinar espessuras otimizadas para os reparos e os resultados obtidos com estas equações foram comparados com os resultados das simulações por elementos finitos e outros resultados obtidos com equações disponíveis em normas pertinentes. Em conclusão, o método de elementos finitos usado neste trabalho simulou satisfatoriamente o comportamento em testes hidrostáticos de componentes tubulares tipo Tê, com redução concêntrica, curvados a quente e curvados a frio, com defeitos de perda de espessura metálica e que foram reparados com materiais compósitos. Por sua vez, os resultados numéricos auxiliaram na validação de equações analíticas simples que poderão ser usadas na determinação de espessuras otimizadas de sistemas de reparos com materiais compósitos.
This dissertation investigates four geometries of pipe components that operate with internal pressure and contain metal loss defects in areas of their external surface. These components were repaired with two systems of composite materials that consisted of epoxy resin reinforced by glass fiber (ERFV) and epoxy resin reinforced by carbon fiber (ERFC). The following tubular specimens were studied: components with concentric diameter reducers, hot curved short radius elbow components, cold curved long radius elbow components and T-components. The finite element method was used to simulate the behavior of the specimens when submitted to hydrostatic tests. The simulations took into consideration the behavior of the material of the pipes API 5L Grade B in their elastic and plastic regions, and investigated the increase in the rupture strength of the pipes with defects that was provided by the presence of composite repairs. The results obtained with the finite element models, used for the determination of distributions of deformation and burst pressures, were compared with experimental results available for each pipe component. The results of these comparisons were satisfactory. Subsequently, simple analytical equations were developed to determine the optimized thicknesses for the composite repair systems and the results obtained from these equations were compared with the results determined with the finite element models and with other results obtained with equations recommended by international relevant standards. In conclusion, the finite element method used in this work satisfactorily simulated the behavior of the selected pipe components with defects of metal loss that were hydrostatically tested and that were repaired with composites materials. The numerical results helped to validate simple analytical equations that can be used in the determination of the optimized thicknesses of repair systems with composite materials.
APA, Harvard, Vancouver, ISO, and other styles
24

Fahimi, Marcus. "Analysis of system models for product pipeline planning and measurement." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Coco, Joseph. "PARSES: A Pipeline for Analysis of RNA-Sequencing Exogenous Sequences." ScholarWorks@UNO, 2011. http://scholarworks.uno.edu/td/1297.

Full text
Abstract:
RNA-Sequencing (RNA-Seq) has become one of the most widely used techniques to interrogate the transcriptome of an organism since the advent of next generation sequencing technologies [1]. A plethora of tools have been developed to analyze and visualize the transcriptome data from RNA-Seq experiments, solving the problem of mapping reads back to the host organism's genome [2] [3]. This allows for analysis of most reads produced by the experiments, but these tools typically discard reads that do not match well with the reference genome. This additional information could reveal important insight into the experiment and possible contributing factors to the condition under consideration. We introduce PARSES, a pipeline constructed from existing sequence analysis tools, which allows the user to interrogate RNA-Sequencing experiments for possible biological contamination or the presence of exogenous sequences that may shed light on other factors influencing an organism's condition.
APA, Harvard, Vancouver, ISO, and other styles
26

Li, Tianyi. "Solving Mysteries with Crowds: Supporting Crowdsourced Sensemaking with a Modularized Pipeline and Context Slices." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/99937.

Full text
Abstract:
The increasing volume and complexity of text data are challenging the cognitive capabilities of expert analysts. Machine learning and crowdsourcing present new opportunities for large-scale sensemaking, but it remains a challenge to model the overall process so that many distributed agents can contribute to suitable components asynchronously and meaningfully. In this work, I explore how to crowdsource sensemaking for intelligence analysis. Specifically, I focus on the complex processes that include developing hypotheses and theories from a raw dataset and iteratively refining the analysis. I first developed Connect the Dots, a web application that implements the concept of "context slices" and supports novice crowds in building relationship networks for exploratory analysis. Then I developed CrowdIA, a software platform that implements the entire crowd sensemaking pipeline and the context slicing for each step, to enable unsupervised crowd sensemaking. Using the pipeline as a testbed, I probed the errors and bottlenecks in crowdsourced sensemaking,and suggested design recommendations for integrated crowdsourcing systems. Building on these insights and to support iterative crowd sensemaking, I developed the concept of "crowd auditing" in which an auditor examines a pipeline of crowd analyses and diagnoses the problems to steer future refinement. I explored the design space to support crowd auditing and developed CrowdTrace, a crowd auditing tool that enables novice auditors to effectively identify the important problems with the crowd analysis and create microtasks for crowd workers to fix the problems.The core contributions of this work include a pipeline that enables distributed crowd collaboration to holistic sensemaking processes, two novel concepts of "context slices" and "crowd auditing", web applications that support crowd sensemaking and auditing, as well as design implications for crowd sensemaking systems. The hope is that the crowd sensemaking pipeline can serve to accelerate research on sensemaking, and contribute to helping people conduct in-depth investigations of large collections of information.
Doctor of Philosophy
In today's world, we have access to large amounts of data that provide opportunities to solve problems at unprecedented depths and scales. While machine learning offers powerful capabilities to support data analysis, to extract meaning from raw data is cognitively demanding and requires significant person-power. Crowdsourcing aggregates human intelligence, yet it remains a challenge for many distributed agents to collaborate asynchronously and meaningfully. The contribution of this work is to explore how to use crowdsourcing to make sense of the copious and complex data. I first implemented the concept of ``context slices'', which split up complex sensemaking tasks by context, to support meaningful division of work. I developed a web application, Connect the Dots, which generates relationship networks from text documents with crowdsourcing and context slices. Then I developed a crowd sensemaking pipeline based on the expert sensemaking process. I implemented the pipeline as a web platform, CrowdIA, which guides crowds to solve mysteries without expert intervention. Using the pipeline as a testbed, I probed the errors and bottlenecks in crowd sensemaking and provided design recommendations for crowd intelligence systems. Finally, I introduced the concept of ``crowd auditing'', in which an auditor examines a pipeline of crowd analyses and diagnoses the problems to steer a top-down path of the pipeline and refine the crowd analysis. The hope is that the crowd sensemaking pipeline can serve to accelerate research on sensemaking, and contribute to helping people conduct in-depth investigations of large collections of data.
APA, Harvard, Vancouver, ISO, and other styles
27

Hanson, Niels William. "MetaPathways : a modular pipeline for the analysis of environmental sequence information." Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/52845.

Full text
Abstract:
The lack of cultivated reference strains for the majority of naturally occurring microorganisms has lead to the development of plurality sequencing methods and the field of metagenomics, offering a glimpse into the genomes of this so-called 'microbial dark matter' (MDM). An explosion of sequencing initiatives has followed, attempting to capture and extract biological meaning from MDM across a wide range of ecosystems from deep-sea vents and polar seas to waste-water bioreactors and human beings. Current analytic approaches focus on taxonomic structure and metabolic potential through a combination of phylogenetic anchor screening of the small subunit ribosomal RNA gene (SSU or 16S rRNA) and general sequence searches using homology-based inference. Though much has been learned about microbial diversity and metabolic potential within natural and engineered ecosystems using these approaches, they are insufficient to resolve the ecological relationships that couple nutrient and energy flow between community members - ultimately translating into ecosystem functions and services. This shortcoming arises from a combination of data-intensive challenges presented by environmental sequence information that span processing, integration, and interpretation steps, and a general lack of robust statistical and analytical methods to directly address these problems. This dissertation addresses some of these shortcomings through the development of a modular analytical pipeline, MetaPathways, allowing for the large-scale and systematic processing and integration of many forms of environmental sequence information. MetaPathways is built to scale, comparing hundreds of metagenomic samples through the efficient use of data structures, grid compute models, and interactive data query. Moreover, it attempts to bring functional analysis back to the metabolic map through the creation of environmental pathway/genome databases (ePGDBs), adopting the Pathway Tools software for metabolic pathway prediction on the MetaCyc encyclopedia of genes and genomes. ePGDBs and the pathway-centric approach are validated to provide known and novel insights into community structure and function. Finally, novel taxonomic and metabolic methods supporting the pathway-centric model are derived and demonstrated, and enhance Pathway Tools as a framework for engineering microbial communities and consortia.
Science, Faculty of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
28

Bergström, Simon, and Oscar Ivarsson. "Automation of a Data Analysis Pipeline for High-content Screening Data." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-122913.

Full text
Abstract:
High-content screening is a part of the drug discovery pipeline dealing with the identification of substances that affect cells in a desired manner. Biological assays with a large set of compounds are developed and screened and the output is generated with a multidimensional structure. Data analysis is performed manually by an expert with a set of tools and this is considered to be too time consuming and unmanageable when the amount of data grows large. This thesis therefore investigates and proposes a way of automating the data analysis phase through a set of machine learning algorithms. The resulting implementation is a cloud based application that can support the user with the selection of which features that are relevant for further analysis. It also provides techniques for automated processing of the dataset and training of classification models which can be utilised for predicting sample labels. An investigation of the workflow for analysing data was conducted before this thesis. It resulted in a pipeline that maps the different tools and software to what goal they fulfil and which purpose they have for the user. This pipeline was then compared with a similar pipeline but with the implemented application included. This comparison demonstrates clear advantages in contrast to previous methodologies in that the application will provide support to work in a more automated way of performing data analysis.
APA, Harvard, Vancouver, ISO, and other styles
29

Hood, Rachael Lucille. "“Don't frack with us!” An analysis of two anti-pipeline movements." Oberlin College Honors Theses / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=oberlin1594488329200428.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

García, Pérez Ana E., Carlos Allende Prieto, Jon A. Holtzman, Matthew Shetrone, Szabolcs Mészáros, Dmitry Bizyaev, Ricardo Carrera, et al. "ASPCAP: THE APOGEE STELLAR PARAMETER AND CHEMICAL ABUNDANCES PIPELINE." IOP PUBLISHING LTD, 2016. http://hdl.handle.net/10150/621372.

Full text
Abstract:
The Apache Point Observatory Galactic Evolution Experiment (APOGEE) has built the largest moderately high-resolution (R approximate to 22,500) spectroscopic map of the stars across the Milky Way, and including dust-obscured areas. The APOGEE Stellar Parameter and Chemical Abundances Pipeline (ASPCAP) is the software developed for the automated analysis of these spectra. ASPCAP determines atmospheric parameters and chemical abundances from observed spectra by comparing observed spectra to libraries of theoretical spectra, using. 2 minimization in a multidimensional parameter space. The package consists of a FORTRAN90 code that does the actual minimization and a wrapper IDL code for book-keeping and data handling. This paper explains in detail the ASPCAP components and functionality, and presents results from a number of tests designed to check its performance. ASPCAP provides stellar effective temperatures, surface gravities, and metallicities precise to 2%, 0.1 dex, and 0.05 dex, respectively, for most APOGEE stars, which are predominantly giants. It also provides abundances for up to 15 chemical elements with various levels of precision, typically under 0.1 dex. The final data release (DR12) of the Sloan Digital Sky Survey III contains an APOGEE database of more than 150,000 stars. ASPCAP development continues in the SDSS-IV APOGEE-2 survey.
APA, Harvard, Vancouver, ISO, and other styles
31

Hunter, Alistair Edward. "Impact moles and directional drills : safe installation distances for existing services." Thesis, University of Nottingham, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.323250.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Majer, Jan. "Výpočet potrubní trasy parovodu." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2016. http://www.nusl.cz/ntk/nusl-241223.

Full text
Abstract:
This thesis describes issues related to design and structural analysis of steam pipeline. Analysis are done by using two different software program – AutoPIPE and Ansys. Issues related to pipeline support system and thermal expansion are solved by using AutoPIPE. Structural analysis results are evaluated in accordance with ČSN EN 13 480 code.
APA, Harvard, Vancouver, ISO, and other styles
33

ALVINO, ALBERTO EDWIN ILDEFONSO. "APPLICATION OF FUZZY LOGIC TO THE MUHLBAUER MODEL FOR PIPELINE RISK ANALYSIS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2003. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=4391@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
W. Kent Muhlbauer fez uma identificação detalhada de aproximadamente 300 diferentes condições que influenciam a avaliação do Risco em uma tubulação e propôs um sistema de pontuação que é conhecido mundialmente como o método de Muhlbauer. O método de Muhlbauer avalia as diversas variáveis que influenciam no Risco de dutovias mediante a atribuição de valores quantitativos. No entanto, sendo um método qualitativo, estas variáveis não podem ser informadas através de valores exatos. Estas variáveis podem ser tratadas como provenientes de distribuições randômicas. Entretanto, identificar as distribuições randômicas pode exigir muito esforço. Sendo assim, em vez de assumir que as avaliações das variáveis têm distribuições randômicas, pode-se considerar que têm distribuições dadas por conjuntos e números nebulosos. No presente trabalho, os valores numéricos presentes no modelo Muhlbauer passaram a ser admitidos como não determinísticos, admitindo uma incerteza. Esta incerteza depende do engenheiro especialista avaliador do Risco. Para incorporar esta incerteza no cálculo do valor do Risco procurou-se trabalhar com os conjuntos e números nebulosos. Na análise das incertezas mediante os conjuntos nebulosos, requer-se definir as variáveis linguísticas (VL), os valores linguísticos das VL de saída e entrada, as funções de pertinência, além das regras nebulosas. Com estes, um Sistema de Lógica Fuzzy (SLF) é implementado com base na inferência Mamdani. No caso dos números nebulosos, estes admitem um valor mais provável e uma incerteza. Esta incerteza é avaliada por uma função de pertinência normalizada. Operações de soma, subtração, multiplicação e divisão são possíveis para os números fuzzy. Como resultado final torna-se possível encontrar-se não só um número que define o Risco como também a incerteza (faixa de valores) que este Risco pode ter, que é uma função das incertezas das avaliações individuais das variáveis. O presente trabalho propõe um modelo básico de Gerenciamento de Risco (GR) e Análise de Integridade Estrutural (AIE) para dutos com corrosão externa. Para isso, os resultados de uma AIE nível I aplicado aos dutos é relacionada com a metodologia de análise de Risco de W. Kent Muhlbauer, através de uma Matriz de Risco. A partir de uma AIE nível I o cálculo da Vida Residual (T asterisco) é avaliada. A T asterisco é comparada com uma Vida Desejada (VD), a qual é obtida da matriz de risco. Se T asterisco é menor que a VD, recomenda-se fazer uma AIE nível II. Se T asterisco é maior que VD determina-se um Tempo de Inspeção (TI) baseado na análise de confiabilidade.
W. Kent Muhlbauer made a detailed identification of approximately 300 different conditions that influence in the evaluation of the Risk in a pipeline, so, He proposed a punctuation system that is known as the method of Muhlbauer. Muhlbauer s method evaluates the several variables that influence in the pipeline Risk by the attribution of quantitative values. However, being a qualitative method, these variables cannot be informed through exact values. These variables can be treated as coming of distributions random. However, to identify the distributions random can demand a lot of effort. Being like this, instead of assuming that the evaluations of the variables have distributions random, it can be considered that they have distributions given by fuzzy sets and numbers. In the present work, the values numeric presents in the model Muhlbauer will be admitted as no deterministic, admitting an uncertainty. This uncertainty depends on the engineer specialist knowledge about the Risk. To incorporate this uncertainty in the calculation of the value of the Risk Muhlbauer tried to work with the fuzzy sets and numbers. In the analysis of the uncertainties by the fuzzy sets, it is necessary to define the linguistic variables (VL), the linguistic values of input and output of VL, the membership functions, besides the fuzzy rules. With these, a System of Logic Fuzzy (SLF) is implemented based on Mamdani s inference. In the case of the fuzzy numbers, they admit a more probable value and an uncertainty. This uncertainty is evaluated by a function of normalized membership. Sum, subtraction, multiplication and division operations are possible for the numbers fuzzy. As result, it is possible to find not only a number that defines the Risk as also the uncertainty (range of values) that the Risk can have. It is a function of the uncertainties of the individual evaluations of the variables. The present work proposes a basic model of Management of Risk (GR) and Analysis of Structural Integrity (AIE) for pipeline with damage corrosion. For that, the results of an AIElevel one applied to the pipeline are associated with the results of Muhlbauer s method through a Risk Matrix. Starting from an AIE level one, the calculation of the Residual Life (T asterisk) is evaluated. T asterisk is compared with a ideal life (VD), which is obtained of the Risk Matrix. If T asterisk is smaller than VD, it is recommended to do an AIE level II. If T asterisk is larger than VD is determined a Time of Inspection based on the reliability analysis.
APA, Harvard, Vancouver, ISO, and other styles
34

Agbenowosi, Newland Komla. "A Mechanistic Analysis Based Decision Support System for Scheduling Optimal Pipeline Replacement." Diss., Virginia Tech, 2000. http://hdl.handle.net/10919/29796.

Full text
Abstract:
Failure of pipes in water distribution systems is a common occurrence especially in large cities. The failure of a pipe results in: loss of water; property damage; interruption of service; decreased system performance; and the financial cost of restoring the failed pipe. The cost of replacement and rehabilitation in the United States is estimated at 23 plus billion dollars. It is virtually impossible to replace all vulnerable pipes at the same time. As a result, there is a need for methods that can help in progressive system rehabilitation and replacement subject to budgetary constraints. If delaying is considered a good strategy due to the time value of money then, the timing of preventive maintenance becomes a crucial element for system maintenance and operation. The central under pinning element in the decision process for scheduling preventive maintenance is the deteriorating nature of a pipe under a given surrounding. By planning to replace pipes before they fail, proper planning can be put in place for securing of finances and labor force needed to rehabilitate the pipes. With this approach, service interruptions are minimized as the loss of service time is limited to the time used in replacing the pipe. In this research, a mechanistic model for assessing the stage of deterioration of an underground pipe is developed. The developed model consists of three sub-models namely, the Pipe Load Model (PLM), the Pipe Deterioration Model (PDM), and the Pipe Break Model (PBM). The PLM simulates the loads and stresses exerted on a buried water main. These loads include the earth load, traffic load, internal pressure, expansive soil loads, thermal, and frost loads. The PDM simulates the deterioration of the pipe due to corrosion resulting from the physical characteristics of the pipe environment. The pipe deterioration effect is modeled in two stages. First, the thinning of the pipe wall is modeled using a corrosion model. Second, the localized pit growth is used to determine the residual strength of the pipe based on the fracture toughness and the initial design strength of the pipe. The PBM assesses the vulnerability of a pipe at any time in terms of a critical safety factor. The safety factor is defined as the ratio of residual strength to applied stress. For a conservative estimate the multiplier effect due to thermal and frost loads are considered. For a chosen analysis period, say 50 years, the pipes with safety factors less than the critical safety factor are selected and ordered by their rank. Aided by the prioritized list of failure prone pipes, utilities can organize a replacement schedule that minimizes cost over time. Additionally a physically based regression model for determining the optimal replacement time of pipe is also presented. A methodology for assessing the consequences of accelerated and delayed replacement is also provided. The methodologies developed in this dissertation will enable utilities to formulate future budgetary needs compatible with the intended level of service. An application of the model and results are included in the dissertation.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
35

Malik, Sarita. "The Army Civil Affairs Officer educational pipeline a supply and demand analysis." Thesis, Monterey, Calif. : Naval Postgraduate School, 2008. http://edocs.nps.edu/npspubs/scholarly/theses/2008/Dec/08Dec%5FMalik.pdf.

Full text
Abstract:
Thesis (M.S. in Defense Analysis)--Naval Postgraduate School, December 2008.
Thesis Advisor(s): McNab, Robert ; Guttieri, Karen. "December 2008." Description based on title screen as viewed on January 29, 2009. Includes bibliographical references (p. 127-134). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
36

Ramsay, Trevor. "A Motif Discovery and Analysis Pipeline for Heterogeneous Next-Generation Sequencing Data." Thesis, University of California, Davis, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1599520.

Full text
Abstract:

Bioinformatics has made great strides in understanding the regulation of gene expression, but many of the tools developed for this purpose depend on data from a limited number of species. Despite their unique genetic attributes, there remains a dearth of research into undomesticated trees. The poplar tree, Populus trichocarpa, has undergone multiple rounds of genome duplication during its evolution. In addition its life cycle varies from other annual crop and model plants previously studied, leading to significant technical challenges to understand the unique biology of these trees. For example, the process of secondary growth occurs as the tree stems thicken, and creates secondary xylem (wood) and phloem (inner bark) for water and products of photosynthesis transport, respectively. Because of this, the research group I work with studies the secondary growth of P. trichocarpa (Spicer, 2010) (Groover, et al., 2010) (Groover, et al., 2006) (Groover, 2005).

The genomic tools to investigate gene regulation in P. trichocarpa are readily available. Next-generation sequencing technologies such as RNA-Seq and ChIP-Seq can be used to understand gene expression and binding of transcription factors to specific locations in the genome. Similarly, a variety of specialized bioinformatic tools such as EdgeR, Cufflinks, and MACS can be used to analyze gene binding and expression from sequencing data provided by ChIP-seq and RNA-seq (Blahnik, et al., 2010) (Mortazavi, et al., 2008) (Robinson, 2010) (Robinson, 2007) (Robinson, et al., 2008) (McCarthy, 2012) (Trapnell, 2013) (Zhang, 2008). The binding and expression data these tools provide form a foundation for analyzing the gene expression regulation in P. trichocarpa.

The goal of my project is to provide a motif discovery and analysis pipeline for analyses of Populus species. The motif discovery and analysis pipeline utilizes heterogeneous data collected from poplar and aspen mutants to elucidate the gene regulatory mechanisms involved in secondary growth. The experiments target transcription factors related to secondary growth, and through analysis of the variety of transcription factor binding experiments, I have identified the motifs involved in gene regulation of secondary growth within P. trichocarpa. (Filkov, et al., 2008).

APA, Harvard, Vancouver, ISO, and other styles
37

Yu, Hanqi. "Reliability Sensitivity Analysis of Dropped Objects Hitting on the Pipeline at Seabed." ScholarWorks@UNO, 2019. https://scholarworks.uno.edu/td/2710.

Full text
Abstract:
Nowadays, as oil industry gradually moves towards deep sea fields with water depth more than 1000 meters, they are subjected to several threats which can cause failure of the pipeline, of which the accidentally-dropped objects have become the leading external risk factor for subsea developments. In this thesis, a sample field layout introduced in Det Norske Veritas (DNV) guide rules is selected as the study case with 100 m water depth. Six different groups of dropped objects are used in this paper. The conditional hit probability for long/flat shaped objects will be calculated with the methods from both DNV rules and an in-house tool Dropped Objects Simulator (DROBS). The difference between the results will be discussed. Meanwhile, the sensitivity analysis on mass, collision area , the volume, added mass coefficient and drag coefficient of the objects are calculated.
APA, Harvard, Vancouver, ISO, and other styles
38

Baruzzo, Giacomo. "Improving the RNA-Seq analysis pipeline: read alignment and expression level quantification." Doctoral thesis, Università degli studi di Padova, 2017. http://hdl.handle.net/11577/3424871.

Full text
Abstract:
DNA and RNA play an essential role in the life of each living organism. The two molecules have different characteristics and properties but their functions are strictly related. DNA encodes all the genetic instructions needed by the main cell activities in the so-called genome. DNA is related to RNA through the gene expression process, which transcribes the information encoded by DNA into RNAs. Opposite to the static information provided by DNA, the set of transcribed RNAs at a specific instant represents the current state of each cell and, at the end, it provides a dynamic characterization of its activity. For this reason, transcriptome analysis represents a powerful tool to identify the dynamic behavior of an organism, such as the response to environmental stimuli or the pathological mechanisms involved in diseases. In recent years, transcriptomic analyses were revolutionized by the advent of RNA sequencing (RNA-Seq), a new methodology that applies current Next Generation Sequencing (NGS) techniques to RNA molecules. RNA-Seq enables to investigate at high resolution all the RNA species present in a sample, characterizing their sequences and quantifying their abundances at the same time. In practice, millions of short transcript sub-sequences, called reads, are sequenced from random positions of the input RNAs using the same NGS platforms employed in DNA sequencing. Unfortunately, no information is provided about which transcripts have generated the reads or from which part of the transcripts they come from. For this reason, reads represent at the same time the output of the sequencing process and the input of complex RNA-Seq data analysis pipelines. The first task in many RNA-Seq data analysis pipelines consists in identifying the relation between the sequencing output (i.e. reads) and the sequenced transcripts. The most common approach to this problem consists in aligning the reads against a reference genome. Once the reads are positioned in the genome, it is possible to infer which transcripts have generated them analyzing the read locations. The information coming from the positions and the number of reads could be employed in a wide range of downstream analyses. For example, counting the number of reads aligned to a gene could give a measure of its expression level, whereas studying which reads are located across exon junction could identify different isoforms. At first glance, these tasks may seem very simple, but the implementation of both the single steps and the whole analysis workflow are in fact complex and still not well defined. Among all the analysis steps in the pipeline, this thesis is focused on the read alignment problem. Read alignment is identified as one of the most critical steps, both for its almost ubiquitous presence in the different RNA-Seq analysis workflows and for its complexity. The study of this pivotal task was carried out through several steps. First, a complete characterization of the problem was performed, analyzing the alignment challenges both from a methodological and a computational point of view. In addition, the algorithms and data structures employed in the alignment process were analyzed together with different ways of modeling the read alignment problem. Then, state of the art methods for RNA-Seq read alignment were identified performing a thorough literature search about RNA-Seq, which revealed the presence of many available methods. At the same time, the literature search highlighted that the identification of a suitable alignment method for a specific application is challenging, mainly due to the lack of accurate comparative analyses. Thus, a comprehensive benchmark analysis of fourteen splice aware alignment methods and four splice unaware tools was designed and performed. The simulation of several datasets describing real scenarios and the definition of a comprehensive set of accuracy and efficiency metrics were performed in order to assess the different alignment methods. The assessment revealed considerable differences between methods’ performance, highlighting often a poor correlation between accuracy and popularity. Finally, the effect of the alignment accuracy on the reliability of an expression level quantification study was assessed for a subset of alignment methods. Overall, this thesis considers the RNA-Seq read alignment problem and presents a thorough characterization of its characteristics and challenges. In a fast evolving research field such as RNA-Seq, the information resulting from the assessment of state of the art methods provides some valuable guidelines for the definition of robust and accurate analysis pipelines.
DNA e RNA giocano un ruolo essenziale nelle vita di ogni organismo. Le due molecole hanno differenti caratteristiche e proprietà ma le loro funzioni sono strettamente legate. Il DNA codifica nel genoma tutte le informazioni genetiche necessarie alle principali attività delle cellula. Il DNA è legato all’RNA tramite il processo della espressione genica, processo che trascrive le informazioni codificate dal DNA nel RNA. Diversamente dalle informazioni statiche fornite dal DNA, l’insieme degli RNA trascritti in un certo istante temporale rappresenta lo stato attuale di ogni cellula e fornisce una caratterizzazione dinamica della sua attività. Per questa ragione, l’analisi del trascrittoma rappresenta un potente strumento per identificare il comportamento dinamico di un organismo, come la risposta a stimoli ambientali o i meccanismi patologici alla base di diverse malattie. Negli ultimi anni, le analisi del trascrittoma sono state rivoluzionate dall’avvento dell’RNA sequencing (RNA-Seq), una nuova metodologia che applica le attuali tecnologie di sequenziamento di nuova generazione (NGS) a molecole di RNA. L’RNA-Seq consente di studiare tutte le specie di RNA presenti nel campione in esame, caratterizzando allo stesso tempo a loro sequenza nucleotidica e la loro quantità. In pratica, milioni di sotto sequenze dei trascritti, chiamate read, vengono sequenziate a partire da posizioni casuali dei trascritti presenti nel campione, utilizzando le medesime piattaforme NGS impiegate nel sequenziamento di DNA. Sfortunatamente le tecnologie NGS producono in output le sono read e nessuna informazione viene quindi fornita riguardo a quali trascritti abbiano generato le read o da quale porzione dei trascritti esse provengano. Per questo motivo le read rappresentano allo stesso tempo l’output del processo di sequenziamento e l’input di complesse pipeline di analisi dati RNA-Seq. Il primo passo in molte pipeline consiste proprio nella identificazione della relazione tra l’output del sequenziamento (le read) e i trascritti che sono stati sequenziati. L’approccio più comune alla risoluzione di questo problema è l’allineamento delle read su un genoma di riferimento. Infatti, identificando la posizione di ogni read nel genoma è possibile inferire quale trascritto la abbia originata analizzando la sua posizione all’interno dei geni. L’informazione derivante dalla posizione e dal numero di read può essere poi utilizzata in un ampio spettro di analisi. Ad esempio, il conteggio del numero di read allineate presso un gene può essere utilizzato come misura del suo livello di espressione, mentre lo studio di quali read si trovino a cavallo di una giunzione può permettere l’identificazione di diverse isoforme. A prima vista queste analisi possono sembrare semplici, ma l’implementazione sia della intera pipeline di analisi sia delle singole fasi che la compongono è invece complessa ed ancora non ben definita. Tra tutte le fasi che compongono la pipeline di analisi dati RNA-Seq, questa tesi si focalizza sulla fase di allineamento delle read. L’allineamento delle read costituisce uno dei passi più critici nella intera analisi di dati RNA-Seq, sia per la sua complessità che per la sua diffusione e presenza nella maggior parte delle pipeline di analisi utilizzate. Lo studio di questa fondamentale operazione è stato effettuato attraverso varie fasi. In primo luogo è stata effettuata una completa caratterizzazione del problema dell’allineamento, analizzando gli aspetti critici e i problemi aperti sia dal punto di vista metodologico che computazionale. In secondo luogo, gli algoritmi e le strutture dati utilizzate nel processo di allineamento sono state analizzate insieme alle diverse strategie di modellazione del problema. Successivamente, i metodi stato dell’arte per l’allineamento di read RNA-Seq sono stati individuati attraverso una approfondita analisi della letteratura, la quale ha evidenziato la presenza di molteplici metodi per la risoluzione di questo problema. Contemporaneamente, l’analisi della letteratura ha evidenziato la difficoltà nella scelta del metodo più accurato per il particolare scenario da analizzare. La difficoltà nella individuazione del corretto metodo è dovuta principalmente per la carenza in letteratura di accurate analisi comparative. Per questa ragione, il passo successivo è stato la progettazione ed esecuzione di una approfondita analisi comparativa di 14 metodi per l’allineamento splice aware e di 4 metodi per l’allineamento splice unaware. A questo scopo, è stata effettua la simulazione di diversi dati a descrizione di molteplici scenari reali. In aggiunta, sono state sviluppate diverse metriche per la valutazione della accuratezza ed efficienza dei singoli metodi analizzati. I risultati di questa analisi hanno rivelato considerevoli differenze tra le prestazioni dei singoli metodi, sottolineando spesso uno scarso legame tra popolarità e accuratezza. L’ultimo passo dello studio è stato l’analisi degli effetti delle diverse accuratezze raggiunge in fase di allineamento sulla precisione e affidabilità delle fasi successive nella pipeline di analisi. Nello specifico, sono state studiate le conseguenze dell’uso di un sottoinsieme dei metodi di allineamento sulla accuratezza della quantificazione del livello di espressione. In conclusione, questa tesi analizza il problema dell’allineamento di read RNA-Seq e presenta una approfondita descrizione delle caratteristiche e delle criticità di questa complessa fase della pipeline. In un campo di ricerca dalla veloce evoluzione come l’RNA-Seq, le informazioni risultanti dalla valutazione comparativa dei metodi stato dell’arte fornisce preziose linee guida per l’aggiornamento e la definizione di accurate e affidabili pipeline di analisi.
APA, Harvard, Vancouver, ISO, and other styles
39

Irwin, Warwick Allan. "Understanding and Improving Object-Oriented Software Through Static Software Analysis." Thesis, University of Canterbury. Computer Science and Software Engineering, 2007. http://hdl.handle.net/10092/1166.

Full text
Abstract:
Software engineers need to understand the structure of the programs they construct. This task is made difficult by the intangible nature of software, and its complexity, size and changeability. Static analysis tools can help by extracting information from source code and conveying it to software engineers. However, the information provided by typical tools is limited, and some potentially rich veins of information - particularly metrics and visualisations - are under-utilised because developers cannot easily acquire or make use of the data. This thesis documents new tools and techniques for static analysis of software. It addresses the problem of generating parsers directly from standard grammars, thus avoiding the com-mon practice of customising grammars to comply with the limitations of a given parsing al-gorithm, typically LALR(1). This is achieved by a new parser generator that applies a range of bottom-up parsing algorithms to produce a hybrid parsing automaton. Consequently, we can generate more powerful deterministic parsers - up to and including LR(k) - without incurring the combinatorial explosion that makes canonical LR(k) parsers impractical. The range of practical parsers is further extended to include GLR, which was originally developed for natural language parsing but is shown here to also have advantages for static analysis of programming languages. This emphasis on conformance to standard grammars im-proves the rigour of static analysis tools and allows clearer definition and communication of derived information, such as metrics. Beneath the syntactic structure of software (exposed by parsing) lies the deeper semantic structure of declarations, scopes, classes, methods, inheritance, invocations, and so on. In this work, we present a new tool that performs semantic analysis on parse trees to produce a comprehensive semantic model suitable for processing by other static analysis tools. An XML pipeline approach is used to expose the syntactic and semantic models of the software and to derive metrics and visualisations. The approach is demonstrated producing several types of metrics and visualisations for real software, and the value of static analysis for informing software engineering decisions is shown.
APA, Harvard, Vancouver, ISO, and other styles
40

Norris, Shaun W. "A Pipeline for Creation of Genome-Scale Metabolic Reconstructions." VCU Scholars Compass, 2016. http://scholarscompass.vcu.edu/etd/4667.

Full text
Abstract:
The decreasing costs of next generation sequencing technologies and the increasing speeds at which they work have lead to an abundance of 'omic datasets. The need for tools and methods to analyze, annotate, and model these datasets to better understand biological systems is growing. Here we present a novel software pipeline to reconstruct the metabolic model of an organism in silico starting from its genome sequence and a novel compilation of biological databases to better serve the generation of metabolic models. We validate these methods using five Gardnerella vaginalis strains and compare the gene annotation results to NCBI and the FBA results to Model SEED models. We found that our gene annotations were larger and highly similar in terms of function and gene types to the gene annotations downloaded from NCBI. Further, we found that our FBA models required a minimal addition of transport reactions, sources, and escapes indicating that our draft pathway models were very complete. We also found that on average our solutions contained more reactions than the models obtained from Model SEED due to a large amount of baseline reactions and gene products found in ASGARD.
APA, Harvard, Vancouver, ISO, and other styles
41

Roberts, Caroline. "Risk-based decision-making for the management of structural assets." Thesis, Cranfield University, 1999. http://dspace.lib.cranfield.ac.uk/handle/1826/4587.

Full text
Abstract:
This thesis investigates the benefit of risk-based decision methods in engineering decisions. A thorough literature review identified the major issues and limitations in current methods. Consequently a more comprehensive model was developed to account for the complexities of real life decision-making. The enhancements introduced to the model include identifying and evaluating stakeholder influences, decision objectives, criteria and preferences between criteria and decision outcomes. Monitoring and controlling important parameters during implementation is also included to ensure objectives are met and risks controlled. Tools and techniques were identified to support decision-making within the new model. The research focuses on how available techniques can improve engineering decision-making. The model was applied to four case studies analysing real life, 'live' decision problems in bridge management and pipeline management. These confirmed the relevance and importance of the model enhancements. The practicality of the methods, their benefits and limitations were evaluated such that the proposed model was enhanced further. The enhanced model was shown to bring enhanced understanding to all four case studies and made the decisions more rational, thorough and auditable. The fifth case study reviewed how unsupported decisions are currently made within the sponsoring company. This involved a detailed desktop analysis of past projects and interviews with senior engineers and provided further evidence, which emphasised the value of using the decision model. General guidelines were developed based on the case study experiences to help the decision-maker identify the level of analysis required for different types of decision problems. These were defined as applicability matrices. The benefit of using a third party facilitator in each of the case studies was identified in terms of the roles of leader, liaison, disseminator, spokesman and disturbance handler. The balance between these five roles through the stages of the decision process was found to be important to ensure the facilitator does not dominate the decision.
APA, Harvard, Vancouver, ISO, and other styles
42

Melo, Henrique Velloso Ferreira. "Desenvolvimento de um pipeline para análise genômica e transcriptômica com base em Web services." Universidade Federal de São Carlos, 2009. https://repositorio.ufscar.br/handle/ufscar/6951.

Full text
Abstract:
Made available in DSpace on 2016-08-17T18:39:30Z (GMT). No. of bitstreams: 1 2590.pdf: 1752867 bytes, checksum: 7dd2196f0a9d489b35a0759a6cc018c6 (MD5) Previous issue date: 2009-09-04
Pipeline systems for genomic and transcriptomic analysis aim to create communication bridges among the existing analysis tools, therefore reducing researchers efforts. Most of the pipelines found in the literature lack important features which would be useful to the development of genome or transcriptome sequencing projects. Among them, the capacity of tracking the project results along its development, including the generation of partial reports; the presence of a collaborative environment where the involved laboratories can contribute with new data and chromatograms; the possibility to configure analysis parameters; multiple pipeline support and the possibility to include new tools and modules. In this work, a pipeline prototype was developed to overcome these shortcomings. Sequencing projects progresses are tracked along all over their developments. Chromatograms are progressively received along the development of the project and partial reports over newly received data are generated. The communication with the processing server is done via Web service, which offers a universal language interface, allowing client applications in heterogeneous platforms to submit data and execute operations and queries. Pipelines are configured in XML documents written in a predefined format, through which the researchers choose the tools and parameters to be used. The prototype offers support to multiple pipelines executed simultaneously in the same project. Pipelines are executed in parallel by the means of thread pools, what increases efficiency by distributing the workload in multiprocessed systems. Another feature of the prototype is the extensibility as each pipeline step is wrapped in a module. New modules can be easily inserted in the system through the implementation of a programming interface, therefore without the needing of recompilation. Module insertions are done in a declarative way through XML documents. A client application was also developed in the collaborative platform Sakai, allowing different research groups involved in a sequencing project to create pipelines, view results and exchange information on the project current status. To evaluate the efficiency of the prototype, a case study was carried out. Sequences generated from sequencing of Sphenophorus levis transcriptome were submitted and a pipeline was configured to analyze the data. The case study has pointed out that the prototype is efficient and produces good results.
Sistemas de pipeline para análise de genomas e transcriptomas têm o objetivo de criar pontes de comunicação entre as diferentes ferramentas no intuito de reduzir os esforços do pesquisador no processo de análise. A maioria dos pipelines descritos na literatura carece de funcionalidades importantes para o desenvolvimento de projetos de sequenciamento. Entre elas, a capacidade de acompanhar e gerar resultados parciais das análises ao longo do desenvolvimento do projeto; a presença de um ambiente colaborativo onde os diferentes laboratórios envolvidos possam contribuir com novos dados e cromatogramas; a possibilidade da configuração dos parâmetros da análise; o suporte a múltiplos pipelines com diferentes configurações; e o suporte à inclusão de novos programas e módulos. Neste trabalho, foi desenvolvido um protótipo que supre essas deficiências. O progresso dos projetos é acompanhado ao longo de todo o seu desenvolvimento. Para isso, recebe dados brutos de cromatogramas, realiza análises dos dados parciais e emite relatórios com os resultados. A comunicação com o servidor de processamento é realizada via Web service, oferecendo uma interface na linguagem universal XML que permite que aplicações cliente em plataformas heterogêneas submetam dados e realizem operações e consultas. Os pipelines são configurados através de arquivos XML em formato específico, no qual o pesquisador define os programas a parâmetros a utilizar. O protótipo dá suporte a múltiplos pipelines com execução simultânea em um mesmo projeto. A execução dos pipelines é realizada em paralelo por meio de um pool de threads, o que aumenta a eficiência dividindo a carga de processamento em servidores com mais de um núcleo. Uma aplicação cliente foi desenvolvida na plataforma colaborativa, permitindo que os diferentes grupos de pesquisa envolvidos no sequenciamento criem pipelines, visualizem resultados e troquem informações sobre o andamento do projeto. Outro diferencial do protótipo desenvolvido é a extensibilidade. Cada etapa do pipeline é encapsulada em um módulo. Novos módulos podem ser facilmente inseridos sem a necessidade de recompilação de todo o sistema, bastando para isso que o mesmo implemente uma interface específica. A inserção no sistema é realizada declarativamente em arquivos XML. Um estudo de caso foi realizado com a submissão de cromatogramas a partir do sequenciamento de ESTs (Expressed Sequence Tags) de Sphenophorus Levis. Um pipeline foi configurado para o estudo, e sua execução mostrou que o sistema é eficiente e apresenta bons resultados.
APA, Harvard, Vancouver, ISO, and other styles
43

Rowland, Adewumi. "GIS-based prediction of pipeline third-party interference using hybrid multivariate statistical analysis." Thesis, University of Newcastle Upon Tyne, 2011. http://hdl.handle.net/10443/2529.

Full text
Abstract:
In reported pipeline failures globally, third-party interference (TPI) has been recognised as a dominant failure mechanism in the oil and gas industry, although there has been limited research in this area. The problem is receiving considerable attention within the oil and gas industry, because of the industry threats (e.g. Al Qaeda's capabilities) and the natural vulnerability of pipelines because of their long distance network distribution. The ability to predict and secure pipelines against TPI is a valuable knowledge in the pipeline industry, and especially for the safety of the millions of people who live near pipelines. This thesis develop an understanding of the relationships between the many and various contributory factors leading to potential TPI, frequently resulting in mass deaths, economic losses, and widespread destruction to property. The thesis used GIS-based spatial statistical methodologies, first, based on hotspot and cold spot cluster analyses to explain pipeline incident patterns and distributions; and a geographically weighted regression (GWR) model to investigate the determinants of TPI and to identify local and global effects of the independent variables. Secondly, a generalized linear model (GLMs) methodology of Poisson GLMs and Logistic Regression (LR) procedures, by using a combination of land use types, pipeline geometry and intrinsic properties, and socioeconomic and socio-political factors to identify and predict potentially vulnerable pipeline segments and regions in a pipeline network. The GWR model showed significant spatial relationship between TPI, geographical accessibility, and pipeline intrinsic properties (e.g. depth, age, size), varying with location in the study area. The thesis showed that depth of pipeline and the socio-economic conditions of population living near pipeline are the two major factors influencing the occurrence of TPI. This thesis have prompted the need for selective protection of vulnerable segments of a pipeline by installing security tools where most needed. The thesis examined available literature and critically evaluated and assessed selected international pipeline failure databases, their effectiveness, limitations, trend, and the evolving difficulties of addressing and minimising TPI. The result of the review showed irregular nomenclature and the need for a universal classification of pipeline incidents database. The advantages and disadvantages of different detection and prevention tools for minimising TPI, used in the pipeline industry are discussed. A questionnaire survey was developed and employed, as part of the thesis, for the employees and managers in the pipeline industry. The results of the data analysis has contributed to the body of knowledge on pipeline TPI, especially the industry perceptions, prevention strategies, capabilities and complexities of the various application methods presently being implemented. The thesis also outlined the actions that governments and industry can and should take to help manage and effectively reduce the risk of pipeline TPI. The results of this study will be used as a reference to develop strategies for managing pipeline TPI. The results of the thesis also indicated that communications with all stakeholders is more effective in preventing intentional pipeline interference, and that the government's social responsibility to communities is the major factor influencing the occurrence of intentional pipeline TPI.
APA, Harvard, Vancouver, ISO, and other styles
44

SILVA, BARBARA AZEVEDO DA. "SLOPE STABILITY ANALYSIS ALONG A SUBMARINE PIPELINE`S ROUTE AT CAMPOS BASIN, RJ." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2005. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=7336@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
Neste trabalho são avaliadas as condições de estabilidade do subsolo marinho ao longo da rota de um duto rígido de aço revestido com concreto, de 10 de diâmetro, localizado na Bacia de Campos, RJ. Os dados geológicos e geotécnicos foram adquiridos a partir de um amplo levantamento geofísico e geotécnico realizado na diretriz do duto. Devido as adversidades à estabilidade do fundo marinho, várias análises vêm sendo realizadas de forma qualitativa e quantitativa. Atualmente, por questões ambientais e de segurança de obras de engenharia, busca-se uma avaliação mais quantitativa dos riscos geológicos associados à condição de estabilidade do fundo marinho, pois os escorregamentos de taludes representam o principal risco para estas estruturas. As análises de estabilidade foram feitas a partir da teoria do talude infinito, em metodologia aplicada por Nowacki et al. (2003) nos campos profundos de Mad Dog e Atlantis, no Golfo do México, e comparadas com os resultados encontrados a partir da formulação clássica do talude infinito. Uma integração de dados geotécnicos e geofísicos foi necessária, para que todos os parâmetros utilizados na metodologia pudessem ser obtidos. Os resultados indicaram dois pontos críticos ao longo da rota, ambos associados aos flancos do cânion Itapemirim, que apresentam gradientes altos. Foram realizados breves estudos com o objetivo de se avaliar a possibilidade de ocorrência de mecanismos disparadores (terremotos e ondas de tempestade) nesses pontos críticos mas os resultados descartaram esta possibilidade. Concluiu- se ainda que os dados geotécnicos utilizados neste trabalho não foram suficientes e de certo modo não apropriados para a análise de estabilidade de taludes. Uma nova campanha de ensaios geotécnicos de laboratório foi proposta para futuros estudos.
This research analyses stability conditions of marine sediments superficial layers of along a 10 in. diameter pipeline`s route, located at Campos Basin, RJ. The geological and geotechnical data were obtained during a wide geophysical and geotechnical survey along the pipeline`s route. Due to adversities to botton stability found in the marine environment, several qualitative and quantitative stability analysis have been made. Nowadays, because of environmental and safety issues, the researches are more focused on quantitative analysis of geohazards associated with the stability condition of the sea bottom, since slope slides represent the main risk for those structures. The stability analyses were based on the infinite slope theory. The methodology applied was the same used by Nowacki et al. (2003) at the deep fields of Mad Dog and Atlantis, in the Gulf of Mexico. The results were compared with the results from the classical formulation of the infinite slope. An integration of geotechnical and geophysical data was necessary, in order to obtain all the parameters used in the methodology. The results indicated two critical points along the route, both associated with the flanks of the Itapemirim Canyon, which have steep slopes. Brief studies were made to evaluate the importance of triggering mechanisms (earthquakes and storm waves) at these critical points but the results discard this possibility. It was also concluded that the geotechnical data used in this research were insufficient and in a certain way not appropriate for the slope stability analysis. A new geotechnical lab tests campaign was proposed for future studies.
APA, Harvard, Vancouver, ISO, and other styles
45

ROSAS, MARCO ANTONIO PEREZ. "ANALYSIS OF PIPELINE WITH EXTERNAL METAL LOSS REPAIRED WITH STEEL-ADESIVE-REPAIR SYSTEM." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2006. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=9369@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Devido à crescente necessidade de aumentar a disponibilidade de uso imediato de dutos que sofreram danos por perda de espessura ou impacto de objetos estranhos, o uso de reparos tem tido cada vez mais importância. Neste trabalho, um novo método para o reparo de dutos com perda de espessura externa foi analisado. O reparo consiste na utilização de camadas metálicas coladas ao duto com adesivo epóxi. Existem vários tipos de reparos, mas o proposto tem a vantagem de ser facilmente aplicável, prescinde de soldagem, e apresenta módulo elasticidade alto, o que permite uma redução na deformação total presente na região do defeito. Este novo método de reparo tem como objetivo devolver a integridade estrutural do duto de uma forma simples e econômica. Para conhecer o comportamento e avaliar este tipo de reparo, utilizaram-se métodos analíticos, experimentais e numéricos. Fez-se uso da técnica de elementos finitos pela grande vantagem de poder considerar a não linearidade do material, já que foram levadas em consideração situações de estado limite para o cálculo das pressões que originam a plastificação e a ruptura do duto. Foi desenvolvido um modelo analítico para o dimensionamento do reparo, que permite analisar o comportamento das tensões na região do defeito para cada pressão aplicada, procedimento útil em uma seleção ótima do reparo. Na abordagem experimental foram testados oito espécimes tubulares: dois sem defeitos e seis com defeitos usinados para simular a perda de espessura, dos quais cinco foram reparados com diferentes números e tipos de camadas. Os espécimes foram devidamente instrumentados para a obtenção de dados de pressões, variações volumétricas e deformações ocorridas em pontos localizados nos espécimes tubulares. O duto reparado com quatro camadas de aço de baixo carbono e o duto reparado com duas camadas de aço inox 304 suportaram a pressão de ruptura de um duto sem defeito, ambos rompendo em seções afastadas daquelas reparadas. Os dutos reparados com uma, duas e três camadas de aço de baixo carbono romperam na região do defeito. Os resultados experimentais obtidos comprovaram a eficiência da nova técnica de reparo proposta, e foram satisfatoriamente previstos pelos modelos numéricos e analíticos.
Due to increasing need to prolong the availability of immediate use of the pipeline that had suffered to damages of metal loss or strange object impact, the use of repairs has each time more importance. In this work, a new repair system to be utilized on pipeline with external metal loss, was analyzed. The repair system is composed of pre-curved thin layers of steel lamina, which are set in place on the external metal loss defect area of a pipe and cement in place with epoxy resin. There are many types of repair system, but the use of this type of system avoids the need for field welding, and presents high elasticity module, it allows a reduction in the present total deformation in the region of the defect. This new repair system has as objective to return the structural integrity of the pipeline of a simple and economic way. To know the behavior and to evaluate this type of repair, analytical, experimental and numerical methods were used. The use of the finite elements method has the great advantage of being able to consider non linearity of the material for the calculate of the rupture pressure. An analytical model to project the repair was developed, this can analyze the behavior of the stress in the region of the defect for each applied pressure, useful procedure in an excellent election of the repair. In the experimental boarding eight pipe specimens were tested: two without defects and six with defects to simulate the loss of thickness, of which five were repaired with different numbers and types of layers. The specimens were instrumented to obtain data of pressures, volumetric variations and deformations, occurred in the specimens. The pipe repaired with four steel layers of low carbon and the pipe repaired with two stainless steel layers 304 did supported the pressure of rupture of a pipe without defect, both did breach in sections distant those repaired. The pipeline repaired with one, two and three steel layers of low carbon had breached in the region of the defect. The experimental results did prove the efficiency of the new repair system, and satisfactorily had been foreseen by the numerical and analytical models.
APA, Harvard, Vancouver, ISO, and other styles
46

Gorman, Bryan Robert. "Multi-scale imaging and informatics pipeline for in situ pluripotent stem cell analysis." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/97824.

Full text
Abstract:
Thesis: Ph. D., Harvard-MIT Program in Health Sciences and Technology, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 86-97).
Human pluripotent stem (hPS) cells have the ability to reproduce indefinitely and differentiate into any cell type of the body, making them a potential source of cells for medical therapy and an ideal system to study fate decisions in early development. However, hPS cells exhibit a high degree of heterogeneity, which may be an obstacle to their clinical translation. Heterogeneity is at least partially induced as an artifact of removing the cells from the embryo and culturing them on a plastic dish. hPS cells grow in spatially patterned colony structures, which necessitates in situ quantitative single-cell image analysis. This dissertation offers a tool for analyzing the spatial population context of hPS cells that integrates automated fluorescent microscopy with an analysis pipeline. It enables high-throughput detection of colonies at low resolution, with single-cellular and sub-cellular analysis at high resolutions, generating seamless in situ maps of single-cellular data organized by colony. We demonstrate the tool's utility by analyzing inter- and intra-colony heterogeneity of hPS cell cycle regulation and pluripotency marker expression. We measured the heterogeneity within individual colonies by analyzing cell cycle as a function of distance. Cells loosely associated with the outside of the colony are more likely to be in G1, reflecting a less pluripotent state, while cells within the first pluripotent layer are more likely to be in G2, possibly reflecting a G2/M block. Our analysis tool can group colony regions into density classes, and cells belonging to those classes have distinct distributions of pluripotency markers and respond differently to DNA damage induction. Our platform also enabled noninvasive texture analysis of live hPS colonies, which was applied to monitoring subtle changes in differentiation state. Lastly, we demonstrate that our pipeline can robustly handle high-content, high-resolution single molecular mRNA FISH data by using novel image processing techniques. Overall, the imaging informatics pipeline presented offers a novel approach to the analysis of hPS cells, which includes not only single cell features but also spatial configuration across multiple length scales.
by Bryan Robert Gorman.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
47

Vishwakarma, Anmol. "Development Of A Performance Analysis Framework For Water Pipeline Infrastructure Using Systems Understanding." Thesis, Virginia Tech, 2019. http://hdl.handle.net/10919/87081.

Full text
Abstract:
The fundamental purpose of drinking water distribution systems is to provide safe drinking water at sufficient volumes and optimal pressure with the lowest lifecycle costs from the source (treatment plants, raw water source) to the customers (residences, industries). Most of the distribution systems in the US were laid out during the development phase after World War II. As the drinking water infrastructure is aging, water utilities are battling the increasing break rates in their water distribution system and struggling to bear the associated economic costs. However, with the growth in sensory technologies and data science, water utilities are seeing economic value in collecting data and analyzing it to monitor and predict the performance of their distribution systems. Many mathematical models have been developed to guide repair and rehabilitation decisions in the past but remain largely unused because of low reliability. This is because any effort to build a decision support framework based on a model should rest its foundations on a robust knowledge base of the critical factors influencing the system, which varies from utility to utility. Mathematical models built on a strong understanding of the theory, current practices and the trends in data can prove to be more reliable. This study presents a framework to support repair and rehabilitation decisions for water utilities using water pipeline field performance data.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
48

Rashyna, Marina. "Quality control of a somatic mutation analysis pipeline for next generation sequencing data." Thesis, Uppsala universitet, Institutionen för immunologi, genetik och patologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-354751.

Full text
Abstract:
Many studies are focused on analysis of next generation sequencingdata from normal and cancer tissues with the intention ofidentifying somatic mutations in cancer. In brief, the producedsequences are mapped to the reference genome; later the data fromthe tumour and normal sample is compared to identify mutations inthe tumour. Errors can be introduced during the sample handling orby the sequencing platform, leading to incorrect alignment andultimately to false positive mutations. To be certain that thediscovered mutation is not an artefact; quality control should beperformed on the raw sequencing data, on the results of readalignment and finally following the mutation calling.There are two aims of this study. First, to identify the mostimportant metrics for control of raw sequencing data andreadalignment data. Second, to develop tools that can evaluatethese metrics. To discover the most essential metrics, freelyavailable software packages for quality control of the rawsequencing data and read alignment were analysed.Two tools, RawQC and MapQC have been developed in Python 3, toperform a quality control of raw sequencing and alignment data.RawQC can handle targeted panel data from the main commerciallyavailable sequencing platforms Illumina, Ion Torrent and PacificBiosciences. A novel feature implemented in RawQC is the analysisof read duplications for estimating the duplication level withregard to the read length. For MapQC, a new feature is Flagoverview metric that presents a quick summary of the alignment,where the read length is also considered. Both tools produceuseful statistics and graphs for quality assessment of input data.The evaluation of these metrics is an important step beforesomatic variant calling. By evaluating the quality of the datacertain decisions on the data processing and filtering arefacilitated to reduce the amount of false positive or falsenegative mutation calls.
APA, Harvard, Vancouver, ISO, and other styles
49

El-Suleiman, Abdussalam. "Gas turbine application to CO2 pipeline : a techno-economic and environmental risk analysis." Thesis, Cranfield University, 2014. http://dspace.lib.cranfield.ac.uk/handle/1826/9240.

Full text
Abstract:
Gas Turbines (GTs) are used extensively in pipelines to compress gas at suitable points. The primary objective of this study is to look at CO2 return pipelines and the close coupling of the compression system with advanced prime mover cycles. Adopting a techno-economic and environmental risk analysis (TERA) frame work, this study conducts the modelling and evaluation of CO2 compression power requirements for gas turbine driven equipment (pump and compressor). The author developed and validated subroutines to implement variable stators in an in-house GT simulation code known as Variflow in order to enhance the off-design performance simulation of the code. This modification was achieved by altering the existing compressor maps and main program algorithm of the code. Economic model based on the net present value (NPV) method, CO2 compressibility factor model based on the Peng-Robinson equation of state and pipeline hydraulic analysis model based on fundamental gas flow equation were also developed to facilitate the TERA of selected GT mechanical drives in two case scenarios. These case scenarios were specifically built around Turbomatch simulated GT design and off-design performance which ensure that the CO2 is introduced into the pipeline at the supercritical pressure as well as sustain the CO2 pressure above a minimum designated pressure during transmission along an adapted real life pipeline profile. The required compression duty for the maximum and minimum CO2 throughput as well as the operation site ambient condition, guided the selection of two GTs of 33.9 MW and 9.4 MW capacities. At the site ambient condition, the off design simulations of these GTs give an output of 25.9 MW and 7.6 MW respectively. Given the assumed economic parameters over a plant life of 25 years, the NPV for deploying the 33.9 MW GT is about £13.9M while that of the 9.4 MW GT is about £1.2M. The corresponding payback periods (PBPs) were 3 and 7 years respectively. Thus, a good return on investment is achieved within reasonable risk. The sensitivity analysis results show a NPV of about £19.1M - £24.3M and about £3.1M - £4.9M for the 33.9 MW and 9.4 MW GTs respectively over a 25 - 50% fuel cost reduction. Their PBPs were 3 - 2 years and 5 - 4 years respectively. In addition, as the CO2 throughput drops, the risk becomes higher with less return on investment. In fact, when the CO2 throughput drops to a certain level, the investment becomes highly unattractive and unable to payback itself within the assumed 25 years plant life. The hydraulic analysis results for three different pipe sizes of 24, 14 and 12¾ inch diameters show an increase in pressure drop with increase in CO2 throughput and a decrease in pressure drop with increase in pipe size for a given throughput. Owing to the effect of elevation difference, the 511 km long pipeline profile gives rise to an equivalent length of 511.52 km. Similarly, given the pipeline inlet pressure of 15 MPa and other assumed pipeline data, the 3.70 MTPY (0.27 mmscfd) maximum average CO2 throughput considered in the 12¾ inch diameter pipeline results in a delivery pressure of about 15.06 MPa. Under this condition, points of pressure spikes above the pipeline maximum operating allowable pressure (15.3 MPa) were obtained along the profile. Lowering the pipeline operating pressure to 10.5 MPa gives a delivery pressure of about 10.45 MPa within safe pressure limits. At this 10.5 MPa, over a flat pipeline profile of same length, the delivery pressure is about 10.4 MPa. Thus, given the operating conditions for the dense phase CO2 pipeline transmission and the limit of this study, it is very unlikely that a booster station will be required. So also, compressing the CO2 to 15 MPa may no longer be necessary; which eliminates the need of combining a compressor and pump for the initial pressure boost in order to save power. This is because, irrespective of the saving in energy, the increase in capital cost associated with obtaining a pump and suitable driver far outweighs the extra expense incurred in acquiring a rated GT mechanical drive to meet the compression duty.
APA, Harvard, Vancouver, ISO, and other styles
50

Mainetti, Gabriele. "The Herschel Space Observatory: Pipeline development and early cosmological surveys results." Doctoral thesis, Università degli studi di Padova, 2011. http://hdl.handle.net/11577/3427432.

Full text
Abstract:
This thesis deals with al l technical and scientific issues related to the use of infrared and sub-mil limeter wavelengths in astronomy. In particular, the thesis focuses on the HerschelSpace Observatory with particular attention to one of the instruments on board, SPIRE of which the design and components wil l be described in detail. The problems related to data reduction are presented in this Thesis in the central chapters: al l phases of the pipeline data reduction wil l be described, namely the transformation of raw telemetry data in “scientific” data usable by the astronomical community. The thesis wil l then deal with the aspects linked of analysis of these data, particularly with regard to the tools developed to analyze easily the large datasets released to the astronomical community. The scientific side of the issue wil l be discussed in the final chapters with particular attention to the so-cal led cosmological surveys. The thesis wil l focus then on the first results obtained in one of the largest projects designed to Herschel, i.e. HerMES, and on the application of a technique known as statistical analysis of the probability of deflection - P(D) to improve our knowledge about the number counts below the confusion limit. This thesis emphasizes, therefore, both technical aspects (with particular attention to al l the software infrastructure) of an important mission as the Herscheland the purely scientific aspects (formation and evolution of galaxies, observational cosmology) related to astronomical observations in the infrared and sub-mil limetric bands.
Questa Tesi tratta tutte le problematiche tecniche e scientifiche legate all’utilizzo delle lunghezze d’onda infrarosse e sub-millimetriche in astronomia. In particolare la Tesi si focalizza sull’Osservatorio Spaziale Herschel con particolare attenzione a uno degli strumenti a bordo, SPIRE del quale verranno descritti nel dettaglio il design e le componenti. Le problematiche legate alla riduzione dati vengono trattate in questa Tesi nei capitoli centrali nei quali si descriverà dettagliatamente il lavoro svolto in quest’ambito: verranno descritte tutte le fasi della pipeline di riduzione dati, cioè la trasformazione di dati grezzi di telemetria in dati “scien- tifici” usabili dalla comunità astronomica. La Tesi tratterà poi gli aspetti legati all’analisi di questi dati, con particolare riguardo agli strumenti sviluppati per analizzare facilmente i grandi datasets che compongono i “prodotti” finali delle osservazioni effettuate con Herschel. Il lato scientifico del problema verrà discusso nei capitoli finali con particolare attenzione per le cosiddette surveys cosmologiche. La Tesi si focalizzerà quindi sui primi risultati ottenuti nell’ambito di uno dei più grandi progetti pensati per Herschel, cioè HerMES e sulle applicazioni di una tecnica statistica nota come analisi della Probabilità delle Deflessioni - P(D) per ottenere informazioni accurate sull’andamento dei conteggi di sorgenti al di sotto del limite di confusione. Questa Tesi mette quindi in evidenza sia gli aspetti tecnici (con particolare attenzione a tutta l’infrastruttura software) di una missione im- portante come quella di Herschel, sia gli aspetti puramente scientifici (formazione ed evoluzione delle galassie, cosmologia osservative) legati alle osservazioni astro- nomiche nelle bande infrarosse e sub-millimetriche.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography