Дисертації з теми "Data integration method"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 дисертацій для дослідження на тему "Data integration method".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Chavali, Krishna Kumar. "Integration of statistical and neural network method for data analysis." Morgantown, W. Va. : [West Virginia University Libraries], 2006. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=4749.
Повний текст джерелаTitle from document title page. Document formatted into pages; contains viii, 68 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 50-51).
Lin, Shih-Yung. "Integration and processing of high-resolution moiré-interferometry data." Diss., Virginia Tech, 1992. http://hdl.handle.net/10919/40181.
Повний текст джерелаPh. D.
Graciolli, Vinicius Medeiros. "A novel classification method applied to well log data calibrated by ontology based core descriptions." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2018. http://hdl.handle.net/10183/174993.
Повний текст джерелаA method for the automatic detection of lithological types and layer contacts was developed through the combined statistical analysis of a suite of conventional wireline logs, calibrated by the systematic description of cores. The intent of this project is to allow the integration of rock data into reservoir models. The cores are described with support of an ontology-based nomenclature system that extensively formalizes a large set of attributes of the rocks, including lithology, texture, primary and diagenetic composition and depositional, diagenetic and deformational structures. The descriptions are stored in a relational database along with the records of conventional wireline logs (gamma ray, resistivity, density, neutrons, sonic) of each analyzed well. This structure allows defining prototypes of combined log values for each lithology recognized, by calculating the mean and the variance-covariance values measured by each log tool for each of the lithologies described in the cores. The statistical algorithm is able to learn with each addition of described and logged core interval, in order to progressively refine the automatic lithological identification. The detection of lithological contacts is performed through the smoothing of each of the logs by the application of two moving means with different window sizes. The results of each pair of smoothed logs are compared, and the places where the lines cross define the locations where there are abrupt shifts in the values of each log, therefore potentially indicating a change of lithology. The results from applying this method to each log are then unified in a single assessment of lithological boundaries The mean and variance-covariance data derived from the core samples is then used to build an n-dimensional gaussian distribution for each of the lithologies recognized. At this point, Bayesian priors are also calculated for each lithology. These distributions are checked against each of the previously detected lithological intervals by means of a probability density function, evaluating how close the interval is to each lithology prototype and allowing the assignment of a lithological type to each interval. The developed method was tested in a set of wells in the Sergipe-Alagoas basin and the prediction accuracy achieved during testing is superior to classic pattern recognition methods such as neural networks and KNN classifiers. The method was then combined with neural networks and KNN classifiers into a multi-agent system. The results show significant potential for effective operational application to the construction of geological models for the exploration and development of areas with large volume of conventional wireline log data and representative cored intervals.
Stock, Kristin Mary. "A new method for representing and translating the semantics of hetrogenous spatial databases." Thesis, Queensland University of Technology, 2000.
Знайти повний текст джерелаSukcharoenpong, Anuchit. "Shoreline Mapping with Integrated HSI-DEM using Active Contour Method." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1406147249.
Повний текст джерелаDarrell, Leopold Augustus. "Development of an NDT method to characterise flaws based on multiple eddy current sensor integration and data fusion." Thesis, Leeds Beckett University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245778.
Повний текст джерелаLindström, Maria, and Lena Ljungwald. "A study of the integration of complementary analysis methods : Analysing qualitative data for distributed tactical operations." Thesis, Linköping University, Department of Computer and Information Science, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-4750.
Повний текст джерелаComplex socio-technical systems, like command and control work in military operations and rescue operations, are becoming more and more common in the society, and there is a growing urge for more useful and effective systems. Qualitative data from complex socio-technical systems can be challenging to analyse. This thesis probes one way of enhancing existing analysis methods to better suit this task.
Our case study is carried out at FOI (the Swedish Defence Research Agency). One of FOI’s tasks is to analyse complex situations, for example military operations, and they have developed an approach called the Reconstruction – exploration approach (R&E) for analysing distributed tactical operations (DTOs). The R&E approach has a rich contextual approach, but lacks a systematic analytic methodology.
The assignment of this thesis is to investigate how the R&E approach could be enhanced and possibly merged with other existing cognitive analysis methods to better suit the analysis of DTOs. We identified that the R&E approach’s main weaknesses were the lack of structure and insufficient way of handling subjective data, which contributed to difficulties when performing a deeper analysis. The approach also needed a well-defined analysis method for increasing the validity of the identified results.
One way of improvement was to integrate the R&E approach with several cognitive analysis methods based on their respective individual strengths. We started by analysing the R&E approach and then identified qualities in other methods that complemented the weaknesses in the R&E approach. Finally we developed an integrated method.
The Critical Decision Method (CDM) appeared to be the most suitable method for integration with the R&E approach. Nevertheless, the CDM did not have all the qualities asked for so we chose to use functions from other methods included in our initial analysis as well; ETA and Grounded theory.
The integration resulted in a method with a well-defined method for analysis and the possibility to handle subjective data. This can contribute to a deeper analysis of DTOs.
Söderström, Eva. "Merging Modelling Techniques: A Case Study and its Implications." Thesis, University of Skövde, Department of Computer Science, 1999. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-393.
Повний текст джерелаThere are a countless number of methods in the field of Information Systems Development (ISD) today, where only a few have received much attention by practitioners. These ISD methods are described and developed using knowledge in the field of Method Engineering (ME). Most methods concern either what a system is to contain or how the system is to be realised, but as of now, there is no best method for all situations. Bridging the gap between the fuzzier "what"-methods and the more formal "how"-methods is difficult, if not impossible. Methods therefore need to be integrated to cover as much of the systems life cycle as possible. An integration of two methods, one from each side of the gap, can be performed in a number of different ways, each way having its own obstacles that need to be overcome.
The case study we have performed concerns a method integration of the fuzzier Business Process Model (BPM) in the EKD method with the more formal description technique SDL (Specification and Description Language). One meta model per technique was created, which were then used to compare BPM and SDL. The integration process consisted of translating EKD business process diagrams into SDL correspondences, while carefully documenting and analysing encountered problems. The encountered problems mainly arose because of either transaction-independence differences or method focus deviations. The case study resulted in, for example, a number of implications for both EKD and SDL, as well as for ME, and include suggestions for future work.
Forst, Marie Bess. "Zoophonics keyboards: A venue for technology integration in kindergarten." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2560.
Повний текст джерелаZeng, Sai. "Knowledge-based FEA Modeling Method for Highly Coupled Variable Topology Multi-body Problems." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/4772.
Повний текст джерелаRyd, Jonatan, and Jeffrey Persson. "Development of a pipeline to allow continuous development of software onto hardware : Implementation on a Raspberry Pi to simulate a physical pedal using the Hardware In the Loop method." Thesis, KTH, Hälsoinformatik och logistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-296952.
Повний текст джерелаSaab vill undersöka metoden Hardware In the Loop som ett koncept, dessutom hur en infrastruktur av Hardware In the Loop skulle se ut. Hardware In the Loop baseras på att kontinuerligt testa hårdvara som är simulerad. Mjukvaran Saab vill använda sig av för Hardware In the Loop metoden är Jenkins, vilket är ett Continuous Integration och Continuous Delivery verktyg. För attsimulera hårdvaran vill Saab undersöka användningen av ett Application Programming Interface mellan en Raspberry Pi och programmeringsspråket Robot Framework. Anledning till att Saab vill undersöka allt det här, är för att de tror att det kan förbättra frekvensen av testning och kvaliteten av testning, vilket skulle leda till en förbättring av deras produkter. Teorin bakom Hardware In the Loop, Continuous Integration och Continuous Delivery kommer att förklaras i den här rapporten. Hardware In the Loop metoden blev implementerad med Continuous Integration och Continuous Delivery verktyget Jenkins. Ett Application Programming Interface mellan General Purpose Input/output pinnarna på en Raspberry Pi och Robot Framework blev utvecklat. Med de här implementationerna utförda, så blev Hardware Inthe Loop metoden slutligen integrerat, där Raspberry Pis användes för att simulera hårdvaran.
Manser, Paul. "Methods for Integrative Analysis of Genomic Data." VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3638.
Повний текст джерелаMing, Jingsi. "Statistical methods for integrative analysis of genomic data." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/545.
Повний текст джерелаLysenko, Artem. "Integration strategies and data analysis methods for plant systems biology." Thesis, University of Nottingham, 2012. http://eprints.nottingham.ac.uk/27798/.
Повний текст джерелаGao, Yang. "On the integration of qualitative and quantitative methods in data fusion." Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.240463.
Повний текст джерелаKhalilikhah, Majid. "Traffic Sign Management: Data Integration and Analysis Methods for Mobile LiDAR and Digital Photolog Big Data." DigitalCommons@USU, 2016. https://digitalcommons.usu.edu/etd/4744.
Повний текст джерелаWeirauch, Matthew T. "Data integration methods for systems-level investigation of gene functional association networks /." Diss., Digital Dissertations Database. Restricted to UC campuses, 2009. http://uclibs.org/PID/11984.
Повний текст джерелаConstantinescu, Emil Mihai. "Adaptive Numerical Methods for Large Scale Simulations and Data Assimilation." Diss., Virginia Tech, 2008. http://hdl.handle.net/10919/27938.
Повний текст джерелаPh. D.
Choo, Jae gul. "Integration of computational methods and visual analytics for large-scale high-dimensional data." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49121.
Повний текст джерелаJeanmougin, Marine. "Statistical methods for robust analysis of transcriptome data by integration of biological prior knowledge." Thesis, Evry-Val d'Essonne, 2012. http://www.theses.fr/2012EVRY0029/document.
Повний текст джерелаRecent advances in Molecular Biology have led biologists toward high-throughput genomic studies. In particular, the investigation of the human transcriptome offers unprecedented opportunities for understanding cellular and disease mechanisms. In this PhD, we put our focus on providing robust statistical methods dedicated to the treatment and the analysis of high-throughput transcriptome data. We discuss the differential analysis approaches available in the literature for identifying genes associated with a phenotype of interest and propose a comparison study. We provide practical recommendations on the appropriate method to be used based on various simulation models and real datasets. With the eventual goal of overcoming the inherent instability of differential analysis strategies, we have developed an innovative approach called DiAMS, for DIsease Associated Modules Selection. This method was applied to select significant modules of genes rather than individual genes and involves the integration of both transcriptome and protein interactions data in a local-score strategy. We then focus on the development of a framework to infer gene regulatory networks by integration of a biological informative prior over network structures using Gaussian graphical models. This approach offers the possibility of exploring the molecular relationships between genes, leading to the identification of altered regulations potentially involved in disease processes. Finally, we apply our statistical developments to study the metastatic relapse of breast cancer
Aich, Sudipto. "Evaluation of Driver Performance While Making Unprotected Intersection Turns Utilizing Naturalistic Data Integration Methods." Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/76892.
Повний текст джерелаMaster of Science
HÅKANSSON, MICHAEL. "Matching Methods for Information Sharingwith Supply Chain Context." Thesis, KTH, Industriell Management, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-199188.
Повний текст джерелаEtt företags produktivitet och konkurrenskraft beror på dess förmåga att hantera information. Med den teknik som finns tillgänglig är möjligheterna att samla in och behandla information bättre än någonsin. En av de branscher som har visat sig ha stor nytta av att analysera stora mängder av information är detaljhandeln. Hur som helst måste informationen tas emot innan den kan analyseras. Detta innebär ofta att informationen måste flöda mellan medlemmar i en distributionskedja. Syftet med denna studie var att undersöka vilka metoder som är lämpliga för att dela information mellan leverantörer och återförsäljare. Undersökningen genomfördes som en fallstudie inom den svenska sportvaruindustrin, där informationsdelningsförhållandet mellan en leverantör och sju av dess kunder undersöktes. De studerade metoderna för informationsdelning var manuell dokumenthantering, webbportaler och genom en tredjeparts EDI-tjänst. EDI-lösningen gynnar båda parter, men är inte alltid tillämplig. Om resurserna är knappa för båda kommunicerande parter och ingen teknisk lösning för att dela information finns på plats är den manuella metoden en lämplig lösning på kort sikt. Om en part med stora resurser ofta delar information med parter som inte har möjlighet att investera i informationsdelningslösningar kan en portal vara en lämplig kompromiss. Den lösningen ger ffektivitetsvinster till företaget som investerar i portalen medan de andra parterna kan fortsätta att manuellt tillhandahålla information.
Wu, Chao-Min. "Computational Methods for Integrating Different Anatomical Data Sets of The Human Tongue /." The Ohio State University, 1996. http://rave.ohiolink.edu/etdc/view?acc_num=osu148793324553722.
Повний текст джерелаPotter, Dustin Paul. "A combinatorial approach to scientific exploration of gene expression data: An integrative method using Formal Concept Analysis for the comparative analysis of microarray data." Diss., Virginia Tech, 2005. http://hdl.handle.net/10919/28792.
Повний текст джерелаPh. D.
Golbamaki, Bakhtyari Azadi. "Integration of toxicity data from experiments and non-testing methods within a weight of evidence procedure." Thesis, Open University, 2018. http://oro.open.ac.uk/55615/.
Повний текст джерелаKromanis, Rolands. "Structural performance evaluation of bridges : characterizing and integrating thermal response." Thesis, University of Exeter, 2015. http://hdl.handle.net/10871/17440.
Повний текст джерелаBylesjö, Max. "Latent variable based computational methods for applications in life sciences : Analysis and integration of omics data sets." Doctoral thesis, Umeå universitet, Kemi, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1616.
Повний текст джерелаFunktionsgenomik är ett forskningsområde med det slutgiltiga målet att karakterisera alla gener i ett genom hos en organism. Detta inkluderar studier av hur DNA transkriberas till mRNA, hur det sedan translateras till proteiner och hur dessa proteiner interagerar och påverkar organismens biokemiska processer. Den traditionella ansatsen har varit att studera funktionen, regleringen och translateringen av en gen i taget. Ny teknik inom fältet har dock möjliggjort studier av hur tusentals transkript, proteiner och små molekyler uppträder gemensamt i en organism vid ett givet tillfälle eller över tid. Konkret innebär detta även att stora mängder data genereras även från små, isolerade experiment. Att hitta globala trender och att utvinna användbar information från liknande data-mängder är ett icke-trivialt beräkningsmässigt problem som kräver avancerade och tolkningsbara matematiska modeller. Denna avhandling beskriver utvecklingen och tillämpningen av olika beräkningsmässiga metoder för att klassificera och integrera stora mängder empiriskt (uppmätt) data. Gemensamt för alla metoder är att de baseras på latenta variabler: variabler som inte uppmätts direkt utan som beräknats från andra, observerade variabler. Detta koncept är väl anpassat till studier av komplexa system som kan beskrivas av ett fåtal, oberoende faktorer som karakteriserar de huvudsakliga egenskaperna hos systemet, vilket är kännetecknande för många kemiska och biologiska system. Metoderna som beskrivs i avhandlingen är generella men i huvudsak utvecklade för och tillämpade på data från biologiska experiment. I avhandlingen demonstreras hur dessa metoder kan användas för att hitta komplexa samband mellan uppmätt data och andra faktorer av intresse, utan att förlora de egenskaper hos metoden som är kritiska för att tolka resultaten. Metoderna tillämpas för att hitta gemensamma och unika egenskaper hos regleringen av transkript och hur dessa påverkas av och påverkar små molekyler i trädet poppel. Utöver detta beskrivs ett större experiment i poppel där relationen mellan nivåer av transkript, proteiner och små molekyler undersöks med de utvecklade metoderna.
Bylesjö, Max. "Latent variable based computational methods for applications in life sciences : Analysis and integration of omics data sets /." Umeå : Chemistry Kemi, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-1616.
Повний текст джерелаKaden, Marika. "Integration of Auxiliary Data Knowledge in Prototype Based Vector Quantization and Classification Models." Doctoral thesis, Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-206413.
Повний текст джерелаVaratharajah, Thujeepan. "Integrating UCD with Agile Methods : From the perspective of UX-Designers." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-262705.
Повний текст джерелаSamtidigt som Agila metoder ökar i popularitet inom mjukvaruutvecklingsprojekt, så uppstår även frågan om hur Agilt arbete integrerar användarcentrerade krav i sin process ett område som är i fokus inom Användarcentrerad Design (ACD). Tillgängliga rapporter indikerar på att integrationen av Agilt och ACD har givit förbättrade processer och slutprodukt, samt att båda processer är kompatibla med varandra. Det anses dock finnas en brist på riktlinjer i hur man kan integrera båda processer, och det efterfrågas vidare studier i ämnet. Denna studie ämnar till att erbjuda just detta genom att presentera några faktorer av hur Agilt och ACD kan integreras i praktiken, men också exempel på faktorer som kan påverka hur väl integration lyckas. Detta tas fram genom en empirisk studie, genom att ta del av insikter från UX-designers som jobbar i olika Scrum projekt. Tio UX-designers deltog i semistrukturerade intervjuer, och baserat på en tematisk analys så presenteras resultat i form av föreslagna faktorer att ta del av när man vill integrera Agila och ACD metoder.
Blankenburg, Hagen [Verfasser], and Mario [Akademischer Betreuer] Albrecht. "Computational methods for integrating and analyzing human systems biology data / Hagen Blankenburg. Betreuer: Mario Albrecht." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2014. http://d-nb.info/1062535944/34.
Повний текст джерелаGoodbrand, Alan D. "Integrating informal and formal requirements methods, a practical approach for systems employing spatially referenced data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ64954.pdf.
Повний текст джерелаMyers, Robert J. "Problem-based learning: a case study in integrating teachers, students, methods, and hypermedia data bases." Diss., Virginia Tech, 1993. http://hdl.handle.net/10919/40302.
Повний текст джерелаLucchi, Francesca <1984>. "Reverse Engineering tools: development and experimentation of innovative methods for physical and geometrical data integration and post-processing." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amsdottorato.unibo.it/5837/.
Повний текст джерелаL’impiego di tecniche di Ingegneria Inversa si è ampiamente diffuso e consolidato negli ultimi anni, tanto che questi sistemi sono comunemente impiegati in numerose applicazioni. Pertanto, numerose attività di ricerca sono volte all’analisi del dato acquisito in termini di accuratezza e precisione ed alla definizione di tecniche innovative per il post processing. In questo panorama, l’attività di ricerca presentata in questa tesi di dottorato è rivolta alla definizione di due metodologie, l’una finalizzata a facilitare le operazioni di elaborazione del dato e l’altra a permettere un agevole data fusion tra informazioni fisiche e geometriche di uno stesso oggetto. In particolare, il primo approccio prevede l’individuazione della componente di errore nelle coordinate di punti acquisiti mediate un sistema di scansione a triangolazione ottica. Un’opportuna matrice di correzione della componente sistematica è stata individuata, a seconda delle condizioni operative e dei parametri di acquisizione del sistema. Pertanto, si è raggiunto un miglioramento delle performance del sistema in termini di incremento dell’accuratezza del dato acquisito. Il secondo tema di ricerca affrontato in questa tesi consiste nell’integrazione tra il dato geometrico proveniente da una scansione 3D e le informazioni sulla temperatura rilevata mediante un’indagine termografica. Si è così ottenuto un termogramma in 3D registrando opportunamente su ogni punto acquisito il relativo valore di temperatura. L’informazione geometrica, proveniente dalla scansione laser, è stata inoltre utilizzata per normalizzare il termogramma, rendendolo indipendente dal punto di vista della presa termografica.
Heine, Jennifer Miers. "Staff Development Methods for Planning Lessons with Integrated Technology." Thesis, University of North Texas, 2002. https://digital.library.unt.edu/ark:/67531/metadc3343/.
Повний текст джерелаFeitosa, Neto José Alencar. "Estimação do erro em redes de sensores sem fios." Universidade Federal de Alagoas, 2008. http://repositorio.ufal.br/handle/riufal/815.
Повний текст джерелаApresentamos as redes de sensores sem fios no contexto da aquisição de informação, e propomos um modelo genérico baseado nos processos de amostragem e de reconstrução de sinais. Utilizando esse modelo, definimos uma medida de desempenho do funcionamento das redes através do erro de reconstrução do sinal. Dada a complexidade analítica de se calcular esse erro em diferentes cenários, propomos e implementamos uma experiência Monte Carlo que permite avaliar quantitativamente a contribuição de diversos fatores no desempenho de uma rede de sensores sem fios. Esses fatores são (i) a distribuição espacial dos sensores (ii) a granularidade do fenômeno sob observação (iii) a forma em que os sensores amostram o fenômeno (funções características constantes sobre células de Voronoi e sobre círculos), (iv) as características de comunicação entre sensores (por vizinhança entre células de Voronoi e pelo raio de comunicação), (v) os algoritmos de clusterização e agregação (LEACH e SKATER), e (vi) as técnicas de reconstrução (por Voronoi e por Kriging). Os resultados obtidos permitem concluir que todos esses fatores influem significativamente no desempenho de uma rede de sensores sem fios e, pela metodologia de trabalho, foi possível medir essa influência em todos os cenários considerados.
Singh, Nitesh Kumar [Verfasser]. "Integrating diverse biological sources and computational methods for the analysis of high-throughput expression data / Nitesh Kumar Singh." Greifswald : Universitätsbibliothek Greifswald, 2014. http://d-nb.info/1060136937/34.
Повний текст джерелаKramer, Frank [Verfasser], Tim [Akademischer Betreuer] Beißbarth, and Stephan [Akademischer Betreuer] Waack. "Integration of Pathway Data as Prior Knowledge into Methods for Network Reconstruction / Frank Kramer. Gutachter: Tim Beißbarth ; Stephan Waack. Betreuer: Tim Beißbarth." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2014. http://d-nb.info/105990764X/34.
Повний текст джерелаMeng, Chen [Verfasser], Bernhard [Akademischer Betreuer] Küster, and Dmitrij [Akademischer Betreuer] Frischmann. "Application of multivariate methods to the integrative analysis of high-throughput omics data / Chen Meng. Betreuer: Bernhard Küster. Gutachter: Bernhard Küster ; Dmitrij Frischmann." München : Universitätsbibliothek der TU München, 2016. http://d-nb.info/1082347299/34.
Повний текст джерелаAyllón-Benítez, Aarón. "Development of new computational methods for a synthetic gene set annotation." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0305.
Повний текст джерелаThe revolution in new sequencing technologies, by strongly improving the production of omics data, is greatly leading to new understandings of the relations between genotype and phenotype. To interpret and analyze data grouped according to a phenotype of interest, methods based on statistical enrichment became a standard in biology. However, these methods synthesize the biological information by a priori selecting the over-represented terms and focus on the most studied genes that may represent a limited coverage of annotated genes within a gene set. During this thesis, we explored different methods for annotating gene sets. In this frame, we developed three studies allowing the annotation of gene sets and thus improving the understanding of their biological context.First, visualization approaches were applied to represent annotation results provided by enrichment analysis for a gene set or a repertoire of gene sets. In this work, a visualization prototype called MOTVIS (MOdular Term VISualization) has been developed to provide an interactive representation of a repertoire of gene sets combining two visual metaphors: a treemap view that provides an overview and also displays detailed information about gene sets, and an indented tree view that can be used to focus on the annotation terms of interest. MOTVIS has the advantage to solve the limitations of each visual metaphor when used individually. This illustrates the interest of using different visual metaphors to facilitate the comprehension of biological results by representing complex data.Secondly, to address the issues of enrichment analysis, a new method for analyzing the impact of using different semantic similarity measures on gene set annotation was proposed. To evaluate the impact of each measure, two relevant criteria were considered for characterizing a "good" synthetic gene set annotation: (i) the number of annotation terms has to be drastically reduced while maintaining a sufficient level of details, and (ii) the number of genes described by the selected terms should be as large as possible. Thus, nine semantic similarity measures were analyzed to identify the best possible compromise between both criteria while maintaining a sufficient level of details. Using GO to annotate the gene sets, we observed better results with node-based measures that use the terms’ characteristics than with edge-based measures that use the relations terms. The annotation of the gene sets achieved with the node-based measures did not exhibit major differences regardless of the characteristics of the terms used. Then, we developed GSAn (Gene Set Annotation), a novel gene set annotation web server that uses semantic similarity measures to synthesize a priori GO annotation terms. GSAn contains the interactive visualization MOTVIS, dedicated to visualize the representative terms of gene set annotations. Compared to enrichment analysis tools, GSAn has shown excellent results in terms of maximizing the gene coverage while minimizing the number of terms.At last, the third work consisted in enriching the annotation results provided by GSAn. Since the knowledge described in GO may not be sufficient for interpreting gene sets, other biological information, such as pathways and diseases, may be useful to provide a wider biological context. Thus, two additional knowledge resources, being Reactome and Disease Ontology (DO), were integrated within GSAn. In practice, GO terms were mapped to terms of Reactome and DO, before and after applying the GSAn method. The integration of these resources improved the results in terms of gene coverage without affecting significantly the number of involved terms. Two strategies were applied to find mappings (generated or extracted from the web) between each new resource and GO. We have shown that a mapping process before computing the GSAn method allowed to obtain a larger number of inter-relations between the two knowledge resources
Melamid, Elan. "What works? integrating multiple data sources and policy research methods in assessing need and evaluating outcomes in community-based child and family service systems /." Santa Monica, Calif. : RAND, 2002. http://www.rand.org/publications/RGSD/RGSD161/RGSD161.pdf.
Повний текст джерелаChen, Kai. "Mitigating Congestion by Integrating Time Forecasting and Realtime Information Aggregation in Cellular Networks." FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/412.
Повний текст джерелаArnold, Matthias [Verfasser], Hans-Werner [Akademischer Betreuer] [Gutachter] Mewes, Florian [Gutachter] Kronenberg, and Fabian J. [Gutachter] Theis. "Supporting the evidence for human trait-associated genetic variants by computational biology methods and multi-level data integration. / Matthias Arnold ; Gutachter: Florian Kronenberg, Fabian J. Theis, Hans-Werner Mewes ; Betreuer: Hans-Werner Mewes." München : Universitätsbibliothek der TU München, 2016. http://d-nb.info/1113749164/34.
Повний текст джерелаRuppel, Antonia [Verfasser], Frank [Akademischer Betreuer] Lisker, Frank [Gutachter] Lisker, and Laura [Gutachter] Crispini. "A multi-method approach to study the geodynamic evolution of eastern Dronning Maud Land in East Antarctica by integrating geophysical data with surface geology / Antonia Ruppel ; Gutachter: Frank Lisker, Laura Crispini ; Betreuer: Frank Lisker." Bremen : Staats- und Universitätsbibliothek Bremen, 2019. http://d-nb.info/1194156746/34.
Повний текст джерелаObeidat, Laith Mohammad. "Enhancing the Indoor-Outdoor Visual Relationship: Framework for Developing and Integrating a 3D-Geospatial-Based Inside-Out Design Approach to the Design Process." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/97726.
Повний текст джерелаDoctor of Philosophy
Achieving a well-designed visual connection to one's surroundings is considered by many philosophers and theorists to be an essential aspect of our spatial experience within built environments. The goal of this research is to help designers to achieve better visual connections to the outside environment and therefore create more meaningful spatial experiences within the built environment. This research aims to enhance the ability of designers to explore the best possible views and make the right design decisions to frame these views of the outdoors from the inside of their buildings. Of course, the physical presence of designers at a building site has been the traditional method of determining the best views; however, this is not always possible during the design process for many reasons. Thus, this research aims to find a more effective alternative to visiting a building site in order to inform each design decision regarding the quality of its visual connection to the outdoors. To do so, this research developed a proposed inside-out design approach to be integrated into the design process. Specifically, it outlines a process that allows the designers to be digitally immersed within an accurate 3D representation of the surrounding context, which will help designers to explore views from multiple angles both inside the space and in response make the most suitable design decision. For further developing the proposed process, it was used during conducting this research to design an Art Museum on Virginia Tech Campus.
Obrocki, Lea Marit [Verfasser]. "Advances in geoarchaeological site formation research by integrating geophysical methods, direct push sensing techniques and stratigraphic borehole data - case studies from central Europe and the western Peloponnese around ancient Olympia - / Lea Marit Obrocki." Mainz : Universitätsbibliothek Mainz, 2019. http://d-nb.info/118923730X/34.
Повний текст джерелаWang, Yuepeng. "Integrative methods for gene data analysis and knowledge discovery on the case study of KEDRI's brain gene ontology a thesis submitted to Auckland University of Technology in partial fulfilment of the requirements for the degree of Master of Computer and Information sciences, 2008 /." Click here to access this resource online, 2008. http://hdl.handle.net/10292/467.
Повний текст джерелаLemaréchal, Jean-Didier. "Estimation des propriétés dynamiques des réseaux cérébraux à large échelle par modèles de masse neurale de potentiels évoqués cortico-corticaux Comparison of two integration methods for dynamic causal modeling of electrophysiological data. NeuroImage An atlas of neural parameters based on dynamic causal modeling of cortico-cortical evoked potentials." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALS007.
Повний текст джерелаThis thesis work aims at modeling cortico-cortical evoked potentials (CCEPs) induced by intracortical direct electrical stimulation in epileptic patients being recorded with stereo-electroencephalography during epilepsy surgery. Neural mass models implemented within the dynamic causal modeling (DCM) framework are used for this purpose.We first demonstrate the importance of using an accurate integration scheme to solve the system of differential equations governing the global dynamics of the model, in particular to obtain precise estimates of the neuronal parameters of the model (Lemaréchal et al., 2018).In a second study, this methodology is applied to a large dataset from the F-TRACT project. The axonal conduction delays and speeds between brain regions, as well as the local synaptic time constants are estimated and their spatial mapping is obtained based on validated cortical parcellation schemes. Interestingly, the large amount of data included in this study allow to highlight brain dynamics differences between the young and the older populations (Lemaréchal et al., submitted).Finally, in the Bayesian context of DCM, we show that an atlas of connectivity can improve the specification and the estimation of a neural mass model, for electroencephalographic and magnetoencephalographic studies, by providing a priori distributions on the connectivity parameters of the model.To sum up, this work provides novel insights on dynamical properties of cortico-cortical interactions. The publication of our results in the form of an atlas of neuronal properties already provides an effective tool for a better specification of whole brain neuronal models
Qasim, Lara. "System reconfiguration : A Model based approach; From an ontology to the methodology bridging engineering and operations Model-Based System Reconfiguration: A Descriptive Study of Current Industrial Challenges Towards a reconfiguration framework for systems engineering integrating use phase data An overall Ontology for System Reconfiguration using Model-Based System Engineering An Ontology for System Reconfiguration: Integrated Modular Avionics IMA Case Study A model-based method for System Reconfiguration." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPAST031.
Повний текст джерелаSystem evolutions have to be managed to ensure system effectiveness and efficiency through its whole lifecycle, particularly when it comes to complex systems that take years of development and dozens of years of usage. System Reconfiguration is key in complex systems management, as it is an enabler of system flexibility and adaptability regarding system evolutions. System reconfiguration ensures operational effectiveness and increases system qualities (e.g., reliability, availability, safety, and usability).This research has been conducted in the context of a large international aerospace, space, ground transportation, defense, and security company. This research aims at supporting system reconfiguration during operations.First, we conducted a descriptive study based on a field study and a literature review to identify the industrial challenges related to system reconfiguration. The main issue lies in the development of reconfiguration support. More specifically, challenges related to data identification and integration were identified.In this thesis, we present the OSysRec ontology, which captures and formalizes the reconfiguration data. The ontology synthesizes the structure, dynamics, and management aspects necessary to support the system reconfiguration process in an overall manner.Furthermore, we present a model-based method (MBSysRec) that integrates system reconfiguration data and bridges both the engineering and the operational phases. MBSysRec is a multidisciplinary method that involves combinatorial configuration generation and a multi-criteria decision-making method for configuration evaluation and selection.This thesis is a step towards a model-based approach for system reconfiguration of evolving systems, ensuring their flexibility and adaptability
Chen, Jiuqiang. "Designing scientific workflows following a structure and provenance-aware strategy." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00931122.
Повний текст джерела