Tesis sobre el tema "Data loading"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Data loading.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 47 mejores tesis para su investigación sobre el tema "Data loading".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Mack, Moritz. "Loading and Querying Data on Distributed Virtualized Web Application Servers". Thesis, Uppsala University, Department of Information Technology, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-108039.

Texto completo
Resumen

Virtualized web application servers within computational clouds, such as the GoogleApp Engine, generally restrict resource usage and therefore provide limited,relationally none-complete query facilities only. This work investigates how scalable,reliable and a more powerful access to the App Engine Datastore can beaccomplished and an Optimized Distributed Datastore Exchange (ODDSE) ispresented. Being aware of the App Engine’s resource restrictions ODDSE provides areliable and failure safe query interface to transparently exchange data with thedistributed Datastore using full SQL or AmosQL. ODDSE therefore wraps Datastorerelations and utilizes the extensible database system Amos II to compensate formissing query facilities in Google’s relationally none-complete query language GQL.Under the covers ODDSE furthermore implements an adaptive and reliablemanagement of interactions with App Engine servers. For scalability and highperformance the interaction is parallelized and continuously adapted. Theperformance of ODDSE is evaluated and compared to a similar system showing itsconsiderably good results for both bulk uploading and query processing.

Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Gustafsson, Markus. "A Server for ARINC 615A Loading". Thesis, Linköpings universitet, Institutionen för datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-100443.

Texto completo
Resumen
The development of the next generation of Saab's multirole combat aircraft the JAS 39 Gripen includes developing a lot of new software for the aircraft's on-board computers. The new software has to be loaded into these computers routinely, in order to carry out testing on it. This is currently a slow and tedious process. In order to load the computers a protocol defined in the ARINC 615A standard is used. Today Saab uses commercial software applications implementing this protocol for the loading process. These applications have significant disadvantages, such as not being able to load several computers in parallel or only being able to load computers from a specific manufacturer. In this thesis we introduce a system using a client-server architecture that allows users to load the computers in parallel, regardless of the manufacturer. In Section 3.2.2 we show that our system reduces the time required to load the aircraft's on-board computers significantly. We also indicate some improvements that can be made in order to speed up the process even further. These improvements mainly involve improving the on-board computers themselves through the use of faster persistent storage and support for later revisions of the protocol involved.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Carlo, Gilles. "Dynamic loading and class management in a distributed actor system". Master's thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-04272010-020040/.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Entesar, Abdullah Ali. "Statistical analysis of truck loading on Swedish highways". Thesis, KTH, Transportvetenskap, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-45980.

Texto completo
Resumen
Vehicle over loading, or single axle over loading, is one of the major causes of pavement deterioration. Trafik Verket (TV), the Swedish Transport Administration, recognized that the current process for estimating traffic volume should be reevaluated, and if possible improved. This degree project uses data from the Bridge Weigh in Motion (BWIM) system to study the actual loads applied to Swedish highways. The axle load spectrum is plotted with the conventional frequency distribution plots, and with a new cumulative distribution approach. The paper introduces the maximum allowable potential vehicle weight MAPVW concept, and uses this visual technique to identify overloads for different vehicle geometries. The paper concludes that for 5 and 6 axle trucks the triple axle is frequently overloaded, while for longer trucks one of the dual axles is often over loaded. The highest over loads tend to be on the driving axle, suggesting incorrect loading procedures.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Ahmed, Ejaz. "A grid enabled staging DBMS method for data Mapping, Matching & Loading". Thesis, University of Bedfordshire, 2011. http://hdl.handle.net/10547/204951.

Texto completo
Resumen
This thesis is concerned with the need to deal with data anomalies, inconsistencies and redundancies within the context of data integration in grids. A data Mapping, Matching and Loading (MML) process that is based on the Grid Staging Catalogue Service (MML-GSCATS) method is identified. In particular, the MML-GSCATS method consists of the development of two mathematical algorithms for the MML processes. Specifically it defines an intermediate data storage staging facility in order to process, upload and integrate data from various small to large size data repositories. With this in mind, it expands the integration notion of a database management system (DBMS) to include the MML-GSCATS method in traditional distributed and grid environments. The data mapping employed is in the form of value correspondences between source and target databases whilst data matching consolidates distinct catalogue schemas of federated databases to access information seamlessly. There is a need to deal with anomalies and inconsistencies in the grid, MML processes are applied using a healthcare case study with developed scenarios. These scenarios were used to test the MML-GSCATS method with the help of software prototyping toolkit. Testing has set benchmarks, performance, reliability and error detections (anomalies and redundancies). Cross-scenario data sets were formulated and results of scenarios were compared with benchmarking. These benchmarks help in comparing the MMLGSCATS methodology with traditional and current grid methods. Results from the testing and experiments demonstrate that the MML-GSCATS is a valid method for identifying data anomalies, inconsistencies and redundancies that are produced during loading. Testing results indicates the MML-GSCATS is better than traditional methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Trumstedt, Karl. "Evaluation of methods for loading and processing of measurement data in Oracle". Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-181305.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Gao, Fei Ph D. Massachusetts Institute of Technology. "Modeling human attention and performance in automated environments with low task loading". Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/106592.

Texto completo
Resumen
Thesis: Ph. D. in Engineering Systems, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2016.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 211-225).
Automation has the benefit of reducing human operators' workload. By leveraging the power of computers and information technology, the work of human operators is becoming easier. However, when the workload is too low but the human is required to be present either by regulation or due to limitations of automation, human performance can be negatively affected. Negative consequences such as distraction, mind wandering, and inattention have been reported across many high risk settings including unmanned aerial vehicle operation, process control plant supervision, train engineers, and anesthesiologists. Because of the move towards more automated systems in the future, a better understanding is needed to enable intervention and mitigation of possible negative impacts. The objectives of this research are to systematically investigate the attention and performance of human operators when they interact with automated systems under low task load, build a dynamic model and use it to facilitate system design. A systems-based framework, called the Boredom Influence Diagram, was proposed to better understand the relationships between the various influences and outcomes of low task loading. A System Dynamics model, named the Performance and Attention with Low-task-loading (PAL) Model, was built based on this framework. The PAL model captures the dynamic changes of task load, attention, and performance over time in long duration low task loading automated environments. In order to evaluate the replication and prediction capability of the model, three dynamic hypotheses were proposed and tested using data from three experiments. The first hypothesis stated that attention decreases under low task load. This was supported by comparing model outputs with data from an experiment of target searching using unmanned vehicles. Building on Hypothesis 1, the second and third hypotheses examined the impact of decreased attention on performance in responding to an emergency event. Hypothesis 2 was examined by comparing model outputs with data from an experiment of accident response in nuclear power plant monitoring. Results showed that performance is worse with lower attention levels. Hypothesis 3 was tested by comparing model outputs with data from an experiment of defensive target tracking. The results showed that the impact of decreased attention on performance was larger when the task was difficult. The process of testing these three hypotheses shows that the PAL model is a generalized theory that could explain behaviors under low task load in different supervisory control settings. Finally, benefits, limitations, generalizability and applications of the PAL model were evaluated. Further research is needed to improve and extend the PAL model, investigate individual differences to facilitate personnel selection, and develop system and task designs to mitigate negative consequences.
by Fei Gao.
Ph. D. in Engineering Systems
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Deng, Yanbo. "Using web services for customised data entry". Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080313.185408/.

Texto completo
Resumen
Scientific databases often need to be accessed from a variety of different applications. There are usually many ways to retrieve and analyse data already in a database. However, it can be more difficult to enter data which has originally been stored in different sources and formats (e.g. spreadsheets, other databases, statistical packages). This project focuses on investigating a generic, platform independent way to simplify the loading of databases. The proposed solution uses Web services as middleware to supply essential data management functionality such as inserting, updating, deleting and retrieval of data. These functions allow application developers to easily customise their own data entry applications according to local data sources, formats and user requirements. We implemented a Web service to support loading data to the Germinate database at the New Zealand Institute of Crop & Food Research (CFR). We also provided language specific client toolkits to help developers invoke the Web service. The toolkits allow applications to be easily customised for different platforms. In addition, we developed sample applications to help end users load data from their project data sources via the Web service. The Web service approach was evaluated through user and developer trials. The feedback from the developer trial showed that using Web services as middleware is a useful approach to allow developers and competent end users to customise data entry with minimal effort. More importantly, the customised client applications enabled end users to load data directly from their project spreadsheets and databases. It significantly reduced the effort required for exporting or transforming the source data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Melcher, Anthony A. "Estimating Suspended Solids and Phosphorus Loading in Urban Stormwater Systems Using High-Frequency, Continuous Data". DigitalCommons@USU, 2019. https://digitalcommons.usu.edu/etd/7455.

Texto completo
Resumen
The introduction of pavement, buildings, and other impervious surfaces to urban landscapes greatly influences the quantity and quality of urban stormwater runoff. In this study, we designed and implemented modern stormwater monitoring technologies to establish a “smart” stormwater sensor network within the Northwest Field Canal (NWFC), an urban water conveyance located in Logan, Utah, USA. This network was designed to collect flow and water quality data at high frequencies and simultaneously at multiple locations. The observatory’s innovative method of inter-site communication and changing sampling frequencies during storm events was able to capture short duration events at the upstream and downstream ends of the NWFC and at multiple outfalls in the canal simultaneously without human intervention. We then investigated statistical regression models between turbidity and TSS so as to predict TSS at high frequencies. Finally, the addition of the high-frequency discharge data in the calibration procedure for a stormwater simulation model developed using the Environmental Protection Agency’s Stormwater Management Model did little to improve model performance at the downstream end of the canal, but did provide important insight into the overall contribution of discharge from individual stormwater outfalls to the NWFC. The results of this study inform water professionals on how to build and operate automated monitoring systems and how to create high-frequency estimates of TSS and TP loads in urban water systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Barauskas, Antanas. "Duomenų gavimas iš daugialypių šaltinių ir jų struktūrizavimas". Master's thesis, Lithuanian Academic Libraries Network (LABT), 2014. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2014~D_20140619_092257-64025.

Texto completo
Resumen
Šio darbo idėja yra Išgauti-Pertvarkyti-Įkelti (angl. ETL) principu veikiančios sistemos sukūrimas. Sistema išgauna duomenis iš skirtingo tipo šaltinių, juos tinkamai pertvarko ir tik tuomet įkelia į parinktą saugojimo vietą. Išnagrinėti pagrindiniai duomenų gavimo būdai ir populiariausi šiuo metu ETL įrankiai. Sukurta debesų kompiuterija paremtos daugiakomponentinės duomenų gavimo iš daugialypių šaltinių ir jų struktūrizavimo vieningu formatu sistemos architektūra ir prototipas. Skirtingai nuo duomenis kaupiančių sistemų, ši sistema duomenis išgauna tik tuomet, kai jie reikalingi. Duomenų saugojimui naudojama grafu paremta duomenų bazė, kuri leidžia saugoti ne tik duomenis bet ir jų tarpusavio ryšių informaciją. Darbo apimtis: 48 puslapiai, 19 paveikslėlių, 10 lentelių ir 30 informacijos šaltinių.
The aim of this work is to create ETL (Extract-Transform-Load) system for data extraction from different types of data sources, proper transformation of the extracted data and loading the transformed data into the selected place of storage. The main techniques of data extraction and the most popular ETL tools available today have been analyzed. An architectural solution based on cloud computing, as well as, a prototype of the system for data extraction from multiple sources and data structurization have been created. Unlike the traditional data storing - based systems, the proposed system allows to extract data only in case it is needed for analysis. The graph database employed for data storage enables to store not only the data, but also the information about the relations of the entities. Structure: 48 pages, 19 figures, 10 tables and 30 references.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Carn, Cheril y cheril Carn@dsto defence gov au. "The inverse determination of aircraft loading using artificial neural network analysis of structural response data with statistical methods". RMIT University. Aerospace, Mechanical and Manufacturing Engineering, 2007. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080109.090600.

Texto completo
Resumen
An artificial Neural Network (ANN) system has been developed that can analyse aircraft flight data to provide a reconstruction of the aerodynamic loads experienced by the aircraft during flight, including manoeuvre, buffet and distributed loading. For this research data was taken from the International Follow-On Structural Test Project (IFOSTP) F/A-18 fatigue test conducted by the Royal Australian Air Force and Canadian Forces. This fatigue test involved the simultaneous application of both manouevre and buffet loads using airbag actuators and shakers. The applied loads were representative of the actual loads experienced by an FA/18 during flight tests. Following an evaluation of different ANN types an Ellman network with three linear layers was selected. The Elman back-propagation network was tested with various parameters and structures. The network was trained using the MATLAB 'traingdx' function with is a gradient descent with momentum and adaptive learning rate back-propagation algorithm. The ANN was able to provide a good approximation of the actual manoeuvre or buffet loads at the location where the training loads data were recorded even for input values which differ from the training input values. In further tests the ability to estimate distributed loading at locations not included in the training data was also demonstrated. The ANN was then modified to incorporate various methods for the calculation and prediction of output error and reliability Used in combination and in appropriate circumstances, the addition of these capabilities significantly increase the reliability, accuracy and therefore usefulness of the ANN system's ability to estimate aircraft loading.To demonstrate the ANN system's usefulness as a fatigue monitoring tool it was combined with a formulae for crack growth analysis. Results inficate the ANN system may be a useful fatigue monitoring tool enabling real time monitoring of aircraft critical components using existing strain gauge sensors.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Najafian, G. "Local hydrodynamic force coefficients from field data and probabilistic analysis of offshore structures exposed to random wave loading". Thesis, University of Liverpool, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.317214.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

De, Wet Dirk Pieter Gerhardus. "Post-calibration and quality management of weigh-in-motion traffic data". Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/4205.

Texto completo
Resumen
Thesis (MScEng (Process Engineering))--University of Stellenbosch, 2010.
Thesis presented in fulfilment of the requirements for the degree of Master of Science in Engineering at Stellenbosch University
ENGLISH ABSTRACT: Weigh-in-motion (WIM) scales are installed on various higher order South African roads to provide traffic loading information for pavement design, strategic planning and law enforcement using a scientific approach. The two most respected international guideline documents for WIM systems are the American ASTM E1318 Standard and the COST 323 European Specification, yet neither are fully suited to be applied to local WIM systems. The author developed a post-calibration method for WIM data, called the Truck Tractor (TT) Method, to correct the magnitude of recorded axle loads in retrospect. It incorporates a series of powerful data quality checks. The TT Method is robust, accurate and adequately simple to be used on a routine basis. The TT Method uses the truck tractor loads of articulated 6- and 7-axle trucks with single steering- and double driving axles – these vehicle are called Eligible Trucks. Only Eligible Trucks with average axle loads between 6.5 t and 8.5 t are used in the calibration process – these vehicles are called Selected Trucks. A calibration factor, kTT, is determined using a fully automated iterative procedure, and multiplied with all axle load measurements to produce data for which the average truck tractor load of Selected Trucks, TTT, is equal to 21.8 t. The TT Method can be used for WIMs in various operating environments and is not sensitive to the extent of miss-calibration of a WIM, clipping of sensors owing to poor lane discipline or different extents of loading on different routes. The TT Method includes a series of data quality checks that can be used on a routine basis. They are summarised as follows: - The standard deviation of truck tractor loads for Selected Trucks, STTT, should always be below 2.0 t, but preferably below 1.9 t. - The standard deviation of front axle loads for Selected Trucks, SFTT, should always be below 0.9 t, but preferably below 0.8 t. - The post-calibration factor from the TT Method, kTT, should be between 0.9 and 1.1. The factor for any month should not deviate by more than 3% from the moving average of the previous five months. - The average of front axle loads of Selected Trucks, FTT, should be between 5.6 t and 6.6 t; the exact values are influenced by load transfer between the steering and driving axles. - A procedure was formulated using the Front axle / Truck tractor Ratio, FTR, to identify the percentage of Eligible Trucks that in all probability clipped the sensor. The percentage of these records must be below 10 %, but preferably below 6 %. The TT Method has the potential to significantly improve WIM data collection in South Africa. The calibration module of the TT Method, i.e. the procedure to calculate kTT, has already been accepted by SANRAL. Most of the data quality checking concepts associated with the TT Method were also accepted, although their threshold values are still being refined.
AFRIKAANSE OPSOMMING: Weeg-in-beweging (“weigh-in-motion”, WIM) skale word op talle hoë orde paaie in Suid- Afrika gebruik om op wetenskaplike wyse verkeersinligting te verskaf wat gebruik word vir plaveiselontwerp, strategiese beplanning en wetstoepassing met betrekking tot oorlading. Nie een van die twee vooraanstaande internasionale riglyne vir WIM sisteme, die ASTM E1318 Standaard en die COST 323 Europese Spesifikasie, is in geheel geskik vir Suid-Afrikaanse kondisies nie. Die outeur het ‘n unieke kalibrasie metode, genaamd die TT Metode, ontwikkel wat ’n reeks roetine kwaliteitsbeheertoetse vir WIM data insluit. Die TT Metode is eenvoudig, akkuraat en toepaslik vir ’n wye verskeidenheid WIM sisteme in Suid-Afrika. Die massa van trekkers van geartikuleerde 6- en 7-as vragmotors met enkel stuur- en dubbel dryf-aste en ’n gemiddelde asmassa tussen 6.5 en 8.5 ton (ook genoem Geselekteerde Vragmotors) word as verwysingsmassa gebruik. ’n Iteratiewe prosedure word gevolg vir die bepaling van die kalibrasie faktor, kTT. Dieselfde faktor word met alle asmassas in die data vir die analise periode vermenigvuldig, met die einddoel dat die gemiddelde trekker massa van die Geselekteerde Vragmotors, TTT, gedryf word na die teikenwaarde van 21.8 ton. Die TT Metode is ewe toepaslik ongeag die tipiese belading van trokke op ’n roete, hoe goed die WIM sisteem oorspronklik gekalibreer was of hoe goed laandissipline by die WIM sensor is. Die kwaliteitsbeheertoetse kan op ’n roetine basis toegepas word as deel van die uitvoering van WIM kalibrasie prosedure, en word soos volg saamgevat: - Die standaard afwyking van trekker massas van Geselekteerde Vragmotors, STTT, behoort altyd laer as 2.0 ton, maar verkieslik laer as 1.9 ton te wees. - Die standaard afwyking van voor-as massas van Geselekteerde Trokke, SFTT, behoort altyd laer as 0.9 ton, maar vekieslik laer as 0.8 ton te wees. - Die kalibrasiefaktor, kTT, moet verkieslik tussen 0.9 en 1.1 wees, en mag nie met meer as 3 % van die gemiddelde kTT vir die voorafgaande vyf maande verskil nie. - Die gemiddeld van voor-as massas van Geselekteerde Vragmotors, FTT, behoort tussen 5.6 ton en 6.6 ton te wees. Die presiese waarde hang af van die mate waartoe gewig tussen die voor-as en dubbel dryf-as oorgedra word weens dinamiese effekte op die trekker. - Die verhouding tussen die voor-as en dubbel dryf-as, bekend as die FTR, kan gebruik word as ‘n aanduiding of ’n trok weens swak laandissipline slegs gedeeltelik oor die WIM sensor gery het. Die persentasie gedeeltelike metings moet laer as 10%, maar verkieslik laer as 6 % wees. Die TT Metode het die potensiaal om die insameling en kwaliteit van verkeersdata deur middel van WIM sisteme noemenswaarding te verbeter. Die kalibrasie module van die TT Metode, m.a.w. die prosedure om kTT te bereken, is reeds deur SANRAL aanvaar. Die meeste van die kwaliteitsbeheerkonsepte wat met die TT Metode gepaard gaan is ook aanvaar, maar die drempelwaardes hiervoor word nog verfyn.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Chen, Yi. "Experimental characterization and modelling of a servo-pneumatic system for a knee loading apparatus". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amslaurea.unibo.it/8565/.

Texto completo
Resumen
The new knee test rig developed in University of Bologna used pneumatic cylinder as actuator system. Specific characterization and modelling about the pneumatic cylinder and the related devices are needed in better controlling the test rig. In this thesis, an experimental environment for the related device is set up with data acquisition system using Real-time Windows Target, Simulink, MatLab. Based on the experimental data, a fitted model for the pneumatic cylinder friction is found.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Berg, Matthias y Jonathan Grangien. "Implementing an Interactive Simulation Data Pipeline for Space Weather Visualization". Thesis, Linköpings universitet, Medie- och Informationsteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-162477.

Texto completo
Resumen
This thesis details work carried out by two students working as contractors at the Community Coordinated Modelling Center at Goddard Space Flight Center of the National Aeronautics and Space Administration. The thesis is made possible by and aims to contribute to the OpenSpace project. The first track of the work implemented is the handling of and putting together new data for a visualization of coronal mass ejections in OpenSpace. The new data allows for observation of coronal mass ejections at their origin by the surface of the Sun, whereas previous data visualized them from 30 solar radii out from the Sun and outwards. Previously implemented visualization techniques are used together to visualize different volume data and fieldlines, which together with a synoptic magnetogram of the Sun gives a multi-layered visualization. The second track is an experimental implementation of a generalized and less user involved process for getting new data into OpenSpace, with a priority on volume data as that was a subject of experience. The results show a space weather model visualization, and how one such model can be adapted to fit within the parameters of the OpenSpace project. Additionally, the results show how a GUI connected to a series of background events can form a data pipeline to make complicated space weather models more easily available.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Saqib, Muhammad. "The use of laboratory and site tests data in the finite element modelling of offshore piles subjected to tensile loading". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Buscar texto completo
Resumen
The thesis investigate the response of pile foundations subjected to axial load, with application to offshore wind turbines supported by multi-pod steel jackets. Wind turbines are relatively light weight structures as compared to other offshore structures and this results on the foundation in high tensile loadings. For this reason, the thesis focus is the response of a open ended steel pile subjected to pull out. The study develops on the results of a large-scale testing campaign recently conducted at the Test Centre of Support Structures of the Universität Hannover. The model piles encompassed various lenghts, diameters and thicknesses. In the thesis, different design methods (ICP-05, UWA-05, NGI-05 and Fugro-05) have been used to back analyse the experimental results and were compared to each others. Results of site and laboratory tests on the soil used in the experiments were also used to identify the soil parameters required to implement finite element models of the tested piles. In particular, CPT tests and triaxial tests were considered and the parameters derived are the soil behavior type, the soil unit weight, the stiffness modulus, the relative density, the poissons ratio, and the peak friction angle. Finite element analyses were then run in the experimental boundary and loading conditions e the results were compared with experimental data. A good agreement was observed, particularly on the capacity. The advantage of using the finite element method is the possibility to gain an insight in the pile behaviour and describe the entire load-displacement curve, when compared to CPT based bearing capacity prediction. Good results were achieved using a simple constitutive model, improvement can be surely gained by implementing more sophisticate soil models.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Anderson, John Robert Beveridge. "Review of South African live load models for traffic loading on bridge and culvert structures using weigh-in-motion (WIM) data". Master's thesis, University of Cape Town, 2006. http://hdl.handle.net/11427/14590.

Texto completo
Resumen
Includes bibliographical references.
This thesis uses the axle weights and axle spacings of heavy vehicles recorded by weigh-in-motion (WIM) sensors to calculate the load effects on single lane, simply supported structures spanning up to 30m. The main objective was to compare the load effects caused by the recorded vehicles with those calculated using TMH7 Part 2 and the alternative live load models proposed in subsequent research. Through the probabilistic analysis of the truck survey data, the thesis predicts the magnitude of extreme events that may occur within a bridge structure's design life. The results reinforce the deï¬ ciencies of TMH7 Part 2's NA loading curve to cater for normal traffic conditions on spans of 10m and less. They also highlight the conservative assumptions made in the conï¬ guration of vehicle convoys used to simulate serviceability loads in 20m to 30m spans. The ï¬ ndings of the thesis support the need for the rational calibration of the partial factors used in limit state design. The WIM data was analysed to highlight the extent of overloading. The results provide evidence that the overloading of individual axles and axle sets is prevalent and that overloading has a greater impact on 5m and 10m spans than 30m spans. Research was carried out into the basis of the bridge live load models in TMH7 Part 2 and those recently developed in Europe, the United States and Canada. The thesis documents the advancement of rationally based live load models derived from actual vehicle data. Alternative live load models were calibrated against the extreme events predicted by the WiM data. The results independently validate the alternative live load model proposed by the latest research commissioned by the Department of Transport. This live load model takes a similar form to the one proposed in the Eurocode - ENV 1991-3.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Gonzalez, Nicolas Alejandro. "Principal Components Analysis, Factor Analysis and Trend Correlations of Twenty-Eight Years of Water Quality Data of Deer Creek Reservoir, Utah". BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/3309.

Texto completo
Resumen
I evaluated twenty-eight years (1980-2007) of spatial-temporal water quality data from Deer Creek Reservoir in Utah. The data came from three sampling points representing the lotic, transitional and lentic zones. The data included measurements of climatological, hydrological and water quality conditions at four depths; Surface, Above Thermocline, Below Thermocline and Bottom. The time frame spanned dates before and after the completion of the Jordanelle Reservoir (1987-1992), approximately fourteen miles upstream of Deer Creek. I compared temporal groupings and found that a traditional month distribution following standard seasons was not effective in characterizing the measured conditions; I developed a more representative seasonal grouping by performing a Tukey-Kramer multiple comparisons adjustment and a Bonferronian correction of the Student's t comparison. Based on these analyses, I determined the best groupings were Cold (December - April), Semi-Cold (May and November), Semi-Warm (June and October), Warm (July and September) and Transition (August). I performed principal component analysis (PCA) and factor analysis (FA) to determine principal parameters associated with the variability of the water quality of the reservoir. These parameters confirmed our seasonal groups showing the Cold, Transition and Warm seasons as distinct groups. The PCA and FA showed that the variables that drive most of the variability in the reservoir are specific conductivity and variables related with temperature. The PCA and FA showed that the reservoir is highly variable. The first 3 principal components and rotated factors explained a cumulative 59% and 47%, respectively of the variability in Deer Creek. Both parametric and nonparametric approaches provided similar correlations but the evaluations that included censored data (nutrients) were considerably different with the nonparametric approach being preferred.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Miatke, Baxter G. "A Framework For Estimating Nutrient And Sediment Loads That Leverages The Temporal Variability Embedded In Water Monitoring Data". ScholarWorks @ UVM, 2016. http://scholarworks.uvm.edu/graddis/651.

Texto completo
Resumen
Rivers deliver significant macronutrients and sediments to lakes that can vary substantially throughout the year. These nutrient and sediment loadings, exacerbated by winter and spring runoff, impact aquatic ecosystem productivity and drive the formation of harmful algae blooms. The source, extent and magnitude of nutrient and sediment loading can vary drastically due to extreme weather events and hydrologic processes, such as snowmelt or high flow storm events, that dominate during a particular time period, making the temporal component (i.e., time over which the loading is estimated) critical for accurate forecasts. In this work, we developed a data-driven framework that leverages the temporal variability embedded in these complex hydrologic regimes to improve loading estimates. Identifying the "correct" time scale is an important first step for providing accurate estimates of seasonal nutrient and sediment loadings. We use water quality concentration and associated 15-minute discharge data from nine watersheds in Vermont's Lake Champlain Basin to test our proposed framework. Optimal time periods were selected using a hierarchical cluster analysis that uses the slope and intercept coefficients from individual load-discharge regressions to derive improved linear models. These optimized linear models were used to improve estimates of annual and "spring" loadings for total phosphorus, dissolved phosphorus, total nitrogen, and total suspended loads for each of the nine study watersheds. The optimized annual regression model performed ~20% better on average than traditional annual regression models in terms of Nash-Sutcliffe efficiency, and resulted in ~50% higher cumulative load estimates with the largest difference occurring in the "spring". In addition, the largest nutrient and sediment loadings occurred during the "spring" unit of time and were typically more than 40% of the total annual estimated load in a given year. The framework developed here is robust and may be used to analyze other units of time associated with hydrologic regimes of interest provided adequate water quality data exist. This, in turn, may be used to create more targeted and cost-effective management strategies for improved aquatic health in rivers and lakes.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Bakker, Cleo. "Nutrients and biota in a lake system before and after restoration; a data analysis of the Swedish eutrophication case study Växjösjön". Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-445066.

Texto completo
Resumen
Eutrophication has proven to be a fundamental ecological problem for lakes and other bodies of water all around the world. The process of eutrophication can be defined as a lake containing increasing concentrations of nutrients from external and/or internal input over time. The increase of nutrients in the lake has several consequences for the lake ecosystem, such as the increase in algal blooms (sometimes containing toxic and harmful cyanobacteria) and the decrease of macrophytes. One nutrient that plays a key role in the eutrophication process is phosphorus. To restore eutrophic waters, the external and internal input of phosphorus needs to be reduced. External input can be decreased by reducing the run-off from industrial areas or agriculture. Internal input can be reduced by disrupting the in-lake phosphorus loading processes, which are connected heavily to the lake sediment. The internal phosphorus loading processes can be caused by several different processes. One is the mineralization of organic biomass on the sediment which releases phosphorus into the water, another is the release of previously iron-bound phosphorus from the sediment. Different treatments can be implemented in a lake system to disrupt these internal processes of phosphorus loading and consequently restore the water quality of the lake. Such treatments also influence the biota of the lake and the ecosystem services, because of their effect on water quality. Biomanipulation treatments and aluminum treatments were implemented in lake Växjösjön in Sweden to restore the lake to a more natural and balanced state. Both treatments were effective in reducing the eutrophic conditions of the lake, improving water quality, biota, and the ecosystem services. Local human populations benefit from these improvements, for example by receiving increased revenue from lake recreation. More research is however needed to discern the long-term effects of the treatments in the Växjö municipality, thereby aiding local government and policy makers in their future decisions regarding restoration.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Khakipoor, Banafsheh. "Applied Science for Water Quality Monitoring". University of Akron / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=akron1595858677325397.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Saliko, Denis. "Investigation of the structural response of pavements in cold region using instrumented test site data". Licentiate thesis, KTH, Byggnadsmaterial, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290939.

Texto completo
Resumen
The structural behaviour of pavement structures is known to be affected by the traffic-related loading and by the ambient factors to which the structure is subjected. A new mechanistic-empirical (M-E) pavement design method is under development in Sweden with the main purpose to adequately predict pavement structural response and performance. An M-E design method for a flexible pavement means application of the principles of engineering mechanics to evaluate the response of pavement structure to traffic loading and much improved design methods to carry out distress prediction or how performance changes with time. This would ensure a fundamental understanding of how the pavement structure responds to a certain action or loading conditions.  The mechanistic-empirical approach is more flexible as it is able to adapt to new situations such as new pavement materials and loading situations. It is important to take into account the real loading and climatic conditions and predict the resulting changes in material properties and structural behaviour at the time of loading as well as in the long-term. New models are therefore required for the further development of pavement design method, and it needs to be validated through reliable data obtained through realistic measurements. In this licentiate thesis, the effects of the environmental factors and loading by heavy vehicles in pavements are investigated. The results of the study are based on environmental data from multiple locations in Sweden and on measurements from two instrumented road sections located near the village of Långträsk in the northern part of Sweden. Both roads consist of thin flexible pavements, the behaviour of which is highly dependent on the variation of the temperature of the asphalt layer, the moisture content in the unbound granular layers, and frost depth conditions. The licentiate report consists of three scientific publications.  Paper 1 presents a country-specific case study in which the frost penetration depth in various Swedish roads is predicted by a statistically derived empirical model that uses the air freezing index calculated from the air temperature as an input. The model correlation is based on meteorological data from 44 meteorological stations and pavement cross-sectional temperature distribution data from 49 road weather information system (RWIS) stations over all five climatic zones throughout all of Sweden. Paper 2 focuses on the response of an instrumented test section subjected to loading by falling weight deflectometer (FWD) and heavy vehicles. The mechanical response instrumentation consisted of asphalt strain gauges (ASG), strain measuring units (εMU), and soil pressure cells (SPC) installed at different locations in the structure. The layer stiffness values were obtained via backcalculation based on the FWD surface deflections bowls. The recorded values of the mechanical response were compared against calculated values by multilayer elastic theory (MLET) based software. Three different heavy vehicles weighing from ~64 tons to ~74 tons were compared in terms of damage caused to the pavement structure. It was found that if the number of axles was increased and dual tyres were used, longer heavier vehicles were not more destructive to the pavement structure than shorter vehicles with fewer axles and higher axial load and tyre pressure. In paper 3, the effect of the seasonal variation of the environmental factors on the behaviour of an instrumented road test section was investigated. The same loading configuration described in paper 2 was used on four different measurement campaigns in different seasons over the span of 1 year. The environmental variables were monitored throughout the year by asphalt thermocouples, a frost rod, and time-domain reflectometer (TDR) probes. The mechanical response sensors and the environmental sensors were found to be a reliable data collection method throughout the entire year. By comparing the recorded response values to the MLET calculated values, it was shown that it is possible to model the mechanical behaviour of pavement structures using linear-elastic MLET if the temperature variations in the asphalt layer and the moisture variations in the granular layers are taken into account.
Utvecklingen av det strukturella tillståndet av vägbyggnader är beroende av trafikmängd samt vägkonstruktionens omgivningsfaktorer. En mekanistisk-empiriskdimensioneringsmetod för vägkonstruktioner är under utveckling i Sverige med huvudsyftet att på realistiskt sätt kunna förutsäga vägens respons och tillståndsutveckling. En M-E dimensionering betyder att man applicerar mekanistiska grundprinciper för att bestämma vägkonstruktionens respons under trafikbelastning och tillämpar sedan överföringsfunktioner för att förutsäga nedbrytningsförloppet. Att använda en mekanistiskt baserad metod innebär att man får en grundläggande förståelse av hur vägkonstruktionen svarar på en specifik påverkan eller belastning. Detta mera realistiska tillvägagångsätt ger dessutom ökad flexibilitet och gör att metoden kan appliceras vid nya belastningsfall eller under nya klimatförhållanden. Det blir därför möjligt att ta hänsyn till det riktiga belastningsfallet under de rätta klimatförhållandena. Detta gör att nyamodeller för vägkonstruktionens respons behövs och att dessa måste valideras för realistiska trafikbelastningsfall. I denna licentiatavhandling undersöks hur omgivningsvariabler och trafikbelastning från tunga lastbilar påverkar vägkonstruktioner. Resultat från denna studie bygger på omgivningsdata från ett stort antal mätstationer i Sverige och från mätningar från två instrumenterade vägsträckor nära byn Långträski Norrland. Båda vägkonstruktionerna består av tunna vägöverbyggnader och därför är responsen starkt beroende av beläggningstemperaturen och fuktinnehållet i vägbyggnadens obundna lager samt frostdjupet.Rapporten bygger på tre publikationer.Första artikeln presenterar hur frostdjup i svenska vägar kan förutsägas utifrån en statistisk modell som använder ett frostindex beräknat från lufttemperaturen. Modellen bygger på en anpassning av data från 44 meteorologiska väderstationer och 49 av Trafikverkets väderinformationsstationer distribuerade över alla fem klimatzoner i Sverige. Den andra artikeln fokuserar på responsmätningar från en instrumenterad vägtestplats där både fallvikt samt tunga lastbilar har använts för lastgenerering. Den vägtekniska instrumenteringen bestod av asfalttöjningsgivare (ASG), trycktöjningsgivare (MU), och jordtyckceller (SPC) installerade på olika djup. Styvhet av de olika lagren bakberäknades baserat på data från fallviktsmätningarna. De registrerade värdena från de vägtekniska givarna jämfördes med desom beräknades med hjälp av beräkningsverktyget ERAPave. Tre lastbilar med den totala tyngden mellan ~64 –~74 ton användes sedan för att jämföra deras relativa nedbrytningseffekt. Slutsatsen är att om antal axlar ökas med ökad tyngdoch om parmonterade däck användes är nedbrytningseffekten från 74 ton lastbil inte större än från en lättare lastbil med färre axlar. I artikel tre fokuseras det på effekten av hur klimatfaktorer påverkar vägkonstruktionens respons. Samma lastbilar som iden andra artikeln användes under fyra mätkampanjer vid olika tidpunkter på året. Omgivningsparametrarna registrerades i form av beläggningstemperatur och fukt i de obundna lagren samt frostdjupet. Vägens mekaniska respons beräknades genom att uppdatera styvhetsmodulerna utifrån beläggningens temperatur och de obundna lagrens med fuktkvoter och jämföra med de uppmätta responsvärdena. Beräkningsmetodiken visar sig ge bra överensstämmelse med det om uppmättes.

QC 210303

Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Gowthaman, Rahul y Suhail Jagwani. "Advanced bushing script program in MSC ADAMS". Thesis, KTH, Fordonsdynamik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-243489.

Texto completo
Resumen
The thesis focuses on investigating and optimizing a bushing script implemented as a tool in MSC ADAMS/Car. The study provides an insight on the representation of a rubber bushing and identify parameters which can be used to define the properties of a bushing in a simulation environment such as ADAMS/Car. The tool being studied here can be used to implement different kind of bushings such as a hydro bushing and a general rubber bushing, but optimization was implemented for the rubber bushing only. With an increasing reliance on Computer Aided Engineering (CAE) tools in the designing process, it is necessary that the vehicle behaviour can be predicted without relying on physical testing. CAE tools reduces the need of prototypes and provides a faster approach to designing vehicles. MSC ADAMS/Car is one such tool, which has been used here to predict the vehicle dynamic behaviour, which will influence the ride, handling and comfort characteristics of the vehicle. Rubber bushings, which have been studied here, have a significant contribution to the overall stiffness of the vehicle and as such, it is imperative that the tool being used here, is accurate and makes the designing process easy. The rubber bushing can be imagined to be a combination of a non-linear elastic spring, a frequency dependent Maxwell component and an amplitude dependent frictional element. In order to ease the design of the bushing properties, a reduced number of input properties are used to calculate the bushing properties internally. While trying to validate the force hysteresis loop obtained through the model with the measured data, it was seen that the accuracy was quite poor for the model when loading it with dynamic loads corresponding to amplitudes of0.2 mm and lower. The quasi-static loading and dynamic loading above 0.2 mm is shown to have a satisfactory accuracy when compared to the measured data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Balidakis, Kyriakos [Verfasser], Harald [Akademischer Betreuer] Schuh, Harald [Gutachter] Schuh, Rüdiger [Gutachter] Haas y Daniela [Gutachter] Thaller. "On the development and impact of propagation delay and geophysical loading on space geodetic technique data analysis / Kyriakos Balidakis ; Gutachter: Harald Schuh, Rüdiger Haas, Daniela Thaller ; Betreuer: Harald Schuh". Berlin : Technische Universität Berlin, 2019. http://d-nb.info/1200466640/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Drury, Travis Daniel. "Managing a Watershed Inventory Project and Exploring Water Quality Data in the Four Mile Creek Watershed". Miami University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=miami1366318507.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Veselý, Jan. "Implementace BI v servisním oddělení telekomunikační společnosti". Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-165129.

Texto completo
Resumen
The subject of the thesis is the proof of concept of Business Intelligence (BI) implementation in the Communications service department from the Kapsch BusinessCom s.r.o. company. The aim of the work is developing a BI proof solution and consequently producing an output for the final users. The main outcome of the thesis is operating BI solutions used in practice. The solution was carried out using tools MS SQL Server 2008 R2 Management Studio, MS SQL Server 2008 R2 Business Intelligence Development studio and MS Excel 2007. The project analyses the data obtained from the ERP (Enterprise Resource Planning) database acquired from the MS Dynamics Navision system. Information on "Voice" projects from the Communications service department, which mainly focuses on the service of PBXs and similar products, are recorded in the project among others. A multidimensional analysis was carried out, data warehouse was designed, data pumps were created, OLAP (On-line Analytical Processing) block and output in the Microsoft Excel 2007 program were carried out during the project.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Mašek, Martin. "Datové sklady - principy, metody návrhu, nástroje, aplikace, návrh konkrétního řešení". Master's thesis, Vysoká škola ekonomická v Praze, 2007. http://www.nusl.cz/ntk/nusl-10145.

Texto completo
Resumen
The main goal of this thesis is to summarize and introduce general theoretical concepts of Data Warehousing by using the systems approach. The thesis defines Data Warehousing and its main areas and delimitates Data Warehousing area in terms of higher-level area called Business Intelligence. It also describes the history of Data Warehousing & Business Intelligence, focuses on key principals of Data Warehouse building and explains the practical applications of this solution. The aim of the practical part is to perform the evaluation of theoretical concepts. Based on that, design and build Data Warehouse in environment of an existing company. The final solution shall include Data Warehouse design, hardware and software platform selection, loading with real data by using ETL services and building of end users reports. The objective of the practical part is also to demonstrate the power of this technology and shall contribute to business decision-making process in this company.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Jensen, Alexandria Kosoma. "HIGH RESOLUTION SENSING OF NITRATE DYNAMICS IN A MIXED-USE APPALACHIAN WATERSHED: QUANTIFYING NITRATE FATE AND TRANSPORT AS INFLUENCED BY A BACKWATER RIPARIAN WETLAND". UKnowledge, 2018. https://uknowledge.uky.edu/bae_etds/59.

Texto completo
Resumen
As harmful algal blooms begin to appear in unexpected places such as rivers in predominantly forested systems, a better understanding of the nutrient processes within these contributing watersheds is necessary. However, these systems remain understudied. Utilization of high-resolution water quality data applied to deterministic numerical modeling has shown that a 0.42% watershed area backwater riparian wetland along the Ohio River floodplain can attenuate 18.1% of nitrate discharged from local mixed-use watersheds and improves in performance during high loading times due to coinciding increased hydrological connectivity and residence times of water in these wetlands. Loading from the Fourpole Creek watershed was typical for mixed-use systems at 3.3 kgN/ha/yr. The high-resolution data were used to improve boundary condition parameterization, elucidate shortcomings in the model structure, and reduce posterior solution uncertainty. Using high resolution data to explicitly inform the modeling process is infrequently applied in the literature. Use of these data significantly improves the modeling process, parameterization, and reduces uncertainty in a way that would not have been possible with a traditional grab sampling approach.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Turville, Christopher, University of Western Sydney y of Informatics Science and Technology Faculty. "Techniques to handle missing values in a factor analysis". THESIS_FIST_XXX_Turville_C.xml, 2000. http://handle.uws.edu.au:8081/1959.7/395.

Texto completo
Resumen
A factor analysis typically involves a large collection of data, and it is common for some of the data to be unrecorded. This study investigates the ability of several techniques to handle missing values in a factor analysis, including complete cases only, all available cases, imputing means, an iterative component method, singular value decomposition and the EM algorithm. A data set that is representative of that used for a factor analysis is simulated. Some of this data are then randomly removed to represent missing values, and the performance of the techniques are investigated over a wide range of conditions. Several criteria are used to investigate the abilities of the techniques to handle missing values in a factor analysis. Overall, there is no one technique that performs best for all of the conditions studied. The EM algorithm is generally the most effective technique except when there are ill-conditioned matrices present or when computing time is of concern. Some theoretical concerns are introduced regarding the effects that changes in the correlation matrix will have on the loadings of a factor analysis. A complicated expression is derived that shows that the change in factor loadings as a result of change in the elements of a correlation matrix involves components of eigenvectors and eigenvalues.
Doctor of Philosophy (PhD)
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Krashanitsa, Roman Yurievich. "An Inverse Computational Approach for the Identification of the Parameters of the Constitutive Model for Damaged Ceramics Subjected to Impact Loading". Diss., Tucson, Arizona : University of Arizona, 2005. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu%5Fetd%5F1390%5F1%5Fm.pdf&type=application/pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

LEVY, Raphael. "Interactions intra et inter moléculaires, conformation des polymères adsorbés, transition de phase sous étirement : que peut-on apprendre des mesures de force". Phd thesis, Université Louis Pasteur - Strasbourg I, 2002. http://tel.archives-ouvertes.fr/tel-00001565.

Texto completo
Resumen
Le microscope à force atomique (AFM) est un outil privilégié pour sonder la matière à l'échelle nanométrique. Dans cette thèse, nous l'utilisons comme instrument de mesure de force. Nous montrons que la mesure des fluctuations thermiques des ressorts permet de déterminer précisément leur raideur à condition d'utiliser un modèle qui prend en compte la forme des modes et la méthode de détection. A l'aide de nouvelle méthodes d'analyse des courbes de rétraction, nous étudions la conformation de polymères et de copolymères adsorbés, ainsi que l'interaction spécifique entre un complexe du nickel (Ni-NTA) et l'acide aminé histidine. Nous mettons en évidence des comportements inattendus en présence de liens multiples et nous en proposons une interprétation basée sur un équilibre entre formations et ruptures des liaisons. Nous présentons des premiers résultats expérimentaux concernant la transition conformationnelle de la polylysine sous étirement.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Ahmed, Sarah N. "Evaluation of nitrogen nonpoint-source loadings using high resolution land use data in a GIS a multiple watershed study for the State of Maryland /". College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8615.

Texto completo
Resumen
Thesis (M.S.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Civil and Environmental Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Lopez, Sepulveda Gloria Patricia. "Aplicação de inteligência computacional na resolução de problemas de sistemas elétricos de potência /". Ilha Solteira, 2017. http://hdl.handle.net/11449/152469.

Texto completo
Resumen
Orientador: Marcos Julio Rider Flores
Resumo: Nesta tese são utilizados algoritmos de Inteligência Computacional para resolver quatro problemas da área de sistemas elétricos de potência, com o intuito de automatizar a tomada de decisões em processos que normalmente são realizados por especialistas humanos ajudados de métodos computacionais clássicos. Nesta tese são utilizados os algoritmos de aprendizado de máquina: árvores de decisão, redes neurais artificiais e máquinas de vetor de suporte, para realizar o processo de aprendizado dos sistemas inteligentes e para realizar a mineração de dados. Estes algoritmos podem ser treinados a partir das medições disponíveis e ações registradas nos centros de controle dos sistemas de potência. Sistemas Inteligentes foram utilizados para realizar: a) o controle centralizado Volt-VAr em modernos sistemas de distribuição de energia elétrica em tempo real usando medições elétricas; b) a detecção de fraudes nas redes de distribuição de energia elétrica realizando um processo de mineração de dados para estabelecer padrões de consumo que levem a possíveis clientes fraudadores; c) a localização de faltas nos sistemas de transmissão de energia elétrica automatizando o processo de localização e ajudando para que uma ação de controle da falta seja realizada de forma rápida e eficiente; e d) a coordenação de carga inteligente de veículos elétricos e dispositivos de armazenamento em tempo real utilizando a tecnologia V2G, nos sistemas de distribuição de energia elétrica a partir de medições elé... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: In this thesis Computational Intelligence algorithms are used to solve four problems of the area of power electrical systems, in order to automate decision making in processes that are usually performed by human experts aided by classical computational methods. In this thesis the machine learning algorithms are used: decision trees, artificial neural networks and support vector machines to carry out the learning process of Intelligent Systems and to perform Data Mining. These algorithms are trained from the available measurements and actions recorded in the control centers of the systems. Intelligent Systems were used to perform: a) the centralized control Volt-VAr in modern systems of distribution of electrical energy in real time using electrical measurements; b) detection of fraud in electricity distribution networks by performing a data mining process to establish patterns of consumption that lead to possible fraudulent customers; c) fault location in electric power transmission systems by automating the localization process and helping to ensure that a fault control action is performed quickly and efficiently; and d) coordination of intelligent charging of electric vehicles and storage devices using V2G technology in real-time, in electric power distribution systems using electrical measurements. For the centralized control problem Volt-VAr was tested in 42-node distribution system, for the problem of loading electric vehicles and storage devices the tests were performed... (Complete abstract click electronic access below)
Doutor
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Caland, Fabrice. "Décomposition tensorielle de signaux luminescents émis par des biosenseurs bactériens pour l'identification de Systèmes Métaux-Bactéries". Phd thesis, Université de Lorraine, 2013. http://tel.archives-ouvertes.fr/tel-00934507.

Texto completo
Resumen
La disponibilité et la persistance à l'échelle locale des métaux lourds pourraient être critiques notamment pour l'usage futur des zones agricoles ou urbaines, au droit desquelles de nombreux sites industriels se sont installés dans le passé. La gestion de ces situations environnementales complexes nécessitent le développement de nouvelles méthodes d'analyse peu invasives (capteurs environnementaux), comme celles utilisant des biosenseurs bactériens, afin d'identifier et d'évaluer directement l'effet biologique et la disponibilité chimique des métaux. Ainsi dans ce travail de thèse, nous avons cherché à identifier, à l'aide d'outils mathématiques de l'algèbre multi-linéaire, les réponses de senseurs bactériens fluorescents dans des conditions environnementales variées, qu'il s'agisse d'un stress engendré par la présence à forte dose d'un métal ou d'une carence nutritive engendrée par son absence. Cette identification est fondée sur l'analyse quantitative à l'échelle d'une population bactérienne de signaux multidimensionnels. Elle repose en particulier sur (i) l'acquisition de données spectrales (fluorescence) multivariées sur des suspensions de biosenseurs multicolores interagissant avec des métaux et sur (ii) le développement d'algorithme de décomposition tensoriels. Les méthodes proposées, développées et utilisées dans ce travail s'efforcent d'identifier " sans \textsl{a priori} " (\textsl{a minima}), la réponse fonctionnelle de biosenseurs sous différentes conditions environnementales, par des méthodes de décomposition de tenseurs sous \hyphenation{con-train-tes} des signaux spectraux observables. Elles tirent parti de la variabilité des réponses systémiques et permettent de déterminer les " sources " élémentaires identifiant le système et leur comportement en fonction des paramètres extérieurs. Elles sont inspirées des méthodes CP et PARALIND . L'avantage de ce type d'approche, par rapport aux approches classiques, est l'identification unique des réponses des biosenseurs sous de faibles contraintes. Le travail a consisté à développer des algorithmes efficaces de séparations de sources pour les signaux fluorescents émis par des senseurs bactériens, garantissant la séparabilité des sources fluorescentes et l'unicité de la décomposition. Le point original de la thèse est la prise en compte des contraintes liées à la physique des phénomènes analysés telles que (i) la parcimonie des coefficients de mélange ou la positivité des signaux "source", afin de réduire au maximum l'usage d'a priori ou (ii) la détermination non empirique de l'ordre de la décomposition (nombre de sources). Cette posture a permis aussi d'améliorer l'identification en optimisant les mesures physiques par l'utilisation de spectres synchrones ou en apportant une diversité suffisante aux plans d'expériences. L'usage des spectres synchrones s'est avéré déterminant à la fois pour améliorer la séparation des sources de fluorescence, mais aussi pour augmenter le rapport signal sur bruit des biosenseurs les plus faibles. Cette méthode d'analyse spectrale originale permet d'élargir fortement la gamme chromatique des biosenseurs fluorescents multicolores utilisables simultanément. Enfin, une nouvelle méthode d'estimation de la concentration de polluants métalliques présents dans un échantillon à partir de la réponse spectrale d'un mélange de biosenseurs non-spécifiques a été développée.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Jephte, Ioudom Foubi. "Extract, Transform, and Load data from Legacy Systems to Azure Cloud". Master's thesis, 2021. http://hdl.handle.net/10362/118629.

Texto completo
Resumen
Internship report presented as partial requirement for obtaining the Master’s degree in Information Management, with a specialization in Knowledge Management and Business Intelligence
In a world with continuously evolving technologies and hardened competitive markets, organisations need to continually be on guard to grasp cutting edge technology and tools that will help them to surpass any competition that arises. Modern data platforms that incorporate cloud technologies, support organisations to strive and get ahead of their competitors by providing solutions that help them capture and optimally use untapped data, and scalable storages to adapt to ever-growing data quantities. Also, adopt data processing and visualisation tools that help to improve the decision-making process. With many cloud providers available in the market, from small players to major technology corporations, this offers much flexibility to organisations to choose the best cloud technology that will align with their use cases and overall products and services strategy. This internship came up at the time when one of Accenture’s significant client in the financial industry decided to migrate from legacy systems to a cloud-based data infrastructure that is Microsoft Azure cloud. During this internship, development of the data lake, which is a core part of the MDP, was done to understand better the type of challenges that can be faced when migrating data from on-premise legacy systems to a cloud-based infrastructure. Also, provided in this work, are the main recommendations and guidelines when it comes to performing a large scale data migration.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

"Analysis on the less flexibility first (LFF) algorithm and its application to the container loading problem". 2005. http://library.cuhk.edu.hk/record=b5892415.

Texto completo
Resumen
Wu Yuen-Ting.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2005.
Includes bibliographical references (leaves 88-90).
Abstracts in English and Chinese.
Chapter 1. --- Introduction --- p.1
Chapter 1.1 --- Background --- p.1
Chapter 1.2 --- Research Objective --- p.4
Chapter 1.3 --- Contribution --- p.5
Chapter 1.4 --- Structure of this thesis --- p.6
Chapter 2. --- Literature Review --- p.7
Chapter 2.1 --- Genetic Algorithms --- p.7
Chapter 2.1.1 --- Pre-processing step --- p.8
Chapter 2.1.2 --- Generation of initial population --- p.10
Chapter 2.1.3 --- Crossover --- p.11
Chapter 2.1.4 --- Mutation --- p.12
Chapter 2.1.5 --- Selection --- p.12
Chapter 2.1.6 --- Results of GA on Container Loading Algorithm --- p.13
Chapter 2.2 --- Layering Approach --- p.13
Chapter 2.3 --- Mixed Integer Programming --- p.14
Chapter 2.4 --- Tabu Search Algorithm --- p.15
Chapter 2.5 --- Other approaches --- p.16
Chapter 2.5.1 --- Block arrangement --- p.17
Chapter 2.5.2 --- Multi-Directional Building Growing algorithm --- p.17
Chapter 2.6 --- Comparisons of different container loading algorithms --- p.18
Chapter 3. --- Principle of LFF Algorithm --- p.8
Chapter 3.1 --- Definition of Flexibility --- p.8
Chapter 3.2 --- The Less Flexibility First Principle (LFFP) --- p.23
Chapter 3.3 --- The 2D LFF Algorithm --- p.25
Chapter 3.3.1 --- Generation of Corner-Occupying Packing Move (COPM) --- p.26
Chapter 3.3.2 --- Pseudo-packing and the Greedy Approach --- p.27
Chapter 3.3.3 --- Real-packing --- p.30
Chapter 3.4 --- Achievement of 2D LFF --- p.31
Chapter 4. --- Error Bound Analysis on 2D LFF --- p.21
Chapter 4.1 --- Definition of Error Bound --- p.21
Chapter 4.2 --- Cause and Analysis on Unsatisfactory Results by LFF --- p.33
Chapter 4.3 --- Formal Proof on Error Bound --- p.39
Chapter 5. --- LFF for Container Loading Problem --- p.33
Chapter 5.1 --- Problem Formulation and Term Definitions --- p.48
Chapter 5.2 --- Possible Problems to be solved --- p.53
Chapter 5.3 --- Implementation in Container Loading --- p.54
Chapter 5.3.1 --- The Basic Algorithm --- p.56
Chapter 5.4 --- A Sample Packing Scenario --- p.62
Chapter 5.4.1 --- Generation of COPM list --- p.63
Chapter 5.4.2 --- Pseudo-packing and the greedy approach --- p.66
Chapter 5.4.3 --- Update of corner list --- p.69
Chapter 5.4.4 --- Real-Packing --- p.70
Chapter 5.5 --- Ratio Approach: A Modification to LFF --- p.70
Chapter 5.6 --- LFF with Tightness Measure: CPU time Cut-down --- p.75
Chapter 5.7 --- Experimental Results --- p.77
Chapter 5.7.1 --- Comparison between LFF and LFFR --- p.77
Chapter 5.7.2 --- "Comparison between LFFR, LFFT and other algorithms" --- p.78
Chapter 5.7.3 --- Computational Time for different algorithms --- p.81
Chapter 5.7.4 --- Conclusion of the experimental results --- p.83
Chapter 6. --- Conclusion --- p.85
Bibiography --- p.88
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Le, Xuan Thanh. "Robust solutions to storage loading problems under uncertainty". Doctoral thesis, 2017. https://repositorium.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2017021715554.

Texto completo
Resumen
In this thesis we study some storage loading problems motivated from several practical contexts, under different types of uncertainty on the items’ data. To have robust stacking solutions against the data uncertainty, we apply the concepts of strict and adjustable robustness. We first give complexity results for various storage loading problems with stacking constraints, and point out some interesting settings in which the adjustable robust problems can be solved more efficiently than the strict ones. Then we propose different solution algorithms for the robust storage loading problems, and figure out which algorithm performs best for which data setting. We also propose a robust optimization framework dealing with storage loading problems under stochastic uncertainty. In this framework, we offer several rule-based ways of scenario generation to derive different uncertainty sets, and analyze the trade-off between cost and robustness of the robust stacking solutions. Additionally, we introduce a novel approach in dealing with stability issues of stacking configurations. Our key idea is to impose a limited payload on each item depending on its weight. We then study a storage loading problem with the interaction of stacking and payload constraints, as well as uncertainty on the weights of items, and propose different solution approaches for the robust problems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Chen, Tai-Hung y 陳泰宏. "The Research on Pile Axial Loading Test Data Interpretation Using Inverse Analysis Model". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/325bxf.

Texto completo
Resumen
碩士
國立臺灣海洋大學
河海工程學系
106
This thesis interprets axial pile load testing data of six large piles, which include two compressive testing, two tensile testing of drilled shafts and two other tensile testing of barrettes, located at Guandu area of Beitou in Taipei city. The objectives and some key research points of this thesis are:1) The original raw data is interpreted from installed strain gages along shaft during pile load testing; 2) Use of the inverse method to obtain the ultimate frictional resistance of the pile and the initial shear stiffness at the pile/soil interface; 3) Use of APILE software to analyze the above test piles and to compare the difference between the inverse analysis results and the APILE results; 4) Compare the ultimate friction resistance between the tensile and compress piles and between the drilled shafts and barrettes at the same soil layer; 5) Interpretation of tensile piles often affected by concrete cracking during testing, a method to fix the problem is proposed in the thesis. Nowadays, the secant modulus method is the most often used method in Taiwan to interpret the axial force along shaft of the tested pile. Sometimes, the read-out from the installed strain gages appears to be amplified due to concrete cracking, which caused all the stain carried by the rebar only, not both rebar and concrete. Hence, it is impossible to estimate a correct elastic modulus. The effect of concrete cracking is more serious for tensile load testing pile. Therefore, this thesis proposes to use the tangent modulus method and then use the inverse analysis, used by Z.R. Xiao(2003), to simulate a reasonable axial distribution along shaft. In addition, the ultimate frictional resistance and shear stiffness at pile/soil interface can also be estimated. In addition, the program APILE is used to analyze the tested piles. The results interpreted from the read-out of the pile load testing and from the inverse analysis are compared to the results of the APILE analysis. The results of this study show that the axial force obtained by the tangent modulus method is larger than that of the commonly used secant modulus method, which indicates that the secant modulus method is more conservative in estimating concrete modulus. It’s to point out that the tangent modulus method is more complicated than the secant modulus method in modulus estimation. The inverse method adopted from Z.R. Xiao(2003)can reasonably simulate the axial force along depth by curve fitting, and the ultimate frictional resistance and shear stiffness between pile/soil interface are obtained. Comparing to the results interpreted from pile load testing, inverse analysis and APILE analysis, it is observed that there appear to have similar trend among the three methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Rezazadeh, Azar Ehsan. "Computer Vision-based Solution to Monitor Earth Material Loading Activities". Thesis, 2013. http://hdl.handle.net/1807/35938.

Texto completo
Resumen
Large-scale earthmoving activities make up a costly and air-polluting aspect of many construction projects and mining operations, which depend entirely on the use of heavy construction equipment. The long-term jobsites and manufacturing nature of the mining sector has encouraged the application of automated controlling systems, more specifically GPS, to control the earthmoving fleet. Computer vision-based methods are another potential tool to provide real-time information at low-cost and to reduce human error in surface earthmoving sites as relatively clear views can be selected and the equipment offer recognizable targets. Vision-based methods have some advantages over positioning devices as they are not intrusive, provide detailed data about the behaviour of each piece of equipment, and offer reliable documentation for future reviews. This dissertation explains the development of a vision-based system, named server-customer interaction planner (SCIT), to recognize and estimate earth material loading cycles. The SCIT system consists of three main modules: object recognition, tracking, and action recognition. Different object recognition and tracking algorithms were evaluated and modified, and then the ideal methods were used to develop the object recognition and tracking modules. A novel hybrid tracking framework was developed for the SCIT system to track dump trucks in the challenging views found in the loading zones. The object recognition and tracking engines provide spatiotemporal data about the equipment which are then analyzed by the action recognition module to estimate loading cycles. The entire framework was evaluated using videos taken under varying conditions. The results highlight the promising performance of the SCIT system with the hybrid tracking engine, thereby validating the possibility of its practical application.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Chuang, Wen Wei y 莊文維. "An Optimized Data Loading Mechanism with Parallel Processing for List-style UI in Mobile Systems". Thesis, 2015. http://ndltd.ncl.edu.tw/handle/55368670034435539961.

Texto completo
Resumen
碩士
國立臺灣科技大學
電子工程系
103
Nowadays, consumer electronics are pervasive and regarded as an indispensable part of our daily life. With the rapid development of the devices and the quality of network connectivity, accessing abundant data from Internet and internal storages on mobile device applications is very usual. List-style user interface is often used to perform the data on Android applications. However, according to recent studies on performance issues, the conventional mechanism of data processing for the list-style user interface is sluggish. This thesis proposes LDVI mechanism to improve the data processing performance for list-style user interface. The proposed mechanism processes abundant data with content separation and parallel data processing. It dynamically loads the visible items and eliminates the redundant procedures of the invisible items on the list-style user interface. Significantly, regardless of how many items are increased, LDVI mechanism stably consumes the system resources in a constant range for each scrolling event on the list-style user interface rather than continuously raises the consumption. On the other hand, it not only improves the operation of the list-style user interface but also performs better user experience. Moreover, LDVI mechanism has been implemented as a framework on Android. In our experiments, the CPU activity percentage of LDVI mechanism is maintained in the range of 7.80-9.30% and the average is 8.71%. Compare with the conventional mechanism, the data processing performance of LDVI mechanism is enhanced by at least 5.18%, and the CPU activity percentage is reduced by at least 6.26%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Hill, Jaclyn Marie. "Baseline isotopic data for Wolffia spp.: another option for tracing N-loading in freshwater systems?" 2018. http://hdl.handle.net/10962/68721.

Texto completo
Resumen
δ15N values of aquatic plants can reflect anthropogenic N loading. Recent work suggests the duckweed Spirodela spp. effectively maps N loading in freshwater ecosystems, but its use may be complicated by a cyanobacterium–duckweed symbiosis that could reduce its utility in low-nutrient environments. I aimed to evaluate the potential of a 2nd duckweed species Wolffia spp., which lacks a cyanobacterial symbiosis, for use in pollution monitoring in freshwater ecosystems. I used a series of laboratory experiments to investigate δ15N equilibration rates and concentration-level effects of single-source N solutions in plant tissue over 16 d to provide baseline data for sewage-plume mapping with Wolffia spp. I also tested concentration-level effects in multisource solutions to investigate the effects of mixed-source inputs. Wolffia reflected environmental N sources with a 12- to 16-d isotopic equilibration time and showed enriched and depleted δ15N ratios for manure and KNO3 solutions, respectively, but distinguished poorly between lower concentrations of manure. Fractionations at isotopic equilibrium were opposite to expectations and decreased with increasing [N]. Wolffia showed a consistent preference for NH3 in mixed-source treatments, regardless of the proportion or concentration of NH3 or NO3– available, and a capacity for N storage, which may complicate mapping of N-loading in natural environments. Wolffia is likely to be a less useful bioindicator than the previously tested Spirodela. Future research should focus on field applications of Spirodela spp. to test its capacity for sewage-plume mapping of freshwater ecosystems in a natural environment.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

施佑達. "The study of data loading and cleaning in data warehouse:the case of and intelligent maintenance system for the quality of life of elderly". Thesis, 2003. http://ndltd.ncl.edu.tw/handle/64210180128863340034.

Texto completo
Resumen
碩士
元智大學
資訊管理研究所
91
Data warehouse is a sort of common technique that used by enterprises nowadays. Enterprises collect and save all kinds of data that from organizations into Data warehouse, and they provide necessary material to people who make decisions. Because of the heterogeneity, it is likely to be disordered and even has mistakes in it. There must be a suitable data format transformation and Data Clean process before we put the data in data warehouse for making sure the quality of it. This research gives Elder Intelligent Manatenance System for the quality of life as an example, and it states the process of data format transformation and data identified by the opinions of specialists and supported by the limitations of logical rules to ensure the accuracy of data. This system can monitor the activity and the surrounding environment of the elder remotely by setting imperceptible sensors nearby. Their relatives can use navigators to surf on the Internet to inquire the quality of daily life, and also doctors and analysts can seek abnormal situations of the elder with OLAP interface. At the same time, the system saves data in data warehouse and analyze with association rules, and then find out the relationship between sensors.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Chung, Yi-Da y 鍾易達. "Study on the Data Quality and Ocean Tidal Loading influencing the Positioning Precision of GPS Permanent Stations". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/cfne6y.

Texto completo
Resumen
博士
國立交通大學
土木工程系所
105
High-accuracy relative GPS positioning relies on precise coordinates of base stations and good GPS data quality. The aim of this study is to improve the quality of GPS observations using a best model of ocean tidal loading effect and by identifying factors that degrade GPS positioning accuracy. In general, common-mode errors, such as errors due to clock and ionospheric and tropospheric delays can be reduced by differential GPS positioning. However, non-common errors, such as ocean loading effect, cannot be eliminated by relative positioning, thereby lowering GPS positioning accuracy. In this study, factors affecting the GPS positioning accuracy are analyzed in detail. The focus is on ocean tidal loading effect, as well as GPS data quality. The following six quality indicators of GPS data quality are investigated: receiver clock drift (stability and offset), cycle slips in the carrier-phase observations (o/slps), multipath on L1 (mp1), multipath on L2 (mp2) and number of observations (obs). The key finding is that the receiver clock drift dominates the positioning accuracy. Ocean tidal loading effect (OTL) has a strong influence on GPS heighting accuracy. To investigate the OTL-induced height variations, we conduct wavelet and harmonic analyses for GPS data collected at Taiwan’s coastal stations. We assess several OTL models and conclude that NAO.99b is the optimal model. OTL corrections improve GPS heighting accuracy by 30-45%. The overall accuracy improvement at the GPS stations in southwestern Taiwan is higher than that at the stations in southeastern Taiwan. The phases OTL-induced height variations are 1.5-2 hours behind those of ocean tidal heights in eastern Taiwan. The result from this study is expected to benefit GPS positioning accuracy at Taiwan’s GPS coastal stations. The dominating factors of GPS positioning accuracy can used to improve GPS data processing for high-accuracy monitoring of crustal motion and relative sea level rise.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Rodgers, Lisa. "Synthesis of Water Quality Data and Modeling Non-Point Loading in Four Coastal B.C. Watersheds: Implications for Lake and Watershed Health and Management". Thesis, 2015. http://hdl.handle.net/1828/6999.

Texto completo
Resumen
I compared and contrasted nitrogen and phosphorus concentrations and land use differences in two oligotrophic lakes (Sooke and Shawnigan) and two meso-eutrophic lakes (St. Mary and Elk) in order to evaluate nutrient concentrations over time, and evaluate the relationship between in-lake nutrients and land use in the surrounding watershed. I used MapShed© nutrient transport modeling software to estimate the mass load of phosphorus and nitrogen to each lake, and evaluated the feasibility of land use modifications for reducing in-lake nutrients. In comparing nitrogen and phosphorus data in Sooke and Shawnigan Lakes, I determined that natural watershed characteristics (i.e., precipitation, topography, and soils) did not account for the elevated nutrient concentrations in Shawnigan verses Sooke Lake. Natural watershed characteristics indicated that external loads into Shawnigan Lake would be lesser-than or equal to those into Sooke Lake if both watersheds were completely forested. I evaluated trends of in-lake nutrient concentrations for Sooke and Shawnigan Lakes, as well as two eutrophic lakes, St. Mary and Elk. Ten to 30-year trends indicate that nitrogen and phosphorus levels in these lakes have not changed significantly over time. Time-segmented data showed that nutrient trends are mostly in decline or are maintaining a steady-state. Most nutrient concentration data are not precipitation-dependent, and this, coupled with significant correlations to water temperature and dissolved oxygen, indicate that in-lake processes are the primary influence on lake nutrient concentrations -- not external loading. External loading was estimated using, MapShed©, a GIS-based watershed loading software program. Model validation results indicate that MapShed© could be used to determine the effect of external loading on lake water quality if accurate outflow volumes are available. Based on various land-cover scenarios, some reduction in external loading may be achieved through land-based restoration (e.g., reforestation), but the feasibility of restoration activities are limited by private property. Given that most of the causal loads were determined to be due to in-lake processes, land-based restoration may not be the most effective solution for reducing in-lake nitrogen and phosphorus concentrations.
Graduate
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Duarte, Tatiana Marina Gaspar Martins. "Implementação de um Sistema de Business Intelligence". Master's thesis, 2018. http://hdl.handle.net/10400.26/28598.

Texto completo
Resumen
Atualmente os dados são um dos mais importantes e críticos ativos de uma empresa. A sua exploração e análise é uma mais valia no auxílio e no suporte à tomada de decisão. Numa era em que o volume de dados cresce exponencialmente, o acesso a esta informação em tempo útil pode fazer toda a diferença no contexto organizacional. Conhecedores desta nova realidade, as empresas recorrem à implementação de sistemas de Business Intelligence como forma de extrair e analisar grandes quantidades de dados, possibilitando assim uma tomada de decisão mais consciente e informada. Este processo, conhecido como gestão do conhecimento, caracteriza-se por ser um conjunto de processos e ferramentas que organizam e sistematizam os dados com o objetivo de os transformar em conhecimento. Com vista a este fim, são utilizadas ferramentas baseadas em Tecnologias de Informação (TI) com o propósito de automatizar os processos inerentes à transformação dos dados em conhecimento. Nesta tese, desenvolvida no âmbito do Mestrado em Analítica e Inteligência Organizacional, e que decorre do estágio protocolado com a Compta Emerging Business (CEB), descrevemos a implementação de um sistema de Business Intelligence para uma organização ligada à recolha de resíduos. Primeiramente, analisaram-se os dados provenientes de uma fonte de dados interna de forma a identificar os dados que se enquadravam nas necessidades da organização. No passo seguinte, foi criado um modelo de dados multidimensional para a construção de um data warehouse (DW), para onde são carregados os dados extraídos e transformados no processo ETL (Extraction, Transformation and Loading). Por fim, foram elaborados relatórios e dashboards, como meio de apoio e suporte à tomada de decisão.
Instituto Politécnico de Tomar
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Stevens, W. Richard. "Pore water pressure in rock slopes and rockfill slopes subject to dynamic loading". 1985. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu_e9791_1985_430_sip1_w.pdf&type=application/pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Haldar, Sumanta. "Reliability Based Design Methods Of Pile Foundations Under Static And Seismic Loads". Thesis, 2008. http://hdl.handle.net/2005/856.

Texto completo
Resumen
The properties of natural soil are inherently variable and influence design decisions in geotechnical engineering. Apart from the inherent variability of the soil, the variability may arise due to measurement of soil properties in the field or laboratory tests and model errors. These wide ranges of variability in soil are expressed in terms of mean, variance and autocorrelation function using probability/reliability based models. The most common term used in reliability based design is the reliability index, which is a probabilistic measure of assurance of performance of structure. The main objective of the reliability based design is to quantify probability of failure/reliability of a geotechnical system considering variability in the design parameters and associated safety. In foundation design, reliability based design is useful compared to deterministic factor of safety approach. Several design codes of practice recommend the use of limit state design concept based on probabilistic models, and suggest that, development of reliability based design methodologies for practical use are of immense value. The objective of the present study is to propose reliability based design methodologies for pile foundations under static and seismic loads. The work presented in this dissertation is subdivided into two parts, namely design of pile foundations under static vertical and lateral loading; and design of piles under seismic loading, embedded in non-liquefiable and liquefiable soil. The significance of consideration of variability in soil parameters in the design of pile foundation is highlighted. A brief review of literature is presented in Chapter 2 on current pile design methods under vertical, lateral and seismic loads. It also identifies the scope of the work. Chapter 3 discusses the methods of analysis which are subsequently used for the present study. Chapter 4 presents the reliability based design methodology for vertically and laterally loaded piles based on cone penetration test data for cohesive soil. CPT data from Konaseema area in India is used for analysis. Ultimate limit sate and serviceability limit state are considered for reliability based design using CPT data and load displacement curves. Chapter 5 presents the load resistance factor design (LRFD) of vertically and laterally loaded piles based on load test data. Reliability based code calibrated partial factors are determined considering bias in failure criteria, model bias and variability in load and resistance. Chapter 6 illustrates a comprehensive study on the effect of soil spatial variability on response of vertically and laterally loaded pile foundations in undrained clay. Two-dimensional finite difference program, FLAC2D (Itasca 2005) is used to model the soil and pile. The response of pile foundations due to the effect of variance and spatial correlation of undrained shear strength is studied using Monte Carlo simulation. The influence of spatial variability on the propagation and formation of failure near the pile foundation is also examined. Chapter 7 describes reliability based design methodology of piles in non-liquefiable soil. The seismic load on pile foundation is determined from code specified elastic design response spectrum using pseudo-static approach. Variability in seismic load and soil undrained shear strength are incorporated. The effects of soil relative densities, pile diameters, earthquake predominant frequencies and peak acceleration values on the two plausible failure mechanisms; bending and buckling are examined in Chapter 8. The two-dimensional finite difference analysis is used for dynamic analysis. A probabilistic approach is proposed to identify governing failure modes of piles in liquefiable soil in Chapter 9. The variability in the soil parameters namely SPT-N value, friction angle, shear modulus, bulk modulus, permeability and shear strain at 50% of modulus ratio is considered. Monte Carlo simulation is used to determine the probability of failure. A well documented case of the failed pile of Showa Bridge in 1964 Niigata earthquake is considered as case example. Based on the studies reported in this dissertation, it can be concluded that the reliability based design of pile foundations considering variability and spatial correlation of soil enables a rational choice of design loads. The variability in the seismic design load and soil shear strength can quantify the risk involved for pile design in a rational basis. The identification of depth of liquefiable soil layer is found to be most important to identify failure mechanisms of piles in liquefiable soil. Considerations of soil type, earthquake intensity, predominant frequency of earthquake, pile material, variability of soil are also significant.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía