Auswahl der wissenschaftlichen Literatur zum Thema „Civil engineering Data processing“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Civil engineering Data processing" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Civil engineering Data processing":

1

Pu, Wenjing. „Standardized Data Processing“. Transportation Research Record: Journal of the Transportation Research Board 2338, Nr. 1 (Januar 2013): 44–57. http://dx.doi.org/10.3141/2338-06.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Iraqi, A., R. Z. Morawski, A. Barwicz und W. J. Bock. „Distributed data processing in a telemetric system for monitoring civil engineering constructions“. IEEE Transactions on Instrumentation and Measurement 48, Nr. 3 (Juni 1999): 773–77. http://dx.doi.org/10.1109/19.772220.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Čajka, Radim, und Martin Krejsa. „Measured Data Processing in Civil Structure Using the DOProC Method“. Advanced Materials Research 859 (Dezember 2013): 114–21. http://dx.doi.org/10.4028/www.scientific.net/amr.859.114.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This paper describes the use of measured values in the probabilistic tasks by means of the new method which is under development now - Direct Optimized Probabilistic Calculation (DOProC). This method has been used to solve a number of probabilistic tasks. DOProC has been applied in ProbCalc a part of this software is a module for entering and assessing the measured data. The software can read values saved in a text file and can create histograms with non-parametric (empirical) distribution of the probabilities. In case of the parametric distribution, it is possible to make selection from among 24 defined types and specify the best choice, using the coefficient of determination. This approach has been used, for instance, for modelling and experimental validation of reliability of an additionally prestressed masonry construction.
4

Novita, Triske, und Durinta Puspasari. „Analysis of the E-Master Application as an Effort of Employment Data Processing“. Economic Education Analysis Journal 10, Nr. 3 (30.10.2021): 508–21. http://dx.doi.org/10.15294/eeaj.v10i3.48046.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This study aimed to analyze the E-Master application as an effort to process personnel data. This research type was descriptive qualitative. The research subjects were 5 people consisting of 2 OTKP teachers and 3 administrative employees at SMKN 10 Surabaya, while 2 administrative employees and 1 Head of Study Program as source triangulation. Data collection techniques were questionnaires and interviews, while data analysis techniques used data condensation, data presentation, drawing conclusions. The results of the study showed: (1) mapping indicators were able to map Civil servant; (2) SKP indicators, could be used as a medium for monitoring the performance of each Civil servant; (3) training analysis indicators, Civil servant could take part in training activities; (4) promotion indicators, making it easier for Civil servant during the promotion process; (5) pension indicator, providing a sufficient period of time in managing pension files; (6) periodic salary indicators, facilitating the finance department in processing periodic salary calculations; (7) Taspen indicator, making it easier for staffing department to process data on retired personnel; (8) Leave indicators, it was easier for staffing officers to print Civil servant attendance lists; (9) Study permit indicator, helping Civil servant if they wanted to continue their education again.
5

Hein, Günter W., Alfred Leick und Steven Lambert. „Integrated Processing of GPS and Gravity Data“. Journal of Surveying Engineering 115, Nr. 1 (Februar 1989): 15–33. http://dx.doi.org/10.1061/(asce)0733-9453(1989)115:1(15).

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Hou, Feifei, Xiyue Rui, Xinyu Fan und Hang Zhang. „Review of GPR Activities in Civil Infrastructures: Data Analysis and Applications“. Remote Sensing 14, Nr. 23 (25.11.2022): 5972. http://dx.doi.org/10.3390/rs14235972.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Ground penetration radar (GPR) technology has received in-depth analysis and rapid development in the field of civil engineering. GPR data analysis is one of the basic and challenging problems in this field. This research aims to conduct a comprehensive survey of the progress from 2015 to the present in GPR scanning tasks. More than 130 major publications are cited in this research covering different aspects of the research, including advanced data processing methods and a wide variety of applications. First, it briefly introduces the data collection of the GPR system and discusses the signal complexity in simulated/real scenes. Then, it reviews the main signal processing techniques used to interpret the GPR data. Subsequently, the latest GPR surveys are considered and divided according to four application domains, namely bridges, road pavements, underground utilities, and urban subsurface risks. Finally, the survey discusses the open challenges and directions for future research.
7

You, Yinchen, Yi Zheng und Xiaohui Chen. „Civil Engineering Simulation and Safety Detection of High-Rise Buildings Based on BIM“. Mobile Information Systems 2022 (31.07.2022): 1–7. http://dx.doi.org/10.1155/2022/7600848.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
In the process of vigorously promoting the development of China’s smart city construction site engineering, the wide application of BIM technology and its advantages have been widely reflected in various places such as remote video site monitoring, visual chemical field technical data disclosure, and statistical engineering volume. BIM technology can effectively help enterprises manage the whole construction site, and can significantly and effectively improve the safety management level and work efficiency of the whole construction site. In this paper, a numerical simulation processing system for civil engineering is designed. This system can not only effectively predict and evaluate the design results of civil engineering but also help guide enterprises to obtain various earthquake-resistant and beautiful building structure designs, and various this kind of economical and reasonable engineering project construction decision-making. This paper uses simulation models to study the civil engineering of high-rise buildings, and conducts an in-depth study on the efficiency of simulation processing results and the analysis of visualization methods, which mainly include the efficient visualization of simulation time-varying vector fields and the efficient visualization after OpenSEES simulation processing.
8

Knoop, Victor L., Serge P. Hoogendoorn und Henk J. van Zuylen. „Processing Traffic Data Collected by Remote Sensing“. Transportation Research Record: Journal of the Transportation Research Board 2129, Nr. 1 (Januar 2009): 55–61. http://dx.doi.org/10.3141/2129-07.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Jiang, Zonglin. „Research on the development of BIM technology based on the application in the field of civil engineering“. E3S Web of Conferences 261 (2021): 03025. http://dx.doi.org/10.1051/e3sconf/202126103025.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The compliance of civil engineering structural design is the fundamental condition that determines the subsequent safety and life of civil engineering buildings. At the present stage of civil engineering structural design, most of them are based on two-dimensional planar structural design, which is less expressive and covers a single amount of information, and it is difficult to play its proper value in the actual design and construction process due to certain limitations. In such a background, BIM technology was born and has been promoted and applied to some extent. Compared with the traditional design system, BIM technology can gather data processing, data caching and data sharing and other mechanisms, and is valued by major design companies. This paper introduces BIM and describes the current situation and frontier of structural engineering discipline, and takes the application of BIM technology in civil engineering structural design as the basic research point to explore its application in civil engineering structural design and its possible problems and measures to solve them.
10

Howell, Tommie F. „Automation for Transportation—More Than Data Processing“. Journal of Transportation Engineering 116, Nr. 6 (November 1990): 831–35. http://dx.doi.org/10.1061/(asce)0733-947x(1990)116:6(831).

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Civil engineering Data processing":

1

Sinske, A. N. (Alexander Nicholas). „Comparative evaluation of the model-centred and the application-centred design approach in civil engineering software“. Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52741.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (PhD)--University of Stellenbosch, 2002.
ENGLISH ABSTRACT: In this dissertation the traditional model-centred (MC)design approach for the development of software in the civil engineering field is compared to a newly developed application-centred (AC)design approach. In the MC design software models play the central role. A software model maps part of the world, for example its visualization or analysis onto the memory space of the computer. Characteristic of the MC design is that the identifiers of objects are unique and persistent only within the name scope of a model, and that classes which define the objects are components of the model. In the AC design all objects of the engineering task are collected in an application. The identifiers of the objects are unique and persistent within the name scope of the application and classes are no longer components of a model, but components of the software platform. This means that an object can be a part of several models. It is investigated whether the demands on the information and communication in modern civil engineering processes can be satisfied using the MC design approach. The investigation is based on the evaluation of existing software for the analysis and design of a sewer reticulation system of realistic dimensions and complexity. Structural, quantitative, as well as engineering complexity criteria are used to evaluate the design. For the evaluation of the quantitative criteria, in addition to the actual Duration of Execution, a User Interaction Count, the Persistent Data Size, and a Basic Instruction Count based on a source code complexity analysis, are introduced. The analysis of the MCdesign shows that the solution of an engineering task requires several models. The interaction between the models proves to be complicated and inflexible due to the limitation of object identifier scope: The engineer is restricted to the concepts of the software developer, who must provide static bridges between models in the form of data files or software transformers. The concept of the ACdesign approach is then presented and implemented in a new software application written in Java. This application is also extended for the distributed computing scenario. Newbasic classes are defined to manage the static and dynamic behaviour of objects, and to ensure the consistent and persistent state of objects in the application. The same structural and quantitative analyses are performed using the same test data sets as for the MCapplication. It is shown that the AC design approach is superior to the MC design approach with respect to structural, quantitative and engineering complexity .criteria. With respect to the design structure the limitation of object identifier scope, and thus the requirement for bridges between models, falls away, which is in particular of value for the distributed computing scenario. Although the new object management routines introduce an overhead in the duration of execution for the AC design compared to a hypothetical MC design with only one model and no software bridges, the advantages of the design structure outweigh this potential disadvantage.
AFRIKAANSE OPSOMMING: In hierdie proefskrif word die tradisionele modelgesentreerde (MC)ontwerpbenadering vir die ontwikkeling van sagteware vir die siviele ingenieursveld vergelyk met 'n nuut ontwikkelde applikasiegesentreerde (AC) ontwerpbenadering. In die MContwerp speel sagtewaremodelle 'n sentrale rol. 'n Sagtewaremodel beeld 'n deel van die wêreld, byvoorbeeld die visualisering of analise op die geheueruimte van die rekenaar af. Eienskappe van die MContwerp is dat die identifiseerders van objekte slegs binne die naamruimte van 'n model uniek en persistent is, en dat klasse wat die objekte definieer komponente van die model is. In die AC ontwerp is alle objekte van die ingenieurstaak saamgevat in 'n applikasie. Die identifisieerders van die objekte is uniek en persistent binne die naamruimte van die applikasie en klasse is nie meer komponente van die model nie, maar komponente van die sagtewareplatform. Dit beteken dat 'n objek deel van 'n aantal modelle kan vorm. Dit word ondersoek of daar by die MC ontwerpbenadering aan die vereistes wat by moderne siviele ingenieursprosesse ten opsigte van inligting en kommunikasie gestel word, voldoen kan word. Die ondersoek is gebaseer op die evaluering van bestaande sagteware vir die analise en ontwerp van 'n rioolversamelingstelsel met realistiese dimensies en kompleksiteit. Strukturele, kwantitatiewe, sowel as ingenieurskompleksiteitskriteria word gebruik om die ontwerp te evalueer. Vir die evaluering van die kwantitatiewe kriteria word addisioneel tot die uitvoerduurte 'n gebruikersinteraksie-telling, die persistente datagrootte, en 'n basiese instruksietelling gebaseer op 'n bronkode kompleksiteitsanalise , ingevoer. Die analise van die MC ontwerp toon dat die oplossing van ingenieurstake 'n aantal modelle benodig. Die interaksie tussen die modelle bewys dat dit kompleks en onbuigsaam is, as gevolg van die beperking op objekidentifiseerderruimte: Die ingenieur is beperk tot die konsepte van die sagteware ontwikkelaar wat statiese brue tussen modelle in die vorm van lêers of sagteware transformators moet verskaf. Die AC ontwerpbenadering word dan voorgestel en geïmplementeer in 'n nuwe sagteware-applikasie, geskryf in Java. Die applikasie word ook uitgebrei vir die verdeelde bewerking in die rekenaarnetwerk. Nuwe basisklasse word gedefinieer om die statiese en dinamiese gedrag van objekte te bestuur, en om die konsistente en persistente status van objekte in die applikasie te verseker. Dieselfde strukturele en kwantitatiewe analises word uitgevoer met dieselfde toetsdatastelle soos vir die MC ontwerp. Daar word getoon dat die AC ontwerpbenadering die MC ontwerpbenadering oortref met betrekking tot die strukturele, kwantitatiewe en ingenieurskompleksiteitskriteria. Met betrekking tot die ontwerpstruktuur val die beperking van die objek-identfiseerderruimte en dus die vereiste van brue tussen modelle weg, wat besonder voordelig is vir die verdeelde bewerking in die rekenaarnetwerk. Alhoewel die nuwe objekbestuurroetines in die AC ontwerp in vergelyking met 'n hipotetiese MC ontwerp, wat slegs een model en geen sagteware brue bevat, langer uitvoerduurtes tot gevolg het, is die voordele van die ontwerpstruktuur groter as die potensiële nadele.
2

Bostanudin, Nurul Jihan Farhah. „Computational methods for processing ground penetrating radar data“. Thesis, University of Portsmouth, 2013. https://researchportal.port.ac.uk/portal/en/theses/computational-methods-for-processing-ground-penetrating-radar-data(d519f94f-04eb-42af-a504-a4c4275d51ae).html.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The aim of this work was to investigate signal processing and analysis techniques for Ground Penetrating Radar (GPR) and its use in civil engineering and construction industry. GPR is the general term applied to techniques which employ radio waves, typically in the Mega Hertz and Giga Hertz range, to map structures and features buried in the ground or in manmade structures. GPR measurements can suffer from large amount of noise. This is primarily caused by interference from other radio-wave-emitting devices (e.g., cell phones, radios, etc.) that are present in the surrounding area of the GPR system during data collection. In addition to noise, presence of clutter – reflections from other non-target objects buried underground in the vicinity of the target can make GPR measurement difficult to understand and interpret, even for the skilled human, GPR analysts. This thesis is concerned with the improvements and processes that can be applied to GPR data in order to enhance target detection and characterisation process particularly with multivariate signal processing techniques. Those primarily include Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Both techniques have been investigated, implemented and compared regarding their abilities to separate the target originating signals from the noise and clutter type signals present in the data. Combination of PCA and ICA (SVDPICA) and two-dimensional PCA (2DPCA) are the specific approaches adopted and further developed in this work. Ability of those methods to reduce the amount of clutter and unwanted signals present in GPR data have been investigated and reported in this thesis, suggesting that their use in automated analysis of GPR images is a possibility. Further analysis carried out in this work concentrated on analysing the performance of developed multivariate signal processing techniques and at the same time investigating the possibility of identifying and characterising the features of interest in pre-processed GPR images. The driving idea behind this part of work was to extract the resonant modes present in the individual traces of each GPR image and to use properties of those poles to characterise target. Three related but different methods have been implemented and applied in this work – Extended Prony, Linear Prediction Singular Value Decomposition and Matrix Pencil methods. In addition to these approaches, PCA technique has been used to reduce dimensionality of extracted traces and to compare signals measured in various experimental setups. Performance analysis shows that Matrix Pencil offers the best results.
3

Yang, Su. „PC-grade parallel processing and hardware acceleration for large-scale data analysis“. Thesis, University of Huddersfield, 2009. http://eprints.hud.ac.uk/id/eprint/8754/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Arguably, modern graphics processing units (GPU) are the first commodity, and desktop parallel processor. Although GPU programming was originated from the interactive rendering in graphical applications such as computer games, researchers in the field of general purpose computation on GPU (GPGPU) are showing that the power, ubiquity and low cost of GPUs makes them an ideal alternative platform for high-performance computing. This has resulted in the extensive exploration in using the GPU to accelerate general-purpose computations in many engineering and mathematical domains outside of graphics. However, limited to the development complexity caused by the graphics-oriented concepts and development tools for GPU-programming, GPGPU has mainly been discussed in the academic domain so far and has not yet fully fulfilled its promises in the real world. This thesis aims at exploiting GPGPU in the practical engineering domain and presented a novel contribution to GPGPU-driven linear time invariant (LTI) systems that are employed by the signal processing techniques in stylus-based or optical-based surface metrology and data processing. The core contributions that have been achieved in this project can be summarized as follow. Firstly, a thorough survey of the state-of-the-art of GPGPU applications and their development approaches has been carried out in this thesis. In addition, the category of parallel architecture pattern that the GPGPU belongs to has been specified, which formed the foundation of the GPGPU programming framework design in the thesis. Following this specification, a GPGPU programming framework is deduced as a general guideline to the various GPGPU programming models that are applied to a large diversity of algorithms in scientific computing and engineering applications. Considering the evolution of GPU’s hardware architecture, the proposed frameworks cover through the transition of graphics-originated concepts for GPGPU programming based on legacy GPUs and the abstraction of stream processing pattern represented by the compute unified device architecture (CUDA) in which GPU is considered as not only a graphics device but a streaming coprocessor of CPU. Secondly, the proposed GPGPU programming framework are applied to the practical engineering applications, namely, the surface metrological data processing and image processing, to generate the programming models that aim to carry out parallel computing for the corresponding algorithms. The acceleration performance of these models are evaluated in terms of the speed-up factor and the data accuracy, which enabled the generation of quantifiable benchmarks for evaluating consumer-grade parallel processors. It shows that the GPGPU applications outperform the CPU solutions by up to 20 times without significant loss of data accuracy and any noticeable increase in source code complexity, which further validates the effectiveness of the proposed GPGPU general programming framework. Thirdly, this thesis devised methods for carrying out result visualization directly on GPU by storing processed data in local GPU memory through making use of GPU’s rendering device features to achieve realtime interactions. The algorithms employed in this thesis included various filtering techniques, discrete wavelet transform, and the fast Fourier Transform which cover the common operations implemented in most LTI systems in spatial and frequency domains. Considering the employed GPUs’ hardware designs, especially the structure of the rendering pipelines, and the characteristics of the algorithms, the series of proposed GPGPU programming models have proven its feasibility, practicality, and robustness in real engineering applications. The developed GPGPU programming framework as well as the programming models are anticipated to be adaptable for future consumer-level computing devices and other computational demanding applications. In addition, it is envisaged that the devised principles and methods in the framework design are likely to have significant benefits outside the sphere of surface metrology.
4

Oosthuizen, Daniel Rudolph. „Data modelling of industrial steel structures“. Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53346.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (MScEng)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: AP230 of STEP is an application protocol for structural steel-framed buildings. Product data relating to steel structures is represented in a model that captures analysis, design and manufacturing views. The information requirements described in AP230 were analysed with the purpose of identifying a subset of entities that are essential for the description of simple industrial steel frames with the view to being able to describe the structural concept, and to perform the structural analysis and design of such structures. Having identified the essential entities, a relational database model for these entities was developed. Planning, analysis and design applications will use the database to collaboratively exchange data relating to the structure. The comprehensiveness of the database model was investigated by mapping a simple industrial frame to the database model. Access to the database is provided by a set of classes called the database representative classes. The data-representatives are instances that have the same selection identifiers and attributes as corresponding information units in the database. The datarepresentatives' primary tasks are to store themselves in the database and to retrieve their state from the database. A graphical user interface application, programmed in Java, used for the description of the structural concept with the capacity of storing the concept in the database and retrieving it again through the use of the database representative classes was also created as part of this project.
AFRIKAANSE OPSOMMING: AP230 van STEP is 'n toepassingsprotokol wat staal raamwerke beskryf. Die produkdata ter beskrywing van staal strukture word saamgevat in 'n model wat analise, ontwerp en vervaardigings oogmerke in aanmerking neem. Die informasie vereistes, soos beskryf in AP230, is geanaliseer om 'n subset van entiteite te identifiseer wat noodsaaklik is vir die beskrywing van 'n eenvoudige nywerheidsstruktuur om die strukturele konsep te beskryf en om die struktuur te analiseer en te ontwerp. Nadat die essensiële entiteite geïdentifiseer is, is 'n relasionele databasismodel van die entiteite geskep. Beplanning, analise en ontwerptoepassings maak van die databasis gebruik om kollaboratief data oor strukture uit te ruil. Die omvattenheid van die databasis-model is ondersoek deur 'n eenvoudige nywerheidsstruktuur daarop afte beeld. Toegang tot die databasis word verskaf deur 'n groep Java klasse wat bekend staan as die verteenwoordigende databasis klasse. Hierdie databasis-verteenwoordigers is instansies met dieselfde identifikasie eienskappe as die ooreenkomstige informasie eenhede in die databasis. Die hoofdoel van die databasis-verteenwoordigers is om hulself in die databasis te stoor asook om hul rang weer vanuit die databasis te verkry. 'n Grafiese gebruikerskoppelvlak, geprogrammeer in Java, is ontwikkel. Die koppelvlak word gebruik om die strukturele konsep te beskryf, dit te stoor na die databasis en om dit weer, met behulp van die databasis-verteenwoordigers, uit die databasis te haal.
5

Lo, Kin-keung, und 羅建強. „An investigation of computer assisted testing for civil engineering students in a Hong Kong technical institute“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1988. http://hub.hku.hk/bib/B38627000.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Gkoktsi, K. „Compressive techniques for sub-Nyquist data acquisition & processing in vibration-based structural health monitoring of engineering structures“. Thesis, City, University of London, 2018. http://openaccess.city.ac.uk/19192/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Vibration-based structural health monitoring (VSHM) is an automated method for assessing the integrity and performance of dynamically excited structures through processing of structural vibration response signals acquired by arrays of sensors. From a technological viewpoint, wireless sensor networks (WSNs) offer less obtrusive, more economical, and rapid VSHM deployments in civil structures compared to their tethered counterparts, especially in monitoring large-scale and geometrically complex structures. However, WSNs are constrained by certain practical issues related to local power supply at sensors and restrictions to the amount of wirelessly transmitted data due to increased power consumptions and bandwidth limitations in wireless communications. The primary objective of this thesis is to resolve the above issues by considering sub-Nyquist data acquisition and processing techniques that involve simultaneous signal acquisition and compression before transmission. This drastically reduces the sampling and transmission requirements leading to reduced power consumptions up to 85-90% compared to conventional approaches at Nyquist rate. Within this context, the current state-of-the-art VSHM approaches exploits the theory of compressive sensing (CS) to acquire structural responses at non-uniform random sub-Nyquist sampling schemes. By exploiting the sparse structure of the analysed signals in a known vector basis (i.e., non-zero signal coefficients), the original time-domain signals are reconstructed at the uniform Nyquist grid by solving an underdetermined optimisation problem subject to signal sparsity constraints. However, the CS sparse recovery is a computationally intensive problem that strongly depends on and is limited by the sparsity attributes of the measured signals on a pre-defined expansion basis. This sparsity information, though, is unknown in real-time VSHM deployments while it is adversely affected by noisy environments encountered in practice. To efficiently address the above limitations encountered in CS-based VSHM methods, this research study proposes three alternative approaches for energy-efficient VSHM using compressed structural response signals under ambient vibrations. The first approach aims to enhance the sparsity information of vibrating structural responses by considering their representation on the wavelet transform domain using various oscillatory functions with different frequency domain attributes. In this respect, a novel data-driven damage detection algorithm is developed herein, emerged as a fusion of the CS framework with the Relative Wavelet Entropy (RWE) damage index. By processing sparse signal coefficients on the harmonic wavelet transform for two comparative structural states (i.e., damage versus healthy state), CS-based RWE damage indices are retrieved from a significantly reduced number of wavelet coefficients without reconstructing structural responses in time-domain. The second approach involves a novel signal-agnostic sub-Nyquist spectral estimation method free from sparsity constraints, which is proposed herein as a viable alternative for power-efficient WSNs in VSHM applications. The developed method relies on Power Spectrum Blind Sampling (PSBS) techniques together with a deterministic multi-coset sampling pattern, capable to acquire stationary structural responses at sub-Nyquist rates without imposing sparsity conditions. Based on a network of wireless sensors operating on the same sampling pattern, auto/cross power-spectral density estimates are computed directly from compressed data by solving an overdetermined optimisation problem; thus, by-passing the computationally intensive signal reconstruction operations in time-domain. This innovative approach can be fused with standard operational modal analysis algorithms to estimate the inherent resonant frequencies and modal deflected shapes of structures under low-amplitude ambient vibrations with the minimum power, computational and memory requirements at the sensor, while outperforming pertinent CS-based approaches. Based on the extracted modal in formation, numerous data-driven damage detection strategies can be further employed to evaluate the condition of the monitored structures. The third approach of this thesis proposes a noise-immune damage detection method capable to capture small shifts in structural natural frequencies before and after a seismic event of low intensity using compressed acceleration data contaminated with broadband noise. This novel approach relies on a recently established sub-Nyquist pseudo-spectral estimation method which combines the deterministic co-prime sub-Nyquist sampling technique with the multiple signal classification (MUSIC) pseudo-spectrum estimator. This is also a signal-agnostic and signal reconstruction-free method that treats structural response signals as wide-sense stationary stochastic processes to retrieve, with very high resolution, auto-power spectral densities and structural natural frequency estimates directly from compressed data while filtering out additive broadband noise.
7

Bakhary, Norhisham. „Structural condition monitoring and damage identification with artificial neural network“. University of Western Australia. School of Civil and Resource Engineering, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0102.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Many methods have been developed and studied to detect damage through the change of dynamic response of a structure. Due to its capability to recognize pattern and to correlate non-linear and non-unique problem, Artificial Neural Networks (ANN) have received increasing attention for use in detecting damage in structures based on vibration modal parameters. Most successful works reported in the application of ANN for damage detection are limited to numerical examples and small controlled experimental examples only. This is because of the two main constraints for its practical application in detecting damage in real structures. They are: 1) the inevitable existence of uncertainties in vibration measurement data and finite element modeling of the structure, which may lead to erroneous prediction of structural conditions; and 2) enormous computational effort required to reliably train an ANN model when it involves structures with many degrees of freedom. Therefore, most applications of ANN in damage detection are limited to structure systems with a small number of degrees of freedom and quite significant damage levels. In this thesis, a probabilistic ANN model is proposed to include into consideration the uncertainties in finite element model and measured data. Rossenblueth's point estimate method is used to reduce the calculations in training and testing the probabilistic ANN model. The accuracy of the probabilistic model is verified by Monte Carlo simulations. Using the probabilistic ANN model, the statistics of the stiffness parameters can be predicted which are used to calculate the probability of damage existence (PDE) in each structural member. The reliability and efficiency of this method is demonstrated using both numerical and experimental examples. In addition, a parametric study is carried out to investigate the sensitivity of the proposed method to different damage levels and to different uncertainty levels. As an ANN model requires enormous computational effort in training the ANN model when the number of degrees of freedom is relatively large, a substructuring approach employing multi-stage ANN is proposed to tackle the problem. Through this method, a structure is divided to several substructures and each substructure is assessed separately with independently trained ANN model for the substructure. Once the damaged substructures are identified, second-stage ANN models are trained for these substructures to identify the damage locations and severities of the structural ii element in the substructures. Both the numerical and experimental examples are used to demonstrate the probabilistic multi-stage ANN methods. It is found that this substructuring ANN approach greatly reduces the computational effort while increasing the damage detectability because fine element mesh can be used. It is also found that the probabilistic model gives better damage identification than the deterministic approach. A sensitivity analysis is also conducted to investigate the effect of substructure size, support condition and different uncertainty levels on the damage detectability of the proposed method. The results demonstrated that the detectibility level of the proposed method is independent of the structure type, but dependent on the boundary condition, substructure size and uncertainty level.
8

De, Kock Jacobus M. (Jacobus Michiel). „An overview of municipal information systems of Drakenstein municipality with reference to the Actionit open decision support framework“. Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52684.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (MScEng)--University of Stellenbosch, 2002.
ENGLISH ABSTRACT: Actionl'I' is a project undertaken by a consortium consisting of CSIR, Simeka Management Consulting, University of Pretoria and the University of Stellenbosch for the Innovation Fund of the Department of Arts, Culture, Science and Technology in South Africa. Their objective is to create a basic specification for seletected information exchange that is compatible with all levels of government. The comparison between existing information systems at municipal level and ActionIT specifications will be investigated for the purpose of exposing shortcomings on both sides. Appropriate features of existing information systems will be identified for the purpose of enhancing the ActionIT specifications. The ActionIT project is presently in its user requirement and conceptual model definition phase, and this thesis aims to assist in providing information that may be helpful infuture developments. The study undertaken in this thesis requires the application of analytical theory and a working knowledge of information systems and databases in order to: 1. Research existing information systems and relevant engineering data at local municipal authorities. Also important will be the gathering of information regarding systems currently in use, and the format in which information is stored and utilised at municipalities. 2. Do an adequate analysis of the contents of recorded information. This information will establish background knowledge on the operations of local authorities and a clearer understanding of information systems. 3. Evaluate to what degree existing information systems comply with ActionIT specifications. This will be the main focus of this thesis. Thus the focus of this thesis is to record (provide an overview oj) activities in a municipal environment and the interaction with the environment on information system level where standards provided by ActionIT as an Open Decision Support Framework can be of value.
AFRIKAANSE OPSOMMING: ActionIT is in projek wat deur ActionIT konsortium bestaande uit die WNNR, Simeka Management Consulting, Universiteit van Pretoria en die Universiteit van Stellenbosch, onderneem is vir die Innovasie Vonds van die departement van Kuns, Kultuur, Wetenskap en Tegnologie van Suid-Afrika. Hul mikpunt is om in spesifikasie vir informasie sisteme te onwikkel, wat met alle vlakke van regering kan skakel. Die vergelyking tussen die bestaande informasie sisteme op munisipale vlak en ActionIT spesifikasies salondersoek word vir die doelom tekortkominge aan beide kante uit te wys. Vir die verbetering van ActionIT spesifikasies moet aanvaarbare eienskappe van bestaande informasie sisteme geindentifiseer word. Die ActionIT projek is tans in die gebruikers vereiste en konseptueIe model definisie fase, en die tesis is gemik daarop om 'n bydrae te lewer tot die bevordering van informasie wat mag help in toekomstige ontwikkeling. Die werk onderneem in die tesis vereis in teoretiese kennis van informasie sisteme en databasise om: 1. in Ondersoek in die bestaande informasie sisteem en relefante ingenieurs data van in plaaslike munisipaliteit te doen. Die insameling van informasie oor die huidige sisteme in gebruik, die formaat waarin die informasie gestoor en gebruik word is ook belangrik. 2. in Analise van die inhoud van die waargenome informasie te doen. Die informasie sal agtergrond kennis gee oor die werking van plaaslike munisipale owerheid en in beter insig in informasie sisteme gee. 3. in Evaluasie van die verband tussen die bestaande informasie sisteme en ActionIT spesifikasies te doen. Dit is die hoof fokus punt van die tesis. Dus die doel van die tesis is om in oorsig te gee oor die aktiviteite in in munisipale omgewing en die interaksie met die omgewing op informasie sisteem vlak. Waar standaarde wat deur ActionIT voorgeskryf word as in "Open Decision Support Framework" van belang kan wees.
9

Camp, Nicholas Julian. „A model for the time dependent behaviour of rock joints“. Master's thesis, University of Cape Town, 1989. http://hdl.handle.net/11427/21138.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This thesis is a theoretical investigation into the time-dependent behaviour of rock joints. Much of the research work that has been conducted to date in the area of finite element analysis has been involved with the development of special elements to deal with these discontinuities. A comprehensive literature survey is undertaken highlighting some of the significant contributions to the modelling of joints. It is then shown how internal variables can be used to model discontinuities in the rock mass. A finite element formulation is described resulting in a system of equations which can easily be adapted to cope with various constitutive behaviours on the discontinuities. In particular, a viscoplastic relationship; which uses a homogeneous, hyperbolic yield function is adopted. The viscoplastic relationship can be used for both time-dependent (creep) or quasi-static (elasto-plastic) problems. Time-dependent behaviour requires a time integration scheme and therefore a generalised explicit/implicit scheme is chosen. The resulting numerical algorithms are all implemented in the finite element program, NOSTRUM. Various examples are presented to illustrate certain features of both the formulation and the numerical algorithm. Jointed rock beams and a jointed infinite rock mass are modelled assuming plane strain conditions. Reasons are proposed to explain the predicted behaviour. The results of the analysis shows that the internal variable formulation successfully models time-dependent joint movements in a continuous media. The method gives good, qualitative results which agree with observations in deep level mines. It is recommended that quantitative mine observations be used to calibrate the model so that usable predictions of joint movement can be made. This would enable any new developments to be implemented in the model. Further work on implicit methods might allow greater modelling flexibility by reducing computer run times.
10

Goy, Cristina. „Displacement Data Processing and FEM Model Calibration of a 3D-Printed Groin Vault Subjected to Shaking-Table Tests“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20061/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The present thesis is part of the wide work required by the SEBESMOVA3D (SEeismic BEhavior of Scaled MOdels of groin VAults made by 3D printers) project whose first motivation is the preservation of the cultural heritage in case of seismic events. Therefore, the main topic of the thesis is the analysis of the seismic response of scaled models of groin vaults, made of plastic 3D printed bricks filled with mortar, and subjected to shaking table tests performed at the EQUALS laboratory of the University of Bristol. The work has been developed on two parallel binaries: the processing of the displacement data acquired in situ and the calibration of a FEM model.

Bücher zum Thema "Civil engineering Data processing":

1

Trevor, Bell. Microcomputers in civil engineering. London: Construction Press, 1985.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

ASCE International Workshop on Computing in Civil Engineering (2009 Austin, Tex.). Computing in civil engineering: Proceedings of the 2009 ASCE International Workshop on Computing in Civil Engineering : June 24-27, 2009, Austin, Texas. Reston, Va: American Society of Civil Engineers, 2009.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

London), International Conference on Civil and Structural Engineering Computing (2nd 1985. Civil-Comp 85: Proceedings of the SecondInternational Conference on Civil and Structural Engineering Computing. Edinburgh: Civil-Comp Press, 1985.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

P, Mohsen J., American Society of Civil Engineers. Technical Council on Computer Practices. Committee on Coordination outside ASCE. und Congress on Computing in Civil Engineering (2nd : 1995 : Atlanta, Ga.), Hrsg. Computing in civil engineering: Proceedings of the Second Congress held in conjunction with A/E/C Systems '95. New York: American Society of Civil Engineers, 1995.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

International Computing Congress in Civil Engineering (5th 1998 Boston, Mass.). Computing in civil engineering: Proceedings of International Computing Congress held in conjunction with 1998 ASCE Annual Convention and Exhibition, Boston, Massachusetts, October 18-21, 1998. Herausgegeben von Wang Kelvin C. P, Adams Teresa M, American Society of Civil Engineers. Technical Council on Computing and Information Technology. Committee on Coordination Outside ASCE. und ASCE Convention and Exhibition (1998 : Boston, Mass.). Reston, VA: American Society of Civil Engineers, 1998.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Congress on Computing in Civil Engineering (1st 1994 Washington, D.C.). Computing in civil engineering: Proceedings of the First Congress held in conjunction with A/E/C Systems '94. New York, N.Y: American Society of Civil Engineers, 1994.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

International Conference on Civil and Structural Engineering Computing (2nd 1985 London). Civil-Comp 85: The proceedings of the Second International Conference on Civil and Structural Engineering Computing. Edinburgh: Civil-Comp, 1985.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

International Conference on Civil and Structural Engineering Computing (2nd 1985 London). Civil-Comp 85: The proceedings of the Second International Conference on Civil and Structural Engineering Computing. Edinburgh: Civil-Comp, 1985.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Murphy, Donal P. Bibliography of civil engineering computer applications 1984-1990. Dublin: CITIS, 1991.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Falk, Howard. Microcomputer software for civil engineers. New York: Van Nostrand Reinhold, 1986.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Civil engineering Data processing":

1

Economou, Nikos, Antonis Vafidis, Francesco Benedetto und Amir M. Alani. „GPR Data Processing Techniques“. In Civil Engineering Applications of Ground Penetrating Radar, 281–97. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-04813-0_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Castelli, Paolo, Michele Rizzo und Ostilio Spadaccini. „Data Processing Strategies for Monitoring an Offshore SPM System“. In Lecture Notes in Civil Engineering, 233–41. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-74258-4_16.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Baltazart, V., S. Todkar, X. Dérobert und J. M. Simonin. „Thin-Bed Data Model for the Processing of GPR Data over Debonded Pavement Structures“. In Lecture Notes in Civil Engineering, 615–22. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-55236-7_63.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Manosalvas-Paredes, Mario, Nizar Lajnef, Karim Chatti, Juliette Blanc, Nick Thom, Gordon Airey und Davide Lo Presti. „Monitoring Road Pavement Performance Through a Novel Data Processing Approach, Accelerated Pavement Test Results“. In Lecture Notes in Civil Engineering, 545–54. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-55236-7_56.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Thuy, Chu Thanh, Akihito Watanabe und Ryuta Wakutsu. „Cloud-Based 3D Data Processing and Modeling for UAV Application in Disaster Response and Construction Fields“. In Lecture Notes in Civil Engineering, 1177–82. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-15-2184-3_154.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Wei, Aimin. „Research and Practice on Diagnosis of Civil Engineering Majors Based on EFQM Excellence Model Analysis“. In Data Processing Techniques and Applications for Cyber-Physical Systems (DPTA 2019), 953–60. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-1468-5_110.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Dojcinovski, D. M., D. J. Mamucevski und V. P. Mihailov. „Seismic Monitoring of Nuclear Power Plants; An Approach to Optimal and More Accurate Seismic Data Processing and Interpretation Procedure“. In Strong Motion Instrumentation for Civil Engineering Structures, 417–31. Dordrecht: Springer Netherlands, 2001. http://dx.doi.org/10.1007/978-94-010-0696-5_29.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Tran, Anh Phuong, und Sébastien Lambot. „Development of Intrinsic Models for Describing Near-Field Antenna Effects, Including Antenna-Medium Coupling, for Improved Radar Data Processing Using Full-Wave Inversion“. In Civil Engineering Applications of Ground Penetrating Radar, 219–38. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-04813-0_9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Di Cosmo, Lucio. „Plot Level Estimation Procedures and Models“. In Springer Tracts in Civil Engineering, 119–49. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98678-0_6.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
AbstractQuantitative variable raw data recorded in the sample plots require pre-processing before the NFI estimators of totals and densities can be used to produce statistics. The objective of the plot level estimates is to estimate the variables of interest for each sample point expanded to the 1 km2 area of the cell that the point represents. The intensity and complexity of the computations vary considerably depending on the variable, the way it is obtained by the measured items (e.g., DBH measurement vs. basal area), whether all the items in the sample plot or only a subsample of them are measured, and the availability of models. The definitive result of the computations are tallies, volumes, biomass and carbon stocks but estimates of additional variables at intermediate steps may be needed (e.g., total tree height). This chapter describes the methods and the models used in INFC2015 for the estimation of the variables related to trees (e.g., tallies, basal area), small trees and shrubs (e.g., biomass, carbon stock), stumps (e.g., volume, biomass), stock variation (e.g., the wood annually produced by growth and that removed). Some of the models described were produced in view of the INFC needs, before and after it was established in 2001, while others were created during the NFI computation processes. Finally, the conversion factors needed to estimate the biomass of deadwood, saplings and shrubs were obtained through an additional field campaign of the second Italian NFI (INFC2005) and the following laboratory analyses.
10

Vassilev, Vassil, Sylvia Ilieva, Iva Krasteva, Irena Pavlova, Dessisslava Petrova-Antonova und Wiktor Sowinski-Mydlarz. „AI-Based Hybrid Data Platforms“. In Data Spaces, 147–70. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98636-0_8.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
AbstractThe current digital transformation of many businesses and the exponential growth of digital data are two of the key factors of digital revolution. For the successful meeting of high expectations, the data platforms need to employ the recent theoretical, technological, and methodological advances in contemporary computing and data science and engineering. This chapter presents an approach to address these challenges by combining logical methods for knowledge processing and machine learning methods for data analysis into a hybrid AI-based framework. It is applicable to a wide range of problems that involve both synchronous operations and asynchronous events in different domains. The framework is a foundation for building the GATE Data Platform, which aims at the application of Big Data technologies in civil and government services, industry, and healthcare. The platform implementation will utilize several recent distributed technologies such as Internet of Things, cloud, and edge computing and will integrate them into a multilevel service-oriented architecture that supports services along the entire data value chain, while the service orchestration guarantees a high degree of interoperability, reusability, and automation. The platform is designed to be compliant with the open-source software, but its open architecture supports also mixing with commercial components and tools.

Konferenzberichte zum Thema "Civil engineering Data processing":

1

Berkhahn, Volker, Sebastian Rath und Erik Pasche. „Processing Remote Sensing Data for Flood Hazard Assessment“. In International Conference on Computing in Civil Engineering 2005. Reston, VA: American Society of Civil Engineers, 2005. http://dx.doi.org/10.1061/40794(179)43.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Guo, Zhengwei, Yongwei Gao, Yafei Jiang und Guang Xue. „GRIB Parallel Design of Civil Aviation Meteorological Data Processing System“. In 2015 14th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES). IEEE, 2015. http://dx.doi.org/10.1109/dcabes.2015.16.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Geng, Yanbin, Lei Yu und Liangliang Sun. „A Signal Processing Technique-Based Approach to the ITS Data Compression“. In International Conference on Computing in Civil Engineering 2005. Reston, VA: American Society of Civil Engineers, 2005. http://dx.doi.org/10.1061/40794(179)71.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

„Processing E-nose Data for Salmonella Enterica Detection in Poultry Manure“. In International Conference on Biological, Civil and Environmental Engineering. International Institute of Chemical, Biological & Environmental Engineering, 2014. http://dx.doi.org/10.15242/iicbe.c0314046.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Pang, S., H. Sun, Y. Xu und H. Wang. „A generic method for sound data acquisition and processing on civil aircraft engineering simulator“. In CSAA/IET International Conference on Aircraft Utility Systems (AUS 2022). Institution of Engineering and Technology, 2022. http://dx.doi.org/10.1049/icp.2022.1753.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Mizuno, Y., und Y. Fujino. „Data Archiving and Processing Method Using Wavelet Decomposition for Structural Health Monitoring“. In International Workshop on Computing in Civil Engineering 2007. Reston, VA: American Society of Civil Engineers, 2007. http://dx.doi.org/10.1061/40937(261)80.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Brook, A., E. Ben-Dor und R. Richter. „Fusion of hyperspectral images and LiDAR data for civil engineering structure monitoring“. In 2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS). IEEE, 2010. http://dx.doi.org/10.1109/whispers.2010.5594872.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Tang, Pingbo, und Zhenglai Shen. „Time-Quality Analysis of Spatial Data Processing for Bridge Management“. In 2014 International Conference on Computing in Civil and Building Engineering. Reston, VA: American Society of Civil Engineers, 2014. http://dx.doi.org/10.1061/9780784413616.112.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Qiao, Fengxiang, Lei Yu und Mamadou Djimde. „Processing On-Road Truck Emission Data for Comparison with the Computer Model MOBILE6.2“. In International Conference on Computing in Civil Engineering 2005. Reston, VA: American Society of Civil Engineers, 2005. http://dx.doi.org/10.1061/40794(179)65.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Duan, Jin, Xiaoming Chen und Yungui Li. „A Structural Simulation System, Part 2: Data Processing Center“. In 2017 2nd International Conference on Civil, Transportation and Environmental Engineering (ICCTE 2017). Paris, France: Atlantis Press, 2017. http://dx.doi.org/10.2991/iccte-17.2017.42.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Civil engineering Data processing":

1

Johnston, Lisa, und Jon Jeffryes. Teaching Civil Engineering Data Information Literacy Skills: An e-Learning Approach. Purdue University, 2015. http://dx.doi.org/10.5703/1288284315479.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Otto, Philippa. An Analytical System for Determining Disciplinary Vocabulary for Data-Driven Learning: An Example from Civil Engineering. Portland State University Library, Januar 2000. http://dx.doi.org/10.15760/etd.5356.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Hall, Candice, und Robert Jensen. Utilizing data from the NOAA National Data Buoy Center. Engineer Research and Development Center (U.S.), März 2021. http://dx.doi.org/10.21079/11681/40059.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
This Coastal and Hydraulics Engineering Technical Note (CHETN) guides users through the quality control (QC) and processing steps that are necessary when using archived U.S. National Oceanic and Atmospheric Administration (NOAA) National Data Buoy Center (NDBC) wave and meteorological data. This CHETN summarizes methodologies to geographically clean and QC NDBC measurement data for use by the U.S. Army Corps of Engineers (USACE) user community.
4

Kress, Martin A., und Samuel J. Weintraub. AIS Data Case Study : Selecting Design Vessels for New Jersey Back Bays Storm Surge Barriers Study. Engineer Research and Development Center (U.S.), Februar 2021. http://dx.doi.org/10.21079/11681/39779.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The purpose of this Coastal and Hydraulics Engineering technical note (CHETN) is to describe how historic Automatic Identification System (AIS) vessel position data were used to identify a design vessel for use in a storm surge barrier design study. Specifically, this CHETN describes how the AIS data were accessed, how the universe of vessel data was refined to allow for design vessel selection, and how that selection was used in a storm surge barrier (SSB) study. This CHETN draws upon the New Jersey Back Bays Coastal Storm Risk Management Feasibility Study (USACE-NAP 2019), specifically the Appendix B.2 Engineering Appendix Civil document1. The New Jersey Back Bays Study itself builds upon the work of the North Atlantic Coast Comprehensive Study (NACCS) initiated after Hurricane Sandy in 2012 (USACE 2015a).
5

DeMarle, David, und Andrew Bauer. In situ visualization with temporal caching. Engineer Research and Development Center (U.S.), Januar 2022. http://dx.doi.org/10.21079/11681/43042.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
In situ visualization is a technique in which plots and other visual analyses are performed in tandem with numerical simulation processes in order to better utilize HPC machine resources. Especially with unattended exploratory engineering simulation analyses, events may occur during the run, which justify supplemental processing. Sometimes though, when the events do occur, the phenomena of interest includes the physics that precipitated the events and this may be the key insight into understanding the phenomena that is being simulated. In situ temporal caching is the temporary storing of produced data in memory for possible later analysis including time varying visualization. The later analysis and visualization still occurs during the simulation run but not until after the significant events have been detected. In this article, we demonstrate how temporal caching can be used with in-line in situ visualization to reduce simulation run-time while still capturing essential simulation results.
6

Suir, Glenn, Molly Reif und Christina Saltus. Remote sensing capabilities to support EWN® projects : an R&D approach to improve project efficiencies and quantify performance. Engineer Research and Development Center (U.S.), August 2022. http://dx.doi.org/10.21079/11681/45241.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Engineering With Nature (EWN®) is a US Army Corps of Engineers (USACE) Initiative and Program that promotes more sustainable practices for delivering economic, environmental, and social benefits through collaborative processes. As the number and variety of EWN® projects continue to grow and evolve, there is an increasing opportunity to improve how to quantify their benefits and communicate them to the public. Recent advancements in remote sensing technologies are significant for EWN® because they can provide project-relevant detail across a large areal extent, in which traditional survey methods may be complex due to site access limitations. These technologies encompass a suite of spatial and temporal data collection and processing techniques used to characterize Earth's surface properties and conditions that would otherwise be difficult to assess. This document aims to describe the general underpinnings and utility of remote sensing technologies and applications for use: (1) in specific phases of the EWN® project life cycle; (2) with specific EWN® project types; and (3) in the quantification and assessment of project implementation, performance, and benefits.
7

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova und Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], Februar 2020. http://dx.doi.org/10.31812/123456789/3677.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).

Zur Bibliographie