To see the other types of publications on this topic, follow the link: Storm prediction.

Dissertations / Theses on the topic 'Storm prediction'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Storm prediction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lee, Michael. "Rapid Prediction of Tsunamis and Storm Surges Using Machine Learning." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103154.

Full text
Abstract:
Tsunami and storm surge are two of the main destructive and costly natural hazards faced by coastal communities around the world. To enhance coastal resilience and to develop effective risk management strategies, accurate and efficient tsunami and storm surge prediction models are needed. However, existing physics-based numerical models have the disadvantage of being difficult to satisfy both accuracy and efficiency at the same time. In this dissertation, several surrogate models are developed using statistical and machine learning techniques that can rapidly predict a tsunami and storm surge without substantial loss of accuracy, with respect to high-fidelity physics-based models. First, a tsunami run-up response function (TRRF) model is developed that can rapidly predict a tsunami run-up distribution from earthquake fault parameters. This new surrogate modeling approach reduces the number of simulations required to build a surrogate model by separately modeling the leading order contribution and the residual part of the tsunami run-up distribution. Secondly, a TRRF-based inversion (TRRF-INV) model is developed that can infer a tsunami source and its impact from tsunami run-up records. Since this new tsunami inversion model is based on the TRRF model, it can perform a large number of tsunami forward simulations in tsunami inversion modeling, which is impossible with physics-based models. And lastly, a one-dimensional convolutional neural network combined with principal component analysis and k-means clustering (C1PKNet) model is developed that can rapidly predict the peak storm surge from tropical cyclone track time series. Because the C1PKNet model uses the tropical cyclone track time series, it has the advantage of being able to predict more diverse tropical cyclone scenarios than the existing surrogate models that rely on a tropical cyclone condition at one moment (usually at or near landfall). The surrogate models developed in this dissertation have the potential to save lives, mitigate coastal hazard damage, and promote resilient coastal communities.
Doctor of Philosophy
Tsunami and storm surge can cause extensive damage to coastal communities; to reduce this damage, accurate and fast computer models are needed that can predict the water level change caused by these coastal hazards. The problem is that existing physics-based computer models are either accurate but slow or less accurate but fast. In this dissertation, three new computer models are developed using statistical and machine learning techniques that can rapidly predict a tsunami and storm surge without substantial loss of accuracy compared to the accurate physics-based computer models. Three computer models are as follows: (1) A computer model that can rapidly predict the maximum ground elevation wetted by the tsunami along the coastline from earthquake information, (2) A computer model that can reversely predict a tsunami source and its impact from the observations of the maximum ground elevation wetted by the tsunami, (3) A computer model that can rapidly predict peak storm surges across a wide range of coastal areas from the tropical cyclone's track position over time. These new computer models have the potential to improve forecasting capabilities, advance understanding of historical tsunami and storm surge events, and lead to better preparedness plans for possible future tsunamis and storm surges.
APA, Harvard, Vancouver, ISO, and other styles
2

Suyanto, Adhi. "Estimating the exceedance probabilities of extreme floods using stochastic storm transportation and rainfall - runoff modelling." Thesis, University of Newcastle Upon Tyne, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.386794.

Full text
Abstract:
Methods of estimating floods with return periods of up to one hundred years are reasonably well established, and in the main rely on extrapolation of historical flood data at the site of interest. However, extrapolating the tails of fitted probability distributions to higher return periods is very unreliable and cannot provide a satisfactory basis for extreme flood estimation. The probable maximum flood concept is an alternative approach, which is often used for critical cases such as the location of nuclear power plants, and is viewed as a consequence of a combination of a probable maximum precipitation with the worst possible prevailing catchment conditions. Return periods are not usually quoted although they are implicitly thought to be of the order of tens of thousand of years. There are many less critical situations which still justify greater flood protection than would be provided for an estimated one-hundred year flood. There is therefore a need for techniques which can be used to estimate floods with return periods of up to several thousand years. The predictive approach adopted here involves a combination of a probabilistic storm transposition technique with a physically-based distributed rainfall-runoff model. Extreme historical storms within a meteorologically homogeneous region are, conceptually, moved to the catchment of interest, and their return periods are estimated within a probabilistic framework. Known features of storms such as depth, duration, and perhaps approximate shape will, together with catchment characteristics, determine much of the runoff response. But there are other variables which also have an effect and these include the space-time distribution of rainfall within the storm, storm velocity and antecedent catchment conditions. The effects of all these variables on catchment response are explored.
APA, Harvard, Vancouver, ISO, and other styles
3

Hanson, Clair Elizabeth. "A cyclone climatology of the North Atlantic and its implications for the insurance market." Thesis, University of East Anglia, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.365137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jafari, Alireza. "Analysis and Prediction of Wave Transformation from Offshore into the Surfzone under Storm Condition." Thesis, Griffith University, 2013. http://hdl.handle.net/10072/366745.

Full text
Abstract:
Surfzone wave transformation under storm conditions is investigated through field and laboratory measurements in this study. The observations have been used to examine currently available models of wave energy dissipation. Detailed field data has been collected by means of a novel method which was first introduced by Nielsen (1988). This method has been utilised through a common program between Griffith University and The University of Queensland at The Spit on the Gold Coast in Southeast Queensland. The facility primarily consists of a manometer tube array with 12 different manometer tube lengths varying from 60 m to 500 m offshore and a concrete manhole excavated into the dune system to house the monitoring station. Accordingly, this system has enabled the monitoring of a detailed wave height profile across the surfzone under any conditions from the safety of the “bunker” on land. The findings of new laboratory experiments on the frequency response of the semi-rigid manometer tubes are also presented which extend and improve upon the previous work of Nielsen et al. (1993). Testing was conducted over a range of frequencies (0.0067 Hz< f <2 Hz) and tube lengths (10 m< L <900 m). New frequency response factors are determined by fitting the semiempirical gain function of Nielsen et al. (1993) to the observed gain data. As a result, new predictive formulas for the empirical coefficients as a function of tube parameters are provided in this study. Wave induced pore pressure in the surfzone seabed is investigated based on the recorded field data. Two well-known models, i.e. Hsu and Jeng (1994) and Sleath (1970), are assessed against the field measurements. The findings validate the accuracy of the models and indicate that the extent of energy dissipation due to the overlying sand is less than 5% and depends on the incident wave length.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Griffith School of Engineering
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
5

Anderson, Ian. "Improving Detection And Prediction Of Bridge Scour Damage And Vulnerability Under Extreme Flood Events Using Geomorphic And Watershed Data." ScholarWorks @ UVM, 2018. https://scholarworks.uvm.edu/graddis/823.

Full text
Abstract:
Bridge scour is the leading cause of bridge damage nationwide. Successfully mitigating bridge scour problems depends on our ability to reliably estimate scour potential, design safe and economical foundation elements that account for scour potential, identify vulnerabilities related to extreme events, and recognize changes to the environmental setting that increase risk at existing bridges. This study leverages available information, gathered from several statewide resources, and adds watershed metrics to create a comprehensive, georeferenced dataset to identify parameters that correlate to bridges damaged in an extreme flood event. Understanding the underlying relationships between existing bridge condition, fluvial stresses, and geomorphological changes is key to identifying vulnerabilities in both existing and future bridge infrastructure. In creating this comprehensive database of bridge inspection records and associated damage characterization, features were identified that correlate to and discriminate between levels of bridge damage. Stream geomorphic assessment features were spatially joined to every bridge, marking the first time that geomorphic assessments have been broadly used for estimating bridge vulnerability. Stream power assessments and watershed delineations for every bridge and stream reach were generated to supplement the comprehensive database. Individual features were tested for their significance to discriminate bridge damage, and then used to create empirical fragility curves and probabilistic predictions maps to aid in future bridge vulnerability detection. Damage to over 300 Vermont bridges from a single extreme flood event, the August 28, 2011 Tropical Storm Irene, was used as the basis for this study. Damage to historic bridges was also summarized and tabulated. In some areas of Vermont, the storm rainfall recurrence interval exceeded 500 years, causing widespread flooding and damaging over 300 bridges. With a dataset of over 330 features for more than 2,000 observations to bridges that were damaged as well as not damaged in the storm, an advanced evolutionary algorithm performed multivariate feature selection to overcome the shortfalls of traditional logistic regression analysis. The analysis identified distinct combinations of variables that correlate to the observed bridge damage under extreme food events.
APA, Harvard, Vancouver, ISO, and other styles
6

Anderson, John W. "An analysis of a dust storm impacting Operation Iraqi Freedom, 25-27 March 2003." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Dec%5FAnderson.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhu, Dan. "Electric Distribution Reliability Analysis Considering Time-varying Load, Weather Conditions and Reconfiguration with Distributed Generation." Diss., Virginia Tech, 2007. http://hdl.handle.net/10919/26557.

Full text
Abstract:
This dissertation is a systematic study of electric power distribution system reliability evaluation and improvement. Reliability evaluation of electric power systems has traditionally been an integral part of planning and operation. Changes in the electric utility coupled with aging electric apparatus create a need for more realistic techniques for power system reliability modeling. This work presents a reliability evaluation technique that combines set theory and Graph Trace Analysis (GTA). Unlike the traditional Markov approach, this technique provides a fast solution for large system reliability evaluation by managing computer memory efficiently with iterators, assuming a single failure at a time. A reconfiguration for restoration algorithm is also created to enhance the accuracy of the reliability evaluation, considering multiple concurrent failures. As opposed to most restoration simulation methods used in reliability analysis, which convert restoration problems into mathematical models and only can solve radial systems, this new algorithm seeks the reconfiguration solution from topology characteristics of the network itself. As a result the new reconfiguration algorithm can handle systems with loops. In analyzing system reliability, this research takes into account time-varying load patterns, and seeks approaches that are financially justified. An exhaustive search scheme is used to calculate optimal locations for Distributed Generators (DG) from the reliability point of view. A Discrete Ascent Optimal Programming (DAOP) load shifting approach is proposed to provide low cost, reliability improvement solutions. As weather conditions have an important effect on distribution component failure rates, the influence of different types of storms has been incorporated into this study. Storm outage models are created based on ten yearsâ worth of weather and power outage data. An observer is designed to predict the number of outages for an approaching or on going storm. A circuit corridor model is applied to investigate the relationship between power outages and lightning activity.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
8

Geggis, Lorna M. "Do you see what I mean? : Measuring consensus of agreement and understanding of a National Weather Service informational graphic." [Tampa, Fla.] : University of South Florida, 2007. http://purl.fcla.edu/usf/dc/et/SFE0002119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Frifra, Ayyoub. "Assessing and predicting extreme events in Western France." Electronic Thesis or Diss., Nantes Université, 2024. http://www.theses.fr/2024NANU2012.

Full text
Abstract:
Les régions côtières sont de plus en plus exposées à des événements extrêmes en raison des impacts combinés du changement climatique et de l’urbanisation. Cette thèse examine les risques côtiers le long de la côte ouest de la France, en mettant l’accent sur la prévision des tempêtes et de la simulation de la vulnérabilité future aux inondations côtières. Des approches d’apprentissage automatique et d’apprentissage profond ont été utilisées pour améliorer la prédiction des aléas et évaluer les risques futurs. Une nouvelle méthodologie combinant Long Short-Term Memory (LSTM) et Extreme Gradient Boosting (XGBoost) est proposée pour prévoir les caractéristiques et l’occurrence des tempêtes le long de la côte ouest de la France. Parallèlement, un système de modélisation du développement urbain a été appliqué pour prédire les scénarios d’expansion future en Vendée, en analysant la susceptibilité aux inondations. Un réseau de neurones artificiel combiné à une chaîne de Markov a permis de simuler trois scénarios de croissance urbaine: statu quo, protection de l’environnement et planification urbaine stratégique. Les zones inondables à haut risque et les estimations de l’élévation future du niveau de la mer ont ensuite été utilisées pour évaluer les risques d’inondation futurs dans le cadre de chaque scénario de croissance. Les résultats montrent l’efficacité des modèles LSTM et XGBoost pour la prévision des caractéristiques et de l’occurrence des tempêtes. L’approche de modélisation de la croissance urbaine révèle les zones vulnérables aux inondations dans chaque scénario. Cette thèse fournit des outils pour renforcer la résilience et la durabilité dans les zones côtières
Coastal regions are increasingly exposed to extreme events due to the combined impacts of climate change and urbanization. This thesis examines coastal hazards along France’s western coast, emphasizing storm prediction and the simulation of future vulnerability to coastal urban floodind. The research employs machine learning (ML) and deep learning (DL) approaches to improve hazard prediction and assess potential future risks. It introduces a novel methodology that combines Long Short-Term Memory (LSTM) and Extreme Gradient Boosting (XGBoost) to forecast storm features and occurrences along the western coast of France. Additionally, an urban development modeling system was applied to predict future expansion scenarios in the Vendée region, analyzing potential flood susceptibility under each scenario. An Artificial Neural Network combined with a Markov Chain was utilized to simulate three future urban growth scenarios; business-as-usual, environmental protection, and strategic urban planning. High-risk flood zones and future sea level rise estimates were then used to assess future flood risk under each growth scenario. The research findings demonstrate the efficiency of LSTM and XGBoost in predicting storm characteristics and occurrences. Moreover, the urban growth modeling approach forecasts future development sites and specific urban areas vulnerable to flooding, allowing for the evaluation of the impact of various development trajectories on future flood risk. This thesis contributes to coastal hazard prediction, urban planning, and risk management, providing useful tools for improving resilience and sustainability in coastal zones
APA, Harvard, Vancouver, ISO, and other styles
10

Kimock, Joseph. "Predicting commissary store success." Thesis, Monterey, California: Naval Postgraduate School, 2014. http://hdl.handle.net/10945/44595.

Full text
Abstract:
Approved for public release; distribution is unlimited
What external factors affect a commissary store’s success? This thesis analyzes the impact of demographics, local prices and competitors on commissary stores sales per square foot. These three factors were found to account for approximately 60 percent of the variation in sales per square foot between different store locations. The only influential groups for commissary success were active duty members, retirees, and their dependents-Reservists and National Guard members had no impact. Equally important was the price differential between commercial grocery stores and commissary stores in the local area. The number of competitors did not matter in sales predictions.
APA, Harvard, Vancouver, ISO, and other styles
11

Whipple, Sean David. "Predictive storm damage modeling and optimizing crew response to improve storm response operations." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/90166.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Engineering Systems Division, 2014. In conjunction with the Leaders for Global Operations Program at MIT.
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, 2014. In conjunction with the Leaders for Global Operations Program at MIT.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 61-63).
Utility infrastructures are constantly damaged by naturally occurring weather. Such damage results in customer service interruption and repairs are necessary to return the system to normal operation. In most cases these events are few and far between but major storm events (i.e. Hurricane Sandy) cause damage on a significantly higher scale. Large numbers of customers have service interrupted and repair costs are in the millions of dollars. The ability to predict damage before the event and optimize response can significantly cut costs. The first task was to develop a model to predict outages on the network. Using weather data from the past six storms as well as outage data from the events, asset information (framing, pole age, etc.), and environmental information were used to understand the interactions that lead to outages (forested areas are more likely to have outages than underground assets for example). Utilizing data mining and machine learning techniques we developed a model that gathers the data and applies a classification tree model to predict outages caused by weather. Next we developed an optimization model to allocate repair crews across Atlantic Electric staging locations in response to the predicted damage to ensure the earliest possible restoration time. Regulators impose constraints such as cost and return to service time on utility firms and these constraints will largely drive the distribution of repair crews. While the model starts with predicted results, the use of robust optimization will allow Atlantic Electric to optimize their response despite the uncertainty of why outages have occurred, which will lead to more effective response planning and execution across a variety of weather-related outages. Using these models Atlantic Electric will have data driven capability to not only predict how much damage an incoming storm will produce, but also aid in planning how to allocate their repair crews. These tools will ensure Atlantic Electric can properly plan for storm events and as more storms occur the tools will increase their efficacy.
by Sean David Whipple.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
12

Kim, Jun-Young. "ANN wave prediction model for winter storms and hurricanes." W&M ScholarWorks, 2003. https://scholarworks.wm.edu/etd/1539616716.

Full text
Abstract:
Currently available wind-wave prediction models require a prohibitive amount of computing time for simulating non-linear wave-wave interactions. Moreover, some parts of wind-wave generation processes are not fully understood yet. For this reason accurate predictions are not always guaranteed. In contrast, Artificial Neural Network (ANN) techniques are designed to recognize the patterns between input and output so that they can save considerable computing time so that real-time wind-wave forecast can be available to the navy and commercial ships. For this reason, this study tries to use ANN techniques to predict waves for winter storms and hurricanes with much less computing time at the five National Oceanic and Atmospheric Administration (NOAA) wave stations along the East Coast of the U.S. from Florida to Maine (station 44007, 44013, 44025, 44009, and 41009). In order to identify prediction error sources of an ANN model, the 100% known wind-wave events simulated from the SMB model were used. The ANN predicted even untrained wind-wave events accurately, and this implied that it could be used for winter-storm and hurricane wave predictions. For the prediction of winter-storm waves, 1999 and 2001 winter-storm events with 403 data points had 1998 winter-storm events with 78 points were prepared for training and validation data sets, respectively. In general, because winter-storms are relatively evenly distributed over a large area and move slowly, wind information (u and v wind components) over a large domain was considered as ANN inputs. When using a 24-hour time-delay to simulate the time required for waves to be fully developed seas, the ANN predicted wave heights (r = 0.88) accurately, but the prediction accuracy of zero-crossing wave periods was much less (r = 0.61). For the prediction of hurricane waves, 15 hurricanes from 1995 to 2001 and Hurricane Bertha in 1998 were prepared for training and validation data sets, respectively. Because hurricanes affect a relatively small domain, move quickly, and change dramatically with time, the location of hurricane centers, the maximum wind speed, central pressure of hurricane centers, longitudinal and latitudinal distance between wave stations and hurricane centers were used as inputs. The ANN predicted wave height accurately when a 24-hour time-delay was used (r = 0.82), but the prediction accuracy of peak-wave periods was much less (r = 0.50). This is because the physical processes of wave periods are more complicated than those of wave heights. This study shows a possibility of an ANN technique as the winter-storm and hurricane-wave prediction model. If more winter-storm and hurricane data can be available, and the prediction of hurricane tracks is possible, we can forecast real-time wind-waves more accurately with less computing time.
APA, Harvard, Vancouver, ISO, and other styles
13

Preisler, Frederik. "Predicting peak flows for urbanising catchments." Thesis, Queensland University of Technology, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
14

Luitel, Beda Nidhi. "Prediction of North Atlantic tropical cyclone activity and rainfall." Thesis, University of Iowa, 2016. https://ir.uiowa.edu/etd/2113.

Full text
Abstract:
Among natural disasters affecting the United States, North Atlantic tropical cyclones (TCs) and hurricanes are responsible for the highest economic losses and are one of the main causes of fatalities. Although we cannot prevent these storms from occurring, skillful seasonal predictions of the North Atlantic TC activity and associated impacts can provide basic information critical to our improved preparedness. Unfortunately, it is not yet possible to predict heavy rainfall and flooding associated with these storms several months in advance, and the lead time is limited to few days at the most. On the other hand, overall North Atlantic TC activity can be potentially predicted with a six- to nine-month lead time. This thesis focuses on the evaluation of the skill in predicting basin-wide North Atlantic TC activity with a long lead time and rainfall with a short lead time. For the seasonal forecast of TC activity, we develop statistical-dynamical forecasting systems for different quantities related to the frequency and intensity of North Atlantic TCs using only tropical Atlantic and tropical mean sea surface temperatures (SSTs) as covariates. Our results show that skillful predictions of North Atlantic TC activity are possible starting from November for a TC season that peaks in the August-October months. The short term forecasting of rainfall associated with TC activity is based on five numerical weather prediction (NWP) models. Our analyses focused on 15 North Atlantic TCs that made landfall along the U.S. coast over the period of 2007-2012. The skill of the NWP models is quantified by visual examination of the distribution of the errors for the different lead-times, and numerical examination of the first three moments of the error distribution. Based on our results, we conclude that the NWP models can provide skillful forecasts of TC rainfall with lead times up to 48 hours, without a consistently best or worst NWP model.
APA, Harvard, Vancouver, ISO, and other styles
15

Stern, Joshua Gallant. "STORI: selectable taxon ortholog retrieval iteratively." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/53377.

Full text
Abstract:
Speciation and gene duplication are fundamental evolutionary processes that enable biological innovation. For over a decade, biologists have endeavored to distinguish orthology (homology caused by speciation) from paralogy (homology caused by duplication). Disentangling orthology and paralogy is useful to diverse fields such as phylogenetics, protein engineering, and genome content comparison. A common step in ortholog detection is the computation of Bidirectional Best Hits (BBH). However, we found this computation impractical for more than 24 Eukaryotic proteomes. Attempting to retrieve orthologs in less time than previous methods require, we developed a novel algorithm and implemented it as a suite of Perl scripts. This software, Selectable Taxon Ortholog Retrieval Iteratively (STORI), retrieves orthologous protein sequences for a set of user-defined proteomes and query sequences. While the time complexity of the BBH method is O(#taxa^2), we found that the average CPU time used by STORI may increase linearly with the number of taxa. To demonstrate one aspect of STORI’s usefulness, we used this software to infer the orthologous sequences of 26 ribosomal proteins (rProteins) from the large ribosomal subunit (LSU), for a set of 115 Bacterial and 94 Archaeal proteomes. Next, we used established tree-search methods to seek the most probable evolutionary explanation of these data. The current implementation of STORI runs on Red Hat Enterprise Linux 6.0 with installations of Moab 5.3.7, Perl 5 and several Perl modules. STORI is available at: .
APA, Harvard, Vancouver, ISO, and other styles
16

Kruschke, Tim [Verfasser]. "Winter wind storms : Identifcation, verifcation of decadal predictions, and regionalization / Tim Kruschke." Berlin : Freie Universität Berlin, 2015. http://d-nb.info/107549334X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Faria, Correa Thompson Flores Juliana d. "Strategies to improve the performance of openings subject to water ingress during tropical cyclones and severe storms." Thesis, Griffith University, 2020. http://hdl.handle.net/10072/399426.

Full text
Abstract:
Tropical cyclones and severe storms around the world generate destructive winds and heavy rain causing devastating effects to buildings. Over the years, many countries have created or improved their buildings codes after a hazard has happened. Tropical Cyclone Tracy, in 1974 devastated the city of Darwin in Australia, and after that event the Australian Building Code was significantly upgraded to ensure that building structures could withstand cyclonic wind speeds. Since then, the incidence and severity of structural failure in both normal and extreme operating conditions has reduced substantially in Australia. However, tropical cyclones and severe storms still cause repeated serviceability issues in Australia, that impact on local communities, the construction industry, the insurance industry and governments. Insurance losses due to cyclones over the past two decades in northern Australia have totalled $2.4 billion, which averages around $115 million per year. Some non-structural elements remain subject to minor failure, causing loss of amenity and damage to structural building components over time. Buildings investigations have consistently revealed that windows and external glazed doors are affected by wind-driven rain, associated with each individual storm event, causing internal leakage and subsequent damage issues such as mould, termites and infestation. Research indicates that the water ingress may not be excessive but repeated serviceability damage has a cumulative cost impact generally to building owners, insurance and government. This repeated minor to moderate damage has not been sufficiently actioned since they do not lead to structural failure or loss of life. To enhance the performance of building envelope openings subject to wind-driven rain during tropical cyclones and severe storms in the North of Queensland, the attainment of a clear understanding of the interdependencies in current practices for the entire supply chain of windows and external doors is essential. Therefore, reliable tools to predict performance of building envelope openings is essential for decision-makers to better target and prioritise investments. This aim was achieved by addressing the following three core study objectives: (1) to identify the key factors affecting the performance of window and external glazed doors to wind-driven rainwater ingress during tropical cyclones; (2) to develop an openings’ wind-driven water ingress performance prediction model; and (3) use scenario analysis to identify the most appropriate management interventions that could lead to a greater performance of window and door openings subject to wind-driven rainwater ingress during tropical cyclones and severe storms. An integrated approach was used in the study. Firstly, expert interviews and workshops were used to gain a clear insight on the entire supply chain and quality oversight of window and external glazed door installations within the Australian construction industry. This was followed by workshops to develop and operationalise a probabilistic Bayesian Network (BN) model that enabled the identification of workable strategic pathways to improve the performance of openings to mitigate water ingress during tropical cyclones and severe storms. The findings from the expert workshops and interviews revealed some key contributing fault factors and correction recommendations to improve current practices. These recommendations predominately related to upgrading practices related to documentation, inspection liability assignment and installation training for building windows and doors, especially in locations where severe winds are frequent (i.e., northern Australia) and was designed for the use of governments and industry projects. The overall research project finding demonstrates the importance of implementing a multi-pronged change to: openings standards (refers as improvements in the serviceability resistance test to water penetration), standards knowledge and training (refers to improvements in skills and knowledge of designers, builders and installers in design specification, openings installation and waterproofing practices) and in construction documentation (level of design specification). The three practices enforced together, will likely enhance the performance of building envelope openings (upgrade to 66.6% against the current condition 32.2%) and substantially reduce the likelihood that windows and door openings will experience serviceability failure during their lifespans.
Thesis (Masters)
Master of Philosophy (MPhil)
School of Eng & Built Env
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
18

Forte, Paolo. "Predicting Service Metrics from Device and Network Statistics." Thesis, KTH, Kommunikationsnät, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-175892.

Full text
Abstract:
For an IT company that provides a service over the Internet like Facebook or Spotify, it is very important to provide a high quality of service; however, predicting the quality of service is generally a hard task. The goal of this thesis is to investigate whether an approach that makes use of statistical learning to predict the quality of service can obtain accurate predictions for a Voldemort key-value store [1] in presence of dynamic load patterns and network statistics. The approach follows the idea that the service-level metrics associated with the quality of service can be estimated from serverside statistical observations, like device and network statistics. The advantage of the approach analysed in this thesis is that it can virtually work with any kind of service, since it is based only on device and network statistics, which are unaware of the type of service provided. The approach is structured as follows. During the service operations, a large amount of device statistics from the Linux kernel of the operating system (e.g. cpu usage level, disk activity, interrupts rate) and some basic end-to-end network statistics (e.g. average round-trip-time, packet loss rate) are periodically collected on the service platform. At the same time, some service-level metrics (e.g. average reading time, average writing time, etc.) are collected on the client machine as indicators of the store’s quality of service. To emulate network statistics, such as dynamic delay and packet loss, all the traffic is redirected to flow through a network emulator. Then, different types of statistical learning methods, based on linear and tree-based regression algorithms, are applied to the data collections to obtain a learning model able to accurately predict the service-level metrics from the device and network statistics. The results, obtained for different traffic scenarios and configurations, show that the thesis’ approach can find learning models that can accurately predict the service-level metrics for a single-node store with error rates lower than 20% (NMAE), even in presence of network impairments.
APA, Harvard, Vancouver, ISO, and other styles
19

Marastoni, Gabriele. "Towards predictive maintenance at LHC computing centers: exploration of monitoring data at CNAF." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16923/.

Full text
Abstract:
Nel campo delle applicazioni industriali e scientifiche, l'emergere del machine learning sta cambiando il modo in cui la raccolta dei dati e la loro analisi è concepita e realizzata. Nel dettaglio, l'analisi di volumi massivi di file log compiute da unità computazionali facenti parte di infrastrutture estremamente grandi e complesse sta diventando un modo impegnativo ma promettente per estrarre informazioni fruibili nell'ambito del miglioramento e ottimizzazione nell'uso delle risorse e per un risparmio economico. Questo è particolarmente interessante per il calcolo computazionale al LHC anche perchè si prevede che le attività future saranno svolte in "flat budget" per maggior parte dei finanziamenti elargiti nei prossimi anni. La Worldwide LHC Computing Grid coordina le operazioni di una grande quantità di centri di calcolo nel mondo, ognuno di questi è una composizione coerente di spazi di storage, potere di elaborazione e connessioni network sopra la quali vengono eseguite una grande quantità di applicazioni che lavorano con livelli di software comuni o specifici di determinati esperimenti. Questi servizi producono grandissimi volumi di log eterogenei e non strutturati i quali possono essere digeriti attraverso approcci tipici del Big Data Analitics. Presso il data center Tier-1 di Bologna il CNAF ha iniziato un'attività in quest'ambito con tentativi di analisi dati; si propone nel lungo periodo di ingegnerizzare e schierare una soluzione di manutenzione predittiva basata su tecniche di machine learning. Questo lavoro di tesi si propone di raccogliere, manipolare ed esplorare i log prodotti da un servizio specifico del CNAF - StoRM, il servizio di gestione dello storage. L'obiettivo è gettare le basi di questa indagine collezionando osservazioni e producendo strumenti che possano essere utilizzati da altri su differenti tipi di log; quindi fare un primo piccolo passo verso un approccio basato sulla manutenzione predittiva per i centri computazionali di LHC.
APA, Harvard, Vancouver, ISO, and other styles
20

Simkin, L. P. "The assessment of retail store locations : UK retailers' location practices and the development of a predictive retail store location performance model." Thesis, University of Bradford, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.372163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Huffcutt, Allen Ivan. "Development of Biographical Predictors of Cashier Turnover at a Convenience Store Chain." Thesis, University of North Texas, 1989. https://digital.library.unt.edu/ark:/67531/metadc500851/.

Full text
Abstract:
Subjects, 432 convenience store cashiers, were divided into long-tenure and short-tenure groups. Chi-square analysis of application blank information for a weighting sample drawn from both groups revealed two items which significantly (p < .05) differentiated between the long tenure and short-tenure groups: number of previous jobs and full-time/part-time preference. Response weights were computed for these two items and used to calculate composite scores for the remaining holdout sample. A significant reduction in turnover would have occurred at the highest composite score level, if used as a hiring cut off. Results were tempered by several considerations, including a high percentage of false negatives and an insignificant linear relationship between composite scores and tenure.
APA, Harvard, Vancouver, ISO, and other styles
22

Belanger, James Ian. "Predictability and prediction of tropical cyclones on daily to interannual time scales." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/44877.

Full text
Abstract:
The spatial and temporal complexity of tropical cyclones (TCs) raises a number of scientific questions regarding their genesis, movement, intensification, and variability. In this dissertation, the principal goal is to determine the current state of predictability for each of these processes. To quantify the current extent of tropical cyclone predictability, we assess probabilistic forecasts from the most advanced global numerical weather prediction system to date, the ECMWF Variable Resolution Ensemble Prediction System (VarEPS). Using a new false alarm clustering technique to maximize the utility of the VarEPS, the ensemble system is shown to provide well-calibrated probabilistic forecasts for TC genesis through a lead-time of one week, and pregenesis track forecasts with similar skill compared to the VarEPS's postgenesis track forecasts. To quantify the predictability of TCs on intraseasonal time scales, forecasts from the ECMWF Monthly Forecast System (ECMFS) are examined for the North Atlantic Ocean. From this assessment, dynamically based forecasts from the ECMFS provide forecast skill exceeding climatology out to weeks three and four for portions of the southern Gulf of Mexico, western Caribbean and the Main Development Region. Forecast skill in these regions is traced to the model's ability to capture correctly the variability in deep-layer vertical wind shear, the relative frequency of easterly waves moving through these regions, and the intraseasonal modulation of the Madden-Julian Oscillation. On interannual time scales, the predictability of TCs is examined by considering their relationship with tropical Atlantic easterly waves. First, a set of easterly wave climatologies for the CFS-R, ERA-Interim, ERA-40, and NCEP/NCAR Reanalysis are developed using a new easterly wave-tracking algorithm. From the reanalysis-derived climatologies, a moderately positive and statistically significant relationship is seen with tropical Atlantic TCs. In relation to large-scale climate modes, the Atlantic Multidecadal Oscillation (AMO) and Atlantic Meridional Mode (AMM) exhibit the strongest positive covariability with Atlantic easterly wave frequency. Besides changes in the number of easterly waves, the intensification efficiency of easterly waves has also been evaluated. These findings offer a plausible physical explanation for the recent increase in the number of NATL TCs, as it has been concomitant with an increasing trend in both the number of tropical Atlantic easterly waves and intensification efficiency. The last component of this dissertation examines how the historical variability in U.S. landfalling TCs has impacted the annual TC tornado record. To reconcile the inhomogeneous, historical tornado record, two statistical tornado models, developed from a set of a priori predictors for TC tornado formation, are used to reconstruct the TC tornado climatology. While the synthetic TC tornado record reflects decadal scale variations in association with the AMO, a comparison of the current warm phase of the AMO with the previous warm phase period shows that the median number of tornadoes per Gulf TC landfall has significantly increased. This change likely reflects the increase in median TC size (by 35%) of Gulf landfalling TCs along with an increased frequency of large TCs at landfall.
APA, Harvard, Vancouver, ISO, and other styles
23

Wetzell, Lauren McKinnon. "Simple Models For Predicting Dune Erosion Hazards Along The Outer Banks Of North Carolina." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Stobie, James R. "More to the story a reappraisal of U.S. intelligence prior to the Pacific War /." Fort Leavenworth, Kan. : U.S. Army Command and General Staff College, 2007. http://handle.dtic.mil/100.2/ADA471458.

Full text
Abstract:
Thesis (M. of Military Art and Science)--U.S. Army Command and General Staff College, 2007.
The original document contains color images. Title from title page of PDF document (viewed on May 27, 2008). Includes bibliographic references.
APA, Harvard, Vancouver, ISO, and other styles
25

Praus, Ondřej. "Prediktivní analýza - postup a tvorba prediktivních modelů." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-199233.

Full text
Abstract:
This master's degree thesis focuses on predictive analytics. This type of analysis uses historical data and predictive models to predict future phenomenon. The main goal of this thesis is to describe predictive analytics and its process from theoretical as well as practical point of view. Secondary goal is to implement project of predictive analytics in an important insurance company operating in the Czech market and to improve the current state of detection of fraudulent insurance claims. Thesis is divided into theoretical and practical part. The process of predictive analytics and selected types of predictive models are described in the theoretical part of the thesis. Practical part describes the implementation of predictive analytics in a company. First described are techniques of data organization used in datamart development. Predictive models are then implemented based on the data from the prepared datamart. Thesis includes examples and problems with their solutions. The main contribution of this thesis is the detailed description of the project implementation. The field of the predictive analytics is better understandable thanks to the level of detail. Another contribution of successfully implemented predictive analytics is the improvement of the detection of fraudulent insurance claims.
APA, Harvard, Vancouver, ISO, and other styles
26

Martiník, Jan. "Příprava cvičení pro dolování znalostí z báze dat - klasifikace a predikce." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218190.

Full text
Abstract:
My master's thesis on the topic of "Design of exercises for data mining - Classification and prediction" deals with the most frequently used methods classification and prediction. There are association rules, Bayesian classification, genetic algorithms, the nearest method neighbor, neural network and decision trees on the classification. There are linear and non-linear prediction on the prediction. This work also contains a summary of detail the issue of decision trees and a detailed algorithm for creating the decision tree, including development of individual diagrams. The proposed algorithm for creating the decision tree is tested through two tests of data dowloaded from Internet. The results are mutually compared and described differences between the two implementations. The work is written in a way that would provide the reader with a notion of the individual methods and techniques for data mining, their advantages, disadvantages and some of the issues that directly relate to this topic.
APA, Harvard, Vancouver, ISO, and other styles
27

Stokláska, Jiří. "Analýza kompletnosti výrobního procesu rozváděčů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-217826.

Full text
Abstract:
This thesis introduces company ABB Brno and its products. It describes the manufacturing process of a switchgear and it is focused at working procedures at particular points of the assembly line. The main part analyzes the root causes of the incompleteness of switchgears in the relation to the components availability. Time footprint was worked out. As a conclusion the proposal of changes in manufacturing process to improve the level of completeness are stated.
APA, Harvard, Vancouver, ISO, and other styles
28

Wickert, Claudia. "Breeding white storks in former East Prussia : comparing predicted relative occurrences across scales and time using a stochastic gradient boosting method (TreeNet), GIS and public data." Master's thesis, Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2007/1353/.

Full text
Abstract:
In dieser Arbeit wurden verschiedene GIS-basierte Habitatmodelle für den Weißstorch (Ciconia ciconia) im Gebiet der ehemaligen deutschen Provinz Ostpreußen (ca. Gebiet der russischen Exklave Kaliningrad und der polnischen Woiwodschaft Ermland-Masuren) erstellt. Zur Charakterisierung der Beziehung zwischen dem Weißstorch und der Beschaffenheit seiner Umwelt wurden verschiedene historische Datensätze über den Bestand des Weißstorches in den 1930er Jahren sowie ausgewählte Variablen zur Habitat-Beschreibung genutzt. Die Aufbereitung und Modellierung der verwendeten Datensätze erfolgte mit Hilfe eines geographischen Informationssystems (ArcGIS) und einer statistisch-mathematischen Methode aus den Bereichen „Machine Learning“ und „Data-Mining“ (TreeNet, Salford Systems Ltd.). Unter Verwendung der historischen Habitat-Parameter sowie der Daten zum Vorkommen des Weißstorches wurden quantitative Modelle auf zwei Maßstabs-Ebenen erstellt: (i) auf Punktskala unter Verwendung eines Rasters mit einer Zellgröße von 1 km und (ii) auf Verwaltungs-Kreisebene basierend auf der Gliederung der Provinz Ostpreußen in ihre Landkreise. Die Auswertung der erstellten Modelle zeigt, dass das Vorkommen von Storchennestern im ehemaligen Ostpreußen, unter Berücksichtigung der hier verwendeten Variablen, maßgeblich durch die Variablen ‚forest’, ‚settlement area’, ‚pasture land’ und ‚coastline’ bestimmt wird. Folglich lässt sich davon ausgehen, dass eine gute Nahrungsverfügbarkeit, wie der Weißstorch sie auf Wiesen und Weiden findet, sowie die Nähe zu menschlichen Siedlungen ausschlaggebend für die Nistplatzwahl des Weißstorches in Ostpreußen sind. Geschlossene Waldgebiete zeigen sich in den Modellen als Standorte für Horste des Weißstorches ungeeignet. Der starke Einfluss der Variable ‚coastline’ lässt sich höchstwahrscheinlich durch die starke naturräumliche Gliederung Ostpreußens parallel zur Küstenlinie erklären. In einem zweiten Schritt konnte unter Verwendung der in dieser Arbeit erstellten Modelle auf beiden Skalen Vorhersagen für den Zeitraum 1981-1993 getroffen werden. Dabei wurde auf dem Punktmaßstab eine Abnahme an potentiellem Bruthabitat vorhergesagt. Im Gegensatz dazu steigt die vorhergesagte Weißstorchdichte unter Verwendung des Modells auf Verwaltungs-Kreisebene. Der Unterschied zwischen beiden Vorhersagen beruht vermutlich auf der Verwendung unterschiedlicher Skalen und von zum Teil voneinander verschiedenen erklärenden Variablen. Weiterführende Untersuchungen sind notwendig, um diesen Sachverhalt zu klären. Des Weiteren konnten die Modellvorhersagen für den Zeitraum 1981-1993 mit den vorliegenden Bestandserfassungen aus dieser Zeit deskriptiv verglichen werden. Es zeigt sich hierbei, dass die hier vorhergesagten Bestandszahlen höher sind als die in den Zählungen ermittelten. Die hier erstellten Modelle beschreiben somit vielmehr die Kapazität des Habitats. Andere Faktoren, die die Größe der Weißstorch-Population bestimmen, wie z.B. Bruterfolg oder Mortalität sollten in zukünftige Untersuchungen mit einbezogen werden. Es wurde ein möglicher Ansatz aufgezeigt, wie man mit den hier vorgestellten Methoden und unter Verwendung historischer Daten wertvolle Habitatmodelle erstellen sowie die Auswirkung von Landnutzungsänderungen auf den Weißstorch beurteilen kann. Die hier erstellten Modelle sind als erste Grundlage zu sehen und lassen sich mit Hilfe weitere Daten hinsichtlich Habitatstruktur und mit exakteren räumlich expliziten Angaben zu Neststandorten des Weißstorches weiter verfeinern. In einem weiteren Schritt sollte außerdem ein Habitatmodell für die heutige Zeit erstellt werden. Dadurch wäre ein besserer Vergleich möglich hinsichtlich erdenklicher Auswirkungen von Änderungen der Landnutzung und relevanten Umweltbedingungen auf den Weißstorch im Gebiet des ehemaligen Ostpreußens sowie in seinem gesamten Verbreitungsgebiet.
Different habitat models were created for the White Stork (Ciconia ciconia) in the region of the former German province of East Prussia (equals app. the current Russian oblast Kaliningrad and the Polish voivodship Warmia-Masuria). Different historical data sets describing the occurrence of the White Stork in the 1930s, as well as selected variables for the description of landscape and habitat, were employed. The processing and modeling of the applied data sets was done with a geographical information system (ArcGIS) and a statistical modeling approach that comes from the disciplines of machine-learning and data mining (TreeNet by Salford Systems Ltd.). Applying historical habitat descriptors, as well as data on the occurrence of the White Stork, models on two different scales were created: (i) a point scale model applying a raster with a cell size of 1 km2 and (ii) an administrative district scale model based on the organization of the former province of East Prussia. The evaluation of the created models show that the occurrence of White Stork nesting grounds in the former East Prussia for most parts is defined by the variables ‘forest’, ‘settlement area’, ‘pasture land’ and ‘proximity to coastline’. From this set of variables it can be assumed that a good food supply and nesting opportunities are provided to the White Stork in pasture and meadows as well as in the proximity to human settlements. These could be seen as crucial factors for the choice of nesting White Stork in East Prussia. Dense forest areas appear to be unsuited as nesting grounds of White Storks. The high influence of the variable ‘coastline’ is most likely explained by the specific landscape composition of East Prussia parallel to the coastline and is to be seen as a proximal factor for explaining the distribution of breeding White Storks. In a second step, predictions for the period of 1981 to 1993 could be made applying both scales of the models created in this study. In doing so, a decline of potential nesting habitat was predicted on the point scale. In contrast, the predicted White Stork occurrence increases when applying the model of the administrative district scale. The difference between both predictions is to be seen in the application of different scales (density versus suitability as breeding ground) and partly dissimilar explanatory variables. More studies are needed to investigate this phenomenon. The model predictions for the period 1981 to 1993 could be compared to the available inventories of that period. It shows that the figures predicted here were higher than the figures established by the census. This means that the models created here show rather a capacity of the habitat (potential niche). Other factors affecting the population size e.g. breeding success or mortality have to be investigated further. A feasible approach on how to generate possible habitat models was shown employing the methods presented here and applying historical data as well as assessing the effects of changes in land use on the White Stork. The models present the first of their kind, and could be improved by means of further data regarding the structure of the habitat and more exact spatially explicit information on the location of the nesting sites of the White Stork. In a further step, a habitat model of the present times should be created. This would allow for a more precise comparison regarding the findings from the changes of land use and relevant conditions of the environment on the White Stork in the region of former East Prussia, e.g. in the light of coming landscape changes brought by the European Union (EU).
APA, Harvard, Vancouver, ISO, and other styles
29

Veselovský, Martin. "Získávání znalostí pro modelování následných akcí." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2017. http://www.nusl.cz/ntk/nusl-363821.

Full text
Abstract:
Knowledge discovery from databases is a complex issue involving integration, data preparation, data mining using machine learning methods and visualization of results. The thesis deals with the whole process of knowledge discovery, especially with the issue of data warehousing, where it offers the design and implementation of a specific data warehouse for the company ROI Hunter, a.s. In the field of data mining, the work focuses on the classification and forecasting of the advertising data available from the prepared data warehouse and, in particular, on the decision tree classification. When predicting the development of new ads, emphasis is put on the rationale for the prediction as well as the proposal to adjust the ad settings so that the prediction ends positively and, with a certain likelihood, the ads actually get better results.
APA, Harvard, Vancouver, ISO, and other styles
30

Dyanati, Badabi Mojtaba. "Seismic Performance Evaluation And Economic Feasibility Of Self-Centering Concentrically Braced Frames." University of Akron / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=akron1460216523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Haris, Daniel. "Optimalizace strojového učení pro predikci KPI." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2018. http://www.nusl.cz/ntk/nusl-385922.

Full text
Abstract:
This thesis aims to optimize the machine learning algorithms for predicting KPI metrics for an organization. The organization is predicting whether projects meet planned deadlines of the last phase of development process using machine learning. The work focuses on the analysis of prediction models and sets the goal of selecting new candidate models for the prediction system. We have implemented a system that automatically selects the best feature variables for learning. Trained models were evaluated by several performance metrics and the best candidates were chosen for the prediction. Candidate models achieved higher accuracy, which means, that the prediction system provides more reliable responses. We suggested other improvements that could increase the accuracy of the forecast.
APA, Harvard, Vancouver, ISO, and other styles
32

Pelikán, Ondřej. "Predikce škodlivosti aminokyselinových mutací s využitím metody MAPP." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2014. http://www.nusl.cz/ntk/nusl-236151.

Full text
Abstract:
This thesis discusses the issue of predicting the effect of amino acid substitutions on protein function using MAPP method. This method requires the multiple sequence alignment and phylogenetic tree constructed by third-party tools. Main goal of this thesis is to find the combination of suitable tools and their parameters to generate the inputs of MAPP method on the basis of analysis on one massively mutated protein. Then, the MAPP method is tested with chosen combination of parameters and tools on two large independent datasets and consequently is compared with the other tools focused on prediction of the effect of mutations. Apart from this the web interface for the MAPP method was created. This interface simplifies the use of the method since the user need not to install any tools or set any parameters.
APA, Harvard, Vancouver, ISO, and other styles
33

Palček, Peter. "Předpovídání vývoje více časových řad při burzovním obchodování." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236567.

Full text
Abstract:
The diploma thesis comprises of a general approach used to predict the time series, their categorization, basic characteristics and basic statistical methods for their prediction. Neural networks are also mentioned and their categorization with regards to the suitability for prediction of time series. A program for the prediction of the progress of multiple time series in stock market is designed and implemented, and it's based on a model of flexible neuron tree, whose structure is optimized using immune programming and parameters using a modified version of simulated annealing or particle swarm optimization. Firstly, the program is tested on its ability to predict simple time series and then on its ability to predict multiple time series.
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Chi-hung, and 王啟竑. "Apply Neural Network Techniques for Storm Surge Prediction." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/98441654319309498085.

Full text
Abstract:
碩士
國立中山大學
海洋環境及工程學系研究所
98
Taiwan is often threaten by typhoon during summer and autumn. The surges brought by theses typhoons not only cause human lives in danger, but also cause severe floods in coastal area. Storm surge prediction remains still a complex coastal engineering problem to solve since lots of parameters may affect the predictions. The purpose of this study is to predict storm surges using an Artificial Neural Network (ANN). A non-linear hidden-layer forward feeding neural network using back-propagation learning algorithms was developed. The study included a detailed analysis the factors may affect the predictions. The factors were obtained from the formulation of storm surge discrepancies after Horikawa (1987). Storm surge behaviors may vary from different geographical locations and weather conditions. A correlation analysis of the parameters was carried out first to pick up those factors shown high correlations as input parameters for establishing the typhoon surge predictions. The applications started with collecting tide and meteorological data (wind speed, wind direction and pressure) of Dapeng Bay and Kaohsiung harbor. A harmonic analysis was utilized to identify surge deviations. The surge deviation recorded at Dapeng Bay was found higher then Kaohsiung harbor for the same typhoon events. Correlation analysis has shown positive correlations between wind field, both wind speed and direction, and the associated storm surge deviations at Dapeng Bay. Correlation coefficients (CC) 0.6702 and 0.58 were found respectively. The variation of atmospheric pressure during typhoons is found with positive correlation too (i.e. CC=0.3626). Whereas the analysis has shown that the surges at Kaohsiung harbor were only sensitive to wind speed (CC=0.3723), while the correlation coefficients of the wind direction (CC=-0.1559) and atmospheric pressure (CC= -0.0337) are low. The wind direction, wind speed and atmospheric pressure variation were then used as input parameters for the training and predictions. An optimum network structure was defined using the Dapeng Bay data. The best results were obtained by using wind speed, wind direction and pressure variation as input parameters. The ANN model can predict the surge deviation better if the empirical mode decomposition (EMD) method was used for training.
APA, Harvard, Vancouver, ISO, and other styles
35

Jordan, Mark Rickman. "Development of a new storm surge index for global prediction of tropical cyclone generated storm surge." 2008. http://etd.lib.fsu.edu/theses/available/etd-06212008-114817.

Full text
Abstract:
Thesis (Ph. D.)--Florida State University, 2008.
Advisor: Carol Anne Clayson, Florida State University, College of Arts and Sciences, Dept. of Meteorology. Title and description from dissertation home page (viewed Sept. 30, 2008). Document formatted into pages; contains xiii, 83 pages. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
36

Pan, Kuan-Long, and 潘冠龍. "Prediction of Storm-Built Beach Profile Using Artificial Neural Network." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/18213683514544558887.

Full text
Abstract:
碩士
國立中興大學
土木工程學系
87
This study aims to investigate the applicability of the artificial neural network for predicting the major pertinent parameters of a storm-built beach profile. The prediction model is performed from learning 18 model bar profiles selected from previous large wave tank test. A back-propagation procedure was used to adjust the weights of the connections in the neural network and to minimize the error between the desired outputs and the observed values. Base on the proposed neural network model, the major geometric parameters for a storm-built bar are predicted well as the wave condition is given. The results show that the neural network model works better then the previous empirical predictions of Silvester and Hsu (1993) and Hsu and Wang (1997). In addition, the neural network also has good performance in the prediction of the storm-built beach profile.
APA, Harvard, Vancouver, ISO, and other styles
37

You, Chih-Yu, and 游智宇. "A Study on Storm-Surge Prediction at Tanshui Estuary by Artificial Neural Network." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/27196583524021014952.

Full text
Abstract:
碩士
中興大學
土木工程學系所
95
Taiwan northern area is always attacked by typhoon frequently every year and induces the flood disasters. At present, Tamsui river territory has some unfavorable conditions including the basin low-lying and land subsidence to control the flood with storm-surge. Thus the accurate prediction of the storm-surge is an important issue for the area. However, it is quite complex for the prediction of storm-surge and use the numerical method or empirical formula to predict the phenomenon is not easily. Alternatively This paper applies the artificial networks including the supervised multilayer perception neural network and the radial basis function neural network, for the prediction of the storm-surge . Based on the previous empirical formula of the maximum of storm-surge, it is only 0.565 to draw the correlation coefficient. This study chooses the stand atmosphere pressure variation, wind speed and wind direction parameters as the input neurons for the networks of typhoon about 22 groups and discuss the effect of each parameter on storm-surge forecast. The results agree well with the measured data of storm-surge, which all the correlation coefficient are more than 0.9. The results of the predicted and test model show that the correlation coefficient values are larger than 0.85 in the situation of predicted model inputted the atmosphere pressure variation, wind speed, wind direction and storm-surge of last moment parameters into the time series of storm-surge. This result illustrates that time series model forecast well for the storm-surge of the time during the typhoon.
APA, Harvard, Vancouver, ISO, and other styles
38

Huang, Cheng-Tung, and 黃正同. "Prediction of Storm-Built Beach Profile Using Radial Basis Function Artificial Neural Network." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/70202129058487461760.

Full text
Abstract:
碩士
中興大學
土木工程學系所
94
This study aims to investigate the applicability of the Radial-Basis Function neural network (RBFN) for predicting the major pertinent parameters of a storm-built beach profile. The prediction model is performed from learning 18 model bar profiles selected from previous large wave tank test. A Radial-Basis Function network procedure was used to adjust the weights of the connections in the neural network and to minimize the error between the desired outputs and the observed values. Base on the proposed RBFN model that it has curve fitting capability, the major geometric parameters for a storm-built bar are predicted well as the nondimensional wave condition is given. The results show that the neural network model works better then the previous empirical predictions of Silvester and Hsu (1993) and back-propagation neural network..
APA, Harvard, Vancouver, ISO, and other styles
39

Pingree-Shippee, Katherine. "Seasonal predictability of North American coastal extratropical storm activity during the cold months." Thesis, 2018. https://dspace.library.uvic.ca//handle/1828/9324.

Full text
Abstract:
Extratropical cyclones (ETCs) are major features of the weather in the mid- and high-latitudes and are often associated with hazardous conditions such as heavy precipitation, high winds, blizzard conditions, and flooding. Additionally, severe coastal damage and major local impacts, including inundation and erosion, can result from high waves and storm surge due to cyclone interaction with the ocean. Consequently, ETCs can have serious detrimental socio-economic impacts. The west and east coasts of North America are strongly influenced by ETC storm activity. These coastal regions are also host to many land-based, coastal, and maritime socio-economic sectors, all of which can experience strong adverse impacts from extratropical storm activity. Society would therefore benefit if variations in ETC storm activity could be predicted skilfully for the upcoming season. Skilful prediction would enable affected sectors to better anticipate, prepare for, manage, and respond to variations in storm activity and the associated risks. The overall objective of this dissertation is to determine the seasonal predictability of North American coastal extratropical storm activity during the cold months (3-month rolling seasons – OND, NDJ, DJF, JFM – during which storm activity is most frequent and intense) using Environment and Climate Change Canada’s Canadian Seasonal to Interannual Prediction System (CanSIPS). This dissertation describes research focused on three themes: 1.) reanalysis representation of North American coastal storm activity, 2.) potential predictability of storm activity and climate signal-storm activity relationships for the North American coastal regions, and 3.) seasonal prediction of storm activity in CanSIPS. Research Theme 1 evaluates six global reanalysis datasets to determine which best reproduces observed storm activity in the North American coastal regions, annually and seasonally, during the 1979-2010 time period using single-station surface pressure-based proxies; ERA-Interim is found to perform best overall. Research Theme 2, using ERA-Interim, investigates the potential predictability of extratropical storm activity (represented by mean sea level pressure [MSLP], absolute pressure tendency, and 10-m wind speed) during the 1979-2015 time period using analysis of variance. The detected potential predictability provides observation-based evidence showing that it may be possible to predict storm activity on the seasonal timescale. Additionally, using composite analysis, the El Niño-Southern Oscillation, Pacific Decadal Oscillation, and North Atlantic Oscillation are identified as possible sources of predictability in the North American coastal regions. Research Theme 2 provides a basis upon which seasonal forecasting of extratropical storm activity can be developed. Research Theme 3 investigates the seasonal prediction of North American coastal storm activity using the CanSIPS multi-model ensemble mean hindcasts (1981-2010). Quantitative deterministic, categorical deterministic, and categorical probabilistic forecasts are constructed using the three equiprobable category framework (below-, near-, and above-normal conditions) and the parametric Gaussian method for determining probabilities. These forecasts are then evaluated against ERA-Interim using the correlation skill score, percent correct score, and Brier skill score to determine forecast skill. Baseline forecast skill is found for the seasonal forecasts of all three storm activity proxies, with MSLP forecasts found to be most skilful and 10-m wind speed forecasts the least skilful. Skilful seasonal forecasting of North American coastal extratropical storm activity is, therefore, possible in CanSIPS.
Graduate
APA, Harvard, Vancouver, ISO, and other styles
40

Yum, Sang Guk. "Extreme Storm Surge Return Period Prediction Using Tidal Gauge Data and Estimation of Damage to Structures from Storm-Induced Wind Speed in South Korea." Thesis, 2019. https://doi.org/10.7916/d8-44c4-3150.

Full text
Abstract:
Global warming, which is one of the most serious consequence of climate change, can be expected to have different effects on the atmosphere, the ocean, icebergs, etc. Global warming has also brought secondary consequences into nature and human society directly. The most negative effect among the several effects of global warming is the rising sea level related to the large typhoons which can cause flooding on low-level land, coastal invasion, sea water flow into rivers and underground water, rising river level, and fluctuation of sea tides. It is crucial to recognize surge level and its return period more accurately to prevent loss of human life and property damage caused by typhoons. This study researches two topics. The first purpose of this study is to develop a statistical model to predict the return period of the storm surge water related to typhoon Maemi, 2003 in South Korea. To estimate the return period of the typhoon, clustered separated peaks-over-threshold simulation (CSPS) has been used and Weibull distribution is used for the peak storm surge height’s fitting. The estimated return period of typhoon Maemi’s peak total water level is 389.11 years (95% confidence interval 342.27 - 476.2 years). The second aim is related to the fragility curves with the loss data caused by typhoons. Although previous studies have developed various methods to mitigate damages from typhoons, the extent of financial loss has not been investigated enough. In this research, an insurance company provides their loss data caused by the wind speed of typhoon Maemi in 2003. The loss data is very important in evaluating the extent of the damages. In this study, the damage ratio in the loss dataset has been used as the main indicator to investigate the extent of the damages. The damage ratio is calculated by dividing the direct loss by the insured amount. In addition, this study investigates the fragility curves of properties to estimate the damage from typhoon Maemi in 2003. The damage ratios and storm induced wind speeds are used as the main factor for constructing fragility curves to predict the levels of damage of the properties. The geographical information system (GIS) has been applied to produce properties’ spatial wind speeds from the typhoon. With the damage ratios, wind speeds and GIS spatial data, this study constructs the fragility curves with four different damage levels (Level I - Level IV). The findings and results of this study can be basic new references for governments, the engineering industry, and the insurance industry to develop new polices and strategies to cope with climate change.
APA, Harvard, Vancouver, ISO, and other styles
41

Chittibabu, Padala. "Development of storm surge prediction models for the bay of Bengal and the arabian sea." Thesis, 1999. http://localhost:8080/xmlui/handle/12345678/2650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Mirabito, Christopher Michael. "Analysis, implementation, and verification of a discontinuous galerkin method for prediction of storm surges and coastal deformation." Thesis, 2011. http://hdl.handle.net/2152/ETD-UT-2011-08-4130.

Full text
Abstract:
Storm surge, the pileup of seawater occurring as a result of high surface stresses and strong currents generated by extreme storm events such as hurricanes, is known to cause greater loss of life than these storms' associated winds. For example, inland flooding from the storm surge along the Gulf Coast during Hurricane Katrina killed hundreds of people. Previous storms produced even larger death tolls. Simultaneously, dune, barrier island, and channel erosion taking place during a hurricane leads to the removal of major flow controls, which significantly affects inland inundation. Also, excessive sea bed scouring around pilings can compromise the structural integrity of bridges, levees, piers, and buildings. Modeling these processes requires tightly coupling a bed morphology equation to the shallow water equations (SWE). Discontinuous Galerkin finite element methods (DGFEMs) are a natural choice for modeling this coupled system, given the need to solve these problems on large, complicated, unstructured computational meshes, as well as the desire to implement hp-adaptivity for capturing the dynamic features of the solution. Comprehensive modeling of these processes in the coastal zone presents several challenges and open questions. Most existing hydrodynamic models use a fixed-bed approach; the bottom is not allowed to evolve in response to the fluid motion. With respect to movable-bed models, there is no single, generally accepted mathematical model in use. Numerical challenges include coupling models of processes that exhibit disparate time scales during fair weather, but possibly similar time scales during intense storms. The main goals of this dissertation include implementing a robust, efficient, tightly-coupled morphological model using the local discontinuous Galerkin (LDG) method within the existing Advanced Circulation (ADCIRC) modeling framework, performing systematic code and model verification (using test cases with known solutions, proven convergence rates, or well-documented physical behavior), analyzing the stability and accuracy of the implemented numerical scheme by way of a priori error estimates, and ultimately laying some of the necessary groundwork needed to simultaneously model storm surges and bed morphodynamics during extreme storm events.
text
APA, Harvard, Vancouver, ISO, and other styles
43

Song, Youn Kyung. "Extreme Hurricane Surge Estimation for Texas Coastal Bridges Using Dimensionless Surge Response Functions." 2009. http://hdl.handle.net/1969.1/ETD-TAMU-2009-08-7065.

Full text
Abstract:
Since the devastating hurricane seasons of 2004, 2005, and 2008, the stability and serviceability of coastal bridges during and following hurricane events have become a main public concern. Twenty coastal bridges, critical for hurricane evacuation and recovery efforts, in Texas have been identified as vulnerable to hurricane surge and wave action. To accurately assess extreme surges at these bridges, a dimensionless surge response function methodology was adopted. The surge response function defines maximum surge in terms of hurricane meteorological parameters such as hurricane size, intensity, and landfall location. The advantage of this approach is that, given a limited set of discrete hurricane surge data (either observed or simulated), all possible hurricane surges within the meteorological parameter space may be described. In this thesis, we will first present development of the surge response function methodology optimized to include the influence of regional continental shelf geometry. We will then demonstrate surge response function skill for surge prediction by comparing results with surge observations for Hurricanes Carla (1961) and Ike (2008) at several stations along the coast. Finally, we apply the improved surge response function methodology to quantify extreme surges for Texas coastal bridge probability and vulnerability assessment.
APA, Harvard, Vancouver, ISO, and other styles
44

Winter, Heather. "Analysis and Prediction of Rainfall and Storm Surge Interactions in the Clear Creek Watershed using Unsteady-State HEC-RAS Hydraulic Modeling." Thesis, 2012. http://hdl.handle.net/1911/64693.

Full text
Abstract:
This study presents an unsteady-state hydraulic model analysis of hurricane storm surge and rainfall-runoff interactions in the Clear Creek Watershed, a basin draining into Galveston Bay and vulnerable to flooding from both intense local rainfalls and storm surge. Storm surge and rainfall-runoff have historically been modeled separately, and thus the linkage and interactions between the two during a hurricane are not completely understood. This study simulates the two processes simultaneously by using storm surge stage hydrographs as boundary conditions in the Hydrologic Engineering Center’s – River Analysis System (HEC-RAS) hydraulic model. Storm surge hydrographs for a severe hurricane were generated in the Advanced Circulation Model for Oceanic, Coastal, and Estuarine Waters (ADCIRC) model to predict the flooding that could be caused by a worst-case scenario. Using this scenario, zones have been identified to represent areas in the Clear Creek Watershed vulnerable to flooding from storm surge, rainfall, or both.
APA, Harvard, Vancouver, ISO, and other styles
45

Fang, Zheng. "A dynamic hydraulic floodplain map prediction tool for flood alert in a coastal urban watershed considering storm surge issues." Thesis, 2008. http://hdl.handle.net/1911/22228.

Full text
Abstract:
Flood Alert System (FAS2) incorporates adjusted real-time NEXRAD radar data, GIS, hydrologic models and the Internet to provide advanced warning to the Texas Medical Center (TMC) in Houston, Texas. It has been tested during 2006 season with excellent performance and was used as a platform to develop a real-time hydraulic prediction tool---the Floodplain Map Library (FPML) system. FPML provides inundation maps in near real time linking with NEXRAD radar over the watershed. FPML is also compatible with storm surge input in order to predict inundation maps during extreme coastal weather conditions, which will improve emergency personnel's ability to initiate evacuation strategies at many levels.
APA, Harvard, Vancouver, ISO, and other styles
46

Rigney, Matthew C. "Ensemble Statistics and Error Covariance of a Rapidly Intensifying Hurricane." 2009. http://hdl.handle.net/1969.1/ETD-TAMU-2009-05-724.

Full text
Abstract:
This thesis presents an investigation of ensemble Gaussianity, the effect of non- Gaussianity on covariance structures, storm-centered data assimilation techniques, and the relationship between commonly used data assimilation variables and the underlying dynamics for the case of Hurricane Humberto. Using an Ensemble Kalman Filter (EnKF), a comparison of data assimilation results in Storm-centered and Eulerian coordinate systems is made. In addition, the extent of the non-Gaussianity of the model ensemble is investigated and quantified. The effect of this non-Gaussianity on covariance structures, which play an integral role in the EnKF data assimilation scheme, is then explored. Finally, the correlation structures calculated from a Weather Research Forecast (WRF) ensemble forecast of several state variables are investigated in order to better understand the dynamics of this rapidly intensifying cyclone. Hurricane Humberto rapidly intensified in the northwestern Gulf of Mexico from a tropical disturbance to a strong category one hurricane with 90 mph winds in 24 hours. Numerical models did not capture the intensification of Humberto well. This could be due in large part to initial condition error, which can be addressed by data assimilation schemes. Because the EnKF scheme is a linear theory developed on the assumption of the normality of the ensemble distribution, non-Gaussianity in the ensemble distribution used could affect the EnKF update. It is shown that multiple state variables do indeed show significant non-Gaussianity through an inspection of statistical moments. In addition, storm-centered data assimilation schemes present an alternative to traditional Eulerian schemes by emphasizing the centrality of the cyclone to the assimilation window. This allows for an update that is most effective in the vicinity of the storm center, which is of most concern in mesoscale events such as Humberto. Finally, the effect of non-Gaussian distributions on covariance structures is examined through data transformations of normal distributions. Various standard transformations of two Gaussian distributions are made. Skewness, kurtosis, and correlation between the two distributions are taken before and after the transformations. It can be seen that there is a relationship between a change in skewness and kurtosis and the correlation between the distributions. These effects are then taken into consideration as the dynamics contributing to the rapid intensification of Humberto are explored through correlation structures.
APA, Harvard, Vancouver, ISO, and other styles
47

Song, Hui. "Automatic prediction of solar flares and super geomagnetic storms." Thesis, 2008. http://library1.njit.edu/etd/fromwebvoyage.cfm?id=njit-etd2008-046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Xie, Jia-Ming, and 謝家銘. "Application of Bayesian Method for Chain Store Sales Prediction." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/av5u9a.

Full text
Abstract:
碩士
國立政治大學
統計學系
107
The prediction of sales is important. It is common to do regression analysis to predict sales for a store using its own data. However, for a chain with hundreds of stores, it may be possible to improve prediction accuracy and obtain more reasonable regression coefficients by combining data from different stores. We propose to achieve these goals by using two shrinkage methods: hierarchical Bayesian method and James-Stein estimator. We found that the shrinkage methods yield limited improvement when the regression coefficients in separate models are rather close. Moreover, the hierarchical method incorporated data from different stores and improve predictions, while James-Stein estimator did not improve much.
APA, Harvard, Vancouver, ISO, and other styles
49

Dias, Viviana de Oliveira. "Predictive models for in-store workforce optimization." Master's thesis, 2019. https://hdl.handle.net/10216/125693.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Dias, Viviana de Oliveira. "Predictive models for in-store workforce optimization." Dissertação, 2019. https://hdl.handle.net/10216/125693.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography