Dissertations / Theses on the topic 'Natural hazards'

To see the other types of publications on this topic, follow the link: Natural hazards.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Natural hazards.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lagos, González Tomás Ignacio. "Designing resilient power networks against natural hazards." Tesis, Universidad de Chile, 2017. http://repositorio.uchile.cl/handle/2250/148468.

Full text
Abstract:
Magíster en Gestión de Operaciones. Ingeniero Civil Industrial
Resiliencia en sistemas de potencia se está estudiando recientemente en la literatura, su principal preocupación es proporcionar la viabilidad de la red en caso de eventos de alto impacto y baja probabilidad (HILP). Las principales contribuciones de este trabajo son: (1) Proporcionar un marco novedoso que apoye la toma de decisiones estratégicas para maximizar la resiliencia del sistema eléctrico contra la amenaza de desastres naturales (el primero de acuerdo a la investigación realizada), en particular terremotos. (2) Proporcionar una alternativa a la planificación impulsada por incentivos económicos, que puede ser costrastada para decisiones de agregar nueva capacidad de generación y nuevas líneas. (3) Presentar un enfoque de optimización discreta vía simulación (DOvS) que aborda problemas que tienen incertidumbre en dos etapas. Los resultados computacionales preliminares muestran que se obtienen soluciones más robustas para este problema en particular. Se utiliza el algoritmo Industrial Strength COMPASS para abordar este problema de decisión discreto, donde la medida de resiliencia corresponde a la energía no suministrada esperada (EENS). La evaluación de la EENS se lleva a cabo a través de un simulador que cuantifica los impactos de los desastres naturales en la demanda y que contiene datos históricos sobre terremotos, curvas de fragilidad de los componentes de la red y un modelo operacional de la red eléctrica. A través de un caso de estudio, se demuestra la aplicabilidad de este método, sus principales características y, en última instancia, cómo un planificador de la red puede diseñar sistemas de potencia más resistentes frente a terremotos.
Este trabajo ha sido parcialmente financiado por UK Research Council y CONICYT por medio del Fondo Newton-Picarte
APA, Harvard, Vancouver, ISO, and other styles
2

Bergmeister, Konrad, Manfred Curbach, Evelin Kamper, Dirk Proske, Dieter Rickenmann, and Sigrid Wieshofer. "3rd Probabilistic Workshop Technical Systems, Natural Hazards." Universität für Bodenkultur Wien, 2009. https://slub.qucosa.de/id/qucosa%3A287.

Full text
Abstract:
Modern engineering structures should ensure an economic design, construction and operation of structures in compliance with the required safety for persons and the environment. In order to achieve this aim, all contingencies and associated consequences that may possibly occur throughout the life cycle of the considered structure have to be taken into account. Today, the development is often based on decision theory, methods of structural reliability and the modeling of consequences. Failure consequences are one of the significant issues that determine optimal structural reliability. In particular, consequences associated with the failure of structures are of interest, as they may lead to significant indirect consequences, also called follow-up consequences. However, apart from determining safety levels based on failure consequences, it is also crucially important to have effective models for stress forces and maintenance planning ... (aus dem Vorwort)
APA, Harvard, Vancouver, ISO, and other styles
3

Threatt, Patrick Lee. "NATURAL HAZARDS IN MISSISSIPPI: REGIONAL PERCEPTIONS AND REALITY." MSSTATE, 2008. http://sun.library.msstate.edu/ETD-db/theses/available/etd-11092007-145929/.

Full text
Abstract:
This study comprised of a survey of 807 students in geosciences classes at Mississippi State University to determine the perceived level of threat from eight natural hazards: hurricanes, hail, lightning, tornadoes, earthquakes, ice storms, floods, and wildfires. Responses were analyzed to detect spatial differences in perceptions of threats across the state of Mississippi for comparison. Actual occurrences of the natural hazards and preparations for dealing with these hazards were recorded by county and MEMA districts. Threat perceptions for hurricanes, ice storms, floods, and lightning showed spatial differences, whereas threats from hail, tornadoes, earthquakes, and wildfire showed no spatial differences. All perceived threats except ice storms paralleled the actual recorded occurrences of the respective hazards spatially. Preparations for each hazard included the adoption of MEMAs Basic Plan for the entire state.
APA, Harvard, Vancouver, ISO, and other styles
4

Threatt, Patrick Lee. "Natural hazards in Mississippi regional perceptions and reality /." Master's thesis, Mississippi State : Mississippi State University, 2007. http://library.msstate.edu/etd/show.asp?etd=etd-11092007-145929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

García, Castillo Jorge M. Eng Massachusetts Institute of Technology. "Effects and mitigation of natural hazards in retail networks." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/117797.

Full text
Abstract:
Thesis: M. Eng. in Supply Chain Management, Massachusetts Institute of Technology, Supply Chain Management Program, 2018.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 87-89).
The number of natural hazards has been increasing over the last 10 years. Understanding the impact of natural hazards on retail networks is crucial to make effective planning against disruptions. We used daily sales and inventory data from a country-wide retail network and natural emergencies historic data to quantify the consequences triggered by these events in product and financial flows. We analyze sales and inventory flow through points of sale and distribution centers. We propose the Resilience Investment Model (RIM) to invest in resilience against the effects of natural hazards. This model takes into account the operational details of the organization. RIM is a two-stage multi-period inventory flow stochastic program. The resilience investments consist in acquiring additional inventory to buffer against disruptions and the use of real options contracts with suppliers to execute when a declared emergency happens. We use a set of risk profiles over the future costs to align the investment with the financials and preferences of the organization. This research shows how the risk profiles of the decision maker shape the location and distribution of backup stock in a retail network. We show that risk averse profiles reduce worst-case cost by 15% while increasing average cost by 2%. We recommend the use of risk profiles with cost targets to quantify the Value at Risk of the network due to natural hazards.
by Jorge García Castillo.
M. Eng. in Supply Chain Management
APA, Harvard, Vancouver, ISO, and other styles
6

Hunter, Alasdair. "Quantifying and understanding the aggregate risk of natural hazards." Thesis, University of Exeter, 2014. http://hdl.handle.net/10871/15719.

Full text
Abstract:
Statistical models are necessary to quantify and understand the risk from natural hazards. A statistical framework is developed here to investigate the e ect of dependence between the frequency and intensity of natural hazards on the aggregate risk. The aggregate risk of a natural hazard is de ned as the sum of the intensities for all events within a season. This framework is applied to a database of extra tropical cyclone tracks from the NCEP-NCAR reanalysis for the October to March extended winters between 1950 and 2003. Large positive correlation is found between cyclone counts and the local mean vorticity over the exit regions of the North Atlantic and North Paci c storm tracks. The aggregate risk is shown to be sensitive to this dependence, especially over Scandinavia. Falsely assuming independence between the frequency and intensity results in large biases in the variance of the aggregate risk. Possible causes for the dependence are investigated by regressing winter cyclone counts and local mean vorticity on teleconnection indices with Poisson and linear models. The indices for the Scandinavian pattern, North Atlantic Oscillation and East Atlantic Pattern are able to account for most of the observed positive correlation over the North Atlantic. The sensitivity of extremes of the aggregate risk distribution to the inclusion of clustering, with and without frequency intensity dependence, is investigated using Cantelli bounds and a copula simulation approach. The inclusion of dependence is shown to be necessary to model the clustering of extreme events. The implication of these ndings for the insurance sector is investigated using the loss component of a catastrophe model. A mixture model approach provides a simple and e ective way to incorporate frequency-intensity dependence into the loss model. Including levels of correlation and overdispersion comparable to that observed in the reanalysis data results in an average increase of over 30% in the 200 year return level for the aggregate loss.
APA, Harvard, Vancouver, ISO, and other styles
7

Xia, Xilin. "High-performance simulation technologies for water-related natural hazards." Thesis, University of Newcastle upon Tyne, 2017. http://hdl.handle.net/10443/3798.

Full text
Abstract:
Water-related natural hazards, such as flash floods, landslides and debris flows, usually happen in chains. In order to better understand the underlying physical processes and more reliably quantify the associated risk, it is essential to develop a physically-based multi-hazard modelling system to simulate these hazards at a catchment scale. An effective multi-hazard modelling system may be developed by solving a set of depth-averaged dynamic equations incorporating adaptive basal resistance terms. High-performance computing achieved through implementation on modern graphic processing units (GPUs) can be used to accelerate the model to support efficient large-scale simulations. This thesis presents the key simulation technologies for developing such a novel high-performance water-related natural hazards modelling system. A new well-balanced smoothed particle hydrodynamic (SPH) model is first presented for solving the shallow water equations (SWEs) in the context of flood inundation modelling. The performance of the SPH model is compared with an alternative flood inundation model based on a finite volume (FV) method in order to select a better numerical method for the current study. The FV model performs favourably for practical applications and therefore is adopted to develop the proposed multi-hazard model. In order to more accurately describe the rainfallrunoff and overland flow process that often initiates a hazard chain, a first-order FV Godunovtype model is developed to solve the SWEs, implemented with novel source term discretisation schemes. The new model overcomes the limitations of the current prevailing numerical schemes such as inaccurate calculations of bed slope or friction source terms and provides much improved numerical accuracy, efficiency and stability for simulating overland flows and surface flooding. To support large-scale simulation of flow-like landslides or debris flows, a new formulation of depth-averaged governing equations is derived on the Cartesian coordinate system. The new governing equations take into account the effects of non-hydrostatic pressure and centrifugal force, which may become significant over terrains with steep and curved topography. These equations are compatible with various basal resistance terms, effectively leading to a unified mathematical framework for describing different type of water-related natural hazards including surface flooding, flow-like landslides and debris flows. The new depthaveraged governing equations are then solved using an FV Godunov-type framework based on the second-order accurate scheme. A flexible and GPU-based software framework is further designed to provide much improved computational efficiency for large-scale simulations and ease the future implementation of new functionalities. This provides an effective codebase for the proposed multi-hazard modelling system and its potential is confirmed by successfully applying to simulate flow-like landslides and dam break floods.
APA, Harvard, Vancouver, ISO, and other styles
8

Allen, Matthew Charles. "Stakeholder perceptions of flooding issues in the Wildcat Creek Watershed." Thesis, Kansas State University, 2017. http://hdl.handle.net/2097/35444.

Full text
Abstract:
Master of Arts
Department of Geography
John A. Harrington Jr
Wildcat Creek Watershed near Manhattan, Kansas, experiences damaging flash floods that have required evacuations in recent years (Spicer 2011). The purpose of this study was to qualitatively examine the issue of flooding in the Wildcat Creek Watershed through interviewing stakeholders (those that reside, own a business, or study) using a semi – structured approach. Interview discussion examined stakeholders’ perceptions of 1) how they understand the processes that create the flooding hazard, 2) whether or not they value the implementation of mitigation efforts to reduce the negative impacts of flooding, 3) whether they feel at risk to flooding, and 4) who they consider a trusted source of information about the hydrologic characteristics of the watershed. Based on the results of this study, a spatial relationship in perceptions of flooding issues in the Wildcat Creek Watershed was found. Across the study area, stakeholders understood many of the physical causes of flooding, but did not tend to see the connections among the many physical components. Overall, stakeholders believed that mitigation strategies to curb flash flooding were valuable, although many were not supportive of paying for these efforts through potential taxation from a watershed district. Despite the increase of flooding events in the past decade (Anderson 2011), many stakeholders neither saw any changes in their personal risk of exposure to flooding nor a change in their flood vulnerability. In the context of the flooding issue in Wildcat Creek Watershed, most participants trusted their neighbors and community leaders as sources of information instead of professionals who research and/or conduct work on the watershed.
APA, Harvard, Vancouver, ISO, and other styles
9

Lilly, Joseph. "Municipal planning for natural hazards, what is the best approach?" Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp01/MQ39677.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hammel, Evan Martin. "A multi-attribute framework for risk analysis of natural hazards." Connect to online resource, 2007. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1446078.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kim, Chun Il. "Urban spatial structure, housing markets, and resilience to natural hazards." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/109024.

Full text
Abstract:
Thesis: Ph. D. in Urban and Regional Planning, Massachusetts Institute of Technology, Department of Urban Studies and Planning, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references.
This dissertation consists of three essays on urban structure, housing, and environment. The first paper contributes to the existing debate on the co-location hypothesis by devising a proximity measure and controlling for a set of other urban form measures. Multiple regression analysis revealed that job-worker proximity leads to shorter commuting time. In addition, results from subareas suggested that the impact of job-worker imbalance and the impact of job-worker mismatch on the commuting time are both greater in the suburb in comparison with the city center. The second paper examines the impact of the LIHTC construction on nearby housing prices in the Boston metropolitan area by using the AITS-DID method. The paper found that the price gap between the LIHTC micro-neighborhood and the area beyond is reduced by approximately 16.5 percent points after the LIHTC construction. The segmentation of the analysis by sub-region showed spatially heterogeneous results. The findings from this research are contrary to the conventional perception that subsidized housing developments lead to neighborhood decline persistently. Measuring resilience to natural hazards is a central issue in the hazard mitigation sciences. The third paper applied a confirmatory factor methodology to operationalize the biophysical, built environment, and socioeconomic resilience dimensions for local jurisdictions in large urban metropolitan areas in South Korea. The factor covariances showed a trade-off relationship between natural infrastructure and human activities. Densely developed and affluent urban areas tend to lack biophysical resilience. Some local governments, sorted into the same groups, turn out to be located in different metropolitan areas. The spatial variation and inequality in the resilience dimensions suggest the necessity of integrated and flexible governance for sustainable hazard mitigation.
by Chun Il Kim.
Ph. D. in Urban and Regional Planning
APA, Harvard, Vancouver, ISO, and other styles
12

Liu, Huan. "Economic Analysis of Resilience to Natural Hazards in Industrial Sectors." Doctoral thesis, Kyoto University, 2021. http://hdl.handle.net/2433/263777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Duenas-Osorio, Leonardo Augusto. "Interdependent Response of Networked Systems to Natural Hazards and Intentional Disruptions." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7546.

Full text
Abstract:
Critical infrastructure systems are essential for the continuous functionality of modern global societies. Some examples of these systems include electric energy, potable water, oil and gas, telecommunications, and the internet. Different topologies underline the structure of these networked systems. Each topology (i.e., physical layout) conditions the way in which networks transmit and distribute their flow. Also, their ability to absorb unforeseen natural or intentional disruptions depends on complex relations between network topology and optimal flow patterns. Most of the current research on large networks is focused on understanding their properties using statistical physics, or on developing advanced models to capture network dynamics. Despite these important research efforts, almost all studies concentrate on specific networks. This network-specific approach rules out a fundamental phenomenon that may jeopardize the performance predictions of current sophisticated models: network response is in general interdependent, and its performance is conditioned on the performance of additional interacting networks. Although there are recent conceptual advances in network interdependencies, current studies address the problem from a high-level point of view. For instance, they discuss the problem at the macro-level of interacting industries, or utilize economic input-output models to capture entire infrastructure interactions. This study approaches the problem of network interdependence from a more fundamental level. It focuses on network topology, flow patterns within the networks, and optimal interdependent system performance. This approach also allows for probabilistic response characterization of interdependent networked systems when subjected to disturbances of internal nature (e.g., aging, malfunctioning) or disruptions of external nature (e.g., coordinated attacks, seismic hazards). The methods proposed in this study can identify the role that each network element has in maintaining interdependent network connectivity and optimal flow. This information is used in the selection of effective pre-disaster mitigation and post-disaster recovery actions. Results of this research also provide guides for growth of interacting infrastructure networks and reveal new areas for research on interdependent dynamics. Finally, the algorithmic structure of the proposed methods suggests straightforward implementation of interdependent analysis in advanced computer software applications for multi-hazard loss estimation.
APA, Harvard, Vancouver, ISO, and other styles
14

Vitoontus, Soravit. "Risk assessment of building inventories exposed to large scale natural hazards." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/43676.

Full text
Abstract:
Earthquakes are among the most devastating and unpredictable of natural hazards that affect civil infrastructure and have the potential for causing numerous casualties and significant economic losses over large areas. Every region that has the potential for great earthquakes should have an integrated plan for a seismic design and risk mitigation for civil infrastructure. This plan should include methods for estimating the vulnerability of building inventories and for forecasting economic losses resulting from future events. This study describes a methodology to assess risk to distributed civil infrastructure due to large-scale natural hazards with large geographical footprints, such as earthquakes, hurricanes and floods, and provides a detailed analysis and assessment of building losses due to earthquake. The distinguishing feature of this research, in contrast to previous loss estimation methods incorporated in systems such as HAZUS-MH, is that it considers the correlation in stochastic demand on building inventories due to the hazard, as well as correlation in building response and damage due to common materials, construction technologies, codes and code enforcement. These sources of correlation have been neglected, for the most part, in previous research. The present study has revealed that the neglect of these sources of correlation leads to an underestimation of the estimates of variance in loss and in the probable maximum loss (PML) used as a basis for underwriting risks. The methodology is illustrated with a seismic risk assessment of building inventories representing different occupancy classes in Shelby County, TN, considering both scenario earthquakes and earthquakes specified probabilistically. It is shown that losses to building inventories estimated under the common assumption that the individual losses can be treated as statistically independent may underestimate the PML by a factor of range from 1.7 to 3.0, depending on which structural and nonstructural elements are included in the assessment. A sensitivity analysis reveals the statistics and sources of correlation that are most significant for loss estimation, and points the way forward for supporting data acquisition and synthesis.
APA, Harvard, Vancouver, ISO, and other styles
15

PEDE, ELENA CAMILLA. "Building resilience towards natural hazards: cross-­‐scale knowledge and institutional linkages." Doctoral thesis, Politecnico di Torino, 2015. http://hdl.handle.net/11583/2611359.

Full text
Abstract:
Our environment is becoming increasingly complex: the rapid urbanisation, often accompanied by uncontrolled use of land, occupation of unsafe environments as well as the increased rate of occurrence of climate events are introducing elements of uncertainty (Pinna, 2002). The idea of certainty or security that was fundamental to risk management in the past, collapses. In this context, the notion of ‘risk society’, introduced by Ulrich Beck in 1992, is considered as a shifting paradigm in world security, where our modern society becomes ever more interdependent and more complex, and consequently more vulnerable to threats and risks. Traditionally, planning plays a central role in the scientific management of risks mostly based on the control of calculated risks. But, the increasing uncertainty and the emerging of new types of risks require an alternative path of planning practice that acknowledges and interacts with society’s risk implications. Despite its lack of clarity, resilience offers opportunities to the uncertainty and insecurity of contemporary context (Davoudi et al., 2012). The overall aim of this research is the exploration of a new path of planning for resilience that responds to the increasing uncertainty in the context of global ‘risk society’. It would contribute to the new dynamic of safety and security by drawing on risk dimensions modifications. Several areas of knowledge are developed within this research aimed to improve ways of governing society resilience that covers the continuous system for mitigation, readiness, and resistance in the context of risks. Out of that, new areas of collaboration are identified among urban and emergency planners, decision makers, and the citizens to strengthen societal resilience.
APA, Harvard, Vancouver, ISO, and other styles
16

Vogel, Kristin. "Applications of Bayesian networks in natural hazard assessments." Phd thesis, Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2014/6977/.

Full text
Abstract:
Even though quite different in occurrence and consequences, from a modeling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding: uncertainty about the modeling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Nevertheless deterministic approaches are still widely used in natural hazard assessments, holding the risk of underestimating the hazard with disastrous effects. The all-round probabilistic framework of Bayesian networks constitutes an attractive alternative. In contrast to deterministic proceedings, it treats response variables as well as explanatory variables as random variables making no difference between input and output variables. Using a graphical representation Bayesian networks encode the dependency relations between the variables in a directed acyclic graph: variables are represented as nodes and (in-)dependencies between variables as (missing) edges between the nodes. The joint distribution of all variables can thus be described by decomposing it, according to the depicted independences, into a product of local conditional probability distributions, which are defined by the parameters of the Bayesian network. In the framework of this thesis the Bayesian network approach is applied to different natural hazard domains (i.e. seismic hazard, flood damage and landslide assessments). Learning the network structure and parameters from data, Bayesian networks reveal relevant dependency relations between the included variables and help to gain knowledge about the underlying processes. The problem of Bayesian network learning is cast in a Bayesian framework, considering the network structure and parameters as random variables itself and searching for the most likely combination of both, which corresponds to the maximum a posteriori (MAP score) of their joint distribution given the observed data. Although well studied in theory the learning of Bayesian networks based on real-world data is usually not straight forward and requires an adoption of existing algorithms. Typically arising problems are the handling of continuous variables, incomplete observations and the interaction of both. Working with continuous distributions requires assumptions about the allowed families of distributions. To "let the data speak" and avoid wrong assumptions, continuous variables are instead discretized here, thus allowing for a completely data-driven and distribution-free learning. An extension of the MAP score, considering the discretization as random variable as well, is developed for an automatic multivariate discretization, that takes interactions between the variables into account. The discretization process is nested into the network learning and requires several iterations. Having to face incomplete observations on top, this may pose a computational burden. Iterative proceedings for missing value estimation become quickly infeasible. A more efficient albeit approximate method is used instead, estimating the missing values based only on the observations of variables directly interacting with the missing variable. Moreover natural hazard assessments often have a primary interest in a certain target variable. The discretization learned for this variable does not always have the required resolution for a good prediction performance. Finer resolutions for (conditional) continuous distributions are achieved with continuous approximations subsequent to the Bayesian network learning, using kernel density estimations or mixtures of truncated exponential functions. All our proceedings are completely data-driven. We thus avoid assumptions that require expert knowledge and instead provide domain independent solutions, that are applicable not only in other natural hazard assessments, but in a variety of domains struggling with uncertainties.
Obwohl Naturgefahren in ihren Ursachen, Erscheinungen und Auswirkungen grundlegend verschieden sind, teilen sie doch viele Gemeinsamkeiten und Herausforderungen, wenn es um ihre Modellierung geht. Fehlendes Wissen über die zugrunde liegenden Kräfte und deren komplexes Zusammenwirken erschweren die Wahl einer geeigneten Modellstruktur. Hinzu kommen ungenaue und unvollständige Beobachtungsdaten sowie dem Naturereignis innewohnende Zufallsprozesse. All diese verschiedenen, miteinander interagierende Aspekte von Unsicherheit erfordern eine sorgfältige Betrachtung, um fehlerhafte und verharmlosende Einschätzungen von Naturgefahren zu vermeiden. Dennoch sind deterministische Vorgehensweisen in Gefährdungsanalysen weit verbreitet. Bayessche Netze betrachten die Probleme aus wahrscheinlichkeitstheoretischer Sicht und bieten somit eine sinnvolle Alternative zu deterministischen Verfahren. Alle vom Zufall beeinflussten Größen werden hierbei als Zufallsvariablen angesehen. Die gemeinsame Wahrscheinlichkeitsverteilung aller Variablen beschreibt das Zusammenwirken der verschiedenen Einflussgrößen und die zugehörige Unsicherheit/Zufälligkeit. Die Abhängigkeitsstrukturen der Variablen können durch eine grafische Darstellung abgebildet werden. Die Variablen werden dabei als Knoten in einem Graphen/Netzwerk dargestellt und die (Un-)Abhängigkeiten zwischen den Variablen als (fehlende) Verbindungen zwischen diesen Knoten. Die dargestellten Unabhängigkeiten veranschaulichen, wie sich die gemeinsame Wahrscheinlichkeitsverteilung in ein Produkt lokaler, bedingter Wahrscheinlichkeitsverteilungen zerlegen lässt. Im Verlauf dieser Arbeit werden verschiedene Naturgefahren (Erdbeben, Hochwasser und Bergstürze) betrachtet und mit Bayesschen Netzen modelliert. Dazu wird jeweils nach der Netzwerkstruktur gesucht, welche die Abhängigkeiten der Variablen am besten beschreibt. Außerdem werden die Parameter der lokalen, bedingten Wahrscheinlichkeitsverteilungen geschätzt, um das Bayessche Netz und dessen zugehörige gemeinsame Wahrscheinlichkeitsverteilung vollständig zu bestimmen. Die Definition des Bayesschen Netzes kann auf Grundlage von Expertenwissen erfolgen oder - so wie in dieser Arbeit - anhand von Beobachtungsdaten des zu untersuchenden Naturereignisses. Die hier verwendeten Methoden wählen Netzwerkstruktur und Parameter so, dass die daraus resultierende Wahrscheinlichkeitsverteilung den beobachteten Daten eine möglichst große Wahrscheinlichkeit zuspricht. Da dieses Vorgehen keine Expertenwissen voraussetzt, ist es universell in verschiedenen Gebieten der Gefährdungsanalyse einsetzbar. Trotz umfangreicher Forschung zu diesem Thema ist das Bestimmen von Bayesschen Netzen basierend auf Beobachtungsdaten nicht ohne Schwierigkeiten. Typische Herausforderungen stellen die Handhabung stetiger Variablen und unvollständiger Datensätze dar. Beide Probleme werden in dieser Arbeit behandelt. Es werden Lösungsansätze entwickelt und in den Anwendungsbeispielen eingesetzt. Eine Kernfrage ist hierbei die Komplexität des Algorithmus. Besonders wenn sowohl stetige Variablen als auch unvollständige Datensätze in Kombination auftreten, sind effizient arbeitende Verfahren gefragt. Die hierzu in dieser Arbeit entwickelten Methoden ermöglichen die Verarbeitung von großen Datensätze mit stetigen Variablen und unvollständigen Beobachtungen und leisten damit einen wichtigen Beitrag für die wahrscheinlichkeitstheoretische Gefährdungsanalyse.
APA, Harvard, Vancouver, ISO, and other styles
17

Cencerrado, Barraqué Andrés. "Methodology for time response and quality assessment in natural hazards evolution prediction." Doctoral thesis, Universitat Autònoma de Barcelona, 2012. http://hdl.handle.net/10803/284023.

Full text
Abstract:
En aquesta tesi doctoral es descriu una metodologia per a l’evaluació del temps de resposta i la qualitat en la predicció de l’evolució d’emergències mediambientals. El treball s’ha centrat en el cas específic dels incendis forestals, com un dels desastres naturals més importants i devastadors, però és facilment extrapol·lable a altre tipus d’emèrgencies mediambientals. Existeixen molts entorns de predicció que es basen en l’ús de simuladors de l’evolució del fenòmen catastròfic. Donat el creixent poder quant a capacitat de cómput que ens ofereixen els nous progressos computacionals, com les arquitectures multicore i manycore, i inclús els paradigmes de cómput distribuit, com Grid o Cloud Computing, sorgeix la necessitat d’explotar encertadament el poder computacional que aquests ens ofereixen. Aquest objectiu s’assoleix proporcionant la capacitat d’avaluar, per endavant, com les restriccions existents en el moment d’atendre un incendi forestal actiu afectaran als resultats que s’obtindran, en termes de qualitat (precisió) obtinguda, i temps necessari per prendre una decisió, i en conseqüència, tenir la capacitat de escollir la configuració més adient tant de l’estratègia de predicció, com dels recursos computacionals. Com a conseqüència, el sistema que deriva de l’aplicació d’aquesta metodologia no està dissenyat per ser un Sistema de Suport a les Decisions (DSS), però sí una eina de la que la majoria de DSSs per incendis forestals es poden beneficiar notablement. El problema s’ha tractat per mitjà de la caracterització del comportament d’aquests dos factors durant el procés de predicció. Per això, es presenta un mètode de predicció de dues etapes i s’utilitza com a base de treball, donat el notable augment de qualitat que proporciona en les prediccions. Aquesta metodologia implica haver de treballar amb tècniques pròpies del camp de la Intel.ligència Artificial, com són els Algorismes Genètics i els Arbres de Decisió, i també es recolza en un intens estudi estadístic de les bases de dades d’entrenament, compostes pels resultats de milers de simulacions. Els resultats obtinguts en aquest treball d’investigació de llarga durada són completament satisfactoris, i obren camí a nous reptes. A més, la flexibilitat que ofereix aquesta metodologia permet aplicar-la en qualsevol altre context d’emergència, el qual la converteix en una destacable i molt útil eina per lluitar contra aquestes catàstrofes.
En esta tesis doctoral se describe una metodología para la evaluación del tiempo de respuesta y la calidad en la predicción de la evolución de emergencias medioambientales. El trabajo se ha centrado en el caso específico de los incendios forestales, como uno de los desastres naturales más importantes y devastadores, pero es fácilmente extrapolable a otro tipo de emergencias medioambientales. Existen muchos entornos de predicción que se basan en el uso de simuladores de la evolución del fenómeno catastrófico. Dado el creciente poder en cuanto a capacidad de cómputo que nos ofrecen los nuevos avances computacionales, como las arquitecturas multicore y manycore, e incluso los paradigmas de cómputo distribuido, como Grid o Cloud Computing, surge la necesidad de ser capaces de explotar acertadamente el poder computacional que éstos nos ofrecen. Tal objetivo se alcanza proporcionando la capacidad de evaluar, de antemano, cómo las restricciones existentes a la hora de atender un incendio forestal activo afectarán a los resultados que se obtendrán, tanto en términos de calidad (precisión) obtenida, y tiempo necesario para tomar una decisión, y por consiguiente, tener la capacidad de escoger la configuración más adecuada tanto de la estrategia de predicción, como de los recursos computacionales. Como consecuencia, el sistema que deriva de la aplicación de esta metodología no está diseñado para ser un Sistema de Soporte a las Decisiones (DSS), pero sí una herramienta de la que la mayoría de DSSs para incendios forestales se pueden beneficiar notablemente. El problema se ha tratado por medio de la caracterización del comportamiento de estos dos factores durante el proceso de predicción. Para ello, un método de predicción de dos etapas es presentado y utilizado como base de trabajo, dado el notable aumento de calidad que proporciona en las predicciones. Esta metodología implica lidiar con técnicas propias del campo de la Inteligencia Artificial, como son los Algoritmos Genéticos y los Árboles de Decisión, y a su vez se apoya en un intenso estudio estadístico de bases de datos de entrenamiento, compuestas por los resultados de miles de distintas simulaciones. Los resultados obtenidos en este trabajo de investigación a largo plazo son completamente satisfactorios, y abren camino a nuevos retos. Además, la flexibilidad que ofrece la metodología permite aplicarla en cualquier otro contexto de emergencia, lo que la convierte en una destacable y muy útil herramienta para luchar contra estas catástrofes
This thesis describes a methodology for time response and quality assessment in natural hazards evolution prediction. This work has been focused on the specific case of forest fires as an important and worrisome catastrophe, but it can easily be extrapolated to all other kinds of natural hazards. There exist many prediction frameworks based on the use of simulators of the evolution of the hazard. Given the increasing computing capabilities allowed by new computing advances such as multicore and manycore architectures, and even distributed-computing paradigms, such as Grid and Cloud Computing, the need arises to be able to properly exploit the computational power they offer. This goal is fulfilled by introducing the capability to assess in advance how the present constraints at the time of attending to an ongoing forest fire will affect the results obtained from them, both in terms of quality (accuracy) obtained and time needed to make a decision, and therefore being able to select the most suitable configuration of both the prediction strategy and computational resources to be used. As a consequence, the framework derived from the application of this methodology is not supposed to be a new Decision Support System (DSS) for fire departments and Civil Protection agencies, but a tool from which most of forest fire (and other kinds of natural hazards) DSSs could benefit notably. The problem has been tackled by means of characterizing the behavior of these two factors during the prediction process. For this purpose, a two-stage prediction framework is presented and considered as a suitable and powerful strategy to enhance the quality of the predictions. This methodology involves dealing with Artificial Intelligence techniques, such as Genetic Algorithms and Decision Trees and also relies on a strong statistical study from training databases, composed of the results of thousands of different simulations. The results obtained in this long-term research work are fully satisfactory, and give rise to several new challenges. Moreover, the flexibility offered by the methodology allows it to be applied to other kinds of emergency contexts, which turns it into an outstanding and very useful tool in fighting against these catastrophes.
APA, Harvard, Vancouver, ISO, and other styles
18

Bowman, Harry Albert. "Optimizing Transportation Infrastructure Improvements For Networks Under The Threat of Natural Hazards /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487929745332846.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Grahn, Tonje. "Risk assessment of natural hazards : Data availability and applicability for loss quantification." Doctoral thesis, Karlstads universitet, Institutionen för miljö- och livsvetenskaper, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-48324.

Full text
Abstract:
Quantitative risk assessments are a fundamental part of economic analysis and natural hazard risk management models. It increases the objectivity and the transparency of risk assessments and guides policymakers in making efficient decisions when spending public resources on risk reduction. Managing hazard risks calls for an understanding of the relationships between hazard exposure and vulnerability of humans and assets.   The purpose of this thesis is to identify and estimate causal relationships between hazards, exposure and vulnerability, and to evaluate the applicability of systematically collected data sets to produce reliable and generalizable quantitative information for decision support.   Several causal relationships have been established. For example, the extent of lake flood damage to residential buildings depends on the duration of floods, distance to waterfront, the age of the house and in some cases the water level. Results also show that homeowners private initiative to reduce risk, prior to or during a flood, reduced their probability of suffering building damage with as much as 40 percent. Further, a causal relationship has been established between the number of people exposed to quick clay landslides and landslide fatalities.   Even though several relationships were identified between flood exposure and vulnerability, the effects can only explain small parts of the total variation in damages, especially at object level. The availability of damage data in Sweden is generally low. The most comprehensive damage data sets in Sweden are held by private insurance companies and are not publicly available. Data scarcity is a barrier to quantitative natural hazard risk assessment in Sweden. More efforts should therefore be made to collect data systematically for modelling and validating standardized approaches to quantitative damage estimation.
Natural hazard damages have increased worldwide. Impacts caused by hydrological and meteorological hazards have increased the most. An analysis of insurance payments in Sweden showed that flood damages have been increasing in Sweden as well. With climate change and increasing populations we can expect this trend to continue unless efforts are made to reduce risk and adapt communities to the threats. Economic analysis and quantitative risk assessments of natural hazards are fundamental parts of a risk management process that can support policymakers' decisions on efficient risk reduction. However, in order to develop reliable damage estimation models knowledge is needed of the relationships between hazard exposure and the vulnerability of exposed objects and persons. This thesis has established causal relationships between residential exposure and flood damage on the basis of insurance data. I also found that private damage-reducing actions decreased the probability of damage to buildings with almost 40 percent. Further, a causal relationship has been established between the number of people exposed to quick clay landslides and fatalities. Even though several relationships have been identified between flood exposure and vulnerability, the effects can explain only small parts of the total variation in damages, especially at object level, and more effort is needed to develop quantitative models for risk assessment purposes.
APA, Harvard, Vancouver, ISO, and other styles
20

Pineda, Carlos Eduardo Rodriguez Pineda. "Hazard assessment of earthquake-induced landslides on natural slopes : modelling growth and maturation in primate and human evolution." Thesis, Imperial College London, 2001. http://hdl.handle.net/10044/1/8872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Bell, Heather. "Efficient and effective? the hundred year flood in the communication and perception of flood risk." [Tampa, Fla.] : University of South Florida, 2004. http://purl.fcla.edu/fcla/etd/SFE0000522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Angignard, Marjory [Verfasser]. "Applying risk governance principles to natural hazards and risks in mountains / Marjory Angignard." Dortmund : Universitätsbibliothek Technische Universität Dortmund, 2011. http://d-nb.info/1018126872/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Tolo, S. "Uncertainty quantification and risk assessment methods for complex systems subject to natural hazards." Thesis, University of Liverpool, 2016. http://livrepository.liverpool.ac.uk/3005411/.

Full text
Abstract:
The interaction between natural events and technological installations involves complex mechanisms which have the potential to affect simultaneously more critical systems, nullifying the redundancy measures common to industrial safety systems and endangering the integrity of facilities. The concerns related to this kind of events are far from being restricted to a merely economic or industrial nature. On the contrary, due to the sensitivity of most processes performed in industrial plants and the negative consequences of eventual releases of hazardous materials, the impact of simultaneous failures embraces also the environment and population surrounding the installations. The risk is further widened by the trend of climate extremes: both observations over the past century and projections for next decades suggest an increase of the severity of extreme weather events and their frequency, both on local and global scales. The rise of sea water levels together with the exacerbation of extreme winds and precipitations, enlarge the geographic area of risk and rise the likelihood of accidents in regions historically susceptible to natural hazards. The prevention of technological accidents triggered by natural hazards lies unavoidably with the development of efficient theoretical and computational tools for the vulnerability assessment of industrial installations and the identification of effective strategies to tackle the growing risks to which they are subject. In spite of the increasing trend of the risk and the high-impact consequences, the current scientific literature still lacks robust means to tackle these issues effectively. The research presented in this dissertation addresses the critical need for novel theoretical and computational methods tailored for the risk assessment of complex systems threaten by extreme natural events. The specific requirements associated with the modelling of the interaction between external hazards and engineering systems have been determined, resulting in the identification of two main bottlenecks. On the one hand, this kind of analysis has to deal with the difficulty of representing accurately the complexity of technological systems and the mutual influence among their subsystems. On the other, the high degree of uncertainty affecting climate variables (due to their inner aleatory nature and the restricted information generally available), strongly bounds the accuracy and credibility of the results on which risk-informed decisions must be made. In this work, well-known traditional approaches (such as Bayesian Networks, Monte Carlo methods etc.) as well as cutting-edge methods from different sectors of the scientific literature have been adopted and integrated in order to obtain a novel theoretical strategy and computational tool able to overcome the limitations of the current state of the art. The result of the research is a complete tool for risk assessment and decision making support, based on the use of probabilistic graphical models and able to fully represent a wide spectrum of variables types and their uncertainty, to provide the implementation of flexible computational models as well as their computation and uncertainty quantification.
APA, Harvard, Vancouver, ISO, and other styles
24

Budimir, Mirianna. "Cascading natural hazards : probability and loss modelling for earthquakes and earthquake-triggered landslides." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/378652/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wedholm, Johanna. "Policy change after natural hazards : A systematic large-N study using narrative analysis." Thesis, Uppsala universitet, Statsvetenskapliga institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-412404.

Full text
Abstract:
The main purpose of this thesis is to empirically describe the extent to which and how countries affected by natural hazards refer to these natural hazards as drivers for policy change. In order to realize this, a systematic large-N extensive study with the innovative method of narrative analysis was used to analyze the national progress reports on the implementation of the Hyogo Framework for Action 2013-2015 by searching for the extent to which and how countries affected by natural hazards refer to these natural hazards as drivers for policy change. With a starting point in theories derived from previous research on policy change and natural hazards, focusing events, and policy windows, two positions on the connection of natural hazards and policy change are described. With one position on natural hazards as a driver for policy change and one position as a non-driver for policy change, they are opposing. The results of this study showed that there is an absence of a general pattern regarding the extent to which and how countries affected by natural hazards refer to these natural hazards as drivers for policy change in the national progress reports on the implementation of the Hyogo Framework for Action 2013-2015. Hence, partial support could be given to both positions on the connection of natural hazards and policy changes. These results are highlighting new potential research openings for future studies.
APA, Harvard, Vancouver, ISO, and other styles
26

Harvatt, Joanne Elizabeth. "Natural hazards and the public in the UK : mtegrating understanding and precautionary response." Thesis, University of Birmingham, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.521967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Wei. "Automated spatiotemporal and semantic information extraction for hazards." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1415.

Full text
Abstract:
This dissertation explores three research topics related to automated spatiotemporal and semantic information extraction about hazard events from Web news reports and other social media. The dissertation makes a unique contribution of bridging geographic information science, geographic information retrieval, and natural language processing. Geographic information retrieval and natural language processing techniques are applied to extract spatiotemporal and semantic information automatically from Web documents, to retrieve information about patterns of hazard events that are not explicitly described in the texts. Chapters 2, 3 and 4 can be regarded as three standalone journal papers. The research topics covered by the three chapters are related to each other, and are presented in a sequential way. Chapter 2 begins with an investigation of methods for automatically extracting spatial and temporal information about hazards from Web news reports. A set of rules is developed to combine the spatial and temporal information contained in the reports based on how this information is presented in text in order to capture the dynamics of hazard events (e.g., changes in event locations, new events occurring) as they occur over space and time. Chapter 3 presents an approach for retrieving semantic information about hazard events using ontologies and semantic gazetteers. With this work, information on the different kinds of events (e.g., impact, response, or recovery events) can be extracted as well as information about hazard events at different levels of detail. Using the methods presented in Chapter 2 and 3, an approach for automatically extracting spatial, temporal, and semantic information from tweets is discussed in Chapter 4. Four different elements of tweets are used for assigning appropriate spatial and temporal information to hazard events in tweets. Since tweets represent shorter, but more current information about hazards and how they are impacting a local area, key information about hazards can be retrieved through extracted spatiotemporal and semantic information from tweets.
APA, Harvard, Vancouver, ISO, and other styles
28

Ishiwata, Hiroaki. "Dynamic Stochastic Macroeconomic Analysis of Natural Hazards and Disaster Risk Reduction in Developing Countries." Kyoto University, 2018. http://hdl.handle.net/2433/232025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Grahn, Tonje. "A Nordic Perspective on Data Availability for Quantification of Losses due to Natural Hazards." Licentiate thesis, Karlstads universitet, Institutionen för miljö- och livsvetenskaper, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-41138.

Full text
Abstract:
Natural hazards cause enormous amounts of damage worldwide every year. Since 1994 more than 1.35 billion people have lost their lives and more than 116 million homes have been damaged. Understanding of disaster risk implies knowledge about vulnerability, capacity, exposure of persons and assets, hazard characteristics and the environment. Quantitative damage assessments are a fundamental part of disaster risk management. There are, however, substantial challenges when quantifying damage which depends on the diversity of hazards and the fact that one hazardous event can negatively impact a society in multiple ways. The overall aim of the thesis is to analyze the relationship between climate-related natural hazards and subsequent damage for the purpose of improving the prerequisite for quantitative risk assessments in the future. The thesis concentrates on two specific types of consequences due to two types of hazards, 1) damage to buildings caused by lake floods, and 2) loss of lives caused by quick clay landslides.  Several causal relationships were established between risk factors and the extent of damages. Lake water levels increased the probability of structural building damage. Private damage reducing measures decreased the probability of structural building damage. Extent of damage decreased with distance to waterfront but increased with longer flood duration while prewar houses suffered lower flood damage compared to others. Concerning landslides, the number of fatalities increased when the number of humans in the exposed population increased. The main challenges to further damage estimation are data scarcity, insufficient detail level and the fact that the data are rarely systematically collected for scientific purposes. More efforts are needed to create structured, homogeneous and detailed damage databases with corresponding risk factors in order to further develop quantitative damage assessment of natural hazards in a Nordic perspective.
Naturolyckor orsakar enorma mängder skador över hela världen varje år. Under åren 1994-2013 förlorade mer än 1,35 miljoner människor sina liv och mer än 116 miljoner hem skadades. Förståelse av risk för naturolyckor innebär kunskap om sårbarhet, kapacitet, exponering av personer och tillgångar, hot och miljö. Kvantitativa skadebedömningar, som är en viktig del av riskbedömningar, omfattas av stora utmaningar som grundar sig i hotens mångfaldighet och det faktum att en naturolycka kan påverka ett samhälle negativt på många olika sätt. Det övergripande syftet med avhandlingen är att analysera förhållandet mellan naturkatastrofer och potentiellt påföljande skador i syfte att förbättra förutsättningarna för kvantitativa riskbedömningar i framtiden. Avhandlingen koncentrerar sig på två typer av naturolyckor med specifika konsekvenser, 1) skador på byggnader till följd av sjö-översvämningar, och 2) förlust av liv orsakat av lerskred. Flera orsakssamband mellan riskfaktorer och omfattning av skador har identifierats. Sjöarnas vattennivåer ökade sannolikheten att drabbas av strukturell byggnadsskada, samtidigt som privat initierade åtgärder minskande sannolikheten.. När avstånd mellan sjö och byggnad ökade minskade omfattningen av översvämningsskador, men ökade ju längre sjööversvämningen varade. Hus byggda före 1940 fick mindre skador jämfört med andra hus. Andelen dödsfall i samband med skred i kvicklera ökade när antal människor i den exponerade befolkningen ökade. Den största utmaningen i att förbättra dagens kvantitativa skadebedömningar är den rådande databristen vad gäller förluster och tillhörande riskfaktorer. Denna brist beror på otillgänglig skadedata, bristande detaljnivå på skadedata och tillhörande risk faktorer, och att uppgifterna sällan samlas systematiskt i syfte att studera kausalitet.
The overall aim of the thesis is to analyze the relationship between climate-related natural hazards and subsequent damage for the purpose of improving the prerequisite for quantitative risk assessments in the future. The thesis concentrates on two specific types of hazards with specific types of consequences, 1) damage to buildings caused by lake floods, and 2) loss of lives caused by quick clay landslides.  Several causal relationships were established between risk factors and the extent of damages. Lake water levels increased the probability of structural building damage. Private damage reducing measures decreased the probability of structural building damage. Extent of damage decreased with distance to waterfront but increased with longer flood duration while prewar houses suffered lower flood damage compared to others. Concerning landslides, the number of fatalities increased when the number of humans in the exposed population increased. The main challenges to further damage estimation are data scarcity, insufficient detail level and the fact that the data are rarely systematically collected for scientific purposes.
APA, Harvard, Vancouver, ISO, and other styles
30

Jiang, Fan. "Three Essays on the Behavioral Responses to Coastal Hazards and Vulnerability." FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3759.

Full text
Abstract:
This dissertation consists of three papers in environmental and natural resource economics. The first paper estimates the value of statistical lives (VSL) from hurricane evacuation behavior through an empirical analysis. I present empirical models that predict individuals' willingness to pay (WTP) for avoiding hurricane risks revealed through their evacuation behavior. Using survey data from Texas residents (who were affected by Hurricane Ike), I analyze the individuals’ hurricane evacuation decisions and their corresponding WTP for evacuation. I also estimate the individuals' WTP for avoiding hurricane risks under both voluntary and mandatory evacuation orders and calculate the associated VSL. The findings can be useful to emergency management agencies for evacuation planning. In the second paper, I study market responses to multiple hurricanes based on evidence from real estate sales data. Unlike earlier studies that examined the effect of hurricane exposures on property value, the present study considers how multiple hurricane hits affect the home value. I use repeat sales data from three counties in Florida from 2000 to 2010 and develop a hedonic price model. The findings identify the determinants that influence the property value and provide valuable insights for homebuyers and sellers. The study also provides useful insights regarding the benefits of hurricane mitigations to Florida residents and beyond. The third paper investigates the time preference and the dynamics of evacuation behavior based on evidence from Hurricane Ike and Hurricane Sandy. This paper contributes to the literature on households’ evacuation timing decisions by investigating the factors influencing people’s time preference for evacuation behavior. Unlike other studies, I examine the residents’ evacuation behavior across the Gulf coast as well as the Northeast and Mid-Atlantic coasts from a comparative perspective. I use one survey dataset from Texas residents who experienced Hurricane Ike and another survey dataset from the Northeastern and Mid-Atlantic US states that were affected by Hurricane Sandy. The results provide insights for future hurricane evacuation planning and emergency management.
APA, Harvard, Vancouver, ISO, and other styles
31

Soto, Gómez Agnes Jane. "Geographical Distribution of Disasters Caused by Natural Hazards in Data-scarce Areas : Methodological exploration on the Samala River catchment, Guatemala." Doctoral thesis, Uppsala universitet, Luft-, vatten och landskapslära, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-260708.

Full text
Abstract:
An increasing trend in both the number of disasters and affected people has been observed, especially during the second half of the 20th century. The physical, economic and social impact that natural hazards have had on a global scale has prompted an increasing interest of governments, international institutions and the academia. This has immensely contributed to improve the knowledge on the subject and has helped multiply the number of initiatives to reduce the negative consequences of natural hazards on people. The scale on which studies supporting disaster risk reduction (DRR) actions are performed is a critical parameter. Given that disasters are recognized to be place-dependent, studying the geographical distribution of disasters on a local scale is essential to make DRR practical and feasible for local authorities, organizations and civilians. However, studying disasters on the local scale is still a challenge due to the constraints posed by scarce data availability. Social vulnerability in many disaster-prone areas is however a pressing issue that needs to be swiftly addressed despite of the many limitations of data for such studies. This thesis explored methodological alternatives to study the geographical distribution of natural disasters and their potential causes in disaster-prone and data-scarce areas. The Samala River catchment in Guatemala was selected as a case study, which is representative of areas with high social vulnerability and data scarcity.  Exploratory methods to derive critical disaster information in such areas were constructed using the geographical and social data available for the study area. The hindrances posed by the available data were evaluated and the use of non-traditional datasets such as nightlights imagery to complement the available data were explored as a way of overcoming the observed limitations. The exploratory methods developed in this thesis aim at (a) deriving information on natural disasters under data-scarce circumstances, (b) exploring the correlation between the spatial distribution of natural disasters and the physical context in order to look for causalities, (c) using open data to study the social context as a potential cause of disasters in data-scarce areas, and (d) mapping vulnerabilities to support actions for disaster risk reduction. Although the available data for the case study was limited in quantity and quality and many sources of uncertainty exist in the proposed methods, this thesis argues that the potential contribution to the development of DRR on a local scale is more important than the identified drawbacks. The use of non-traditional data such as remotely sensed imagery made it possible to derive information on the occurrences of disasters and, in particular, causal relationships between location of disasters and their physical and social context.
El número de desastres y personas afectadas por esos desastres en el mundo han mostrado una tendencia creciente, especialmente en la segunda mitad del siglo veinte. El impacto físico, económico y social que las amenazas naturales han causado a nivel global ha causado que gobiernos, instituciones internacionales y la academia se interesen cada vez más en los desastres causados por esas amenazas. Este interés ha contribuido a mejorar el conocimiento existente sobre desastres y ha contribuido a multiplicar las iniciativas orientadas a reducir sus efectos negativos en las personas. La escala en la cual las iniciativas para la reducción del riesgo de desastres (RRD) se llevan a cabo es un parámetro crítico para su materialización. Hoy en día se reconoce la estrecha relación que existe entre los desastres y los lugares donde éstos se registran. Por esta razón, estudiar la distribución de los desastres en una escala local es esencial para que la RRD sea práctica y factible para autoridades y organizaciones locales, y también para la sociedad civil. Sin embargo, estudiar los desastres en una escala local es aún un problema por resolver debido a las restricciones impuestas por la escasa disponibilidad de datos de alta resolución. A pesar de las dificultades y limitaciones identificadas, la vulnerabilidad social en las regiones propensas a desastres es un problema importante que necesita ser atendido con prontitud. La presente tesis exploró alternativas metodológicas para estudiar la distribución geográfica de los desastres naturales y sus causas potenciales, particularmente en áreas propensas a desastres y en condiciones de información limitada. La cuenca del Río Samalá fue seleccionada como caso de estudio debido a que es un área representativa de áreas propensa a desastres con alta vulnerabilidad social y además escasez de datos. El trabajo de investigación propone métodos exploratorios para extraer información crítica sobre desastres utilizando la información geográfica y social que esté disponible, evaluando los obstáculos impuestos por la reducida disponibilidad de datos. La información existente fue complementada con el uso de fuentes de información no tradicional, e.g. imágenes satelitales de luces nocturnas, como una manera de superar las limitaciones identificadas. Los métodos desarrollados en este trabajo de tesis tuvieron como objetivos (a) obtener información sobre desastres naturales en condiciones de escasez de datos, (b) explorar la correlación entre la distribución espacial de los desastres naturales y su contexto físico para identificar causalidades, (c) utilizar información de libre acceso para estudiar el contexto social de los desastres como causa potencial de los desastres en áreas con escasez de datos, y (d) mapear vulnerabilidades para sustentar acciones para la RRD. Este trabajo de tesis sostiene que la contribución potencial de los métodos propuestos al desarrollo de la RRD en la escala social es más importante que las incertidumbres que implican y las limitaciones creadas por la reducida calidad y cantidad de información para el caso de estudio. El uso de fuentes de información no tradicionales tales como imágenes satelitales hizo posible incrementar la información sobre las incidencias de desastres y, en particular, buscar relación de dependencia entre los lugares particulares en los que los desastres fueron registrados y su contexto físico y social.
APA, Harvard, Vancouver, ISO, and other styles
32

Lee, Ji Yun. "Risk-informed decision for civil infrastructure exposed to natural hazards: sharing risk across multiple generations." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53965.

Full text
Abstract:
Civil infrastructure facilities play a central role in the economic, social and political health of modern society and their safety, integrity and functionality must be maintained at manageable cost over their service lives through design and periodic maintenance. Hurricanes and tropical cyclones, tornadoes, earthquakes and floods are paramount among the potentially devastating and costly natural disasters impacting civil infrastructure. Even larger losses may occur in the future, given the population growth and economic development accompanying urbanization in potentially hazardous areas of the world. Moreover, in recent years, the effects that global climate change might have on both the frequency and severity of extreme events from natural hazards and their effect on civil infrastructure facilities have become a major concern for decision makers. Potential influences of climate change on civil infrastructure are even greater for certain facilities with service periods of 100 years or more, which are substantially longer than those previously considered in life-cycle engineering and may extend across multiple generations. Customary risk-informed decision frameworks may not be applicable to such long-term event horizons, because they tend to devalue the importance of current decisions for future generations, causing an ethical and moral dilemma for current decision-makers. Thus, intergenerational risk-informed decision frameworks that consider facility performance over service periods well in excess of 100 years and extend across multiple generations must be developed. This dissertation addresses risk-informed decision-making for civil infrastructure exposed to natural hazards, with a particular focus on the equitable transfer of risk across multiple generations. Risk-informed decision tools applied to extended service periods require careful modifications to current life-cycle engineering analysis methods to account for values and decision preferences of both current and future generations and to achieve decisions that will be sustainable in the long term. The methodology for supporting equitable and socio-economical sustainable decisions regarding long-term public safety incorporates two essential ingredients of such decisions: global climate change effect on stochastic models of extreme events from natural hazards and intergenerational discounting methods for equitable risk-sharing. Several specific civil infrastructure applications are investigated: a levee situated in a flood-prone city; an existing dam built in a strong earthquake-prone area; and a special moment resisting steel frame building designed to withstand hurricanes in Miami, FL. These investigations have led to the conclusion that risks can and should be shared across multiple generations; that the proposed intergenerational decision methods can achieve goals of intergenerational equity and sustainability in engineering decision-making that are reflective of the welfare and aspirations of both current and future generations; and that intergenerational equity can be achieved at reasonable cost.
APA, Harvard, Vancouver, ISO, and other styles
33

O'Grady, Kevin Lawrence. "Facing natural hazards : uncertain and intertemporal elements of choosing shore protection along the Great Lakes /." Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06062008-165904/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Nehl, Ryan E. "Analysis of naturally-occurring and technology-based hazards in Indiana's District 6 region." Virtual Press, 2007. http://liblink.bsu.edu/uhtbin/catkey/1380106.

Full text
Abstract:
Naturally-occurring and technology-based hazards affect public health and safety to varying degrees. Naturally-occurring hazards include weather-related events and infectious disease epidemics/pandemics. Examples of technology-based hazards include hazardous materials incidents and electrical power outages. Due to limited resources, emergency planners have to prioritize hazards that may affect local jurisdictions. The purpose of the reported study was to construct a hierarchy of public safety hazards at the county and district levels to aid emergency planners. Public safety representatives from Indiana's District 6 region completed a survey, based on the Oregon Emergency Management Hazard Analysis Methodology, which assigns numerical scores to various hazard categories based on history, vulnerability, maximum potential, and probability of occurrence within a given jurisdiction. Participants also completed an open-ended question, in narrative form, to describe any additional hazards that may affect their jurisdiction. Significant differences were found in point totals for various hazards (p = .000). Significant differences were found among public safety disciplines in rating the infectious disease hazard (p = .02). No significant differences were found in point totals between naturally-occurring and technology-based hazards (p = .86). Overall, a high level of agreement between disciplines on rating hazards, and significant differences between hazard categories suggests that hazard category prioritization is warranted.
Department of Natural Resources and Environmental Management
APA, Harvard, Vancouver, ISO, and other styles
35

Marcer, Marco. "Déstabilisation des glaciers rocheux dans les Alpes Françaises : une évaluation à l'échelle régionale et locale." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAU048/document.

Full text
Abstract:
Le permafrost de montagne est menacé par le réchauffement atmosphérique, une évolution qui s’accompagne de l’augmentation des phénomènes tels que les chutes de pierres, la formation de thermokarsts et l’accélération des glaciers rocheux. La déstabilisation des glaciers rocheux, qui compromet l’intégrité structurelle de ces formes, semble liée au réchauffement atmosphérique, et a suscité un intérêt grandissant au cours des dernières années. Ce phénomène, qui peut être provoqué par le réchauffement du pergélisol ou des contraintes mécaniques externes, est caractérisé par une accélération anormale des glaciers rocheux affectés, et par l’apparition des signes géomorphologiques telles que des fissures et des crevasses à sa surface. Bien que ce processus peut être transitoire, il peut déterminer une phase de crise amenant le glacier rocheux à un effondrement.Cet étude se préfixe de fournir une première évaluation des phénomènes de déstabilisation de glacier rocheux à l’échelle des Alpes françaises. Dans un premier temps, l’empreinte spatiale du pergélisol a été évaluée afin de produire une carte de répartition du pergélisol régionale, un outil nécessaire pour estimer l’état du permafrost dans les glaciers rocheux. La deuxième étape a consisté à identifier les formes déstabilisées grâce à une observation ponctuelle des images aériennes afin d’identifier les caractéristiques typiquement observables sur les glaciers rocheux déstabilisés. Il est alors possible de comprendre les conditions topoclimatiques typiques dans lesquelles se produit ce phénomène et de repérer les formes susceptibles de subir ce processus. Enfin, les efforts ont été concentrés sur le glacier rocheux du Lou, déstabilisé, qui, du fait d’un détachement de couche active, a conduit à une lave torrentielle en Août 2015. L’analyse a visé à mieux définir les circonstances de cet événement, en mettant l’accent sur les facteurs de préconditionnement, de préparation et de déclenchement et sur leur interaction avec le processus de déstabilisation.Les résultats ont fourni des informations riches sur la zone périglaciaire de la région. La modélisation de la répartition du pergélisol a mis en évidence les étendues de la zone périglaciaire dans la région qu’on peut trouver sur les pentes de débris au-dessus de 2300 - 2500 m.a.s.l. en fonction de l’exposition solaire et des caractéristiques régionales des précipitations. L’observation des photographies aériennes a permis d’observer 46 formes en cours de déstabilisation, soit 12% des glaciers rocheux actifs des Alpes françaises. Il apparaît que la déstabilisation est plus susceptible de se produire dans certaines conditions topoclimatiques locales spécifiques, en particulier dans des pentes exposées au nord, raides et convexes situées aux marges inférieures de la zone de pergélisol. Un grand nombre de glaciers rocheux ne présentant actuellement aucune déstabilisation sont donc susceptibles d’être affectés par une déstabilisation future. L’analyse du glacier rocheux du Lou a révélé que la déstabilisation est liée à une avancée rapide du front vers un ravin torrentiel. Ce processus semble avoir accru la prédisposition des matériaux détritiques du front à être mobilisés par du ruissellement, des précipitations relativement modérées ayant suffi à déclencher l’événement.Malgré les incertitudes liées aux méthodes impliquées, les résultats suggèrent que les conditions favorables à la déstabilisation sont fréquentes, et que cette dernière peut augmenter le niveau de risque si le site est connecté à des infrastructures humaines. Des efforts supplémentaires doivent donc être entrepris, afin d’améliorer la compréhension de ces processus, notamment par la surveillance des sites ainsi que par une évaluation locale complète des cascades de processus liés à ce phénomène
As occurring to several geosystems on our planet, mountain permafrost is threatened by climate change as prolonged warming may compromise the geotechnical properties of the frozen ground. As result, increasing occurrence of rockfall activity, thermokarst formation and rock glacier acceleration was observed in the past decades. Rock glacier destabilization, a process that compromises the structural integrity of these landforms, seems to be linked to atmospheric warming, gaining interest in the past years. The destabilization, which may be triggered by warming permafrost or mechanical stress, is characterized by an anomalous acceleration of the landform and the occurrence of specific features such as cracks and crevasses on its surface. Although the occurrence of these processes is mostly transitory, determining a textit{crisis} phase of the landform, in exceptional cases it may lead the rock glacier to structural collapse.This PhD thesis provided an assessment on the occurrence and related processes of rock glacier destabilization in the French Alps. At first, the spatial occurrence of debris permafrost was assessed in order to provide the permafrost distribution map of the French Alps, a tool that was necessary to evaluate permafrost conditions at rock glaciers sites. The second step consisted in an identification of destabilized rock glaciers in the region, which was done by multiple orthoimages interpretation aimed to identify features typically observable on destabilized rock glacier. Once identified the destabilized rock glaciers it was possible to analyse the typical topographical settings in which destabilization occurs and to to spot those landforms that are susceptible to experience this phenomenon. After these efforts at the regional scale, the focus was shifted towards local scale investigations at the Lou rock glacier, a partially destabilized landform that, due to frontal failure, in August 2015 triggered a debris flow that caused significant damages to buildings. The analysis aimed to better define the circumstances of this event, focusing on preconditioning, preparatory and triggering factors and their interaction with the destabilization process.The results provided interesting insights on the issue of destabilizing rock glaciers in the region. Permafrost distribution modeling demonstrated the large extents of the periglacial zone in the region as it can be found in debris slopes above 2300 - 2900 m.a.s.l. depending upon solar exposure and regional precipitation characteristics. Rock glacier destabilization was observed on 46 landforms, i.e. the 12% of the active rock glaciers. Destabilization was found to be more likely to occur in specific local topo-climatic conditions, consisting of north facing, steep and convex slopes at the lower margins of the permafrost zone. A large number of rock glaciers currently not showing destabilization was found to be located in these conditions and suggested to be susceptible to future destabilization. As demonstrated by the Lou rock glacier analysis, destabilization was found to be a relevant phenomena in the context of permafrost hazards. At this site, rock glacier destabilization was linked to a rapid frontal advance towards a torrential gully. This process seemed to have increased the site predisposition to frontal failure as a mild rainstorm was sufficient to trigger the event.Despite methodological uncertainties, results indicated that destabilization occurrence is widespread and it may rise the hazard level of a site connected to human infrastructures. Therefore, it is suggested that, where it has been modelled and where stakes may be at risk downslope, rock glacier destabilization deserves to be more carefully investigated. In this sense further efforts should focus towards a better understanding of the destabilization process by site monitoring as well as towards a comprehensive hazard assessment linked to this phenomenon
APA, Harvard, Vancouver, ISO, and other styles
36

Stolle, Jacob. "Debris Hazard Assessment in Extreme Flooding Events." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39621.

Full text
Abstract:
Coastal areas are often important to economic, social, and environmental processes throughout the world. With changing climate and growing populations in these areas, coastal communities have become increasingly vulnerable to extreme flooding events, such as tsunami, storm surges, and flash floods. Within this new paradigm, there has been an effort to improve upon current methods of hazard assessment, particularly for tsunami. Recently, the American Society of Civil Engineers (ASCE) released the ASCE 7 Chapter 6 which was the world’s first standard, written in mandatory language, that addressed tsunami resilient design in a probabilistic manner for several of its prescriptions. While often the focus tends to be on mapping the hazards related to hydraulic loading conditions, post-tsunami field surveys from disaster-stricken coastal communities have also shown the importance of also considering the loads exerted by solid objects entrained within the inundating flows, commonly referred to as debris loading. Limited research has addressed debris hazard assessment in a comprehensive manner. Debris loading can be generally divided into two categories: impact and damming. Debris impact loads are caused by the rapid strike of solid objects against a structure. Debris damming loads are the result of the accumulation of debris at the face of or around a structure, causing thus an obstruction to the flow. The primary difference between these loads is the time period over which they act. The rapid loading due to debris impacts requires structural properties be considered in assessing the associated loads whereas debris damming loads are generally considered in a quasi-static manner. In assessing the hazard associated with both impact and damming loading conditions, methodologies must be developed to consider the likelihood of the load occurring and the magnitude of that load. The primary objective of this thesis was to develop a probabilistic framework for assessing debris hazards in extreme coastal flooding events. To achieve this objective, the components of the framework were split into three general categories: debris transport, debris damming, and debris impact. Several physical experimental studies were performed to address each of these components, representing the most comprehensive assessment of debris hazards in extreme flooding events to date. Debris transport was addressed to estimate the likelihood of debris loading occurring on a structure. The studies presented herein examine the different parameters that must be considered in assessing the motion of debris with the flow. The studies showed that the initial configuration of the debris and hydrodynamic conditions were critical in determining the motion of the debris. The stochastic properties of the debris motion were also assessed. It was shown that the lateral displacement of the debris could be approximated by a Gaussian distribution and the debris velocity by a Kumaraswamy (1980) distribution. The study of debris impact was further used to develop the current models used in estimating the impact force. The rigid body impact model was compared to models where the structural response was considered. The analysis showed that the effective stiffness model proposed by Haehnel and Daly (2004) was best suited to provide a conservative estimation of the impact force. Additionally, the impact geometry was taken into consideration examining the influence of various parameters on the impact force. Furthermore, debris damming was examined for the first time in transient loading conditions. This particular study examined the influence of the transient wave condition on the debris dam formation as well as the influence of different debris geometries. The influence of the debris dam geometry was correlated to increases in loading and overtopping conditions at structures. The assessment of debris hazards is critical in the development of accurate design conditions. The probabilistic framework presented within this thesis is expected to provide a basis for estimating debris hazards and inform future studies in the development of hazard assessment models.
APA, Harvard, Vancouver, ISO, and other styles
37

Pardoe, Joanna [Verfasser]. "Multiple and more frequent natural hazards : The vulnerability implications for rural West Africa communities / Joanna Pardoe." Bonn : Universitäts- und Landesbibliothek Bonn, 2016. http://d-nb.info/112454013X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Settembrino, Marc. "When There's No Home to Prepare: Understanding Natural Hazards Vulnerability Among the Homeless in Central Florida." Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5864.

Full text
Abstract:
The current study explores the social construction of natural hazards vulnerability by examining the perceptions of emergency management personnel, homeless service providers and homeless men living in Central Florida. The matrix of vulnerability is proposed as a framework for studying disaster vulnerability, wherein vulnerability is viewed as a complex process consisting of social and physical risk, human agency and time. Using the matrix as a guiding framework, this study examines the risks that natural hazards present to the homeless living in Central Florida and the strategies used by the homeless to manage these risks. This study argues that because the homeless experience increased exposure to natural hazards coupled with potential chronic medical conditions, economic hardship, and social stigma, they are more vulnerable to natural hazards than the general population. However, this study finds that homeless men in Central Florida utilize a variety of strategies that help them manage their risks to severe and inclement weather in Central Florida.
Ph.D.
Doctorate
Sociology
Sciences
Sociology
APA, Harvard, Vancouver, ISO, and other styles
39

Landry, Joselin Simoneaux. "An Examination: Using Participatory Action Research in a Marginalized Coastal Community at Risk to Natural Hazards." ScholarWorks@UNO, 2008. http://scholarworks.uno.edu/td/856.

Full text
Abstract:
This extended case study examines the appropriateness of using Participatory Action Research (PAR) in a small, marginalized coastal community at risk to natural hazards. PAR is a method of conducting high-quality research to support the social change goals of diverse cultural and ethnic communities, especially as they relate to community involvement, democracy, emancipation, and liberation (Lindsey and McGuinness (1998). PAR is not the typical research methodology for hazards research. The community's goal was to "Save our heritage and our land". They consisted of 75-80 members who primarily make their living by seafood extraction. This community has experienced social oppression and environmental events associated with living in a coastal Louisiana community. Findings suggest that PAR has its limitations, however, it does appear to be a useful research method for residents, researchers and those who want a more respectful, empowering, ground up approach to multi-user learning and social change.
APA, Harvard, Vancouver, ISO, and other styles
40

Castillo, Rodríguez Jesica Tamara. "INTEGRATED FLOOD RISK MANAGEMENT: TOWARDS A RISK-INFORMED DECISION MAKING INCORPORATING NATURAL AND HUMAN-INDUCED HAZARDS." Doctoral thesis, Universitat Politècnica de València, 2018. http://hdl.handle.net/10251/82305.

Full text
Abstract:
Flood risk reduction is a global challenge. Society demands higher safety and security levels, including those actions related to flood defence infrastructure protection against natural hazards and manmade threats. Dams and levees, among other flood defence infrastructures, are critical hydraulic infrastructures, aiming at reducing the likelihood that people and property will get flooded, but whose failure would result in consequences for the community downstream, including not only economic damages but also loss of life. There is always a probability associated with infrastructure failure, although in general it might be very low. The purpose of the PhD research, with title "Integrated flood risk management: towards a risk-informed decision making incorporating natural and human-induced hazards", here presented is to propose a framework to enhance integrative flood risk management from a multi-hazard perspective (pluvial flooding, river flooding, dam and levee failure, including man-made threats), addressing current needs for decision making on flood risk reduction and analyzing the complexity of multiple hazards and systems which include multiple components. The thesis is structured in three main parts, including: (i) Part I, a methodology aiming at providing a common framework for identifying and characterizing flood risk due to pluvial flooding, river flooding and dam failure, and incorporate information on loads, system response and consequences into risk models to analyse societal and economic flood risk, (ii) Part II, an approach for quantifying and analyzing risk for complex dam-levee systems, to incorporate information from levee failure into risk models based on the aforementioned methodology, and to analyse societal and economic flood risk, including the potential failure of these infrastructures, and (iii) Part III, a screening tool to characterize the impact of human induced threats on risk due to dam failure or mission disruption. Results from this research have proven that the use of risk models provides a logic and mathematically rigorous framework for compiling information for flood risk characterization and analysis from different natural hazards and flood defence performance. The proposed framework in this thesis and applications aimed at encouraging key actors on flood risk management (infrastructure managers, authorities, emergency action planners, etc.) on the use of QRA, and at demonstrating to what extent QRA can usefully contribute to better understanding risk drivers and inform decisions on how to act to efficiently reduce flood risk.
La reducción del riesgo de inundación es un reto global. La sociedad actual demanda cada vez mayores niveles de seguridad, incluyendo la consecución de acciones vinculadas a la protección de las infraestructuras de defensa frente a inundaciones ante amenazas naturales y antrópicas. Presas y diques, entre otras obras de defensa, son infraestructuras críticas cuyo objetivo es reducir la probabilidad de inundación. Sin embargo, su fallo puede resultar en consecuencias para la comunidad situada aguas abajo, incluyendo no sólo daños económicos sino también pérdida potencial de vidas. Siempre existe una cierta probabilidad asociada al fallo de estas infraestructuras, aunque en general muy baja. El objetivo de la investigación llevada a cabo en la presente tesis doctoral, con título "Integrated flood risk management: towards a risk-informed decision making incorporating natural and human-induced hazards", es proporcionar un marco que fomente la gestión integral del riesgo de inundación desde una perspectiva multi-amenaza, considerando las necesidades actuales en la toma de decisiones para la gestión del riesgo de inundación y analizando la complejidad de sistemas con múltiples componentes, afectados por diferentes amenazas. La tesis se estructura en tres partes principales, incluyendo: (a) Parte I, una metodología para proporcionar un marco común para la identificación y caracterización del riesgo de inundación por inundación pluvial, fluvial y fallo de presas, incorporando información sobre solicitaciones, respuesta del sistema y consecuencias en modelos de riesgo que permiten analizar y evaluar el riesgo social y económico por inundación, (b) Parte II, un método para la cuantificación y análisis del riesgo en sistemas complejos presa-dique, con el objetivo de incorporar información referente al fallo de diques en la metodología propuesta en la Parte I, y analizar el riesgo social y económico por inundación incluyendo el fallo de varias infraestructuras de defensa, y (c) Parte III, una herramienta de cribado que permite caracterizar el impacto de amenazas de origen antrópico en el riesgo asociado al fallo de presas. Los resultados de esta investigación demuestran que el uso de modelos de riesgo proporciona un marco lógico y matemáticamente riguroso para la consideración de toda la información necesaria para la adecuada caracterización y análisis del riesgo de inundación por amenazas naturales y por fallo o mal funcionamiento de obras de defensa. El marco metodológico propuesto y las aplicaciones descritas en esta tesis tienen como objetivo impulsar la aplicación del análisis de riesgo por parte de los actores clave en la gestión del riesgo de inundación (gestores de infraestructuras, autoridades locales, gestores de emergencias, etc.) y demostrar en qué medida estos análisis pueden contribuir a alcanzar un mejor conocimiento de los factores clave que componen el riesgo e informar en la toma de decisiones hacia una reducción del riesgo más eficiente.
La reducció del risc d'inundació és un repte global. La societat actual demana majors nivells de seguretat, incloent-hi la realització d'accions vinculades a la protecció de les infraestructures de defensa enfront del risc d'inundacions afectades per amenaces naturals i antròpiques. Preses i dics fluvials, entre altres obres de defensa, són infraestructures crítiques i tenen l'objectiu de reduir la probabilitat d'inundació però el seu trencament pot resultar en conseqüències en, danys econòmics i també pèrdua potencial de vides. Sempre hi ha una certa probabilitat vinculada al trencament d'aquestes infraestructures, encara que en general molt baixa. L'objectiu de la investigació duta a terme en aquesta tesi doctoral, amb títol "Integrated flood risk management: towards a risk-informed decision making incorporating natural and human-induced hazards", és proporcionar un marc per a fomentar la gestió integral del risc d'inundació des d'una perspectiva multi-amenaça, tenint en compte les necessitats actuals per prendre decisions per a la gestió del risc d'inundació i analitzant sistemes complexes amb múltiples components i afectats per diferents amenaces. La tesi s'estructura en tres parts principals: (a) Part I, una metodologia proposada per a proporcionar un marc comú per a la identificació i caracterització del risc d'inundació per inundació pluvial, fluvial i trencament de preses, incorporant informació de sol¿licitacions, resposta del sistema i conseqüències en models de risc que permeten analitzar el risc social i econòmic per inundació, (b) Part II, un mètode per a la quantificació i anàlisi del risc en sistemes complexes, amb l'objectiu d'incorporar informació referent al trencament de dics fluvials en la metodologia descrita en la Part I, i analitzar el risc social i econòmic pel trencament de diverses infraestructures de defensa, i (c) Part III, una ferramenta de pre-anàlisi per a caracteritzar l'impacte d'amenaces de origen antròpic en el risc associat al trencament de preses. Els resultats de la investigació demostren l'utilitat de l'aplicació de models de risc, proporcionant un marc lògic i matemàticament rigorós per a la consideració de tota la informació necessària per a l'adequada caracterització i anàlisi del risc d'inundació per amenaces naturals i per trencament d'obres de defensa. El marc metodològic i les aplicacions derivades d'aquesta tesi tenen com a objectiu impulsar l'aplicació d'anàlisi de risc quantitatius per part dels actors vinculats a la gestió del risc d'inundació (gestors d'infraestructures, autoritats locals, gestors d'emergències, etc.) i demostrar que poden contribuir a disposar d'un millor coneixement dels factors clau que componen el risc, i per a informar les decisions necessàries per a una reducció del risc més eficient.
Castillo Rodríguez, JT. (2017). INTEGRATED FLOOD RISK MANAGEMENT: TOWARDS A RISK-INFORMED DECISION MAKING INCORPORATING NATURAL AND HUMAN-INDUCED HAZARDS [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/82305
TESIS
APA, Harvard, Vancouver, ISO, and other styles
41

Findley, D. Maximillian. "Rationalizing Disaster: Assessing the physical, economic, and cultural impact of natural hazards in Luzon, 1645-1754." Thesis, Findley, D. Maximillian (2020) Rationalizing Disaster: Assessing the physical, economic, and cultural impact of natural hazards in Luzon, 1645-1754. PhD thesis, Murdoch University, 2020. https://researchrepository.murdoch.edu.au/id/eprint/54529/.

Full text
Abstract:
This dissertation asserts historic natural hazards and the disasters they created are a potent and flexible analytical tool for studying the Philippine archipelago during the mid-colonial period (ca. 1640-1764). Historic hazards, because they occurred in the islands with sufficient regularity in the seventeenth and eighteenth centuries, were not just discrete, disruptive events, but processes that acted over time. These hazards, when viewed as processes, illustrate how a colonial society altered and adapted itself to cope with prolonged disruptions. When comprehended as events, though, responses to individual natural hazards identify the central connections and tensions that defined the colonial Philippines at the moment of disruption. Therefore, this dissertation employs both perspectives to study the physical, economic, and cultural impact of natural hazards in Spanish Luzon between 1645 and 1754, years defined by the most severe disasters experienced in the islands in their respective centuries. By treating each hazard that transpired in the 109-year period as separate events, the dissertation demonstrates how seismic and meteorological hazards threatened crucial assets of the Spanish Empire, including galleons, fortresses, and churches. The dissertation also identifies how individual hazards amplified the growing poverty of the Philippine colony in the seventeenth century. By treating the same destructive events as a process, the dissertation shows the evolving responses of governing institutions—colonial administrators and members of the clergy—to natural hazards over time. These institutional adaptations are reflected in the ways narratives of disaster shared amongst the colony’s literate, Spanish elite changed between 1645 and 1754 to emphasize hazards’ capacity for destruction over their supposed metaphysical causes. Lastly, through case studies on folk magic and the creolization of Catholic festivals, the dissertation explores how Spanish soldiers and the colonized indigenous peoples of Luzon perceived natural hazards respectively.
APA, Harvard, Vancouver, ISO, and other styles
42

Korzeniowska, Karolina [Verfasser], and Oliver [Akademischer Betreuer] Korup. "Object-based image analysis for detecting landforms diagnostic of natural hazards / Karolina Korzeniowska ; Betreuer: Oliver Korup." Potsdam : Universität Potsdam, 2017. http://d-nb.info/1218402792/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Dhanhani, Hamdan Al Ghasyah. "Evaluation of the response capability of the United Arab Emirates (UAE) to the impact of natural hazards." Thesis, University of Bedfordshire, 2010. http://hdl.handle.net/10547/241787.

Full text
Abstract:
The UAE is an Islamic state which has undergone dramatic urbanisation in the last 30 years. It is situated near the eastern margin of the Arabian tectonic plate, close to the seismically active collision zone between the Arabian and Eurasian plates, marked by the Zagros Mountain belt of Iran. In the UAE the population of Dibba in Fujairah has felt tremors as recently as November 26 2009 and an earthquake with a magnitude (M) of 5 occurred in Masafi, Fujairah, in March 2002. The most recent earthquake was M 4.3, and awareness of seismic hazard is increasing. In addition to earthquakes, rapid heavy rainfall in the arid environment of the UAE typically results in high level of discharge and flooding. Tropical storms have also struck the Indian Ocean coast of the UAE and have caused damage in coastal areas. The impact of natural hazard events in Fujairah since 1995 and the responses of the authorities and affected communities illustrates the issues faced by the country and is discussed in this thesis. The Federal Plan to face disasters in the UAE prepared by the Civil Defence sets out the role of the government structures in the UAE to manage disasters with particular reference to the Ministry of Interior, which is the responsible body. A survey of UAE ministries and the Civil Defence shows that in practice there is a lack of clarity between the roles of government bodies and there are many areas of confusion regarding jurisdiction and responsibility between the federal and individual emirate institutions. It was a concern that some supporting ministries were unaware of their role as set out in the overall plan. There is lack of evidence of an integrated approach and no testing of effectiveness of emergency procedures through simulation exercises. It is recognised that, not only are school children particularly vulnerable to natural disasters but also that education is an important mitigation tool through raising awareness of hazard exposure amongst the population. A survey of schools in Fujairah showed that there was little preparation for natural disasters and no framework to address this issue or to ensure the structural integrity of school buildings. The survey revealed that there is a willingness to learn among the school children and this was followed up by a pilot scheme to raise awareness. This is important as the survey also revealed that traditional views about losses are still common amongst parents, particularly in rural areas. The vulnerability of the communities to natural hazards is strongly influenced by social and cultural factors. A survey was undertaken of the population of the UAE to investigate their awareness of natural hazards, their perception of risk and how this might be mitigated. The survey revealed a low level awareness and what the role of government agencies would be in the event of a disaster. A majority considered that disasters were Acts of God, a punishment, and that the most effective way to mitigate risk was through religious observance. It is clear that even in a developed Islamic country an effective response to mitigate risk needs to recognise and address the cultural and religious contexts. Finally the thesis evaluates the response capability of the UAE to the impact of natural hazards. This analysis shows that though there is a Federal Plan for Disasters there is little specific focus on natural hazards. Ministries not directly involved with the Civil Defence were sometimes unclear regarding their roles. At an operational level there is lack of clarity regarding responsibilities and lines of authority between different bodies and between Federal and emirate structures. The Civil Defence was very much focussed on response with little effort devoted to reducing vulnerability through awareness-raising, hazard assessment and monitoring. These need to be addressed to minimise the risk from natural disasters.
APA, Harvard, Vancouver, ISO, and other styles
44

Homan, Jacqueline. "Realism, social constructionism and 'natural' hazards : a study of people-nature relations in Egypt and the U.K." Thesis, University of Liverpool, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366403.

Full text
Abstract:
Recent literature in the social sciences has emphasized the socially constructed nature of knowledge. Consequently, this has had a bearing upon the understanding of science; interpretations of the natural world; and issues associated with understanding 'the Other'. The relevance of these wider social debates can be extended into a consideration of 'natural' hazards in different cultural contexts. This thesis attempts to develop a 'middle ground', drawing on theories of critical realism, that appreciates the socially constructed nature of scientific practice, but that retains the empancipatory, positive potential of science and that allows intervention in other cultural contexts. The remainder of the thesis attempts to put some of these ideas into practice and to develop the implications of these arguments for those interested in understanding and mitigating 'natural' hazards in other cultures. Two case studies are used, relating to Egypt and the U.K., which explore the scientific understandings of 'natural' hazard events in two different cultural contexts. Fundamental to the approach adopted is the need to acknowledge science as a social practice and how it functions within different societies. Examples are given, pertaining to both Egypt and the U.K., of what this might mean in 'practice'. In summary, therefore, there is an appreciation of the implications of recent social science literature for hazards research and the development of a practical approach to hazards with a social and philosophical justification
APA, Harvard, Vancouver, ISO, and other styles
45

Мартиненко, Володимир Олександрович, Владимир Александрович Мартыненко, Volodymyr Oleksandrovych Martynenko, and К. В. Черніговець. "Управління ризиками надзвичайних ситуацій природного та техногенного характеру." Thesis, Сумський національний аграрний університет, 2015. http://essuir.sumdu.edu.ua/handle/123456789/60169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Da, Silva Garcia Bruna. "Analyse des mécanismes d'interaction entre un bloc rocheux et un versant de propagation : application à l'ingénierie." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAI062/document.

Full text
Abstract:
De nombreuses incertitudes liées aux mécanismes d'interaction entre les blocs rocheux et le versant naturel lors des chutes de blocs persistent ; la prévision de tels événements reste de ce fait encore incertaine. Néanmoins, les outils numériques et la puissance de calcul ne cessent d'évoluer. Si, auparavant, les calculs trajectographiques étaient restreints à des géométries simplifiées et à des mouvements balistiques en deux dimensions, il devient désormais possible d'y intégrer des raffinements tels que la forme complexe des blocs, des modèles numériques tridimensionnels de terrain d’une grande définition ou une prise en compte fine des mécanismes dissipatifs au niveau du point d'impact entre le bloc et le versant de propagation.L’objectif principal de la thèse est d’analyser, avec un code numérique discret en trois dimensions, l’influence des paramètres de forme et d’interaction sur la nature du rebond dans un contexte d’ingénierie. Nous présentons tout d’abord une méthodologie d'identification et d'étude de sensibilité des paramètres de contact, élaborée et validée à partir d’expérimentations de laboratoire. Cette méthodologie a été appliquée par la suite à deux expérimentations de chute de blocs menées sur sites réels à moyenne et à grande échelle.L’étude réalisée à moyenne échelle a permis de confronter le modèle numérique à des données obtenues lors d'une campagne expérimentale sur voies ferroviaires commanditée par la SNCF et menée en collaboration avec IRSTEA. Les analyses qui ont été réalisées ont porté sur les vitesses d’impact des blocs avec le ballast et les distances de propagation.L’étude menée à grande échelle s’appuie sur plusieurs séries de lâchés de blocs réalisées sur le site expérimental de la carrière d'Authume dans le cadre d’un Benchmark proposé dans le cadre du Projet National C2ROP. L’objectif principal du Benchmark est de tester et de comparer entre eux des logiciels trajectographiques, des codes de calculs numériques et les pratiques d’ingénierie pour en définir la pertinence et les domaines de validité. Dans le cadre de la thèse, ce travail a été conduit en plusieurs phases (à l’aveugle puis avec des données partielles mesurées lors de la campagne d'essais) et nous présentons l'évolution de ces analyses à l'issue de chacune des phases. L’étude a porté principalement sur les vitesses, les hauteurs et les énergies de passage des blocs en certains points du profil de propagation, ainsi que sur les positions d’arrêt des blocs. Une étude sur l'influence de la forme des blocs sur les distances de propagations est également présentée.Enfin, un Benchmark interne réalisé au sein de l'entreprise IMSRN montre l’importance, sur les analyses, de l'expertise de l'opérateur, et des conséquences de l'utilisation de différents outils trajectographiques (en 2D et en 3D). Ces travaux mettent en lumière les problématiques actuelles auxquelles sont souvent confrontés les bureaux d'études et les ingénieurs en charge des études de risques
Numerous uncertainties related to the machanical interaction between rock boulders and the natural slope during block falls persist; and the forecast of such events is therefore still uncertain. Nevertheless, digital tools and computing power are constantly evolving. Previously, trajectory calculations were restricted to simplified geometries and two-dimensional ballistic movements, but it is now possible to incorporate refinements such as the complex shape of the blocks, three-dimensional numerical models of terrain of large sizes, as well as a better accounting of the dissipative mechanisms at the point of impact between the block and the run-out slope.The main objective of this work is to analyze, with a discrete elements code in three dimensions, the influence of the shape and interaction parameters on the nature of the rebound in an engineering context. We first present a methodology for identifying and studying the sensitivity of contact parameters, developed and validated from laboratory experiments. This methodology was subsequently applied to two block fall experiments conducted on medium and large real-scale scenarios.The study conducted on a medium scale allowed the numerical model to be compared with data obtained during an experimental rockfall tests campaign commissioned by the SNCF and conducted in collaboration with IRSTEA in a railway. The analyzes that were carried out mainly focused on the impact velocities of the blocks with ballast and propagation distances.The large-scale study is based on a series of block releases performed at the experimental site (Authume quarry, France) as part of a Benchmark proposed inside the National Project C2ROP. The mainly goal of this Benchmark is to access and compare trajectory softwares, numerical computation codes and engineering practices to define their relevance and validity domains. As part of the thesis, this work was conducted in several phases (blind phase and then conducted taking in account partial data measured during the experimental tests) and we present the evolution of these analyzes at the end of each one of these phases. The study focused on the velocities, heights and energies of the blocks at certain points of the propagation profile, as well as on the stopping positions of the blocks. The influence of block shapes on run-out distances is also presented.Lastly, an internal Benchmark performed within the IMSRN company shows the importance, on the analyzes, of the expertise of the operator, and the consequences derived from the application of different trajectography tools (in 2D and in 3D). This work highlights the current issues that are often faced by engineering offices and engineers in charge of risk quantification
APA, Harvard, Vancouver, ISO, and other styles
47

Phillips, Melissa Catherine Koeka. "Lightning and hurricane safety knowledge and the effects of education modes on elementary school children." Kent State University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=kent1465337764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Straßer, Michael. "Quantifying Late Quaternary natural hazards in Swiss lakes subaquatic landslides, slope stability assessments, Paleoseismic reconstructions and lake outbursts." Zürich Schweizerische Geotechnische Kommission, 2008. http://e-collection.ethbib.ethz.ch/view/eth:30202?q=strasser.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

BIGI, VELIA. "Understanding vulnerability to natural hazards: the importance of a local scale perspective for the production of relevant information." Doctoral thesis, Politecnico di Torino, 2022. http://hdl.handle.net/11583/2963944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Svensson, Britt-Marie. "Methodology for evaluation of hazards from solid waste and landfill-generated leachate." Doctoral thesis, Högskolan Kristianstad, Sektionen för Lärarutbildning, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hkr:diva-146.

Full text
Abstract:
A methodology based on an analytical protocol for evaluation of hazards from landfill leachate and solid waste is described. A dynamic analytical protocol, the LAQUA protocol, including measurements of inorganic and water-quality parameters, polar and non-polar organic marker compounds, and toxicity, was constructed. An acute toxicity test, using the brackish water crustacean Artemia salina as test organism, was developed. The methodology was applied to authentic problems such as investigation of different treatment techniques for landfill leachate, evaluation of leaching tests and characterization of solid wastes, and an investigation of a filter material aimed for leachate treatment. Investigated treatment methods comprised in all cases pre-treatment by aeration combined with sedimentation, followed by one of: bioremediation, ozonation, chemical oxidation by Fenton’s reagent, or geo-bed filters. Evaluated filter materials were mixtures of natural or residual waste products. A combination of pre-treatment followed by a geo-bed filter containing a mixture of peat and carbon-containing ash gave an efficient simultaneous removal of metals and organic pollutants. The performance of two leaching tests for characterization of solid waste, the up-flow percolation test (SIS-CEN/TS 14405:2004) and the batch test (SS-EN 12457 -3), was investigated. Solid waste materials (sludge from street gutters and fragmented metallic waste) were characterized using leaching tests and the hazards of the materials were evaluated in the eluate, obtained at specific liquid-to-solid ratios (L/S). The L/S 2 and L/S 10 values were compared with limit values included in the waste acceptance criteria (WAC). The analyses were extended towards specific organic compounds, such as individual phenolic compounds and polychlorinated biphenyls (PCB). Organic compounds were found in eluates from both types of tests, showing the possibility to use these methods to evaluate the leaching of such compounds from waste materials. The use of authentic leachate as leachant, leads to increased concentrations of heavy metals in the eluate, compared to the prescribed use of demineralised water as leachant. Generally good agreement was found between the results of the two leaching methods. A strategy based on batch tests is described for investigation of a filter material for leachate treatment. Batch tests gave suitable information about the leaching from new and used material, and showed high removal efficiencies of metals and non-polar organic compounds. However, for investigation of removal of polar organic markers (e.g. phenolic compounds) a batch test is not sufficient and should be supplemented by a column test.
Avfall har alltid uppkommit i alla samhällen i alla tider. Det vanligaste sättet att hand om det har varit att samla avfallet i soptippar (deponier) på mark som ansetts obrukbar, t.ex. i sankmarker utanför bebyggelsen. I takt med ökad konsumtion och produktion har också mängderna avfall ökat. Anledningen till att avfall och avfallshantering kommit alltmer i fokus i miljödebatter under de senaste decennierna, är inte bara de stora mängderna avfall på de många och stora sopbergen runt om i världen. Deponierna släpper också ifrån sig miljöfarliga ämnen från de alltmer komplexa produkter som blivit lagda i deponin. Dessa kommer att fortsätta att läcka ut länge efter det att deponeringen på soptippen slutat, och skapa problem för människor och miljö i flera generationer. Som en konsekvens av internationella beslut bl.a. Agenda 21 (FNs miljökonferens i Rio) och EUs lagstiftning om avfallshantering som Sverige har tagit in i sin lagstiftning, kommer bl.a. antalet soptippar att minska. Efter den 31 december 2008 beräknas endast ett 90-tal deponier för kommunalt avfall att vara i drift i Sverige. Lagstiftningen beskriver en avfalls hierarki där deponering är den sämsta och absolut sista åtgärden som får göras bara när inget av de andra alternativen kan uppfyllas. Det ställs stora krav på de soptippar som är godkända enligt den nuvarande lagstiftningen. Olika typer av deponier ska finnas för olika slag av avfall. En deponi för farligt avfall har bättre skydd mot läckage både under och över tippen än en tipp för icke-farligt avfall. För att ha kontroll på vilket avfall som läggs på deponin och därmed kunna behandla det på korrekt sätt, måste avfallet beskrivas och klassas innan det skickas för deponering. Karakteriseringen görs bl.a. med hjälp av lakningstester. Syftet med denna avhandling har varit att utarbeta en metodik för at utvärdera miljöfarliga ämnen som kommer från avfall och lakvatten från soptippar. Den inledande forskningen utgjordes av ett projekt kallat Laqua, finansierat av EU-kommissionens program för samarbete inom Östersjöregionen, SweBaltcop. Projektets uppgift var att främja utveckling av ekologiskt och ekonomiskt hållbara reningsmetoder för lakvatten. Lakvattnet bildas främst av nederbörd som faller över deponin. Det vatten som kommer in i soptippen tar med sig många av de ämnen som finns i tippen när det rinner ut. Dessa ämnen kan komma från sådant som har deponerats eller bildas när avfallet bryts ner. Lakvattnet samlas upp och renas på något sätt innan det släpps ut till ett naturligt vattendrag. En vanlig metod att behandla lakvatten är att pumpa det till det kommunala avloppsreningsverket och rena det tillsammans med avloppsvatten. Men detta är ingen optimal lösning eftersom lakvatten innehåller andra föroreningar än avloppsvatten, t ex salter, tungmetaller och svårnedbrytbara organiska föreningar. Avloppsreningsverken är konstruerade för att rena avloppsvatten och lakvattnets föroreningar kan störa reningsprocessen. Framför allt kan de känsliga mikroorganismerna i det biologiska reningssteget påverkas negativt. Slammet som bildas vid reningsprocessen kommer att koncentrera många av de oönskade föroreningarna som härstammar från deponin. Slammet är egentligen ett utmärkt gödselmedel för jordbruket, men på grund av att slammet är förorenat av tungmetaller och svårnedbrytbara organiska ämnen kan inte slammet användas. Slammet blir då ett avfall som kommunen inte kan bli av med utan stora kostnader. Många kommuner har därför valt en separat rening av lakvattnet. Val av reningsteknik är beroende av flera faktorer, som volymerna lakvatten som uppkommer, innehållet av de olika miljöfarliga ämnena, vart det renande vattnet ska släppas ut och utrymme för att bygga en reningsanläggning. För att utvärdera olika tekniker för rening av lakvatten, byggdes en försöksanläggning på soptippen i Kristianstad. För att kunna utvärdera en reningsmetod måste bestämningar av koncentrationer av olika ämnen (analyser) göras. På grund av den ökande oron för organiska miljögifter som PCB och fenoler, skulle reningsteknikerna utvärderas med fokus sådana eller liknande ämnen. Analyser av organiska ämnen är komplicerade och tidskrävande, och det är inte möjligt att analysera alla ämnen. I många undersökningar används endast generella parametrar för att uppskatta innehållet av organiska ämnen, men dessa metoder ger ofta inte tillräcklig information om det egentliga innehållet i lakvattnet. Därför utarbetades ett utvärderings protokoll, LAQUA protokollet (artikel I) för bestämning av organiska miljögifter i olika förorenade vatten. Detta protokoll innehåller förutom analyser för organiska miljögifter som PCB och fenoler och en akut toxicitetstest, även standardiserade rutinanalyser av metaller och vattenkemiska parametrar. Eftersom separat analys av alla organiska ämnen inte nödvändig för att bedöma olika reningsmetoder för lakvatten, innehåller protokollet ett antal markörer för polära, respektive opolära organiska föreningar. Den biologiska giftighetstesten som utvecklades (artikel III) och ingår i protokollet är ett s.k. akut toxicitetstest, dvs. organismen påverkas direkt av höga halter av föroreningar. I testen används det lilla saltvattentåliga kräftdjuret Artemia salina, som säljs som föda åt akvariefiskar. En bestämd volym med ett antal Artemia larver läggs i små brunnar med olika koncentrationer av lakvatten under 24 timmar. Sedan jämförs vid vilken inblandning av lakvatten som hälften av kräftdjuren har fått rörelsestörningar. Resultaten på tester med obehandlat lakvatten, respektive behandlat med olika reningstekniker jämförs, och på det viset kan effektiviteten på reningsmetoder bedömas gentemot en vattenlevande organism. Utvärderingen av försöksanläggningen (artikel II) visade att vid förbehandlingen, bestående av luftning och sedimentering, togs mycket av föroreningarna bort, och det rekommenderas att ett sådan reningssteg alltid bör finnas vid en reningsanläggning för lakvatten. De kemiska behandlingsmetoderna med ozon och Fentons reagens (tvåvärt järn och väteperoxid) var effektiva på att ta bort de organiska miljögifterna, men även de mer kostnadseffektiva filterbäddarna visade sig fungera bra. Den goda erfarenheten från försöksanläggningen av filterbäddar ledde till att effektiviteten hos fler filter material undersöktes. I artikel VI beskrivs ett försök, gjord i laboratorium, där lakvatten från en soptipp som tar hand om industri avfall (metallavfall från bl.a. bilar och kylskåp) fick rinna genom kolonner med olika filtermaterial. Mixen av torv och aska med kolinnehåll visade sig vara bra på att ta bort både metaller och organiska ämnen från lakvattnet. Kunskapen från bl.a. dessa undersökningar har bidragit till att en fullskaleanläggning för lokal rening av lakvatten har kunnat byggas i anslutning till Stena metalls soptipp i Halmstad. Den andra delen i avhandlingsarbetet riktade in sig mot lakningstester. För att undersöka vilka ämnen som kan lakas ut från ett avfall rekommenderas två olika standardiserade lakningstests metoder. Vid den ena metoden pumpas en vätska genom en kolonn med en uppvägd mängd avfall tills ett visst vätske/fast fas förhållande (L/S halt) har uppnåtts. Vid den andra snabbare metoden, skakas en bestämd mängd avfall tillsammans med en bestämd volym vätska under 24 timmar. Den vätska som man får efter testerna kan jämföras med ett lakvatten och ska simulera den urlakning som avfallet ger ifrån sig under sin tid på deponin. Denna urlakningsvätska analyseras och de uppmätta halterna av olika ämnen jämförs med en gränsvärdes tabell och avfallet kan hänföras till en avfalls klass. De två lakningstesterna användes för att karakterisera olika avfallsslag, sönderdelat metall skrot (artikel IV) och slam från gatubrunnar (artikel V). För att få mer kunskap om metoderna och kunna vidareutveckla dem, utökades undersökningarna och analyserna. De urlakade vätskorna analyserades därför enligt Laqua protokollet, dvs med markörer för organiska miljögifter. Dessutom gjordes undersökningar där lakvatten användes som lakningsvätska istället för avjonat vatten som testmetoderna föreskriver. Dessa visade att den mer jonstarka vätskan (lakvattnet) ökade urlakningen av metaller från avfallet. En jämförelse av de två metoderna visade att den snabbare skaktesten oftast gav likvärdiga eller högre halter av de analyserade ämnena i urlakningsvätskan, och därmed kan den i många fall användas i första hand. För att bedöma ett filtermaterial ur ett livstidsperspektiv, utvecklades en strategi baserad på skaktester (artikel VII). Ett filtermaterial, en mix av torv och aska med kolinnehåll, undersökes före och efter att det använts i en filterbädd för rening av lakvatten. För att vara säker på att filtermaterialet i sig själv inte släppte ifrån sig några föroreningar gjordes en lakningstest. För att se hur effektivt materialet var på att ta bort metaller, PCB och fenolföreningar, gjordes skaktester med vätskor med kända halter av dessa föroreningar. När filtermaterialet är förbrukat och skall bytas ut anses det som ett avfall, och det karakteriserades med lakningstest för att hur det skulle tas om hand. Metodiken med skaktester ger bra information om utlakning från ett material, och skaktester är också bra instrument för att utvärdera ett filtermaterials effektivitet på att ta hand om metaller och opolära organiska ämnen som PCB. Men för att utvärdera effektiviteten av borttagandet av polära organiska ämnen (t ex. fenoler), är inte en korttids skak test något bra instrument. Reduceringen av dessa ämnen sker genom nedbrytning med hjälp av mikroorganismer, och för att undersöka detta behövs tester som varar en så lång tid att en mikrobiologisk miljö hinner etablera sig, t ex. kolonn tester. Utvärderingen av detta avhandlingsarbete visar på några ytterligare slutsatser och förslag till fortsatt arbete. Toxicitetstesten med Artemia bör kompletteras med tester på t ex. bakterier och växter, eftersom det inte räcker med en test på bara en organism för att bedöma giftigheten av en förorening ett naturligt ekosystem. Vidare behövs en biologisk test för att påvisa kroniska effekter, så som skador på fortplantning eller tumörsjukdomar. Dessa skador kan uppkomma genom att organismer påverkas under en lång tid av de låga, men därmed inte ofarliga halter av organiska miljögifter som ofta förekommer i lakvatten. Den presenterade metodiken kan användas för att utvärdera miljöfarliga ämnen från olika förorenade områden. Avhandlingen har visat att LAQUA protokollets sammansättning och dess analyser är ett bra instrument för att utvärdera reningstekniker för lakvatten och för att ge ytterligare information om organiska ämnen i fast avfall. Bedömning av luft kvalitet och karakterisering av dagvatten är andra exempel där metodiken kan användas. Avslutningsvis är det förstås lättare att utvärdera farligheten från avfall när det är mindre volymer avfall att utvärdera. Detta kan uppnås genom att konsumera mindre, återanvända produkter, återvinna material, utvinna energi ur avfallet och välja miljövänliga produkter när man köper nytt.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography