Academic literature on the topic 'Measure-based modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Measure-based modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Measure-based modeling"

1

Yager, Ronald R. "On the Fusion of Multiple Measure Based Belief Structures." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 26, Suppl. 2 (December 2018): 63–88. http://dx.doi.org/10.1142/s0218488518400123.

Full text
Abstract:
We introduce the concept of a fuzzy measure and describe the process of combining fuzzy measures to form new measures. We discuss the role of fuzzy measures in modeling uncertain information and its use in modeling granular uncertain information with the aid of measure based belief structures. We turn to the problem of fusing multiple measure based belief structures. First we look at the case when the belief structures being fused have the same focal elements. Then we turn to case where the structures being fused have different focal elements. Finally we compare measure-based fusion with Dempster’s rule.
APA, Harvard, Vancouver, ISO, and other styles
2

Wu, Yadong, Hongying Zhang, and Ran Duan. "Total Variation Based Perceptual Image Quality Assessment Modeling." Journal of Applied Mathematics 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/294870.

Full text
Abstract:
Visual quality measure is one of the fundamental and important issues to numerous applications of image and video processing. In this paper, based on the assumption that human visual system is sensitive to image structures (edges) and image local luminance (light stimulation), we propose a new perceptual image quality assessment (PIQA) measure based on total variation (TV) model (TVPIQA) in spatial domain. The proposed measure compares TVs between a distorted image and its reference image to represent the loss of image structural information. Because of the good performance of TV model in describing edges, the proposed TVPIQA measure can illustrate image structure information very well. In addition, the energy of enclosed regions in a difference image between the reference image and its distorted image is used to measure the missing luminance information which is sensitive to human visual system. Finally, we validate the performance of TVPIQA measure with Cornell-A57, IVC, TID2008, and CSIQ databases and show that TVPIQA measure outperforms recent state-of-the-art image quality assessment measures.
APA, Harvard, Vancouver, ISO, and other styles
3

Abdelouahad, Abdelkaher Ait, Mohammed El Hassouni, Hocine Cherifi, and Driss Aboutajdine. "A New Image Distortion Measure Based on Natural Scene Statistics Modeling." International Journal of Computer Vision and Image Processing 2, no. 1 (January 2012): 1–15. http://dx.doi.org/10.4018/ijcvip.2012010101.

Full text
Abstract:
In the field of Image Quality Assessment (IQA), this paper examines a Reduced Reference (RRIQA) measure based on the bi-dimensional empirical mode decomposition. The proposed measure belongs to Natural Scene Statistics (NSS) modeling approaches. First, the reference image is decomposed into Intrinsic Mode Functions (IMF); the authors then use the Generalized Gaussian Density (GGD) to model IMF coefficients distribution. At the receiver side, the same number of IMF is computed on the distorted image, and then the quality assessment is done by fitting error between the IMF coefficients histogram of the distorted image and the GGD estimate of IMF coefficients of the reference image, using the Kullback Leibler Divergence (KLD). In addition, the authors propose a new Support Vector Machine-based classification approach to evaluate the performances of the proposed measure instead of the logistic function-based regression. Experiments were conducted on the LIVE dataset.
APA, Harvard, Vancouver, ISO, and other styles
4

Kah, Samah El, Siham Aqel, My Abdelouahed Sabri, and Abdellah Aarab. "Background Modeling Method Based On Quad Tree Decomposition and Contrast Measure." Procedia Computer Science 148 (2019): 610–17. http://dx.doi.org/10.1016/j.procs.2019.01.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Yan Hui, De Jian Zhou, and Zhao Hua Wu. "Module Similarity Measure Method of Products Based on Assembly Relationship." Applied Mechanics and Materials 607 (July 2014): 721–26. http://dx.doi.org/10.4028/www.scientific.net/amm.607.721.

Full text
Abstract:
A module similarity measure method of the mechanical and electrical products is studied in this paper. First this paper proposes the module assembly type coding method and network modeling method, and establishes its weighted network model by taking typical module as an example; then it studies the assembly relationship similarity, node similarity and network similarity; finally it provides the module similarity measure method based on the assembly relationship.
APA, Harvard, Vancouver, ISO, and other styles
6

Belding, Matthew, Alireza Enshaeian, and Piervincenzo Rizzo. "Vibration-Based Approach to Measure Rail Stress: Modeling and First Field Test." Sensors 22, no. 19 (September 30, 2022): 7447. http://dx.doi.org/10.3390/s22197447.

Full text
Abstract:
This paper describes a non-invasive inspection technique for the estimation of longitudinal stress in continuous welded rails (CWR) to infer the rail neutral temperature (RNT), i.e., the temperature at which the net longitudinal force in the rail is zero. The technique is based on the use of finite element method (FEM), vibration measurements, and machine learning (ML). FEM is used to model the relationship between the boundary conditions and the longitudinal stress of any given CWR to the vibration characteristics (mode shapes and frequencies) of the rail. The results of the numerical analysis are used to train a ML algorithm that is then tested using field data obtained by an array of accelerometers polled on the track of interest. In the study presented in this article, the proposed technique was proven in the field during an experimental campaign conducted in Colorado. A commercial FEM software was used to model the rail track as a short rail segment repeated indefinitely and under varying boundary conditions and stress. Three datasets were prepared and fed to ML models developed using hyperparameter search optimization techniques and k-fold cross validation to infer the stress or the RNT. The frequencies of vibration were extracted from the time waveforms obtained from two accelerometers temporarily attached to the rail. The results of the experiments demonstrated that the success of the technique is dependent on the accuracy of the model and the ability to properly identify the modeshapes. The results also proved that the ML was also able to predict successfully the neutral temperature of the tested rail by using only a limited number of experimental data for the training.
APA, Harvard, Vancouver, ISO, and other styles
7

Hartge, Florian, Thomas Wetter, and Walter E. Haefeli. "A similarity measure for case based reasoning modeling with temporal abstraction based on cross-correlation." Computer Methods and Programs in Biomedicine 81, no. 1 (January 2006): 41–48. http://dx.doi.org/10.1016/j.cmpb.2005.10.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Du, Ao, Jamie E. Padgett, and Abdollah Shafieezadeh. "Influence of intensity measure selection on simulation-based regional seismic risk assessment." Earthquake Spectra 36, no. 2 (February 11, 2020): 647–72. http://dx.doi.org/10.1177/8755293019891717.

Full text
Abstract:
This study investigates the influence of intensity measure (IM) selection on simulation-based regional seismic risk assessment (RSRA) of spatially distributed structural portfolios. First, a co-simulation method for general spectral averaging vector IMs is derived. Then a portfolio-level surrogate demand modeling approach, which incorporates the seismic demand estimation of the non-collapse and collapse states, is proposed. The derived IM co-simulation method enables the first comparative study of different IMs, including the conventional IMs and some more advanced scalar and vector IMs, in the context of RSRA. The influence of IM selection on the predictive performance of the portfolio-level surrogate demand models, as well as on the regional seismic risk estimates, is explored based on a virtual spatially distributed structural portfolio subjected to a scenario earthquake. The results of this study provide pertinent insights in surrogate demand modeling, IM co-simulation and selection, which can facilitate more accurate and reliable regional seismic risk estimates.
APA, Harvard, Vancouver, ISO, and other styles
9

Fang, Li Yong, Hui Li, and Jin Ping Bai. "Defect Contour Matching Based on Similarity Measure for 3D Reconstruction." Advanced Materials Research 308-310 (August 2011): 1656–61. http://dx.doi.org/10.4028/www.scientific.net/amr.308-310.1656.

Full text
Abstract:
Contour matching is one of the important problems concerned in 3-D reconstruction field. According to the difficulties of defect contour matching in defect modeling, a method based on similarity measure is presented in this paper. In this method, the theory of similarity measure is introduced to quantitatively describe the similarity of two contours. And the value of similarity measure is set as the criterion to judge matching relation between two contours in consecutive slices. For reducing computational complexity and improving accuracy of contours matching, a candidate matching field of contour is proposed. The efficiency of this algorithm has been verified by a typical example and satisfying results have been obtained.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Xiao-gang, Li-wei Huang, and Ying-wei Zhang. "Modeling and monitoring of nonlinear multi-mode processes based on similarity measure-KPCA." Journal of Central South University 24, no. 3 (March 2017): 665–74. http://dx.doi.org/10.1007/s11771-017-3467-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Measure-based modeling"

1

Wang, Chen. "A NEW SIMULATION-BASED CONFLICT INDICATOR AS A SURROGATE MEASURE OF SAFETY." UKnowledge, 2012. http://uknowledge.uky.edu/ce_etds/3.

Full text
Abstract:
Traffic safety is one of the most essential aspects of transportation engineering. However, most crash prediction models are statistically-based prediction methods, which require significant efforts in crash data collection and may not be applied in particular traffic environments due to the limitation of data sources. Traditional traffic conflict studies are mostly field-based studies depending on manual counting, which is also labor-intensive and oftentimes inaccurate. Nowadays, simulation tools are widely utilized in traffic conflict studies. However, there is not a surrogate indicator that is widely accepted in conflict studies. The primary objective of this research is to develop such a reliable surrogate measure for simulation-based conflict studies. An indicator named Aggregated Crash Propensity Index (ACPI) is proposed to address this void. A Probabilistic model named Crash Propensity Model (CPM) is developed to determine the crash probability of simulated conflicts by introducing probability density functions of reaction time and maximum braking rates. The CPM is able to generate the ACPI for three different conflict types: crossing, rear-end and lane change. A series of comparative and field-based analysis efforts are undertaken to evaluate the accuracy of the proposed metric. Intersections are simulated with the VISSIM micro simulation and the output is processed through SSAM to extract useful conflict data to be used as the entry into CPM model. In the comparative analysis, three studies are conducted to evaluate the safety effect of specific changes in intersection geometry and operations. The comparisons utilize the existing Highway Safety Manual (HSM) processes to determine whether ACPI can identify the same trends as those observed in the HSM. The ACPI outperforms time-to-collision-based indicators and tracks the values suggested by the HSM in terms of identifying the relative safety among various scenarios. In field-based analysis, the Spearman’s rank tests indicate that ACPI is able to identify the relative safety among traffic facilities/treatments. Moreover, ACPI-based prediction models are well fitted, suggesting its potential to be directly link to real crash. All efforts indicate that ACPI is a promising surrogate measure of safety for simulation-based studies.
APA, Harvard, Vancouver, ISO, and other styles
2

Ghanipoor, Machiani Sahar. "Modeling Driver Behavior at Signalized Intersections: Decision Dynamics, Human Learning, and Safety Measures of Real-time Control Systems." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/71798.

Full text
Abstract:
Traffic conflicts associated to signalized intersections are one of the major contributing factors to crash occurrences. Driver behavior plays an important role in the safety concerns related to signalized intersections. In this research effort, dynamics of driver behavior in relation to the traffic conflicts occurring at the onset of yellow is investigated. The area ahead of intersections in which drivers encounter a dilemma to pass through or stop when the yellow light commences is called Dilemma Zone (DZ). Several DZ-protection algorithms and advance signal settings have been developed to accommodate the DZ-related safety concerns. The focus of this study is on drivers' decision dynamics, human learning, and choice behavior in DZ, and DZ-related safety measures. First, influential factors to drivers' decision in DZ were determined using a driver behavior survey. This information was applied to design an adaptive experiment in a driving simulator study. Scenarios in the experimental design are aimed at capturing drivers learning process while experiencing safe and unsafe signal settings. The result of the experiment revealed that drivers do learn from some of their experience. However, this learning process led into a higher level of risk aversion behavior. Therefore, DZ-protection algorithms, independent of their approach, should not have any concerns regarding drivers learning effect on their protection procedure. Next, the possibility of predicting drivers' decision in different time frames using different datasets was examined. The results showed a promising prediction model if the data collection period is assumed 3 seconds after yellow. The prediction model serves advance signal protection algorithms to make more intelligent decisions. In the next step, a novel Surrogate Safety Number (SSN) was introduced based on the concept of time to collision. This measure is applicable to evaluate different DZ-protection algorithms regardless of their embedded methodology, and it has the potential to be used in developing new DZ-protection algorithms. Last, an agent-based human learning model was developed integrating machine learning and human learning techniques. An abstracted model of human memory and cognitive structure was used to model agent's behavior and learning. The model was applied to DZ decision making process, and agents were trained using the driver simulator data. The human learning model resulted in lower and faster-merging errors in mimicking drivers' behavior comparing to a pure machine learning technique.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
3

Coleman, Mary Angela. "Construct Validity Evidence Based on Internal Structure: Exploring and Comparing the Use of Rasch Measurement Modeling and Factor Analysis with a Measure of Student Motivation." VCU Scholars Compass, 2006. http://hdl.handle.net/10156/1425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lin, Yun. "Task-based Robotic Grasp Planning." Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5361.

Full text
Abstract:
Grasp should be selected intelligently to fulfill different stability properties and manipulative requirements. Currently, most grasping approaches consider only pick-and-place tasks without any physical interaction with other objects or the environment, which are common in an industry setting with limited uncertainty. When robots move to our daily-living environment and perform a broad range of tasks in an unstructured environment, all sorts of physical interactions will occur, which will result in random physical interactive wrenches: forces and torques on the tool. In addition, for a tool to perform a required task, certain motions need to occur. We call it "functional tool motion," which represents the innate function of the tool and the nature of the task. Grasping with a robotic hand gives flexibility in "mounting" the tool onto the robotic arm - a different grasp will connect the tool to the robotic arm with a different hand posture, then the inverse kinematics approach will result in a different joint motion of the arm in order to achieve the same functional tool motion. Thus, the grasp and the functional tool motion decide the manipulator's motion, as well as the effort to achieve the motion. Therefore, we propose to establish two objectives to serve the purpose of a grasp: the grasp should maintain a firm grip and withstand interactive wrenches on the tool during the task; and the grasp should enable the manipulator to carry out the task most efficiently with little motion effort, and then search for a grasp to optimize both objectives. For this purpose, two grasp criteria are presented to evaluate the grasp: the task wrench coverage criterion and the task motion effort criterion. The two grasp criteria are used as objective functions to search for the optimal grasp for grasp planning. To reduce the computational complexity of the search in high-dimensional robotic hand configuration space, we propose a novel grasp synthesis approach that integrates two human grasp strategies - grasp type, and thumb placement (position and direction) - into grasp planning. The grasping strategies abstracted from humans should meet two important criteria: they should reflect the demonstrator's intention, and they should be general enough to be used by various robotic hand models. Different abstractions of human grasp constrain the grasp synthesis and narrow down the solutions of grasp generation to different levels. If a strict constraint is imposed, such as defining all contact points of the fingers on the object, the strategy loses flexibility and becomes rarely achievable for a robotic hand with a different kinematic model. Thus, the choice of grasp strategies should balance the learned constraints and required flexibility to accommodate the difference between a human hand and a robotic hand. The human strategies of grasp type and thumb placement have such a balance while conveying important human intents to the robotic grasping. The proposed approach has been thoroughly evaluated both in simulation and on a real robotic system for multiple objects that would be encountered in daily living.
APA, Harvard, Vancouver, ISO, and other styles
5

Lauer, Benjamin. "Exploiting space-based optical and radar imagery to measure and model tectonic deformation in continental areas." Thesis, Université de Paris (2019-....), 2019. http://www.theses.fr/2019UNIP7089.

Full text
Abstract:
Dans ce travail, nous souhaitons montrer l’atout de l’imagerie satellite pour contraindre et modéliser la déformation tectonique. Cette dernière représente la signature du comportement des failles et est un élément clé pour comprendre leur mécanique et les risques associés. Plus particulièrement, nous souhaitons montrer le bénéfice 1) de la combinaison d’images satellite optiques et radar pour mesurer la déformation cosismique en 3D et modéliser les caractéristiques des failles (géométrie et glissement) et 2) de l’utilisation d’images satellite historiques pour étendre la couverture temporelle des mesures et quantifier des déformations lentes. Le séisme du Balochistan fut largement décrochant senestre, avec une composante de chevauchement secondaire. En combinant des images satellite optiques (SPOT 5, Landsat 8) et radar (RADARSAT-2, TerraSAR-X ScanSAR), nous calculons le champ de déformation cosismique complet en 3D ainsi que la distribution de glissement en surface. Cet ensemble de données nous permet également d’explorer la géométrie de faille et la distribution du glissement en profondeur. Certains segments de la faille décrochant de Chaman (Pakistan) sont sujets à du glissement superficiel asismique de l’ordre d’un cm/an. Nous présentons l’avancement actuel d’un projet en cours dont le but est de mesurer un tel glissement à partir d’images historiques Corona. La procédure d’acquisition atypique de ces images et le manque de métadonnées impose une réévaluation de la chaîne de traitement photogrammétrique. Nous présentons donc un modèle de caméra et une méthode automatisée de génération de Points d’Appui à partir d’images SPOT 6/7, nous permettant de calibrer le modèle de caméra
In this work we aim to illustrate the asset of satellite imagery to constrain and model tectonic deformation. Tectonic deformation is a signature of faults behavior and is a key element to understand fault systems mechanics and the corresponding hazard. We especially intend to demonstrate the benefit of 1) combining satellite optical and radar data to measure coseismic deformations in 3D and provide constraints to model the geometric and cinematic properties of faults and 2) enhance the temporal coverage of measurements by using historical satellite images to quantify slow deformation over time. The Balochistan earthquake was dominated by left-lateral slip, with a secondary reverse component. By combining optical (SPOT 5, Landsat 8) and radar satellite data (RADARSAT-2, TerraSAR-X ScanSAR), we derive the full 3D coseismic displacement field and the slip distribution at the surface. Such an extensive dataset allows us to explore the fault geometry and the slip distribution at depth. A few segments of the strike-slip Chaman fault, in Pakistan, are prone to shallow aseismic creep at a rate of ~1cm/yr. We present the current status of an ongoing project that aims to enable creep rate measurements from Corona historical images. Both the atypical acquisition procedure of these images (panoramic pushbroom) and the lack of metadata impose a reassessment of part of the photogrammetric processing. We thus present an implementation of a camera model and a fully automated method to compute Ground Control Points for Corona images using current SPOT 6/7 imagery, allowing for calibrating the camera model
APA, Harvard, Vancouver, ISO, and other styles
6

Okhrati, Ramin. "Credit Risk Modeling under Jump Processes and under a Risk Measure-Based Approach." Thesis, 2011. http://spectrum.library.concordia.ca/35962/1/Okhrati_PhD_F2011.pdf.

Full text
Abstract:
Having a precise idea of how information is used is a key element in studying credit risk models. This concept plays an important role in structural and reduced form models and most recently in information based models. In this thesis the relationship between these models and the idea of information, more specifically through filtration expansions, is studied in depth. Special attention is given to the study of intensity processes under different types of filtration expansions. Credit derivatives are path dependent financial products. Therefore their analysis is based on the history of the underlying risky process. If the underlying process is allowed to have jumps, then this analysis is more challenging. This explains why, normally, risk management techniques for these products assume that the underlying process is continuous, the derivative is path independent, or the probability measure is risk neutral. In our model, in the context of a locally risk-minimization approach, the problems of pricing and hedging of defaultable claims are discussed without imposing any of the above assumptions. The impact of risk measures in financial markets can no longer be ignored. Considering this, a methodological procedure based on risk measures is developed to gauge the credit quality of defaultable bonds in real bond markets. Through this process a new type of indicator is introduced that can be useful to detect inconsistencies in bond markets. This can be helpful in market integration applications.
APA, Harvard, Vancouver, ISO, and other styles
7

Ghobadi, Razieh. "Geostatistical modelling and simulation of karst systems." Thesis, 2016. http://hdl.handle.net/2440/103460.

Full text
Abstract:
Groundwater is a significant water resource and in many parts of the world it occurs in karst aquifers. The modelling of karst systems is a critical component of groundwater resource assessment and flow. Geostatistical techniques have shown useful applications in the area of groundwater research because of their ability to quantify spatial variability, uncertainty and risk. Traditional geostatistical methods, based on variogram models, use only two-point statistics and thus are not capable of modelling the complex and, high-connectivity structures of karst networks. This has led to an increasing focus on spatial multiple-point statistics (MPS) to model these complex systems. In this approach, a training image is used instead of a variogram. Patterns are obtained by scanning and sampling the training image and during the simulation they are reproduced using MPS. There are two implementations of MPS: (i) gridded and (ii) non-gridded. In gridded MPS, the training image, templates and simulations are based on rigid grids, whereas the spatially flexible non-gridded approach does not depend on rigidly specified grids. The non-gridded approach is relatively new (Erzeybek Balan 2012), and applications, especially in hydrogeology are few; however, the method has been used to simulate paleokarsts in petroleum applications. Non-gridded MPS has potential to improve the modelling of karst systems by replacing the fixed gridding procedure, used in the original form of MPS, by a more flexible grid adapted to each specific application. However, there are some weaknesses in the non-gridded approach reported in the literature. For example, the proposed template cannot properly represent the tortuous nature of a network, and the variation of the passage widths is not taken into account. In the case of a simple channelised system with a constant width, sampling the central line of the passages is sufficient; however, most karst systems have networks with significantly varying widths. In addition, the variability among the realisations generated by non-gridded MPS is relatively small, indicating that the realisations do not cover the full space of uncertainty. In practical applications, it is not possible to know the exact extent of the full space of uncertainty, but the observed variability of the geology and geomorphology of similar structures would tell us when the variability among the simulations is too small (or too large). A lack of significant variability among simulated realisations makes the method inapplicable. This thesis presents a modified non-gridded MPS method that increases the variability among realisations and adequately captures the tortuosities of karst networks. To do this, it includes the width and constructs an optimal template based on a representative variety of directions adapted to each network instead of considering only a few major directions using a generic template as applied by Erzeybek Balan (2012). The performance of Erzeybek Balan’s (2012) non-gridded MPS method has only been visually demonstrated, which is not a sufficiently robust measure of performance. In this thesis, a systematic measure is developed to evaluate the variability among the realisations. This provides an objective way of comparing an important feature of the simulations generated by gridded MPS and the proposed modified non-gridded MPS. The research starts with an investigation and modification of non-gridded MPS. A widely used demonstration image, which is based on a channelised system, is used to compare the performances of the original non-gridded MPS (Erzeybek Balan 2012) and the modified version proposed in this thesis. A distance-based measure is used to evaluate and compare pattern reproduction and the variability of the realisations generated by the modified non-gridded MPS and standard gridded MPS methods. This distance measure can be used to compare the multiple-point histograms of the realisations and training images. Gridded MPS and modified non-gridded MPS are then applied to two different karst systems—Olwolgin Cave and Tank Cave—and the realisations generated by each method are evaluated in terms of pattern reproduction and the extent of the uncertainty space. The comparison examples demonstrate that the proposed modified non-gridded MPS generates a larger uncertainty space than that generated by gridded MPS. The results also confirm that modified non-gridded MPS performs significantly better than the original version of non-gridded MPS in terms of a larger (and more realistic) space of uncertainty and pattern reproduction when applied to a complex karst system.
Thesis (M.Phil.) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 2016.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Measure-based modeling"

1

Öhreneder, Christian. A similarity measure for global image matching based on the forward modeling principle. Wien: Institut für Photogrammetrie und Fernerkundung, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Balackiy, Evgeniy, Natal'ya Ekimova, Aleksandr Rudnev, and Aleksandr Gusev. New approaches to modeling economic development. ru: INFRA-M Academic Publishing LLC., 2022. http://dx.doi.org/10.12737/1862597.

Full text
Abstract:
The monograph presents new results of the authors' long-term research on various topical issues of economic development. All the proposed new approaches are given in the broad context of already existing theories and models, as well as illustrated by numerous vivid examples from the history of different countries. Most of the topics covered belong to the category of the most burning social issues of our time, which gives the work an element of scientific "freshness" and discussion. All the fundamental theses are accompanied by the necessary models, equations, formulas, graphs and figures, but in general the material is not overloaded with technical details, which makes it quite accessible to any interested reader. The peculiarity of the monograph is that all its sections are based on the "paradox principle", the essence of which is to formulate the original problem in the most acute form, taking the form of a logical paradox. The range of topics under consideration covers the history of mankind from antiquity to the modern state. For example, why did humanity, which had been vegetating in the Malthusian trap for 10 thousand years, break out of it at the turn of the XVII and XVIII centuries? What is needed so that the economic growth that has begun does not "choke" in a short time and does not degenerate again into prolonged stagnation? How are economic growth and return on capital related? How are income inequality and the country's investment activity related? How to measure and in practice link the dialectical properties of institutions that presuppose order and freedom? Is it possible to diagnose "failures" in the regulatory activities of central banks? How to explain the transcendent technological creativity of Russian researchers and engineers with Russia's systematic technological lag behind Western countries? Does Russia have a chance to join the club of the most developed and prosperous countries in the world and what is needed for this? And much, much more. It is addressed to both professional specialists and everyone interested in modern problems of human development.
APA, Harvard, Vancouver, ISO, and other styles
3

Railsback, Steven F., and Bret C. Harvey. Modeling Populations of Adaptive Individuals. Princeton University Press, 2020. http://dx.doi.org/10.23943/princeton/9780691195285.001.0001.

Full text
Abstract:
Ecologists now recognize that the dynamics of populations, communities, and ecosystems are strongly affected by adaptive individual behaviors. Yet until now, we have lacked effective and flexible methods for modeling such dynamics. Traditional ecological models become impractical with the inclusion of behavior, and the optimization approaches of behavioral ecology cannot be used when future conditions are unpredictable due to feedbacks from the behavior of other individuals. This book provides a comprehensive introduction to state- and prediction-based theory, or SPT, a powerful new approach to modeling trade-off behaviors in contexts such as individual-based population models where feedbacks and variability make optimization impossible. This book features a wealth of examples that range from highly simplified behavior models to complex population models in which individuals make adaptive trade-off decisions about habitat and activity selection in highly heterogeneous environments. The book explains how SPT builds on key concepts from the state-based dynamic modeling theory of behavioral ecology, and how it combines explicit predictions of future conditions with approximations of a fitness measure to represent how individuals make good—not optimal—decisions that they revise as conditions change. The resulting models are realistic, testable, adaptable, and invaluable for answering fundamental questions in ecology and forecasting ecological outcomes of real-world scenarios.
APA, Harvard, Vancouver, ISO, and other styles
4

Ebstein, Richard P., Songfa Zhong, Robin Chark, Poh San Lai, and Soo Hong Chew. Modeling the Genetics of Social Cognition in the Laboratory. Edited by Turhan Canli. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199753888.013.017.

Full text
Abstract:
This chapter examines recent advances in the genetics of social cognition, discussing evidence from twin studies that confirm the relevancy of genetic hard wiring in understanding many social phenotypes, with important implications for the social sciences and for genome-wide association studies (GWAS) that may identify specific genes contributing to a wide range of social phenotypes, genoeconomics, and individual and social decision making. Stressing the importance of phenotype definition and precise measurement as key to success in GWAS, the authors argue that laboratory-based behavioral economic paradigms using ethnically homogenous student populations generate the best prospects for successful GWAS. Also discussed are the neurochemical/neurogenetic architecture of behavioral economic games that measure individual and social decision making and the considerable progress made in unraveling the neurogenetics of human parenting and the beginning of a political attitudes neuroscience. The authors’ own GWAS is used to present a set of guidelines for future research directions.
APA, Harvard, Vancouver, ISO, and other styles
5

Aguilera-Cobos, Lorena, Rebeca Isabel-Gómez, and Juan Antonio Blasco-Amaro. Efectividad de la limitación de la movilidad en la evolución de la pandemia por Covid-19. AETSA Área de Evaluación de Tecnologías Sanitarias de Andalucía, Fundación Progreso y salud. Consejería de Salud y Familias. Junta de Andalucía, 2022. http://dx.doi.org/10.52766/pyui7071.

Full text
Abstract:
Introduction During the Covid-19 pandemic, non-pharmacological interventions (NPIs) aimed to minimise the spread of the virus as much as possible to avoid the most severe cases and the collapse of health systems. These measures included mobility restrictions in several countries, including Spain. Objective To assess the impact of mobility constraints on incidence, transmission, severe cases and mortality in the evolution of the Covid-19 pandemic. These constraints include: • Mandatory home confinement. • - Recommendation to stay at home. • - Perimeter closures for entry and/or exit from established areas. • - Restriction of night-time mobility (curfew). Methodology Systematic literature review, including documents from official bodies, systematic reviews and meta-analyses. The following reference databases were consulted until October 2021 (free and controlled language): Medline, EMBASE, Cochrane Library, TripDB, Epistemonikos, Royal college of London, COVID-end, COVID-19 Evidence Reviews, WHO, ECDC and CDC. Study selection and quality analysis were performed by two independent researchers. References were filtered firstly by title and abstract and secondly by full text in the Covidence tool using a priori inclusion and exclusion criteria. Synthesis of the results was done qualitatively. The quality of the included studies was assessed using the AMSTAR-II tool. Results The literature search identified 642 studies, of which 38 were excluded as duplicates. Of the 604 potentially relevant studies, 12 studies (10 systematic reviews and 2 official agency papers) were included in the analysis after filtering. One of the official agency papers was from the European Centre for Disease Prevention and Control (ECDC) and the other paper was from the Ontario Agency for Health Promotion and Protection (OHP). The result of the quality assessment with the AMSTAR-II tool of the included systematic reviews was: 3 reviews of moderate quality, 6 reviews of low quality and 1 review of critically low quality. The interventions analysed in the included studies were divided into 2 categories: the first category comprised mandatory home confinement, recommendation to stay at home and curfew, and the second category comprised perimeter blocking of entry and/or exit (local, cross-community, national or international). This division is because the included reviews analysed the measures of mandatory home confinement, advice to stay at home and curfew together without being able to carry out a disaggregated analysis. The included systematic reviews for the evaluation of home confinement, stay-at-home advice and curfew express a decrease in incidence levels, transmission and severe cases following the implementation of mobility limitation interventions compared to the no measure comparator. These conclusions are supported by the quantitative or qualitative results of the studies they include. All reviews also emphasise that to increase the effectiveness of these restrictions it is necessary to combine them with other public health measures. In the systematic reviews included for the assessment of entry and/or exit perimeter closure, most of the studies included in the reviews were found to be modelling studies based on mathematical models. All systematic reviews report a decrease in incidence, transmission and severe case levels following the implementation of travel restriction interventions. The great heterogeneity of travel restrictions applied, such as travel bans, border closures, passenger testing or screening, mandatory quarantine of travellers or optional recommendations for travellers to stay at home, makes data analysis and evaluation of interventions difficult. Conclusions Mobility restrictions in the development of the Covid-19 pandemic were one of the main NPI measures implemented. It can be concluded from the review that there is evidence for a positive impact of NPIs on the development of the COVID-19 pandemic. The heterogeneity of the data from the included studies and their low quality make it difficult to assess the effectiveness of mobility limitations in a disaggregated manner. Despite this, all the included reviews show a decrease in incidence, transmission, hospitalisations and deaths following the application of the measures under study. These measures are more effective when the restrictions were implemented earlier in the pandemic, were applied for a longer period and were more rigorous in their application.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Measure-based modeling"

1

Cristiani, Emiliano, Benedetto Piccoli, and Andrea Tosin. "Basic Theory of Measure-Based Models." In Multiscale Modeling of Pedestrian Dynamics, 137–68. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-06620-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hussain, Tauqeer, Mian M. Awais, and Shafay Shamail. "A Fuzzy Based Approach to Measure Completeness of an Entity-Relationship Model." In Perspectives in Conceptual Modeling, 410–22. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11568346_44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gómez-Alonso, Cristina, and Aida Valls. "A Similarity Measure for Sequences of Categorical Data Based on the Ordering of Common Elements." In Modeling Decisions for Artificial Intelligence, 134–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88269-5_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cheng, Cheng, and Min-Sen Chiu. "Nonlinear Process Modeling Based on Just-in-Time Learning and Angle Measure." In Lecture Notes in Computer Science, 1311–18. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45224-9_177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Hsiang-Chuan. "A Novel Choquet Integral Composition Forecasting Model for Time Series Data Based on Completed Extensional L-Measure." In Time Series Analysis, Modeling and Applications, 119–37. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-33439-9_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tang, Jian, Li-Jie ZHao, Shao-Wei Liu, and Dong Yan. "Ensemble Modeling Difficult-to-Measure Process Variables Based the PLS-LSSVM Algorithm and Information Entropy." In Lecture Notes in Electrical Engineering, 977–84. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21697-8_125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Varghese, Abraham, Shajidmon Kolamban, Vinu Sherimon, Eduardo M. Lacap, Saad Salman Ahmed, Jagath Prasad Sreedhar, Syed Rafeek Ahmed, Hasina Al Harthy, and Huda Salim Al Shuaily. "Modeling Control Measure Score of COVID-19 Outbreak Using Fuzzy c-Means-Based Adaptive Neuro-Fuzzy Inference System." In Information and Communication Technology for Competitive Strategies (ICTCS 2020), 993–1003. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-0739-4_92.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pham, Phu, Phuc Do, and Chien D. C. Ta. "W-PathSim: Novel Approach of Weighted Similarity Measure in Content-Based Heterogeneous Information Networks by Applying LDA Topic Modeling." In Intelligent Information and Database Systems, 539–49. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-75417-8_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jallais, Maëliss, and Demian Wassermann. "Single Encoding Diffusion MRI: A Probe to Brain Anisotropy." In Mathematics and Visualization, 171–91. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-56215-1_8.

Full text
Abstract:
AbstractThis chapter covers anisotropy in the context of probing microstructure of the human brain using single encoded diffusion MRI. We will start by illustrating how diffusion MRI is a perfectly adapted technique to measure anisotropy in the human brain using water motion, followed by a biological presentation of human brain. The non-invasive imaging technique based on water motions known as diffusion MRI will be further presented, along with the difficulties that come with it. Within this context, we will first review and discuss methods based on signal representation that enable us to get an insight into microstructure anisotropy. We will then outline methods based on modeling, which are state-of-the-art methods to get parameter estimations of the human brain tissue.
APA, Harvard, Vancouver, ISO, and other styles
10

Yue, Tao, and Shaukat Ali. "A MOF-Based Framework for Defining Metrics to Measure the Quality of Models." In Modelling Foundations and Applications, 213–29. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-09195-2_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Measure-based modeling"

1

Xiaogang Ji and Nan Li. "ACO-Based multiple geometric elements measure path planning for CMM." In 2010 International Conference on Computer Application and System Modeling (ICCASM 2010). IEEE, 2010. http://dx.doi.org/10.1109/iccasm.2010.5622163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chang Wang and Yi Han. "Design of dynamic volume measure system based on binocular vision." In 2010 International Conference on Computer Application and System Modeling (ICCASM 2010). IEEE, 2010. http://dx.doi.org/10.1109/iccasm.2010.5622784.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ramu, Palaniappan, Nam-Ho Kim, and Raphael T. Haftka. "Inverse measure-based tail modeling approaches for structural reliability estimation." In 48th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference. Reston, Virigina: American Institute of Aeronautics and Astronautics, 2007. http://dx.doi.org/10.2514/6.2007-1947.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Pap, Endre, Jianzhang Wu, and Aniko Szakal. "Preference information modeling by empty interaction index based on monotone measure." In 2015 16th IEEE International Symposium on Computational Intelligence and Informatics (CINTI). IEEE, 2015. http://dx.doi.org/10.1109/cinti.2015.7382950.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Serre, Damien, Serge Lhomme, Bruno Barroca, and Youssef Diab. "Integrating Flood Defence Fragility to Measure Built Environment Vulnerability-A GIS Based Approach." In First International Symposium on Uncertainty Modeling and Analysis and Management (ICVRAM 2011); and Fifth International Symposium on Uncertainty Modeling and Anaylsis (ISUMA). Reston, VA: American Society of Civil Engineers, 2011. http://dx.doi.org/10.1061/41170(400)96.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wei, Du, Liu Zhong, Xiu Bao-Xin, Zhang Wei-Ming, and Cheng Qing. "Command and Control Network Modeling and Efficiency Measure Based on Capability Weighted-Node." In 2011 IEEE 9th International Conference on Dependable, Autonomic and Secure Computing (DASC). IEEE, 2011. http://dx.doi.org/10.1109/dasc.2011.176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

KIM, CHONGMAN, JAEYEONG LEE, and SUNWOO SHIN. "Agent based Simulation Modeling to Measure the Effectiveness of UGV with Communication Repeater." In Sixth International Conference on Advances in Social Science, Management and Human Behaviour - SMHB 2017. Institute of Research Engineers and Doctors, 2017. http://dx.doi.org/10.15224/978-1-63248-141-2-53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Razmara, Jafar, and Safaai B. Deris. "A novel method for protein 3D-structure similarity measure based on n-gram modeling." In 2008 8th IEEE International Conference on Bioinformatics and BioEngineering (BIBE). IEEE, 2008. http://dx.doi.org/10.1109/bibe.2008.4696719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Riemenschneider, Johannes. "Characterization and Modeling of CNT Based Actuators." In ASME 2008 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. ASMEDC, 2008. http://dx.doi.org/10.1115/smasis2008-373.

Full text
Abstract:
In order to get an understanding of the general characteristics of Carbon Nanotube (CNT) based actuators, the system response of the actuator was analyzed. Special techniques were developed in order to generate a reproducible characteristic measure for the material: the R-Curve. On top, the dynamic response of the system was evaluated in different states of the actuator. A model was generated to capture the general behavior of the system. At last an actuator incorporating solid electrolyte was built and tested, showing similar characteristics as the actuator in aqueous electrolyte.
APA, Harvard, Vancouver, ISO, and other styles
10

Eilers, Kevin, and Juergen Rossmann. "Modeling an AGV based facility logistics system to measure and visualize performance availability in a VR environment." In 2014 Winter Simulation Conference - (WSC 2014). IEEE, 2014. http://dx.doi.org/10.1109/wsc.2014.7019903.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Measure-based modeling"

1

Idakwo, Gabriel, Sundar Thangapandian, Joseph Luttrell, Zhaoxian Zhou, Chaoyang Zhang, and Ping Gong. Deep learning-based structure-activity relationship modeling for multi-category toxicity classification : a case study of 10K Tox21 chemicals with high-throughput cell-based androgen receptor bioassay data. Engineer Research and Development Center (U.S.), July 2021. http://dx.doi.org/10.21079/11681/41302.

Full text
Abstract:
Deep learning (DL) has attracted the attention of computational toxicologists as it offers a potentially greater power for in silico predictive toxicology than existing shallow learning algorithms. However, contradicting reports have been documented. To further explore the advantages of DL over shallow learning, we conducted this case study using two cell-based androgen receptor (AR) activity datasets with 10K chemicals generated from the Tox21 program. A nested double-loop cross-validation approach was adopted along with a stratified sampling strategy for partitioning chemicals of multiple AR activity classes (i.e., agonist, antagonist, inactive, and inconclusive) at the same distribution rates amongst the training, validation and test subsets. Deep neural networks (DNN) and random forest (RF), representing deep and shallow learning algorithms, respectively, were chosen to carry out structure-activity relationship-based chemical toxicity prediction. Results suggest that DNN significantly outperformed RF (p < 0.001, ANOVA) by 22–27% for four metrics (precision, recall, F-measure, and AUPRC) and by 11% for another (AUROC). Further in-depth analyses of chemical scaffolding shed insights on structural alerts for AR agonists/antagonists and inactive/inconclusive compounds, which may aid in future drug discovery and improvement of toxicity prediction modeling.
APA, Harvard, Vancouver, ISO, and other styles
2

McKinnon, Mark, Craig Weinschenk, and Daniel Madrzykowski. Modeling Gas Burner Fires in Ranch and Colonial Style Structures. UL Firefighter Safety Research Institute, June 2020. http://dx.doi.org/10.54206/102376/mwje4818.

Full text
Abstract:
The test scenarios ranged from fires in the structures with no exterior ventilation to room fires with flow paths that connected the fires with remote intake and exhaust vents. In the ranch, two replicate fires were conducted for each room of origin and each ventilation condition. Rooms of fire origin included the living room, bedroom, and kitchen. In the colonial, the focus was on varying the flow paths to examine the change in fire behavior and the resulting damage. No replicates were conducted in the colonial. After each fire scene was documented, the interior finish and furnishings were replaced in affected areas of the structure. Instrumentation was installed to measure gas temperature, gas pressure, and gas movement within the structures. In addition, oxygen sensors were installed to determine when a sufficient level of oxygen was available for flaming combustion. Standard video and firefighting IR cameras were also installed inside of the structures to capture information about the fire dynamics of the experiments. Video cameras were also positioned outside of the structures to monitor the flow of smoke, flames, and air at the exterior vents. Each of the fires were started from a small flaming source. The fires were allowed to develop until they self-extinguished due to a lack of oxygen or until the fire had transitioned through flashover. The times that fires burned post-flashover varied based on the damage occurring within the structure. The goal was have patterns remaining on the ceiling, walls, and floors post-test. In total, thirteen experiments were conducted in the ranch structure and eight experiments were conducted in the colonial structure. All experiments were conducted at UL's Large Fire Laboratory in Northbrook, IL. Increasing the ventilation available to the fire, in both the ranch and the colonial, resulted in additional burn time, additional fire growth, and a larger area of fire damage within the structures. These changes are consistent with fire dynamics based assessments and were repeatable. Fire patterns within the room of origin led to the area of origin when the ventilation of the structure was considered. Fire patterns generated pre-flashover, persisted post-flashover if the ventilation points were remote from the area of origin.
APA, Harvard, Vancouver, ISO, and other styles
3

Smith, S. Jarrell, David W. Perkey, and Kelsey A. Fall. Cohesive Sediment Field Study : James River, Virginia. U.S. Army Engineer Research and Development Center, August 2021. http://dx.doi.org/10.21079/11681/41640.

Full text
Abstract:
Estuaries trap much of the fine sediment delivered to them by rivers. This phenomenon presents challenges to the US Army Corps of Engineers (USACE) navigation mission, which maintains navigable waterways for waterborne commerce through estuarine regions. The USACE Regional Sediment Management Program and the USACE Norfolk District are conducting a regional sediment transport modeling study to identify cost-effective sediment management schemes in the James River, a tributary estuary of Chesapeake Bay. A key element of the sediment transport modeling study is the definition of cohesive sediment transport processes, such as erosion and settling velocity. This report describes field-based measurements of cohesive sediment erosion and settling velocity conducted in November 2017. The team conducted erosion testing on 15 cores collected throughout the tidal system. Additionally, two anchor stations were occupied to measure tidal variations in vertical distributions of suspended sediment concentration, particle size, and settling velocity. Recommended cohesive sediment transport parameters were developed from the field measurements.
APA, Harvard, Vancouver, ISO, and other styles
4

Reis, Evan. Development of Index Buildings, (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/fudb2072.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 2: Development of Index Buildings and focuses on the identification of common variations and combinations of materials and construction characteristics of California single-family dwellings. These were used to develop “Index Buildings” that formed the basis of the PEER–CEA Project testing and analytical modeling programs (Working Groups 4 and 5). The loss modeling component of the Project (Working Group 6) quantified the damage-seismic hazard relationships for each of the Index Buildings.
APA, Harvard, Vancouver, ISO, and other styles
5

Zareian, Farzin, and Joel Lanning. Development of Testing Protocol for Cripple Wall Components (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/olpv6741.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA project is to provide scientifically-based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3.2 and focuses on Loading Protocol Development for Component Testing. It presents the background, development process, and recommendations for a quasi-static loading protocol to be used for cyclic testing of cripple wall components of wood-frame structures. The recommended loading protocol was developed for component testing to support the development of experimentally informed analytical models for cripple wall components. These analytical models are utilized for the performance-based assessment of wood-frame structures in the context of the PEER–CEA Project. The recommended loading protocol was developed using nonlinear dynamic analysis of representative multi-degree-of-freedom (MDOF) systems subjected to sets of single-component ground motions that varied in location and hazard level. Cumulative damage of the cripple wall components of the MDOF systems was investigated. The result is a testing protocol that captures the loading history that a cripple wall may experience in various seismic regions in California.
APA, Harvard, Vancouver, ISO, and other styles
6

Reis, Evan, Yousef Bozorgnia, Henry Burton, Kelly Cobeen, Gregory Deierlein, Tara Hutchinson, Grace Kang, et al. Project Technical Summary (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, December 2020. http://dx.doi.org/10.55461/feis4651.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER-CEA Project.” The overall objective of the PEER–CEA project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 7: Reporting and is a summary of the PEER–CEA Project work performed by Working Groups 1–6. This report does not present new information apart from the rest of the project, and its purpose is to serve as a reference for researchers and catastrophe modelers wishing to understand the objectives and key findings of the project. The key overall findings of the PEER–CEA Project are summarized in Chapters 8 and 10, which describe the efforts of the WG5 and WG6 Working Groups. The reader is referred to the individual reports prepared by the Working Groups for comprehensive information on the tasks, methodologies, and results of each.
APA, Harvard, Vancouver, ISO, and other styles
7

Cobeen, Kelly, Vahid Mahdavifar, Tara Hutchinson, Brandon Schiller, David Welch, Grace Kang, and Yousef Bozorgnia. Large-Component Seismic Testing for Existing and Retrofitted Single-Family Wood-Frame Dwellings (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/hxyx5257.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. Quantifying the difference of seismic performance of un-retrofitted and retrofitted single-family wood-frame houses has become increasingly important in California due to the high seismicity of the state. Inadequate lateral bracing of cripple walls and inadequate sill bolting are the primary reasons for damage to residential homes, even in the event of moderate earthquakes. Physical testing tasks were conducted by Working Group 4 (WG4), with testing carried out at the University of California San Diego (UCSD) and University of California Berkeley (UCB). The primary objectives of the testing were as follows: (1) development of descriptions of load-deflection behavior of components and connections for use by Working Group 5 in development of numerical modeling; and (2) collection of descriptions of damage at varying levels of peak transient drift for use by Working Group 6 in development of fragility functions. Both UCSD and UCB testing included companion specimens tested with and without retrofit. This report documents the portions of the WG4 testing conducted at UCB: two large-component cripple wall tests (Tests AL-1 and AL-2), one test of cripple wall load-path connections (Test B-1), and two tests of dwelling superstructure construction (Tests C-1 and C-2). Included in this report are details of specimen design and construction, instrumentation, loading protocols, test data, testing observations, discussion, and conclusions.
APA, Harvard, Vancouver, ISO, and other styles
8

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens I (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/dqhf2112.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4: Testing and focuses on the first phase of an experimental investigation to study the seismic performance of retrofitted and existing cripple walls with sill anchorage. Paralleled by a large-component test program conducted at the University of California [Cobeen et al. 2020], the present study involves the first of multiple phases of small-component tests conducted at the UC San Diego. Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish materials, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the first phase of testing, which consisted of six specimens. Phase 1 including quasi-static reversed cyclic lateral load testing of six 12-ft-long, 2-ft high cripple walls. All specimens in this phase were finished on their exterior with stucco over horizontal sheathing (referred to as a “wet” finish), a finish noted to be common of dwellings built in California before 1945. Parameters addressed in this first phase include: boundary conditions on the top, bottom, and corners of the walls, attachment of the sill to the foundation, and the retrofitted condition. Details of the test specimens, testing protocol, instrumentation; and measured as well as physical observations are summarized in this report. In addition, this report discusses the rationale and scope of subsequent small-component test phases. Companion reports present these test phases considering, amongst other variables, the impacts of dry finishes and cripple wall height (Phases 2–4). Results from these experiments are intended to provide an experimental basis to support numerical modeling used to develop loss models, which are intended to quantify the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100, Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
APA, Harvard, Vancouver, ISO, and other styles
9

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component - Test Program: Comparisons (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/lohh5109.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofit and existing cripple walls. Amongst the body of reports from WG4, in the present report, a suite of four small cripple wall test phases, in total 28 specimens, are cross compared with varied exterior finishes, namely stucco (wet) and non-stucco (dry) exterior finishes. Details representative of era specific construction, specifically the most vulnerable pre-1960s construction are of predominant focus in the present effort. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto cripple walls of 12 ft in length and 2 ft or 6 ft in height. All specimens in this report were constructed with the same boundary conditions and tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing); and dry exterior finishes (horizontal siding, horizontal siding over diagonal sheathing, and T1-11 wood structural panels) with attention towards cripple wall height and the retrofit condition. The present report provides only a brief overview of the test program and setup; whereas a series of three prior reports present results of test groupings nominally by exterior finish type (wet versus dry). As such, herein the focus is to cross compare key measurements and observations of the in-plane seismic behavior of all 28 specimens.
APA, Harvard, Vancouver, ISO, and other styles
10

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens II (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/ldbn4070.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofitted and existing cripple walls. This report focuses stucco or “wet” exterior finishes. Paralleled by a large-component test program conducted at the University of California, Berkeley (UC Berkeley) [Cobeen et al. 2020], the present study involves two of multiple phases of small-component tests conducted at the University of California San Diego (UC San Diego). Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish style, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the third phase of testing, which consisted of eight specimens, as well as half of the fourth phase of testing, which consisted of six specimens where three will be discussed. Although conducted in different phases, their results are combined here to co-locate observations regarding the behavior of the second phase the wet (stucco) finished specimens. The results of first phase of wet specimen tests were presented in Schiller et al. [2020(a)]. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto ten cripple walls of 12 ft long and 2 or 6 ft high. One cripple wall was tested with a monotonic loading protocol. All specimens in this report were constructed with the same boundary conditions on the top and corners of the walls as well as being tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing), cripple wall height, loading protocol, anchorage condition, boundary condition at the bottom of the walls, and the retrofitted condition. Details of the test specimens, testing protocol, including instrumentation; and measured as well as physical observations are summarized in this report. Companion reports present phases of the tests considering, amongst other variables, impacts of various boundary conditions, stucco (wet) and non-stucco (dry) finishes, vertical load, cripple wall height, and anchorage condition. Results from these experiments are intended to support advancement of numerical modeling tools, which ultimately will inform seismic loss models capable of quantifying the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100,Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography