Journal articles on the topic 'Measure-based modeling'

To see the other types of publications on this topic, follow the link: Measure-based modeling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Measure-based modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yager, Ronald R. "On the Fusion of Multiple Measure Based Belief Structures." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 26, Suppl. 2 (December 2018): 63–88. http://dx.doi.org/10.1142/s0218488518400123.

Full text
Abstract:
We introduce the concept of a fuzzy measure and describe the process of combining fuzzy measures to form new measures. We discuss the role of fuzzy measures in modeling uncertain information and its use in modeling granular uncertain information with the aid of measure based belief structures. We turn to the problem of fusing multiple measure based belief structures. First we look at the case when the belief structures being fused have the same focal elements. Then we turn to case where the structures being fused have different focal elements. Finally we compare measure-based fusion with Dempster’s rule.
APA, Harvard, Vancouver, ISO, and other styles
2

Wu, Yadong, Hongying Zhang, and Ran Duan. "Total Variation Based Perceptual Image Quality Assessment Modeling." Journal of Applied Mathematics 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/294870.

Full text
Abstract:
Visual quality measure is one of the fundamental and important issues to numerous applications of image and video processing. In this paper, based on the assumption that human visual system is sensitive to image structures (edges) and image local luminance (light stimulation), we propose a new perceptual image quality assessment (PIQA) measure based on total variation (TV) model (TVPIQA) in spatial domain. The proposed measure compares TVs between a distorted image and its reference image to represent the loss of image structural information. Because of the good performance of TV model in describing edges, the proposed TVPIQA measure can illustrate image structure information very well. In addition, the energy of enclosed regions in a difference image between the reference image and its distorted image is used to measure the missing luminance information which is sensitive to human visual system. Finally, we validate the performance of TVPIQA measure with Cornell-A57, IVC, TID2008, and CSIQ databases and show that TVPIQA measure outperforms recent state-of-the-art image quality assessment measures.
APA, Harvard, Vancouver, ISO, and other styles
3

Abdelouahad, Abdelkaher Ait, Mohammed El Hassouni, Hocine Cherifi, and Driss Aboutajdine. "A New Image Distortion Measure Based on Natural Scene Statistics Modeling." International Journal of Computer Vision and Image Processing 2, no. 1 (January 2012): 1–15. http://dx.doi.org/10.4018/ijcvip.2012010101.

Full text
Abstract:
In the field of Image Quality Assessment (IQA), this paper examines a Reduced Reference (RRIQA) measure based on the bi-dimensional empirical mode decomposition. The proposed measure belongs to Natural Scene Statistics (NSS) modeling approaches. First, the reference image is decomposed into Intrinsic Mode Functions (IMF); the authors then use the Generalized Gaussian Density (GGD) to model IMF coefficients distribution. At the receiver side, the same number of IMF is computed on the distorted image, and then the quality assessment is done by fitting error between the IMF coefficients histogram of the distorted image and the GGD estimate of IMF coefficients of the reference image, using the Kullback Leibler Divergence (KLD). In addition, the authors propose a new Support Vector Machine-based classification approach to evaluate the performances of the proposed measure instead of the logistic function-based regression. Experiments were conducted on the LIVE dataset.
APA, Harvard, Vancouver, ISO, and other styles
4

Kah, Samah El, Siham Aqel, My Abdelouahed Sabri, and Abdellah Aarab. "Background Modeling Method Based On Quad Tree Decomposition and Contrast Measure." Procedia Computer Science 148 (2019): 610–17. http://dx.doi.org/10.1016/j.procs.2019.01.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Yan Hui, De Jian Zhou, and Zhao Hua Wu. "Module Similarity Measure Method of Products Based on Assembly Relationship." Applied Mechanics and Materials 607 (July 2014): 721–26. http://dx.doi.org/10.4028/www.scientific.net/amm.607.721.

Full text
Abstract:
A module similarity measure method of the mechanical and electrical products is studied in this paper. First this paper proposes the module assembly type coding method and network modeling method, and establishes its weighted network model by taking typical module as an example; then it studies the assembly relationship similarity, node similarity and network similarity; finally it provides the module similarity measure method based on the assembly relationship.
APA, Harvard, Vancouver, ISO, and other styles
6

Belding, Matthew, Alireza Enshaeian, and Piervincenzo Rizzo. "Vibration-Based Approach to Measure Rail Stress: Modeling and First Field Test." Sensors 22, no. 19 (September 30, 2022): 7447. http://dx.doi.org/10.3390/s22197447.

Full text
Abstract:
This paper describes a non-invasive inspection technique for the estimation of longitudinal stress in continuous welded rails (CWR) to infer the rail neutral temperature (RNT), i.e., the temperature at which the net longitudinal force in the rail is zero. The technique is based on the use of finite element method (FEM), vibration measurements, and machine learning (ML). FEM is used to model the relationship between the boundary conditions and the longitudinal stress of any given CWR to the vibration characteristics (mode shapes and frequencies) of the rail. The results of the numerical analysis are used to train a ML algorithm that is then tested using field data obtained by an array of accelerometers polled on the track of interest. In the study presented in this article, the proposed technique was proven in the field during an experimental campaign conducted in Colorado. A commercial FEM software was used to model the rail track as a short rail segment repeated indefinitely and under varying boundary conditions and stress. Three datasets were prepared and fed to ML models developed using hyperparameter search optimization techniques and k-fold cross validation to infer the stress or the RNT. The frequencies of vibration were extracted from the time waveforms obtained from two accelerometers temporarily attached to the rail. The results of the experiments demonstrated that the success of the technique is dependent on the accuracy of the model and the ability to properly identify the modeshapes. The results also proved that the ML was also able to predict successfully the neutral temperature of the tested rail by using only a limited number of experimental data for the training.
APA, Harvard, Vancouver, ISO, and other styles
7

Hartge, Florian, Thomas Wetter, and Walter E. Haefeli. "A similarity measure for case based reasoning modeling with temporal abstraction based on cross-correlation." Computer Methods and Programs in Biomedicine 81, no. 1 (January 2006): 41–48. http://dx.doi.org/10.1016/j.cmpb.2005.10.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Du, Ao, Jamie E. Padgett, and Abdollah Shafieezadeh. "Influence of intensity measure selection on simulation-based regional seismic risk assessment." Earthquake Spectra 36, no. 2 (February 11, 2020): 647–72. http://dx.doi.org/10.1177/8755293019891717.

Full text
Abstract:
This study investigates the influence of intensity measure (IM) selection on simulation-based regional seismic risk assessment (RSRA) of spatially distributed structural portfolios. First, a co-simulation method for general spectral averaging vector IMs is derived. Then a portfolio-level surrogate demand modeling approach, which incorporates the seismic demand estimation of the non-collapse and collapse states, is proposed. The derived IM co-simulation method enables the first comparative study of different IMs, including the conventional IMs and some more advanced scalar and vector IMs, in the context of RSRA. The influence of IM selection on the predictive performance of the portfolio-level surrogate demand models, as well as on the regional seismic risk estimates, is explored based on a virtual spatially distributed structural portfolio subjected to a scenario earthquake. The results of this study provide pertinent insights in surrogate demand modeling, IM co-simulation and selection, which can facilitate more accurate and reliable regional seismic risk estimates.
APA, Harvard, Vancouver, ISO, and other styles
9

Fang, Li Yong, Hui Li, and Jin Ping Bai. "Defect Contour Matching Based on Similarity Measure for 3D Reconstruction." Advanced Materials Research 308-310 (August 2011): 1656–61. http://dx.doi.org/10.4028/www.scientific.net/amr.308-310.1656.

Full text
Abstract:
Contour matching is one of the important problems concerned in 3-D reconstruction field. According to the difficulties of defect contour matching in defect modeling, a method based on similarity measure is presented in this paper. In this method, the theory of similarity measure is introduced to quantitatively describe the similarity of two contours. And the value of similarity measure is set as the criterion to judge matching relation between two contours in consecutive slices. For reducing computational complexity and improving accuracy of contours matching, a candidate matching field of contour is proposed. The efficiency of this algorithm has been verified by a typical example and satisfying results have been obtained.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Xiao-gang, Li-wei Huang, and Ying-wei Zhang. "Modeling and monitoring of nonlinear multi-mode processes based on similarity measure-KPCA." Journal of Central South University 24, no. 3 (March 2017): 665–74. http://dx.doi.org/10.1007/s11771-017-3467-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Rodríguez-Martínez, Adán, and Begoña Vitoriano. "Probability-Based Wildfire Risk Measure for Decision-Making." Mathematics 8, no. 4 (April 10, 2020): 557. http://dx.doi.org/10.3390/math8040557.

Full text
Abstract:
Wildfire is a natural element of many ecosystems as well as a natural disaster to be prevented. Climate and land usage changes have increased the number and size of wildfires in the last few decades. In this situation, governments must be able to manage wildfire, and a risk measure can be crucial to evaluate any preventive action and to support decision-making. In this paper, a risk measure based on ignition and spread probabilities is developed modeling a forest landscape as an interconnected system of homogeneous sectors. The measure is defined as the expected value of losses due to fire, based on the probabilities of each sector burning. An efficient method based on Bayesian networks to compute the probability of fire in each sector is provided. The risk measure is suitable to support decision-making to compare preventive actions and to choose the best alternatives reducing the risk of a network. The paper is divided into three parts. First, we present the theoretical framework on which the risk measure is based, outlining some necessary properties of the fire probabilistic model as well as discussing the definition of the event ‘fire’. In the second part, we show how to avoid topological restrictions in the network and produce a computable and comprehensible wildfire risk measure. Finally, an illustrative case example is included.
APA, Harvard, Vancouver, ISO, and other styles
12

Teegavarapu, Ramesh, and Amin Elshorbagy. "Fuzzy set based error measure for hydrologic model evaluation." Journal of Hydroinformatics 7, no. 3 (July 1, 2005): 199–208. http://dx.doi.org/10.2166/hydro.2005.0017.

Full text
Abstract:
Traditional error measures (e.g. mean squared error, mean relative error) are often used in the field of water resources to evaluate the performance of models developed for modeling various hydrological processes. However, these measures may not always provide a comprehensive assessment of the performance of the model intended for a specific application. A new error measure is proposed and developed in this paper to fill the gap left by existing traditional error measures for performance evaluation. The measure quantifies the error that corresponds to the hydrologic condition and model application under consideration and also facilitates selection of the best model whenever multiple models are available for that application. Fuzzy set theory is used to model the modeler's perceptions of predictive accuracy in specific applications. The development of the error measure is primarily intended for use with models that provide hydrologic time series predictions. Hypothetical and real-life examples are used to illustrate and evaluate this measure. Results indicate that use of this measure is rational and meaningful in the selection process of an appropriate model from a set of competing models.
APA, Harvard, Vancouver, ISO, and other styles
13

Pele, Daniel Traian, Emese Lazar, and Miruna Mazurencu-Marinescu-Pele. "Modeling Expected Shortfall Using Tail Entropy." Entropy 21, no. 12 (December 7, 2019): 1204. http://dx.doi.org/10.3390/e21121204.

Full text
Abstract:
Given the recent replacement of value-at-risk as the regulatory standard measure of risk with expected shortfall (ES) undertaken by the Basel Committee on Banking Supervision, it is imperative that ES gives correct estimates for the value of expected levels of losses in crisis situations. However, the measurement of ES is affected by a lack of observations in the tail of the distribution. While kernel-based smoothing techniques can be used to partially circumvent this problem, in this paper we propose a simple nonparametric tail measure of risk based on information entropy and compare its backtesting performance with that of other standard ES models.
APA, Harvard, Vancouver, ISO, and other styles
14

Golosnoy, Vasyl, Benno Hildebrandt, and Steffen Köhler. "Modeling and Forecasting Realized Portfolio Diversification Benefits." Journal of Risk and Financial Management 12, no. 3 (July 11, 2019): 116. http://dx.doi.org/10.3390/jrfm12030116.

Full text
Abstract:
For a financial portfolio, we suggest a realized measure of diversification benefits, which is based on intraday high-frequency returns. Our measure quantifies volatility reduction, which could be achieved by including an additional asset in the portfolio. In order to make our approach feasible for investors, we also provide time series modeling of both the realized diversification measure and realized portfolio weight. The performance of our approach is evaluated in-sample and out-of-sample. We find out that our approach is helpful for the purpose of portfolio variance minimization.
APA, Harvard, Vancouver, ISO, and other styles
15

Weiss, Brandi A., and William Dardick. "An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression." Educational and Psychological Measurement 76, no. 6 (July 21, 2016): 986–1004. http://dx.doi.org/10.1177/0013164415623820.

Full text
Abstract:
This article introduces an entropy-based measure of data–model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify the quality of classification and separation of group membership. Entropy complements preexisting measures of data–model fit and provides unique information not contained in other measures. Hypothetical data scenarios, an applied example, and Monte Carlo simulation results are used to demonstrate the application of entropy in logistic regression. Entropy should be used in conjunction with other measures of data–model fit to assess how well logistic regression models classify cases into observed categories.
APA, Harvard, Vancouver, ISO, and other styles
16

Xing, Xiaoli, Qihao Chen, and Xiuguo Liu. "HETEROGENEITY MEASUREMENT BASED ON DISTANCE MEASURE FOR POLARIMETRIC SAR DATA." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences IV-3 (April 23, 2018): 233–37. http://dx.doi.org/10.5194/isprs-annals-iv-3-233-2018.

Full text
Abstract:
To effectively test the scene heterogeneity for polarimetric synthetic aperture radar (PolSAR) data, in this paper, the distance measure is introduced by utilizing the similarity between the sample and pixels. Moreover, given the influence of the distribution and modeling texture, the K distance measure is deduced according to the Wishart distance measure. Specifically, the average of the pixels in the local window replaces the class center coherency or covariance matrix. The Wishart and K distance measure are calculated between the average matrix and the pixels. Then, the ratio of the standard deviation to the mean is established for the Wishart and K distance measure, and the two features are defined and applied to reflect the complexity of the scene. The proposed heterogeneity measure is proceeded by integrating the two features using the Pauli basis. The experiments conducted on the single–look and multilook PolSAR data demonstrate the effectiveness of the proposed method for the detection of the scene heterogeneity.
APA, Harvard, Vancouver, ISO, and other styles
17

Engemann, Kurt J., and Ronald R. Yager. "Comfort Decision Modeling." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 26, Suppl. 1 (December 2018): 141–63. http://dx.doi.org/10.1142/s0218488518400081.

Full text
Abstract:
We introduce comfort decision modeling for decision problems in which an alternative is to be selected based on a measure of satisfaction we refer to as comfort. We define comfort as the difference between the payoff received by selecting a particular strategy and the worst payoff that could have been received under the manifestation of the same state-of-nature. We define the effective comfort associated with an alternative as the aggregation of an alternative’s comforts across all possible states-of-nature. We study several methods of aggregating an alternative’s individual comforts across the different states-of-nature, incorporating various types of information about the uncertainty associated with the states-of-nature. We provide a Comfort Decision Model to determine the value of alternatives utilizing attitudinal measures of the decision maker. We demonstrate a process of performing sensitivity of the resulting decision to a measure of the attitude of the decision maker. Lastly, we use an illustration to show the practicability and cogency of the new method.
APA, Harvard, Vancouver, ISO, and other styles
18

Menin, Boris Michailovich. "Applying Measurement Theory and Information-based Measure in Modeling Physical Phenomena and Technological Processes." European Journal of Engineering Research and Science 3, no. 1 (January 29, 2018): 28. http://dx.doi.org/10.24018/ejers.2018.3.1.594.

Full text
Abstract:
In this paper, we compare the features of the application of the theory of measurements and the measure of the similarity of the model to the phenomenon under study on the basis of calculating the amount of information contained in the model. An experimental estimate compared with the standard model’s uncertainty calculation procedure shows that this measure is preferable to the traditional approach to calculating the threshold discrepancy. The article presents an algorithm that is used to calculate the minimum achievable uncertainty in the resolution of the model's fuzziness, as well as experimental results demonstrating its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
19

Menin, Boris Michailovich. "Applying Measurement Theory and Information-based Measure in Modeling Physical Phenomena and Technological Processes." European Journal of Engineering and Technology Research 3, no. 1 (January 29, 2018): 28–34. http://dx.doi.org/10.24018/ejeng.2018.3.1.594.

Full text
Abstract:
In this paper, we compare the features of the application of the theory of measurements and the measure of the similarity of the model to the phenomenon under study on the basis of calculating the amount of information contained in the model. An experimental estimate compared with the standard model’s uncertainty calculation procedure shows that this measure is preferable to the traditional approach to calculating the threshold discrepancy. The article presents an algorithm that is used to calculate the minimum achievable uncertainty in the resolution of the model's fuzziness, as well as experimental results demonstrating its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
20

Vanbelle, Sophie, and Emmanuel Lesaffre. "Modeling agreement on bounded scales." Statistical Methods in Medical Research 27, no. 11 (May 8, 2017): 3460–77. http://dx.doi.org/10.1177/0962280217705709.

Full text
Abstract:
Agreement is an important concept in medical and behavioral sciences, in particular in clinical decision making where disagreements possibly imply a different patient management. The concordance correlation coefficient is an appropriate measure to quantify agreement between two scorers on a quantitative scale. However, this measure is based on the first two moments, which could poorly summarize the shape of the score distribution on bounded scales. Bounded outcome scores are common in medical and behavioral sciences. Typical examples are scores obtained on visual analog scales and scores derived as the number of positive items on a questionnaire. These kinds of scores often show a non-standard distribution, like a J- or U-shape, questioning the usefulness of the concordance correlation coefficient as agreement measure. The logit-normal distribution has shown to be successful in modeling bounded outcome scores of two types: (1) when the bounded score is a coarsened version of a latent score with a logit-normal distribution on the [0,1] interval and (2) when the bounded score is a proportion with the true probability having a logit-normal distribution. In the present work, a model-based approach, based on a bivariate generalization of the logit-normal distribution, is developed in a Bayesian framework to assess the agreement on bounded scales. This method permits to directly study the impact of predictors on the concordance correlation coefficient and can be simply implemented in standard Bayesian softwares, like JAGS and WinBUGS. The performances of the new method are compared to the classical approach using simulations. Finally, the methodology is used in two different medical domains: cardiology and rheumatology.
APA, Harvard, Vancouver, ISO, and other styles
21

Asri, Marselinus. "MODELING RISK MEASUREMENT IN EMERGING MARKET." Contemporary Journal on Business and Accounting 1, no. 1 (May 7, 2021): 1–22. http://dx.doi.org/10.58792/cjba.v1i1.10.

Full text
Abstract:
Purpose – This study aims to make modeling measurement risk in capital market variables. Design/methodology/approach – Using Mathematical approaches to integrated a noticeable increase in the firm-level idiosyncratic risk, the volatility measure of coeficient is greater and has a stronger upward trend than the new idiosyncratic volatility measure. Findings – Using the the model decomposing total risk in market variance extended by Bali et.al, we integrated the model with initial model, Fama-French idiosyncratic risk Model, we sugested new model: Rit -RFt = ai + bi (R Mt R Ft) + var.HLt+ var.SBt +Var.MW +Var.RW+ Var.CMA + ei Originality – This paper introduces a variance measure of aggregate idiosyncratic risk, which does not require estimation of market betas or correlations and is based on the concept of gain from portofolio diversification. Keywords: Idiosyncratic Risk, New Model Paper Type Research Result
APA, Harvard, Vancouver, ISO, and other styles
22

Syuhada, Khreshna. "Aggregate Risk Model and Risk Measure-Based Risk Allocation." InPrime: Indonesian Journal of Pure and Applied Mathematics 2, no. 1 (March 31, 2020): 13–23. http://dx.doi.org/10.15408/inprime.v2i1.14494.

Full text
Abstract:
AbstractIn actuarial modeling, aggregate risk is known as more attractive rather than individual risk. It has, however, usual difficulty in finding (the exact form of) joint probability distribution. This paper considers aggregate risk model and employ translated gamma approximation to handle such distribution function formulation. In addition, we deal with the problem of risk allocation in such model. We compute in particular risk allocation based on risk measure forecasts of Value-at-Risk (VaR) and its extensions: improved VaR and Tail VaR. Risk allocation shows the contribution of each individual risk to the aggregate. It has a constraint that the risk measure of aggregate risk is equal to the aggregate of risk measure of individual risk.Keywords: allocation methods; tail-value-at-risk; translated gamma approximation. AbstrakRisiko agregat merupakan kajian yang lebih menarik dalam pemodelan aktuaria, dibandingkan dengan risiko individu. Namun fungsi distribusi risiko agregat sulit ditentukan bentuk eksaknya. Artikel ini membahas mengenai model risiko agregat dan menggunakan metode aproksimasi Translasi Gamma untuk menentukan fungsi distribusi risiko agregat. Berdasarkan fungsi distribusi tersebut, dapat diprediksi alokasi risiko agregat. Metode alokasi risiko agregat diterapkan pada ukuran risiko Value-at-Risk (VaR) dan pengembangannya: improved VaR dan Tail-VaR. Alokasi risiko menyatakan nilai kontribusi setiap risiko individu terhadap ukuran risiko agregat. Jumlahan atau agregat dari setiap alokasi risiko individu sama dengan ukuran risiko agregat.Kata kunci: aproksimasi Translasi Gamma; alokasi risiko; Tail-Value-at-Risk.
APA, Harvard, Vancouver, ISO, and other styles
23

Dui, Hongyan, Chi Zhang, Guanghan Bai, and Liwei Chen. "Mission reliability modeling of UAV swarm and its structure optimization based on importance measure." Reliability Engineering & System Safety 215 (November 2021): 107879. http://dx.doi.org/10.1016/j.ress.2021.107879.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Starck, J. L., F. Murtagh, and R. Gastaud. "A new entropy measure based on the wavelet transform and noise modeling [image compression]." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 45, no. 8 (1998): 1118–24. http://dx.doi.org/10.1109/82.718822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Liang, Baorui, Suhua Zhang, Dongping Li, Yuxin Zhai, Fei Wang, Lexian Shi, and Yongbo Wang. "Safety Risk Evaluation of Construction Site Based on Unascertained Measure and Analytic Hierarchy Process." Discrete Dynamics in Nature and Society 2021 (November 23, 2021): 1–14. http://dx.doi.org/10.1155/2021/7172938.

Full text
Abstract:
Due to the high risk of construction sites, it is necessary to make safety risk evaluation. With the synthetic consideration of the complexity and uncertainty of construction sites, a comprehensive evaluation index system with two levels has been based on the emphases and difficulties of the current safety work in construction industry; the index system includes civilized construction site, management of machinery, equipment, materials, occupation health protection, and the subpackage management, such as 6 one-level evaluation indexes, inspection system of safety risk, safety awareness education, and prevention and control of occupation disease, such as 33 two-level evaluation indexes. The research takes “No. A residential building and other 8 projects” as the empirical analysis object, with the weights of the first indexes and second indexes being calculated by the analytic hierarchy process (AHP) and the information entropy, and applies the unascertained measure to comprehensively evaluate the safety risk of the construction project. The overall score of the project is 6.4475, the evaluation result is good, which is consistent with the actual situation, and the feasibility and effectiveness of the evaluation index system and evaluation model are verified, which provides a reference for safety risk management in the construction stage of construction projects.
APA, Harvard, Vancouver, ISO, and other styles
26

Hadarovich, A. Yu, I. V. Anishchenko, P. Kundrotas, I. Vakser, and A. V. Tuzikov. "Structure prediction algorithm for protein complexes based on gene ontology." Doklady of the National Academy of Sciences of Belarus 64, no. 2 (May 17, 2020): 150–58. http://dx.doi.org/10.29235/1561-8323-2020-64-2-150-158.

Full text
Abstract:
We propose an algorithm for comparing protein-protein complexes based on their functional properties in terms of Gene Ontology. The proposed measure of a functional similarity between complexes is combined with a structural measure to find templates for the template-based docking of protein complexes. We present the results on the modeling of protein complexes based on this algorithm.
APA, Harvard, Vancouver, ISO, and other styles
27

HUANG, BUFU, MENG CHEN, KA KEUNG LEE, and YANGSHENG XU. "HUMAN IDENTIFICATION BASED ON GAIT MODELING." International Journal of Information Acquisition 04, no. 01 (March 2007): 27–38. http://dx.doi.org/10.1142/s0219878907001137.

Full text
Abstract:
Human gait is a dynamic biometrical feature which is complex and difficult to imitate. It is unique and more secure than static features such as passwords, fingerprints and facial features. In this paper, we present intelligent shoes for human identification based on human gait modeling and similarity evaluation with hidden Markov models (HMMs). Firstly we describe the intelligent shoe system for collecting human dynamic gait performance. Using the proposed machine learning method hidden Markov models, an individual wearer's gait model is derived and we then demonstrate the procedure for recognizing different wearers by analyzing the corresponding models. Next, we define a hidden-Markov-model-based similarity measure which allows us to evaluate resultant learning models. With the most likely performance criterion, it will help us to derive the similarity of individual behavior and its corresponding model. By utilizing human gait modeling and similarity evaluation based on hidden Markov models, the proposed method has produced satisfactory results for human identification during testing.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhang, Lan Ting. "Research and Application on Parameterized Modeling Technology of Shaft Part Based on Feature." Applied Mechanics and Materials 148-149 (December 2011): 548–51. http://dx.doi.org/10.4028/www.scientific.net/amm.148-149.548.

Full text
Abstract:
At present, parameterized design and feature modeling are the two main developing direction of CAD. Integrating feature modeling with parameterized technology is the effective method and measure to 3D design and modeling, which product feature is modeled by parameter. According to it’s characteristic, feature of shaft part is classified and described concretely, and feature information model is established in this paper. Feature modeling method and parameterized feature modeling technology of shaft part are studied and parameterized feature modeling module based on UG NX for shaft part is developed.
APA, Harvard, Vancouver, ISO, and other styles
29

BELLINI, FABIO, and GIANNA FIGÀ-TALAMANCA. "DETECTING AND MODELING TAIL DEPENDENCE." International Journal of Theoretical and Applied Finance 07, no. 03 (May 2004): 269–87. http://dx.doi.org/10.1142/s0219024904002426.

Full text
Abstract:
The aim of this work is to develop a nonparametric tool for detecting dependence in the tails of financial data. We provide a simple method to locate and measure serial dependence in the tails, based on runs tests. Our empirical investigations on many financial time series reveal a strong departure from independence for daily logreturns, which is not filtered out by usual Garch models.
APA, Harvard, Vancouver, ISO, and other styles
30

Fajarin, Rio Agustian, and Eko Adhi Setiawan. "Analysis Corrected Performance Ratio on Photovoltaic Through Four Temperature Cell Model." E3S Web of Conferences 67 (2018): 01026. http://dx.doi.org/10.1051/e3sconf/20186701026.

Full text
Abstract:
To measure the performance of photovoltaic system, performance ratio is a common indicator to used. However, the performance ratio calculation is based on various methods to determine the value of photovoltaic temperature. The modelling used in this research is sandia, Ross and Smokler, Schott and Faiman modeling. Each model have different coefficient values from each others. Thus, each modelling has different performance ratio values that will be compared between others. From this comparison we will get the best modelling that closest to the real conditions.
APA, Harvard, Vancouver, ISO, and other styles
31

Huang, Li, Yu Kun Sun, Xiao Fu Ji, Yong Hong Huang, and Tian Yan Du. "Soft Sensor Modeling of Biological Fermentation Process Based on tPSO-FNN." Key Engineering Materials 464 (January 2011): 482–86. http://dx.doi.org/10.4028/www.scientific.net/kem.464.482.

Full text
Abstract:
Biological fermentation process is a complex nonlinear dynamic coupling process. As it is very difficult to measure the key biological parameters on line, the process control is unavailable to industrial production in time. In this respect, however, soft sensing can solve the above problem. To overcome some drawbacks of PSO and FNN, such as falling into local minimum occasionally and slow convergence speed, the extremum disturbed particle swarm optimization (tPSO) algorithm is proposed and then combined with fuzzy neural network (FNN) to optimize the network parameters. Furthermore, the tPSO-FNN is applied in the soft sensor modeling of lysine biological fermentation. Experiment results show that the model proposed could measure the key parameters. And the soft sensor model based on tPSO-FNN has higher precision and better performance than the model based on FNN.
APA, Harvard, Vancouver, ISO, and other styles
32

Aliev, Rafik, and Konul Memmedova. "Application ofZ-Number Based Modeling in Psychological Research." Computational Intelligence and Neuroscience 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/760403.

Full text
Abstract:
Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper,Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS), Test of Attention (D2 Test), and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application ofZ-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps ofZ-number based modeling with numerical solutions are presented.
APA, Harvard, Vancouver, ISO, and other styles
33

Zhang, Liming, Ji Qi, Lixin Li, Kai Zhang, Zhixue Sun, Yongfei Yang, and Qin Luo. "A forward modeling method based on electromagnetic theory to measure the parameters of hydraulic fracture." Fuel 251 (September 2019): 466–73. http://dx.doi.org/10.1016/j.fuel.2019.04.075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Eckel, K. T., A. Pfahlberg, O. Gefeller, and T. Hothorn. "Flexible Modeling of Malignant Melanoma Survival." Methods of Information in Medicine 47, no. 01 (2008): 47–55. http://dx.doi.org/10.3414/me0450.

Full text
Abstract:
Summary Objectives: This paper compares the diagnostic capabilities of flexible ensemble methods modeling the survival time of melanoma patients in comparison to the well established proportional hazards model. Both a random forest type algorithm for censored data as well as a model combination of the proportional hazards model with recursive partitioning are investigated. Methods: Benchmark experiments utilizing the integrated Brier score as a measure for goodness of prediction are the basis of the performance assessment for all competing algorithms. For the purpose of comparing regression relationships represented by the models under test, we describe fitted conditional survival functions by a univariate measure derived from the area under the curve. Based on this measure, we adapt a visualization technique useful for the inspection and comparison of model fits. Results: For the data of malignant melanoma patients the predictive performance of the competing models is on par, allowing for a fair comparison of the fitted relationships. Newly introduced MODplots visualize differences in the fitting structure of the underlying models. Conclusion: The paper provides a framework for comparing the predictive and diagnostic performance of a parametric, a non-parametric and a combined approach.
APA, Harvard, Vancouver, ISO, and other styles
35

Xiao, Shaoping, and Ruicheng Liu. "Studies of COVID-19 Outbreak Control Using Agent-Based Modeling." Complex Systems 30, no. 3 (September 15, 2021): 297–321. http://dx.doi.org/10.25088/complexsystems.30.3.297.

Full text
Abstract:
An agent-based model was developed to study outbreaks and outbreak control for COVID-19, mainly in urban communities. Rules for people’s interactions and virus infectiousness were derived based on previous sociology studies and recently published data-driven analyses of COVID-19 epidemics. The calculated basic reproduction number of epidemics from the developed model coincided with reported values. There were three control measures considered in this paper: social distancing, self-quarantine and community quarantine. Each control measure was assessed individually at first. Later on, an artificial neural network was used to study the effects of different combinations of control measures. To help quantify the impacts of self-quarantine and community quarantine on outbreak control, both were scaled respectively. The results showed that self-quarantine was more effective than the others, but any individual control measure was ineffective in controlling outbreaks in urban communities. The results also showed that a high level of self-quarantine and general community quarantine, assisted with social distancing, would be recommended for outbreak control.
APA, Harvard, Vancouver, ISO, and other styles
36

Kozák, Luca, Attila Házy, and Laura Veres. "Application of artificial intelligence in settlement development modeling." Multidiszciplináris tudományok 11, no. 5 (2021): 256–63. http://dx.doi.org/10.35925/j.multi.2021.5.27.

Full text
Abstract:
In the book (Házy-Veres et al., 2020), we presented a model that applies artificial intelligence and neuro-fuzzy systems to the settlement development. The model is based on the creation of two knowledge bases, databases: one a database of good practices and a second one of settlements. Based on this result we created a web-based application to measure the social innovation potential of settlements and support implementing good practices.
APA, Harvard, Vancouver, ISO, and other styles
37

Athanasopoulos, George, Haiyan Song, and Jonathan A. Sun. "Bagging in Tourism Demand Modeling and Forecasting." Journal of Travel Research 57, no. 1 (February 2, 2017): 52–68. http://dx.doi.org/10.1177/0047287516682871.

Full text
Abstract:
This study introduces bootstrap aggregation (bagging) in modeling and forecasting tourism demand. The aim is to improve the forecast accuracy of predictive regressions while considering fully automated variable selection processes which are particularly useful in industry applications. The procedures considered for variable selection is the general-to-specific (GETS) approach based on statistical inference and stepwise search procedures based on a measure of predictive accuracy (MPA). The evidence based on tourist arrivals from six source markets to Australia overwhelmingly suggests that bagging is effective for improving the forecasting accuracy of the models considered.
APA, Harvard, Vancouver, ISO, and other styles
38

WU, HAOYANG, YUYUAN WU, HAITAO LIU, and HONGJUN ZHANG. "ROUGHNESS OF TYPE-2 FUZZY SET BASED ON SIMILARITY RELATION." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 15, no. 04 (August 2007): 503–17. http://dx.doi.org/10.1142/s0218488507004820.

Full text
Abstract:
Theory of fuzzy set and theory of rough set are two useful means of describing and modeling of uncertainty in ill defined environment where precise mathematical analysis are not suitable. Classical rough set theory is based on equivalent relation. It has been indicated that it could be generated to case with a similarity relation. So far, there has been theoretical investigations on roughness measure of fuzzy set based on equivalent relation. The intention of this paper is to go further and propose a measure of roughness of a type-2 fuzzy set based on similarity relation and prove some properties of this novel measure.
APA, Harvard, Vancouver, ISO, and other styles
39

Jitendra, Asha K., Danielle N. Dupuis, and Anne F. Zaslofsky. "Curriculum-Based Measurement and Standards-Based Mathematics." Learning Disability Quarterly 37, no. 4 (January 2, 2014): 241–51. http://dx.doi.org/10.1177/0731948713516766.

Full text
Abstract:
This purpose of this study was to examine the reliability and validity of a curriculum-based measure of word problem solving (CBM-WPS) as an indicator of performance and progress in a sample of 136 third-grade students at risk for mathematics difficulties (MDs) instructed in a standards-based mathematics curriculum. Students completed the CBM-WPS measure every 2 weeks across 12 school weeks. Results indicated that the CBM-WPS measure was reliable and significantly correlated with measures of arithmetic WPS, number combinations fluency, and a standardized test of mathematics achievement. Results of growth modeling indicated that students showed significant growth on the CBM-WPS measure, with an average increase of 0.33 problems correct per week. Additional analyses revealed that students identified as high at-risk demonstrated similar growth as students identified as low at-risk. Furthermore, the CBM-WPS growth slopes were a significant predictor of students’ spring performance on a standardized test of mathematics achievement, demonstrating their predictive validity. Implications for practice and future research for assessing mathematics skill development are discussed.
APA, Harvard, Vancouver, ISO, and other styles
40

Specht, Aaron J., Marc G. Weisskopf, and Linda Huiling Nie. "Theoretical modeling of a portable x-ray tube based KXRF system to measure lead in bone." Physiological Measurement 38, no. 3 (February 21, 2017): 575–85. http://dx.doi.org/10.1088/1361-6579/aa5efe.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Zhang, Ning, Fanbin Kong, and Chih-Chun Kung. "On Modeling Environmental Production Characteristics: A Slacks-Based Measure for China’s Poyang Lake Ecological Economics Zone." Computational Economics 46, no. 3 (August 28, 2014): 389–404. http://dx.doi.org/10.1007/s10614-014-9467-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Mallam, Hassane Abba, Natatou Dodo Moutari, Barro Diakarya, and Saley Bisso. "Extremal Copulas and Tail Dependence in Modeling Stochastic Financial Risk." European Journal of Pure and Applied Mathematics 14, no. 3 (August 5, 2021): 1057–81. http://dx.doi.org/10.29020/nybg.ejpam.v14i3.3951.

Full text
Abstract:
These last years the stochastic modeling became essential in financial risk management related to the ownership and valuation of financial products such as assets, options and bonds. This paper presents a contribution to the modeling of stochastic risks in finance by using both extensions of tail dependence coefficients and extremal dependance structures based on copulas. In particular, we show that when the stochastic behavior of a set of risks can be modeled by a multivariate extremal process a corresponding form of the underlying copula describing theirdependence is determined. Moreover a new tail dependence measure is proposed and properties of this measure are established.
APA, Harvard, Vancouver, ISO, and other styles
43

Lei, Qi, Min Wu, and Chun Sheng Wang. "Improved Lazy Learning Based Dynamic Modeling Method on Combustion Process." Applied Mechanics and Materials 52-54 (March 2011): 680–85. http://dx.doi.org/10.4028/www.scientific.net/amm.52-54.680.

Full text
Abstract:
This paper presents a new data-based method for the industrial combustion process. In this method, angle measure, which presents change trend of samples, is introduced in evaluating the similarity between the sample data and query data, which is not exploited in the previous word. And ARX method is used to build the local model. With the moving of working points, different models are set up to realize the accurate modeling for combustion process. An example of coke oven combustion process is presented to illustrate the modeling capability of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
44

Siddall, J. N. "Probabilistic Modeling in Design." Journal of Mechanisms, Transmissions, and Automation in Design 108, no. 3 (September 1, 1986): 330–35. http://dx.doi.org/10.1115/1.3258735.

Full text
Abstract:
A general procedure is proposed for evolving the form of a density function that is consistent with the concept of subjective probability. The procedure directly applies new data information to the updating of the form of a density function without imposing on it any theoretical distribution that could restrict its shape, and permits the direct use of judgment arising from real world experience. It is based on the simple concept that sample size is a measure of confidence in the shape of a density function. Two possible algorithms are given, and the concept is extended for simple “true” or “false” events. The importance of probability in artificial intelligence is also dicusssed, and its essentially subjective nature is described. Procedures are briefly suggested.
APA, Harvard, Vancouver, ISO, and other styles
45

Li, Zhi Yang, Xiao Mei Wang, Yu Zhu, Ming Yu Huang, and Hong Jun Ni. "The Modeling Design of Plastic Ashtray Technology and Rapid Prototyping Technology Based on Reverse Engineering." Advanced Materials Research 889-890 (February 2014): 9–13. http://dx.doi.org/10.4028/www.scientific.net/amr.889-890.9.

Full text
Abstract:
Reverse engineering is a process of using 3D geometric modeling method to reconstruct actual objects CAD model based on these points, which is used physical digital measuring equipment to measure the three-dimensional coordinates of points on the surface of the object accurately and rapidly. Based on reverse engineering technology as the theoretical basis, the paper used three-coordinate measuring machine to measure ashtray surface data. After data was be handled, which was used to reconstruct 3D entity in Pro/E software. Last, the 3D entity of ashtray was printed out through rapid prototyping machine, which can be achieved by physical sample to rapid manufacturing of products, shortening production cycle, reducing production costs.
APA, Harvard, Vancouver, ISO, and other styles
46

Modak, Soumita. "A new nonparametric interpoint distance-based measure for assessment of clustering." Journal of Statistical Computation and Simulation 92, no. 5 (October 6, 2021): 1062–77. http://dx.doi.org/10.1080/00949655.2021.1984487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Magdalena Tokarska. "Modeling of electro-conductive properties of woven structure based on mixing model." Communications in Development and Assembling of Textile Products 1, no. 1 (August 10, 2020): 12–19. http://dx.doi.org/10.25367/cdatp.2020.1.p12-19.

Full text
Abstract:
The main purpose of the research is to model the electro-conductive properties of the woven structure based on the mixing model. The generalized Archie's law for analysis of the woven structure and its conductivity was chosen. Electro-conductive components of woven structure i.e. strips and contacts of strips were treated as conducting phases in the structure. Based on generalized Archie’s law, the connectivity as being a measure of how the components of the whole structure are arranged, and the connectedness of a given phase as being a measure of the availability of pathways for conduction through that phase was determined for all structures. It was found that the connectivity of the strips phase is higher than the connectivity of the contacts of strips phase. It means that the strips phase (in terms of their quantity) has a greater effect on the conductivity of the woven structure than the contacts of strips phase. A decrease of the connectedness of strips and contacts of strips phases (in terms of their quality) can be obtained by adding another component to the woven structure which will reduce the conductivity of the structure.
APA, Harvard, Vancouver, ISO, and other styles
48

Пензин, Максим, Maksim Penzin, Николай Ильин, and Nikolay Ilyin. "Modeling of Doppler frequency shift in multipath radio channels." Solar-Terrestrial Physics 2, no. 2 (August 10, 2016): 66–76. http://dx.doi.org/10.12737/21000.

Full text
Abstract:
We discuss the modeling of propagation of a quasi-monochromatic radio signal, represented by a coherent pulse sequence, in a non-stationary multipath radio channel. In such a channel, signal propagation results in the observed frequency shift for each ray (Doppler effect). The modeling is based on the assumption that during propagation of a single pulse a channel can be considered stationary. A phase variation in the channel transfer function is shown to cause the observed frequency shift in the received signal. Thus, instead of measuring the Doppler frequency shift, we can measure the rate of variation in the mean phase of one pulse relative to another. The modeling is carried out within the framework of the method of normal waves. The method enables us to model the dynamics of the electromagnetic field at a given point with the required accuracy. The modeling reveals that a local change in ionospheric conditions more severely affects the rays whose reflection region is in the area where the changes occur.
APA, Harvard, Vancouver, ISO, and other styles
49

Chukanov, Sergey Nikolayevich. "Modeling the structure of a complex system based on estimation of the measure of interaction of subsystems." Computer Research and Modeling 12, no. 4 (August 2020): 707–19. http://dx.doi.org/10.20537/2076-7633-2020-12-4-707-719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Savchenko, V. V., and L. V. Savchenko. "Speech Signal Autoregression Modeling Based on the Discrete Fourier Transform and Scale-Invariant Measure of Information Discrimination." Journal of Communications Technology and Electronics 66, no. 11 (November 2021): 1266–73. http://dx.doi.org/10.1134/s1064226921110085.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography