Articles de revues sur le sujet « Information Value Method »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Information Value Method.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Information Value Method ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Artamonov, Y., et I. Kamanin. « Analysis method of the information value of indicators ». Journal of Physics : Conference Series 1084 (août 2018) : 012011. http://dx.doi.org/10.1088/1742-6596/1084/1/012011.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Almansour, Abdulaziz, Stephen E. Laubach, J. Eric Bickel et Richard A. Schultz. « Value-of-Information Analysis of a Fracture Prediction Method ». SPE Reservoir Evaluation & ; Engineering 23, no 03 (1 août 2020) : 0811–23. http://dx.doi.org/10.2118/198906-pa.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Bae Khee Su et Lee, Kyu-Jin. « The Information Usefulness of Equity Method Fair Value Disclosures ». Korea International Accounting Review ll, no 26 (juin 2009) : 71–94. http://dx.doi.org/10.21073/kiar.2009..26.004.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Matveeva, T. V. « On Method of Revealing Value Information of Conversation Dialogue ». Nauchnyy dialog, no 10 (2018) : 89–101. http://dx.doi.org/10.24224/2227-1295-2018-10-89-101.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Thew, Sarah, et Alistair Sutcliffe. « Value-based requirements engineering : method and experience ». Requirements Engineering 23, no 4 (6 juin 2017) : 443–64. http://dx.doi.org/10.1007/s00766-017-0273-y.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Hinchliffe, Lisa Janicke. « Collaboration : a value and a method ». Research Strategies 19, no 1 (janvier 2003) : 1–2. http://dx.doi.org/10.1016/j.resstr.2003.09.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Pereira, Marco Antonio, Alexandre Evaristo Pinto, João Estevão Barbosa Neto et Eliseu Martins. « Deprival value : information utility analysis ». Revista Contabilidade & ; Finanças 29, no 76 (avril 2018) : 16–25. http://dx.doi.org/10.1590/1808-057x201805200.

Texte intégral
Résumé :
ABSTRACT This article contributes to the perception that the users’ learning process plays a key role in order to apply an accounting concept and this involves a presentation that fits its informative potential, free of previous accounting fixations. Deprival value is a useful measure for managerial and corporate purposes, it may be applied to the current Conceptual Framework of the International Accounting Standards Board (IASB). This study analyzes its utility, taking into account cognitive aspects. Also known as value to the business, deprival value is a measurement system that followed a path where it was misunderstood, confused with another one, it faced resistance to be implemented and fell into disuse; everything that a standardized measurement method tries to avoid. In contrast, deprival value has found support in the academy and in specific applications, such as those related to the public service regulation. The accounting area has been impacted by sophistication of the measurement methods that increasingly require the ability to analyze accounting facts on an economic basis, at the risk of loss of their information content. This development becomes possible only when the potential of a measurement system is known and it is feasible to be achieved. This study consists in a theoretical essay based on literature review to discuss its origin, presentation, and application. Considering the concept’s cognitive difficulties, deprival value was analyzed, as well as its corresponding heteronym, value to the business, in order to explain some of these changes. The concept’s utility was also explored through cross-analysis with impairment and the scheme developed was applied to actual economic situations faced by a company listed on stock exchange.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Choi, Jung Yoon, et Jeong Whon Yu. « Estimation of VMS Traffic Information Value Using Contingent Valuation Method ». Journal of The Korea Institute of Intelligent Transport Systems 12, no 3 (30 juin 2013) : 42–52. http://dx.doi.org/10.12815/kits.2013.12.3.042.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Choi, Jung Yoon, et Jeong Whon Yu. « Estimation of VMS Traffic Information Value Using Contingent Valuation Method ». Journal of The Korea Institute of Intelligent Transport Systems 12, no 3 (30 juin 2013) : 42–52. http://dx.doi.org/10.12815/kits.2013.12.3.42.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Tarasova, S. A. « Information Value Factor in Adaptive Time Series Forecasting ». Informacionnye Tehnologii 28, no 4 (14 avril 2022) : 219–24. http://dx.doi.org/10.17587/it.28.219-224.

Texte intégral
Résumé :
The article considers the task of forecasting time series with unstable dynamics. Adaptive forecasting methods are usually used to solve such tasks, but there is a problem of using adaptive forecasting methods, which consists of choosing an adequate adaptation parameter, in science. The purpose of the study is to construct and test an adaptive forecasting model, in which the adaptation parameter is calculated based on the information value. The discount factor is calculated as the ratio of the measure of the information value in the current period to the measure of the information value in the previous period. The values of the adaptation parameter and the discount factor are obtained depending on the information half-life. The analysis of the effectiveness of the model is carried out on the example of forecasting a medical and statistical indicator — the morbidity of the population. The average relative error of the forecast obtained in the study for the proposed adaptive model is significantly less than for the linear model. The method of finding the adaptation parameter and the discount factor based on the information value can serve as an additional criterion for choosing these constants, and in some cases the method of forecasting.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Mackevičius, Jonas, Daiva Raziūnienė et Romualdas Valkauskas. « The Value of Financial Analysis Information : Theoretical View ». Buhalterinės apskaitos teorija ir praktika, no 20 (22 janvier 2020) : 5. http://dx.doi.org/10.15388/batp.2019.13.

Texte intégral
Résumé :
The value of financial information analysis depends on usability, efforts, duration, etc. These dimensions have not been systematically reviewed as well as assessment methods have not been established yet. The purpose of the article is an examination of value dimensions and arrangement of the model for identifying and determining analysis methods. Research methods of this article are literature analysis, classification, specification and generalization of information. In this article, we focus on financial analysis data and the relation between information and its value. We suggest the theoretical model which determines dimensions of the information value and costs related to its transfer. We propose a method of evaluating parameters of value and how these parameters may influence its exposure.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Van Wegen, Bert, et Robert De Hoog. « Measuring the Economic Value of Information Systems ». Journal of Information Technology 11, no 3 (septembre 1996) : 247–60. http://dx.doi.org/10.1177/026839629601100306.

Texte intégral
Résumé :
The determination of value of information or information systems is a basic issue for information management. In order to solve it several questions must be answered like: what is the object of valuation; how is value defined and measured; and what constitutes a coherent and usable method for valuation. In this paper an approach is outlined that combines the information commodity approach, activity-based costing, and graph modelling. The first is used to define the object of analysis (an information commodity) and the nature of value (the demand value at the marketplace). The third allows the modelling of business processes in terms of activities and cost relations between activities. The second enables the assignment of costs to activities modelled in the graph. Together they constitute a coherent and usable method for determining the value of IS. This is illustrated by means of a case study.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Potrakhov, N. N., et A. Yu Gryaznov. « Method for Assessment of Information Value of Dental X-Ray Images ». Biomedical Engineering 43, no 1 (janvier 2009) : 17–19. http://dx.doi.org/10.1007/s10527-009-9086-8.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Granikov, Vera, Roland Grad, Reem El Sherif, Michael Shulha, Genevieve Chaput, Genevieve Doray, François Lagarde, Annie Rochette, David Li Tang et Pierre Pluye. « The Information Assessment Method : Over 15 years of research evaluating the value of health information ». Education for Information 36, no 1 (3 avril 2020) : 7–18. http://dx.doi.org/10.3233/efi-190348.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Thanh, Nguyen Chi. « MCA method in information storage and retrieval ». Journal of Computer Science and Cybernetics 1, no 3 (6 août 2015) : 25–28. http://dx.doi.org/10.15625/1813-9663/1/3/6691.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
16

Wang, Xiao Feng, et Hai Jian Li. « Research on Information Hiding Method Based on Word Text ». Advanced Materials Research 926-930 (mai 2014) : 2815–18. http://dx.doi.org/10.4028/www.scientific.net/amr.926-930.2815.

Texte intégral
Résumé :
This paper proposes an information hiding method based on word text, according to the theory of the human eye cones for the color sensitivity; the human eye is least sensitive to blue. I change the lowest two bits of G component in the carrier text character RGB color value, change the lowest two bits of R component in the carrier text character RGB color value, and change the lowest four bits of B component in the carrier text character RGB color value, So that each carrier text character can hide 8 bits binary information. RGB color value of character underline can hide 24 bits information; the combination of these two methods can hide 32 bits binary information. After testing, the results show that the algorithm hiding rate is high, the algorithm is stable and easy to implement, has a certain application value.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Mohiuddin, Syed, Elisabeth Fenwick et Katherine Payne. « USE OF VALUE OF INFORMATION IN UK HEALTH TECHNOLOGY ASSESSMENTS ». International Journal of Technology Assessment in Health Care 30, no 6 (décembre 2014) : 553–70. http://dx.doi.org/10.1017/s0266462314000701.

Texte intégral
Résumé :
Objectives: The aim of this study was to identify and critically appraise the use of Value of Information (VOI) analyses undertaken as part of health technology assessment (HTA) reports in England and Wales.Methods: A systematic review of National Institute for Health Research (NIHR) funded HTA reports published between 2004 and 2013 identified the use of VOI methods and key analytical details in terms of: (i) types of VOI methodology used; (ii) parameters and key assumptions; and (iii) conclusions drawn in terms of the need for further research.Results: A total of 512 HTA reports were published during the relevant timeframe. Of these, 203 reported systematic review and economic modeling studies and 25 of these had used VOI method(s). Over half of the twenty-five studies (n = 13) conducted both EVPI (Expected Value of Perfect Information) and EVPPI (Expected Value of Partial Perfect Information) analyses. Eight studies conducted EVPI analysis, three studies conducted EVPI, EVPPI, and EVSI (Expected Value of Sampling Information) analyses and one study conducted EVSI analysis only. The level of detail reporting the methods used to conduct the VOI analyses varied.Conclusions: This review has shown that the frequency of the use of VOI methods is increasing at a slower pace compared with the published volume of HTA reports. This review also suggests that analysts reporting VOI method(s) in HTA reports should aim to describe the method(s) in sufficient detail to enable and encourage decision-makers guiding research prioritization decisions to use the potentially valuable outputs from quantitative VOI analyses.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Purwar, Archana, et Sandeep Kumar Singh. « DBSCANI : Noise-Resistant Method for Missing Value Imputation ». Journal of Intelligent Systems 25, no 3 (1 juillet 2016) : 431–40. http://dx.doi.org/10.1515/jisys-2014-0172.

Texte intégral
Résumé :
AbstractThe quality of data is an important task in the data mining. The validity of mining algorithms is reduced if data is not of good quality. The quality of data can be assessed in terms of missing values (MV) as well as noise present in the data set. Various imputation techniques have been studied in MV study, but little attention has been given on noise in earlier work. Moreover, to the best of knowledge, no one has used density-based spatial clustering of applications with noise (DBSCAN) clustering for MV imputation. This paper proposes a novel technique density-based imputation (DBSCANI) built on density-based clustering to deal with incomplete values in the presence of noise. Density-based clustering algorithm proposed by Kriegal groups the objects according to their density in spatial data bases. The high-density regions are known as clusters, and the low-density regions refer to the noise objects in the data set. A lot of experiments have been performed on the Iris data set from life science domain and Jain’s (2D) data set from shape data sets. The performance of the proposed method is evaluated using root mean square error (RMSE) as well as it is compared with existing K-means imputation (KMI). Results show that our method is more noise resistant than KMI on data sets used under study.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Analia Sánchez, Marisa, Antonio Carlos Gastaud Maçada et Marcela del Valle Sagardoy. « A strategy-based method of assessing information technology investments ». International Journal of Managing Projects in Business 7, no 1 (20 décembre 2013) : 43–60. http://dx.doi.org/10.1108/ijmpb-12-2012-0073.

Texte intégral
Résumé :
Purpose – The purpose of the paper is to present a theoretical framework and the preliminary results of a research on how to assess information technology (IT) investments so as to deliver maximum business value. Design/methodology/approach – To see whether IT projects fit strategy, the Strategy Map provides a framework for defining the portfolio value and data envelopment analysis (DEA) is used to measure the efficiency of project portfolios. Subsequently, an application that illustrates the value of the framework is described. Findings – The authors offer a framework that integrates the Strategy Map and IT project portfolio management (PPM) and suggest that this conceptual framework will allow an organization to enhance the value of IT investments. Research limitations/implications – This paper is supported by a case study using secondary data only. Practical implications – The suggested method could help CEOs to understand the interactions between projects and strategy and thus supports decision making to prioritize and track IT investments. The paper illustrates how the proposed framework is applied. It also provides the basis for further research. Originality/value – By explicitly linking IT investment with organizational goals, this approach produces results that differ from those of previous studies and provides a strategy-based approach to PPM.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Zipfel, Alexander, Daniel Herdeg et Philipp Theumer. « Method for quantifying the value of information for production control in cross-company value-adding networks ». Procedia Manufacturing 54 (2021) : 1–6. http://dx.doi.org/10.1016/j.promfg.2021.07.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
21

Martínez-Santiago, Fernando, L. Alfonso Ureña-López et Maite Martín-Valdivia. « A merging strategy proposal : The 2-step retrieval status value method ». Information Retrieval 9, no 1 (janvier 2006) : 71–93. http://dx.doi.org/10.1007/s10791-005-5722-4.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
22

Eidsvik, Jo, Debarun Bhattacharjya et Tapan Mukerji. « Value of information of seismic amplitude and CSEM resistivity ». GEOPHYSICS 73, no 4 (juillet 2008) : R59—R69. http://dx.doi.org/10.1190/1.2938084.

Texte intégral
Résumé :
We propose a method for computing the value of information in petroleum exploration, a field in which decisions regarding seismic or electromagnetic data acquisition and processing are critical. We estimate the monetary value, in a certain context, of a seismic amplitude or electromagnetic-resistivity data set before purchasing the data. The method is novel in the way we incorporate spatial dependence to solve large-scale, real-world problems by integrating the decision-theoretical concept of value of information with rock physics and statistics. The method is based on a statistical model for saturation and porosity on a lattice along the top reservoir. Our model treats these variables as spatially correlated. The porosity and saturation are tied to the seismic and electromagnetic data via nonlinear rock-physics relations. We efficiently approximate the posterior distribution for the reservoir variables in a Bayesian model by fitting a Gaussian at the posterior mode for transformed versions of saturation and porosity. The value of information is estimated based on the prior and posterior distributions, the possible revenues from the reservoir, and the cost of drilling wells. We illustrate the method with three examples.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Hameed, Wafaa Mustafa, et Nzar A. Ali. « Comparison of Seventeen Missing Value Imputation Techniques ». Journal of Hunan University Natural Sciences 49, no 7 (30 juillet 2022) : 26–36. http://dx.doi.org/10.55463/issn.1674-2974.49.7.4.

Texte intégral
Résumé :
Copious data are collected and put away each day. That information can be utilized to extricate curiously designs. However, the information that we collect is ordinarily inadequate. Presently, utilizing that information to extricate any data may allow deceiving comes about. Utilizing that, we pre-process the information to exterminate the variations from the norm. In case of a low rate of lost values, those occurrences can be overlooked, but, in the case of huge sums, overlooking them will not allow wanted results. Many lost spaces in a dataset could be a huge issue confronted by analysts because it can lead to numerous issues in quantitative investigations. So, performing any information mining procedures to extricate a little good data out of a dataset, a few pre-processings of information can be done to dodge such paradoxes and, in this manner, move forward the quality of information. For handling such lost values, numerous methods have been proposed since 1980. The best procedure is to disregard the records containing lost values. Another method is ascription, which includes supplanting those lost spaces with a few gauges by doing certain computations. This would increment the quality of information and would extemporize forecast comes about. This paper gives an audit on methods for handling lost information like median imputation (MDI), hot (cold) deck imputation, regression imputation, expectation maximization (EM), support vector machine imputation (SVMI), multivariate imputation by chained equation (MICE), SICE technique, reinforcement programming, nonparametric iterative imputation algorithms (NIIA), and multilayer perceptrons. This paper also explores some good options of methods to estimate missing values to be used by other researchers in this field of study. Also, it aims to help them to figure out what method is commonly used now. The overview may also provide insight into each method and its advantages and limitations to consider for future research in this field of study. It can be a baseline to answer the questions of which techniques have been used and which is the most popular.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Pishchik, Vlada I. « Value-driven fears of modern information generations ». E3S Web of Conferences 258 (2021) : 07037. http://dx.doi.org/10.1051/e3sconf/202125807037.

Texte intégral
Résumé :
Today, the problem of formation and life of generations is becoming very relevant. Many researchers in different countries note that representatives of the information younger generations have an increased risk of depression, anxiety and fears. In the study, we note the manifestations of modernity: transitivity of society, fluidity, “uncertainty” and etc. In this regard, young people face fluid socialization, a delay in the period of growing up and excessive parental care. There is a tendency of replacement of real contacts of young people with virtual ones, which can increase the level of anxiety. The trends of changing values in Russia in the direction of survival established earlier by Inglehart and Baker (2000) may have different prospects today. Some researchers show a high level of anxiety among young Russians. The situation with COVID-19 may aggravate these manifestations. Illusory correlations, false representations in the beliefs of young people also increase social anxiety. These realities determined the purpose of the study: to determine the value bases of young generations fears of the modern South of Russia. The sample included: schoolchildren (born in 2004-2005) - 150 people, schoolchildren of the 9th grade and 10th grade, information generation (born in 1995-1999) - 210 people, students, young workers, transition generation (born in 1965-1982) - 245 people, working adults of Rostov-on-Don. We used the following techniques: values were measured by S. Schwartz method, to determine fears we used the V. Pishchik method of determining values through actualized fears. The results of the study showed that the values of preservation and self-affirmation are more pronounced in the “Transition” and “Information” generations. The values of transcendence are expressed in all the studied generations with a greater extent in the “New” generation. The loss of culture scares the “Transitional” generation, the loss of oneself scares the “Information” generation, the information overload scares the “New” generation. We defined the value bases of the fears of the young generations of the modern South of Russia.
Styles APA, Harvard, Vancouver, ISO, etc.
25

SHANG, ZHAOWEI, LINGFENG ZHANG, HENGJUN ZHAO et LAN ZHANG. « IMAGE FUSION METHOD BASED ON MULTI-DIRECTIONAL SUPPORT VALUE TRANSFORM ». International Journal of Wavelets, Multiresolution and Information Processing 10, no 05 (septembre 2012) : 1250049. http://dx.doi.org/10.1142/s021969131250049x.

Texte intégral
Résumé :
Image fusion is a technique of combining information from multiple images of the same scene into an image, so that the fused image contains a more accurate description of the scene than any of the individual source images. SVT (Support Value Transform) method is one of leading multiscale methods for studying image fusion, which achieves better fusion results both in visual inspection and quantitative analysis. One important assumption of SVT method is representing salient feature of image by support value. However, related studies did not give any mathematical explanation of it. Another drawback of this fusion method is ignoring the rich directional information of original image. In this paper, we provide a rational evidence of the physical meaning of support value by introducing Gaussian curvature of image. Also, by combining with DFB (Directional Filter Bank), a multi-directional SVT image fusion method is proposed. Our method can capture different and flexible directional information, which may help obtain the intrinsic geometrical structure of original image. Experimental results on several pairs of multi-focus images show that the proposed method achieves better results than SVT and other traditional common used methods.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Karandikar, Jaydeep, et Thomas Kurfess. « Value of information method for optimization and experimental design using surrogate models ». Manufacturing Letters 2, no 4 (octobre 2014) : 108–11. http://dx.doi.org/10.1016/j.mfglet.2014.07.003.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
27

Sakao, Tomohiko, et Mattias Lindahl. « A value based evaluation method for Product/Service System using design information ». CIRP Annals 61, no 1 (2012) : 51–54. http://dx.doi.org/10.1016/j.cirp.2012.03.108.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
28

Nguyen, DuyQuang, et Miguel J. Bagajewicz. « New sensor network design and retrofit method based on value of information ». AIChE Journal 57, no 8 (2 novembre 2010) : 2136–48. http://dx.doi.org/10.1002/aic.12440.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

S. A. Smolyak, S. A. « New method of liquidation value estimation ». Journal of the New Economic Association 55, no 3 (2022) : 12–27. http://dx.doi.org/10.31737/2221-2264-2022-55-3-1.

Texte intégral
Résumé :
Valuation standards define the liquidation value of an asset as its value within a shortened (as compared to typical) exposure / sale period. However, usually such timings (even when assets are sold at the market value) are random, and the "more" / «less» ratios are not applicable. We treat the liquidation value of an asset as its value in a forced sale with proper marketing and a deterministic exposure period limit. We propose a model for determining the liquidation value, which allows to optimize the seller’s marketing policy according to the criterion of the expected discounted benefits. This model takes into account the probabilistic nature of demand for the similar assets and the dependence of this demand on price (information on the price elasticity of demand is not required). The formulas obtained also allow taking into account inflation, the salvage value of an asset, its depreciation during the exposure period, as well as the need to incur selling expenses during the exposure period and the possibility of obtaining additional income from the use of the asset in this period. The dependences of the asset’s liquidation value on the remaining exposure period, calculated using the model, differ significantly from those recommended in the literature on valuation.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Iwata, T., K. Saito et T. Yamada. « Recommendation Method for Improving Customer Lifetime Value ». IEEE Transactions on Knowledge and Data Engineering 20, no 9 (septembre 2008) : 1254–63. http://dx.doi.org/10.1109/tkde.2008.55.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
31

He, Jincong, Pallav Sarma, Eric Bhark, Shusei Tanaka, Bailian Chen, Xian-Huan Wen et Jairam Kamath. « Quantifying Expected Uncertainty Reduction and Value of Information Using Ensemble-Variance Analysis ». SPE Journal 23, no 02 (9 janvier 2018) : 428–48. http://dx.doi.org/10.2118/182609-pa.

Texte intégral
Résumé :
Summary Data-acquisition programs, such as surveillance and pilots, play an important role in minimizing subsurface risks and improving decision quality for reservoir management. For design optimization and investment justification of these programs, it is crucial to be able to quantify the expected uncertainty reduction and the value of information (VOI) attainable from a given design. This problem is challenging because the data from the acquisition program are uncertain at the time of the analysis. In this paper, a method called ensemble-variance analysis (EVA) is proposed. Derived from a multivariate Gaussian assumption between the observation data and the objective function, the EVA method quantifies the expected uncertainty reduction from covariance information that is estimated from an ensemble of simulations. The result of EVA can then be used with a decision tree to quantify the VOI of a given data-acquisition program. The proposed method has several novel features compared with existing methods. First, the EVA method directly considers the data/objective-function relationship. Therefore, it can handle nonlinear forward models and an arbitrary number of parameters. Second, for cases when the multivariate Gaussian assumption between the data and objective function does not hold, the EVA method still provides a lower bound on expected uncertainty reduction, which can be useful in providing a conservative estimate of the surveillance/pilot performance. Finally, EVA also provides an estimate of the shift in the mean of the objective-function distribution, which is crucial for VOI calculation. In this paper, the EVA work flow for expected-uncertainty-reduction quantification is described. The result from EVA is benchmarked with recently proposed rigorous sampling methods, and the capacity of the method for VOI quantification is demonstrated for a pilot-analysis problem using a field-scale reservoir model.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Zhang, Chang Qing. « An Improved Close Value Method for Multi-Sensor Object Recognition ». Applied Mechanics and Materials 707 (décembre 2014) : 487–90. http://dx.doi.org/10.4028/www.scientific.net/amm.707.487.

Texte intégral
Résumé :
Multi-sensor information fusion problem contains many characteristic indexes, and thus it can be resolved using a multi-attribute decision making method. Information entropy is used to objectively determine the attributes weights, and thus it can overcome the subjective randomness. The aim of this paper is to develop a new multi-sensor object recognition method based on close value method. The example of part recognition proves that the proposed method is both feasible and effective.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Cui, ChunSheng, et YanLi Cao. « Vague multi-attribute group decision-making method based on evidence theory with a new aspect to solve weights ». Journal of Intelligent & ; Fuzzy Systems 42, no 4 (4 mars 2022) : 3737–47. http://dx.doi.org/10.3233/jifs-211937.

Texte intégral
Résumé :
In order to solve the problems of weight solving and information aggregation in the Vague multi-attribute group decision-making, this paper first solves the weight of Vague evaluation value, and then fuses the information of Vague sets through evidence theory, and obtains an information aggregation algorithm for Vague multi-attribute group decision-making. Firstly, The algorithm draws on the idea of solving the weight of evidence in the improved evidence theory algorithm, and calculates the weight of Vague evaluation value, and revises the original evaluation information after obtaining the weight of each Vague evaluation value. Secondly, this algorithm analyzes the mathematical relationship between the Vague sets and the evidence theory, and uses the evidence theory to fuse the evaluation information to obtain the final Vague evaluation value of each alternative. Finally, this algorithm uses a score function to calculate the score of each alternative to determine the best alternative. The algorithm given in the paper enables decision-makers to make rational decisions in uncertain environments, and then select the best alternative.
Styles APA, Harvard, Vancouver, ISO, etc.
34

Buckley, James J., Thomas Feuring et Yoichi Hayashi. « Fuzzy Difference Equations : The Initial Value Problem ». Journal of Advanced Computational Intelligence and Intelligent Informatics 5, no 6 (20 novembre 2001) : 315–25. http://dx.doi.org/10.20965/jaciii.2001.p0315.

Texte intégral
Résumé :
In this paper we study fuzzy solutions to the second order, linear, difference equation with constant coefficients but having fuzzy initial conditions. We look at two methods of solution: (1) in the first method we fuzzify the crisp solution and then check to see if it solves the difference equation; and (2) in the second method we first solve the fuzzy difference equation and then check to see if the solution defines a fuzzy number. Relationships between these two solution methods are also presented. Two applications are given: (1) the first is about a second order difference equation, having fuzzy initial conditions, modeling national income; and (2) the second is from information theory modeling the transmission of information.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Town, Stephen. « The value of people ». Performance Measurement and Metrics 15, no 1/2 (8 juillet 2014) : 67–80. http://dx.doi.org/10.1108/pmm-05-2014-0019.

Texte intégral
Résumé :
Purpose – The purpose of this paper is to reflect on advances in the understanding and practice of people evaluation in libraries. The paper is conceptual and offers a framework for human capital evaluation. Design/methodology/approach – The research approach has been to employ a mixed method research strategy (multi-methodology), combining desk research exploring quantitative capital assessment methods from other industries, sectors and libraries; phenomenological observation of existing data collection and development concepts; and survey data from staff in case studies of the author's own and other organizations. Findings – The synthesis suggests the measures required to populate the library capital dimension of the value scorecard, thereby providing an estimation of the value of a library's human capital. Originality/value – The paper fills a gap through a broad survey of advances in people assessment in libraries, and provides a unique framework for human capital measurement in libraries.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Heath, Anna, Ioanna Manolopoulou et Gianluca Baio. « A Review of Methods for Analysis of the Expected Value of Information ». Medical Decision Making 37, no 7 (14 avril 2017) : 747–58. http://dx.doi.org/10.1177/0272989x17697692.

Texte intégral
Résumé :
In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Krasnoselsky, M. V., T. M. Popovskaya et L. G. Raskin. « Assessment of information value of metabolic indicators in patients with colorectal cancer ». Klinical Informatics and Telemedicine 15, no 16 (7 décembre 2020) : 79–87. http://dx.doi.org/10.31071/kit2020.16.04.

Texte intégral
Résumé :
Introduction. The problem of assessing the information value of indicators of the condition of patients is of a general medical nature in connection with the fundamental importance of the results of clinical examination of patients for making a diagnosis and choosing an adequate treatment tactics. The research is aimed at finding effective methods for assessing the information content of controlled indicators. Materials and methods. We examined 32 patients diagnosed with colorectal cancer. Metabolic disorders were studied on the eve of surgery and on the 14th day after surgery. To assess carbohydrate metabolism, the content of glucose (GLUCGOD) and lactate (LACT) in blood serum was studied. To assess lipid metabolism, total cholesterol (CHOL), alpha-lipoproteins (HDLC) (high-density lipoproteins), beta-lipoproteins (LDL) (low-density lipoproteins), triglycerides (TRIG) were studied. The level of the following amino acids was de-termined: methionine, cysteine, taurine, phenylalanine, tyrosine, tryptophan, glutamate, glutamine, citruline, aspartate, asparagine, arginine, ornithine, alanine, leucine, isoleucine, valine, histidine, threonine, lysine, gydroxine, serin. The calculation of correlations between the indicators is carried out. Results. In connection with the known shortcomings of the widely used method for assessing the information content of indicators by calculating the Kullback measure, a search for alternative methods that satisfy the requirements formulated in the work was carried out. The proposed method is based on a special procedure for statistical processing of the measurement results of a set of controlled indicators before and after the operation. A simple analytical relationship has been obtained that effectively detects differences in the statistical distributions of the values of the controlled indicators that appear in connection with the operation. In addition, a method for assessing the informativeness of indicators in a small sample of initial data is proposed. The method is based on identify-ing the dynamics of correlations between indicators as a result of surgery. Conclusion. Effective methods for assessing the informativeness of controlled indicators are proposed, which reveal differences in the statistical distributions of indicator values that appear in connection with the operation. Key words: Colorectal cancer; Measures for assessing the information value of indicators; A small sample of initial data.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Ekincioğlu, Caner, et Semra Boran. « SMED methodology based on fuzzy Taguchi method ». Journal of Enterprise Information Management 31, no 6 (8 octobre 2018) : 867–78. http://dx.doi.org/10.1108/jeim-01-2017-0019.

Texte intégral
Résumé :
Purpose There can be activities that cannot reduce times by conventional single minute exchange of die (SMED) tools. In this case more advanced tools are needed. The purpose of this paper is to integrate the fuzzy Taguchi method into the SMED method in order to improve the setup time. The reason for using fuzzy logic is the subjective evaluation of factor’s levels assessment by experts. Subjective assessment contains a certain degree of uncertainty and is vagueness. The fuzzy Taguchi method provides to determining optimal setup time parameters in an activity of SMED. So it is possible to reduce time more than the conventional SMED method. Design/methodology/approach In this study, the SMED method and the fuzzy Taguchi method are used. Findings In this study, it has been shown that the setup time is reduced (from 196 to 75 min) and the optimum value can be given at the intermediate value by the fuzzy Taguchi method. Originality/value In this limited literature research, the authors have not found a study using the fuzzy Taguchi method in the SMED method.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Xia, Xin Tao, Lei Lei Gao et Jian Feng Chen. « Fusion Method for True Value Estimation of Manufacturing Quality under Condition of Poor Information (Part I : Theory) ». Applied Mechanics and Materials 34-35 (octobre 2010) : 157–61. http://dx.doi.org/10.4028/www.scientific.net/amm.34-35.157.

Texte intégral
Résumé :
Poor information means incomplete and insufficient information, such as small sample and unknown distribution. For point estimation under the condition of poor information, the statistical methods relied on large samples and known distributions may become ineffective. For this end, a fusion method is proposed. The fusion method develops five methods, three concepts, and one rule. The five methods include the rolling mean method, the membership function method, the maximum membership grade method, the moving bootstrap method, and the arithmetic mean method. The three concepts comprise the solution set on the estimated true value, the fusion series, and the final estimated true value. The rule is the range rule. The method proposed can supply a foundation for the true value estimation of manufacturing quality under the condition of poor information.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Ohshima, Shigeru, Mayuko Ieda, Midori Yamamoto et Daisuke Kobayashi. « Quantitative Evaluation Method of Information Value of Initial Symptoms Based on Bayesian Theory ». YAKUGAKU ZASSHI 132, no 6 (1 juin 2012) : 763–68. http://dx.doi.org/10.1248/yakushi.132.763.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
41

Busert, Timo, et Alexander Fay. « Extended Value Stream Mapping Method for Information Based Improvement of Production Logistics Processes ». IEEE Engineering Management Review 47, no 4 (1 décembre 2019) : 119–27. http://dx.doi.org/10.1109/emr.2019.2934953.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
42

Sarkar, Shraban, Archana K. Roy et Tapas R. Martha. « Landslide susceptibility assessment using Information Value Method in parts of the Darjeeling Himalayas ». Journal of the Geological Society of India 82, no 4 (octobre 2013) : 351–62. http://dx.doi.org/10.1007/s12594-013-0162-z.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
43

Strong, Mark, et Jeremy E. Oakley. « An Efficient Method for Computing Single-Parameter Partial Expected Value of Perfect Information ». Medical Decision Making 33, no 6 (28 décembre 2012) : 755–66. http://dx.doi.org/10.1177/0272989x12465123.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
44

Liang, Kaixuan, Ming Zhao, Jing Lin et Jinyang Jiao. « An information-based K-singular-value decomposition method for rolling element bearing diagnosis ». ISA Transactions 96 (janvier 2020) : 444–56. http://dx.doi.org/10.1016/j.isatra.2019.06.012.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
45

Ma, Jingru, Xiaodong Wang et Guangxiang Yuan. « Evaluation of Geological Hazard Susceptibility Based on the Regional Division Information Value Method ». ISPRS International Journal of Geo-Information 12, no 1 (10 janvier 2023) : 17. http://dx.doi.org/10.3390/ijgi12010017.

Texte intégral
Résumé :
The traditional susceptibility evaluation of geological hazards usually comprises a global susceptibility evaluation of the entire study area but ignores the differences between the local areas caused by spatial non-stationarity. In view of this, the geographically weighted regression model (GWR) was used to divide the study area at regional scale. Seven local areas were obtained with low spatial auto-correlation of each evaluation factor. Additionally, 11 evaluation factors, including the aspect, elevation, curvature, ground roughness, relief amplitude, slope, lithology, distance from the fault, height of the cut slope, multiyear average rainfall and the normalized difference vegetation index (NDVI) were selected to establish the evaluation index system of the geological hazard susceptibility. The Pearson coefficient was used to remove the evaluation factors with high correlation. The global and seven local areas were evaluated for susceptibility using the information value model and the global and regional division susceptibility evaluation results were obtained. The results show that the regional division information value model had better prediction performance (AUC = 0.893) and better accuracy. This model adequately considers the influence of the geological hazard impact factors in the different local areas on geological hazard susceptibility and weakens the influence of some factors that have higher influence in the global model but lower influence in local areas on the evaluation results. Therefore, the use of the regional division information value model for susceptibility evaluation is more consistent with the actual situation in the study area and is more suitable for guiding risk management and hazard prevention and mitigation.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Ye, Jun. « Single-Valued Neutrosophic Minimum Spanning Tree and Its Clustering Method ». Journal of Intelligent Systems 23, no 3 (1 septembre 2014) : 311–24. http://dx.doi.org/10.1515/jisys-2013-0075.

Texte intégral
Résumé :
AbstractClustering plays an important role in data mining, pattern recognition, and machine learning. Then, single-valued neutrosophic sets (SVNSs) are a useful means to describe and handle indeterminate and inconsistent information, which fuzzy sets and intuitionistic fuzzy sets cannot describe and deal with. To cluster the data represented by single-value neutrosophic information, the article proposes a single-valued neutrosophic minimum spanning tree (SVNMST) clustering algorithm. Firstly, we defined a generalized distance measure between SVNSs. Then, we present an SVNMST clustering algorithm for clustering single-value neutrosophic data based on the generalized distance measure of SVNSs. Finally, two illustrative examples are given to demonstrate the application and effectiveness of the developed approach.
Styles APA, Harvard, Vancouver, ISO, etc.
47

KAYSER, VICTORIA, KERSTIN GOLUCHOWICZ et ANTJE BIERWISCH. « TEXT MINING FOR TECHNOLOGY ROADMAPPING — THE STRATEGIC VALUE OF INFORMATION ». International Journal of Innovation Management 18, no 03 (19 mai 2014) : 1440004. http://dx.doi.org/10.1142/s1363919614400040.

Texte intégral
Résumé :
Technology roadmapping is a well-established method used in strategy development to map alternative future paths, while text mining offers untapped potentials concerning early detection and environmental scanning. In this paper, the roadmapping process is split into different steps in order to analyse which text mining methods could add further value within each. This leads to a two-layered process model, which includes text mining techniques to systematically integrate external information in ongoing roadmapping processes. Textual data can be used for a structured analysis and exploration of thematic fields and an objective, quantitative summary of actual developments. To demonstrate some of the benefits, the field of "cloud computing" is used to illustrate the procedure. As this article will show, the results provided by this approach extend the existing methodology, integrates an external view and complements expert opinion.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Gong, Siqi, Jiantao Lu, Shunming Li, Huijie Ma, Yanfeng Wang et Guangrong Teng. « Two-Channel Information Fusion Weak Signal Detection Based on Correntropy Method ». Applied Sciences 12, no 3 (28 janvier 2022) : 1414. http://dx.doi.org/10.3390/app12031414.

Texte intégral
Résumé :
In recent years, as a simple and effective method of noise reduction, singular value decomposition (SVD) has been widely concerned and applied. The idea of SVD for denoising is mainly to remove singular components (SCs) with small singular value (SV), which ignores the weak signals buried in strong noise. Aiming to extract the weak signals in strong noise, this paper proposed a method of selecting SCs by the correntropy-induced metric (CIM). Then, the frequency components of characteristic signals can be found through cyclic correntropy spectrum (CCES) which is the extension of the correntropy (CE). The proposed method firstly merges the signals collected by the two channels, secondly uses the principal components analysis (PCA) method to reduce the dimensionality, thirdly uses the singular value decomposition method to decompose the signal, fourthly calculates the CIM value to determine the selected singular components for construction, and finally uses the cyclic correntropy spectrum displaying the characteristics of the reconstructed signal. The experimental results show that the proposed method has a good effect on feature extraction.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Shao, Hong Xiang, et Xiao Ming Duan. « Video Vehicle Detection Method Based on Multiple Color Space Information Fusion ». Advanced Materials Research 546-547 (juillet 2012) : 721–26. http://dx.doi.org/10.4028/www.scientific.net/amr.546-547.721.

Texte intégral
Résumé :
A detection method which selective fuses the nine detection results of RGB, YCbCr and HSI color space according to the image color space relative independence of each component and complementarities is approached in order to improve vehicle video detection accuracy. The method fuses three different detection results in nine components by the value of H when the value of both S and I are higher and does another three detection results when the value of both S and I are smaller. Experiments show that the method compared to the traditional method using only the detection results of the brightness component improved substantial, reduced empty of the detected vehicle a large extent and increased traffic information data accuracy depending on the detection result.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Řezáč, Martin. « Advanced empirical estimate of information value for credit scoring models ». Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis 59, no 2 (2011) : 267–74. http://dx.doi.org/10.11118/actaun201159020267.

Texte intégral
Résumé :
Credit scoring, it is a term for a wide spectrum of predictive models and their underlying techniques that aid financial institutions in granting credits. These methods decide who will get credit, how much credit they should get, and what further strategies will enhance the profitability of the borrowers to the lenders. Many statistical tools are avaiable for measuring quality, within the meaning of the predictive power, of credit scoring models. Because it is impossible to use a scoring model effectively without knowing how good it is, quality indexes like Gini, Kolmogorov-Smirnov statisic and Information value are used to assess quality of given credit scoring model. The paper deals primarily with the Information value, sometimes called divergency. Commonly it is computed by discretisation of data into bins using deciles. One constraint is required to be met in this case. Number of cases have to be nonzero for all bins. If this constraint is not fulfilled there are some practical procedures for preserving finite results. As an alternative method to the empirical estimates one can use the kernel smoothing theory, which allows to estimate unknown densities and consequently, using some numerical method for integration, to estimate value of the Information value. The main contribution of this paper is a proposal and description of the empirical estimate with supervised interval selection. This advanced estimate is based on requirement to have at least k, where k is a positive integer, observations of socres of both good and bad client in each considered interval. A simulation study shows that this estimate outperform both the empirical estimate using deciles and the kernel estimate. Furthermore it shows high dependency on choice of the parameter k. If we choose too small value, we get overestimated value of the Information value, and vice versa. Adjusted square root of number of bad clients seems to be a reasonable compromise.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie