Journal articles on the topic 'Uncertainty (Information theory)'

To see the other types of publications on this topic, follow the link: Uncertainty (Information theory).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Uncertainty (Information theory).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Egghe, Leo. "Uncertainty and information: Foundations of generalized information theory." Journal of the American Society for Information Science and Technology 58, no. 5 (2007): 756–58. http://dx.doi.org/10.1002/asi.20519.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shokry, M., and Manar Omran. "The Information System by Uncertainty Theory." Journal of Engineering Research 3, no. 12 (December 1, 2019): 35–40. http://dx.doi.org/10.21608/erjeng.2019.125749.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tang, Yongchuan, Yong Chen, and Deyun Zhou. "Measuring Uncertainty in the Negation Evidence for Multi-Source Information Fusion." Entropy 24, no. 11 (November 2, 2022): 1596. http://dx.doi.org/10.3390/e24111596.

Full text
Abstract:
Dempster–Shafer evidence theory is widely used in modeling and reasoning uncertain information in real applications. Recently, a new perspective of modeling uncertain information with the negation of evidence was proposed and has attracted a lot of attention. Both the basic probability assignment (BPA) and the negation of BPA in the evidence theory framework can model and reason uncertain information. However, how to address the uncertainty in the negation information modeled as the negation of BPA is still an open issue. Inspired by the uncertainty measures in Dempster–Shafer evidence theory, a method of measuring the uncertainty in the negation evidence is proposed. The belief entropy named Deng entropy, which has attracted a lot of attention among researchers, is adopted and improved for measuring the uncertainty of negation evidence. The proposed measure is defined based on the negation function of BPA and can quantify the uncertainty of the negation evidence. In addition, an improved method of multi-source information fusion considering uncertainty quantification in the negation evidence with the new measure is proposed. Experimental results on a numerical example and a fault diagnosis problem verify the rationality and effectiveness of the proposed method in measuring and fusing uncertain information.
APA, Harvard, Vancouver, ISO, and other styles
4

Yan, Xi-zu, and Zhong-min Song. "The portfolio models of contained grey profit under uncertainty." Grey Systems: Theory and Application 4, no. 3 (October 28, 2014): 487–94. http://dx.doi.org/10.1108/gs-09-2014-0035.

Full text
Abstract:
Purpose – The purpose of this paper is to establish the portfolio models of contained grey profit under uncertainty, and the results are applied to solve uncertain investment problem. Design/methodology/approach – In investment problems, uncertainties may exist in model parameters and input data. For the investment problems contained grey profit and incomplete information about natural world state, according to the portfolio theory, the grey systems theory and the uncertainty decision theory, the paper puts forward portfolio models and the methods. Findings – Traditional uncertainty decision is researched for incomplete information about natural world state, in reality, investment problems are not only uncertain state information, but income are uncertain. Practical implications – Because the investment problems have been widely used in economic analysis, decision analysis and economic management, examples are provided at the end to verify its feasibility. Originality/value – The paper successfully combined the portfolio theory, the gray system theory and uncertainty decision theory and new uncertainty investment decision-making models and methods are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Bin, Dingyi Gan, Yongchuan Tang, and Yan Lei. "Incomplete Information Management Using an Improved Belief Entropy in Dempster-Shafer Evidence Theory." Entropy 22, no. 9 (September 7, 2020): 993. http://dx.doi.org/10.3390/e22090993.

Full text
Abstract:
Quantifying uncertainty is a hot topic for uncertain information processing in the framework of evidence theory, but there is limited research on belief entropy in the open world assumption. In this paper, an uncertainty measurement method that is based on Deng entropy, named Open Deng entropy (ODE), is proposed. In the open world assumption, the frame of discernment (FOD) may be incomplete, and ODE can reasonably and effectively quantify uncertain incomplete information. On the basis of Deng entropy, the ODE adopts the mass value of the empty set, the cardinality of FOD, and the natural constant e to construct a new uncertainty factor for modeling the uncertainty in the FOD. Numerical example shows that, in the closed world assumption, ODE can be degenerated to Deng entropy. An ODE-based information fusion method for sensor data fusion is proposed in uncertain environments. By applying it to the sensor data fusion experiment, the rationality and effectiveness of ODE and its application in uncertain information fusion are verified.
APA, Harvard, Vancouver, ISO, and other styles
6

YOU, CUILIAN. "UNCERTAINTY EXTENSION THEOREM AND PRODUCT UNCERTAIN MEASURE." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 18, no. 02 (April 2010): 197–208. http://dx.doi.org/10.1142/s0218488510006489.

Full text
Abstract:
The additivity axiom of classical measure theory has been challenged by many mathematicians. Different replacements of the additivity correspond with different theory. In uncertainty theory, the additivity is replaced with self-duality and countable subadditivity. Similar to classical measure theory, there are some properties studied in uncertainty theory. Given the measure of each singleton set, the measure can be fully and uniquely determined in the sense of the maximum uncertainty principle. Generally speaking, a product uncertain measure may be defined in many ways, in this paper, a kind of definition is proposed.
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Xihua, Fuqiang Wang, and Xiaohong Chen. "Trapezoidal Intuitionistic Fuzzy Multiattribute Decision Making Method Based on Cumulative Prospect Theory and Dempster-Shafer Theory." Journal of Applied Mathematics 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/279138.

Full text
Abstract:
With respect to decision making problems under uncertainty, a trapezoidal intuitionistic fuzzy multiattribute decision making method based on cumulative prospect theory and Dempster-Shafer theory is developed. The proposed method reflects behavioral characteristics of decision makers, information fuzziness under uncertainty, and uncertain attribute weight information. Firstly, distance measurement and comparison rule of trapezoidal intuitionistic fuzzy numbers are used to derive value function under trapezoidal intuitionistic fuzzy environment. Secondly, the value function and decision weight function are used to calculate prospect values of attributes for each alternative. Then considering uncertain attribute weight information, Dempster-Shafer theory is used to aggregate prospect values for each alternative, and overall prospect values are obtained and thus the alternatives are sorted consequently. Finally, an illustrative example shows the feasibility of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
8

Lee, Jejung, Abdallah Sayyed-Ahmad, and Dong-Hoon Sheen. "Basin model inversion using information theory and seismic data." GEOPHYSICS 72, no. 6 (November 2007): R99—R108. http://dx.doi.org/10.1190/1.2757738.

Full text
Abstract:
We present a new approach to basin-model inversion in which uncertain parameters in a basin model are estimated using information theory and seismic data. We derive a probability function from information theory to quantify uncertainties in the estimated parameters in basin modeling. The derivation requires two constraints: a normalization and a misfit constraint. The misfit constraint uses seismic information by minimizing the difference between calculated seismograms from a basin simulator and observed seismograms. The information-theory approach emphasizes the relative difference between the so-called expected and calculated minima of the misfit function. The synthetic-model application shows that the greater the difference between the expected and calculated minima of the misfit function, the larger the uncertainty in parameter estimation. Uncertainty analysis provides secondary information on how accurately the inversion process is performed in basin modeling.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Shao Pu, and Tao Feng. "Uncertainty Measure Based on Evidence Theory." Applied Mechanics and Materials 329 (June 2013): 344–48. http://dx.doi.org/10.4028/www.scientific.net/amm.329.344.

Full text
Abstract:
Evidence theory is an effective method to deal with uncertainty information. And uncertainty measure is to reflect the uncertainty of an information system. Thus we want to merge evidence theory with uncertainty method in order to measure the roughness of a rough approximation space. This paper discusses the information fusion and uncertainty measure based on rough set theory. First, we propose a new method of information fusion based on the Bayse function, and define a pair of belief function and plausibility function using the fused mass function in an information system. Then we construct entropy for every decision class to measure the roughness of every decision class, and entropy for decision information system to measure the consistence of decision table.
APA, Harvard, Vancouver, ISO, and other styles
10

He, Rongheng, Hui Li, Bo Zhang, and Mei Chen. "The multi-level warehouse layout problem with uncertain information: uncertainty theory method." International Journal of General Systems 49, no. 5 (June 22, 2020): 497–520. http://dx.doi.org/10.1080/03081079.2020.1778681.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Rains, Stephen A., and Riva Tukachinsky. "Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success." Journal of Health Communication 20, no. 11 (July 2015): 1275–86. http://dx.doi.org/10.1080/10810730.2015.1018641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Wright, Lesley F. "Information Gap Decision Theory: Decisions under Severe Uncertainty." Journal of the Royal Statistical Society: Series A (Statistics in Society) 167, no. 1 (February 2004): 185–86. http://dx.doi.org/10.1111/j.1467-985x.2004.298_4.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Brennan, M. D., R. Cheong, and A. Levchenko. "How Information Theory Handles Cell Signaling and Uncertainty." Science 338, no. 6105 (October 18, 2012): 334–35. http://dx.doi.org/10.1126/science.1227946.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Nearing, Grey S., and Hoshin V. Gupta. "Ensembles vs. information theory: supporting science under uncertainty." Frontiers of Earth Science 12, no. 4 (May 9, 2018): 653–60. http://dx.doi.org/10.1007/s11707-018-0709-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Zhou, Honggeng. "An Empirical Test of the Information Processing Theory." International Journal of Information Systems and Supply Chain Management 4, no. 1 (January 2011): 45–59. http://dx.doi.org/10.4018/jisscm.2011010103.

Full text
Abstract:
According to the propositions in the information processing theory, this study tests the relationship between task uncertainty and three organizational design strategies, i.e., creation of lateral relationships, investment in information systems, and creation of self-contained tasks. Data from 125 North American manufacturing firms are used and business environment uncertainty is employed to measure task uncertainty. Sourcing practice and delivery practice measure the creation of lateral relationships, while Information quality measures the investment in information systems. Also, just-in-time production and human resource management measure the creation of self-contained tasks. Regression analysis shows that business environment uncertainty has significant positive influence on sourcing practice, delivery practice, information quality, just-in-time production, and human resource management. While the information processing theory was proposed more than thirty years ago, this study empirically extends the relevance of information processing theory to today’s supply chain environment.
APA, Harvard, Vancouver, ISO, and other styles
16

Menin, Boris. "Uncovering the Hidden Information: A Novel Approach to Modeling Physical Phenomena Through Information Theory." Applied Physics Research 15, no. 1 (April 13, 2023): 101. http://dx.doi.org/10.5539/apr.v15n1p101.

Full text
Abstract:
The growing need to study more complex physical phenomena and technological processes determines the importance of reducing the uncertainty of formulated models. However, measurement theory does not provide a clear answer to the question of how to calculate and use model structure uncertainty: the presence of certain base quantities and derived variables. The key novelty of this research lies in the informational method, which allows you to find the value of the uncertainty of the model of the phenomenon that has a certain structure. This uncertainty is initial and precedes the definition of uncertainties associated with the implemented computer algorithms, subsequent experiments, data processing, and the people involved in the study. This article aims to provide a detailed explanation of the informational method and its application for the selection of a model that satisfies the chosen universal criterion of comparative uncertainty. This criterion allows for solving the problem of identifying the preferred model that meets the requirements and philosophical outlook of the observer. So far, for many decades, no efforts have been made to take this uncertainty into account in scientific and technical practice. We applied the information method to analyze the attainable accuracy or perfection of established physical laws in this paper.
APA, Harvard, Vancouver, ISO, and other styles
17

Szmidt, Eulalia. "Uncertainty and Information: Foundations of Generalized Information Theory (Klir, G.J.; 2006)." IEEE Transactions on Neural Networks 18, no. 5 (September 2007): 1551. http://dx.doi.org/10.1109/tnn.2007.906888.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Saldanha, Terence, Mariana Andrade-Rojas, Abhishek Kathuria, Jiban Khuntia, and Mayuram Krishnan. "How the Locus of Uncertainty Shapes the Influence of CEO Long-term Compensation on Information Technology Capital Investments." MIS Quarterly 48, no. 2 (June 1, 2024): 459–90. http://dx.doi.org/10.25300/misq/2023/17433.

Full text
Abstract:
Firms must allocate resources effectively to cope with uncertainty, which can manifest as a disruption and an opportunity. Although information technology (IT) is a means to cope with uncertainty, chief executive officers (CEOs) may not always support IT investments due to the risky nature of IT, especially when facing uncertain conditions. While prior research suggests that CEO long-term compensation positively incentivizes IT investments, little is known about how different loci of uncertainty impact this relationship. To address this research gap, we study how firm-specific uncertainty and competitive uncertainty shape the influence of CEO long-term compensation on a firm’s IT capital investment. Drawing on agency theory and prospect theory, we develop two hypotheses. First, we hypothesize that firm-specific uncertainty and competitive uncertainty positively moderate the influence of CEO long-term compensation on firm IT capital investment. Second, we hypothesize that competitive uncertainty has a stronger positive moderating effect than firm-specific uncertainty on the influence of CEO long-term compensation on firm IT capital investment. Our analysis of secondary longitudinal data from 2000 to 2007 of 357 public firms in the United States supports our hypotheses. In exploratory analyses, we found that CEO long-term compensation results in a higher risk-oriented dominant logic in the firm, particularly in conditions of firm-specific uncertainty and competitive uncertainty, with competitive uncertainty having a stronger positive moderating effect. These findings uncover risk-oriented dominant logic as a theoretical mechanism that explains how CEO long-term compensation positively influences firm IT capital investment in uncertain conditions. We also conducted exploratory analyses using a different secondary dataset of 286 U.S. public firms from 2004 to 2019 to consider firm investments in transformative IT applications and found support for our theory. This finding triangulates our results across different time periods and different types of IT investments. This study contributes to theory and practice by providing a nuanced understanding of boundary conditions surrounding CEO long-term compensation, and decisions CEOs make vis-à-vis IT capital investments.
APA, Harvard, Vancouver, ISO, and other styles
19

Westover, M. Brandon, Nathaniel A. Eiseman, Sydney S. Cash, and Matt T. Bianchi. "Information Theoretic Quantification of Diagnostic Uncertainty." Open Medical Informatics Journal 6, no. 1 (December 14, 2012): 36–50. http://dx.doi.org/10.2174/1874431101206010036.

Full text
Abstract:
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes’ rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians’ deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians’ application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
APA, Harvard, Vancouver, ISO, and other styles
20

Yan, Ruixia, Liangui Peng, Yanxi Xie, and Xiaoli Wang. "Rough Set-Game Theory Information Mining Model Considering Opponents’ Information." Electronics 11, no. 2 (January 13, 2022): 244. http://dx.doi.org/10.3390/electronics11020244.

Full text
Abstract:
In multi-strategy games, the increase in the number of strategies makes it difficult to make a solution. To maintain the competition advantage and obtain maximal profits, one side of the game hopes to predict the opponent’s behavior. Building a model to predict an opponent’s behavior is helpful. In this paper, we propose a rough set-game theory model (RS-GT) considering uncertain information and the opponent’s decision rules. The uncertainty of strategies is obtained based on the rough set method, and an accurate solution is obtained based on game theory from the rough set-game theory model. The players obtain their competitors’ decision rules to predict the opponents’ behavior by mining the information from repeated games in the past. The players determine their strategy to obtain maximum profits by predicting the opponent’s actions, i.e., adopting a first-mover or second-mover strategy to build a favorable situation. The result suggests that the rough set-game theory model helps enterprises avoid unnecessary losses and allows them to obtain greater profits.
APA, Harvard, Vancouver, ISO, and other styles
21

Wang, Jing, Jing Wang, Jingfeng Guo, Liya Wang, Chunying Zhang, and Bin Liu. "Research Progress of Complex Network Modeling Methods Based on Uncertainty Theory." Mathematics 11, no. 5 (March 1, 2023): 1212. http://dx.doi.org/10.3390/math11051212.

Full text
Abstract:
A complex network in reality contains a large amount of information, but some information cannot be obtained accurately or is missing due to various reasons. An uncertain complex network is an effective mathematical model to deal with this problem, but its related research is still in its infancy. In order to facilitate the research into uncertainty theory in complex network modeling, this paper summarizes and analyzes the research hotspots of set pair analysis, rough set theory and fuzzy set theory in complex network modeling. This paper firstly introduces three kinds of uncertainty theories: the basic definition of set pair analysis, rough sets and fuzzy sets, as well as their basic theory of modeling in complex networks. Secondly, we aim at the three uncertainty theories and the establishment of specific models. The latest research progress in complex networks is reviewed, and the main application fields of the three uncertainty theories are discussed, respectively: community discovery, link prediction, influence maximization and decision-making problems. Finally, the prospect of the modeling and development of uncertain complex networks is put forward.
APA, Harvard, Vancouver, ISO, and other styles
22

REED, WILLIAM. "Information, Power, and War." American Political Science Review 97, no. 4 (November 2003): 633–41. http://dx.doi.org/10.1017/s0003055403000923.

Full text
Abstract:
Ultimatum bargaining models of international interactions suggest that when conflict is costly and the actors are fully informed, the probability of conflict goes to zero. However, conflict occurs with some positive probability when the challenger is uncertain about the defender's reservation value. I employ a simple ultimatum game of bargaining to evaluate two traditional power-centric theories of world politics, balance of power, and power transition theory. The formal and empirical analyses demonstrate that as states approach power parity, information asymmetries are greatest, thus enhancing the probability of militarized conflict. Uncertainty is a central cause of conflict emergence and is correlated with the distribution of observable capabilities. Recognizing the relationship between the distribution of power and the uncertainty offers a more sophisticated interpretation of power-centric explanations of world politics.
APA, Harvard, Vancouver, ISO, and other styles
23

Beresford‐Smith, Bryan, and Colin J. Thompson. "An info‐gap approach to managing portfolios of assets with uncertain returns." Journal of Risk Finance 10, no. 3 (May 22, 2009): 277–87. http://dx.doi.org/10.1108/15265940910959393.

Full text
Abstract:
PurposeThe purpose of this paper is to provide a quantitative methodology based on information‐gap decision theory for dealing with (true) Knightian uncertainty in the management of portfolios of assets with uncertain returns.Design/methodology/approachPortfolio managers aim to maximize returns for given levels of risk. Since future returns on assets are uncertain the expected return on a portfolio of assets can be subject to significant uncertainty. Information‐gap decision theory is used to construct portfolios that are robust against uncertainty.FindingsUsing the added dimensions of aspirational parameters and performance requirements in information‐gap theory, the paper shows that one cannot simultaneously have two robust‐optimal portfolios that outperform a specified return and a benchmark portfolio unless one of the portfolios has arbitrarily large long and short positions.Research limitations/implicationsThe paper has considered only one uncertainty model and two performance requirements in an information‐gap analysis over a particular time frame. Alternative uncertainty models could be introduced and benchmarking against proxy portfolios and competitors are examples of additional performance requirements that could be incorporated in an information‐gap analysis.Practical implicationsAn additional methodology for applying information‐gap modeling to portfolio management has been provided.Originality/valueThis paper provides a new and novel approach for managing portfolios in the face of uncertainties in future asset returns.
APA, Harvard, Vancouver, ISO, and other styles
24

HAN, DEQIANG, CHONGZHAO HAN, and YONG DENG. "NOVEL APPROACHES FOR THE TRANSFORMATION OF FUZZY MEMBERSHIP FUNCTION INTO BASIC PROBABILITY ASSIGNMENT BASED ON UNCERTAINTY OPTIMIZATION." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 21, no. 02 (April 2013): 289–322. http://dx.doi.org/10.1142/s0218488513500165.

Full text
Abstract:
With the development of uncertainty reasoning and information fusion, there have emerged several theories including fuzzy set theory, Dempster-Shafer evidence theory, probability theory and rough set theory, etc., for representing and dealing with the uncertain information. When the fusion of the uncertain information originated from different sources is needed, how to construct a general framework for different theories of uncertainty or how to establish the connection between different theoretical frameworks has become a crucial problem. Particularly, to combine two kinds of information represented respectively by the BPA and the FMF, this paper proposes two transformations of an FMF into a BPA by solving a constrained maximization or minimization optimization problem. The objective function is the uncertainty degree of the body of evidence (BOE) and the corresponding constraints are established based on the given FMF. In fact the transformation of an FMF into a BPA is the transformation of fuzzy sets into random sets, which is currently accepted as a unified framework for several theories of uncertainty. Our proposed approaches have no predefinition of focal elements and they can be used as the general transformations of fuzzy sets into random sets. Some examples and analyses are provided to illustrate and justify the rationality and effectiveness of the proposed approaches.
APA, Harvard, Vancouver, ISO, and other styles
25

Li, Yuchen, and Zaoli Yang. "Games with incomplete information and uncertain payoff: from the perspective of uncertainty theory." Soft Computing 23, no. 24 (March 22, 2019): 13669–78. http://dx.doi.org/10.1007/s00500-019-03906-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Cui, Yuquan, Xiaolin Zhang, and Xi Lu. "Supply Chain Commitment Contract Model Based on Uncertainty Theory under Uncertain Market Information." Applied Mathematics 06, no. 10 (2015): 1727–39. http://dx.doi.org/10.4236/am.2015.610153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Keyes, Laura M., and Abraham David Benavides. "Chaos theory, uncertainty, and organizational learning." International Journal of Organization Theory & Behavior 21, no. 4 (November 12, 2018): 226–41. http://dx.doi.org/10.1108/ijotb-04-2018-0050.

Full text
Abstract:
Purpose The purpose of this paper is to juxtapose chaos theory with organizational learning theory to examine whether public organizations co-evolve into a new order or rather institutionalize newly gained knowledge in times of a highly complex public health crisis. Design/methodology/approach The research design utilizes the results from a survey administered to 200 emergency management and public health officials in the Dallas–Fort Worth metroplex. Findings The findings of this paper suggest that public entities were more likely to represent organizational learning through the coordination of professionals, access to quality information, and participation in daily communication. Leadership was associated with the dissemination of knowledge through the system rather than the development of new standard operating procedures (as suggested by chaos theory and co-evolution). Research limitations/implications There are limitations to this study given the purposive sample of emergency management and public health officials employed in the Dallas–Fort Worth metroplex. Practical implications The authors find that public organizations that learn how to respond to unprecedented events through reliance on structure, leadership, and culture connect decision makers to credible information resulting in organizational learning. Social implications As a result, public administrators need to focus and rely on their organization’s capacity to receive and retain information in a crisis. Originality/value This research contributes to our understanding of organizational learning in public organizations under highly complex public health situations finding decisions makers rely on both organizational structure and culture to support the flow of credible information.
APA, Harvard, Vancouver, ISO, and other styles
28

Peng, Jin, Bo Zhang, and Kiki Ariyanti Sugeng. "Uncertain Hypergraphs: A Conceptual Framework and Some Topological Characteristics Indexes." Symmetry 14, no. 2 (February 5, 2022): 330. http://dx.doi.org/10.3390/sym14020330.

Full text
Abstract:
In practical applications of hypergraph theory, we are usually surrounded by the state of indeterminacy. This paper employs uncertainty theory to address indeterministic information. We initially put forward the idea of an uncertain hypergraph by combining hypergraph theory with uncertainty theory in order to provide a useful tool to deal with a variety of uncertain complex systems and to create a new interdisciplinary research field. The main focus of this paper is to propose a conceptual framework of uncertain hypergraphs and to study the operations of uncertain hypergraphs. Moreover, some topological indexes are proposed to describe the characteristics of the structures of uncertain hypergraph. Additionally, some further research directions are discussed.
APA, Harvard, Vancouver, ISO, and other styles
29

OBLOW, E. M. "O-THEORY—A HYBRID UNCERTAINTY THEORY." International Journal of General Systems 13, no. 2 (January 1987): 95–106. http://dx.doi.org/10.1080/03081078708934960.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Lin, Guoping, Jiye Liang, and Yuhua Qian. "Uncertainty Measures for Multigranulation Approximation Space." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 23, no. 03 (June 2015): 443–57. http://dx.doi.org/10.1142/s0218488515500191.

Full text
Abstract:
Multigranulation rough set theory is a relatively new mathematical tool for solving complex problems in the multigranulation or distributed circumstances which are characterized by vagueness and uncertainty. In this paper, we first introduce the multigranulation approximation space. According to the idea of fusing uncertain, imprecise information, we then present three uncertainty measures: fusing information entropy, fusing rough entropy, and fusing knowledge granulation in the multigranulation approximation space. Furthermore, several essential properties (equivalence, maximum, minimum) are examined and the relationship between the fusion information entropy and the fusion rough entropy is also established. Finally, we prove these three measures are monotonously increasing as the partitions become finer. These results will be helpful for understanding the essence of uncertainty measures in multigranulation rough space and enriching multigranulation rough set theory.
APA, Harvard, Vancouver, ISO, and other styles
31

Jizba, Petr, Jacob A. Dunningham, and Jaewoo Joo. "Role of information theoretic uncertainty relations in quantum theory." Annals of Physics 355 (April 2015): 87–114. http://dx.doi.org/10.1016/j.aop.2015.01.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Sun, Lin, and Jiucheng Xu. "Information Entropy and Mutual Information-based Uncertainty Measures in Rough Set Theory." Applied Mathematics & Information Sciences 8, no. 4 (July 1, 2014): 1973–85. http://dx.doi.org/10.12785/amis/080456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Weijs, S. V., G. Schoups, and N. van de Giesen. "Why hydrological forecasts should be evaluated using information theory." Hydrology and Earth System Sciences Discussions 7, no. 4 (July 16, 2010): 4657–85. http://dx.doi.org/10.5194/hessd-7-4657-2010.

Full text
Abstract:
Abstract. Probabilistic forecasting is becoming increasingly popular in hydrology. Equally important are methods to evaluate such forecasts. There is still debate about which scores to use for this evaluation. In this paper we distinguish two scales for evaluation: information-uncertainty and utility-risk. We claim that the information-uncertainty scale is to be preferred for forecast evaluation. We propose a Kullback-Leibler divergence as the appropriate measure for forecast quality. Interpreting a decomposition of this measure into uncertainty, correct information and wrong information, it follows directly that deterministic forecasts, although they can still have value for decisions, increase uncertainty to infinity. We resolve this paradoxical result by proposing that deterministic forecasts are implicitly probabilistic or are implicitly assuming a decision problem. Although forecast value could be the final objective in engineering, we claim that for calibration of models representing a hydrological system, information should be the objective in calibration, because it allows to extract all information from the observations and avoids learning from information that is not there. Calibration based on maximizing value trains an implicit decision model, which inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts.
APA, Harvard, Vancouver, ISO, and other styles
34

Mobekk, Hilde, and Asle Fagerstrøm. "Escalation in Information Technology Projects." International Journal of Information Technology Project Management 6, no. 4 (October 2015): 1–19. http://dx.doi.org/10.4018/ijitpm.2015100101.

Full text
Abstract:
According to the information systems literature, many information technology (IT) projects go wildly over budget, drag on long past their originally scheduled completion date, and do not deliver according to initial specification. Theories that have been used to understand the escalation phenomenon in general are the self-justification theory, the prospect theory, the agency theory, and the approach avoidance theory. These theories have contributed to a considerable insight in the phenomenon of escalation, but divergence among them indicates that there are still some unanswered questions. Discounting describes how the subjective value of an outcome is altered because its outcome is either uncertain and/or delayed. Since a key factor in IT project is the uncertainty and/or delay associated with cost, schedules and functionality of the IT solutions that are made, the authors decided to introduce the concept of discounting to expand understanding of escalation behavior in IT projects.
APA, Harvard, Vancouver, ISO, and other styles
35

Vigo, Ronaldo. "Complexity over Uncertainty in Generalized Representational Information Theory (GRIT): A Structure-Sensitive General Theory of Information." Information 4, no. 1 (December 20, 2012): 1–30. http://dx.doi.org/10.3390/info4010001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Mat Saad, Mohd Zuwairi, and Sharifah Nor Atiqah Syed Lokman Hakim. "The Preference of Social Networking Sites and Uncertainty Reduction Strategies Towards Information on COVID-19 Vaccination." Jurnal Komunikasi: Malaysian Journal of Communication 39, no. 3 (September 30, 2023): 501–14. http://dx.doi.org/10.17576/jkmjc-2023-3903-27.

Full text
Abstract:
At the end of 2019, the world showcased a lethal outbreak of COVID-19, which was announced by the World Health Organization (WHO) as a pandemic in March 2020. As this outbreak has spread worldwide, including in Malaysia, several alternatives were undertaken by the government to curb this spread including introduce a vaccination program for the community. This program is believed to make most of the community worried or uncertain since it is still newly introduced. In reducing people's worries and uncertainties regarding COVID-19 vaccination, searching for information via social networking sites (SNS) will likely reduce their uncertainty. This study aims to determine the most preferred SNS used by people of middle age to reduce uncertainty regarding information on COVID-19 vaccination. This study also aimed to identify the factors that contribute to middle-aged people's preferences for SNS to reduce their uncertainty about COVID-19 vaccination, as well as the uncertainty reduction strategies (URSs) used by middle-aged people over preferred SNS to reduce their level of uncertainty about COVID-19 vaccination. Uncertainty Reduction Theory was used as the underpinning theory for this study. This study employed a qualitative approach in which an in-depth e-interview was conducted over a Google Meet platform. The results showed that Facebook is the most preferred SNS the informants use to reduce uncertainty regarding COVID-19 vaccination. This study also found information authority to be the most important factor for SNS preference. Finally, the study also discovered that the passive strategy is the most commonly employed method used by informants to reduce information uncertainties regarding COVID-19 vaccination. Keywords: COVID-19, vaccination, social networking sites, Uncertainty Reduction Theory, information.
APA, Harvard, Vancouver, ISO, and other styles
37

Blevins, James. "The information-theoretic turn." Psihologija 46, no. 4 (2013): 355–75. http://dx.doi.org/10.2298/psi1304355b.

Full text
Abstract:
Over the past decade, information theory has been applied to the analysis of a successively broader range of morphological phenomena. Interestingly, this tradition has arisen independently of the linguistic applications of information theory dating from the 1950?s. Instead, the point of origin for current work lies in a series of studies of morphological processing in which Kostic and associates develop a statistical notion of ?morphological information? based on ?uncertainty? and ?uncertainty reduction?. From these initial studies, analyses based on statistical notions of information have been applied to general problems of morphological description and typological classification, leading to a formal rehabilitation of the complex system perspective of traditional WP models.
APA, Harvard, Vancouver, ISO, and other styles
38

Wu, Lei, Yongchuan Tang, Liuyuan Zhang, and Yubo Huang. "Uncertainty Management in Assessment of FMEA Expert Based on Negation Information and Belief Entropy." Entropy 25, no. 5 (May 15, 2023): 800. http://dx.doi.org/10.3390/e25050800.

Full text
Abstract:
The failure mode and effects analysis (FMEA) is a commonly adopted approach in engineering failure analysis, wherein the risk priority number (RPN) is utilized to rank failure modes. However, assessments made by FMEA experts are full of uncertainty. To deal with this issue, we propose a new uncertainty management approach for the assessments given by experts based on negation information and belief entropy in the Dempster–Shafer evidence theory framework. First, the assessments of FMEA experts are modeled as basic probability assignments (BPA) in evidence theory. Next, the negation of BPA is calculated to extract more valuable information from a new perspective of uncertain information. Then, by utilizing the belief entropy, the degree of uncertainty of the negation information is measured to represent the uncertainty of different risk factors in the RPN. Finally, the new RPN value of each failure mode is calculated for the ranking of each FMEA item in risk analysis. The rationality and effectiveness of the proposed method is verified through its application in a risk analysis conducted for an aircraft turbine rotor blade.
APA, Harvard, Vancouver, ISO, and other styles
39

Slayton, Rebecca. "Governing Uncertainty or Uncertain Governance? Information Security and the Challenge of Cutting Ties." Science, Technology, & Human Values 46, no. 1 (January 29, 2020): 81–111. http://dx.doi.org/10.1177/0162243919901159.

Full text
Abstract:
Information security governance has become an elusive goal and a murky concept. This paper problematizes both information security governance and the broader concept of governance. What does it mean to govern information security, or for that matter, anything? Why have information technologies proven difficult to govern? And what assurances can governance provide for the billions of people who rely on information technologies every day? Drawing together several distinct bodies of literature—including multiple strands of governance theory, actor–network theory, and scholarship on sociotechnical regimes—this paper conceptualizes networked action on a spectrum from uncertain governance to governing uncertainty. I advance a twofold argument. First, I argue that networks can better govern uncertainty as they become more able not only to enroll actors in a collective agenda, but also to cut ties with those who seek to undermine that agenda. And second, I argue that the dominant conception of information security governance, which emphasizes governing uncertainty through risk management, in practice devolves to uncertain governance. This is largely because information technologies have evolved toward greater connectedness—and with it, greater vulnerability—creating a regime of insecurity. This evolution is illustrated using the history of the US government’s efforts to govern information security.
APA, Harvard, Vancouver, ISO, and other styles
40

Hall, Jim W. "Handling uncertainty in the hydroinformatic process." Journal of Hydroinformatics 5, no. 4 (October 1, 2003): 215–32. http://dx.doi.org/10.2166/hydro.2003.0019.

Full text
Abstract:
Hydroinformatics combines topics of modelling and decision-making, both of which have attracted a great deal of attention outside hydroinformatics from the point of view of uncertainty. Epistemic uncertainties are due to the inevitably incomplete evidence about the dependability of a model or set of competing models. Inherent uncertainties are due to the varying information content inherent in measurements or model predictions, be they probabilistic or fuzzy. Decision-making in management of the aquatic environment is, more often than not, a complex, discursive, multi-player process. The requirement for hydroinformatics systems is to support rather than replace human judgment in this process, a requirement that has significant bearing on the treatment of uncertainty. Furthermore, a formal language is required to encode uncertainty in computer systems. We therefore review the modern mathematics of uncertainty, starting first with probability theory and then extending to fuzzy set theory and possibility theory, the theory of evidence (and its random set counterpart), which generalises probability and possibility theory, and higher-order generalisations. A simple example from coastal hydraulics illustrates how a range of types of uncertain information (including probability distributions, interval measurements and fuzzy sets) can be handled in the types of algebraic or numerical functions that form the kernel of most hydroinformatic systems.
APA, Harvard, Vancouver, ISO, and other styles
41

Ver Steeg, G., and S. Wehner. "Relaxed uncertainty relations and information processing." Quantum Information and Computation 9, no. 9&10 (September 2009): 801–32. http://dx.doi.org/10.26421/qic9.9-10-6.

Full text
Abstract:
We consider a range of "theories'' that violate the uncertainty relation for anti-commuting observables derived. We first show that Tsirelson's bound for the CHSH inequality can be derived from this uncertainty relation, and that relaxing this relation allows for non-local correlations that are stronger than what can be obtained in quantum mechanics. We continue to construct a hierarchy of related non-signaling theories, and show that on one hand they admit superstrong random access encodings and exponential savings for a particular communication problem, while on the other hand it becomes much harder in these theories to learn a state. We show that the existence of these effects stems from the absence of certain constraints on the expectation values of commuting measurements from our non-signaling theories that are present in quantum theory.
APA, Harvard, Vancouver, ISO, and other styles
42

CAO, TRU H., and HOA NGUYEN. "UNCERTAIN AND FUZZY OBJECT BASES: A DATA MODEL AND ALGEBRAIC OPERATIONS." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 19, no. 02 (April 2011): 275–305. http://dx.doi.org/10.1142/s0218488511007003.

Full text
Abstract:
Fuzzy set theory and probability theory are complementary for soft computing, in particular object-oriented systems with imprecise and uncertain object properties. However, current fuzzy object-oriented data models are mainly based on fuzzy set theory or possibility theory, and lack of a rigorous algebra for querying and managing uncertain and fuzzy object bases. In this paper, we develop an object base model that incorporates both fuzzy set values and probability degrees to handle imprecision and uncertainty. A probabilistic interpretation of relations on fuzzy sets is introduced as a formal basis to coherently unify the two types of measures into a common framework. The model accommodates both class attributes, representing declarative object properties, and class methods, representing procedural object properties. Two levels of property uncertainty are taken into account, one of which is value uncertainty of a definite property and the other is applicability uncertainty of the property itself. The syntax and semantics of the selection and other main data operations on the proposed object base model are formally defined as a full-fledged algebra.
APA, Harvard, Vancouver, ISO, and other styles
43

Wu, Haotian, Guo Dong Bai, Shuo Liu, Lianlin Li, Xiang Wan, Qiang Cheng, and Tie Jun Cui. "Information theory of metasurfaces." National Science Review 7, no. 3 (November 27, 2019): 561–71. http://dx.doi.org/10.1093/nsr/nwz195.

Full text
Abstract:
Abstract We propose a theory to characterize the information and information processing abilities of metasurfaces, and demonstrate the relation between the information of the metasurface and its radiation pattern in the far-field region. By incorporating a general aperture model with uncertainty relation in L2-space, we propose a theory to predict the upper bound of information contained in the radiation pattern of a metasurface, and reveal the theoretical upper limit of orthogonal radiation states. The proposed theory also provides guidance for inverse design of the metasurface with respect to given functionalities. Through investigation of the information of disordered-phase modulated metasurfaces, we find the information invariance (1−γ, where γ is Euler's constant) of chaotic radiation patterns. That is to say, the information of the disordered-phase modulated radiation patterns is always equal to 1−γ, regardless of variations in size, the number of elements and the phase pattern of metasurface. This value might be the lower bound of radiation-pattern information of the metasurface, which can provide a theoretical limit for information modulation applications, including computational imaging, stealth technologies and wireless communications.
APA, Harvard, Vancouver, ISO, and other styles
44

Ziyang Chen, Ziyang Chen, and Yang Zhang Ziyang Chen. "Conflict Evidence Fusion Algorithm Based on Cosine Distance and Information Entropy." 電腦學刊 34, no. 3 (June 2023): 343–55. http://dx.doi.org/10.53106/199115992023063403026.

Full text
Abstract:
<p>Dealing with high conflict evidence, traditional evidence theory sometimes has certain limitations, and results in fusion results contrary to common sense. In order to solve the problem of high conflict evidence fusion, this paper analyzes traditional evidence theory and proposes an evidence fusion method that combines cosine distance and information entropy. Cosine distance can measure the directionality between two vectors. The better the directionality, the more similar the two vectors are. Therefore, this article uses cosine distance to determine the similarity between evidences, and then calculates the credibility of each piece of evidence. Information entropy can calculate the amount of information for each evidence. The greater the information entropy, the greater the uncertainty of the evidence. Therefore, this article uses information entropy to measure the uncertainty of the evidence. Then, the credibility and uncertainty of the evidence are fused to calculate the weight of the evidence. Then we use d-s evidence theory for evidence fusion. The numerical example shows that the method is feasible and effective in dealing with conflict evidence. </p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
45

Yang, Yuhong. "Elements of Information Theory." Journal of the American Statistical Association 103, no. 481 (March 1, 2008): 429. http://dx.doi.org/10.1198/jasa.2008.s218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Chen, Yutong, and Yongchuan Tang. "Measuring the Uncertainty in the Original and Negation of Evidence Using Belief Entropy for Conflict Data Fusion." Entropy 23, no. 4 (March 28, 2021): 402. http://dx.doi.org/10.3390/e23040402.

Full text
Abstract:
Dempster-Shafer (DS) evidence theory is widely used in various fields of uncertain information processing, but it may produce counterintuitive results when dealing with conflicting data. Therefore, this paper proposes a new data fusion method which combines the Deng entropy and the negation of basic probability assignment (BPA). In this method, the uncertain degree in the original BPA and the negation of BPA are considered simultaneously. The degree of uncertainty of BPA and negation of BPA is measured by the Deng entropy, and the two uncertain measurement results are integrated as the final uncertainty degree of the evidence. This new method can not only deal with the data fusion of conflicting evidence, but it can also obtain more uncertain information through the negation of BPA, which is of great help to improve the accuracy of information processing and to reduce the loss of information. We apply it to numerical examples and fault diagnosis experiments to verify the effectiveness and superiority of the method. In addition, some open issues existing in current work, such as the limitations of the Dempster-Shafer theory (DST) under the open world assumption and the necessary properties of uncertainty measurement methods, are also discussed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
47

Wang, Jinjing (Jenny), Yang Yang, Carla Macias, and Elizabeth Bonawitz. "Children With More Uncertainty in Their Intuitive Theories Seek Domain-Relevant Information." Psychological Science 32, no. 7 (June 28, 2021): 1147–56. http://dx.doi.org/10.1177/0956797621994230.

Full text
Abstract:
How do changes in learners’ knowledge influence information seeking? We showed preschoolers ( N = 100) uncertain outcomes for events and let them choose which event to resolve. We found that children whose intuitive theories were at immature stages were more likely to seek information to resolve uncertainty about an outcome in the related domains, but children with more mature knowledge were not. This result was replicated in a second experiment but with the nuance that children at intermediate stages of belief development—when the causal outcome would be most ambiguous—were the most motivated to resolve the uncertainty. This effect was not driven by general uncertainty at the framework level but, rather, by the impact that framework knowledge has in accessing uncertainty at the model level. These results are the first to show the relationship between a learning preference and the developmental stage of a child’s intuitive theory.
APA, Harvard, Vancouver, ISO, and other styles
48

Crump, Matthew J. C., Walter Lai, and Nicholaus P. Brosowsky. "Instance theory predicts information theory: Episodic uncertainty as a determinant of keystroke dynamics." Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale 73, no. 4 (December 2019): 203–15. http://dx.doi.org/10.1037/cep0000182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Weijs, S. V., G. Schoups, and N. van de Giesen. "Why hydrological predictions should be evaluated using information theory." Hydrology and Earth System Sciences 14, no. 12 (December 13, 2010): 2545–58. http://dx.doi.org/10.5194/hess-14-2545-2010.

Full text
Abstract:
Abstract. Probabilistic predictions are becoming increasingly popular in hydrology. Equally important are methods to test such predictions, given the topical debate on uncertainty analysis in hydrology. Also in the special case of hydrological forecasting, there is still discussion about which scores to use for their evaluation. In this paper, we propose to use information theory as the central framework to evaluate predictions. From this perspective, we hope to shed some light on what verification scores measure and should measure. We start from the ''divergence score'', a relative entropy measure that was recently found to be an appropriate measure for forecast quality. An interpretation of a decomposition of this measure provides insight in additive relations between climatological uncertainty, correct information, wrong information and remaining uncertainty. When the score is applied to deterministic forecasts, it follows that these increase uncertainty to infinity. In practice, however, deterministic forecasts tend to be judged far more mildly and are widely used. We resolve this paradoxical result by proposing that deterministic forecasts either are implicitly probabilistic or are implicitly evaluated with an underlying decision problem or utility in mind. We further propose that calibration of models representing a hydrological system should be the based on information-theoretical scores, because this allows extracting all information from the observations and avoids learning from information that is not there. Calibration based on maximizing utility for society trains an implicit decision model rather than the forecasting system itself. This inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts. We also show this in an example. The final conclusion is that models should preferably be explicitly probabilistic and calibrated to maximize the information they provide.
APA, Harvard, Vancouver, ISO, and other styles
50

Li, Ming, and Kefeng Liu. "Probabilistic Prediction of Significant Wave Height Using Dynamic Bayesian Network and Information Flow." Water 12, no. 8 (July 22, 2020): 2075. http://dx.doi.org/10.3390/w12082075.

Full text
Abstract:
Short-term prediction of wave height is paramount in oceanic operation-related activities. Statistical models have advantages in short-term wave prediction as complex physical process is substantially simplified. However, previous statistical models have no consideration in selection of predictive variables and dealing with prediction uncertainty. This paper develops a machine learning model by combining the dynamic Bayesian network (DBN) with the information flow (IF) designated as DBN-IF. IF is focused on selecting the best predictive variables for DBN by causal analysis instead of correlation analysis. DBN for probabilistic prediction is constructed by structure learning and parameter learning with data mining. Based on causal theory, graph theory, and probability theory, the proposed DBN-IF model could deal with the uncertainty and shows great performance in significant wave height prediction compared with the artificial neural network (ANN), random forest (RF) and support vector machine (SVM) for all lead times. The interpretable DBN-IF is proven as a promising tool for nonlinear and uncertain wave height prediction.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography