Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Information-Based Complexity.

Zeitschriftenartikel zum Thema „Information-Based Complexity“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Information-Based Complexity" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Wozniakowski, H. „Information-Based Complexity“. Annual Review of Computer Science 1, Nr. 1 (Juni 1986): 319–80. http://dx.doi.org/10.1146/annurev.cs.01.060186.001535.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Packel, Edward W., und J. F. Traub. „Information-based complexity“. Nature 328, Nr. 6125 (Juli 1987): 29–33. http://dx.doi.org/10.1038/328029a0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Heinrich, Stefan, und Jörg-Detlef Kern. „Parallel information-based complexity“. Journal of Complexity 7, Nr. 4 (Dezember 1991): 339–70. http://dx.doi.org/10.1016/0885-064x(91)90024-r.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kon, Mark A. „Book Review: Information-based complexity“. Bulletin of the American Mathematical Society 21, Nr. 2 (01.10.1989): 332–40. http://dx.doi.org/10.1090/s0273-0979-1989-15851-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Traub, J. F., und H. Wo\'zniakowski. „Perspectives on Information-Based Complexity“. Bulletin of the American Mathematical Society 26, Nr. 1 (01.04.1992): 29–53. http://dx.doi.org/10.1090/s0273-0979-1992-00240-9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Lui, Leong Ting, Germán Terrazas, Hector Zenil, Cameron Alexander und Natalio Krasnogor. „Complexity Measurement Based on Information Theory and Kolmogorov Complexity“. Artificial Life 21, Nr. 2 (Mai 2015): 205–24. http://dx.doi.org/10.1162/artl_a_00157.

Der volle Inhalt der Quelle
Annotation:
In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Parlett, Beresford N. „Some basic information\\ on information-based complexity theory“. Bulletin of the American Mathematical Society 26, Nr. 1 (01.01.1992): 3–29. http://dx.doi.org/10.1090/s0273-0979-1992-00239-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Orme, Anthony Mark, Haining Yao und Letha H. Etzkorn. „Complexity metrics for ontology based information“. International Journal of Technology Management 47, Nr. 1/2/3 (2009): 161. http://dx.doi.org/10.1504/ijtm.2009.024120.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Packel, Edward W., und Henryk Woźniakowski. „Recent developments in information-based complexity“. Bulletin of the American Mathematical Society 17, Nr. 1 (01.07.1987): 9–37. http://dx.doi.org/10.1090/s0273-0979-1987-15511-x.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Galas, David J., Matti Nykter, Gregory W. Carter, Nathan D. Price und Ilya Shmulevich. „Biological Information as Set-Based Complexity“. IEEE Transactions on Information Theory 56, Nr. 2 (Februar 2010): 667–77. http://dx.doi.org/10.1109/tit.2009.2037046.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Woźniakowski, H. „A survey of information-based complexity“. Journal of Complexity 1, Nr. 1 (Oktober 1985): 11–44. http://dx.doi.org/10.1016/0885-064x(85)90020-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Woźniakowski, H. „Probabilistic setting of information-based complexity“. Journal of Complexity 2, Nr. 3 (September 1986): 255–69. http://dx.doi.org/10.1016/0885-064x(86)90005-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Mathé, P. „s-Numbers in information-based complexity“. Journal of Complexity 6, Nr. 1 (März 1990): 41–66. http://dx.doi.org/10.1016/0885-064x(90)90011-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Traub, J. F., und H. Woźniakowski. „Information-Based complexity: New questions for mathematicians“. Mathematical Intelligencer 13, Nr. 2 (März 1991): 34–43. http://dx.doi.org/10.1007/bf03024085.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Nemirovsky, A. S. „Information-based complexity of linear operator equations“. Journal of Complexity 8, Nr. 2 (Juni 1992): 153–75. http://dx.doi.org/10.1016/0885-064x(92)90013-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Cheung, Karen S. K., und Douglas Vogel. „Complexity Reduction in Lattice-Based Information Retrieval“. Information Retrieval 8, Nr. 2 (April 2005): 285–99. http://dx.doi.org/10.1007/s10791-005-5663-y.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Pozo, Jose M., Arjan J. Geers, Maria-Cruz Villa-Uriol und Alejandro F. Frangi. „Flow complexity in open systems: interlacing complexity index based on mutual information“. Journal of Fluid Mechanics 825 (21.07.2017): 704–42. http://dx.doi.org/10.1017/jfm.2017.392.

Der volle Inhalt der Quelle
Annotation:
Flow complexity is related to a number of phenomena in science and engineering and has been approached from the perspective of chaotic dynamical systems, ergodic processes or mixing of fluids, just to name a few. To the best of our knowledge, all existing methods to quantify flow complexity are only valid for infinite time evolution, for closed systems or for mixing of two substances. We introduce an index of flow complexity coined interlacing complexity index (ICI), valid for a single-phase flow in an open system with inlet and outlet regions, involving finite times. ICI is based on Shannon’s mutual information (MI), and inspired by an analogy between inlet–outlet open flow systems and communication systems in communication theory. The roles of transmitter, receiver and communication channel are played, respectively, by the inlet, the outlet and the flow transport between them. A perfectly laminar flow in a straight tube can be compared to an ideal communication channel where the transmitted and received messages are identical and hence the MI between input and output is maximal. For more complex flows, generated by more intricate conditions or geometries, the ability to discriminate the outlet position by knowing the inlet position is decreased, reducing the corresponding MI. The behaviour of the ICI has been tested with numerical experiments on diverse flows cases. The results indicate that the ICI provides a sensitive complexity measure with intuitive interpretation in a diversity of conditions and in agreement with other observations, such as Dean vortices and subjective visual assessments. As a crucial component of the ICI formulation, we also introduce the natural distribution of streamlines and the natural distribution of world-lines, with invariance properties with respect to the cross-section used to parameterize them, valid for any type of mass-preserving flow.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Han, Cuize, und Ming Yuan. „Information based complexity for high dimensional sparse functions“. Journal of Complexity 57 (April 2020): 101443. http://dx.doi.org/10.1016/j.jco.2019.101443.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Tavares, Gabriela, und Panos Parpas. „On the information-based complexity of stochastic programming“. Operations Research Letters 41, Nr. 6 (November 2013): 622–26. http://dx.doi.org/10.1016/j.orl.2013.08.011.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Bonmati, Ester, Anton Bardera, Miquel Feixas und Imma Boada. „Novel Brain Complexity Measures Based on Information Theory“. Entropy 20, Nr. 7 (25.06.2018): 491. http://dx.doi.org/10.3390/e20070491.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Krivovichev, Sergey V. „Information-based measures of structural complexity of crystals“. Acta Crystallographica Section A Foundations and Advances 73, a2 (01.12.2017): C378. http://dx.doi.org/10.1107/s2053273317091951.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Zhang, H. X., Y. S. Zhu und Z. M. Wang. „Complexity measure and complexity rate information based detection of ventricular tachycardia and fibrillation“. Medical & Biological Engineering & Computing 38, Nr. 5 (September 2000): 553–57. http://dx.doi.org/10.1007/bf02345752.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Raginsky, Maxim, und Alexander Rakhlin. „Information-Based Complexity, Feedback and Dynamics in Convex Programming“. IEEE Transactions on Information Theory 57, Nr. 10 (Oktober 2011): 7036–56. http://dx.doi.org/10.1109/tit.2011.2154375.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Novak, Erich, Ian H. Sloan, Joseph F. Traub und Henryk Wozniakowski. „Frances Kuo Wins the 2014 Information-Based Complexity Prize“. Journal of Complexity 30, Nr. 4 (August 2014): v. http://dx.doi.org/10.1016/s0885-064x(14)00056-9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Novak, Erich. „Nominations for 2016 Information-Based Complexity Young Researcher Award“. Journal of Complexity 34 (Juni 2016): vii. http://dx.doi.org/10.1016/s0885-064x(16)30003-6.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Drori, Yoel. „The exact information-based complexity of smooth convex minimization“. Journal of Complexity 39 (April 2017): 1–16. http://dx.doi.org/10.1016/j.jco.2016.11.001.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Le Yi Wang und Lin Lin. „Information-based complexity of uncertainty sets in feedback control“. IEEE Transactions on Automatic Control 46, Nr. 4 (April 2001): 519–33. http://dx.doi.org/10.1109/9.917654.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Dale, M. B., M. Anand und R. E. Desrochers. „Measuring information-based complexity across scales using cluster analysis“. Ecological Informatics 2, Nr. 2 (Juni 2007): 121–27. http://dx.doi.org/10.1016/j.ecoinf.2007.03.011.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Milanese, M., und A. Vicino. „Information-Based Complexity and Nonparametric Worst-Case System Identification“. Journal of Complexity 9, Nr. 4 (Dezember 1993): 427–46. http://dx.doi.org/10.1006/jcom.1993.1028.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Pavlenko, Yaryna, und Iryna Yurchak. „Information currency converter based on Telegram messenger“. Computer systems and network 4, Nr. 1 (16.12.2022): 106–21. http://dx.doi.org/10.23939/csn2022.01.106.

Der volle Inhalt der Quelle
Annotation:
The work is dedicated to the development of a mobile chatbot containing an information currency converter, designed for use by a wide range of people. A chatbot is a subject-oriented text-based dialog interface that allows a user to perform a limited set of tasks: getting information about the current rate of currencies (USD or EUR) relative to the national currency and finding out the current rate of cryptocurrencies (Bitcoin, Ethereum, Litecoin) in dollars or euros. To achieve this goal, the selected subject area was analyzed and appropriate conclusions were made. A corresponding study of analogs who perform tasks similar in complexity was carried out, only a few chatbots were identified, as a certain number of bots posted in Telegram no longer provide their services or work incorrectly. The algorithm of the service for currency conversion based on the Telegram messenger is described. The chatbot is implemented in the Python programming language and uses the Pycharm development environment, as it is best suited for programming the intended project and is easy to use. There are two options available to the user: the cryptocurrency rate from the CoinGecko site or the exchange rate from PrivatBank. The article examines the development and improvement of chatbots. Similar Telegram bots, which function similarly to the created one, are reviewed. The author’s bot has been developed, and the architecture and algorithm of the CurrencyBot currency conversion service are presented.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Sun, Shuliang. „A New Information Hiding Method Based on Improved BPCS Steganography“. Advances in Multimedia 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/698492.

Der volle Inhalt der Quelle
Annotation:
Bit-plane complexity segmentation (BPCS) steganography is advantageous in its capacity and imperceptibility. The important step of BPCS steganography is how to locate noisy regions in a cover image exactly. The regular method, black-and-white border complexity, is a simple and easy way, but it is not always useful, especially for periodical patterns. Run-length irregularity and border noisiness are introduced in this paper to work out this problem. Canonical Cray coding (CGC) is also used to replace pure binary coding (PBC), because CGC makes use of characteristic of human vision system. Conjugation operation is applied to convert simple blocks into complex ones. In order to contradict BPCS steganalysis, improved BPCS steganography algorithm adopted different bit-planes with different complexity. The higher the bit-plane is, the smaller the complexity is. It is proven that the improved BPCS steganography is superior to BPCS steganography by experiment.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Cheng, Ying, ZhiWei Guan und HongLin Zhao. „Complexity metrics for auto fault diagnosis based on information entropy“. IOP Conference Series: Materials Science and Engineering 392 (03.08.2018): 062147. http://dx.doi.org/10.1088/1757-899x/392/6/062147.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Wu, Xue. „Calculation of the Minimum Computational Complexity Based on Information Entropy“. International Journal on Computational Science & Applications 2, Nr. 1 (29.02.2012): 73–82. http://dx.doi.org/10.5121/ijcsa.2012.2107.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Jiang, Tianzi. „A parallel information-based complexity approach to visual surface reconstruction“. International Journal of Computer Mathematics 70, Nr. 2 (Januar 1998): 165–77. http://dx.doi.org/10.1080/00207169808804744.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Woźniakowski, Henryk. „Why does information-based complexity use the real number model?“ Theoretical Computer Science 219, Nr. 1-2 (Mai 1999): 451–65. http://dx.doi.org/10.1016/s0304-3975(98)00300-4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Novak, Erich. „2017 Joseph F. Traub Information-Based Complexity Young Researcher Award“. Journal of Complexity 39 (April 2017): vi. http://dx.doi.org/10.1016/s0885-064x(17)30019-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Novak, Erich. „2018 Joseph F. Traub Information-Based Complexity Young Researcher Award“. Journal of Complexity 44 (Februar 2018): v. http://dx.doi.org/10.1016/s0885-064x(17)30097-3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Cha, Shin, In Sang Chung und Yong Rae Kwon. „Complexity measures for concurrent programs based on information-theoretic metrics“. Information Processing Letters 46, Nr. 1 (April 1993): 43–50. http://dx.doi.org/10.1016/0020-0190(93)90195-f.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Cho, S., R. Alamoudi und S. Asfour. „Interaction-based complexity measure of manufacturing systems using information entropy“. International Journal of Computer Integrated Manufacturing 22, Nr. 10 (Oktober 2009): 909–22. http://dx.doi.org/10.1080/09511920902951393.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Kang, Hyun-Seok, und Chi-Hyuck Jun. „Mutual information-based multi-output tree learning algorithm“. Intelligent Data Analysis 25, Nr. 6 (29.10.2021): 1525–45. http://dx.doi.org/10.3233/ida-205367.

Der volle Inhalt der Quelle
Annotation:
A tree model with low time complexity can support the application of artificial intelligence to industrial systems. Variable selection based tree learning algorithms are more time efficient than existing Classification and Regression Tree (CART) algorithms. To our best knowledge, there is no attempt to deal with categorical input variable in variable selection based multi-output tree learning. Also, in the case of multi-output regression tree, a conventional variable selection based algorithm is not suitable to large datasets. We propose a mutual information-based multi-output tree learning algorithm that consists of variable selection and split optimization. The proposed method discretizes each variable based on k-means into 2–4 clusters and selects the variable for splitting based on the discretized variables using mutual information. This variable selection component has relatively low time complexity and can be applied regardless of output dimension and types. The proposed split optimization component is more efficient than an exhaustive search. The performance of the proposed tree learning algorithm is similar to or better than that of a multi-output version of CART algorithm on a specific dataset. In addition, with a large dataset, the time complexity of the proposed algorithm is significantly reduced compared to a CART algorithm.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Cao, Hai Wang, und Chao Gai Xue. „Self-Organization System Framework of Enterprise Information System Based on CAS“. Advanced Materials Research 591-593 (November 2012): 2628–31. http://dx.doi.org/10.4028/www.scientific.net/amr.591-593.2628.

Der volle Inhalt der Quelle
Annotation:
In order to avoid enterprise information system (EIS) risk, the self-organization mechanism of EIS based on complex adaptive system (CAS) is studied. Firstly, self-organization properties of EIS are analyzed, which include open system, nonlinear characteristics, far from equilibrium and fluctuations. Secondly, the complex properties and complex adaptive properties of EIS self-organization are studied. The complex properties include multi-agent, active adaptation of agents, multi-level nature, technology complexity, organizational complexity, process complexity and environment complexity. The complex adaptive properties include aggregation mechanism, identification mechanism, non-linear characteristics flow characteristics, diversity characteristics, internal model mechanism and block characteristics. Finally, architecture model of EIS self-organization is proposed as well as its macro and micro models, which provides a new perspective for EIS and helps understand the rules of EIS implementation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Pei, Xiao Bing, und Shao Ping Lv. „Research on Effectiveness of Production Resource Allocation Based on Extended Information Entropy“. Applied Mechanics and Materials 687-691 (November 2014): 5145–48. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.5145.

Der volle Inhalt der Quelle
Annotation:
The effective resource allocation in production system is the key to high performance. This paper firstly analyses the entropy increase which damages the factors’ operation, then the information entropy theory is extended to establish a complexity model based on size, difficulty, and state diversity. On the basis of complexity, the decline of factors’ utilization efficiency is described. Moreover, some specific management methods are introduced to illustrate the importance of complexity control.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Mattos, Sérgio Henrique Vannucchi Leme de, Luiz Eduardo Vicente, Andrea Koga Vicente, Cláudio Bielenki Júnior und José Roberto Castilho Piqueira. „Metrics based on information entropy applied to evaluate complexity of landscape patterns“. PLOS ONE 17, Nr. 1 (20.01.2022): e0262680. http://dx.doi.org/10.1371/journal.pone.0262680.

Der volle Inhalt der Quelle
Annotation:
Landscape is an ecological category represented by a complex system formed by interactions between society and nature. Spatial patterns of different land uses present in a landscape reveal past and present processes responsible for its dynamics and organisation. Measuring the complexity of these patterns (in the sense of their spatial heterogeneity) allows us to evaluate the integrity and resilience of these complex environmental systems. Here, we show how landscape metrics based on information entropy can be applied to evaluate the complexity (in the sense of spatial heterogeneity) of patches patterns, as well as their transition zones, present in a Cerrado conservation area and its surroundings, located in south-eastern Brazil. The analysis in this study aimed to elucidate how changes in land use and the consequent fragmentation affect the complexity of the landscape. The scripts CompPlex HeROI and CompPlex Janus were created to allow calculation of information entropy (He), variability (He/Hmax), and López-Ruiz, Mancini, and Calbet (LMC) and Shiner, Davison, and Landsberg (SDL) measures. CompPlex HeROI enabled the calculation of these measures for different regions of interest (ROIs) selected in a satellite image of the study area, followed by comparison of the complexity of their patterns, in addition to enabling the generation of complexity signatures for each ROI. CompPlex Janus made it possible to spatialise the results for these four measures in landscape complexity maps. As expected, both for the complexity patterns evaluated by CompPlex HeROI and the complexity maps generated by CompPlex Janus, the areas with vegetation located in a region of intermediate spatial heterogeneity had lower values for the He and He/Hmax measures and higher values for the LMC and SDL measurements. So, these landscape metrics were able to capture the behaviour of the patterns of different types of land use present in the study area, bringing together uses linked to vegetation with increased canopy coverage and differentiating them from urban areas and transition areas that mix different uses. Thus, the algorithms implemented in these scripts were demonstrated to be robust and capable of measuring the variability in information levels from the landscape, not only in terms of spatial datasets but also spectrally. The automation of measurement calculations, owing to informational entropy provided by these scripts, allows a quick assessment of the complexity of patterns present in a landscape, and thus, generates indicators of landscape integrity and resilience.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Langer, Carlotta, und Nihat Ay. „Complexity as Causal Information Integration“. Entropy 22, Nr. 10 (30.09.2020): 1107. http://dx.doi.org/10.3390/e22101107.

Der volle Inhalt der Quelle
Annotation:
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting. We will discuss a class of information geometric measures that aim at assessing the intrinsic causal cross-influences in a system. One promising candidate of these measures, denoted by ΦCIS, is based on conditional independence statements and does satisfy all of the properties that have been postulated as desirable. Unfortunately it does not have a graphical representation, which makes it less intuitive and difficult to analyze. We propose an alternative approach using a latent variable, which models a common exterior influence. This leads to a measure ΦCII, Causal Information Integration, that satisfies all of the required conditions. Our measure can be calculated using an iterative information geometric algorithm, the em-algorithm. Therefore we are able to compare its behavior to existing integrated information measures.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Abad, Andres G., und Jionghua Jin. „Complexity metrics for mixed model manufacturing systems based on information entropy“. International Journal of Information and Decision Sciences 3, Nr. 4 (2011): 313. http://dx.doi.org/10.1504/ijids.2011.043025.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Kamejima, Kohji. „Chromatic Information Adaptation for Complexity-Based Integration of Multi-Viewpoint Imagery“. Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2007 (05.05.2007): 82–87. http://dx.doi.org/10.5687/sss.2007.82.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Akman, Olcay. „Information Complexity Based Modeling in the Presence of Length-Biased Sampling“. Journal of Statistical Theory and Practice 4, Nr. 1 (März 2010): 45–55. http://dx.doi.org/10.1080/15598608.2010.10411972.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

„Information-Based Complexity“. Science 243, Nr. 4895 (03.03.1989): 1142–43. http://dx.doi.org/10.1126/science.243.4895.1142-a.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

„Information-based complexity“. Mathematics and Computers in Simulation 31, Nr. 1-2 (Februar 1989): 142. http://dx.doi.org/10.1016/0378-4754(89)90072-4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

„2004 Information-Based Complexity Prize Committee“. Journal of Complexity 20, Nr. 1 (Februar 2004): 4. http://dx.doi.org/10.1016/j.jco.2003.11.002.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie