Gotowa bibliografia na temat „Information theory”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Information theory”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Information theory"

1

Syau, Yu-Ru, i En-Bing Lin. "Evidence Theory in Incomplete Information Tables". International Journal of Machine Learning and Computing 5, nr 3 (czerwiec 2015): 242–46. http://dx.doi.org/10.7763/ijmlc.2015.v5.514.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

HAYASHI, Masahito. "Role of Quantum Information Theory in Information Theory". IEICE ESS Fundamentals Review 10, nr 1 (2016): 4–13. http://dx.doi.org/10.1587/essfr.10.1_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

MANDUSIC, Dubravka, i Lucija BLASKOVIC. "Information Literacy, Theory and Practice in Education". Revista Romaneasca pentru Educatie Multidimensionala 5, nr 1 (30.06.2013): 47–58. http://dx.doi.org/10.18662/rrem/2013.0501.04.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Khouzani, MHR, i Pasquale Malacaria. "Information Theory in Game Theory". Entropy 20, nr 11 (24.10.2018): 817. http://dx.doi.org/10.3390/e20110817.

Pełny tekst źródła
Streszczenie:
Information theory, as the mathematics of communication and storage of information, and game theory, as the mathematics of adversarial and cooperative strategic behaviour, are each successful fields of research on their own. [...]
Style APA, Harvard, Vancouver, ISO itp.
5

Ellerman, David. "Logical information theory: new logical foundations for information theory". Logic Journal of the IGPL 25, nr 5 (7.08.2017): 806–35. http://dx.doi.org/10.1093/jigpal/jzx022.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Ashby, F. Gregory, i Kenneth H. Norwich. "Resurrecting Information Theory". American Journal of Psychology 108, nr 4 (1995): 609. http://dx.doi.org/10.2307/1423078.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Wills, S. "Quantum Information Theory". Irish Mathematical Society Bulletin 0082 (2018): 35–37. http://dx.doi.org/10.33232/bims.0082.35.37.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

van Lambalgen, Michiel. "Algorithmic Information Theory". Journal of Symbolic Logic 54, nr 4 (grudzień 1989): 1389. http://dx.doi.org/10.2307/2274821.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Bennett, C. H., i P. W. Shor. "Quantum information theory". IEEE Transactions on Information Theory 44, nr 6 (1998): 2724–42. http://dx.doi.org/10.1109/18.720553.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

McCornack, Steven A. "Information manipulation theory". Communication Monographs 59, nr 1 (marzec 1992): 1–16. http://dx.doi.org/10.1080/03637759209376245.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Information theory"

1

Hjørland, Birger. "Principia Informatica. Foundational Theory of Information and Principles of Information Services". Libraries Unlimited, 2002. http://hdl.handle.net/10150/105735.

Pełny tekst źródła
Streszczenie:
Library and information science (LIS) may alternatively be labeled library, information and documentation studies, LID or just information science, IS. In taking IS serious as a research field, this paper presents an understanding of one of its core concepts (information) and outlines its fundamental principles. It is shown that there exist hierarchies of information processing mechanisms in nature and culture and that IS is concerned with only the highest forms of such mechanisms, which consist of libraries, electronic databases and related information services. Theories about such high-level information systems are closely related to theoretical views of knowledge, language, documents, cognition, science and communication. Information scientists are not the only experts involved in the handling of information, and a view of our special role is presented. The aspiration of this article is to provide a synopsis of the fundamentals of IS: Principia Informatica.
Style APA, Harvard, Vancouver, ISO itp.
2

Bond, Rachael Louise. "Relational information theory". Thesis, University of Sussex, 2018. http://sro.sussex.ac.uk/id/eprint/76664/.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Sahai, Anant. "Anytime information theory". Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/8770.

Pełny tekst źródła
Streszczenie:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2001.
Includes bibliographical references (p. 171-175).
We study the reliable communication of delay-sensitive bit streams through noisy channels. To bring the issues into sharp focus, we will focus on the specific problem of communicating the values of an unstable real-valued discrete-time Markov random process through a finite capacity noisy channel so as to have finite average squared error from end-to-end. On the source side, we give a coding theorem for such unstable processes that shows that we can achieve the rate-distortion bound even in the infinite horizon case if we are willing to tolerate bounded delays in encoding and decoding. On the channel side, we define a new parametric notion of capacity called anytime capacity that corresponds to a sense of reliable transmission that is stronger than the traditional Shannon capacity sense but is less demanding than the sense underlying zero-error capacity. We show that anytime capacity exists for memoryless channels without feedback and is connected to standard random coding error exponents. The main result of the thesis is a new source/channel separation theorem that encompasses unstable processes and establishes that the stronger notion of anytime capacity is required to be able to deal with delay-sensitive bit streams. This theorem is then applied in the control systems context to show that anytime capacity is also required to evaluate channels if we intend to use them as part of a feedback link from sensing to actuation. Finally, the theorem is used to shed light on the concept of "quality of service requirements" by examining a toy mathematical example for which we prove the absolute necessity of differentiated service without appealing to human preferences.
by Anant Sahai.
Ph.D.
Style APA, Harvard, Vancouver, ISO itp.
4

Schumann, Robert Helmut. "Quantum information theory". Thesis, Stellenbosch : Stellenbosch University, 2000. http://hdl.handle.net/10019.1/51892.

Pełny tekst źródła
Streszczenie:
Thesis (MSc)--Stellenbosch University, 2000
ENGLISH ABSTRACT: What are the information processing capabilities of physical systems? As recently as the first half of the 20th century this question did not even have a definite meaning. What is information, and how would one process it? It took the development of theories of computing (in the 1930s) and information (late in the 1940s) for us to formulate mathematically what it means to compute or communicate. Yet these theories were abstract, based on axiomatic mathematics: what did physical systems have to do with these axioms? Rolf Landauer had the essential insight - "Information is physical" - that information is always encoded in the state of a physical system, whose dynamics on a microscopic level are well-described by quantum physics. This means that we cannot discuss information without discussing how it is represented, and how nature dictates it should behave. Wigner considered the situation from another perspective when he wrote about "the unreasonable effectiveness of mathematics in the natural sciences". Why are the computational techniques of mathematics so astonishingly useful in describing the physical world [1]? One might begin to suspect foul play in the universe's operating principles. Interesting insights into the physics of information accumulated through the 1970s and 1980s - most sensationally in the proposal for a "quantum computer". If we were to mark a particular year in which an explosion of interest took place in information physics, that year would have to be 1994, when Shor showed that a problem of practical interest (factorisation of integers) could be solved easily on a quantum computer. But the applications of information in physics - and vice versa - have been far more widespread than this popular discovery. These applications range from improved experimental technology, more sophisticated measurement techniques, methods for characterising the quantum/classical boundary, tools for quantum chaos, and deeper insight into quantum theory and nature. In this thesis I present a short review of ideas in quantum information theory. The first chapter contains introductory material, sketching the central ideas of probability and information theory. Quantum mechanics is presented at the level of advanced undergraduate knowledge, together with some useful tools for quantum mechanics of open systems. In the second chapter I outline how classical information is represented in quantum systems and what this means for agents trying to extract information from these systems. The final chapter presents a new resource: quantum information. This resource has some bewildering applications which have been discovered in the last ten years, and continually presents us with unexpected insights into quantum theory and the universe.
AFRIKAANSE OPSOMMING: Tot watter mate kan fisiese sisteme informasie verwerk? So onlangs soos die begin van die 20ste eeu was dié vraag nog betekenisloos. Wat is informasie, en wat bedoel ons as ons dit wil verwerk? Dit was eers met die ontwikkeling van die teorieë van berekening (in die 1930's) en informasie (in die laat 1940's) dat die tegnologie beskikbaar geword het wat ons toelaat om wiskundig te formuleer wat dit beteken om te bereken of te kommunikeer. Hierdie teorieë was egter abstrak en op aksiomatiese wiskunde gegrond - mens sou wel kon wonder wat fisiese sisteme met hierdie aksiomas te make het. Dit was Rolf Landauer wat uiteindelik die nodige insig verskaf het - "Informasie is fisies" - informasie word juis altyd in 'n fisiese toestand gekodeer, en so 'n fisiese toestand word op die mikroskopiese vlak akkuraat deur kwantumfisika beskryf. Dit beteken dat ons nie informasie kan bespreek sonder om ook na die fisiese voorstelling te verwys nie, of sonder om in ag te neem nie dat die natuur die gedrag van informasie voorskryf. Hierdie situasie is vanaf 'n ander perspektief ook deur Wigner beskou toe hy geskryf het oor "die onredelike doeltreffendheid van wiskunde in die natuurwetenskappe". Waarom slaag wiskundige strukture en tegnieke van wiskunde so uitstekend daarin om die fisiese wêreld te beskryf [1]? Dit laat 'n mens wonder of die beginsels waarvolgens die heelal inmekaar steek spesiaal so saamgeflans is om ons 'n rat voor die oë te draai. Die fisika van informasie het in die 1970's en 1980's heelwat interessante insigte opgelewer, waarvan die mees opspraakwekkende sekerlik die gedagte van 'n kwantumrekenaar is. As ons één jaar wil uitsonder as die begin van informasiefisika, is dit die jaar 1994 toe Shor ontdek het dat 'n belangrike probleem van algemene belang (die faktorisering van groot heelgetalle) moontlik gemaak word deur 'n kwantumrekenaar. Die toepassings van informasie in fisika, en andersom, strek egter veel wyer as hierdie sleutel toepassing. Ander toepassings strek van verbeterde eksperimentele metodes, deur gesofistikeerde meetmetodes, metodes vir die ondersoek en beskrywing van kwantumchaos tot by dieper insig in die samehang van kwantumteorie en die natuur. In hierdie tesis bied ek 'n kort oorsig oor die belangrikste idees van kwantuminformasie teorie. Die eerste hoofstuk bestaan uit inleidende materiaal oor die belangrikste idees van waarskynlikheidsteorie en klassieke informasie teorie. Kwantummeganika word op 'n gevorderde voorgraadse vlak ingevoer, saam met die nodige gereedskap van kwantummeganika vir oop stelsels. In die tweede hoofstuk spreek ek die voorstelling van klassieke informasie en kwantumstelsels aan, en die gepaardgaande moontlikhede vir 'n agent wat informasie uit sulke stelsels wil kry. Die laaste hoofstuk ontgin 'n nuwe hulpbron: kwantuminformasie. Gedurende die afgelope tien jaar het hierdie nuwe hulpbron tot verbysterende nuwe toepassings gelei en ons keer op keer tot onverwagte nuwe insigte oor kwantumteorie en die heelal gelei.
Style APA, Harvard, Vancouver, ISO itp.
5

Huang, Shao-Lun Ph D. Massachusetts Institute of Technology. "Euclidean network information theory". Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/84888.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 121-123).
Many network information theory problems face the similar difficulty of single letterization. We argue that this is due to the lack of a geometric structure on the space of probability distributions. In this thesis, we develop such a structure by assuming that the distributions of interest are all close to each other. Under this assumption, the Kullback-Leibler (K-L) divergence is reduced to the squared Euclidean metric in an Euclidean space. In addition, we construct the notion of coordinate and inner product, which will facilitate solving communication problems. We will present the application of this approach to the point-to-point channels, general broadcast channels (BC), multiple access channels (MAC) with common sources, interference channels, and multi-hop layered communication networks without or with feedback. It can be shown that with this approach, information theory problems, such as the single-letterization, can be reduced to some linear algebra problems. Solving these linear algebra problems, we will show that for the general broadcast channels, transmitting the common message to receivers can be formulated as the trade-off between linear systems. We also provide an example to visualize this trade-off in a geometric way. For the MAC with common sources, we observe a coherent combining gain due to the cooperation between transmitters, and this gain can be obtained quantitively by applying our technique. In addition, the developments of the broadcast channels and multiple access channels suggest a trade-off relation between generating common messages for multiple users and transmitting them as the common sources to exploit the coherent combining gain, when optimizing the throughputs of communication networks. To study the structure of this trade-off and understand its role in optimizing the network throughput, we construct a deterministic model by our local approach that captures the critical channel parameters and well models the network. With this deterministic model, for multi-hop layered networks, we analyze the optimal network throughputs, and illustrate what kinds of common messages should be generated to achieve the optimal throughputs. Our results provide the insight of how users in a network should cooperate with each other to transmit information efficiently.
by Shao-Lun Huang.
Ph.D.
Style APA, Harvard, Vancouver, ISO itp.
6

Faghfoor, Maghrebi Mohammad. "Information gain in quantum theory". Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/2724.

Pełny tekst źródła
Streszczenie:
In this thesis I address the fundamental question that how the information gain is possible in the realm of quantum mechanics where a single measurement alters the state of the system. I study an ensemble of particles in some unknown (but product) state in detail and suggest an optimal way of gaining the maximum information and also quantify the corresponding information exactly. We find a rather novel result which is quite different from other well-known definitions of the information gain in quantum theory.
Style APA, Harvard, Vancouver, ISO itp.
7

Vedral, Vlatko. "Quantum information theory of entanglement". Thesis, Imperial College London, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299786.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Girolami, Davide. "Quantum correlations in information theory". Thesis, University of Nottingham, 2013. http://eprints.nottingham.ac.uk/13397/.

Pełny tekst źródła
Streszczenie:
The project concerned the study of quantum correlations (QC) in compound systems, i.e. statistical correlations more general than entanglement which are predicted by quantum mechanics but not described in any classical scenario. I aimed to understand the technical and operational properties of the measures of QC, their interplay with entanglement quantifiers and the experimental accessibility. In the first part of my research path, after having acquired the conceptual and technical rudiments of the project, I provided solutions for some computational issues: I developed analytical and numerical algorithms for calculating bipartite QC in finite dimensional systems. Then, I tackled the problem of the experimental detection of QC. There is no Hermitian operator associated with entanglement measures, nor with QC ones. However, the information encoded in a density matrix is redundant to quantify them, thus the full knowledge of the state is not required to accomplish the task. I reported the first protocol to measure the QC of an unknown state by means of a limited number of measurements, without performing the tomography of the state. My proposal has been implemented experimentally in a NMR (Nuclear Magnetic Resonance) setting. In the final stage of the project, I explored the foundational and operational merits of QC. I showed that the QC shared by two subsystems yield a genuinely quantum kind of uncertainty on single local observables. The result is a promising evidence of the potential exploitability of separable (unentangled) states for quantum metrology in noisy conditions.
Style APA, Harvard, Vancouver, ISO itp.
9

Hawes, Vanessa Lucey. "Music's experiment with information theory". Thesis, University of East Anglia, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.514351.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Daemi, M. F. "Information theory and pattern recognition". Thesis, University of Nottingham, 1990. http://eprints.nottingham.ac.uk/14003/.

Pełny tekst źródła
Streszczenie:
This thesis presents an account of an investigation into the use of information theory measures in pattern recognition problems. The objectives were firstly to determine the information content of the set of representations of an input image which are found at the output of an array of sensors; secondly to assess the information which may be used to allocate different patterns to appropriate classes in order to provide a means of recognition; and thirdly to assess the recognition capability of pattern recognition systems and their efficiency of utilization of information. Information assessment techniques were developed using fundamental principles of information theory. These techniques were used to assess the information associated with attributes such as orientation and location, of a variety of input images. The techniques were extended to permit the assessment of recognition capability and to provide a measure of the efficiency with which pattern recognition systems use the information available.
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Information theory"

1

Duplantier, Bertrand, i Vincent Rivasseau, red. Information Theory. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81480-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Krippendorff, Klaus. Information Theory. 2455 Teller Road, Newbury Park California 91320 United States of America: SAGE Publications, Inc., 1986. http://dx.doi.org/10.4135/9781412984485.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Information theory. Cambridge [England]: Cambridge University Press, 1997.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Information theory. New York: Dover Publications, 1990.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Goldman, Stanford. Information theory. Mineola, N.Y: Dover Publications, 2005.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Chambert-Loir, Antoine. Information Theory. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-21561-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Winter, Stephan, Matt Duckham, Lars Kulik i Ben Kuipers, red. Spatial Information Theory. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-74788-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Hayashi, Masahito. Quantum Information Theory. Berlin, Heidelberg: Springer Berlin Heidelberg, 2017. http://dx.doi.org/10.1007/978-3-662-49725-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Seibt, Peter. Algorithmic Information Theory. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/978-3-540-33219-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Egenhofer, Max, Nicholas Giudice, Reinhard Moratz i Michael Worboys, red. Spatial Information Theory. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23196-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Information theory"

1

Shekhar, Shashi, i Hui Xiong. "Information Theory". W Encyclopedia of GIS, 582. Boston, MA: Springer US, 2008. http://dx.doi.org/10.1007/978-0-387-35973-1_638.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Holt, Anatol W. "Information (Theory)". W Organized Activity and its Support by Computer, 119–37. Dordrecht: Springer Netherlands, 1997. http://dx.doi.org/10.1007/978-94-011-5590-8_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Bossomaier, Terry, Lionel Barnett, Michael Harré i Joseph T. Lizier. "Information Theory". W An Introduction to Transfer Entropy, 33–63. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43222-9_3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Maña, Carlos. "Information Theory". W UNITEXT for Physics, 221–44. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-55738-0_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Jou, David, José Casas-Vázquez i Georgy Lebon. "Information Theory". W Extended Irreversible Thermodynamics, 143–67. Dordrecht: Springer Netherlands, 2009. http://dx.doi.org/10.1007/978-90-481-3074-0_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Weik, Martin H. "information theory". W Computer Science and Communications Dictionary, 779. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_8973.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Beigi, Homayoon. "Information Theory". W Fundamentals of Speaker Recognition, 265–300. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-77592-0_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Dobrushin, R. L. "Information Theory". W Mathematics and Its Applications, 222–25. Dordrecht: Springer Netherlands, 1993. http://dx.doi.org/10.1007/978-94-017-2973-4_15.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Jou, David, José Casas-Vázquez i Georgy Lebon. "Information Theory". W Extended Irreversible Thermodynamics, 165–90. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/978-3-642-56565-6_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Utgoff, Paul E., James Cussens, Stefan Kramer, Sanjay Jain, Frank Stephan, Luc De Raedt, Ljupčo Todorovski i in. "Information Theory". W Encyclopedia of Machine Learning, 548. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_404.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Information theory"

1

Jiang, Xianyang, i Jianhua Lu. "Match Information Theory". W 2010 6th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM). IEEE, 2010. http://dx.doi.org/10.1109/wicom.2010.5600702.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Borade, Shashi, i Lizhong Zheng. "Euclidean Information Theory". W 2008 IEEE International Zurich Seminar on Communications (IZS). IEEE, 2008. http://dx.doi.org/10.1109/izs.2008.4497265.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Enßlin, Torsten. "Information field theory". W BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING: 32nd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering. AIP, 2013. http://dx.doi.org/10.1063/1.4819999.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Feria, Erlan H. "Latency-information theory". W 2010 IEEE Sarnoff Symposium. IEEE, 2010. http://dx.doi.org/10.1109/sarnof.2010.5469775.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

"Network Information Theory". W 2022 IEEE 4th International Conference on Advanced Trends in Information Theory (ATIT). IEEE, 2022. http://dx.doi.org/10.1109/atit58178.2022.10024186.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

"Network Information Theory". W 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT). IEEE, 2021. http://dx.doi.org/10.1109/atit54053.2021.9678710.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Feria, Erlan H. "Latency-Information Theory: A Novel Latency Theory Revealed as Time Dual of Information Theory". W 2009 IEEE 13th Digital Signal Processing Workshop and 5th IEEE Signal Processing Education Workshop. IEEE, 2009. http://dx.doi.org/10.1109/dsp.2009.4785904.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Yarichin, E. M. "Theory of full information (video-information)". W 2011 International Siberian Conference on Control and Communications (SIBCON 2011). IEEE, 2011. http://dx.doi.org/10.1109/sibcon.2011.6072609.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Yingxu Wang. "A cognitive informatics theory for visual information processing". W 2008 7th IEEE International Conference on Cognitive Informatics (ICCI). IEEE, 2008. http://dx.doi.org/10.1109/coginf.2008.4639184.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Moskowitz, Ira S., Pedro N. Safier i Paul Cotae. "Algebraic Information Theory and Kosko's Forbidden Interval Theorem". W Modelling, Identification and Control. Calgary,AB,Canada: ACTAPRESS, 2013. http://dx.doi.org/10.2316/p.2013.801-053.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Information theory"

1

Spivak, David. Categorical Information Theory. Fort Belvoir, VA: Defense Technical Information Center, maj 2011. http://dx.doi.org/10.21236/ada543905.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Moran, William. Coding Theory Information Theory and Radar. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 2005. http://dx.doi.org/10.21236/ada456510.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Calderbank, Arthur R. Coding Theory Information Theory and Radar. Fort Belvoir, VA: Defense Technical Information Center, styczeń 2005. http://dx.doi.org/10.21236/ada434253.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Adami, Christoph. Relativistic Quantum Information Theory. Fort Belvoir, VA: Defense Technical Information Center, listopad 2007. http://dx.doi.org/10.21236/ada490967.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Parlett, Beresford. Some Basic Information on Information-Based Complexity Theory. Fort Belvoir, VA: Defense Technical Information Center, lipiec 1989. http://dx.doi.org/10.21236/ada256585.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Gorecki, Frank D. Passive Tracking and Information Theory. Fort Belvoir, VA: Defense Technical Information Center, maj 1999. http://dx.doi.org/10.21236/ada385452.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Burnett, Margaret M. Information Foraging Theory in Software Maintenance. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 2012. http://dx.doi.org/10.21236/ada579505.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Dowski, Edward R., i Jr. An Information Theory Approach to Three Incoherent Information Processing Systems,. Fort Belvoir, VA: Defense Technical Information Center, styczeń 1995. http://dx.doi.org/10.21236/ada299683.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Calabrese, P. G. A Theory of Conditional Information with Applications. Fort Belvoir, VA: Defense Technical Information Center, marzec 1994. http://dx.doi.org/10.21236/ada278164.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Allwein, Gerard T., Ira S. Moskowitz i LiWu Chang. A New Framework for Shannon Information Theory. Fort Belvoir, VA: Defense Technical Information Center, styczeń 2004. http://dx.doi.org/10.21236/ada420108.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii