Academic literature on the topic 'Information theory and compression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Information theory and compression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Information theory and compression"

1

Lawton, Wayne. "Information theory, wavelets, and image compression." International Journal of Imaging Systems and Technology 7, no. 3 (1996): 180–90. http://dx.doi.org/10.1002/(sici)1098-1098(199623)7:3<180::aid-ima4>3.0.co;2-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gibson, Jerry. "Information Theory and Rate Distortion Theory for Communications and Compression." Synthesis Lectures on Communications 6, no. 1 (December 31, 2013): 1–127. http://dx.doi.org/10.2200/s00556ed1v01y201312com009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bookstein, Abraham, and Shmuel T. Klein. "Compression, information theory, and grammars: a unified approach." ACM Transactions on Information Systems 8, no. 1 (January 3, 1990): 27–49. http://dx.doi.org/10.1145/78915.78917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cai, Mingjie, and Qingguo Li. "Compression of Dynamic Fuzzy Relation Information Systems." Fundamenta Informaticae 142, no. 1-4 (December 9, 2015): 285–306. http://dx.doi.org/10.3233/fi-2015-1295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

ROMEO, AUGUST, ENRIQUE GAZTAÑAGA, JOSE BARRIGA, and EMILIO ELIZALDE. "INFORMATION CONTENT IN UNIFORMLY DISCRETIZED GAUSSIAN NOISE: OPTIMAL COMPRESSION RATES." International Journal of Modern Physics C 10, no. 04 (June 1999): 687–716. http://dx.doi.org/10.1142/s0129183199000528.

Full text
Abstract:
We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon's noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P(f) of the noise. Entropy values and compression rates are shown to depend on the shape of this power spectrum, given different normalizations. The cases of white noise (w.n.), fnp power-law noise (including 1/f noise), ( w.n. +1/f) noise, and piecewise ( w.n. +1/f | w.n. +1/f2) noise are discussed, while quantitative behaviors and useful approximations are provided.
APA, Harvard, Vancouver, ISO, and other styles
6

Franz, Arthur, Oleksandr Antonenko, and Roman Soletskyi. "A theory of incremental compression." Information Sciences 547 (February 2021): 28–48. http://dx.doi.org/10.1016/j.ins.2020.08.035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bu, Yuheng, Weihao Gao, Shaofeng Zou, and Venugopal Veeravalli. "Information-Theoretic Understanding of Population Risk Improvement with Model Compression." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3300–3307. http://dx.doi.org/10.1609/aaai.v34i04.5730.

Full text
Abstract:
We show that model compression can improve the population risk of a pre-trained model, by studying the tradeoff between the decrease in the generalization error and the increase in the empirical risk with model compression. We first prove that model compression reduces an information-theoretic bound on the generalization error; this allows for an interpretation of model compression as a regularization technique to avoid overfitting. We then characterize the increase in empirical risk with model compression using rate distortion theory. These results imply that the population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. We show through a linear regression example that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest that the Hessian-weighted K-means clustering compression approach can be improved by regularizing the distance between the clustering centers. We provide experiments with neural networks to support our theoretical assertions.
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Yen-Liang, and Fang-Chi Chi. "Summarization of information systems based on rough set theory." Journal of Intelligent & Fuzzy Systems 40, no. 1 (January 4, 2021): 1001–15. http://dx.doi.org/10.3233/jifs-201160.

Full text
Abstract:
In the rough set theory proposed by Pawlak, the concept of reduct is very important. The reduct is the minimum attribute set that preserves the partition of the universe. A great deal of research in the past has attempted to reduce the representation of the original table. The advantage of using a reduced representation table is that it can summarize the original table so that it retains the original knowledge without distortion. However, using reduct to summarize tables may encounter the problem of the table still being too large, so users will be overwhelmed by too much information. To solve this problem, this article considers how to further reduce the size of the table without causing too much distortion to the original knowledge. Therefore, we set an upper limit for information distortion, which represents the maximum degree of information distortion we allow. Under this upper limit of distortion, we seek to find the summary table with the highest compression. This paper proposes two algorithms. The first is to find all summary tables that satisfy the maximum distortion constraint, while the second is to further select the summary table with the greatest degree of compression from these tables.
APA, Harvard, Vancouver, ISO, and other styles
9

Marzen, Sarah E., and Simon DeDeo. "The evolution of lossy compression." Journal of The Royal Society Interface 14, no. 130 (May 2017): 20170166. http://dx.doi.org/10.1098/rsif.2017.0166.

Full text
Abstract:
In complex environments, there are costs to both ignorance and perception. An organism needs to track fitness-relevant information about its world, but the more information it tracks, the more resources it must devote to perception. As a first step towards a general understanding of this trade-off, we use a tool from information theory, rate–distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion (‘distortions’). We identify two distinct regimes for organisms in these environments: a high-fidelity regime where perceptual costs grow linearly with environmental complexity, and a low-fidelity regime where perceptual costs are, remarkably, independent of the number of environmental states. This suggests that in environments of rapidly increasing complexity, well-adapted organisms will find themselves able to make, just barely, the most subtle distinctions in their environment.
APA, Harvard, Vancouver, ISO, and other styles
10

Maslov, V. P., and V. E. Nazaikinskii. "Remark on the notion of optimal data compression in information theory." Mathematical Notes 99, no. 3-4 (March 2016): 616–18. http://dx.doi.org/10.1134/s0001434616030378.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Information theory and compression"

1

Presnell, Stuart. "Minimal resources in quantum information theory : compression and measurement." Thesis, University of Bristol, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.399944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Reid, Mark Montgomery. "Path-dictated, lossless volumetric data compression." Thesis, University of Ulster, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shih, An-Zen. "Fractal compression analysis of superdeformed nucleus data." Thesis, University of Liverpool, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hong, Edwin S. "Group testing for image compression /." Thesis, Connect to this title online; UW restricted, 2001. http://hdl.handle.net/1773/6900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zemouri, Rachid. "Data compression of speech using sub-band coding." Thesis, University of Newcastle Upon Tyne, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.316094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tang, P. S. "Data compression for high precision digital waveform recording." Thesis, City University London, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.384076.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jiang, Jianmin. "Multi-media data compression and real-time novel architectures implementation." Thesis, University of Southampton, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Reeder, Brian Martin. "Application of artificial neural networks for spacecraft instrument data compression." Thesis, University of Sussex, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.362216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yamazato, Takaya, Iwao Sasase, and Shinsaku Mori. "Interlace Coding System Involving Data Compression Code, Data Encryption Code and Error Correcting Code." IEICE, 1992. http://hdl.handle.net/2237/7844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Beirami, Ahmad. "Network compression via network memory: fundamental performance limits." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53448.

Full text
Abstract:
The amount of information that is churned out daily around the world is staggering, and hence, future technological advancements are contingent upon development of scalable acquisition, inference, and communication mechanisms for this massive data. This Ph.D. dissertation draws upon mathematical tools from information theory and statistics to understand the fundamental performance limits of universal compression of this massive data at the packet level using universal compression just above layer 3 of the network when the intermediate network nodes are enabled with the capability of memorizing the previous traffic. Universality of compression imposes an inevitable redundancy (overhead) to the compression performance of universal codes, which is due to the learning of the unknown source statistics. In this work, the previous asymptotic results about the redundancy of universal compression are generalized to consider the performance of universal compression at the finite-length regime (that is applicable to small network packets). Further, network compression via memory is proposed as a compression-based solution for the compression of relatively small network packets whenever the network nodes (i.e., the encoder and the decoder) are equipped with memory and have access to massive amounts of previous communication. In a nutshell, network compression via memory learns the patterns and statistics of the payloads of the packets and uses it for compression and reduction of the traffic. Network compression via memory, with the cost of increasing the computational overhead in the network nodes, significantly reduces the transmission cost in the network. This leads to huge performance improvement as the cost of transmitting one bit is by far greater than the cost of processing it.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Information theory and compression"

1

Hankerson, Darrel R. Introduction to information theory and data compression. 2nd ed. Boca Raton, FL: Chapman & Hall/CRC Press, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

A, Harris Greg, and Johnson Peter D. 1945-, eds. Introduction to information theory and data compression. 2nd ed. Boca Raton, Fla: Chapman & Hall/CRC Press, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hankerson, Darrel. Introduction to information theory and data compression. Boca Raton, Fla: CRC Press, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hankerson, Darrel R. Introduction to information theory and data compression. Boca Raton, Fla: CRC Press, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gibson, Jerry. Information Theory and Rate Distortion Theory for Communications and Compression. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-031-01680-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Seidler, J. Information systems and data compression. Boston, Mass: Kluwer Academic, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Moffat, Alistair. Compression and Coding Algorithms. Boston, MA: Springer US, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

The mathematics of coding theory: Information, compression, error correction, and finite fields. Upper Saddle River, NJ: Pearson Prentice Hall, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Krichevsky, Rafail. Universal Compression and Retrieval. Dordrecht: Springer Netherlands, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Manstetten, Dietrich. Schranken für die Redundanz des Huffman-Codes. Heidelberg: A. Hüthig, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Information theory and compression"

1

Caire, Giuseppe, Shlomo Shamai, Amin Shokrollahi, and Sergio Verdú. "Fountain codes for lossless data compression." In Algebraic Coding Theory and Information Theory, 1–20. Providence, Rhode Island: American Mathematical Society, 2005. http://dx.doi.org/10.1090/dimacs/068/01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Alajaji, Fady, and Po-Ning Chen. "Lossless Data Compression." In An Introduction to Single-User Information Theory, 55–104. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-8001-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gazi, Orhan. "Typical Sequences and Data Compression." In Information Theory for Electrical Engineers, 175–233. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-8432-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ivaniš, Predrag, and Dušan Drajić. "Data Compression (Source Encoding)." In Information Theory and Coding - Solved Problems, 45–90. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-49370-1_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yeung, Raymond W. "Zero-Error Data Compression." In A First Course in Information Theory, 41–59. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4419-8608-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mandal, M. K., S. Panchanathan, and T. Aboulnasr. "Choice of wavelets for image compression." In Information Theory and Applications II, 239–49. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/bfb0025147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gibson, Jerry. "Entropy and Mutual Information." In Information Theory and Rate Distortion Theory for Communications and Compression, 9–28. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-031-01680-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Alajaji, Fady, and Po-Ning Chen. "Lossy Data Compression and Transmission." In An Introduction to Single-User Information Theory, 219–62. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-8001-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gagie, Travis. "On the Value of Multiple Read/Write Streams for Data Compression." In Information Theory, Combinatorics, and Search Theory, 284–97. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-36899-8_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kamimura, Ryotaro. "Information-Theoretic Self-compression of Multi-layered Neural Networks." In Theory and Practice of Natural Computing, 401–13. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04070-3_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Information theory and compression"

1

Braverman, Mark, and Gillat Kol. "Interactive compression to external information." In STOC '18: Symposium on Theory of Computing. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3188745.3188956.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gunduz, Deniz, Elza Erkip, and H. Vincent Poor. "Secure lossless compression with side information." In 2008 IEEE Information Theory Workshop (ITW). IEEE, 2008. http://dx.doi.org/10.1109/itw.2008.4578644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Akyol, Emrah, Cedric Langbort, and Tamer Basar. "Strategic compression and transmission of information." In 2015 IEEE Information Theory Workshop - Fall (ITW). IEEE, 2015. http://dx.doi.org/10.1109/itwf.2015.7360767.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ordentlich, Or, and Ofer Shayevitz. "Subset-universal lossy compression." In 2015 IEEE Information Theory Workshop (ITW). IEEE, 2015. http://dx.doi.org/10.1109/itw.2015.7133146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rosas, Fernando E., Pedro A. M. Mediano, and Michael Gastpar. "Learning, compression, and leakage: Minimising classification error via meta-universal compression principles." In 2020 IEEE Information Theory Workshop (ITW). IEEE, 2021. http://dx.doi.org/10.1109/itw46852.2021.9457579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhong, Jing, Roy D. Yates, and Emina Soljanin. "Backlog-adaptive compression: Age of information." In 2017 IEEE International Symposium on Information Theory (ISIT). IEEE, 2017. http://dx.doi.org/10.1109/isit.2017.8006591.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Torokhti, Anatoli, Shmuel Friedland, and Phil Howlett. "Towards generic theory of data compression." In 2007 IEEE International Symposium on Information Theory. IEEE, 2007. http://dx.doi.org/10.1109/isit.2007.4557241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Huang, Ying-zong, and Gregory W. Wornell. "Separation architectures for lossy compression." In 2015 IEEE Information Theory Workshop (ITW). IEEE, 2015. http://dx.doi.org/10.1109/itw.2015.7133164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Emad, Amin, and Olgica Milenkovic. "Compression of noisy signals with information bottlenecks." In 2013 IEEE Information Theory Workshop (ITW 2013). IEEE, 2013. http://dx.doi.org/10.1109/itw.2013.6691344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chern, B. G., I. Ochoa, A. Manolakos, A. No, K. Venkat, and T. Weissman. "Reference based genome compression." In 2012 IEEE Information Theory Workshop (ITW 2012). IEEE, 2012. http://dx.doi.org/10.1109/itw.2012.6404708.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Information theory and compression"

1

Spivak, David. Categorical Information Theory. Fort Belvoir, VA: Defense Technical Information Center, May 2011. http://dx.doi.org/10.21236/ada543905.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Moran, William. Coding Theory Information Theory and Radar. Fort Belvoir, VA: Defense Technical Information Center, September 2005. http://dx.doi.org/10.21236/ada456510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Calderbank, Arthur R. Coding Theory Information Theory and Radar. Fort Belvoir, VA: Defense Technical Information Center, January 2005. http://dx.doi.org/10.21236/ada434253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Adami, Christoph. Relativistic Quantum Information Theory. Fort Belvoir, VA: Defense Technical Information Center, November 2007. http://dx.doi.org/10.21236/ada490967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Parlett, Beresford. Some Basic Information on Information-Based Complexity Theory. Fort Belvoir, VA: Defense Technical Information Center, July 1989. http://dx.doi.org/10.21236/ada256585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gorecki, Frank D. Passive Tracking and Information Theory. Fort Belvoir, VA: Defense Technical Information Center, May 1999. http://dx.doi.org/10.21236/ada385452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gorelenkova, M. V., N. N. Gorelenkov, E. A. Azizov, A. N. Romannikov, and H. W. Herrmann. Kinetic theory of plasma adiabatic major radius compression in tokamaks. Office of Scientific and Technical Information (OSTI), October 1997. http://dx.doi.org/10.2172/304217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

A.N. Romannikov, E.A. Azizov, H.W. Herrmann, M.V. Gorelenkova, and N.N. Gorelenkov. Kinetic Theory of Plasma Adiabatic Major Radius Compression in Tokamaks. Office of Scientific and Technical Information (OSTI), October 1997. http://dx.doi.org/10.2172/4564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Burnett, Margaret M. Information Foraging Theory in Software Maintenance. Fort Belvoir, VA: Defense Technical Information Center, September 2012. http://dx.doi.org/10.21236/ada579505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dowski, Edward R., and Jr. An Information Theory Approach to Three Incoherent Information Processing Systems,. Fort Belvoir, VA: Defense Technical Information Center, January 1995. http://dx.doi.org/10.21236/ada299683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography