Academic literature on the topic 'Perceptually lossless quality'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Perceptually lossless quality.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Perceptually lossless quality"

1

SREELEKHA, G., and P. S. SATHIDEVI. "A WAVELET-BASED PERCEPTUAL IMAGE CODER INCORPORATING A NEW MODEL FOR COMPRESSION OF COLOR IMAGES." International Journal of Wavelets, Multiresolution and Information Processing 07, no. 05 (September 2009): 675–92. http://dx.doi.org/10.1142/s0219691309003197.

Full text
Abstract:
A wavelet-based perceptual image coder for the compression of color images is proposed here in which the coding structure is coupled with Human Visual System models to produce high quality images. The major contribution is the development of a new model for the compression of the color components based on psychovisual experiments, which quantifies the optimum amount of compression that can be applied to the color components for a given rate. The model is developed for YCbCr color space and the perceptually uniform CIE Lab color space. A complete coding structure for the compression of color images is developed by incorporating the new perceptual model. The performance of the proposed coder is compared with a wavelet-based coder that uses the quantization stage of the JPEG2000 standard. The perceptual quality of the compressed images is tested using the wavelet-based subjective and objective perceptual quality matrices such as Mean Opinion Score, Visual Information Fidelity and Visual Signal to Noise Ratio. Though the model is developed for a perceptually lossless high quality image compression, results obtained reveal that the proposed structure gives very good perceptual quality compared to the existing schemes for lower bit rates. These advantages make the proposed coder a candidate for replacing the encoder stage of the current image compression standards.
APA, Harvard, Vancouver, ISO, and other styles
2

Singh, Abhilasha, and Malay Kishore Dutta. "A Reversible Data Hiding Scheme for Efficient Management of Tele-Ophthalmological Data." International Journal of E-Health and Medical Communications 8, no. 3 (July 2017): 38–54. http://dx.doi.org/10.4018/ijehmc.2017070103.

Full text
Abstract:
Advancements in medical sciences and induction of advanced technologies have led to increased role of medical images in tele-diagnosis. This paper proposes a technique for easy, efficient and accurate management of distributed medical databases and alleviates the risk of any distortion in images during transmission. It also provides remedy of issues like tampering, accidentally or intentionally, authentication and reliability without affecting the perceptual properties of the image. The technique is blind and completely reversible. Values of PSNR and BER imply that the changes made to original images are imperceptible to the Human Visual System. Performance of the technique has been evaluated for fundus images and the results are extremely encouraging. The technique is lossless and conforms to the firm necessities of medical data management by maintaining perceptual quality and diagnostic significance of the images, therefore is very practical to be used in health care centers.
APA, Harvard, Vancouver, ISO, and other styles
3

Zadeh, Pooneh Bagheri, Akbar Sheikh Akbari, and Tom Buggy. "DCT image codec using variance of sub-regions." Open Computer Science 5, no. 1 (August 11, 2015): 13–21. http://dx.doi.org/10.1515/comp-2015-0003.

Full text
Abstract:
AbstractThis paper presents a novel variance of subregions and discrete cosine transform based image-coding scheme. The proposed encoder divides the input image into a number of non-overlapping blocks. The coefficients in each block are then transformed into their spatial frequencies using a discrete cosine transform. Coefficients with the same spatial frequency index at different blocks are put together generating a number of matrices, where each matrix contains coefficients of a particular spatial frequency index. The matrix containing DC coefficients is losslessly coded to preserve its visually important information. Matrices containing high frequency coefficients are coded using a variance of sub-regions based encoding algorithm proposed in this paper. Perceptual weights are used to regulate the threshold value required in the coding process of the high frequency matrices. An extension of the system to the progressive image transmission is also developed. The proposed coding scheme, JPEG and JPEG2000were applied to a number of test images. Results show that the proposed coding scheme outperforms JPEG and JPEG2000 subjectively and objectively at low compression ratios. Results also indicate that the proposed codec decoded images exhibit superior subjective quality at high compression ratios compared to that of JPEG, while offering satisfactory results to that of JPEG2000.
APA, Harvard, Vancouver, ISO, and other styles
4

"Effects of Compression Algorithms and Identification of Cancer cell using CT Coronel View Lung Image." International Journal of Engineering and Advanced Technology 8, no. 6 (August 30, 2019): 103–5. http://dx.doi.org/10.35940/ijeat.b5552.088619.

Full text
Abstract:
Modern radiology techniques provide crucial medical information for radiologists to diagnose diseases and determine appropriate treatments. Hence dealing with medical image compression needs to compromise on good perceptual quality (i.e. diagnostically lossless) and high compression rate. The objective also includes finding out an optimum algorithm for medical image compression algorithm. The objective is also focused towards the selection of the developed image compression algorithm, which do not change the characterization behavior of the image.
APA, Harvard, Vancouver, ISO, and other styles
5

S, Dr Rohith, and Harish V. "A Reversible Data Hiding Scheme in Encrypted Images for Medical Applications." International Journal of Advanced Research in Science, Communication and Technology, February 10, 2021, 166–72. http://dx.doi.org/10.48175/ijarsct-776.

Full text
Abstract:
Storage and exchange of data of the patient images are common in medical applications. To protect the information of the patient and to avoid miss handling of the patient information data hiding scheme is very much essential. Reversible Data Hiding (RDH) scheme is one such scheme paid more attention to hide the data in encrypted images, since it maintains the excellent property that the original cover can be lossless recovered after embedded data is extracted while protecting the image content’s confidentiality. In this paper initially space is reserved from the encrypted images, which may be used to embed the information later stage. Histogram shifting based Reversible Data Hiding scheme used to reserve the room before encryption process. The proposed method can achieve real reversibility, that is, data extraction and image recovery are free of any error. Experiments show that this novel method and achieves better perceptual quality.
APA, Harvard, Vancouver, ISO, and other styles
6

Fineman, Daniel. "The Anomaly of Anomaly of Anomaly." M/C Journal 23, no. 5 (October 7, 2020). http://dx.doi.org/10.5204/mcj.1649.

Full text
Abstract:
‘Bitzer,’ said Thomas Gradgrind. ‘Your definition of a horse.’‘Quadruped. Graminivorous. Forty teeth, namely twenty-four grinders, four eye-teeth, and twelve incisive. Sheds coat in the spring; in marshy countries, sheds hoofs, too. Hoofs hard, but requiring to be shod with iron. Age known by marks in mouth.’ Thus (and much more) Bitzer.‘Now girl number twenty,’ said Mr. Gradgrind. ‘You know what a horse is.’— Charles Dickens, Hard Times (1854)Dickens’s famous pedant, Thomas Gradgrind, was not an anomaly. He is the pedagogical manifestation of the rise of quantification in modernism that was the necessary adjunct to massive urbanisation and industrialisation. His classroom caricatures the dominant epistemic modality of modern global democracies, our unwavering trust in numbers, “data”, and reproductive predictability. This brief quotation from Hard Times both presents and parodies the 19th century’s displacement of what were previously more commonly living and heterogeneous existential encounters with events and things. The world had not yet been made predictably repetitive through industrialisation, standardisation, law, and ubiquitous codes of construction. Theirs was much more a world of unique events and not the homogenised and orthodox iteration of standardised knowledge. Horses and, by extension, all entities and events gradually were displaced by their rote definitions: individuals of a so-called natural kind were reduced to identicals. Further, these mechanical standardisations were and still are underwritten by mapping them into a numerical and extensive characterisation. On top of standardised objects and procedures appeared assigned numerical equivalents which lent standardisation the seemingly apodictic certainty of deductive demonstrations. The algebraic becomes the socially enforced criterion for the previously more sensory, qualitative, and experiential encounters with becoming that were more likely in pre-industrial life. Here too, we see that the function of this reproductive protocol is not just notational but is the sine qua non for, in Althusser’s famous phrase, the manufacture of citizens as “subject subjects”, those concrete individuals who are educated to understand themselves ideologically in an imaginary relation with their real position in any society’s self-reproduction. Here, however, ideology performs that operation through that nominally least political of cognitive modes, the supposed friend of classical Marxism’s social science, the mathematical. The historical onset of this social and political reproductive hegemony, this uniform supplanting of time’s ineluctable differencing with the parasite of its associated model, can partial be found in the formation of metrics. Before the 19th century, the measures of space and time were local. Units of length and weight varied not just between nations but often by municipality. These parochial standards reflected indigenous traditions, actualities, personalities, and needs. This variation in measurement standards suggested that every exchange or judgment of kind and value relied upon the specificity of that instance. Every evaluation of an instance required perceptual acuity and not the banality of enumeration constituted by commodification and the accounting practices intrinsic to centralised governance. This variability in measure was complicated by similar variability in the currencies of the day. Thus, barter presented the participants with complexities and engagements of skills and discrete observation completely alien to the modern purchase of duplicate consumer objects with stable currencies. Almost nothing of life was iterative: every exchange was, more or less, an anomaly. However, in 1790, immediately following the French Revolution and as a central manifestation of its movement to rational democratisation, Charles Maurice de Talleyrand proposed a metrical system to the French National Assembly. The units of this metric system, based originally on observable features of nature, are now formally codified in all scientific practice by seven physical constants. Further, they are ubiquitous now in almost all public exchanges between individuals, corporations, and states. These units form a coherent and extensible structure whose elements and rules are subject to seemingly lossless symbolic exchange in a mathematic coherence aided by their conformity to decimal representation. From 1960, their basic contemporary form was established as the International System of Units (SI). Since then, all but three of the countries of the world (Myanmar, Liberia, and the United States), regardless of political organisation and individual history, have adopted these standards for commerce and general measurement. The uniformity and rational advantage of this system is easily demonstrable in just the absurd variation in the numeric bases of the Imperial / British system which uses base 16 for ounces/pounds, base 12 for inches/feet, base three for feet/yards, base 180 for degrees between freezing and cooling, 43,560 square feet per acre, eights for division of inches, etc. Even with its abiding antagonism to the French, Britain officially adopted the metric system as was required by its admission to the EU in 1973. The United States is the last great holdout in the public use of the metric system even though SI has long been the standard wanted by the federal government. At first, the move toward U.S. adoption was promising. Following France and rejecting England’s practice, America was founded on a decimal currency system in 1792. In 1793, Jefferson requested a copy of the standard kilogram from France in a first attempt to move to the metric system: however, the ship carrying the copy was captured by pirates. Indeed, The Metric Conversion Act of 1975 expressed a more serious national intention to adopt SI, but after some abortive efforts, the nation fell back into the more archaic measurements dominant since before its revolution. However, the central point remains that while the U.S. is unique in its public measurement standard among dominant powers, it is equally committed to the hegemonic application of a numerical rendition of events.The massive importance of this underlying uniformity is that it supplies the central global mechanism whereby the world’s chaotic variation is continuously parsed and supplanted into comparable, intelligible, and predictable units that understand individuating difference as anomaly. Difference, then, is understood in this method not as qualitative and intensive, which it necessarily is, but quantitative and extensive. Like Gradgrind’s “horse”, the living and unique thing is rendered through the Apollonian dream of standardisation and enumeration. While differencing is the only inherent quality of time’s chaotic flow, accounting and management requite iteration. To order the reproduction of modern society, the unique individuating differences that render an object as “this one”, what the Medieval logicians called haecceities, are only seen as “accidental” and “non-essential” deviations. This is not just odd but illogical since these very differences allow events to be individuated items so to appear as countable at all. As Leibniz’s principle, the indiscernibility of identicals, suggests, the application of the metrical same to different occasions is inherently paradoxical: if each unit were truly the same, there could only be one. As the etymology of “anomaly” suggests, it is that which is unexpected, irregular, out of line, or, going back to the Greek, nomos, at variance with the law. However, as the only “law” that always is at hand is the so-called “Second Law of Thermodynamics”, the inconsistently consistent roiling of entropy, the evident theoretical question might be, “how is anomaly possible when regularity itself is impossible?” The answer lies not in events “themselves” but exactly in the deductive valorisations projected by that most durable invention of the French Revolution adumbrated above, the metric system. This seemingly innocuous system has formed the reproductive and iterative bias of modern post-industrial perceptual homogenisation. Metrical modeling allows – indeed, requires – that one mistake the metrical changeling for the experiential event it replaces. Gilles Deleuze, that most powerful French metaphysician (1925-1995) offers some theories to understand the seminal production (not reproduction) of disparity that is intrinsic to time and to distinguish it from its homogenised representation. For him, and his sometime co-author, Felix Guattari, time’s “chaosmosis” is the host constantly parasitised by its symbolic model. This problem, however, of standardisation in the face of time’s originality, is obscured by its very ubiquity; we must first denaturalise the seemingly self-evident metrical concept of countable and uniform units.A central disagreement in ancient Greece was between the proponents of physis (often translated as “nature” but etymologically indicative of growth and becoming, process and not fixed form) and nomos (law or custom). This is one of the first ethical and so political debates in Western philosophy. For Heraclitus and other pre-Socratics, the emphatic character of nature was change, its differencing dynamism, its processual but not iterative character. In anticipation of Hume, Sophists disparaged nomos (νόμος) as simply the habituated application of synthetic law and custom to the fluidity of natural phenomena. The historical winners of this debate, Plato and the scientific attitudes of regularity and taxonomy characteristic of his best pupil, Aristotle, have dominated ever since, but not without opponents.In the modern era, anti-enlightenment figures such as Hamann, Herder, and the Schlegel brothers gave theoretical voice to romanticism’s repudiation of the paradoxical impulses of the democratic state for regulation and uniformity that Talleyrand’s “revolutionary” metrical proposal personified. They saw the correlationalism (as adumbrated by Meillassoux) between thought and thing based upon their hypothetical equitability as a betrayal of the dynamic physis that experience presented. Variable infinity might come either from the character of God or nature or, as famously in Spinoza’s Ethics, both (“deus sive natura”). In any case, the plenum of nature was never iterative. This rejection of metrical regularity finds its synoptic expression in Nietzsche. As a classicist, Nietzsche supplies the bridge between the pre-Socratics and the “post-structuralists”. His early mobilisation of the Apollonian, the dream of regularity embodied in the sun god, and the Dionysian, the drunken but inarticulate inexpression of the universe’s changing manifold, gives voice to a new resistance to the already dominate metrical system. His is a new spin of the mythic representatives of Nomos and physis. For him, this pair, however, are not – as they are often mischaracterised – in dialectical dialogue. To place them into the thesis / antithesis formulation would be to give them the very binary character that they cannot share and to, tacitly, place both under Apollo’s procedure of analysis. Their modalities are not antithetical but mutually exclusive. To represent the chaotic and non-iterative processes of becoming, of physis, under the rubric of a common metrics, nomos, is to mistake the parasite for the host. In its structural hubris, the ideological placebo of metrical knowing thinks it non-reductively captures the multiplicity it only interpellates. In short, the polyvalent, fluid, and inductive phenomena that empiricists try to render are, in their intrinsic character, unavailable to deductive method except, first, under the reductive equivalence (the Gradgrind pedagogy) of metrical modeling. This incompatibility of physis and nomos was made manifest by David Hume in A Treatise of Human Nature (1739-40) just before the cooptation of the 18th century’s democratic revolutions by “representative” governments. There, Hume displays the Apollonian dream’s inability to accurately and non-reductively capture a phenomenon in the wild, free from the stringent requirements of synthetic reproduction. His argument in Book I is succinct.Now as we call every thing custom, which proceeds from a past repetition, without any new reasoning or conclusion, we may establish it as a certain truth, that all the belief, which follows upon any present impression, is deriv'd solely from that origin. (Part 3, Section 8)There is nothing in any object, consider'd in itself, which can afford us a reason for drawing a conclusion beyond it; ... even after the observation of the frequent or constant conjunction of objects, we have no reason to draw any inference concerning any object beyond those of which we have had experience. (Part 3, Section 12)The rest of mankind ... are nothing but a bundle or collection of different perceptions, which succeed each other with an inconceivable rapidity, and are in a perpetual flux and movement. (Part 4, Section 6)In sum, then, nomos is nothing but habit, a Pavlovian response codified into a symbolic representation and, pragmatically, into a reproductive protocol specifically ordered to exclude anomaly, the inherent chaotic variation that is the hallmark of physis. The Apollonian dream that there can be an adequate metric of unrestricted natural phenomena in their full, open, turbulent, and manifold becoming is just that, a dream. Order, not chaos, is the anomaly. Of course, Kant felt he had overcome this unacceptable challenge to rational application to induction after Hume woke him from his “dogmatic slumber”. But what is perhaps one of the most important assertions of the critiques may be only an evasion of Hume’s radical empiricism: “there are only two ways we can account for the necessary agreement of experience with the concepts of its objects: either experience makes these concepts possible or these concepts make experience possible. The former supposition does not hold of the categories (nor of pure sensible intuition) ... . There remains ... only the second—a system ... of the epigenesis of pure reason” (B167). Unless “necessary agreement” means the dictatorial and unrelenting insistence in a symbolic model of perception of the equivalence of concept and appearance, this assertion appears circular. This “reading” of Kant’s evasion of the very Humean crux, the necessary inequivalence of a metric or concept to the metered or defined, is manifest in Nietzsche.In his early “On Truth and Lies in a Nonmoral Sense” (1873), Nietzsche suggests that there is no possible equivalence between a concept and its objects, or, to use Frege’s vocabulary, between sense or reference. We speak of a "snake" [see “horse” in Dickens]: this designation touches only upon its ability to twist itself and could therefore also fit a worm. What arbitrary differentiations! What one-sided preferences, first for this, then for that property of a thing! The various languages placed side by side show that with words it is never a question of truth, never a question of adequate expression; otherwise, there would not be so many languages. The "thing in itself" (which is precisely what the pure truth, apart from any of its consequences, would be) is likewise something quite incomprehensible to the creator of language and something not in the least worth striving for. This creator only designates the relations of things to men, and for expressing these relations he lays hold of the boldest metaphors.The literal is always already a reductive—as opposed to literature’s sometimes expansive agency—metaphorisation of events as “one of those” (a token of “its” type). The “necessary” equivalence in nomos is uncovered but demanded. The same is reproduced by the habitual projection of certain “essential qualities” at the expense of all those others residing in every experiential multiplicity. Only in this prison of nomos can anomaly appear: otherwise all experience would appear as it is, anomalous. With this paradoxical metaphor of the straight and equal, Nietzsche inverts the paradigm of scientific expression. He reveals as a repressive social and political obligation the symbolic assertion homology where actually none can be. Supposed equality and measurement all transpire within an Apollonian “dream within a dream”. The concept captures not the manifold of chaotic experience but supplies its placebo instead by an analytic tautology worthy of Gradgrind. The equivalence of event and definition is always nothing but a symbolic iteration. Such nominal equivalence is nothing more than shifting events into a symbolic frame where they can be commodified, owned, and controlled in pursuit of that tertiary equivalence which has become the primary repressive modality of modern societies: money. This article has attempted, with absurd rapidity, to hint why some ubiquitous concepts, which are generally considered self-evident and philosophically unassailable, are open not only to metaphysical, political, and ethical challenge, but are existentially unjustified. All this was done to defend the smaller thesis that the concept of anomaly is itself a reflection of a global misrepresentation of the chaos of becoming. This global substitution expresses a conservative model and measure of the world in the place of the world’s intrinsic heterogenesis, a misrepresentation convenient for those who control the representational powers of governance. In conclusion, let us look, again too briefly, at a philosopher who neither accepts this normative world picture of regularity nor surrenders to Nietzschean irony, Gilles Deleuze.Throughout his career, Deleuze uses the word “pure” with senses antithetical to so-called common sense and, even more, Kant. In its traditional concept, pure means an entity or substance whose essence is not mixed or adulterated with any other substance or material, uncontaminated by physical pollution, clean and immaculate. The pure is that which is itself itself. To insure intelligibility, that which is elemental, alphabetic, must be what it is itself and no other. This discrete character forms the necessary, if often tacit, precondition to any analysis and decomposition of beings into their delimited “parts” that are subject to measurement and measured disaggregation. Any entity available for structural decomposition, then, must be pictured as constituted exhaustively by extensive ones, measurable units, its metrically available components. Dualism having established as its primary axiomatic hypothesis the separability of extension and thought must now overcome that very separation with an adequacy, a one to one correspondence, between a supposedly neatly measurable world and ideological hegemony that presents itself as rational governance. Thus, what is needed is not only a purity of substance but a matching purity of reason, and it is this clarification of thought, then, which, as indicated above, is the central concern of Kant’s influential and grand opus, The Critique of Pure Reason.Deleuze heard a repressed alternative to the purity of the measured self-same and equivalent that, as he said about Plato, “rumbled” under the metaphysics of analysis. This was the dark tradition he teased out of the Stoics, Ockham, Gregory of Rimini, Nicholas d’Autrecourt, Spinoza, Meinong, Bergson, Nietzsche, and McLuhan. This is not the purity of identity, A = A, of metrical uniformity and its shadow, anomaly. Rather than repressing, Deleuze revels in the perverse purity of differencing, difference constituted by becoming without the Apollonian imposition of normalcy or definitional identity. One cannot say “difference in itself” because its ontology, its genesis, is not that of anything itself but exactly the impossibility of such a manner of constitution: universal anomaly. No thing or idea can be iterative, separate, or discrete.In his Difference and Repetition, the idea of the purely same is undone: the Ding an sich is a paradox. While the dogmatic image of thought portrays the possibility of the purely self-same, Deleuze never does. His notions of individuation without individuals, of modulation without models, of simulacra without originals, always finds a reflection in his attitudes toward, not language as logical structure, but what necessarily forms the differential making of events, the heterogenesis of ontological symptoms. His theory has none of the categories of Pierce’s triadic construction: not the arbitrary of symbols, the “self-representation” of icons, or even the causal relation of indices. His “signs” are symptoms: the non-representational consequences of the forces that are concurrently producing them. Events, then, are the symptoms of the heterogenetic forces that produce, not reproduce them. To measure them is to export them into a representational modality that is ontologically inapplicable as they are not themselves themselves but the consequences of the ongoing differences of their genesis. Thus, the temperature associated with a fever is neither the body nor the disease.Every event, then, is a diaphora, the pure consequent of the multiplicity of the forces it cannot resemble, an original dynamic anomaly without standard. This term, diaphora, appears at the conclusion of that dialogue some consider Plato’s best, the Theaetetus. There we find perhaps the most important discussion of knowledge in Western metaphysics, which in its final moments attempts to understand how knowledge can be “True Judgement with an Account” (201d-210a). Following this idea leads to a theory, usually known as the “Dream of Socrates”, which posits two kinds of existents, complexes and simples, and proposes that “an account” means “an account of the complexes that analyses them into their simple components … the primary elements (prôta stoikheia)” of which we and everything else are composed (201e2). This—it will be noticed—suggests the ancient heritage of Kant’s own attempted purification of mereological (part/whole relations) nested elementals. He attempts the coordination of pure speculative reason to pure practical reason and, thus, attempts to supply the root of measurement and scientific regularity. However, as adumbrated by the Platonic dialogue, the attempted decompositions, speculative and pragmatic, lead to an impasse, an aporia, as the rational is based upon a correspondence and not the self-synthesis of the diaphorae by their own dynamic disequilibrium. Thus the dialogue ends inconclusively; Socrates rejects the solution, which is the problem itself, and leaves to meet his accusers and quaff his hemlock. The proposal in this article is that the diaphorae are all that exists in Deleuze’s world and indeed any world, including ours. Nor is this production decomposable into pure measured and defined elementals, as such decomposition is indeed exactly opposite what differential production is doing. For Deleuze, what exists is disparate conjunction. But in intensive conjunction the same cannot be the same except in so far as it differs. The diaphorae of events are irremediably asymmetric to their inputs: the actual does not resemble the virtual matrix that is its cause. Indeed, any recourse to those supposedly disaggregate inputs, the supposedly intelligible constituents of the measured image, will always but repeat the problematic of metrical representation at another remove. This is not, however, the traditional postmodern trap of infinite meta-shifting, as the diaphoric always is in each instance the very presentation that is sought. Heterogenesis can never be undone, but it can be affirmed. In a heterogenetic monism, what was the insoluble problem of correspondence in dualism is now its paradoxical solution: the problematic per se. What manifests in becoming is not, nor can be, an object or thought as separate or even separable, measured in units of the self-same. Dogmatic thought habitually translates intensity, the differential medium of chaosmosis, into the nominally same or similar so as to suit the Apollonian illusions of “correlational adequacy”. However, as the measured cannot be other than a calculation’s placebo, the correlation is but the shadow of a shadow. Every diaphora is an event born of an active conjunction of differential forces that give rise to this, their product, an interference pattern. Whatever we know and are is not the correlation of pure entities and thoughts subject to measured analysis but the confused and chaotic confluence of the specific, material, aleatory, differential, and unrepresentable forces under which we subsist not as ourselves but as the always changing product of our milieu. In short, only anomaly without a nominal becomes, and we should view any assertion that maps experience into the “objective” modality of the same, self-evident, and normal as a political prestidigitation motivated, not by “truth”, but by established political interest. ReferencesDella Volpe, Galvano. Logic as a Positive Science. London: NLB, 1980.Deleuze, Gilles. Difference and Repetition. Trans. Paul Patton. New York: Columbia UP, 1994.———. The Logic of Sense. Trans. Mark Lester. New York: Columbia UP, 1990.Guenon, René. The Reign of Quantity. New York: Penguin, 1972.Hawley, K. "Identity and Indiscernibility." Mind 118 (2009): 101-9.Hume, David. A Treatise of Human Nature. Oxford: Clarendon, 2014.Kant, Immanuel. Critique of Pure Reason. Trans. Norman Kemp Smith. London: Palgrave Macmillan, 1929.Meillassoux, Quentin. After Finitude: An Essay on the Necessity of Contingency. Trans. Ray Brassier. New York: Continuum, 2008.Naddaf, Gerard. The Greek Concept of Nature. Albany: SUNY, 2005. Nietzsche, Friedrich. The Birth of Tragedy. Trans. Douglas Smith. Oxford: Oxford UP, 2008.———. “On Truth and Lies in a Nonmoral Sense.” Trans. Walter Kaufmann. The Portable Nietzsche. New York: Viking, 1976.Welch, Kathleen Ethel. "Keywords from Classical Rhetoric: The Example of Physis." Rhetoric Society Quarterly 17.2 (1987): 193–204.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Perceptually lossless quality"

1

Wu, David, and dwu8@optusnet com au. "Perceptually Lossless Coding of Medical Images - From Abstraction to Reality." RMIT University. Electrical & Computer Engineering, 2007. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080617.160025.

Full text
Abstract:
This work explores a novel vision model based coding approach to encode medical images at a perceptually lossless quality, within the framework of the JPEG 2000 coding engine. Perceptually lossless encoding offers the best of both worlds, delivering images free of visual distortions and at the same time providing significantly greater compression ratio gains over its information lossless counterparts. This is achieved through a visual pruning function, embedded with an advanced model of the human visual system to accurately identify and to efficiently remove visually irrelevant/insignificant information. In addition, it maintains bit-stream compliance with the JPEG 2000 coding framework and subsequently is compliant with the Digital Communications in Medicine standard (DICOM). Equally, the pruning function is applicable to other Discrete Wavelet Transform based image coders, e.g., The Set Partitioning in Hierarchical Trees. Further significant coding gains are ex ploited through an artificial edge segmentation algorithm and a novel arithmetic pruning algorithm. The coding effectiveness and qualitative consistency of the algorithm is evaluated through a double-blind subjective assessment with 31 medical experts, performed using a novel 2-staged forced choice assessment that was devised for medical experts, offering the benefits of greater robustness and accuracy in measuring subjective responses. The assessment showed that no differences of statistical significance were perceivable between the original images and the images encoded by the proposed coder.
APA, Harvard, Vancouver, ISO, and other styles
2

Oh, Han. "Perceptual Image Compression using JPEG2000." Diss., The University of Arizona, 2011. http://hdl.handle.net/10150/202996.

Full text
Abstract:
Image sizes have increased exponentially in recent years. The resulting high-resolution images are typically encoded in a lossy fashion to achieve high compression ratios. Lossy compression can be categorized into visually lossless and visually lossy compression depending on the visibility of compression artifacts. This dissertation proposes visually lossless coding methods as well as a visually lossy coding method with perceptual quality control. All resulting codestreams are JPEG2000 Part-I compliant.Visually lossless coding is increasingly considered as an alternative to numerically lossless coding. In order to hide compression artifacts caused by quantization, visibility thresholds (VTs) are measured and used for quantization of subbands in JPEG2000. In this work, VTs are experimentally determined from statistically modeled quantization distortion, which is based on the distribution of wavelet coefficients and the dead-zone quantizer of JPEG2000. The resulting VTs are adjusted for locally changing background through a visual masking model, and then used to determine the minimum number of coding passes to be included in a codestream for visually lossless quality under desired viewing conditions. The proposed coding scheme successfully yields visually lossless images at competitive bitrates compared to those of numerically lossless coding and visually lossless algorithms in the literature.This dissertation also investigates changes in VTs as a function of display resolution and proposes a method which effectively incorporates multiple VTs for various display resolutions into the JPEG2000 framework. The proposed coding method allows for visually lossless decoding at resolutions natively supported by the wavelet transform as well as arbitrary intermediate resolutions, using only a fraction of the full-resolution codestream. When images are browsed remotely, this method can significantly reduce bandwidth usage.Contrary to images encoded in the visually lossless manner, highly compressed images inevitably have visible compression artifacts. To minimize these artifacts, many compression algorithms exploit the varying sensitivity of the human visual system (HVS) to different frequencies, which is typically obtained at the near-threshold level where distortion is just noticeable. However, it is unclear that the same frequency sensitivity applies at the supra-threshold level where distortion is highly visible. In this dissertation, the sensitivity of the HVS for several supra-threshold distortion levels is measured based on the JPEG2000 quantization distortion model. Then, a low-complexity JPEG2000 encoder using the measured sensitivity is described. The proposed visually lossy encoder significantly reduces encoding time while maintaining superior visual quality compared with conventional JPEG2000 encoders.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Perceptually lossless quality"

1

Singh, Abhilasha, and Malay Kishore Dutta. "A Reversible Data Hiding Scheme for Efficient Management of Tele-Ophthalmological Data." In Ophthalmology, 172–88. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5195-9.ch011.

Full text
Abstract:
Advancements in medical sciences and induction of advanced technologies have led to increased role of medical images in tele-diagnosis. This paper proposes a technique for easy, efficient and accurate management of distributed medical databases and alleviates the risk of any distortion in images during transmission. It also provides remedy of issues like tampering, accidentally or intentionally, authentication and reliability without affecting the perceptual properties of the image. The technique is blind and completely reversible. Values of PSNR and BER imply that the changes made to original images are imperceptible to the Human Visual System. Performance of the technique has been evaluated for fundus images and the results are extremely encouraging. The technique is lossless and conforms to the firm necessities of medical data management by maintaining perceptual quality and diagnostic significance of the images, therefore is very practical to be used in health care centers.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography