Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Erasure of the subject.

Dissertationen zum Thema „Erasure of the subject“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Erasure of the subject" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Gaudin, Denys. „Ecriture et voix : clinique du recours à l'écrit chez des sujets psychotiques“. Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0021.

Der volle Inhalt der Quelle
Annotation:
Nous traitons la question d’un rapport entre écriture et voix dans la clinique des psychoses. A l’appui des témoignages de sujets psychotiques, sujets disant écrire ce qui survient sous forme de voix, nous interrogeons les ressorts et les enjeux de ce passage à la lettre. Revenant sur la clinique de l’hallucination, nous détaillons ce que les sujets psychotiques nous enseignent sur les mécanismes impliqués dans l’instant de la voix hallucinée. Suivant ce fil, nous mettons en valeur la dimension de jouissance inhérente à la voix. Dans un second temps, nous nous penchons plus spécifiquement sur les pratiques d’écriture des sujets rencontrés, pratiques où il s’agit de noter, d’arrimer sur papier ce que les voix font entendre. Suivant les pistes annoncées par nos patients, nous questionnons la fonction régulatrice d’une pratique de la lettre. Pour ce faire, nous reprenons les conceptions lacaniennes de la lettre comme « littoral » ou comme « godet ». De même, les propos de nos patients nous mènent à interroger en quoi l’écriture pourrait permettre de « faire partir » la voix, en quoi elle ferait le moyen d’un détachement. Les œuvres et les témoignages d’écrivains nous donnent l’occasion de pousser plus avant notre questionnement, d’arpenter les domaines où, toujours, l’artiste précède le clinicien. Ainsi, nous revenons sur les travaux de James Joyce, de Louis Wolfson et, surtout, de Samuel Beckett. Nous nous penchons sur les indications qu’ils nous livrent au sujet d’un nouage entre écriture et voix. L’objet de notre recherche est de mettre à l’épreuve l’hypothèse selon laquelle, dans la clinique des psychoses, le recours à l’écrit peut relever d’un traitement de la voix
We examine the issue of a link between voice and writing in psychosis. Relying on psychotic subjects who say they write what they hear through voices, we study the nature of this shifting from voice to letter. As a first step, we specify what we mean by voice. We detail what psychotic subjects teach us about the mechanisms involved in the moment of the hallucinated voice. By doing so, we point out the part of jouissance involved in voice.As a second step, we focus on their writing practices, the moment when they take note, when they put down on paper what they hear through voices. We endeavour to elucidate the issues of this movement. We are led to examine the regulating function of writing. Therefore, we refer to the lacanian concept of letter as « littoral » or « godet ». Moreover, patients’ words led us to specify how writing could be a way to make the voice “go away”, a way to separate. The works and the testimonies of writers lead us to go on exploring a field where the artist always precedes the clinician. We refer to James Joyce, to Louis Wolfson, and especially, to Samuel Beckett’s works. We try to grasp the indications they give about a link between voice and writing. The purpose is to test the hypothesis which states that, in psychosis, the use of writing can be a way to treat the voice
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Jhupsee, Sneha. „Erasure layering“. Master's thesis, University of Cape Town, 2018. http://hdl.handle.net/11427/28065.

Der volle Inhalt der Quelle
Annotation:
This dissertation developed from an interest around sustainability and the current housing crisis within the inner-city of Cape Town. The evolution of the city has played a role in developing a layered but fragmented space that lacks a favourable density. New housing developments within the city are developer-led and market driven schemes that more often than not do not consider the rich urban and social contexts provided by the city. These schemes remove vast portions of rich urban fabric to profit from maximising bulk. While these developments do indeed add density, they lack diversity and equity. This dissertation challenges the contradiction of the positive addition of density and the negative impact of inequitable and unsustainable architecture. From a sustainable point of view the idea of continued reuse and transformation of vacant existing buildings is explored. Many existing buildings within the inner-city are not fit for their intended purpose and seen as impediments that generate unsafe spaces. These buildings have become targets for inequitable developer-led schemes as they are located on prime positioned land. This dissertation explores layering the existing by providing different layers of public and private function. The sustainability of retaining an existing building is interrogated through the lens of the value of its structure. Essentially, there is an immense amount of building stock that is underutilised and underdeveloped within the inner-city that may provide an opportunity to layer the urban fabric. This dissertation endeavours to explore a new typology that embraces density for an inclusive city through sustainable practices. The ideas of reuse, density of the city and expanding its capacity in a sensitive manner and adding to the character and rich existing urban fabric of the city are pertinent to the dissertation design. Realistic ideals such as bulk and parking as well as idealistic ideas such as how to create an equitable building in a market driven era, and everything in between, will be explored.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Denan, Françoise. „Souffrance au travail et discours capitaliste : une lecture lacanienne subversive“. Electronic Thesis or Diss., Paris 8, 2022. http://www.theses.fr/2022PA080079.

Der volle Inhalt der Quelle
Annotation:
Le débat médiatique oppose la psychologie du travailleur et les conditions de travail. Une recension des travaux des sociologues du travail montre qu’il est possible de penser une autre articulation, à partir de diverses causalités : l’anomie, soit une dérégulation qui rend floues les limites (Durkheim) ; une langue de l’entreprise spécifique, dénuée de sens (Gaulejac) ; une répression de la pensée (Dejours).Freud et Lacan sont tous deux des lecteurs de Marx. Le premier en fait un adversaire qui méconnaît la pulsion de mort tandis que le second lui rend un hommage appuyé, revisitant sur ses traces la dialectique hégélienne du maître et de l’esclave grâce au paradigme des discours. Deux concepts majeurs s’en déduisent : le signifiant-maître, qui démontre la puissance injonctive du langage et le plus-de-jouir, contrepartie des restrictions imposées à la jouissance dont Lacan fait le principe même du capitalisme. Toutefois, la science et la financiarisation de l’économie contemporaine pulvérisent le discours, éradiquant le sujet au profit de la comptabilité, ôtant toute barrière à la jouissance, devenue un droit inaliénable. Le règne des nombres ne permet de réfléchir qu’en termes d’une continuité qui se déroule à l’infini (1+1+1+…) et non plus d’oppositions (permis/interdit) susceptibles de mettre des limites aux excès. Sortir du discours capitaliste suppose un maniement nouveau du langage. En faisant résonner la jouissance qu’il recèle, le psychanalyste donne une chance à chaque Un de se réapproprier sa propre langue et par là, de dire non au toujours-plus. Le désir qui surgit fomente un lien social nouveau et permet de penser un système politique alternatif à la gouvernance mondialisée par les nombres
Debates in the media oppose workers psychology against working conditions. A review of studies by sociologists of work shows that it is possible to articulate these opposing elements differently, based on various causalities: anomie, a deregulation that blurs the limits (Durkheim); a specific, meaningless company language (Gaulejac); a repression of thought (Dejours).Freud and Lacan are both readers of Marx. Freud makes him an adversary who disregards the death drive whilst Lacan pays him a strong homage, retracing in his footsteps the Hegelian dialectic of master and slave thanks to the paradigm of discourses. Two major concepts can be deduced from this: the master-signifier, which demonstrates the injunctive power of language, and the surplus-jouissance, the counterpart of the restrictions imposed on jouissance that Lacan makes the very principle of capitalism.However, science and the financialisation of contemporary economics pulverise the discourse, eradicating the subject in favour of accounting, removing any barrier to jouissance, which has become an inalienable right. The reign of numbers only allows us to think in terms of a continuity that unfolds ad infinitum (1 + 1 + 1 +…) and no longer of opposing elements (allowed / forbidden) that are likely to put limits on excess.Leaving a capitalist discourse supposes a new way of handling of the language. By making resonate the hitherto concealed jouissance, the psychoanalyst gives everyone a chance to reappropriate their own language and thereby to say no to always-more. The desire which arises creates a new social bond and makes it possible to think of an alternative political system to the globalized governance by numbers
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Aksak, Cagan. „Landauer Erasure For Quantum Systems“. Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/2/12611047/index.pdf.

Der volle Inhalt der Quelle
Annotation:
Maxwell&rsquo
s thought experiment on a demon performing microscopic actions and violating the second law of thermodynamics has been a challenging paradox for a long time. It is finally resolved in the seventies and eighties by using Landauer&rsquo
s principle, which state that erasing information is necessarily accompanied with a heat dumped to the environment. The purpose of this study is to describe the heat dumped to the environment associated with erasure operations on quantum systems. To achieve this, first a brief introduction to necessary tools like density matrix formalism, quantum operators and entropy are given. Second, the Maxwell&rsquo
s demon and Szilard model is described. Also the connection between information theory and physics is discussed via this model. Finally, heat transfer operators associated with quantum erasure operations are defined and all of their properties are obtained.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Demay, Gregory. „Source Coding for Erasure Channels“. Thesis, KTH, Kommunikationsteori, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-55297.

Der volle Inhalt der Quelle
Annotation:
The main goal of this thesis is to bound the rate-distortion performance of the aforementioned sparse-graph codes for lossy compression of a BES. As our main contributions, we first derive lower bounds on the rate-distortion performance of LDGM codes for the BES, which are valid for any LDGM code of a given rate and generator node degree distribution and any encoding function. Our approach follows that of Kudekar and Urbanke, where lower bounds were derived for the BSS case. They introduced two methods for deriving lower bounds, namely the counting method and the test channel method. Based on numerical results they observed that the two methods lead to the same bound. We generalize these two methods for the BES and prove that indeed both methods lead to identical rate-distortion bounds for the BES and hence, also for the BSS. Secondly, based on the technique introduced by Martinian and Wainwright, we upper bound the rate-distortion performance of the check regular Poisson LDGM (CRP LDGM) ensemble and the compound LDGM-LDPC ensemble for the BES.We also show that there exist compound LDGM-LDPC codes, with degrees independent of the blocklength, which can achieve any given point on the Shannon rate-distortion curve of the BES.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Alex, Stacey Margaret. „Resisting Erasure: Undocumented Latinx Narratives“. The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1563164119840926.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kang, Hyunsook. „Is Plane Conflation Bracket Erasure?“ Department of Linguistics, University of Arizona (Tucson, AZ), 1988. http://hdl.handle.net/10150/227252.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Cai, Jing. „A study of erasure correcting codes“. Thesis, University of Plymouth, 2010. http://hdl.handle.net/10026.1/1650.

Der volle Inhalt der Quelle
Annotation:
This work focus on erasure codes, particularly those that of high performance, and the related decoding algorithms, especially with low computational complexity. The work is composed of different pieces, but the main components are developed within the following two main themes. Ideas of message passing are applied to solve the erasures after the transmission. Efficient matrix-representation of the belief propagation (BP) decoding algorithm on the BEG is introduced as the recovery algorithm. Gallager's bit-flipping algorithm are further developed into the guess and multi-guess algorithms especially for the application to recover the unsolved erasures after the recovery algorithm. A novel maximum-likelihood decoding algorithm, the In-place algorithm, is proposed with a reduced computational complexity. A further study on the marginal number of correctable erasures by the In-place algoritinn determines a lower bound of the average number of correctable erasures. Following the spirit in search of the most likable codeword based on the received vector, we propose a new branch-evaluation- search-on-the-code-tree (BESOT) algorithm, which is powerful enough to approach the ML performance for all linear block codes. To maximise the recovery capability of the In-place algorithm in network transmissions, we propose the product packetisation structure to reconcile the computational complexity of the In-place algorithm. Combined with the proposed product packetisation structure, the computational complexity is less than the quadratic complexity bound. We then extend this to application of the Rayleigh fading channel to solve the errors and erasures. By concatenating an outer code, such as BCH codes, the product-packetised RS codes have the performance of the hard-decision In-place algorithm significantly better than that of the soft-decision iterative algorithms on optimally designed LDPC codes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

PAIBA, FRANKLIN ANTONIO SANCHEZ. „BIDIMENSIONAL FOUNTAIN CODES FOR ERASURE CHANNELS“. PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2008. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=12457@1.

Der volle Inhalt der Quelle
Annotation:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
Esta dissertação aborda o estudo de códigos fontanais (códigos LT e códigos Raptor) que são uma classe de códigos criados para a transmissão de dados de maneira confiável e eficiente através de canais os quais podem ser modelados como canais com apagamento. Os códigos LT e códigos Raptor são denominados códigos fontanais, devido a que eles são uma boa aproximação para o conceito de fontanas digitais. Além disso, eles são classificados como códigos de taxa versátil, no sentido que o número de símbolos codificados que podem ser gerados a partir dos dados de entrada é potencialmente ilimitado. Códigos LT são capazes de recuperar, com probabilidade maior do que (1 − delta), um conjunto de k símbolos de entrada a partir de quaisquer k + O((raiz quadrada de k)(ln(2))(k/delta)) símbolos codificados recebidos, com uma média de O(k ln(k/delta)) operações XOR. Os códigos Raptor são uma extensão de códigos LT, na qual o processo de codificação é composto de duas etapas: um código de bloco de comprimento fixo (denominado pré- código) e um código LT com uma distribuição de graus apropriada. Investigou-se o desempenho dos códigos LT usando duas novas distribuições de graus (Sóliton Robusta Melhorada e Sóliton Robusta Truncada) e foi proposto um modelo de códigos LT Bidimensionais, na qual os símbolos de entrada são agrupados em forma de matriz. Neste esquema os blocos correspondentes às linhas da matriz são codificados usando um código LT e, em seguida, a matriz resultante tem suas colunas também codificadas usando um código LT. Ainda que a complexidade do esquema tenha sido dobrada o desempenho alcançado pelos códigos LT Bidimensionais superou o desempenho dos códigos LT convencionais para situações em que a qualidade do canal BEC é elevada.
Fountain Codes (LT Codes and Raptor Codes) are a class of codes proposed to efficient and reliably transmit data through Erasure Channels. LT Codes and Raptor Codes are a good approximation to the concept of digital fountain and as such are named as fountain codes. They are said to be rateless codes in the sense that the number of symbols produced by the encoder could grow, potentially, to infinite. With probability of success larger than (1−delta), a decoder of an LT code based scheme can recover the k transmitted symbols from any received block of k + O((square root k)(ln(2))(k/delta)) correct symbols with an average of O(k ln(k/delta)) XOR operations. Raptor codes are an extension of the LT codes idea, with a tandem scheme where a fixed length block code (namely a pre- code) is followed by an LT code that uses a properly chosen degree distribution. In this dissertation the performance of LT codes with two recently proposed degree distributions, the Improved Robust Soliton and the Truncated Soliton Robust Distribution were investigated. A new scheme called Bidimensional LT Codes, has been proposed. In this scheme the input symbols are structured in a matrix form and afterwards the blocks corresponding to the lines of the matrix are encoded with an LT code. The columns of the new matrix so obtained are next encoded with a similar LT code. The complexity of the new scheme is doubled and yet its performance only just surpasses that of the conventional LT scheme for high quality BEC.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Rehman, Sadia. „This is My Family: An Erasure“. The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1492399220029598.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Davids, Margaret. „Erasure: An Additive and Subtractive Act“. VCU Scholars Compass, 2019. https://scholarscompass.vcu.edu/etd/5866.

Der volle Inhalt der Quelle
Annotation:
MOTIVATION In the simplest form, a pencil mark on a page is removed by a traditional rubber eraser. However, the marks are often never fully removed, and the paper thins with each attempt to rub out an old idea. But how does one erase a chair? A pilaster? A room? A building?... More importantly, how does the subtractive act of erasing become an additive one? The historical fabric of a building is important; it is also imperative that it does not remain stagnant. Erasing is an opportunity to design an interior environment that both acknowledges the traces of the pencil marks and the eraser. It is an opportunity to learn from historic design strategies and thoughtfully transition into the present to create a living, breathing palimpsest (Plesch, 2015). PROBLEM Current preservation policies and landmarking tactics arguably contradict preservationists’ claims of promoting environmental, economic, and social growth within communities by exempting historical buildings from complying with codes and regulations which consequently use property that could be more sustainably employed. Historical preservation is largely based in social constructs; therefore, present policies should be reflective of societal changes. At times, the act of preserving often removes these buildings from the possibility of a relevant and functional future by attempting to keep them wedged within historical restraints (Avrami, 2016). METHOD Research of precedent incidents of erasure with applications to concepts involving historical preservation and restoration in the fields interior design and architecture will influence the design approach. These precedent studies will include works by Carlo Scarpa, Peter Zumthor, and David Chipperfield. To supplement these studies, other artistic disciplines and artists, including Robert Rauschenberg, will be researched to holistically comprehend approaches to the concept of erasing. The execution of explorations of erasing different objects and media to better understand the process of erasure will also be imperative. These experimentations will include the strategic erasing of pencil sketches and common objects to investigate how to best represent an object that has been erased. PRELIMINARY RESULTS The approach to erasing the historical fabric of a building is largely dependent on the building itself. This is evident in Scarpa’s attention to the physical and metaphorical joinery of new and existing structures in his design of Palazzo Abatellis, Zumthor’s weaving of old and new brickwork at Kolumba, and Chipperfield’s use of exposed ruins in his design strategy for the Neues Museum (McCarter, 2013; Carrington, 2008; RYKWERT, 2009). The process of erasure within the realm of preservation is a constant and demonstrates how the act of erasing allows opportunities for the existence of something new (Katz, 2006). CONCLUSION Choosing to re-program and systematically erase a section of a historically significant but outdated medical tower as a collective art studio space would introduce the opportunity to design an “erased space “as an environment for post-graduate art students to produce creative work. This space would strengthen the growing bond between a school of the arts and a historic medical school while contributing to the culture of the surrounding neighborhoods and contribute to the rich tradition of art within the city.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Provest, Ian S., of Western Sydney Nepean University, of Performance Fine Arts and Design Faculty und School of Design. „Concepts of viewpoint and erasure: Botany Bay“. THESIS_FPFAD_SD_Provest_I.xml, 1996. http://handle.uws.edu.au:8081/1959.7/790.

Der volle Inhalt der Quelle
Annotation:
When Captain James Cook sailed into Botany Bay in Australia for the first time in 1770, his botanist Joseph Banks described the behaviour of the Aboriginals to be 'totally unmovd' and 'totally engagd'.During this same few days Cook named the place Stingray Bay. Within eight days the name was changed by Cook to Botany Bay. Banks' phrases generate oscillating perceptions and Cook's name change poses questions. The perceptions documented in Banks' journal, refer to an invisibility of the Aboriginals themselves. The name 'Stingray' and its change to 'Botany' raises political questions about the necessity for the change. The change also sheds light on a viewpoint at odds with its subject. The events that occurred during the eight days Cook was anchored in Botany Bay will be discussed firstly in the framework of an analysis of the implications of the terms 'totally unmovd' and 'totally engagd' in Banks' journal, and secondly in a discussion about the various historical notions concerning the name change. Did these curly histories and viewpoints render the indigenous culture invisible? Can these inscriptions made by Cook and Banks and the subsequent mythologies surrounding them, including those about the actual place, be a metaphor for 'further understanding'?
Master of Arts (Hons) (Visual Arts)
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Provest, I. S. „Concepts of viewpoint and erasure : Botany Bay /“. View thesis, 1996. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030910.162554/index.html.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Talebi-Rafsanjan, Siamak. „Image restoration techniques for bursty erasure channels“. Thesis, King's College London (University of London), 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.406409.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Mishra-Linger, Richard Nathan. „Irrelevance, Polymorphism, and Erasure in Type Theory“. PDXScholar, 2008. https://pdxscholar.library.pdx.edu/open_access_etds/2674.

Der volle Inhalt der Quelle
Annotation:
Dependent type theory is a proven technology for verified functional programming in which programs and their correctness proofs may be developed using the same rules in a single formal system. In practice, large portions of programs developed in this way have no computational relevance to the ultimate result of the program and should therefore be removed prior to program execution. In previous work on identifying and removing irrelevant portions of programs, computational irrelevance is usually treated as an intrinsic property of program expressions. We find that such an approach forces programmers to maintain two copies of commonly used datatypes: a computationally relevant one and a computationally irrelevant one. We instead develop an extrinsic notion of computational irrelevance and find that it yields several benefits including (1) avoidance of the above mentioned code duplication problem; (2) an identification of computational irrelevance with a highly general form of parametric polymorphism; and (3) an elective (i.e., user-directed) notion of proof irrelevance. We also develop a program analysis for identifying irrelevant expressions and show how previously studied types embodying computational irrelevance (including subset types and squash types) are expressible in the extension of type theory developed herein.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Zepeda, Salvatierra Joaquin Alejandro. „Tanden filterbank DFT code for bursty erasure correction“. Thesis, McGill University, 2006. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=99553.

Der volle Inhalt der Quelle
Annotation:
Discrete Fourier Transform (DFT) encoding over the real (or complex) field has been proposed as a means to reconstruct samples lost in multimedia transmissions over packet-based networks. A collection of simple sample reconstruction (and error detection) algorithms makes DFT codes an interesting candidate. A common problem with DFT code sample reconstruction algorithms is that the quantization associated with practical implementations results in reconstruction errors that are particularly large when lost samples occur in bursts (bursty erasures).
Following a survey of DFT decoding algorithms, we present herein the Tandem Filterbank/DFT Code (TFBD). The TFBD code consists of a tandem arrangement of a filterbank and DFT encoder that effectively creates DFT codes along the rows (temporal codevectors) and columns (subband codevectors) of the frame under analysis. The tandem arrangement ensures that subband codevectors (the frame columns) will be DFT codes, and we show how the temporal codevectors (frame rows) can also be interpreted as DFT codes. All the subband and temporal codevectors can be used to reconstruct samples entirely independently of each other. An erasure burst along a particular codevector can then be broken up by reconstructing some lost samples along the remaining orientation; these samples can then be used as received samples in reconstructing the original codevector, a technique that we refer to as pivoting. Expressions related to the performance of the Tandem Filterbank/DFT (TFBD) code, including an expression for the temporal code reconstruction error and for temporal-to-subband pivoting operations, are derived and verified through simulations. The expressions also prove useful in the selection of the many parameters specifying a TFBD encoder. The design process is illustrated for two sample TFBD codes that are then compared to a benchmark DFT code at the same rate. The results show that the TFBD encoder is capable of reconstruction error improvements that are more than four orders of magnitude better than that of the benchmark DFT code.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Jeffries, Sean Joseph. „Imprint erasure and DNA demethylation in mouse development“. Thesis, University of Cambridge, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.608949.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Zheng, Chenbo. „Error and erasure decoding for a CDPD system“. Thesis, This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-08222008-063125/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Apollonio, Pietrofrancesco. „Erasure error correcting codes applied to DTN communications“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6852/.

Der volle Inhalt der Quelle
Annotation:
The space environment has always been one of the most challenging for communications, both at physical and network layer. Concerning the latter, the most common challenges are the lack of continuous network connectivity, very long delays and relatively frequent losses. Because of these problems, the normal TCP/IP suite protocols are hardly applicable. Moreover, in space scenarios reliability is fundamental. In fact, it is usually not tolerable to lose important information or to receive it with a very large delay because of a challenging transmission channel. In terrestrial protocols, such as TCP, reliability is obtained by means of an ARQ (Automatic Retransmission reQuest) method, which, however, has not good performance when there are long delays on the transmission channel. At physical layer, Forward Error Correction Codes (FECs), based on the insertion of redundant information, are an alternative way to assure reliability. On binary channels, when single bits are flipped because of channel noise, redundancy bits can be exploited to recover the original information. In the presence of binary erasure channels, where bits are not flipped but lost, redundancy can still be used to recover the original information. FECs codes, designed for this purpose, are usually called Erasure Codes (ECs). It is worth noting that ECs, primarily studied for binary channels, can also be used at upper layers, i.e. applied on packets instead of bits, offering a very interesting alternative to the usual ARQ methods, especially in the presence of long delays. A protocol created to add reliability to DTN networks is the Licklider Transmission Protocol (LTP), created to obtain better performance on long delay links. The aim of this thesis is the application of ECs to LTP.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Tomás, Estevan Virtudes. „Complete-MDP convolutional codes over the erasure channel“. Doctoral thesis, Universidad de Alicante, 2010. http://hdl.handle.net/10045/18325.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Giotta, Gina Nicole. „Disappeared: erasure in the age of mechanical writing“. Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/2705.

Der volle Inhalt der Quelle
Annotation:
This dissertation surveys effaced bodies and the complications and victories left in their wake. While the recent `material turn' in media studies has produced valuable insight into the history of media artifacts and forms (as well as their contemporary progeny), the centrality of writing practices and inscription technologies in such scholarship has generated a rather ironic critical blind spot as regards the corresponding phenomenon of erasure. As inscription and erasure are co-constitutive forces that can only exist through ongoing encounters with one another, it is necessary--if we are to understand mechanical writing in all of its intricacy--to also keep in mind the parallel act of erasure and what has been lost or effaced as a result of the modern drive to write and record the world in so many ways. As such, this project considers three moments of erasure--or, scenes of deletion--between the periods of 1850 and 1950 in which the body serves as the site or object of effacement. In addition to carving out a secret route through which to explore the body's intersection with media technology (and the increasing mutability that has befallen it as a result of this association), this project also throws light on practices and technologies of erasure that have, themselves, become subject to deletion from the evolving historical record. The first case study considers the neglected pre-history of Photoshop by elaborating the retouching practices that grew up alongside the camera during the nineteenth century. It argues that such practices worked to erect a visible difference between the portrait of the criminal and that of his genteel counterpart, thereby helping to secure the class privilege of the latter at a time when the `democratic' representational style of the camera threatened to undo it. The second study explores the feminine `container' technology of military camouflage from its origins in World War I as a means of concealing the body of the soldier to its re-invention in the twenty-first century as a strategy for covering over the ongoing danger of war and impotence of hi-technology in postmodern scrimmages against non-state actors. This chapter ultimately builds upon Friedrich Kittler's argument that war is the mother of all media by suggesting that the dialectical tension between camouflage and the optical devices designed to thwart it is the mother of all war. The final case study turns to the breezier technology of the television laugh track and its erasure of the live studio audience from both the production process and the television text. It argues that while the laugh track's erasure of the audience left an irreducible trace that manifested itself in the repetition of the laughter dotting the text, the new formal devices that have come to replace the machine's original functions deftly efface their logic in a way that makes them unrecognizable as the offspring of the maligned technology.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Azeem, Muhammad Moazam. „Erasure Correcting Codes for Opportunistic Spectrum Access (OSA)“. Thesis, Paris, CNAM, 2014. http://www.theses.fr/2014CNAM1002/document.

Der volle Inhalt der Quelle
Annotation:
Les années récentes ont vu l’explosion du trafic sur les réseaux mobiles depuis l’apparition de nouveaux terminaux (smartphones, tablettes) et des usages qu’ils permettent, en particulier les données multimédia, le trafic voix restant sensiblement constant. Une conséquence est le besoin de plus de spectre, ou la nécessité de mieux utiliser le spectre déjà alloué. Comme il n’y a pas de coordination entre les utilisateurs secondaire(s) et primaire, avant toute transmission les premiers doivent mettre en œuvre des traitements pour détecter les périodes dans lesquelles l’utilisateur primaire transmet, ce qui est le scénario considéré dans cette thèse. Nous considérons donc une autre approche, reposant sur l’utilisation de codes correcteurs d’effacements en mode paquet. La dernière partie de la thèse aborde un scénario dans lequel il n’y a plus d’utilisateur primaire, tous les utilisateurs ayant le même droit à transmettre dans le canal. Nous décrivons une modification de la couche MAC du 802.11 consistant à réduire les différents temps consacrés à attendre (SIFS, DIFS, backoff, . . .) afin d’accéder plus souvent au canal, au prix de quelques collisions supplémentaires qu’il est possible de récupérer en mettant en œuvre des codes correcteurs d’effacements
The emergence of new devices especially the smartphones and tablets having a lot of new applications have rocketed the wireless traffic in recent years and this is the cause of main surge in the demand of radio spectrum. There is a need of either more spectrum or to use existing spectrum more efficiently due to dramatic increase in the demand of limited spectrum. Among the new dynamic access schemes designed to use the spectrum more efficiently opportunistic spectrum access (OSA) is currently addressed when one or more secondary users (SU) are allowed to access the channel when the PU is not transmitting. The erasure correcting codes are therefore envisioned to recover the lost data due to sensing impairments. We define the parameter efficiency of SU and optimize it in-terms of spectrum utilization keeping into account sensing impairments, code parameters and the activity of PU. Finally, the spectrum access for multiple secondary users is addressed when there is no primary and each user has equal right to access the channel. The interesting scenarios are Cognitive radio networks and WiFi where 802.11 protocol gives the specification for MAC layer. The throughput curvesachieved by retransmission and using various erasure correcting codes are compared. This modification in MAC layer will reduce the long waiting time to access the channel, as the number of users are increased
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Azeem, Muhammad Moazam. „Erasure Correcting Codes for Opportunistic Spectrum Access (OSA)“. Electronic Thesis or Diss., Paris, CNAM, 2014. http://www.theses.fr/2014CNAM1002.

Der volle Inhalt der Quelle
Annotation:
Les années récentes ont vu l’explosion du trafic sur les réseaux mobiles depuis l’apparition de nouveaux terminaux (smartphones, tablettes) et des usages qu’ils permettent, en particulier les données multimédia, le trafic voix restant sensiblement constant. Une conséquence est le besoin de plus de spectre, ou la nécessité de mieux utiliser le spectre déjà alloué. Comme il n’y a pas de coordination entre les utilisateurs secondaire(s) et primaire, avant toute transmission les premiers doivent mettre en œuvre des traitements pour détecter les périodes dans lesquelles l’utilisateur primaire transmet, ce qui est le scénario considéré dans cette thèse. Nous considérons donc une autre approche, reposant sur l’utilisation de codes correcteurs d’effacements en mode paquet. La dernière partie de la thèse aborde un scénario dans lequel il n’y a plus d’utilisateur primaire, tous les utilisateurs ayant le même droit à transmettre dans le canal. Nous décrivons une modification de la couche MAC du 802.11 consistant à réduire les différents temps consacrés à attendre (SIFS, DIFS, backoff, . . .) afin d’accéder plus souvent au canal, au prix de quelques collisions supplémentaires qu’il est possible de récupérer en mettant en œuvre des codes correcteurs d’effacements
The emergence of new devices especially the smartphones and tablets having a lot of new applications have rocketed the wireless traffic in recent years and this is the cause of main surge in the demand of radio spectrum. There is a need of either more spectrum or to use existing spectrum more efficiently due to dramatic increase in the demand of limited spectrum. Among the new dynamic access schemes designed to use the spectrum more efficiently opportunistic spectrum access (OSA) is currently addressed when one or more secondary users (SU) are allowed to access the channel when the PU is not transmitting. The erasure correcting codes are therefore envisioned to recover the lost data due to sensing impairments. We define the parameter efficiency of SU and optimize it in-terms of spectrum utilization keeping into account sensing impairments, code parameters and the activity of PU. Finally, the spectrum access for multiple secondary users is addressed when there is no primary and each user has equal right to access the channel. The interesting scenarios are Cognitive radio networks and WiFi where 802.11 protocol gives the specification for MAC layer. The throughput curvesachieved by retransmission and using various erasure correcting codes are compared. This modification in MAC layer will reduce the long waiting time to access the channel, as the number of users are increased
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Greenan, Kevin M. „Reliability and power-efficiency in erasure-coded storage systems /“. Diss., Digital Dissertations Database. Restricted to UC campuses, 2009. http://uclibs.org/PID/11984.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Kalichini, Gladys. „FyaMoneka: exploring the erasure of women within Zambian history“. Thesis, Rhodes University, 2018. http://hdl.handle.net/10962/63186.

Der volle Inhalt der Quelle
Annotation:
This Master of Fine Art submission, comprising of an exhibition and mini-thesis, explores the erasure of women’s narratives from Zambian history and collective memory. As a point of entry into the broader conversation of narratives of women marginalised in certain historicised events, this research analyses the narratives of Julia Chikamoneka and Alice Lenshina that are held in the collective memory of Zambian history. It focuses on the representations of narratives of women during and beyond colonial times, while hinging particularly on these two characters’ encounters with and against British rule in Northern Rhodesia (Zambia). Titled FyaMoneka: Exploring the Erasure of Women Within Zambian History, the mini-thesis examines the representations and positioning of women’s political activities within the liberation narrative that is recorded in the National Archives of Zambia (NAZ) and the United National Independence Party (UNIP) Archives. This mini-thesis highlights the fact that women have been written out of Zambia’s liberation narrative in the NAZ and the UNIP Archives, and remains mindful of the inherent modifications and erasures of women’s accounts over time, including the obfuscation or the absence of certain archival materials. This mini-thesis prospectively reconstructs Chikamoneka’s and Lenshina’s narratives using traces of their histories within collective memory through re/visiting processes of re-archivisation. The exhibition, titled ChaMoneka (It Has Become Visible): UnCasting Shadows, explores death and representations of death, where death is conceptualised as a metaphor for the erasure of women’s historical narratives, whereas the body represents the narrative. Based on an exploration of the relationship and tensions between collective memory and history, death within this exhibition is thematised as the course of fading away and a continuous process in which women’s narratives are erased.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Zeng, Weifei. „Coding and scheduling optimization over packet erasure broadcast channels“. Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/71501.

Der volle Inhalt der Quelle
Annotation:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2012.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 93-97).
Throughput and per-packet delay can present strong trade-offs that are important in the cases of delay sensitive applications. In this thesis, we investigate such trade-offs using a random linear network coding scheme for one or more receivers in single hop wireless packet erasure broadcast channels. We capture the delay sensitivities across different types of network applications using a class of delay metrics based on the norms of packet arrival times. With these delay metrics, we establish a unified framework to characterize the rate and delay requirements of applications and optimize system parameters. In the single receiver case, we demonstrate the trade-off between average packet delay, which we view as the inverse of throughput, and maximum ordered inter-arrival delay for various system parameters. For a single broadcast channel with multiple receivers having different delay constraints and feedback delays, we jointly optimize the coding parameters and time-division scheduling parameters at the transmitters. We formulate the optimization problem as a Generalized Geometric Program (GGP). This approach allows the transmitters to adjust adaptively the coding and scheduling parameters for efficient allocation of network resources under varying delay constraints. In the case where the receivers are served by multiple non-interfering wireless broadcast channels, the same optimization problem is formulated as a Signomial Program, which is NP-hard in general. We provide approximation methods using successive formulation of geometric programs and show the convergence of approximations. Practical issues of implementing proposed coding and optimization scheme on existing layered network architecture are also discussed.
by Weifei Zeng.
S.M.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Ahmad, Shakeel. „Optimized Network-Adaptive Multimedia Transmission Over Packet Erasure Channels“. [S.l. : s.n.], 2008. http://nbn-resolving.de/urn:nbn:de:bsz:352-opus-70706.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Childers, Thomas Edward. „File transfer with erasure coding over wireless sensor networks“. Thesis, Monterey, Calif. : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Mar/09Mar%5FChilders.pdf.

Der volle Inhalt der Quelle
Annotation:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, March 2009.
Thesis Advisors: McEachen, John ; Tummala, Murali. "March 2009." Description based on title screen as viewed on April 23, 2009. Author subject terms: Wireless Communication, Wireless Sensor Networks, Data Transmission. Includes bibliographical references (p. 81-82). Also available in print.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Nagasubramanian, Karthik. „Code design for erasure channels with limited or noisy feedback“. [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-2065.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Le, Dang Quang. „Opportunistic multicasting scheduling using erasure-correction coding over wireless channels“. Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=95215.

Der volle Inhalt der Quelle
Annotation:
In wireless communications, the broadcast nature can be explored to efficiently support multicast services while the difference in channel gains among the wireless links of users promotes multiuser diversity which can be used to improve multicast performance. This thesis proposes an opportunistic multicast scheduling scheme using erasure-correction coding to jointly exploit multicast gain, multiuser diversity, and time/frequency diversity in wireless communications. The proposed scheme sends only one copy to all users in the multicast group at a selected transmission threshold on time/frequency slots and using erasure-correction coding to recover erased packets when the instantaneous signal-to-noise ratio (SNR) of the link between the Base Station and a user is insufficient. On flat fading channel, an analytical framework is developed to establish the optimum selection of transmission threshold and erasure-correction code rate to achieve the best multicast throughput on different fading conditions. Numerical results show that the proposed scheme outperforms both the Worst-User and Best-User schemes for a wide range of SNR. The results also show that multiuser diversity is superior in the low SNR region while multicast gain is most significant at high SNR region. Moreover, to study the role of channel knowledge, the proposed scheme is considered in two cases: (i) with full channel gain knowledge and (ii) with only partial knowledge of fading type and average SNR. Our study indicates that full channel knowledge is beneficial for small multicast groups but at large group size it is sufficient to have partial channel knowledge as the difference in achievable throughput between the two cases is just marginal. The proposed scheme is further extended for applications to Orthogonal Frequency Division Multiplexing systems to take advantage of frequency diversity in a frequency-selective fading environment. Our study on the effect of frequency correlation on multicast thro
Dans les communications sans-fil, la diffusion se prête naturellement aux services efficaces de multidiffusion tandis que les variations des gains de canal, parmi les liens avec les différents utilisateurs, permettent de promouvoir la diversité multiutilisateur qui peut être utilisée pour améliorer la performance de multidiffusion. Ce mémoire propose une méthode d'ordonnancement multidiffusion opportuniste en utilisant un code correcteur d'effacement pour exploiter conjointement le gain de multidiffusion ainsi que les diversités temporelle, spectrale et multiutilisateurs disponibles dans les communications sans fil. La méthode proposée n'envoie qu'une seule copie, sur les créneaux temporels et spectraux, à un seuil de transmission sélectionné pour tous les utilisateurs appartenant au groupe de multidiffusion et en utilisant un code correcteur d'effacement pour récupérer les paquets effacés lorsque le rapport signal sur bruit (SNR) instantané du lien entre Station de Base et un utilisateur est insuffisant. Sur un canal à évanouissement uniforme, un cadre analytique est développé pour établir la sélection optimale du seuil de transmission et le taux de codage à effacement pour atteindre le meilleur débit de multidiffusion sous l'effet de différentes conditions d'évanouissement. Les résultats numériques montrent que la méthode proposée surpasse en performance à la fois les méthodes Pire-Utilisateur et Meilleur-Utilisateur pour une ample plage de SNR. Les résultats montrent aussi que la diversité multiutilisateur est supérieure à bas SNR tandis que le gain multidiffusion est plus significatif à haut SNR. De plus, pour étudier le rôle de la connaissance du canal, la méthode proposée est considérée pour deux cas: (i) avec connaissance complète du canal et (ii) avec seulement une connaissance partielle du type d'évanouissement et SNR moyen. Notre étude indique que la connaissance complète du canal est avantageuse pour$
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Parikh, Salil (Salil Arvind) 1971. „On the use of erasure codes in unreliable data networks“. Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/33159.

Der volle Inhalt der Quelle
Annotation:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2001.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (p. 62-64).
Modern data networks are approaching the state where a large number of independent and heterogeneous paths are available between a source node and destination node. In this work, we explore the case where each path has an independent level of reliability characterized by a probability of path failure. Instead of simply repeating the message across all the paths, we use the path diversity to achieve reliable transmission of messages by using a coding technique known as an erasure correcting code. We develop a model of the network and present an analysis of the system that invokes the Central Limit Theorem to approximate the total number of bits received from all the paths. We then optimize the number of bits to send over each path in order to maximize the probability of receiving a sufficient number of bits at the destination to reconstruct the message using the erasure correcting code. Three cases are investigated: when the paths are very reliable, when the paths are very unreliable, and when the paths have a probability of failure within an interval surrounding 0.5. We present an overview of the mechanics of an erasure coding process applicable to packet-based transactions. Finally, as avenues for further research, we discuss several applications of erasure coding in networks that have only a single path between source and destination: for latency reduction in interactive web sessions; as a transport layer for critical messaging; and an application layer protocol for high-bandwidth networks.
by Salil Parikh.
S.M.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Small-Clouden, Lystra. „Globalization, assimilation, culture erasure| A review of Trinidad and Tobago“. Thesis, Capella University, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3723119.

Der volle Inhalt der Quelle
Annotation:

The objective of this study was to examine the relationship between globalization and assimilation (dependent variables), and four contributing factors of culture, value, norms, and identity (independent variables) to determine whether managers in Trinidad and Tobago devalue their own culture to assimilate into a global culture. A researcher-constructed survey questionnaire was used to collect data from a random sample of respondents. The survey was analyzed utilizing both parametric and nonparametric statistical tools to answer five Research Subquestions. The one-sample t test was an appropriate tool to establish construct reliability and validity of assumptions for this quantitative study. Values were established to support the level of statistical significance for (p < 0.05) effect as follows: a medium effect size (f2 = .15), alpha = .0.05, power = .80, yielding an acceptable sample size of 85 participants. Based on the evaluation of the statistical data, it was concluded (a) there was an impact of demographic factors on culture, values, norms, and identity; (b) global factors had no impact on culture, values, norms and identity; (c) the Trinidad and Tobago manager assimilated during international business meetings; (d) there was an impact of assimilation on culture, values, norms and identity in Trinidad and Tobago; and (e) there was no change in management behavior during international business meetings. Three implications resulted from the findings. First, from a theoretical perspective, based on the analysis of culture, managers were unaware of culture erasure. Second, from a scientific merit perspective, the ANOVA method optimized and validated causal-comparative effect of both measurement and structural models with the inclusion of interrelationships effects between variables. Finally, from a practical perspective, respondents perceived global factors had no impact on culture, but assimilation had a negative impact on culture. Based on the results, it was assumed the unique and distinguishable aspects of culture are disappearing, and the effects of globalization and assimilation have caused an unconscious reprogramming of collective behaviors, which resulted in culture erasure.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Phillips, Linzy. „Erasure-correcting codes derived from Sudoku & related combinatorial structures“. Thesis, University of South Wales, 2013. https://pure.southwales.ac.uk/en/studentthesis/erasurecorrecting-codes-derived-from-sudoku--related-combinatorial-structures(b359130e-bfc2-4df0-a6f5-55879212010d).html.

Der volle Inhalt der Quelle
Annotation:
This thesis presents the results of an investigation into the use of puzzle-based combinatorial structures for erasure correction purposes. The research encompasses two main combinatorial structures: the well-known number placement puzzle Sudoku and a novel three component construction designed specifically with puzzle-based erasure correction in mind. The thesis describes the construction of outline erasure correction schemes incorporating each of the two structures. The research identifies that both of the structures contain a number of smaller sub-structures, the removal of which results in a grid with more than one potential solution - a detrimental property for erasure correction purposes. Extensive investigation into the properties of these sub-structures is carried out for each of the two outline erasure correction schemes, and results are determined that indicate that, although the schemes are theoretically feasible, the prevalence of sub-structures results in practically infeasible schemes. The thesis presents detailed classifications for the different cases of sub-structures observed in each of the outline erasure correction schemes. The anticipated similarities in the sub-structures of Sudoku and sub-structures of Latin Squares, an established area of combinatorial research, are observed and investigated, the proportion of Sudoku puzzles free of small sub-structures is calculated and a simulation comparing the recovery rates of small sub-structure free Sudoku and standard Sudoku is carried out. The analysis of sub-structures for the second erasure correction scheme involves detailed classification of a variety of small sub-structures; the thesis also derives probabilistic lower bounds for the expected numbers of case-specific sub-structures within the puzzle structure, indicating that specific types of sub-structure hinder recovery to such an extent that the scheme is infeasible for practical erasure correction. The consequences of complex cell inter-relationships and wider issues with puzzle-based erasure correction, beyond the structures investigated in the thesis are also discussed, concluding that while there are suggestions in the literature that Sudoku and other puzzle-based combinatorial structures may be useful for erasure correction, the work of this thesis suggests that this is not the case.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Scherer, Frank F. „A culture of erasure, orientalism and Chineseness in Cuba, 1847-1997“. Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape2/PQDD_0017/MQ59204.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Aitken, D. G. „Error control coding for mixed wireless and internet packet erasure channels“. Thesis, University of Surrey, 2008. http://epubs.surrey.ac.uk/804436/.

Der volle Inhalt der Quelle
Annotation:
Recent years have seen dramatic growth in Internet usage and an increasing convergence between Internet and wireless communications. There has also been renewed interest in iteratively decoded low-density parity-check (LDPC) codes due to their capacity approaching performance on AWGN channels.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Kane, Peter William. „Representation and meaning : the erasure of mediation in contemporary text theory“. Thesis, University of Canterbury. Department of English, 1988. http://hdl.handle.net/10092/3801.

Der volle Inhalt der Quelle
Annotation:
This thesis undertakes an examination of theories describing the generation of meaning in various forms of representation, with specific reference to literary texts, photography and cinema. Proceeding from Derrida's claim that all signification may be classed as a general form of Writing, I consider the consequences of analysing the individual forms of representation from within a general system. The classical representation of self-presence in speech is examined, and the metaphysical notions derived from the apparent proximity of voice to thought, on the basis of instantaneous production, are challenged. As a result, the claims regarding presence and atemporality in the analogical representation of the photograph and the motion picture projection are critically examined; and the closure of mediation and meaning in the speech/writing and author/text hierarchies is subjected to deconstructionist analysis. Text theory often acknowledges Lacanian psychoanalysis, and Lacan's theory of the signification of desire is evaluated for its relationship to traditional models of representation. The Lacanian influence in Barthes's work is traced through the notion of textual desire developed in SIZ, and Barthes's claims regarding the meaning of myth and code are examined for their erasure of textual mediation. With the conclusion that claims to "truth" and the original encounter of "presence" in representation are founded on erasure and the privileging of secondary phenomena, the thesis incorporates examples of texts, such as Desperately Seeking Susan, which evoke the traditionally elided material; because the label "feminine" is usually applied to the excluded features of representation, the contemporary work in this area often takes a feminist approach, challenging the phallogocentric tradition of Western society.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Elliott, Matthew Edwin. „Erasure and reform Los Angeles literature and the reconstruction of the past /“. College Park, Md. : University of Maryland, 2004. http://hdl.handle.net/1903/2075.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D.) -- University of Maryland, College Park, 2004.
Thesis research directed by: English Language and Literature. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Wang, Chia-Wen. „Bounds on the map threshold of iterative decoding systems with erasure noise“. Thesis, Texas A&M University, 2008. http://hdl.handle.net/1969.1/86027.

Der volle Inhalt der Quelle
Annotation:
Iterative decoding and codes on graphs were first devised by Gallager in 1960, and then rediscovered by Berrou, Glavieux and Thitimajshima in 1993. This technique plays an important role in modern communications, especially in coding theory and practice. In particular, low-density parity-check (LDPC) codes, introduced by Gallager in the 1960s, are the class of codes at the heart of iterative coding. Since these codes are quite general and exhibit good performance under message-passing decoding, they play an important role in communications research today. A thorough analysis of iterative decoding systems and the relationship between maximum a posteriori (MAP) and belief propagation (BP) decoding was initiated by Measson, Montanari, and Urbanke. This analysis is based on density evolution (DE), and extrinsic information transfer (EXIT) functions, introduced by ten Brink. Following their work, this thesis considers the MAP decoding thresholds of three iterative decoding systems. First, irregular repeat-accumulate (IRA) and accumulaterepeataccumulate (ARA) code ensembles are analyzed on the binary erasure channel (BEC). Next, the joint iterative decoding of LDPC codes is studied on the dicode erasure channel (DEC). The DEC is a two-state intersymbol-interference (ISI) channel with erasure noise, and it is the simplest example of an ISI channel with erasure noise. Then, we introduce a slight generalization of the EXIT area theorem and apply the MAP threshold bound for the joint decoder. Both the MAP and BP erasure thresholds are computed and compared with each other. The result quantities the loss due to iterative decoding Some open questions include the tightness of these bounds and the extensions to non-erasure channels.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Chang, Cheng. „Reliable and secure storage with erasure codes for OpenStack Swift in PyECLib“. Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-202972.

Der volle Inhalt der Quelle
Annotation:
In the last decade, cloud storage systems has experienced a rapid growth to account for an important part of cloud-based services. Among them, OpenStack Swift is a open source software to implement an object storage system. Meanwhile, storage providers are making great effort to ensure the quality of their services. One of the key factors of storage systems is the data durability. Fault tolerance mechanisms play an important role in ensuring the data availability. Existing approaches like replication and RAID are used to protect data from lost, while with their own drawbacks. Erasure coding comes as a novel concept applied in the storage systems for the concern of data availability. Studies showed that it is able to provide fault tolerance with redundancies while reducing the capacity overhead, offering a tradeoff between performanc eand cost. This project did an in-depth investigation on the OpenStack Swift and the erasure coding approach. Analysis on erasure coded and replication systems are performed to compare the features of both approaches. A prototype of custom erasure code is implemented as an extension to Swift, offering data storage with promising reliability and performance.
Molnlagring system har upplevt en snabb tillväxt att spela en viktig roll i molnbaserade tjänster under det senaste decenniet. Bland annat är Openstack Swift en programvara med öppen källköd som ska införa som object lagring system. Dessutom har molnlagring system gjort stora ansträngar för att säkerställa kvaliten av sina tjänster. En av de viktigaste faktorerna av molnlagring system är datashållbarheten. Feltoleransmekanismer spelar en viktig roll för att grantera datastillgångår. Bland annat finns det Replikering och RAID används för att skydda data från förlorade trots att de drabbas av många nackdelar. Erasure kodning kommer som nytt koncept som kan tillämpas i lagringssystem för angelägenheten av datastillgänglighet. Forskningar har visat att det kan ge feltolerans med uppsägningar och samtidigt minska kapaciteten och erbjuder en kompromiss mellan prestanda och kostnad. Projekten gjorde en fördjupad undersökning på Openstack Swift och erasure kodning. Analyserna på raderingskodade och replikationssystem har vidtagits för att jämföra egenskaperna hos båda metoder. En prototyp av anpassade radering koden är att implementeras som förlängning till Swift och erbjuder datalagring med lovande tillförlitlighet och prestanda.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Thie, Johnson Electrical Engineering &amp Telecommunications Faculty of Engineering UNSW. „Optimal erasure protection assignment for scalably compressed data over packet-based networks“. Awarded by:University of New South Wales. Electrical Engineering and Telecommunications, 2004. http://handle.unsw.edu.au/1959.4/19373.

Der volle Inhalt der Quelle
Annotation:
This research is concerned with the reliable delivery of scalable compressed data over lossy communication channels. Recent works proposed several strategies for assigning optimal code redundancies to elements of scalable data, which form a linear structure of dependency, under the assumption that all source elements are encoded onto a common group of network packets. Given large data and small network packets, such schemes require very long channel codes with high computational complexity. In networks with high loss, small packets are more desirable than long packets. The first contribution of this thesis is to propose a strategy for optimally assigning elements of the scalable data to clusters of packets, subject to constraints on packet size and code complexity. Given a packet cluster arrangement, the scheme then assigns optimal code redundancies to the source elements, subject to a constraint on transmission length. Experimental results show that the proposed strategy can outperform the previous code assignment schemes subject to the above-mentioned constraints, particularly at high channel loss rates. Secondly, we modify these schemes to accommodate complex structures of dependency. Source elements are allocated to clusters of packets according to their dependency structure, subject to constraints on packet size and channel codeword length. Given a packet cluster arrangement, the proposed schemes assign optimal code redundancies to the source elements, subject to a constraint on transmission length. Experimental results demonstrate the superiority of the proposed strategies for correctly modelling the dependency structure. The last contribution of this thesis is to propose a scheme for optimizing protection of scalable data where limited retransmission is possible. Previous work assumed that retransmission is not possible. For most real-time or interactive applications, however, retransmission of lost data may be possible up to some limit. In the present work we restrict our attention to streaming sources (e.g., video) where each source element can be transmitted in one or both of two time slots. An optimization algorithm determines the transmission and level of protection for each source element, using information about the success of earlier transmissions. Experimental results confirm the benefit of limited retransmission.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Vishwanathan, Roopa Tate Stephen R. „Power-benefit analysis of erasure encoding with redundant routing in sensor networks“. [Denton, Tex.] : University of North Texas, 2006. http://digital.library.unt.edu/permalink/meta-dc-5426.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Alivizatou, M. E. „Preservation, erasure and representation : rethinking 'intangible heritage' in a comparative museum ethnography“. Thesis, University College London (University of London), 2009. http://discovery.ucl.ac.uk/18749/.

Der volle Inhalt der Quelle
Annotation:
In a critical dialogue with museum and cultural heritage studies, this thesis examines the concept of ‘intangible cultural heritage’ (ICH) and its implications for heritage theory, policy and practice. ICH gained international recognition in the 21st century primarily through the activities of UNESCO. Controversies and gaps inherent in the institutional discourse on ICH, however, have led critics to question not only its assumptions but in some cases its very raison d’être. Taking this forward, the purpose of this thesis is to revisit the ICH discourse and explore alternative negotiations entangled in institutional configurations, intellectual quests for parallel/ subversive heritages and new/ postmuseum paradigms. My point of departure is a critique of the preservationist ethos of UNESCO that has led to the construction of the official ICH narrative. Based on the idea of the ‘politics of erasure’, I argue for the re-conceptualisation of ICH not via archival and salvage measures, but through the reworking of the modern/ pre-modern and presence/ absence dynamics embedded in notions of impermanence, renewal and transformation. Parallel to that, I trace the implications of the ICH discourse for heritage and museum practice. As such, I conduct multi-sited fieldwork research and follow the negotiations of ICH from the global sphere of UNESCO to the localised complexities of five museum milieux. These are the National Museum of New Zealand Te Papa Tongarewa (Wellington), the Vanuatu Cultural Centre (Port Vila), the National Museum of the American Indian (Washington, New York, Suitland), the Horniman Museum (London) and the Musée du Quai Branly (Paris); selected as fieldwork destinations for the diverse perspectives they offer on ICH in the museum space and discourse. In so doing, I engage with the idea of the new museum, not as a repository of material culture, but as performative space for the empowerment of bottom-up, participatory museology and the reworking of the tangible/ intangible divide. My conclusion suggests that, couched within debates on the politics of recognition, representation and invented traditions and beyond UNESCO’s preservationist schemata, ICH emerges as a contested and critical intervention challenging and reinventing heritage policy and museum-work.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Silvaroli, Antonio. „Design and Analysis of Erasure Correcting Codes in Blockchain Data Availability Problems“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Den vollen Inhalt der Quelle finden
Annotation:
In questo lavoro viene affrontato il concetto di Blockchain e Bitcoin, con enfasi sugli attacchi alla disponibilità riguardanti le transazioni, nel caso in cui nella rete vengano considerati alcuni nodi detti "light nodes", che migliorano la scalabilità del sistema. Quindi, si analizza il funzionamento della Blockchain di Bitcoin quando la struttura dati "Merkle Tree" viene codificata, in modo da aumentare la probabilità dei light nodes di rilevare cancellazioni di transazioni, attuate da parte di nodi attaccanti. Attraverso una codifica con erasure codes, in particolare con codici low density parity check (LDPC), si riesce ad aumentare la probabilità di detection di una cancellazione e, grazie alla decodifica iterativa è possibile recuperare tale cancellazione. Viene affrontato il problema degli stopping sets, cioè quelle strutture che impediscono il recupero dei dati tramite decodifica iterativa e si progetta un algoritmo per l'enumerazione di tali strutture. Sono poi testate, in modo empirico, alcune soluzioni teoriche presenti in letteratura. Successivamente vengono progettati nuovi codici, seguendo un metodo di design diverso da quello presente in letteratura. Tali codici comportano un miglioramento delle performance, in quanto il minimo stopping set per tali codici risulta più alto di quello di codici già analizzati. In questo modo eventuali attacchi alla disponibilità risultano, in probabilità, più difficili. Come conseguenza, il throughput della rete risulta più stabile dato che, con minori attacchi che vanno a buon fine, la frequenza di generazione di nuovi codici, per un nuovo processo di codifica delle transazioni, tende ad essere più bassa. Infine vengono proposti dei possibili miglioramenti.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Wu, Chentao. „Improve the Performance and Scalability of RAID-6 Systems Using Erasure Codes“. VCU Scholars Compass, 2012. http://scholarscompass.vcu.edu/etd/2894.

Der volle Inhalt der Quelle
Annotation:
RAID-6 is widely used to tolerate concurrent failures of any two disks to provide a higher level of reliability with the support of erasure codes. Among many implementations, one class of codes called Maximum Distance Separable (MDS) codes aims to offer data protection against disk failures with optimal storage efficiency. Typical MDS codes contain horizontal and vertical codes. However, because of the limitation of horizontal parity or diagonal/anti-diagonal parities used in MDS codes, existing RAID-6 systems suffer several important problems on performance and scalability, such as low write performance, unbalanced I/O, and high migration cost in the scaling process. To address these problems, in this dissertation, we design techniques for high performance and scalable RAID-6 systems. It includes high performance and load balancing erasure codes (H-Code and HDP Code), and Stripe-based Data Migration (SDM) scheme. We also propose a flexible MDS Scaling Framework (MDS-Frame), which can integrate H-Code, HDP Code and SDM scheme together. Detailed evaluation results are also given in this dissertation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Vishwanathan, Roopa. „Power-benefit analysis of erasure encoding with redundant routing in sensor networks“. Thesis, University of North Texas, 2006. https://digital.library.unt.edu/ark:/67531/metadc5426/.

Der volle Inhalt der Quelle
Annotation:
One of the problems sensor networks face is adversaries corrupting nodes along the path to the base station. One way to reduce the effect of these attacks is multipath routing. This introduces some intrusion-tolerance in the network by way of redundancy but at the cost of a higher power consumption by the sensor nodes. Erasure coding can be applied to this scenario in which the base station can receive a subset of the total data sent and reconstruct the entire message packet at its end. This thesis uses two commonly used encodings and compares their performance with respect to power consumed for unencoded data in multipath routing. It is found that using encoding with multipath routing reduces the power consumption and at the same time enables the user to send reasonably large data sizes. The experiments in this thesis were performed on the Tiny OS platform with the simulations done in TOSSIM and the power measurements were taken in PowerTOSSIM. They were performed on the simple radio model and the lossy radio model provided by Tiny OS. The lossy radio model was simulated with distances of 10 feet, 15 feet and 20 feet between nodes. It was found that by using erasure encoding, double or triple the data size can be sent at the same power consumption rate as unencoded data. All the experiments were performed with the radio set at a normal transmit power, and later a high transmit power.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Liu, Chengjian. „ESetStore: an erasure-coding based distributed storage system with fast data recovery“. HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/507.

Der volle Inhalt der Quelle
Annotation:
The past decade has witnessed the rapid growth of data in large-scale distributed storage systems. Triplication, a reliability mechanism with 3x storage overhead and adopted by large-scale distributed storage systems, introduces heavy storage cost as data amount in storage systems keep growing. Consequently, erasure codes have been introduced in many storage systems because they can provide a higher storage efficiency and fault tolerance than data replication. However, erasure coding has many performance degradation factors in both I/O and computation operations, resulting in great performance degradation in large-scale erasure-coded storage systems. In this thesis, we investigate how to eliminate some key performance issues in I/O and computation operations for applying erasure coding in large-scale storage systems. We also propose a prototype named ESetStore to improve the recovery performance of erasure-coded storage systems. We introduce our studies as follows. First, we study the encoding and decoding performance of the erasure coding, which can be a key bottleneck with the state-of-the-art disk I/O throughput and network bandwidth. We propose a graphics processing unit (GPU)-based implementation of erasure coding named G-CRS, which employs the Cauchy Reed-Solomon (CRS) code, to improve the encoding and decoding performance. To maximize the coding performance of G-CRS by fully utilizing the GPU computational power, we designed and implemented a set of optimization strategies. Our evaluation results demonstrated that G-CRS is 10 times faster than most of the other coding libraries. Second, we investigate the performance degradation introduced by intensive I/O operations in recovery for large-scale erasure-coded storage systems. To improve the recovery performance, we propose a data placement algorithm named ESet. We define a configurable parameter named overlapping factor for system administrators to easily achieve desirable recovery I/O parallelism. Our simulation results show that ESet can significantly improve the data recovery performance without violating the reliability requirement by distributing data and code blocks across different failure domains. Third, we take a look at the performance of applying coding techniques to in-memory storage. A reliable in-memory cache for key-value stores named R-Memcached is designed and proposed. This work can be served as a prelude of applying erasure coding to in-memory metadata storage. R-Memcached exploits coding techniques to achieve reliability, and can tolerate up to two node failures. Our experimental results show that R-Memcached can maintain very good latency and throughput performance even during the period of node failures. At last, we design and implement a prototype named ESetStore for erasure-coded storage systems. The ESetStore integrates our data placement algorithm ESet to bring fast data recovery for storage systems.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Hubbertz, Andrew Paul. „Subject clitics and subject extraction in Somali“. Thesis, McGill University, 1991. http://catalog.hathitrust.org/api/volumes/oclc/32079883.html.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Laskey, Patrick Joshua. „A Poetics of Translation Transduction| Lifting the Erasure on What Always-already Was“. Thesis, The American University of Paris (France), 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13871587.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Dorman, Andrew. „Cosmetic Japaneseness : cultural erasure and cultural performance in Japanese film exports (2000-2010)“. Thesis, University of St Andrews, 2014. http://hdl.handle.net/10023/6354.

Der volle Inhalt der Quelle
Annotation:
Since the introduction of film to Japan in the 1890s, Japanese cinema has been continually influenced by transnational processes of film production, distribution, promotion, and reception. This has led inevitably to questions about the inherent nationality of Japan's film culture, despite the fact that Japanese cinema has often been subjected to analyses of its fundamental ‘Japaneseness'. This study seeks to make an original contribution to the field of Japanese film studies by investigating the contradictory ways in which Japan has functioned as a global cinematic brand in the period 2000 to 2010, and how this is interrelated with modes of promotion and reception in the English-speaking markets of the UK and the USA. Through textual and empirical analyses of seven films from the selected period and the non-Japanese consumption of them, this thesis argues that contemporary film exports are culturally-decentred in regards to their industrial and, to some extent, aesthetic dimensions. This results from contradictory modes of ‘cultural erasure' and ‘cultural performance' in the production of certain films, whereby aesthetic traces of cultural specificity are concealed or emphasised in relation to external commercial interests. Despite strategies of cultural erasure, explicit cinematic representations of cultural specificity remain highly valued as export commodities. Moreover, in the case of contemporary Japanese film exports, there are significant issues of ‘cultural ownership' to be accounted for given the extent to which non-national industrial consortia (film producers, financers, DVD distributors, film festivals) have invested in the promotion and in some cases the production of Japanese films. Thus, both in relation to the aesthetic erasure of Japaneseness and their non-Japanese commercial identities, recent film exports can be viewed as non-national cultural products that have a commercial and cinematic identity connected to external influences as much as internal ones.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Kim, Brian Y. „Superficial Seoul: Cultivating the Episodic, Exotic, & Erotic in a Culture of Erasure“. University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1470043571.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie