To see the other types of publications on this topic, follow the link: Artifacts.

Dissertations / Theses on the topic 'Artifacts'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Artifacts.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Musgrove, Lacar E. "Artifacts." ScholarWorks@UNO, 2012. http://scholarworks.uno.edu/td/1507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tice, Brian (Brian Joseph). "Sonic Artifacts." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112554.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 72-74).
A Sonic Artifact is a physical object that represents and contains a musical album and allows for real time interaction with the listener. We restore the association of music with the physical artifact of its delivery, a design of the music merchandise of the future, now with the ability to interact with the music in real time. Rather than the audio experience being delivered as a file via a centralized music streaming platform or other method, the music will reside in an active environment associated with the artist. The musical experience has the potential to be unique upon each listen and the total composition and is dependent on the actions of the listener. If the listener chooses, they get to be a part of the composition.
by Brian Tice.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
3

Franklin, Rhonda Deanne. "Feminist artifacts." The Ohio State University, 1992. http://rave.ohiolink.edu/etdc/view?acc_num=osu1314741090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sarver, Abbey Lee. "JPG Artifacts." VCU Scholars Compass, 2016. http://scholarscompass.vcu.edu/etd/4164.

Full text
Abstract:
This thesis examines my artistic practice over the past two years at Virginia Commonwealth University, which has led to the installation of my thesis exhibition, JPG Artifacts. My work inspects the current process of image making within a responsive studio practice of deconstructing the digital image into a physical space. While my thesis exhibition is just one culminating formal installation of my experimental studio practice, this paper will examine some main points of reference towards what has led me to the most present public iteration of my work. I hope to position my research in context of contemporary art and artists that have most heavily influenced and shaped my work.
APA, Harvard, Vancouver, ISO, and other styles
5

Di, Summa Laura Teresa. "Naturalizing artifacts." Thesis, Connect to online version, 2007. http://proquest.umi.com/pqdweb?did=1441194221&sid=1&Fmt=2&clientId=10306&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kientzel, Paula. "Artifacts and fantasy." Diss., Columbia, Mo. : University of Missouri-Columbia, 2007. http://hdl.handle.net/10355/4895.

Full text
Abstract:
Thesis (M.F.A)--University of Missouri-Columbia, 2007.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on March 28, 2008) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
7

Stepanek, Ellyn M. "Pop-culture artifacts." Cleveland, Ohio : Cleveland State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=csu1209741511.

Full text
Abstract:
Thesis (M.A.)--Cleveland State University, 2008.
Abstract. Title from PDF t.p. (viewed on July 11, 2008). Includes bibliographical references (p. 43-44). Available online via the OhioLINK ETD Center. Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
8

Rastkar, Sarah. "Summarizing software artifacts." Thesis, University of British Columbia, 2013. http://hdl.handle.net/2429/44482.

Full text
Abstract:
To answer an information need while performing a software task, a software developer sometimes has to interact with a lot of software artifacts. This interaction may involve reading through large amounts of information and many details of artifacts to find relevant information. In this dissertation, we propose the use of automatically generated natural language summaries of software artifacts to help a software developer more efficiently interact with software artifacts while trying to answer an information need. We investigated summarization of bug reports as an example of natural language software artifacts, summarization of crosscutting code concerns as an example of structured software artifacts and multi-document summarization of project documents related to a code change as an example of multi-document summarization of software artifacts. We developed summarization techniques for all the above cases. For bug reports, we used an extractive approach based on an existing supervised summarization system for conversational data. For crosscutting code concerns, we developed an abstractive summarization approach. For multi-document summarization of project documents, we developed an extractive supervised summarization approach. To establish the effectiveness of generated summaries in assisting software developers, the summaries were extrinsically evaluated by conducting user studies. Summaries of bug reports were evaluated in the context of bug report duplicate detection tasks. Summaries of crosscutting code concerns were evaluated in the context of software code change tasks. Multi-document summaries of project documents were evaluated by investigating whether project experts find summaries to contain information describing the reason behind the corresponding code changes. The results show that reasonably accurate natural language summaries can be automatically produced for different types of software artifacts and that the generated summaries are effective in helping developers address their information needs.
APA, Harvard, Vancouver, ISO, and other styles
9

Wikström, Martin. "Compensating for Respiratory Artifacts in Blood Pressure Waveforms." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2942.

Full text
Abstract:

Cardiac catheterization has for a long time been a valuable way to evaluate the hemodynamics of a patient. One of the benefits is that the entire blood pressure waveform can be recorded and visualized to the cardiologist. These measurements are however disturbed by different phenomenon, such as respiration and the dynamics of the fluid filled catheter, which introduces artifacts in the blood pressure waveform. If these disturbances could be removed, the measurement would be more accurate. This report focuses on the effects of respiratory artifacts in blood pressure signals during cardiac catheterization.

Four methods, a standard bandpass filter, two adaptive filters and one wavelet based method are considered. The difference between respiratory artifacts in systolic and diastolic pressure is studied and dealt with during compensation. All investigated methods are implemented in Matlab and validated against blood pressure signals from catheterized patients.

The results are algorithms that try to correct for respiratory artifacts. The rate of success is hard to determine since only a few measured blood pressure signals have been available and since the size and appearance of the actual artifacts are unknown.

APA, Harvard, Vancouver, ISO, and other styles
10

Fine, Justin Mitchell. "Artifacts of the Kree." Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/52972.

Full text
Abstract:
The work presented in this thesis explores the creation and curation of fictional artifacts. The goal was to simultaneously explore and create a wholly fictitious civilization as a means of self-actualization and a grasp at the ineffable. The "Artifacts of the Kree" is a real-time interactive rendering of digitally fabricated objects belonging to a civilization that inhabited a planet far beyond the reaches of humanity. These objects were curated second hand by an unknown sentient species and cataloged in the system presented here.
Master of Fine Arts
APA, Harvard, Vancouver, ISO, and other styles
11

King, Jonathan Lee. "Artifacts of Questions Asked." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/76901.

Full text
Abstract:
The cyclic trajectory described here exemplifies a loosely defined, continuously evolving set of questions, results, and methodologies that have emerged during the process of design by making. Through a series of prototypical building components and assemblies this collection presents a design process that began with a top-down program-specific design process that informed the development of a unique building system and enabled a bottom up formal exploration. As the design thesis for the first professional Master of Architecture degree, this exploration surrounds the design, fabrication, and deployment of a series of component-based building assemblies. One example, the SEEDS Pavilion At Hawks Ridge, serves as a remote base of operations for a local youth organization that supports field-based environmental education. The pavilion continues an investigation of user assembled construction and is based on a component group that can be assembled on-site by camp children. Each building component was manufactured using on campus fabrication laboratories and was assembled on-site by a group of supervised SEEDS camp student-volunteers during a two-day design-build workshop at the Hawk's Ridge Preserve in Floyd, Virginia. The form of the structure is derived by the limitation of component number, size, and assembly sequence and represents the conflict between a parametrically derived prescriptive shape and the forms that result from the bottom up exploration of the physical system itself. The component-based construction is made possible by a series of nodal linkage assemblies designed to accommodate variations in on-site conditions using a strategic 'sloppy detail' that enables a high degree of assembly and deployment tolerance. The following collection of sequential images outlines construction of several prototypical components and assemblies and is intended to represent a continuance, not an end, to a long-term effort.
Master of Architecture
APA, Harvard, Vancouver, ISO, and other styles
12

Li, Boyang. "Automatically Documenting Software Artifacts." W&M ScholarWorks, 2017. https://scholarworks.wm.edu/etd/1530192342.

Full text
Abstract:
Software artifacts, such as database schema and unit test cases, constantly change during evolution and maintenance of software systems. Co-evolution of code and DB schemas in Database-Centric Applications (DCAs) often leads to two types of challenging scenarios for developers, where (i) changes to the DB schema need to be incorporated in the source code, and (ii) maintenance of a DCAs code requires understanding of how the features are implemented by relying on DB operations and corresponding schema constraints. On the other hand, the number of unit test cases often grows as new functionality is introduced into the system, and maintaining these unit tests is important to reduce the introduction of regression bugs due to outdated unit tests. Therefore, one critical artifact that developers need to be able to maintain during evolution and maintenance of software systems is up-to-date and complete documentation. In order to understand developer practices regarding documenting and maintaining these software artifacts, we designed two empirical studies both composed of (i) an online survey of contributors of open source projects and (ii) a mining-based analysis of method comments in these projects. We observed that documenting methods with database accesses and unit test cases is not a common practice. Further, motivated by the findings of the studies, we proposed three novel approaches: (i) DBScribe is an approach for automatically documenting database usages and schema constraints, (ii) UnitTestScribe is an approach for automatically documenting test cases, and (iii) TeStereo tags stereotypes for unit tests and generates html reports to improve the comprehension and browsing of unit tests in a large test suite. We evaluated our tools in the case studies with industrial developers and graduate students. In general, developers indicated that descriptions generated by the tools are complete, concise, and easy to read. The reports are useful for source code comprehension tasks as well as other tasks, such as code smell detection and source code navigation.
APA, Harvard, Vancouver, ISO, and other styles
13

Black, Kelsey. "Analysis of Voice Activated Artifacts." Thesis, University of Colorado at Denver, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10683253.

Full text
Abstract:

This purpose of this thesis is to analyze voice-activated recording artifacts, using a playback audio created in Adobe Audition. To show how an automated voice recorder with standby mode treats the silence of a recording. This thesis focuses on the WAV PCM format. The WS-550M, WS-560M, and the DM-520 recorders did not have the option to create a WAV PCM file, therefore the WS-550M and the 560M created MP3 files and the DM-520 created a WMA file. Each of the recorders have automated standby mode. The recorders were set to create a WAV PCM that was a 16-bit stereo file at 44kHz. The following is a list of the devices that will be used in this study. Olympus DM-520, Olympus DM-620, Olympus WS-550M, Olympus WS-560M, Olympus WS-700M, Olympus WS-700M, Olympus WS-750M, Olympus WS-760M, Olympus WS-802, Olympus WS-822, Olympus WS-823, Philips Voice Tracer.

APA, Harvard, Vancouver, ISO, and other styles
14

Ehn, Pelle. "Work-oriented design of computer artifacts." Doctoral thesis, Umeå universitet, Samhällsvetenskapliga fakulteten, 1988. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-62913.

Full text
Abstract:
This thesis is an inquiry into the human activity of designing computer artifacts that are useful to people in their daily activity at work. The emphasis is on opportunities and constraints for industrial democracy and quality of work. First, the philosophical foundation of design of computer artifacts is con­sidered. The need for a more fundamental understanding of design than the one offered by rationalistic systems thinking is argued. The alternative design philosophy suggested is based on pragmatic interpretations of the philosophies of existential phenomenology, emancipatory practice, and or­dinary language. Design is seen as a concerned social and creative activity founded in our traditions, but aiming at transcending them by anticipation and construction of alternative futures. Second, it is argued that the existing disciplinary boundaries between natural sciences, social sciences and humanities are dysfunctional for the subject matter of designing computer artifacts. An alternative under­standing of the subject matter and a curriculum for its study is discussed. The alternative emphasizes social systems design methods, a new theoreti­cal foundation of design, and the new potential for design in the use of prototyping software and hardware. The alternative also emphasizes the need to learn from other more mature design disciplines such as architec­tural design. Towards this background, and based on the practical research in two projects (DEMOS and UTOPIA), a view on work-oriented design of computer artifacts is presented. This concerns, thirdly, the collective resource approach to design of com­puter artifacts - an attempt to widen the design process to also include trade union activities, and the explicit goal of industrial democracy in design and use. It is argued that a participative approach to the design process is not sufficient in the context of democratization. However, it is suggested that it is technically possible to design computer artifacts based on criteria such as skill and democracy at work, and a trade union investigation and negotia­tion strategy is argued for as a democratic and workable complement to traditional design activities. Finally, a tŒil perspective - the ideal of skilled workers and designers in coopération designing computer artifacts as tools for skilled work is consid­ered. It is concluded that computer artifacts can be designed with the ideal of c rail tools for a specific profession, utilizing interactive hardware devices and the computer's capacity for symbol manipulation to create this resemblance, and that a tool perspective, used with care, can be a useful design ideal. However, the ideological use of a tool metaphor is also taken into account, as is the instrumental blindness a tool perspective may create towards the importance of social interaction competence at work.
digitalisering@umu
APA, Harvard, Vancouver, ISO, and other styles
15

Sun, Jim Qile. "Wavelet artifacts and edge based postprocessing." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq21078.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Stormo, Jeremy M. "Analysis of Windows 8 Registry Artifacts." ScholarWorks@UNO, 2013. http://scholarworks.uno.edu/td/1779.

Full text
Abstract:
Microsoft’s series of Windows operating systems represents some of the most commonly encountered technologies in the field of digital forensics. It is then fair to say that Microsoft’s design decisions greatly affect forensic efforts. Because of this, it is exceptionally important for the forensics community to keep abreast of new developments in the Windows product line. With each new release, the Windows operating system may present investigators with significant new artifacts to explore. Described by some as the heart of the Windows operating system, the Windows registry has been proven to contain many of these forensically interesting artifacts. Given the weight of Microsoft’s influence on digital forensics and the role of the registry within Windows operating systems, this thesis delves into the Windows 8 registry in the hopes of developing new Windows forensics utilities.
APA, Harvard, Vancouver, ISO, and other styles
17

Katayama, Errol G. "Aristotle on artifacts : a metaphysical puzzle /." Albany (N.Y) : State university of New York press, 1999. http://catalogue.bnf.fr/ark:/12148/cb371816451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Elsaesser, Carl. "Invisible artifacts: public impasses, filmic intimacies." Thesis, University of Iowa, 2018. https://ir.uiowa.edu/etd/6100.

Full text
Abstract:
This essay mirrors the structure found in my thesis film—a series of small vignettes—but does not emulate one for the other and vice versa. The essay is an experiment that seeks to allow rather than define and imagines a world already in place rather than built by the accumulation of reading and writing this text; the vignettes serve as a witness to this world rather than symbols building up a system of signs. I’m interested in expanding the conversation around political personhood towards a receptive stance; a politics of receptivity. This doesn’t serve as a counter to any narratives around a politics of action, activism and social justice, especially one that is needed now more than ever with the current political climate. Instead the text pays close attention to the daily, the individual, and the banal, not as an apathetic or reactionary stance, but as one charged with potential. In casting off an overarching relationship to narrative and resolve in the text, which does not deny narrativity found in the moments and individual vignettes, I posit a self of inconclusive gestures, pregnant moments and the feeling of something meaningful happening in a world of interviews, speculative text messages, portraits of binge watching tv, video essays, and case studies. The potential pitfalls and failures of a piece of this nature is to read each vignette as a simple projection of a stable self, that a stable I is seeing himself in all that is around. And while this writing encourages one to hold up a mirror and see oneself in a vignette it is also a piece that explores these affective limitations and symbolic acts of violence. There is the critique that non-narrative potential filled space creates a problematically uncomplicated relationship to the subject matter and viewer. This is a notion further developed in Maggie Nelson’s book, The Art of Cruelty that thinks through arts capability to enact change through often violent or problematic means. In discussing the work of artist Alfredo Jaar in which he rebukes Newsweek’s headlines with his own headlines about the Rwandan genocide, she writes, “since the artist has predetermined what it is, exactly, that we should have been looking at… what is the use of our looking at all?” (Nelson 26). By avoiding the possibility of critique or question to the piece by supplying the question and answer, Jaar’s work, she argues, becomes problematically tame. The receptive stance taken aims to keeps questions at the forefront rather than answer them. It is a writing that tries to keep the question, “What is left? What is still there after?” churning, full of potential and charged. It’s a piece that presumes a self always in the wake of, as Bergson writes with beautiful images of cones in his book Matter and Memory. It is a piece that also assumes a self that is always becoming, as Spinoza suggests with his famous line, “No one has determined what the body can do”. (Spinoza 87). Despite the contradiction of a self pulled to and from, past and future, this piece presumes that selfhood is formed on the basis of both simultaneously. Similar to how CA Conrad articulates his rituals creating space for himself, the text permits associations and cross readings in order to find some kind of residue to reside in. While this residual space from the accretion of texts rubbing up against each other is one in which I’m identifying a receptive selfhood, this space is one that also radically permits another. Kathleen Stewart articulates a similar position in her fantastic ethnographic study of coal mining communities in West Virginia, A Space on the Side of the Road. In it she writes, “In the effort to track something of the texture, density, and force of a local cultural real through its mediating forms and their social uses, it tears itself between evocation and representation, mimesis and interpretation.” It here could be both the text and an understanding of selfhood. This polemic of constructing a self through an affective refuge of experiences is one that is not a transaction. The vignettes, unless themselves an exchange, are complete onto themselves. They are curated and designed, but through their excess, unresolved nature and individuality they stand alone, while their impact secretes. I cannot say how the wake of the piece will be felt or what action comes from political passivity. But, as the Vipassana teacher Michele McDonald teaches, I know that in order to act first one has to understand where we are acting from. In this way the piece performs a similar purpose to Raymond William’s, Structures of Feelings that maps out affective terrains. This piece attempts to map, but never seeks to pin point or locate. There is no single source to find for one’s internal landscape, but there might be a way to witness where and how impact occurs in the radically passive personhood, the same way focusing on the breath allows one to witness when and how awareness shifts to thought, sensation or emotion; it’s about a continuous how, rarely about fixing a why. In lieu of the film one can think of the text as a bin system for video editing software. The folder in which this text is archived would be titled, potential. The text moves similarly to scrubbing through material. Then there is | |. I’m hesitant to pin down and strictly label the significance of | | but maybe thinking of it as a direction; | | is a whispered, “now,” or a pinched now, or a hand on the back with a warm now, but most importantly it is a continued now that respects critical/emotional distance. This sign is riddled throughout the text- in some moments performing an experience of time, time passing, or time severed, or performing the self addressing self through the correction of speech and grammar. At times it reveals an emotional temporal sense where the subjective experience extends or shrinks the actual.
APA, Harvard, Vancouver, ISO, and other styles
19

Kutsenko, Olena. "Integrated versioning and collaboration process management of automotive production lines based on an artifact-centric collaboration environment." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4646.

Full text
Abstract:
While collaborative engineering in outsourcing projects presents potential benefits to the partners, it involves some risks and reasonable concerns. First, a poor mechanism of data exchange and data communication can lead to loss of effectiveness and efficiency of the project. Second, collaborative engineering requires partners to adapt a common business process, which often means moving away from a familiar way of working. Thus, it is crucial to find an optimal way a collaboration can be achieved with the lowest waste due to changes in communication practices, and losses in efficiency due to asynchronous processes or big amount of exchange data. The main goal of this thesis is to connect two aspects in a collaboration: data-exchange and process-based execution. The main reason to do so is to resolve a problem with a weak control over information needed for successful project execution. Three research methods are used in this work: a case study to analyze how collaboration is performed in the industry and which problems exist; a literature review to understand how existing collaboration tools can be adapted to help solving the identified problems; and a prototypical implementation to show how automated versioning of engineering knowledge can be added to the union of data exchange and process-based aspects. The case study was performed and a list of business requirements was presented. Based on the list of requirements, solutions within the literature were searched. A process-based artifact-centric concept was applied to the case study scenario. Objectives are achieved, however, the problem was shown on an example of one company, which presents a limitation, as generalization has not been proved.
APA, Harvard, Vancouver, ISO, and other styles
20

Holbrook, Elizabeth Ashlee. "SATISFACTION ASSESSMENT OF TEXTUAL SOFTWARE ENGINEERING ARTIFACTS." UKnowledge, 2009. http://uknowledge.uky.edu/gradschool_diss/712.

Full text
Abstract:
A large number of software projects exist and will continue to be developed that have textual requirements and textual design elements where the design elements should fully satisfy the requirements. Current techniques to assess the satisfaction of requirements by corresponding design elements are largely manual processes that lack formal criteria and standard practices. Software projects that require satisfaction assessment are often very large systems containing several hundred requirements and design elements. Often these projects are within a high assurance project domain, where human lives and millions of dollars of funding are at stake. Manual satisfaction assessment is expensive in terms of hours of human effort and project budget. Automated techniques are not currently applied to satisfaction assessment. This dissertation addresses the problem of automated satisfaction assessment for English, textual documents and the generation of candidate satisfaction assessments that can then be verified by a human analyst with far less effort and time expenditure than is required to produce a manual satisfaction assessment. Validation results to date show that automated satisfaction methods produce candidate satisfaction assessments sufficient to greatly reduce the effort required to assess the satisfaction of textual requirements by textual design elements.
APA, Harvard, Vancouver, ISO, and other styles
21

Scholl, Philipp Marcel [Verfasser], Kristof Van [Akademischer Betreuer] Laerhoven, and Bernd [Akademischer Betreuer] Becker. "Activity recognition with instrumented and wearable artifacts." Freiburg : Universität, 2018. http://d-nb.info/1155722396/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Tng, Yan Siong. "Artifacts in Radar Imaging of Moving Targets." Thesis, Monterey, California. Naval Postgraduate School, 2012. http://hdl.handle.net/10945/17470.

Full text
Abstract:
Approved for public release; distribution is unlimited
In this thesis, we study the artifacts that occur when a scene being imaged by radar contains moving targets. The physics of interaction between radar waves and moving targets were studied to develop a model using MATLAB for the received signal which does not make use of the start-stop approximation. The effects of target motion in the image formation process were studied for different radar configurations, including multistatic radars and Synthetic Aperture Radar. The key limitation of this model is its high computational resource requirements when simulating high bandwidth or long pulses. It was observed that range profiles may experience distortion due to the received waveforms differences from the matched filter. The exact outcome is waveform dependent; generally, both main lobe broadening and range errors were introduced by target motion. This leads to the wrong object localization and defocusing on the image. For SAR, a moving targets physical location varies throughout the imaging process. This means that standard backprojection fails to yield a focused image even if the range error due to the Doppler shift has been corrected, resulting in smearing. This is similar to motion blur experienced in optical cameras with a fast object.
APA, Harvard, Vancouver, ISO, and other styles
23

Corbin, George. "The Google Chrome operating system forensic artifacts." Thesis, Utica College, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1571599.

Full text
Abstract:

The increased popularity of Google Chromebooks due to their ease of use, security features and low price have contributed to explosive growth in terms of their market share in the personal computing marketplace. This growing market share will result in Chromebooks becoming part of new and ongoing forensic investigations. It is important for forensic investigators to have a strong understanding of the forensic artifacts found on a Google Chromebook. The investigators need to know what these artifacts mean and how to acquire them. A Google Chromebook uses the Google Chrome Operating System for its operating system. The purpose for the research was to begin developing the necessary art in support of forensic examiners tasked with investigating Google Chromebooks and the data they use.

APA, Harvard, Vancouver, ISO, and other styles
24

Kamışlı, Fatih. "Reduction of blocking artifacts using side information." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/35602.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.
Includes bibliographical references (p. 95-96).
Block-based image and video coding systems are used extensively in practice. In low bit-rate applications, however, they suffer from annoying discontinuities, called blocking artifacts. Prior research shows that incorporating systems that reduce blocking artifacts into codecs is useful because visual quality is improved. Existing methods reduce blocking artifacts by applying various post-processing techniques to the compressed image. Such methods require neither any modification to current encoders nor an increase in the bit-rate. This thesis examines a framework where blocking artifacts are reduced using side information transmitted from the encoder to the decoder. Using side information enables the use of the original image in deblocking, which improves performance. Furthermore, the computational burden at the decoder is reduced. The principal question that arises is whether the gains in performance of this choice can compensate for the increase in the bit-rate due to the transmission of side information. Experiments are carried out to answer this question with the following sample system: The encoder determines block boundaries that exhibit blocking artifacts as well as filters (from a predefined set of filters) that best deblock these block boundaries.
(cont.) Then it transmits side information that conveys the determined block boundaries together with their selected filters to the decoder. The decoder uses the received side information to perform deblocking. The proposed sample system is compared against an ordinary coding system and a post-processing type deblocking system with the bit-rate of these systems being equal to the overall bit-rate (regular encoding bits + side information bits) of the proposed system. The results of the comparisons indicate that, both for images and video sequences, the proposed system can perform better in terms of both visual quality and PSNR for some range of coding bit-rates.
by Fatih Kamisli.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
25

Su, Hua. "Statistical design and optimization of engineering artifacts /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487864986609792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Kanter, Jaimie. "Fan Fiction Crossovers| Artifacts of a Reader." Thesis, Hofstra University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10286513.

Full text
Abstract:

Over twenty-five years ago, Henry Jenkins (1992) wrote that fan fiction writing is evidence of “exceptional reading” (p. 284) in that the fan text reflects a reader’s commentary. This investigation examined the ways in which crossover fan fiction, fan-written fiction that mixes elements of two or more well-known fictional worlds, might reveal evidence of this “exceptional reading.” Using a qualitative content analysis of 5 crossover texts that remix Rowling’s Harry Potter series and Austen’s Pride and Prejudice, the study focused on fan writers-as-readers of the source texts. Drawing on Rosenblatt’s (1988) transactional theory of reading, which posits that meaning resides in the transactions between reader, text, and writer, and that the meaning produced is a “new event,” this research concluded that the fan fiction writers’ crossover texts were, in part, a written record of some of the fan writers’ transactions with the source texts, a partial record of the “new event.” Furthermore, this analysis provided evidence that these fan readers-turned-writers demonstrated a powerful understanding of their intended and anticipated audience, a commanding and controlled use of emulation, and a calculated mingling of worlds—both to sustain and to disrupt the fan canon—in order to present their own interpretations of, comments on, and admiration for the source texts. The crossovers are evidence of “exceptional reading” in that they demonstrate the fan writers’ reading transactions.

APA, Harvard, Vancouver, ISO, and other styles
27

Kwolek, Queston Aureon, and Queston Aureon Kwolek. "30/30 Museum & Park: Engaging Artifacts." Thesis, The University of Arizona, 2017. http://hdl.handle.net/10150/625028.

Full text
Abstract:
This project is located in the St. Henri neighborhood along the Lachine Canal in Montreal, Quebec. Industrial artifacts along the canal are culturally and historically significant to the people of Montreal. These artifacts are currently disconnected from public access – residents and tourists should be able to fully engage with them. The abandoned malting plant site has the potential to become an engaging destination that visitors want to explore. The proposal honors and reimagines the site’s industrial infrastructure and introduces valuable public amenities to the Lachine Canal. The 30/30 concept refers to the juxtaposition of the existing thirty silos and proposed thirty mounds. Generated from the volumetric capacity of the silos and natural form of grain, the mounds support vegetation to restore the sites pre-industrial presence of nature. Museum functions and public spaces are integrated into both the silos and mounds, resulting in an activity-driven experience for visitors that is centered on exploration and discovery. The proposal has the potential to host events, exhibitions, and outdoor activities year-round. By allowing guests to "trespass" through urban artifacts, they are invited to discover the mysterious atmosphere and cultural significance of the former factory and the site’s new public amenities.
APA, Harvard, Vancouver, ISO, and other styles
28

Sementilli, Philip Joseph Jr. "Suppression of artifacts in super-resolved images." Diss., The University of Arizona, 1993. http://hdl.handle.net/10150/186313.

Full text
Abstract:
Super-resolution of imagery is an image restoration problem in which the goal is to recover attenuated spatial frequencies from diffraction-limited images. In this dissertation we consider the limits on super-resolution performance in terms of usable bandwidth of the restored frequency spectrum. Based on a characterization of the spectral extrapolation errors (viz. null objects), we derive an expression for an approximate bound on accurate bandwidth extension for the general class of super-resolution algorithms that incorporate a-priori assumptions of a non-negative, space-limited object. For super-resolution of sampled data, we show that it may be possible to achieve significant bandwidth extrapolation depending on the relationship between sampling rate, optical cutoff frequency, and object extent. Simulation results are presented which substantiate the derived bandwidth extrapolation bounds. Given the existence of oscillatory restoration artifacts, we present several artifact suppression techniques as adjuncts to a Poisson maximum a-posteriori (MAP) super-resolution algorithm developed by Hunt. It is shown that "resolution kernels" can be used to accomplish artifact-free restoration with limited bandwidth extension. To achieve increased bandwidth extension, a smoothness constrained MAP estimator is derived which demonstrates substantial artifact reduction in the restoration of natural scenes. It is shown that the constrained Poisson MAP estimator combines a Poisson image observation model with a specific form of Markov random field object prior. Several simulation results demonstrate the artifact suppression capabilities of the regularized algorithms.
APA, Harvard, Vancouver, ISO, and other styles
29

Suresh, Nitin. "Mean Time Between Visible Artifacts in Visual Communications." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/16238.

Full text
Abstract:
As digital communication of television content becomes more pervasive, and as networks supporting such communication become increasingly diverse, the long-standing problem of assessing video quality by objective measurements becomes particularly important. Content owners as well as content distributors stand to benefit from rapid objective measurements that correlate well with subjective assessments, and further, do not depend on the availability of the original reference video. This thesis investigates different techniques of subjective and objective video evaluation. Our research recommends a functional quality metric called Mean Time Between Failures (MTBF) where failure refers to video artifacts deemed to be perceptually noticeable, and investigates objective measurements that correlate well with subjective evaluations of MTBF. Work has been done for determining the usefulness of some existing objective metric by noting their correlation with MTBF. The research also includes experimentation with network-induced artifacts, and a study on statistical methods for correlating candidate objective measurements with the subjective metric. The statistical significance and spread properties for the correlations are studied, and a comparison of subjective MTBF with the existing subjective measure of MOS is performed. These results suggest that MTBF has a direct and predictable relationship with MOS, and that they have similar variations across different viewers. The research is particularly concerned with the development of new no-reference objective metrics that are easy to compute in real time, as well as correlate better than current metrics with the intuitively appealing MTBF measure. The approach to obtaining greater subjective relevance has included the study of better spatial-temporal models for noise-masking and test data pooling in video perception. A new objective metric, 'Automatic Video Quality' metric (AVQ) is described and shown to be implemented in real time with a high degree of correlation with actual subjective scores, with the correlation values approaching the correlations of metrics that use full or partial reference. This is metric does not need any reference to the original video, and when used to display MPEG2 streams, calculates and indicates the video quality in terms of MTBF. Certain diagnostics like the amount of compression and network artifacts are also shown.
APA, Harvard, Vancouver, ISO, and other styles
30

Dickerson, Anne E. "Age differences in functional performance : deficits or artifacts?" FIU Digital Commons, 1991. http://digitalcommons.fiu.edu/etd/2810.

Full text
Abstract:
An experiment was conducted to compare the functional performance of 20 young adults and 20 older adults in two types of tasks. One type of task was normal activities of daily living which are meaningful, familiar, and well practiced while the other type was a contrived, relatively unfamiliar task of wrapping a package. While young and old adults did not differ in the ratings of the familiarity of the two tasks, results from an Age by Task Type mixed MANOVA demonstrated a significant age difference in both activities. This suggests that older adults show age-related decline with tasks even when those tasks are familiar, practiced, and ecologically valid.
APA, Harvard, Vancouver, ISO, and other styles
31

Johnson, Denise Evelyn. "Incorporating visual artifacts into the systems development process." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0008/MQ35505.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Elnaffar, Said Selim. "Semi-automated linking of user interface design artifacts." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape7/PQDD_0004/MQ42610.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Sivzattian, Siv Vahe. "Requirements as economic artifacts : a portfolio-based approach." Thesis, Imperial College London, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.406730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Vogel, Pascal [Verfasser]. "Designing Openness-Infusing Socio-Technical Artifacts / Pascal Vogel." Hamburg : Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky, 2021. http://d-nb.info/1235243982/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Ferreira, Pedro Filipe Araujo de Castro. "First-pass myocardial perfusion MRI : artifacts and advances." Thesis, Imperial College London, 2010. http://hdl.handle.net/10044/1/5937.

Full text
Abstract:
Magnetic resonance imaging of myocardial perfusion during the first-pass of a contrast agent has been proved valuable for the detection of coronary artery disease. During contrast enhancement, transient dark rim artifacts are sometimes visible in the subendocardium, mimicking real perfusion defects and complicating diagnosis. This thesis studied several different mechanisms behind dark rim artifacts with the aim of exploring possible solutions to minimise them and potentially improve the accuracy of perfusion methods. An in-depth review of current myocardial perfusion imaging techniques is presented. This is followed by a comprehensive study of dark rim artifacts, with realistic phantom and numerical simulations, and in vivo measurements. Simulations for the most common perfusion sequences are made, showing that Gibbs, myocardial radial-motion, and frequency-offsets are capable of creating endocardial signal-loss, although dependent on many sequence parameters. Frequency-offsets during first-pass of contrast agentwere measured in vivo; results show negligible intra-voxel signal dephasing, although careful frequency adjustments need to be considered for the b-SSFP and h-EPI sequences. The investigations on dark rim artifacts lead to the development of an ultrafast but robust sequence suitable for first-pass myocardial perfusion imaging, and the assessment of its in vivo performance. The sequence was a multi-slice single-shot spin-echo EPI sequence accelerated by a reduced phase-encode FOV (zonal excitation), and parallel imaging R=2. When tested in clinical volunteers with CA at rest, the sequence yielded a reasonable CNR with a very short acquisition time, although spatial resolution needs to be improved.
APA, Harvard, Vancouver, ISO, and other styles
36

Elumeze, Nwanua Onochie. "SmartTiles: Towards room-sized, child-programmable computational artifacts." Connect to online resource, 2007. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1447654.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Dillon, Andrew. "Artifacts as theories: Convergence through user-centered design." Medford, N.J.: ASIS, 1995. http://hdl.handle.net/10150/105923.

Full text
Abstract:
This item is not the definitive copy. Please use the following citation when referencing this material: Dillon, A. (1995) Artifacts as Theories: Convergence through User- Centered Design. 1995 Proceedings of the 58th Annual ASIS Conference, Medford NJ: ASIS, 208-210. Abstract: The present paper proposes the artifact as theory perspective which draws together models of scientific practice and design behaviour and in so doing, offers the view of any information technology system as a conjecture on the part of the design team of human and organizational requirements to be met. By adopting this perspective, information system design can be seen as an ill-structured problem best tackled by usercentered theories and methods. The present paper will outline this perspective, emphasizing the need for convergence of views at the outset of design, and demonstrate the advantages it offers to both the theory and practice of technology design and the field of information science.
APA, Harvard, Vancouver, ISO, and other styles
38

Allen-Masacek, Marjorie Kirsten. "Teaching ARTifacts: Teaching art with a cultural lens." CSUSB ScholarWorks, 2001. https://scholarworks.lib.csusb.edu/etd-project/1925.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Oliva, Enrico <1976&gt. "Argumentation and artifacts for intelligent multi-agent systems." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/919/1/Tesi_Oliva_Enrico.pdf.

Full text
Abstract:
Reasoning under uncertainty is a human capacity that in software system is necessary and often hidden. Argumentation theory and logic make explicit non-monotonic information in order to enable automatic forms of reasoning under uncertainty. In human organization Distributed Cognition and Activity Theory explain how artifacts are fundamental in all cognitive process. Then, in this thesis we search to understand the use of cognitive artifacts in an new argumentation framework for an agent-based artificial society.
APA, Harvard, Vancouver, ISO, and other styles
40

Oliva, Enrico <1976&gt. "Argumentation and artifacts for intelligent multi-agent systems." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/919/.

Full text
Abstract:
Reasoning under uncertainty is a human capacity that in software system is necessary and often hidden. Argumentation theory and logic make explicit non-monotonic information in order to enable automatic forms of reasoning under uncertainty. In human organization Distributed Cognition and Activity Theory explain how artifacts are fundamental in all cognitive process. Then, in this thesis we search to understand the use of cognitive artifacts in an new argumentation framework for an agent-based artificial society.
APA, Harvard, Vancouver, ISO, and other styles
41

Poli, Nicola. "Game Engines and MAS: BDI & Artifacts in Unity." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15657/.

Full text
Abstract:
In questa tesi vedremo un breve sunto riguardo lo stato dei Sistemi Multi-Agente e andremo ad analizzare le limitazioni che attualmente ne impediscono l'utilizzo ai programmatori di videogiochi. Dopodiché, andremo a proporre un nuovo linguaggio BDI, basato su Prolog e inspirato a Jason, che, grazie all'interprete Prolog sviluppato da I. Horswill, darà la possibilità al programmatore di videogiochi di esprimere comportamenti dichiarativi di alto livello per agenti autonomi all'interno del game engine Unity. Andremo anche a proporre una versione di Artefatto per la modellazione dell'ambiente in una scena Unity e un layer di comunicazione che agenti e artefatti possano utilizzare per interagire tra loro. Infine presenteremo un caso di studio per sottolineare i benefici che questo sistema fornisce.
APA, Harvard, Vancouver, ISO, and other styles
42

Milich, Kacy. "A ratiometric fluorometer for reduced sensitivity against solvent artifacts." Diss., Columbia, Mo. : University of Missouri-Columbia, 2005. http://hdl.handle.net/10355/5837.

Full text
Abstract:
Thesis (M.S.)--University of Missouri-Columbia, 2005.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file viewed on (January 24, 2007) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
43

Eilevstjønn, Joar. "Removal of Cardiopulmonary Resuscitation Artifacts in the Human Electrocardiogram." Doctoral thesis, Norwegian University of Science and Technology, Department of Electronics and Telecommunications, 2004. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-314.

Full text
Abstract:

Death from heart diseases is the most common type of mortality in western countries and the survival rate of cardiac arrest is dismally low. In the treatment of cardiac arrest, two therapeutic methods are most important: cardiopulmonary resuscitation (CPR; chest compressions and ventilations) and defibrillation (electrical shocks to restart a fibrillating heart).

An automated external defibrillator is commonly used for such shocks, and records and performs signal analysis on the electrocardiogram(ECG) in order to advice when to shock the patient. However, the mechanical activity during CPR introduces artifact components in the ECG. To perform reliable ECG signal analysis, CPR is therefore discontinued for a substantial time before the potential delivery of a shock. This wastes valuable therapy time, and if this hands-off time could be reduced or eliminated by removing these artifacts, it should improve the chance of return of spontaneous circulation.

We propose a method for removing CPR artifacts using a novel multichannel adaptive filter, the computationally efficient and numerically robust MultiChannel Recursive Adaptive Matching Pursuit(MC-RAMP) filter. Using the most realistic data set to date, human out-of-hospital cardiac arrest data of both shockable and non-shockable rhythms, we test MC-RAMP and evaluate the feasibility of ECG analysis during CPR. In our experiments we use a shock advice algorithm and individual ECG signal features to reach the conclusion that after CPR artifact filtering, ECG rhythm analysis during ongoing CPR is feasible.

Finally, we analyze and quantify the time intervals without blood flow (no flow time(NFT)) during external automatic defibrillation in cardiac arrest patients and show that these patients were not perfused around half of the time. We propose methods using CPR artifact filtering to reduce the NFT, and show their significant and promising potential. By introducing the proposed methods into defibrillators, the NFT would be significantly reduced, hopefully increasing the survival.

APA, Harvard, Vancouver, ISO, and other styles
44

Lins, Otavio G. "Ocular artifacts in recording EEGs and event related potentials." Thesis, University of Ottawa (Canada), 1993. http://hdl.handle.net/10393/6889.

Full text
Abstract:
The ocular artifacts derive from the potential difference between the cornea and the fundus of the eye. This can be represented by an equivalent dipole with its positive pole directed toward the cornea. The DC potential between the cornea and the forehead measures approximately +13 mV. The scalp-distribution of the ocular artifacts can be described in terms of propagation factors--the percentage of the EOG present at the EEG electrodes. These factors are significantly different for blinks and upward eye-movements. The source dipoles for blinks and saccades are different--blink dipoles point radially whereas saccade dipoles point tangentially, in the direction of the eye movement. Blink and eye movement potentials are generated by different mechanisms--blink potentials are generated by the eyelid sliding over the cornea, eye movement potentials by the rotation of the ocular dipole. A very small downward rotation of the eyes may occur during a normal blink. The "rider artifact" at the onset of upward saccade is caused by the eyelid as it lags behind the eyes at the beginning of the movement. Smaller rider artifacts, caused by the horizontal asymmetry of the eyelid, can be noted during horizontal but not downward saccades. Techniques that use scaled EOG to remove ocular artifacts from EEG recordings may remove some of the frontal EEG together with the ocular artifacts. Dipole source techniques allow the ocular generators to be distinguished from the nearby brain generators. A problem with dipole source techniques is that the head model used in the calculation is not accurate at the eyes. A new technique uses principal component analysis to estimate the ocular artifact at each electrode without using a head model. This technique is the most effective way to remove ocular artifacts from EEG recordings. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
45

Sténson, Carl. "Object Placement in AR without Occluding Artifacts in Reality." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-211112.

Full text
Abstract:
Placement of virtual objects in Augmented Reality is often done without regarding the artifacts in the physical environment. This thesis investigates how placement can be done with the artifacts included. It only considers placement of wall mounted objects. Through the development of two prototypes, using detected edges in RGB-images in combination with volumetric properties to identify the artifacts, arreas will be suggested for placement of virtual objects. The first prototype analyze each triangle in the model, which is an intensive and with low precision on the localization of the physical artifacts. The second prototype analyzed the detected RGB-edges in world space, which proved to detect the features with precise localization and a reduce calculation time. The second prototype manages this in a controlled setting. However, a more challenging environment would possibly pose other issues. In conclusion, placement in relation to volumetric and edge information from images in the environment is possible and could enhance the experience of being in a mixed reality, where physical and virtual objects coexist in the same world.
Placering av virtuella objekt i Augumented Reality görs ofta utan att ta hänsyn till objekt i den fysiska miljön. Den här studien utreder hur placering kan göras med hänsyn till den fysiska miljön och dess objekt. Den behandlar enbart placering av objekt på vertikala ytor. För undersökningen utvecklas två prototyper som använder sig av kantigenkänning i foton samt en volymmetrisk representation av den fysiska miljön. I denna miljö föreslår prototyperna var placering av objekt kan ske. Den första prototypen analyserar varje triangel i den volymmetriska representationen av rummet, vilket visade sig vara krävande och med låg precision av lokaliseringen av objekt i miljön. Den andra prototypen analyserar de detekterade kanterna i fotona och projicerar dem till deras positioner i miljön. Vilket var något som visade sig hitta objekt i rummet med god precision samt snabbare än den första prototypen. Den andra prototypen lyckas med detta i en kontrollerad miljö. I en mer komplex och utmanande miljö kan problem uppstå. Placering av objekt i Augumented Reality med hänsyn till både en volymmetrisk och texturerad representation av en miljö kan uppnås. Placeringen kan då ske på ett mer naturligt sätt och därmed förstärka upplevelsen av att virtuella och verkliga objekt befinner sig i samma värld.
APA, Harvard, Vancouver, ISO, and other styles
46

Heyer, Tim. "Semantic Inspection of Software Artifacts From Theory to Practice." Doctoral thesis, Linköping : Univ, 2001. http://www.ep.liu.se/diss/science_technology/07/25/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Faridi, Imran Zafar. "Image Compression Using Bidirectional DCT to Remove Blocking Artifacts." Digital Archive @ GSU, 2005. http://digitalarchive.gsu.edu/cs_theses/9.

Full text
Abstract:
Discrete Cosine Transform (DCT) is widely used transform in many areas of the current information age. It is used in signal compression such as voice recognition, shape recognition and also in FBI finger prints. DCT is the standard compression system used in JPEG format. The DCT quality deteriorates at low-bit compression rate. The deterioration is due to the blocking artifact inherent in block DCT. One of the successful attempts to reduce these blocking artifacts was conversion of Block-DCT into Line-DCT. In this thesis we will explore the Line-DCT and introduce a new form of line-DCT called Bidirectional-DCT, which retains the properties of Line- DCT while improving computational efficiency. The results obtained in this thesis show significant reduction in processing time both in one dimensional and two dimensional DCT in comparison with the traditional Block-DCT. The quality analysis also shows that the least mean square error is considerably lower than the traditional Block-DCT which is a consequence of removing the blocking artifacts. Finally, unlike the traditional block DCT, the Bidirectional-DCT enables compression with very low bit rates and very low blocking artifacts.
APA, Harvard, Vancouver, ISO, and other styles
48

Bertino, Leanne. "The significance of bear canine artifacts in Hopewell context." Virtual Press, 1994. http://liblink.bsu.edu/uhtbin/catkey/897529.

Full text
Abstract:
This study has presented a comprehensive overview of the context and significance of real and effigy bear canine artifacts in Hopewell context. The evidence suggests that burials with bear canine artifacts and additional grave goods in an extended position contained high status individuals. These burials contained the remains of males or male children, with status differences evident in both burial position and quantity of grave goods. Bear canine artifacts found in non-burials contexts were primarily found in "ceremonial caches." The inclusion of bear canine artifacts in such caches is indicative of their spiritual importance in Hopewell culture. Modification, including drilling, splitting and piercing of bear canine artifacts occurred in all five regions where these artifacts were found. This was the only class of data that spanned all five regions. Data from burials indicates that these artifacts were commonly used as a form of adornment, especially necklaces. Evidence from a burial at Hopewell Mounds points to an ideological, religious function for these artifacts. Much of the data for effigy bear canine artifacts correlates with t--at of real canines, and they appear to have served the same function. Since people chose to manufacture these artifacts rather than do without indicates that the meaning behind the image represented by bear canine may be more important than the artifact itself.
Department of Anthropology
APA, Harvard, Vancouver, ISO, and other styles
49

Kilegran, Johanna. "Correction of cupping artifacts in localized laboratoryphase-contrast CT." Thesis, KTH, Tillämpad fysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-246330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hsieh, Mo-Han. "Standards as interdependent artifacts : the case of the Internet." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/43848.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.
Includes bibliographical references.
This thesis has explored a new idea: viewing standards as interdependent artifacts and studying them with network analysis tools. Using the set of Internet standards as an example, the research of this thesis includes the citation network, the author affiliation network, and the co-author network of the Internet standards over the period of 1989 to 2004. The major network analysis tools used include cohesive subgroup decomposition (the algorithm by Newman and Girvan is used), regular equivalence class decomposition (the REGE algorithm and the method developed in this thesis is used), nodal prestige and acquaintance (both calculated from Kleinberg's technique), and some social network analysis tools. Qualitative analyses of the historical and technical context of the standards as well as statistical analyses of various kinds are also used in this research. A major finding of this thesis is that for the understanding of the Internet, it is beneficial to consider its standards as interdependent artifacts. Because the basic mission of the Internet (i.e. to be an interoperable system that enables various services and applications) is enabled, not by one or a few, but by a great number of standards developed upon each other, to study the standards only as stand-alone specifications cannot really produce meaningful understandings about a workable system. Therefore, the general approaches and methodologies introduced in this thesis which we label a systems approach is a necessary addition to the existing approaches. A key finding of this thesis is that the citation network of the Internet standards can be decomposed into functionally coherent subgroups by using the Newman-Girvan algorithm.
(cont.) This result shows that the (normative) citations among the standards can meaningfully be used to help us better manage and monitor the standards system. The results in this thesis indicate that organizing the developing efforts of the Internet standards into (now) 121 Working Groups was done in a manner reasonably consistent with achieving a modular (and thus more evolvable) standards system. A second decomposition of the standards network was achieved by employing the REGE algorithm together with a new method developed in this thesis (see the Appendix) for identifying regular equivalence classes. Five meaningful subgroups of the Internet standards were identified, and each of them occupies a specific position and plays a specific role in the network. The five positions are reflected in the names we have assigned to them: the Foundations, the Established, the Transients, the Newcomers, and the Stand-alones. The life cycle among these positions was uncovered and is one of the insights that the systems approach on this standard system gives relative to the evolution of the overall standards system. Another insight concerning evolution of the standard system is the development of a predictive model for promotion of standards to a new status (i.e. Proposed, Draft and Internet Standards as the three ascending statuses). This model also has practical potential to managers of standards setting organizations and to firms (and individuals) interested in efficiently participating in standards setting processes. The model prediction is based on assessing the implicit social influence of the standards (based upon the social network metric, betweenness centrality, of the standards' authors) and the apparent importance of the standard to the network (based upon calculating the standard's prestige from the citation network).
(cont.) A deeper understanding of the factors that go into this model was also developed through the analysis of the factors that can predict increased prestige over time for a standard. The overall systems approach and the tools developed and demonstrated in this thesis for the study of the Internet standards can be applied to other standards systems. Application (and extension) to the World Wide Web, electric power system, mobile communication, and others would we believe lead to important improvements in our practical and scholarly understanding of these systems.
by Mo-Han Hsieh.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography