Dissertations / Theses on the topic 'Snapshoot'

To see the other types of publications on this topic, follow the link: Snapshoot.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Snapshoot.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Xie, Jiahua. "Moment beyond moment." Click here to access this resource online, 2008. http://hdl.handle.net/10292/452.

Full text
Abstract:
This practice-based project explores the photographic phenomenon of ‘moment beyond moment’, which refers to the combined representations of an existing image in an environment, together with the real-life situation at the moment the photograph is taken. I call this photograph an ‘extended photograph’. Employing practical works of extended photographs and focusing on interactions between the moment in real-life and the moment in an existing image, the research explores the transformation of meanings caused by the interactions of these moments in an extended photograph. The research owes its approach to grounded theory, contrary thinking and Chinese Buddhist ‘Sudden Enlightenment’ to further its aim of exploring the unpredictable interaction of these moments, and to disclose the potentials of meaning transformation. My research outcome intends to initiate a discourse with photographic practitioners and theorists on the phenomenon of moment beyond moment in a working environment that is encaged by the excessive existence of displayed images. The thesis is composed as a creative work that consists of a series of photographic images accompanied by an exegesis component. The images represent a nominal 80%, and the exegesis 20% of the final submission.
APA, Harvard, Vancouver, ISO, and other styles
2

Sabatke, Derek S. "Snapshot spectropolarimetry." Diss., The University of Arizona, 2002. http://hdl.handle.net/10150/289859.

Full text
Abstract:
Channeled spectropolarimetry is a novel method of measuring the spectral dependence of the polarization state of light. Amplitude modulation is employed to encode all four Stokes component spectra into a single optical power spectrum. The encoding is performed with a simple arrangement of two thick birefringent retarders and a linear analyzer. No moving parts are required, and the system is able to acquire its data in a single detector array integration time. We report the results of an in-depth study of channeled spectropolarimetry. The mathematics of the amplitude modulation analogy are explored, providing a basic design procedure. The system's spectral resolution is described in terms of the space bandwidth product. The technique is then analyzed in the general context of linear operator theory, using both analytic and computational approaches to the singular value decomposition and pseudoinversion of the system's operator. This analysis highlights the importance of the choice of object space in constraining linear reconstructions of data from under-determined systems, and provides the underpinnings of the calibration and reconstruction techniques for a hardware prototype. Calibration of the prototype is approached as experimental estimation of the system's operator. Our basic method of reconstruction involves pseudoinversion of the operator while constraining object space to a truncated Fourier basis. Apodization is helpful in reducing the ringing of reconstructions of spectra which extend beyond the edges of the system's spectral range. Experimental results are presented, including comparisons between measurements taken with the channeled spectropolarimeter and a reference rotating compensator, fixed analyzer instrument. We have used measurements of the effects of stress birefringence on light propagated through material subject to time-varying stress to demonstrate time-resolved snapshot spectropolarimetry. Continuing efforts include the combination of channeled spectropolarimetry with computed tomography imaging spectrometry to realize a snapshot imaging spectropolarimeter.
APA, Harvard, Vancouver, ISO, and other styles
3

Hagen, Nathan. "Snapshot imaging spectropolarimetry." Diss., The University of Arizona, 2007. http://hdl.handle.net/10150/195957.

Full text
Abstract:
The research for this dissertation project began with the goal to construct a snapshot imaging spectropolarimeter forthe visible spectrum. The instrument integrates a channeled spectropolarimeter (CHSP) into a computed tomographicimaging spectrometer (CTIS), the result being an instrument that measures the complete spatially- andspectrally-resolved Stokes vectors of a scene. It is not the first of its kind, since a similar instrument has beenbuilt before for use in the short-wave infrared. However, that instrument encountered severe difficulties due tolimitations of available hardware. Visible spectrum work generally enjoys the best instrumentation available, providingan ideal place to attempt a proof-of-concept demonstration.The main body of the research is focused on finding ways to improve the CTIS measurement technique, especially in waysallowing it to integrate with channeled spectropolarimetry. The first effort is a careful analysis and reworking of thecalibration procedure for the instrument, followed by a survey and comparison of ideas for alternative CTIS designs.The second effort makes use of the new calibration approach to develop an alternative way of thinking about CTISreconstructions based on the geometry and physics of the instrument rather than on abstract matrix mathematics. Thisopens up ways to improve their accuracy and to achieve reconstructions at a much higher speed.Experimental results from the instrument illustrate the improvements obtained from using the new methods, showing itscurrent capabilities and limitations.
APA, Harvard, Vancouver, ISO, and other styles
4

Bryntesson, Fredrik. "Snapshot Algorithm Animation with Erlang." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-211087.

Full text
Abstract:
Algorithms used in distributed systems for synchronization can often be hard to understand, and especially for beginners these concepts can be difficult to apprehend. Seeing an animation of these concepts could help to gain insight about how they work. The Snapshot algorithm (Chandy-Lamport) is one of these. But what is a good animation of an algorithm? What characteristics do an animation need to be considered as good? This thesis describes an analysis of those characteristics and a development of an animation software for the Snapshot algorithm using a game engine written in Erlang.
APA, Harvard, Vancouver, ISO, and other styles
5

Cahill, Michael James. "Serializable Isolation for Snapshot Databases." University of Sydney, 2009. http://hdl.handle.net/2123/5353.

Full text
Abstract:
PhD
Many popular database management systems implement a multiversion concurrency control algorithm called snapshot isolation rather than providing full serializability based on locking. There are well-known anomalies permitted by snapshot isolation that can lead to violations of data consistency by interleaving transactions that would maintain consistency if run serially. Until now, the only way to prevent these anomalies was to modify the applications by introducing explicit locking or artificial update conflicts, following careful analysis of conflicts between all pairs of transactions. This thesis describes a modification to the concurrency control algorithm of a database management system that automatically detects and prevents snapshot isolation anomalies at runtime for arbitrary applications, thus providing serializable isolation. The new algorithm preserves the properties that make snapshot isolation attractive, including that readers do not block writers and vice versa. An implementation of the algorithm in a relational database management system is described, along with a benchmark and performance study, showing that the throughput approaches that of snapshot isolation in most cases.
APA, Harvard, Vancouver, ISO, and other styles
6

Tebow, Christopher. "A Tunable Snapshot Imaging Spectrometer." Diss., Tucson, Arizona : University of Arizona, 2005. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu%5Fetd%5F1023%5F1%5Fm.pdf&type=application/pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Volin, Curtis Earl. "Portable snapshot infrared imaging spectrometer." Diss., The University of Arizona, 2000. http://hdl.handle.net/10150/289203.

Full text
Abstract:
A practical, field-capable, 3.0 to 5.0 μm mid-wave infrared Computed-Tomography Imaging Spectrometer (CTIS) has been demonstrated. The CTIS employs a simple optical system in order to measure the object cube without any scanning . The data is not measured directly, but in a manner which requires complicated post-processing to extract an estimate of the object's spectral radiance. The advantage of a snapshot imaging spectrometer is that it can collect information about a dynamic event which a standard scanning spectrometer could either miss or corrupt with temporal artifacts. Results were presented for reconstructions of laboratory targets with sampling up to 46 x 46 x 21 voxels over a variable field-of-view, or 0.1 μm spectral sampling. Demonstration of the snapshot capability has been performed on both static targets and targets with rapidly varying content. The contents of this dissertation are directed towards two ends. The primary undertaking is a realization of the theoretical model of the CTIS is a practical, field-capable MWIR instrument. The design, calibration, and operation of the MWIR CTIS are explained in detail in the text and appendices. Of additional interest is the advancement of the theory to improve the design and functionality of the spectrometer. A new algorithm for design of the holographic disperser component of the CTIS is introduced. The design process dramatically extends the set of possibilities for the disperser. In order to improve the reconstruction potential of the spectrometer, the analytic expressions which describe the CTIS have been expanded into a principal component basis set. The result is a technique for creating an initial estimate of the object and a technique for improving the reconstruction algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

Aumiller, Riley. "Longwave Infrared Snapshot Imaging Spectropolarimeter." Diss., The University of Arizona, 2013. http://hdl.handle.net/10150/301708.

Full text
Abstract:
The goal of this dissertation research is to develop and demonstrate a functioning snapshot imaging spectropolarimeter for the long wavelength infrared region of the electromagnetic spectrum (wavelengths from 8-12 microns). Such an optical system will be able to simultaneously measure both the spectral and polarimetric signatures of all the spatial locations/targets in a scene with just a single integration period of a camera. This will be accomplished by combining the use of computed tomographic imaging spectrometry (CTIS) and channeled spectropolarimetry. The proposed system will be the first instrument of this type specifically designed to operate in the long wavelength infrared region, as well as being the first demonstration of such a system using an uncooled infrared focal plane array. In addition to the design and construction of the proof-of-concept snapshot imaging spectropolarimeter LWIR system, the dissertation research will also focus on a variety of methods on improving CTIS system performance. These enhancements will include some newly proposed methods of system design, calibration, and reconstruction aimed at improving the speed of reconstructions allowing for the first demonstration of a CTIS system capable of computing reconstructions in 'real time.'
APA, Harvard, Vancouver, ISO, and other styles
9

Luo, Haitao. "Snapshot Imaging Polarimeters Using Spatial Modulation." Diss., The University of Arizona, 2008. http://hdl.handle.net/10150/193905.

Full text
Abstract:
The recent demonstration of a novel snapshot imaging polarimeter using the fringe modulation technique shows a promise in building a compact and moving-parts-free device. As just demonstrated in principle, this technique has not been adequately studied. In the effort of advancing this technique, we build a complete theory framework that can address the key issues regarding the polarization aberrations caused by using the functional elements. With this model, we can have the necessary knowledge in designing, analyzing and optimizing the systems. Also, we propose a broader technique that uses arbitrary modulation instead of sinusoidal fringes, which can give us more engineering freedom and can be the solution of achromatizing the system. In the hardware aspect, several important progresses are made. We extend the polarimeter technique from visible to middle wavelength infrared by using the yttrium vanadate crystals. Also, we incorporate a Savart Plate polarimter into a fundus camera to measure the human eye's retinal retardance, useful information for glaucoma diagnosis. Thirdly, a world-smallest imaging polarimeter is proposed and demonstrated, which may open many applications in security, remote sensing and bioscience.
APA, Harvard, Vancouver, ISO, and other styles
10

Mihoubi, Sofiane. "Snapshot multispectral image demosaicing and classification." Thesis, Lille 1, 2018. http://www.theses.fr/2018LIL1I062/document.

Full text
Abstract:
Les caméras multispectrales échantillonnent le spectre du visible et/ou de l'infrarouge selon des bandes spectrales étroites. Parmi les technologies disponibles, Les caméras snapshot équipées d'une mosaïque de filtres acquièrent des images brutes à cadence vidéo. Ces images brutes nécessitent un processus de dématriçage permettant d'estimer l'image multispectrale en pleine définition. Dans ce manuscrit nous examinons les méthodes de dématriçage multispectrale et proposons une nouvelle méthode basée sur l'image panchromatique. De plus, nous mettons en évidence l'influence de l'illumination sur les performances de dématriçage, puis nous proposons des étapes de normalisation rendant ce dernier robuste aux propriétés d'acquisition. Les résultats expérimentaux montrent que notre méthode fournit de meilleurs résultats que les méthodes classiques.Afin d'effectuer une analyse de texture, nous étendons les opérateurs basés sur les motifs binaires locaux aux images de texture multispectrale au détriment d'exigences de mémoire et de calcul accrues. Nous proposons alors de calculer les descripteurs de texture directement à partir d'images brutes, ce qui évite l'étape de dématriçage tout en réduisant la taille du descripteur. Afin d'évaluer la classification sur des images multispectrales, nous avons proposé la première base de données multispectrale de textures proches dans les domaines spectraux du visible et du proche infrarouge. Des expériences approfondies sur cette base montrent que le descripteur proposé a à la fois un coût de calcul réduit et un pouvoir de discrimination élevé en comparaison avec les descripteurs classiques appliqués aux images dématriçées
Multispectral cameras sample the visible and/or the infrared spectrum according to narrow spectral bands. Available technologies include snapshot multispectral cameras equipped with filter arrays that acquire raw images at video rate. Raw images require a demosaicing procedure to estimate a multispectral image with full spatio-spectral definition. In this manuscript we review multispectral demosaicing methods and propose a new one based on the pseudo-panchromatic image. We highlight the influence of illumination on demosaicing performances, then we propose pre- and post-processing normalization steps that make demosaicing robust to acquisition properties. Experimental results show that our method provides estimated images of better objective quality than classical ones.Multispectral images can be used for texture classification. To perform texture analysis, we extend local binary pattern operators to multispectral texture images at the expense of increased memory and computation requirements. We propose to compute texture descriptors directly from raw images, which both avoids the demosaicing step and reduces the descriptor size. In order to assess classification on multispectral images we have proposed the first significant multispectral database of close-range textures in the visible and near infrared spectral domains. Extensive experiments on this database show that the proposed descriptor has both reduced computational cost and high discriminating power with regard to classical local binary pattern descriptors applied to demosaiced images
APA, Harvard, Vancouver, ISO, and other styles
11

Vosandi, Lauri. "Efficient and Reliable Filesystem Snapshot Distribution." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177471.

Full text
Abstract:
Linux is an portable operating system kernel devised by Linus Torvalds and it can be used in conjunction with other userspace utilities such as GNU to build a free and open-source operating system for a multitude of target applications. While Linux-based operating systems have made significant progress on the servers and embedded systems, there is still much room for improvement for workstations and laptops. Up to now Linux-based operating system deployment has been error prone, time-consuming process and usually specific to a particular distribution of Linux. Linux-based operating systems also have a reputation of being overly complex to set up for a novice computer user and even though there are now laptops available with pre-installed Ubuntu [1], installing Linux-based operating system on arbitrary device is troublesome due to lack of native support for certain hardware components. In this thesis Butterknife, a B-tree file system (Btrfs) and Linux Containers (LXC) based provisioning suite is presented. Butterknife can be used to significantly reduce deployment time of customized Linuxbased operating system. Butterknife makes use of LXC to prepare a template of the root filesystem and Btrfs snapshotting to save state of the template. Btrfs send/receive mechanism is then used to transfer the root filesystem to the target machine. Post-deployment scripts are then used to configure the root filesystem for particular deployment, optionally retaining hostname, domain membership, configuration management keys etc. Current implementation of Butterknife uses HTTP(S) and multicast for transport, and various peer-to-peer scenarios are discussed in the Section 6 – Conclusions and Future Work. In addition to provisioning, Butterknife makes use of Btrfs incremental snapshots to implement differential upgrades. This approach is especially attractive for mobile devices, embedded systems and Internet of Things, where software upgrades have to be delivered in a guaranteed manner. Butterknife brings additional value to already existing ecosystem by bridging gap between stock installation medium and configuration management.
APA, Harvard, Vancouver, ISO, and other styles
12

Valdez, Ashley. "Snapshot Spectral Domain Optical Coherence Tomography." Thesis, The University of Arizona, 2016. http://hdl.handle.net/10150/613413.

Full text
Abstract:
Optical coherence tomography systems are used to image the retina in 3D to allow ophthalmologists diagnose ocular disease. These systems yield large data sets that are often labor-intensive to analyze and require significant expertise in order to draw conclusions, especially when used over time to monitor disease progression. Spectral Domain Optical Coherence Tomography (SD-OCT) instantly acquires depth profiles at a single location with a broadband source. These systems require mechanical scanning to generate two- or three-dimensional images. Instead of mechanically scanning, a beamlet array was used to permit multiple depth measurements on the retina with a single snapshot using a 3x 3 beamlet array. This multi-channel system was designed, assembled, and tested using a 1 x 2 beamlet lens array instead of a 3 x 3 beamlet array as a proof of concept prototype. The source was a superluminescent diode centered at 840nm with a 45nm bandwidth. Theoretical axial resolution was 6.92um and depth of focus was 3.45mm. Glass samples of varying thickness ranging from 0.18mm to 1.14mm were measured with the system to validate that correct depth profiles can be acquired for each channel. The results demonstrated the prototype system performed as expected, and is ready to be modified for in vivo applicability.
APA, Harvard, Vancouver, ISO, and other styles
13

Sardar, Zeeshan Mohammad. "Snapshot based concurrency control protocols for XML." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=98790.

Full text
Abstract:
Currently, only few XML data management systems support concurrent access to an XML document, and if they do, they typically apply variations of locking to handle XML's nested structure. However, advanced query processing techniques use a wide range of indexes and allow for arbitrary navigation through the XML documents, making lock acquisition complex and potentially leading to high blocking rates. In this thesis, we suggest two concurrency control protocols that avoid any read locks by providing transactions a committed snapshot of the data. OptiX enhances traditional optimistic concurrency control to work on XML while SnaX offers snapshot isolation as provided by relational database systems like Oracle and Post-greSQL. We evaluate the performance of these two protocols on XML documents of different structure and on the XMark benchmark. For that purpose, we propose suitable update operations to serve as an extension to XMark which currently supports only queries.
APA, Harvard, Vancouver, ISO, and other styles
14

Alomari, Mohammad. "Ensuring Serializable Executions with Snapshot Isolation DBMS." University of Sydney, 2009. http://hdl.handle.net/2123/4211.

Full text
Abstract:
Doctor of Philosophy(PhD)
Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.
APA, Harvard, Vancouver, ISO, and other styles
15

Peng, Gang. "A distributed snapshot protocol for virtual machines." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/32049.

Full text
Abstract:
The distributed snapshot protocol is a critical technology in the areas of disaster recovery and computer security of distributed systems, and there have appeared a huge number of projects working on this topic since the 1970's. Recently, with the popularity of parallel computing and disaster recovery, this topic has received more and more attention from both academic and industrial researchers. However, all the existing protocols have several common disadvantages. First, existing protocols all require several modifications to the target processes or their OS, which is usually error prone and sometimes impractical. Second, all the existing protocols are only aiming at taking snapshots of processes, not whole entire OS images, which constrains the areas to which they can be applied. This thesis introduces the design and implementation of our hypervisor level, coordinated non-blocking distributed snapshot protocol. Superior to all the existing protocols, it provides a simpler and totally transparent snapshot platform to both the target processes and their OS images. Based on several observations of the target environment, we simplify our protocol by intentionally ignoring the channel states, and to hide our protocol from the target processes and their OS, we, on one hand, exploit VM technology to silently insert our protocol under the target OS, and on the other hand, design and implement two kernel modules and a management daemon system in the control domain. We test our protocol with several popular benchmarks and all the experimental results prove the correctness and the efficiency of our protocol.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
16

Locke, Ann M. "Design and analysis of a snapshot imaging spectropolarimeter." Diss., The University of Arizona, 2003. http://hdl.handle.net/10150/280365.

Full text
Abstract:
The subject of this dissertation is the implementation of Computed Tomographic Imaging Channeled Spectropolarimetry (CTICS) in the design and analysis of a short wave infrared (SWIR) system with a 54 x 46 spatial resolution and 70 band spectral resolution from 1.25-1.99 μm for the purpose of object identification and classification. It is the first of its kind to provide imaging spectropolarimetry with no moving parts and snapshot capability. This spectropolarimeter has applications in many fields, such as mining, military reconnaissance, biomedical imaging, and astronomy. First, motivations are provided for building this unique imaging spectropolarimeter by discussing the current applications of such systems, the drawbacks of previous designs, and a review of some the current systems being used. A review of basic concepts on imaging systems, linear algebra, and polarimetry is given as an introduction into the technical details of the design of the system that follow. First, designing the Computed Tomography Imaging Spectrometer (CTIS) and then the channeled spectropolarimetry components. The fusion of these two techniques create the CTICS. An assembled version of the SWIR CTIS is calibrated and reconstructions of various objects demonstrate the capabilities of this portion of the system. The polarimetry components are added and a discussion follows on the method used to extract the new data. Two systems, a polarization state generator (PSG) and rotating retarder fixed analyzer (RRFA) system are built to verify the CTICS accuracy. The final assembled system is presented and testing results are shown. Error analysis on various sources of noise is done. To conclude, a novel sub-Nyquist sampling technique is demonstrated and future work is suggested on a reconstruction technique that will streamline the postprocessing of the images.
APA, Harvard, Vancouver, ISO, and other styles
17

Parker, Alistair. "Snapshot photography : a phatic, socially constructed mnemonic technology." Thesis, Lancaster University, 2017. http://eprints.lancs.ac.uk/88475/.

Full text
Abstract:
This practice related research study explores my cognitive response to a biographical snapshot photograph celebrating my first day at school. The experience triggered an exploration of the relationship between snapshot photographs and memory. The finding of a second almost identical snapshot photograph of my son taken twenty years later by me prompted me to question why my father and I should take almost identical snapshots. I argue that the invention of photography was driven by the desire to capture the images created by the camera obscura by mark-making with the pencil of light as an aid memoir. I argue that the desire to externalise memory using mnemonic technology is innate with primal origins in parietal art and lithic technologies. The discourse explores the cultural evolution of technology through Jaques Derrida’s theory of originary technicity and Bernard Stiegler’s concept of the cultural evolution of technology by epiphylogenesis and the notion of the externalisation of memory as prosthesis. I explore the emergence of snapshot photography from the canon of photography through the theories of cultural evolution, technological momentum, and social constructivism, together with psycho-social notions of desire, ritual, performativity and intentionality in the establishment of snapshot photography as a ubiquitous ingrained social practice. The research is informed by a studio practice element that uses the adventures of Lewis Carroll’s, Alice as a conceptual framework to explore a journey of agency, self and auto didactical knowledge acquisition. I discuss the search for an appropriate methodological framework for art practice based research. My practice is a catalyst for enquiry; a project usually starts with an artefact that forms the locus of a question, the search for the answer to those questions, often leading epistemically, to unexpected places and relationships. The mode and manner of my enquiry are rhizomatous, pragmatic and serendipitous; the relationship between practice and theory is flexible, one informing the other. Through practice, I explore the deconstruction and textualisation of the visual metaphor of memory through the rhetorical devices of ekphrasis and memory texts and a visualisation of the nature and originary technicity of snapshot photography and an exploration of self and place. The thesis for this study is founded on the premise that snapshot photography is a socially constructed, phatic, mnemonic mark-making technology with origins in parietal forms of visual expression.
APA, Harvard, Vancouver, ISO, and other styles
18

Saghazadeh, Sahar. "Implementing a state snapshot for the EMCA Simulator." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-147543.

Full text
Abstract:
The aim of this master thesis was designing and developing snapshot functionality for EMCA hardware simulator system which is used for implementing digital signal processing functions at Ericsson AB. The literature review contains the background knowledge on modeling, simulation, virtual machines and hardware simulators. The simulator system has been reviewed and the design of the snapshot functionality for the system has been explained. The performance of the snapshot functionality for saving and retrieving the visible state information has been evaluated. The results show that developed functionality works well in saving and retrieving the information. Moreover, the effect of the invisible simulator states has been investigated. Some suggestions are presented for furtherworks.
APA, Harvard, Vancouver, ISO, and other styles
19

Shi, Jeff, Tony Mao, James Chesney, and Nicholas Speciale. "Fast Auroral Snapshot Explorer (FAST) Packet Processing System." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611832.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
This paper describes the design of a space telemetry level zero processing system for National Aeronautics and Space Administration's (NASA's) Fast Auroral Snapshot Explorer (FAST) science mission. The design is based on a prototype Very Large Scale Integration (VLSI) level zero processing system, and utilizes VLSI telemetry data processing functional components, VLSI system technologies, and Object-Oriented Programming. The system performs level zero processing functions based on Consultative Committee for Space Data Systems (CCSDS) data format [1], and features high data processing rates, highly automated operations, and Open Software Foundation (OSF)/Motif based Graphical User Interface (GUI).
APA, Harvard, Vancouver, ISO, and other styles
20

Geierman, Joseph. "Facility management during the 2009 recession a snapshot view /." Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31663.

Full text
Abstract:
Thesis (M. S.)--Building Construction, Georgia Institute of Technology, 2010.
Committee Chair: Roper, Kathy; Committee Member: Castro-Lacouture, Daniel; Committee Member: Thomas-Mobley, Linda. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
21

Chao, Da-Wei David. "Development and evaluation of remote database snapshot refresh methods /." Thesis, Connect to this title online; UW restricted, 1990. http://hdl.handle.net/1773/8830.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Fernandez, Ramos Javier. "Snapshot multispectral oximetry using image replication and birefringent spectrometry." Thesis, University of Glasgow, 2017. http://theses.gla.ac.uk/8162/.

Full text
Abstract:
This thesis describes the improvements to the image replicating imaging spectrometer (IRIS) and the development of novel applications in the field of oximetry. IRIS is a snapshot multispectral device with a high transmission output and no need of inversion for data recovering, hence, with high signal-to-noise ratio (SNR). IRIS shows great versatility due to the possibility of choosing multiple contiguous or non-contiguous wavelengths inside its free spectral range. IRIS uses a set of waveplates and Wollaston prisms to demultiplex the spectral information of an object and replicate the image of such object in different wavelengths. The birefringent nature of IRIS means that different wavelengths are separated by the Wollaston prisms with different angles, introducing multiple images of the same object. In addition, the spectral transmission function shows multiple spectral sidelobes that contaminate each IRIS band with light belonging to other wavelengths. These issues can lower the performance of IRIS as a multispectral imaging device. In this thesis, these problems were assessed with the introduction of a filter plate array placed in the image plane of the optical system. This filter array is a set of narrow-band filters (Full Width Half Maximum (FWHM) =10 ± 2 nm ) that removes undesired wavelengths from each IRIS band. Since the spectral transmission of IRIS is replicated along the free spectral range, the filters can be designed to match any of the present spectral lobes in IRIS. The design and fabrication of a filter array enhance the performance of IRIS as a multispectral imaging device: it allows wavelength selection and improves spectral and spatial image quality. The design and manufacture of the corresponding filter holder and camera adapter were critical in terms of offering an easy filter-camera implementation. The filter plate allowed the removal of other dispersed wavelengths by the Wollaston prisms, improving image registration between the set of spectral images created by IRIS, and so, improving the quality of the registered spectral 3-D cube. The implemented improvements on IRIS allow high quality, calibration-free oximetry using eight different wavelengths optimised for oximetry. Two main experiments were performed: 1) Using an inverted microscopy interfaced with IRIS and a linear spectral unmixing technique, we measured the deoxygenation of single horse red blood cells (RBC) in vitro in real time. The oximetry was performed with a subcellular spatial resolution of 0.5 μ m , a temporal resolution of 30 Hz, and an accuracy (standard error of the mean) of ± 1.1% in oxygen saturation. 2) Eight-wavelength calibration-free retinal oximetry performed in nine healthy subjects demonstrated an increase in the stability of the oxygen saturation measurements along retinal vessels when compared with more traditional analysis methods such as two wavelengths oximetry. The stability was measured as the standard deviation along the retinal vessels of the nine subjects and was found to be ∼ 3% in oxygen saturation for eight-wavelengths oximetry and ∼ 5% in oxygen saturation for two-wavelengths oximetry. A modified physical model was used to improve the characterization of light propagation through the eye, retina, and blood vessels by applying a set of feasible physiological assumptions. This model was optimised by an algorithm which solves for the different variables involved in the retinal vessels transmissions in order to accurately calculate the oxygen saturation. The oximetry algorithm was applied in retinal vessels, in collaboration in vivo on rat spinal cord to assess hypoxia in inflammatory diseases such as multiple sclerosis and rheumatoid arthritis and on mice legs to assess hypoxia on autoimmune diseases. A third experiment using a microscope interfaced with IRIS was performed. The experiment aimed to replicate laminar flow conditions observed in retinal vessels and to calculate oxygen diffusion between adjacent streams of blood with different oxygen saturation. For this purpose a PDMS multichannel flow cell with cross sections of 40x100 μm was designed and fabricated allowing us to replicate conditions found in retinal blood vessels. Laminar flow was replicated but the experiment failed in calculating oxygen diffusion due to flaws in the experiment. The experiment with the results and recommendations on how to improve it can be found in Apendix B for future researchers.
APA, Harvard, Vancouver, ISO, and other styles
23

Anderson, J. L. "A COMPACT 500 MS/SEC WIDEBAND SNAPSHOT RECORDER/WORKSTATION." International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/613051.

Full text
Abstract:
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada
This paper describes the design of the TCM Plus, an integrated 500 MSample/second snapshot recording system that achieves high performance in a compact, modular implementation. The system can record and playback analog and digital signals at sample rates from 10 KHz - 500 MHZ with RAM-based storage of up to 256 MBytes. High density multi-layer circuit card designs and custom and semi-custom chips were required to meet the physical size design objective of a 7" high rack mount chassis for the memory unit. A highly graphical computer with standard busses was imbedded as the system controller, resulting in the capability to tightly couple wideband acquisition with signal processing application software which can turn the system into an ultrahigh performance signal processing workstation.
APA, Harvard, Vancouver, ISO, and other styles
24

Fleischer, Candace C. "A molecular snapshot of charged nanoparticles in the cellular environment." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53632.

Full text
Abstract:
Nanoparticles are promising platforms for biomedical applications ranging from diagnostic tools to therapeutic delivery agents. During the course of these applications, nanoparticles are exposed to a complex mixture of extracellular serum proteins that nonspecifically adsorb onto the surface. The resulting protein layer, or protein "corona," creates an interface between nanoparticles and the biological environment. Protecting the nanoparticle surface can reduce protein adsorption, but complete inhibition remains a challenge. As a result, the corona, rather than the nanoparticle itself, mediates the cellular response to the nanoparticle. The following dissertation describes the fundamental characterization of the cellular binding of charged nanoparticles, interactions of protein-nanoparticle complexes with cellular receptors, and the structural and thermodynamic properties of adsorbed corona proteins.
APA, Harvard, Vancouver, ISO, and other styles
25

Almeida, Fábio Renato de [UNESP]. "Gerenciamento de transação e mecanismo de serialização baseado em Snapshot." Universidade Estadual Paulista (UNESP), 2014. http://hdl.handle.net/11449/122161.

Full text
Abstract:
Made available in DSpace on 2015-04-09T12:28:25Z (GMT). No. of bitstreams: 0 Previous issue date: 2014-02-28Bitstream added on 2015-04-09T12:47:36Z : No. of bitstreams: 1 000811822.pdf: 1282272 bytes, checksum: ffbcb6d3dc96adfefe2d6b8418c1e323 (MD5)
Dentre os diversos níveis de isolamento sob os quais uma transação pode executar, Snapshot se destaca pelo fato de lidar com uma visão isolada da base de dados. Uma transação sob o isolamento Snapshot nunca bloqueia e nunca é bloqueada quando solicita uma operação de leitura, permitindo portanto uma maior concorrência quando a mesma é comparada a uma execução sob um isolamento baseado em bloqueios. Entretanto, Snapshot não é imune a todos os problemas decorrentes da concorrência e, portanto, não oferece garantia de serialização. Duas estratégias são comumente empregadas para se obter tal garantia. Na primeira delas o próprio Snapshot é utilizado, mas uma alteração estratégica na aplicação e na base de dados, ou até mesmo a inclusão de um componente de software extra, são empregados como auxiliares para se obter apenas históricos serializáveis. Outra estratégia, explorada nos últimos anos, tem sido a construção de algoritmos fundamentados no protocolo de Snapshot, mas adaptados de modo a impedir as anomalias decorrentes do mesmo e, portanto, garantir serialização. A primeira estratégia traz como vantagem o fato de se aproveitar os benefícios de Snapshot, principalmente no que diz respeito ao monitoramento apenas dos elementos que são escritos pela transação. Contudo, parte da responsabilidade em se lidar com problemas de concorrência é transferida do Sistema Gerenciador de Banco de Dados (SGBD) para a aplicação. Por sua vez, a segunda estratégia deixa apenas o SGBD como responsável pelo controle de concorrência, mas os algoritmos até então apresentados nesta categoria tem exigido também o monitoramento dos elementos lidos. Neste trabalho é desenvolvida uma técnica onde os benefícios de Snapshot são mantidos e a garantia de serialização é obtida sem a necessidade de adaptação do código da aplicação ou da introdução de uma camada de software extra. A técnica proposta é ...
Among the various isolation levels under which a transaction can execute, Snapshot stands out because of its capacity to work on an isolated view of the database. A transaction under the Snapshot isolation never blocks and is never blocked when requesting a read operation, thus allowing a higher level of concurrency when it is compared to an execution under a lock-based isolation. However, Snapshot is not immune to all the problems that arise from the competition, and therefore no serialization warranty exists. Two strategies are commonly employed to obtain such assurance. In the first one Snapshot itself is used, but a strategic change in the application and database, or even the addition of an extra software component, are employed as assistants to get only serializable histories. Another strategy, explored in recent years, has been the coding of algorithms based on the Snapshot protocol, but adapted to prevent the anomalies arising from it, and therefore ensure serialization. The first strategy has the advantage of exploring the benefits of Snapshot, especially with regard to monitoring only the elements that are written by the transaction. However, part of the responsibility for dealing with competition issues is transferred from the Database Management System (DBMS) to the application. In turn, the second strategy leaves only the DBMS as responsible for concurrency control, but the algorithms presented so far in this category also require the monitoring of the elements that the transaction reads. In this work we developed a technique where the benefits of Snapshot use are retained and serialization warranty is achieved without the need for adaptation of application code or the addition of an extra software layer. The proposed technique is implemented in a prototype of a DBMS that has temporal features and has been built to demonstrate the applicability of the technique in systems that employ the object-oriented model. However, the ...
APA, Harvard, Vancouver, ISO, and other styles
26

Almeida, Fábio Renato de. "Gerenciamento de transação e mecanismo de serialização baseado em Snapshot /." São José do Rio Preto, 2014. http://hdl.handle.net/11449/122161.

Full text
Abstract:
Orientador: Carlos Roberto Valêncio
Banca: Elaine Parros Machado de Sousa
Banca: Rogéria Cristiane Gratão de Souza
Resumo: Dentre os diversos níveis de isolamento sob os quais uma transação pode executar, Snapshot se destaca pelo fato de lidar com uma visão isolada da base de dados. Uma transação sob o isolamento Snapshot nunca bloqueia e nunca é bloqueada quando solicita uma operação de leitura, permitindo portanto uma maior concorrência quando a mesma é comparada a uma execução sob um isolamento baseado em bloqueios. Entretanto, Snapshot não é imune a todos os problemas decorrentes da concorrência e, portanto, não oferece garantia de serialização. Duas estratégias são comumente empregadas para se obter tal garantia. Na primeira delas o próprio Snapshot é utilizado, mas uma alteração estratégica na aplicação e na base de dados, ou até mesmo a inclusão de um componente de software extra, são empregados como auxiliares para se obter apenas históricos serializáveis. Outra estratégia, explorada nos últimos anos, tem sido a construção de algoritmos fundamentados no protocolo de Snapshot, mas adaptados de modo a impedir as anomalias decorrentes do mesmo e, portanto, garantir serialização. A primeira estratégia traz como vantagem o fato de se aproveitar os benefícios de Snapshot, principalmente no que diz respeito ao monitoramento apenas dos elementos que são escritos pela transação. Contudo, parte da responsabilidade em se lidar com problemas de concorrência é transferida do Sistema Gerenciador de Banco de Dados (SGBD) para a aplicação. Por sua vez, a segunda estratégia deixa apenas o SGBD como responsável pelo controle de concorrência, mas os algoritmos até então apresentados nesta categoria tem exigido também o monitoramento dos elementos lidos. Neste trabalho é desenvolvida uma técnica onde os benefícios de Snapshot são mantidos e a garantia de serialização é obtida sem a necessidade de adaptação do código da aplicação ou da introdução de uma camada de software extra. A técnica proposta é ...
Abstract: Among the various isolation levels under which a transaction can execute, Snapshot stands out because of its capacity to work on an isolated view of the database. A transaction under the Snapshot isolation never blocks and is never blocked when requesting a read operation, thus allowing a higher level of concurrency when it is compared to an execution under a lock-based isolation. However, Snapshot is not immune to all the problems that arise from the competition, and therefore no serialization warranty exists. Two strategies are commonly employed to obtain such assurance. In the first one Snapshot itself is used, but a strategic change in the application and database, or even the addition of an extra software component, are employed as assistants to get only serializable histories. Another strategy, explored in recent years, has been the coding of algorithms based on the Snapshot protocol, but adapted to prevent the anomalies arising from it, and therefore ensure serialization. The first strategy has the advantage of exploring the benefits of Snapshot, especially with regard to monitoring only the elements that are written by the transaction. However, part of the responsibility for dealing with competition issues is transferred from the Database Management System (DBMS) to the application. In turn, the second strategy leaves only the DBMS as responsible for concurrency control, but the algorithms presented so far in this category also require the monitoring of the elements that the transaction reads. In this work we developed a technique where the benefits of Snapshot use are retained and serialization warranty is achieved without the need for adaptation of application code or the addition of an extra software layer. The proposed technique is implemented in a prototype of a DBMS that has temporal features and has been built to demonstrate the applicability of the technique in systems that employ the object-oriented model. However, the ...
Mestre
APA, Harvard, Vancouver, ISO, and other styles
27

Wilmsen, Dominik. "Derivation of Change from Sequences of Snapshots." Fogler Library, University of Maine, 2006. http://www.library.umaine.edu/theses/pdf/WilmsenD2006.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Stevens, Charlotte. "Snapshots from the cultural history of taste." Thesis, Loughborough University, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.416724.

Full text
Abstract:
This thesis explores the cultural, or literary history of taste as a social construct. Taking the mid-eighteenth century as its starting point, the thesis adopts an historicist approach to five very particular texts from this vast history. It begins by focusing on three novels: firstly, Henry Fielding's Tom Jones (1749) which was published at a time when there was increasing pressure to create `standards' of taste; secondly, Jane Austen's Sense and Sensibility (1811) which belongs to a moment that scrutinised these `standards'; and thirdly, Charles Dickens's Oliver Twist (1837), which reflects an era in which taste is driven by commercial forces. The final chapters explore a significant twentieth-century development in the history of taste: namely, the adaptation of text into film. Here, David Lean's Oliver Twist (1948) and Tony Richardson's Tom Jones (1963) become the focus for close investigation. I argue that Lean's Oliver Twist very much belongs to a post-war Britain in which the acquisition of taste was part of a wider framework for maintaining national and social cohesion. Richardson's Tom Jones, I argue, must be read in relation to the cultural revolutions in tastet hat dominatedth e early 1960s.
APA, Harvard, Vancouver, ISO, and other styles
29

Stender, Jan. "Snapshots in large-scale distributed file systems." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2013. http://dx.doi.org/10.18452/16660.

Full text
Abstract:
Viele moderne Dateisysteme unterstützen Snapshots zur Erzeugung konsistenter Online-Backups, zur Wiederherstellung verfälschter oder ungewollt geänderter Dateien, sowie zur Rückverfolgung von Änderungen an Dateien und Verzeichnissen. Während frühere Arbeiten zu Snapshots in Dateisystemen vorwiegend lokale Dateisysteme behandeln, haben moderne Trends wie Cloud- oder Cluster-Computing dazu geführt, dass die Datenhaltung in verteilten Speichersystemen an Bedeutung gewinnt. Solche Systeme umfassen häufig eine Vielzahl an Speicher-Servern, was besondere Herausforderungen mit Hinblick auf Skalierbarkeit, Verfügbarkeit und Ausfallsicherheit mit sich bringt. Diese Arbeit beschreibt einen Snapshot-Algorithmus für großangelegte verteilte Dateisysteme und dessen Integration in XtreemFS, ein skalierbares objektbasiertes Dateisystem für Grid- und Cloud-Computing-Umgebungen. Die zwei Bausteine des Algorithmus sind ein System zur effizienten Erzeugung und Verwaltung von Dateiinhalts- und Metadaten-Versionen, sowie ein skalierbares, ausfallsicheres Verfahren zur Aggregation bestimmter Versionen in einem Snapshot. Um das Problem einer fehlenden globalen Zeit zu bewältigen, implementiert der Algorithmus ein weniger restriktives, auf Zeitstempeln lose synchronisierter Server-Uhren basierendes Konsistenzmodell für Snapshots. Die wesentlichen Beiträge der Arbeit sind: 1) ein formales Modell von Snapshots und Snapshot-Konsistenz in verteilten Dateisystemen; 2) die Beschreibung effizienter Verfahren zur Verwaltung von Metadaten- und Dateiinhalts-Versionen in objektbasierten Dateisystemen; 3) die formale Darstellung eines skalierbaren, ausfallsicheren Snapshot-Algorithmus für großangelegte objektbasierte Dateisysteme; 4) eine detaillierte Beschreibung der Implementierung des Algorithmus in XtreemFS. Eine umfangreiche Auswertung belegt, dass der vorgestellte Algorithmus die Nutzerdatenrate kaum negativ beeinflusst, und dass er mit großen Zahlen an Snapshots und Versionen skaliert.
Snapshots are present in many modern file systems, where they allow to create consistent on-line backups, to roll back corruptions or inadvertent changes of files, and to keep a record of changes to files and directories. While most previous work on file system snapshots refers to local file systems, modern trends like cloud and cluster computing have shifted the focus towards distributed storage infrastructures. Such infrastructures often comprise large numbers of storage servers, which presents particular challenges in terms of scalability, availability and failure tolerance. This thesis describes snapshot algorithm for large-scale distributed file systems and its integration in XtreemFS, a scalable object-based file system for grid and cloud computing environments. The two building blocks of the algorithm are a version management scheme, which efficiently records versions of file content and metadata, as well as a scalable and failure-tolerant mechanism that aggregates specific versions in a snapshot. To overcome the lack of a global time in a distributed system, the algorithm implements a relaxed consistency model for snapshots, which is based on timestamps assigned by loosely synchronized server clocks. The main contributions of the thesis are: 1) a formal model of snapshots and snapshot consistency in distributed file systems; 2) the description of efficient schemes for the management of metadata and file content versions in object-based file systems; 3) the formal presentation of a scalable, fault-tolerant snapshot algorithm for large-scale object-based file systems; 4) a detailed description of the implementation of the algorithm as part of XtreemFS. An extensive evaluation shows that the proposed algorithm has no severe impact on user I/O, and that it scales to large numbers of snapshots and versions.
APA, Harvard, Vancouver, ISO, and other styles
30

Grieve, Stuart Michael. "Development of fast magnetic resonance imaging methods for investigation of the brain." Thesis, University of Oxford, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.365824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Howell, Randy Keith. "d-MUSIC : an algorithm for single snapshot direction-of-arrival estimation." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ37346.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Dockter, Rhyan B. "Genome Snapshot and Molecular Marker Development in Penstemon (Plantaginaceae)." BYU ScholarsArchive, 2011. https://scholarsarchive.byu.edu/etd/2512.

Full text
Abstract:
Penstemon Mitchell (Plantaginaceae) is one of the largest, most diverse plant genera in North America. Their unique diversity, paired with their drought-tolerance and overall hardiness, give Penstemon a vast amount of potential in the landscaping industry—especially in the more arid western United States where they naturally thrive. In order to develop Penstemon lines for more widespread commercial and private landscaping use, we must improve our understanding of the vast genetic diversity of the genus on a molecular level. In this study we utilize genome reduction and barcoding to optimize 454-pyrosequencing in four target species of Penstemon (P. cyananthus, P. davidsonii, P. dissectus and P. fruticosus). Sequencing and assembly produced contigs representing an average of 0.5% of the Penstemon species. From the sequence, SNP information and microsatellite markers were extracted. One hundred and thirty-three interspecific microsatellite markers were discovered, of which 50 met desired primer parameters, and were of high quality with readable bands on 3% Metaphor gels. Of the microsatellite markers, 82% were polymorphic with an average heterozygosity value of 0.51. An average of one SNP in 2,890 bp per species was found within the individual species assemblies and one SNP in 97 bp were found between any two supposed homologous sequences of the four species. An average of 21.5% of the assembled contigs were associated with putative genes involved in cellular components, biological processes, and molecular functions. On average 19.7% of the assembled contigs were identified as repetitive elements of which LTRs, DNA transposons and other unclassified repeats, were discovered. Our study demonstrates the effectiveness of using the GR-RSC technique to selectively reduce the genome size to putative homologous sequence in different species of Penstemon. It has also enabled us the ability to gain greater insights into microsatellite, SNP, putative gene and repetitive element content in the Penstemon genome which provide essential tools for further genetic work including plant breeding and phylogenetics.
APA, Harvard, Vancouver, ISO, and other styles
33

Kraay, Andrea L. (Andrea Lorraine) 1976. "Physically constrained maximum likelihood method for snapshot deficient adaptive array processing." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/87331.

Full text
Abstract:
Thesis (Elec.E. and S.M. in Electrical Engineering)--Joint Program in Applied Ocean Physics and Engineering (Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; and the Woods Hole Oceanographic Institution), 2003.
"February 2003."
Includes bibliographical references (leaves 139-141).
by Andrea L. Kraay.
Elec.E.and S.M.in Electrical Engineering
APA, Harvard, Vancouver, ISO, and other styles
34

Chamberland, Meredith Scotti. "The wait-list game: a snapshot of those left in limbo." Thesis, Boston University, 2013. https://hdl.handle.net/2144/10959.

Full text
Abstract:
Thesis (Ed.D.)--Boston University
Undergraduate admissions in the United States is a multibillion dollar industry involving families, higher education institutions, journalists, testing companies, test preparation companies, private consultants, marketing firms, high school guidance counselors, high school teachers, coaches, financial advisors, and publicly funded programs. Pushing all of the citizens ofthe United States towards postsecondary education has been a goal of many presidents. In an effort to achieve this goal, colleges and universities utilize wait lists so that no seat goes unfilled. Five high school guidance counselors, ten students and one of their parents, and ten college and university admissions personnel participated in this study. The students and parents all come from one private high school in the northeast. Guidance counselors from one public and one private high school participated. The interviews with ten admissions personnel include four-year public and private colleges and universities in the United States. Qualitative methods consisted of audio-recorded interviews, which were later transcribed and coded. Data were analyzed for common themes and were found among each of the population groups. There are four noteworthy findings. Students want wait lists to exist because they provide an opportunity for acceptance that would otherwise not exist. Parents want wait lists to exist, but they want policy reform that requires colleges and universities to be consistent in their communication. High school counselors call for more transparency and information regarding how college and university admissions offices create wait lists and how students are chosen for enrollment from the wait lists. Lastly, college admissions representatives primarily use wait lists to meet enrollment targets, but may also use wait lists to keep acceptance percentages lower, increase yield percentages, and admit only viable financial candidates. These findings suggest that the NACAC Statement of Principles of Good Practice needs revision to include more guidelines about communication with wait-listed students and their families, a need for Masters Programs that lead to certification as a high school counselor to include a course on college admissions counseling, a need for the US News and World Report to eliminate acceptance percentages as an evaluative measure of quality in its college rankings.
APA, Harvard, Vancouver, ISO, and other styles
35

Gryspeerdt, Edward, Tom Goren, and Tristan W. P. Smith. "Observing the timescales of aerosol–cloud interactions in snapshot satellite images." Copernicus Publications, 2021. https://ul.qucosa.de/id/qucosa%3A74863.

Full text
Abstract:
The response of cloud processes to an aerosol perturbation is one of the largest uncertainties in the anthropogenic forcing of the climate. It occurs at a variety of timescales, from the near-instantaneous Twomey effect to the longer timescales required for cloud adjustments. Understanding the temporal evolution of cloud properties following an aerosol perturbation is necessary to interpret the results of so-called “natural experiments” from a known aerosol source such as a ship or industrial site. This work uses reanalysis wind fields and ship emission information matched to observations of ship tracks to measure the timescales of cloud responses to aerosol in instantaneous (or“snapshot”) images taken by polar-orbiting satellites. As in previous studies, the local meteorological environment is shown to have a strong impact on the occurrence and properties of ship tracks, but there is a strong time dependence in their properties. The largest droplet number concentration (Nd) responses are found within 3 h of emission, while cloud adjustments continue to evolve over periods of 10 h or more. Cloud fraction is increased within the early life of ship tracks, with the formation of ship tracks in otherwise clear skies indicating that around 5 %–10%of clear-sky cases in this region may be aerosol-limited. The liquid water path (LWP) enhancement and the Nd– LWP sensitivity are also time dependent and strong functions of the background cloud and meteorological state. The nearinstant response of the LWP within ship tracks may be evidence of a bias in estimates of the LWP response to aerosol derived from natural experiments. These results highlight the importance of temporal development and the background cloud field for quantifying the aerosol impact on clouds, even in situations where the aerosol perturbation is clear.
APA, Harvard, Vancouver, ISO, and other styles
36

Robertson, Stephanie. "Microstructural manipulation by laser irradiation of prepared samples : The ’Snapshot Method’." Licentiate thesis, Luleå tekniska universitet, Produkt- och produktionsutveckling, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-76150.

Full text
Abstract:
Various metallurgical microstructures and their formation are studied in this thesis by using a laser beam to melt a variety of materials with different chemical compositions over a range of thermal cycles. The laser beam was used conventionally in a narrow gap multi-layer weld, used for welding large depths with filler wire additions, as well as a non-traditional simulated welding approach labelled here as the Snapshot method. In laser beam welding, materials go through rapid heating and cooling cycles that are difficult to mimic by other techniques. In welding, any microstructural development depends on complex combinations of chemistry and thermal cycles but is also influenced by melt flow behavior. In turn, microstructural morphologies influence the mechanical behavior which can suffer due to inappropriate microstructural constituents. The Snapshot method, through control of thermal cycling and material composition, can achieve the same rates while guiding microstructural development to form tailored properties. The tunable laser beam properties can be exploited to develop an experimental welding simulation (Snapshot method), which enables the complex interlinked chemical and thermal events which take place during welding to be studied in a controlled manner. Exploring the microstructural relationships to their thermal history provides a greater knowledge into tailoring microstructural compositions to obtain various required mechanical properties for laser welding, additive manufacturing and also non-laser welding techniques. The feasibility of the Snapshot method is investigated in the three appended journal publications. High speed imaging and thermal recording have proved to be essential tools in this work, with analysis from optical microscopy and EDX/EDS to provide additional support. The Snapshot method is introduced as a concept in Papers I and II, demonstrating successfully guided thermal histories after obtaining molten material. Application of a second and third heating cycle, reheating the structure without melting, yielding altered microstructures. Reaching the austenitisation temperature range allowed for the simulation of complex multi-layer welding thermal histories. Geometrically non-uniform material additions are utilized in Paper III, which investigated the formation of microstructures through the chemical composition route. New chemical compositions were obtained by different degrees of dilution of the weld filler wire by the base material.
APA, Harvard, Vancouver, ISO, and other styles
37

Moore, Ashley Hale. "Moments." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1306709446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Katebi, Ataur Rahim. "Supporting snapshots in a log-based file system." [Gainesville, Fla.] : University of Florida, 2004. http://purl.fcla.edu/fcla/etd/UFE0008900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Moh, Chuang-Hue 1975. "Snapshots in a distributed persistent object storage system." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/87361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Persson, Andreas, and Lukas Landenstad. "Explaining change : Comparing network snapshots for vulnerability management." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16710.

Full text
Abstract:
Background. Vulnerability management makes it easier for companies to find, manage and patch vulnerabilities in a network. This is done by scanning the network for known vulnerabilities. The amount of information collected during the scans can be large and prolong the analysis process of the findings. When presenting the result of found vulnerabilities it is usually represented as a trend of number of found vulnerabilities over time. The trends do not explain the cause of change in found vulnerabilities.  Objectives. The objective of this thesis is to investigate how to explain the cause of change in found vulnerabilities, by comparing vulnerability scanning reports from different points in time. Another objective of this thesis is to create an automated system that connects changes in vulnerabilities to specific events in the network. Methods. A case study was conducted where three reports, from vulnerability scans of Outpost24's internal test network, were examined in order to understand the structure of the reports and mapping them to events. To complement the case study, an additional simulated test network was set up in order to conduct self defined tests and obtain higher accuracy when identifying the cause of change in found vulnerabilities. Results. The observations done in the case study provided us with information on how to parse the data and how to identify the cause of change with a rule-based system. Interpretation of the data was done and the changes were grouped into three categories; added, removed or modified. After conducting the test cases, the results were then interpreted to find signatures in order to identify the cause of change in vulnerabilities. These signatures were then made into rules, implemented into a proof-of-concept tool. The proof of concept tool compared scan reports in pairs in order to find differences. These differences were then matched with the rules and if it did not match any rule, the change in the report was flagged as an ''unexplained'' change. The proof-of-concept tool was then used to investigate the cause of change between the reports from the case study. The framework was validated by evaluating the rules gathered from the simulated test network on the data from the case study. Furthermore, a domain expert verified that the identified causes were accurate by manually comparing the vulnerability reports from the case study. Conclusions. It is possible to identify the cause of change in found vulnerabilities from vulnerability scan reports by constructing signatures for events and use these signatures as rules. This can also be implemented automatically, as a software, in order to identify the cause of change faster than manual labor.
Bakgrund. Sårbarhetshantering underlättar arbetet för företag att hitta, hantera och korrigera sårbarheter i ett nätverk. Det görs genom att skanna nätverket efter kända sårbarheter. Mängden information som samlas under skanningar kan vara stor och medföra till att analysprocessen av upptäckterna försenas. Resultaten av de upptäckta sårbarheterna brukar vanligtvis presenteras som en trend av antalet funna sårbarheter över ett tidsintervall. Trenderna förklarar dock inte andledningen till de funna sårbarheterna. Syfte. Målet med denna avhandling är att undersöka hur det är möjligt att identifiera anledningen till skillnaden i funna sårbarheter genom att jämföra sårbarhetsrapporter från olika tidpunkter. Ett andra mål är att utveckla ett automatiskt system som kopplar skillnaderna i funna sårbarheter till specifika händelser i nätverket. Metod. En fallstudie utfördes där tre sårbarhetsrapporter, från Outpost24s interna testnätverk, undersöktes för att få förståelse kring strukturen av rapporterna samt för att koppla upptäckter i rapporterna till händelser. För att komplementera fallstudien satte vi upp ett nytt, simulerat testnätverk för att kunna utföra egna tester samt för att uppnå en högre precision vid identifiering av förändringar. Resultat. Utifrån fallstudien fick vi förståelse för hur vi skulle tolka informationen från rapporterna samt för hur man kan ge orsak till förändring genom ett regelbaserad system. Informationen från rapporterna tolkades och förändringarna delades in i tre olika kategorier; tillagda, borttagna eller modifierade. Utifrån testerna från det simulerade nätverket byggdes signaturer som identifierar orsak till föränding av funna sårbarheter. Signaturerna användes sedan för att göra regler, vilka implementerades i ett konceptverktyg. Konceptverktyget jämförde sårbarhetsrapporter i par för att upptäcka skillnader. De identifierade skillnaderna försökte sedan matchas ihop med reglerna och skulle skillnaden inte matcha någon regel så flaggas skillnaden som ''oförklarad''. Konceptverktyget användes slutligen för att finna orsak till förändringar i rapporterna från fallstudien. Ramverket validerates genom att utvärdera hur reglerna byggda utifrån det simulerade nätverket presterade för fallstudien. En domänexpert verifierade att händelserna som presenterades och orsaken till förändringarna var korrekta genom att analysera sårbarhetsrapporterna från fallstudien manuellt. Slutsatser. Det är möjligt att identifiera orsak till förändringar i upptäckta sårbarheter i sårbarhetsrapporter genom att identifiera signaturer för händelser, och använda dessa signaturer i ett reglerbaserat system. Systemet är också möjligt att implementera automatiskt, i form av mjukvara, för att kunna identifiera orsaken till förändring snabbare än om det skulle gjorts manuellt.
APA, Harvard, Vancouver, ISO, and other styles
41

Misyak, Sarah A. "Development of a SNP Assay for the Differentiation of Allelic Variations in the mdx Dystrophic Mouse Model." Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/32325.

Full text
Abstract:
The purpose of this study was to develop a SNaPshot® assay to simultaneously discriminate between the dystrophic and wild type (wt) alleles in mdx mice. The mdx mouse is an animal model for Duchenne muscular dystrophy (DMD), a severe and fatal muscle wasting disease. To evaluate possible treatments and to carry out genetic studies, it is essential to distinguish between mice that carry the mutant dystrophic or wt allele(s). The current Amplification-Resistant Mutation System (ARMS) assay used to genotype mdx mice is labor intensive and sometimes fails to yield typing results, which reduce its efficiency as a screening tool. An alternative assay based on single nucleotide polymorphism (SNP) extension technology (i.e., SNaPshot®) would be advantageous because its specificity and capability to be automated would reduce the labor involved and increase the fidelity of each assay. A SNaPshot® assay has been developed that provides a robust and potentially automatable assay that discriminates between the wt and dystrophic alleles. The assay has been optimized to use: an undiluted DNA in the PCR, a 0.1 µM PCR primer concentration, a full PCR product for the SNP extension reaction, a 50ºC annealing temperature for the SNP extension in accordance with standard SNaPshot® conditions, and a 0.4 µM concentration of the SNP extension primer. The advantages of the resultant SNaPshot® assay over the ARMS assay include higher fidelity, robustness, and more consistent performance within and among laboratories, and reduced risk of human error.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
42

Gorman, Alistair S. "Snapshot spectral imaging using image replication and birefringent interferometry : principles and applications." Thesis, Heriot-Watt University, 2011. http://hdl.handle.net/10399/2480.

Full text
Abstract:
This thesis explores the image-replicating imaging spectrometer (IRIS). This relatively recent invention is a two-dimensional, snapshot spectral-imaging technology, capable of recording the spectral and spatial data from a scene instantaneously. Whereas conventional spectral-imaging technologies require multiple detector frames to record the entire data set, IRIS is able to record the data set in a single frame, a capability which is useful for highly dynamic scenes. The IRIS concept and the design of IRIS systems are explained in detail, and constraints on the performance of IRIS are determined. Practical issue in the use of IRIS systems are identi ed and solutions are identi ed and appraised. Some applications of IRIS are also shown, demonstrating its viability as a spectral imaging technology. Novel aspects of this work include the re nement of the IRIS design, demonstration of a registration algorithm for IRIS, designs for achromatic Wollaston prisms, a comparison of the IRIS technology with conventional spectral imaging technologies, and the application of IRIS to practical problems.
APA, Harvard, Vancouver, ISO, and other styles
43

Tuku, Woldu. "Distributed state estimation using phasor measurement units (PMUs)for a system snapshot." Kansas State University, 2012. http://hdl.handle.net/2097/14129.

Full text
Abstract:
Master of Science
Department of Electrical and Computer Engineering
Noel N. Schulz
As the size of electric power systems are increasing, the techniques to protect, monitor and control them are becoming more sophisticated. Government, utilities and various organizations are striving to have a more reliable power grid. Various research projects are working to minimize risks on the grid. One of the goals of this research is to discuss a robust and accurate state estimation (SE) of the power grid. Utilities are encouraging teams to change the conventional way of state estimation to real time state estimation. Currently most of the utilities use traditional centralized SE algorithms for transmission systems. Although the traditional methods have been enhanced with advancement in technologies, including PMUs, most of these advances have remained localized with individual utility state estimation. There is an opportunity to establish a coordinated SE approach integration using PMU data across a system, including multiple utilities and this is using Distributed State Estimation (DSE). This coordination will minimize cascading effects on the power system. DSE could be one of the best options to minimize the required communication time and to provide accurate data to the operators. This project will introduce DSE techniques with the help of PMU data for a system snapshot. The proposed DSE algorithm will split the traditional central state estimation into multiple local state estimations and show how to reduce calculation time compared with centralized state estimation. Additionally these techniques can be implemented in micro-grid or islanded system.
APA, Harvard, Vancouver, ISO, and other styles
44

Alverson, Matthew. "A Verbal Snapshot of Visual Scrutiny Primarily in the Sphere of Art." VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/2155.

Full text
Abstract:
Finding and extracting meaning from the world I encounter every day is the primary motivation behind my creativity. Filtering perception and separating the important bits of information by selection or elimination is the crux of this investigation. This process is one of finding rationale in futility and applying meaning to meaningless encounters. The significance of life is not fixed and it is our responsibility to make it up to best suit the desire to have purpose. Depending on the way something is looked at determines the meaning behind it. Anything can have content if it is seen and translated a certain way. Aligning this inquiry to the course of painting is how I examine the pieces of information that have potential importance. Painting allows me to slow down, scrutinize, and evaluate the way I perceive reality. Every image seems unrelated but is actually connected by an undercurrent of doubt at every level of creation.
APA, Harvard, Vancouver, ISO, and other styles
45

Reichwaldt, Kai. "Snapshots from Between : Non-binary identity construction on Instagram." Thesis, Umeå universitet, Umeå centrum för genusstudier (UCGS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-153425.

Full text
Abstract:
Though technically not a new phenomenon historically or geographically, non-binary identities arestarting to be more visible in Sweden, and have become a topic of both discussion concerninghuman rights and anti-discriminatory practices, but also mockery. In this thesis I wanted to have acloser look at how Swedish people identifying as neither wholly male nor female presentthemselves on social media in text and image, as well as how they describe the resistance they meetin regards to their identity and choices of presentation. Its theoretical basis lies in works of JudithButler, Jack Halberstam and Lee Edelman to examine which societal structures the subjects of thisstudy have to relate to. The source of my empirical material are ten Instagram accounts, which wereanalysed via an integrated discourse psychology/discourse theory approach. During the period ofmaterial collection, an incident in the shape of a public debate concerning trans questions had aconsiderable impact on the lives of the subjects of this study, consequentially making it a significanttheme of the thesis. The results show the difficulty of trying to hold a balance between or outsidethe gender and/or sex binary in a society which only recognizes male and female, as well as theconflicts of identity which can arise when one’s gender identity clashes with other importantpersonal values.
APA, Harvard, Vancouver, ISO, and other styles
46

Barr, Kenneth C. (Kenneth Charles) 1978. "Summarizing multiprocessor program execution with versatile, microarchitecture-independent snapshots." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/38224.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.
Includes bibliographical references (p. 131-137).
Computer architects rely heavily on software simulation to evaluate, refine, and validate new designs before they are implemented. However, simulation time continues to increase as computers become more complex and multicore designs become more common. This thesis investigates software structures and algorithms for quickly simulating modern cache-coherent multiprocessors by amortizing the time spent to simulate the memory system and branch predictors. The Memory Timestamp Record (MTR) summarizes the directory and cache state of a multiprocessor system in a compact data structure. A single MTR snapshot is versatile enough to reconstruct the microarchitectural state resulting from various coherence protocols and cache organizations. The MTR may be quickly updated by each simulated processor during a fast-forwarding phase and optionally stored off-line for reuse. To fill large branch prediction tables, we introduce Branch Predictor-based Compression (BPC) which compactly stores a branch trace so that it may be used to fill in any branch predictor structure. An entire BPC trace requires less space than single discrete predictor snapshots, and it may be decompressed 3-6x faster than performing functional simulation.
by Kenneth C. Barr.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
47

Labuschagne, Jeanine. "Molecular methods for genotyping selected detoxification and DNA repair enzymes / J. Labuschagne." Thesis, North-West University, 2010. http://hdl.handle.net/10394/4599.

Full text
Abstract:
The emerging field of personalized medicine and the prediction of side effects experienced due to pharmaceutical drugs is being studied intensively in the post genomic era. The molecular basis of inheritance and disease susceptibility is being unravelled, especially through the use of rapidly evolving new technologies. This in turn facilitates analyses of individual variations in the whole genome of both single subjects and large groups of subjects. Genetic variation is a common occurrence and although most genetic variations do not have any apparent effect on the gene product some do exhibit effects, such as an altered ability to detoxify xenobiotics. The human body has a highly effective detoxification system that detoxifies and excretes endogenous as well as exogenous toxins. Numerous studies have proved that specific genetic variations have an influence on the efficacy of the metabolism of pharmaceutical drugs and consequently the dosage administered. The primary aim of this project was the local implementation and assessment of two different genotyping approaches namely: the Applied Biosystems SNaPshot technique and Affymetrix DMET microarray. A secondary aim was to investigate if links could be found between the genetic data and the biochemical detoxification profile of participants. I investigated the approaches and gained insight into which method would be better for specific local applications, taking into consideration the robustness and ease of implementation as well as cost effectiveness in terms of data generated. The final study cohort comprised of 18 participants whose detoxification profiles were known. Genotyping was performed using the DMET microarray and SNaPshot techniques. The SNaPshot technique was used to genotype 11 SNPs relating to DNA repair and detoxification and was performed locally. Each DMET microarray delivers significantly more data in that it genotypes 1931 genetic markers relating to drug metabolism and transport. Due to the absence of a local service supplier, the DMET - microarrays were outsourced to DNALink in South Korea. DNALink generated raw data which was analysed locally. I experienced many problems with the implementation of the SNaPshot technique. Numerous avenues of troubleshooting were explored with varying degrees of success. I concluded that SNaPshot technology is not the best suited approach for genotyping. Data obtained from the DMET microarray was fed into the DMET console software to obtain genotypes and subsequently analysed with the help of the NWU statistical consultation services. Two approaches were followed: firstly, clustering the data and, secondly, a targeted gene approach. Neither of the two methods was able to establish a relationship between the DMET genotyping data and the detoxification profiling. For future studies to successfully correlate SNPs or SNP groups and a specific detoxification profile, two key issues should be addressed: i) The procedure for determining the detoxification profile following substrate loading should be further refined by more frequent sampling after substrate loading. ii) The number of participants should be increased to provide statistical power that will enable a true representation of the particular genetic markers in the specific population. The statistical analyses, such as latent class analyses to cluster the participants will also be of much more use for data analyses and interpretation if the study is not underpowered.
Thesis (M.Sc. (Biochemistry))--North-West University, Potchefstroom Campus, 2011.
APA, Harvard, Vancouver, ISO, and other styles
48

Santos, Simone Teixeira Bonecker dos. "Estudo da estrutura e filogenia da população do Rio de Janeiro através de SNPs do cromossomo Y." reponame:Repositório Institucional da FIOCRUZ, 2010. https://www.arca.fiocruz.br/handle/icict/6306.

Full text
Abstract:
Submitted by Tatiana Silva (tsilva@icict.fiocruz.br) on 2013-02-08T13:08:41Z No. of bitstreams: 1 simone_t_b_santos_ioc_bcm_0006_2010.pdf: 2551865 bytes, checksum: 53317a265fddefc98babc5e7a0074acb (MD5)
Made available in DSpace on 2013-02-08T13:08:41Z (GMT). No. of bitstreams: 1 simone_t_b_santos_ioc_bcm_0006_2010.pdf: 2551865 bytes, checksum: 53317a265fddefc98babc5e7a0074acb (MD5) Previous issue date: 2010
Fundação Oswaldo Cruz.Instituto Oswaldo Cruz. Rio de janeiro, RJ, Brasil
A população brasileira é considerada miscigenada, derivada de um processo relativamente recorrente e recente. Aqui viviam milhões de indígenas quando começou o processo colonizatório envolvendo integrantes europeus, principalmente portugueses do sexo masculino, tornando comum o acasalamento entre homens europeus e mulheres indígenas, começando assim a heterogeneidade étnica encontrada em nossa população. Posteriormente com a chegada dos escravos, durante o ciclo econômico da cana-de-açúcar, começou a ocorrer relacionamentos entre europeus e africanas. Basicamente, trata-se de uma população tri-híbrida que atualmente apresenta em sua composição outros grupos, entre eles: italianos, espanhóis, sírios, libaneses e japoneses. Para o melhor entendimento das raízes filogenéticas brasileiras, foram utilizados neste estudo marcadores bi-alélicos da região não recombinante do cromossomo Y. O objetivo foi analisar como esses grupos heterogêneos contribuíram para o pool genético de origem paterna encontrado na população masculina do Rio de Janeiro, e assim enriquecer os conhecimentos acerca dos movimentos migratórios no processo de estruturação desta população. Foram analisados, através do minissequenciamento multiplex, 13 polimorfismos de base única (SNPs) e foi possível a identificação de nove haplogrupos e quatro sub-haplogrupos, em uma amostra de 200 indivíduos não aparentados e residentes do Estado do Rio de Janeiro, escolhidos aleatoriamente entre participantes de estudos de paternidade da Defensoria Pública do Rio de Janeiro. Dos haplogrupos analisados, somente o R1a, não foi observado em nossa população. O haplogrupo mais representativo foi o de origem européia, o R1b1, com 51%, enquanto o menos representativo, com 1% foi o Q1a3a, encontrado entre os nativos americanos. Cerca de 85% dos cromossomos Y analisados são de origem européia; 10,5% de africanos e 1% de ameríndios, e o restante são de origem indefinida. Ao comparamos com dados da literatura nossa população mostrou-se semelhante a população branca de Porto Alegre e nenhuma diferença significativa foi encontrada entre o pool gênico da população masculina do Rio de Janeiro com a portuguesa. Os resultados aqui encontrados corroboram os dados históricos da fundação da população do Rio de Janeiro durante o século XVI, período onde foi observada uma significante redução da população ameríndia, com importante contribuição demográfica vinda da região Subsaariana da África e Europa, principalmente de portugueses. Tendo em vista o alto grau de miscigenação da nossa população e os avanços na medicina personalizada, estudos sobre a estrutura genética humana têm fundamental implicação no entendimento na evolução e no impacto em doenças humanas, uma vez que para esta abordagem, a coloração da pele é um preditor não confiável de ancestralidade étnica do indivíduo.
The Brazilian population is highly admixture, a relatively recurrent and recent process. Millions of indigenous people had been living here when the colonization process began, initially involving mainly Portuguese men. The immigration of European women during the first centuries was insignificant, making common the marriage between European men with indigenous women; hence, starting the ethnic heterogeneity found in our population nowadays. Subsequently, with the arrival of slaves during the economic cycle of sugarcane, began the relationships between Europeans and Africans. Basically, it is a tri-hybrid population with contributions from other groups, such as Italians, Germans, Syrians, Lebanese and Japanese. For a better understanding of Brazilian phylogenetic roots, biallelic markers of nonrecombining region of the Y chromosome were used in this study. The goal was to analyze how these heterogeneous groups contributed to the genetic pool found present-day in the population of Rio de Janeiro, and thus contribute to the understanding of migratory movements in the process of structuring this population. We analyzed, through minisequencing multiplex, 13 single nucleotide polymorphisms (SNPs) and through those it was able to identify nine haplogrupos and four subhaplogrupos, in a sample of 200 unrelated individuals, residents of the State of Rio de Janeiro, chosen randomly between participants from studies of fatherhood. Of the haplogrupos examined, only the R1a, has not been observed in our population. The more representative haplogroup was from European origin, the R1b1, with 51%, while the less representative, with 1% was the Q1a3a, found among Native Amerindians. 85% of Y chromosomes analyzed were from Europeans; 10.5% from Africans and 1% of Native Amerindians, and the rest have not had their origin defined. In this study sample, the vast majority of Y-chromosomes proved to be of European origin. Indeed, there were no significant differences when the haplogroup frequencies in Brazil and Portugal were compared by means of an exact test of population differentiation. These results corroborate historical data of the foundation of the population of Rio de Janeiro during the 16th century, a period where it was observed a significant reduction of Amerindian population was observed with important contribution from the Sub-Saharan region of Africa and Europe, particularly the Portuguese. In view of the high degree of admixture of oBrazilian population and advances in medicine, customized research on human genetic structure have fundamental implication in understanding the evolution and impact on human diseases, since for this approach, the skin color is an unreliable ancestry predictor of individual ethnic
APA, Harvard, Vancouver, ISO, and other styles
49

Monteiro, Guilherme Arthur Brunet. "Estratégia de manutenção em uma oficina de cilindros de laminação de aços longos." Universidade Federal de Pernambuco, 2013. https://repositorio.ufpe.br/handle/123456789/12200.

Full text
Abstract:
Submitted by Irene Nascimento (irene.kessia@ufpe.br) on 2015-03-12T16:45:10Z No. of bitstreams: 2 DISSERTAÇÃO Guilherme Arthur Brunet Monteiro.pdf: 3391661 bytes, checksum: f43864a0c429d64328086345fa921f69 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Made available in DSpace on 2015-03-12T16:45:10Z (GMT). No. of bitstreams: 2 DISSERTAÇÃO Guilherme Arthur Brunet Monteiro.pdf: 3391661 bytes, checksum: f43864a0c429d64328086345fa921f69 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2013-07-29
O mercado siderúrgico como um todo sofre constantes modificações em função de novos entrantes, oscilação do mercado e sob um contexto extremamente competitivo, produtores da indústria do aço seguem um caminho árduo na busca incessante por custos competitivos globalmente. Esta busca incessante pela redução de custos provoca a revisão de padrões e conceitos sobre o negócio, fazendo com que surjam idéias e novas formas de se fazer o que já é feito da mesma forma há muito tempo. Esta dissertação apresenta a aplicação de uma metodologia baseada em conceitos de manutenção aplicado a uma Oficina de Cilindros em uma Laminação de aços longos. Sob uma atuação de montagem, desmontagem, calibração e ajustes operacionais, uma Oficina de Cilindros apresenta muita interação com a recuperação e substituição de itens, com forte interação baseada em inspeções operacionais. O modelo proposto aborda estas inspeções, deixando claros a freqüência, parâmetros e seqüenciamento da atividade, além de empregar a manutenção preventiva nos conjuntos específicos, tudo com base em dados históricos, obtidos com snapshot, analisando o comportamento das falhas e quebras, permitindo decidir o tipo de intervenção, dividida em tecnologia, metodologia, periodicidade ou freqüência, sendo essas duas últimas obtidas com o conceito de delay time analysis.
APA, Harvard, Vancouver, ISO, and other styles
50

Möller, Björn. "Full frame 3D snapshot : Possibilities and limitations of 3D image acquisition without scanning." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2857.

Full text
Abstract:

An investigation was initiated, targeting snapshot 3D image sensors, with the objective to match the speed and resolution of a scanning sheet-of-light system, without using a scanning motion. The goal was a system capable of acquiring 25 snapshot images per second from a quadratic scene with a side from 50 mm to 1000 mm, sampled in 512×512 height measurement points, and with a depth resolution of 1 µm and beyond.

A wide search of information about existing 3D measurement techniques resulted in a list of possible schemes, each presented with its advantages and disadvantages. No single scheme proved successful in meeting all the requirements. Pulse modulated time-of-flight is the only scheme capable of depth imaging by using only one exposure. However, a resolution of 1 µm corresponds to a pulse edge detection accuracy of 6.67 fs when visible light or other electromagnetic waves are used. Sequentially coded light projections require a logarithmic number of exposures. By projecting several patterns at the same time, using for instance light of different colours, the required number of exposures is reduced even further. The patterns are, however, not as well focused as a laser sheet-of-light can be.

Using powerful architectural concepts such as matrix array picture processing (MAPP) and near-sensor image processing (NSIP) a sensor proposal was presented, designed to give as much support as possible to a large number of 3D imaging schemes. It allows for delayed decisions about details in the future implementation.

It is necessary to relax at leastone of the demands for this project in order to realise a working 3D imaging scheme using concurrent technology. One of the candidates for relaxation is the most obvious demand of snapshot behaviour. Furthermore, there are a number of decisions to make before designing an actual system using the recommendations presented in this thesis. The ongoing development of electronics, optics, and imaging schemes might be able to meet the 3D snapshot demands in a near future. The details of light sensing electronics must be carefully evaluated and the optical components such as lenses, projectors, and fibres should be studied in detail.

APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography