Dissertations / Theses on the topic 'Trace analysis'

To see the other types of publications on this topic, follow the link: Trace analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Trace analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Shepard, Timothy Jason. "TCP packet trace analysis." Thesis, Massachusetts Institute of Technology, 1990. http://hdl.handle.net/1721.1/13577.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1990.
Includes bibliographical references (leaves 66-67).
by Timothy Jason Shepard.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
2

Gallouzi, M. Souheil. "Trace analysis of LOTOS behaviours." Thesis, University of Ottawa (Canada), 1989. http://hdl.handle.net/10393/5831.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ezust, S. Alan (Sam Alan). "Tango : the trace analysis generator." Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=22729.

Full text
Abstract:
This thesis describes the development of an automatic generator of backtracking protocol trace analysis tools for single-module specifications written in the formal description language, Estelle. The generated tool automatically checks the validity of any execution trace against the given specification. The approach taken was to modify an Estelle-to-C++ compiler. Discussion about nondeterministic specifications, multiple observation points, and on-line trace analysis follow. Trace analyzers for the protocols LAPD and TP0 have been tested and performance results are evaluated. Issues in the analysis of partial traces are discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Zeng, Zuotao. "Trace analysis by molecular spectroscopy." Thesis, University of Huddersfield, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.368320.

Full text
Abstract:
This thesis describes new analytical methods for trace or ultra-trace analysis by molecular absorption and emission spectroscopy. The initial part of the thesis is devoted to an introduction to molecular electromagnetic absorption spectroscopy and molecular fluorescence. The principles, advantages and limitations of spectrophotometric and fluorimetric measurement are discussed. The concepts of enzymes and their applications in analytical chemistry are also expounded. Two organic compounds, 5-( 4-arsonophenylazo )-8-(p-toluenesulfonamido)quinoline (APTSQ) and 5-(p-methoxyphenylazo)-8-(p-toluenesulfonamido) quinoline (MPTSQ), have been synthesised and used as new chromogenic anellor f1uorogenic reagents. Five specific, highly sensitive, simple, precise and inexpensive novel analytical methods have been developed: (1) A spectrophotometric method is described for the determination of cobalt. The maximum absorbance is at 582 nm with a molar absorptivity of 1.18 x 1 as I mor' cm-1 . Beer's law is obeyed for cobalt concentrations in the range 0-0.5 J.l.g mr'. (2) A fluorimetric method for the determination of cobalt is proposed. The fluorescence intensity is measured at Aex 287 nm and Aem 376 nm. The response is linear up to 25 ng mr' and the detection limit is 0.002 ng mr'. The mechanism of the fluorescence reaction has been investigated. (3) A flu~ri~etric method is proposed for the d~ter~i~~tion of H202. The response IS linear up to 12.2 ng mr H:t>2. The detection limit IS 0.16 ng mr'. (4) An enzymatic assay for glucose by spectrofluorimetry is described. The fluorescence intensity is proportional to the concentration of glucose up t0180 ng mr'. A detection limit of 5.4 ng mr' was obtained and allowed the determination of glucose in an extremely small amount of serum (O.5J.1.I) and urine (1 J.l.1). (5) A fluorimetric method for the determination of iron is proposed, based on the reaction between iron(III) and MPTSQ in the presence of cetyltrimethylammonium bromide. The fluorescence intenSity (Aex=317 nm, A.em=534 nm) is linear up to 170 ng mr' with a detection limit of 0.12 n9 mr'. An investigation of the mechanism of the fluorescence reaction has been made. The applications of the proposed methods for the determination of the concerned analytes at low levels in biological, environmental, pharmaceutical or beverage samples are also reported.
APA, Harvard, Vancouver, ISO, and other styles
5

Chimpalee, Narong. "Studies in inorganic trace analysis." Thesis, Queen's University Belfast, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhou, Yang. "Execution Trace Visualization for Java Pathfinder using Trace Compass." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-286313.

Full text
Abstract:
Multi-threading is commonly applied in modern computer programs, bringing many conveniences but also causing concurrency issues. Among the various error debugging tools, Java Pathfinder (JPF) can detect latent errors of multithreaded Java programs through model checking. However, the text-based format of the output trace is hard to read, and previous attempts in visualizing JPF traces show limitations. For long-term development, popular trace analytic platform such as Trace Compass (TC) is extended to adapt to JPF traces. In this thesis, the development of JPF and TC makes it possible to analyze JPF traces on TC with a user interface including visual diagrams. The development solves the conceptual differences between the tools and successfully visualize important trace data. The implementation can help provide a generic approach for analyzing JPF traces with visualization.
Multitrådning används ofta i moderna datorprogram, vilket har många fördelar men kan också orsaka samtidighetsproblem. Bland olika felsökningsverktyg kan Java Pathfinder (JPF) upptäcka latenta fel hos multitrådade Javaprogram genom modellkontroll. Spårningsinformationen i form av text har låg läsbarhet, och tidigare försök att visualsera JPF-spår har visat begränsningar. För långsiktig utveckling har populära spårningsanalysplattformar som Trace Compass (TC) utvidgats för att anpassas till JPF-spår. I examensprojektet gör utvecklingen av JPF och TC det möjligt att analysera JPF-spår på TC med ett användargränssnitt baserat på visuella diagram. Utvecklingen löser den konceptuella skillnaden mellan verktygen och visualiserar spårdata på ett framgångsrikt sätt. Implementeringen bidrar med ett generiskt tillvägagångssätt för att analysera JPF spår med hjälp av visualisering.
APA, Harvard, Vancouver, ISO, and other styles
7

Tetland, Mikkel. "Trace element analysis of placer gold." Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/52893.

Full text
Abstract:
Trace element analysis provides an efficient method of fingerprinting placer gold populations in order to characterize bedrock-source hypogene mineralization style and effects of supergene gold mobilization in surficial settings. LA-ICP-MS analysis, using the AuRM2 gold reference material as an external standard, measured concentrations of Mg, Al, Ti, V, Mn, Fe, Ni, Cu, Zn, As, Se, Rh, Pd, Sn, Sb, Te, Pt, Hg, Pb, Bi, and U in placer gold samples from: the Prophet Mine Australia, Nus River Colombia, Piaba Brazil, and four locations in British Columbia. Additional reference materials NIST610 and FAU7 were used to quantify the micro-homogeneity of AuRM2, monitor precision, and confirm accuracy. The AuRM2 reference material had not previously been utilized for micro-analysis, but is shown to be sufficiently homogenous at a micro-scale (64-108 µm). Precision of analyses of AuRM2 range between ±10-15% and accuracy is better than ±10%. Semi-quantitative concentrations for V, Hg, and U (0.3, 3.7, and 0.1 ppm respectively) were determined in AuRM2 along with Hg in NIST610 (0.6 ppm). Quantitative concentrations were determined for Se, Rh, Sb, and Te (24.9, 0.1, 0.3, and 2 ppm respectively) in FAU7. Placer gold in the central Okanagan from Mission Creek and the Winfield paleoplacer share the same hypogene trace element signature indicating that modern-day placer gold in Mission Creek is reworked from Miocene paleoplacers similar to the Winfield occurrence. Samples from Lambly Creek, also in the central Okanagan, exhibit two discrete trace element populations of native gold; one has elevated Cu indicative of greenstone-hosted orogenic gold, the other has a low Cu signature similar to intrusive-hosted mineralization. Both groups contain primary mineral inclusions and limited supergene rims indicating two proximal hypogene gold sources within the catchment of Lambly Creek. The samples from the Prophet Mine contain biologically precipitated supergene gold. Gold-rich rims are up to 100 µm thick indicating significant supergene gold accumulation in this placer occurrence. Analyses of regions of gold grains containing primary hypogene inclusions retain a distinct hypogene signature rich in base metals (Ni, Zn, Pb, and V) whereas others had an indistinct signature of supergene gold richer in Sb and Bi.
Irving K. Barber School of Arts and Sciences (Okanagan)
Earth and Environmental Sciences, Department of (Okanagan)
Graduate
APA, Harvard, Vancouver, ISO, and other styles
8

Waly, Hashem. "Automated Fault Identification - Kernel Trace Analysis." Thesis, Université Laval, 2011. http://www.theses.ulaval.ca/2011/28246/28246.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Talebpour, Alireza. "Texture analysis using the trace transform." Thesis, University of Surrey, 2004. http://epubs.surrey.ac.uk/804946/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Feltham, David John. "Trace element studies by proton microprobe analysis." Thesis, University of Oxford, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.258766.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Juckeland, Guido. "Trace-based Performance Analysis for Hardware Accelerators." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-105859.

Full text
Abstract:
This thesis presents how performance data from hardware accelerators can be included in event logs. It extends the capabilities of trace-based performance analysis to also monitor and record data from this novel parallelization layer. The increasing awareness to power consumption of computing devices has led to an interest in hybrid computing architectures as well. High-end computers, workstations, and mobile devices start to employ hardware accelerators to offload computationally intense and parallel tasks, while at the same time retaining a highly efficient scalar compute unit for non-parallel tasks. This execution pattern is typically asynchronous so that the scalar unit can resume other work while the hardware accelerator is busy. Performance analysis tools provided by the hardware accelerator vendors cover the situation of one host using one device very well. Yet, they do not address the needs of the high performance computing community. This thesis investigates ways to extend existing methods for recording events from highly parallel applications to also cover scenarios in which hardware accelerators aid these applications. After introducing a generic approach that is suitable for any API based acceleration paradigm, the thesis derives a suggestion for a generic performance API for hardware accelerators and its implementation with NVIDIA CUPTI. In a next step the visualization of event logs containing data from execution streams on different levels of parallelism is discussed. In order to overcome the limitations of classic performance profiles and timeline displays, a graph-based visualization using Parallel Performance Flow Graphs (PPFGs) is introduced. This novel technical approach is using program states in order to display similarities and differences between the potentially very large number of event streams and, thus, enables a fast way to spot load imbalances. The thesis concludes with the in-depth analysis of a case-study of PIConGPU---a highly parallel, multi-hybrid plasma physics simulation---that benefited greatly from the developed performance analysis methods
Diese Dissertation zeigt, wie der Ablauf von Anwendungsteilen, die auf Hardwarebeschleuniger ausgelagert wurden, als Programmspur mit aufgezeichnet werden kann. Damit wird die bekannte Technik der Leistungsanalyse von Anwendungen mittels Programmspuren so erweitert, dass auch diese neue Parallelitätsebene mit erfasst wird. Die Beschränkungen von Computersystemen bezüglich der elektrischen Leistungsaufnahme hat zu einer steigenden Anzahl von hybriden Computerarchitekturen geführt. Sowohl Hochleistungsrechner, aber auch Arbeitsplatzcomputer und mobile Endgeräte nutzen heute Hardwarebeschleuniger um rechenintensive, parallele Programmteile auszulagern und so den skalaren Hauptprozessor zu entlasten und nur für nicht parallele Programmteile zu verwenden. Dieses Ausführungsschema ist typischerweise asynchron: der Skalarprozessor kann, während der Hardwarebeschleuniger rechnet, selbst weiterarbeiten. Die Leistungsanalyse-Werkzeuge der Hersteller von Hardwarebeschleunigern decken den Standardfall (ein Host-System mit einem Hardwarebeschleuniger) sehr gut ab, scheitern aber an einer Unterstützung von hochparallelen Rechnersystemen. Die vorliegende Dissertation untersucht, in wie weit auch multi-hybride Anwendungen die Aktivität von Hardwarebeschleunigern aufzeichnen können. Dazu wird die vorhandene Methode zur Erzeugung von Programmspuren für hochparallele Anwendungen entsprechend erweitert. In dieser Untersuchung wird zuerst eine allgemeine Methodik entwickelt, mit der sich für jede API-gestützte Hardwarebeschleunigung eine Programmspur erstellen lässt. Darauf aufbauend wird eine eigene Programmierschnittstelle entwickelt, die es ermöglicht weitere leistungsrelevante Daten aufzuzeichnen. Die Umsetzung dieser Schnittstelle wird am Beispiel von NVIDIA CUPTI darstellt. Ein weiterer Teil der Arbeit beschäftigt sich mit der Darstellung von Programmspuren, welche Aufzeichnungen von den unterschiedlichen Parallelitätsebenen enthalten. Um die Einschränkungen klassischer Leistungsprofile oder Zeitachsendarstellungen zu überwinden, wird mit den parallelen Programmablaufgraphen (PPFGs) eine neue graphenbasisierte Darstellungsform eingeführt. Dieser neuartige Ansatz zeigt eine Programmspur als eine Folge von Programmzuständen mit gemeinsamen und unterchiedlichen Abläufen. So können divergierendes Programmverhalten und Lastimbalancen deutlich einfacher lokalisiert werden. Die Arbeit schließt mit der detaillierten Analyse von PIConGPU -- einer multi-hybriden Simulation aus der Plasmaphysik --, die in großem Maße von den in dieser Arbeit entwickelten Analysemöglichkeiten profiert hat
APA, Harvard, Vancouver, ISO, and other styles
12

Dangolle, Champa D. P. "Some aspects of trace analysis of metals." Thesis, Queen's University Belfast, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.318885.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Fernandes, A. R. "Trace analysis and chemistry of polychlorinated biphenyls." Thesis, University of East Anglia, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.384580.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Abubaker, Mokhtar Mohamed. "Some novel methods in ultra-trace analysis." Thesis, University of Salford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.386511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Pallavicini, Nicola. "Method development for isotope analysis of trace and ultra-trace elements in environmental matrices." Doctoral thesis, Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-59705.

Full text
Abstract:
The increasing load of toxic elements entering the ecosystems, as a consequence of anthropogenic processes, has grown public awareness in the last decades, resulting in a great number of studies focusing on pollution sources, transport, distribution, interactions with living organisms and remediation. Physical/chemical processes that drive the uptake, assimilation, compartmentation and translocation of heavy metals in biota has received a great deal of attention recently, since elemental concentrations and isotopic composition in biological matrices can be used as  probes of both natural and anthropogenic sources. Further they can help to evaluate fate of contaminants and to assess bioavailability of such elements in nature. While poorly defined isotopic pools, multiple sources and fractionating processes add complexity to source identification studies, tracing is hindered mainly by poorly known or unidentified fractionating factors. High precision isotope ratio measurements have found increasing application in various branches of science, from classical isotope geochronology to complex multi-tracer experiments in environmental studies. Instrumental development and refining separation schemes have allowed higher quality data to be obtained and played a major role in the recent progress of the field. The use of modern techniques such as inductively coupled plasma sector field mass spectrometry (ICP-SFMS) and multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS) for trace and ultra-trace element concentrations and isotope ratio measurements have given new opportunities.  However, sources of errors must be accurately evaluated and avoided at every procedural step. Moreover, even with the utilization of sound analytical measurement protocols, source and process tracing in natural systems can be complicated further by spatial and temporal variability. The work described in the present thesis has been focused primarily on analytical method development, optimization and evaluation (including sample preparation, matrix separation, instrumental analysis and data evaluation stages) for isotopic and multi-elemental analyses in environmental samples at trace and ultra-trace levels. Special attention was paid to evaluate strengths and limitations of the methods as applied to complex natural environments, aiming at correct interpretation of isotopic results in environmental forensics. The analytical protocols covered several isotope systems of both stable (Cd, B, Cr, Cu, Fe, Tl and Zn) and radiogenic (Os, Pb and Sr) elements. Paper I was dedicated to the optimization and testing of a rapid and high sample throughput method for Os concentrations and isotope measurements by ICP-SFMS. If microwave (MW) digestion followed by sample introduction to ICP-SFMS by traditional solution nebulization (SN) offered unparalleled throughput important for processing large number of samples, high-pressure ashing (HPA) combined with gas-phase introduction (GPI) proved to be advantageous for samples with low (below 500 pg) analyte content. The method was applied to a large scale bio-monitoring case, confirming accumulation of anthropogenic Os in animals from an area affected by emissions from a stainless steel foundry. The method for Cr concentrations and isotope ratios in different environmental matrices was optimized in Paper II. A coupling between a high pressure/temperature acid digestion and a one pass, single column matrix separation allowed the analysis of chromites, soils, and biological matrices (first Cr isotope study in lichens and mosses) by ICP-SFMS and MC-ICP-MS. With an overall reproducibility of 0.11‰ (2σ), the results suggested a uniform isotope composition in soil depth profiles. On the other hand a strong negative correlation found between δ53Cr and Cr concentrations in lichens and mosses indicates that airborne Cr from local anthropogenic source(s) is depleted in heavy isotopes, therefore highlighting the possibility of utilization of Cr isotopes to trace local airborne pollution source from steel foundries.   Paper III describes development of high-precision Cd isotope ratio measurement by MC-ICP-MS in a variety of environmental matrices. Several digestion methods (HPA, MW, ultrawave and ashing) were tested for sample preparation, followed by analyte separation from matrix using ion-exchange chromatography. The reproducibility of the method (2σ for δ114Cd/110Cd) was found to be better than 0.1‰. The method was applied to a large number of birch leaves (n>80) collected at different locations and growth stages. Cd in birch leaves is enriched in heavier isotopes relative to the NIST SRM 3108 Cd standard with a mean δ114Cd/110Cd of 0.7‰. The fractionation is assumed to stem from sample uptake through the root system and element translocation in the plant and it exhibits profound between-tree as well as seasonal variations. The latter were compared with seasonal isotopic variations for other isotopic systems (Zn, Os, Pb) in the same trees to aid a better understanding of underlying processes. In Paper IV the number of isotope systems studied was extended to include B, Cd, Cu, Fe, Pb, Sr, Tl and Zn. The analytical procedure utilized a high pressure acid digestion (UltraCLAVE), which provides complete oxidation of the organic material in biological samples, and a two-column ion-exchange separation which represents further development of the separation scheme described in Paper III. Such sample preparation ensures low blank levels, efficient separation of matrix elements, sufficiently high analyte recoveries and reasonably high sample throughput. The method was applied to a large number of biological samples (n>240) and the data obtained represent the first combined characterization of variability in isotopic composition for eight elements in leaves, needles, lichens and mushrooms collected from a geographically confined area. To further explore the reason of variability observed, soil profiles from the same area were analyzed for both concentrations and isotopic compositions of B, Cd, Cr, Cu, Fe, Pb, Sr, Tl and Zn in Paper V. Results of this study suggest that the observed high variability can be dependent on operationally-defined fractions (assessed by applying a modified SEP to process soil samples) and on the typology of the individual matrix analyzed (assessed through the coupling of soil profile results to those obtained for other matrices: lysimetric waters, mushrooms, litter, needles, leaves and lichens). The method development conducted in this work highlights the importance of considering all possible sources of biases/errors as well as possibility to use overlapping sample preparation schemes for multi-isotope studies. The results obtained for different environmental matrices represent a starting point for discussing the role of natural isotopic variability in isotope applications and forensics, and the importance of in-depth knowledge of the multiple parameters affecting the variability observed.
APA, Harvard, Vancouver, ISO, and other styles
16

Al-Attar, A. F. "Selenium and trace metals as pollutants." Thesis, University of Bristol, 1987. http://hdl.handle.net/1983/1858b91b-362e-422f-b91c-84aa44e23e90.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Fuller, Barry Stanley. "A trace of a trace : an analysis of trauma in Shoah perpetrator auto/biographical narratives." Thesis, University of British Columbia, 2011. http://hdl.handle.net/2429/36658.

Full text
Abstract:
In this study, I attempt to prove that many Shoah perpetrators potentially suffered the effects of trauma and that traces of their traumatisation exist within their auto/biographical narratives. In my endeavour to demonstrate this belief, I first discuss how many Shoah perpetrators were not merely prompted to take part in the Shoah because they were antisemitic, but that they were heavily influenced by the social and political environment around them. However, even though many of the perpetrators took part in othering processes as a response to their interpellation and socialisation, they still suffered from the effects of trauma. I then discuss the reasons for why many of the perpetrators wrote auto/biographical narratives, ultimately stating that an unconscious need to work through the effects of trauma was a possible reason for constructing these narratives and that traces of traumatisation do exist within the pages of these discourses. Finally, I make the case that many of the Shoah perpetrators suffered the effects of what Rachel M. MacNair terms Perpetration-Induced Traumatic Stress and that their traumatisation allowed them to perpetuate trauma on others through traumatic reenactment.
APA, Harvard, Vancouver, ISO, and other styles
18

Nagapattinam, Ramamurthi Ragavendar. "Dynamic Trace-based Analysis of Vectorization Potential of Programs." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1339733528.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Rubidge, Gletwyn Robert. "The analysis of trace gas emissions from landfills." Thesis, Port Elizabeth Technikon, 2000. http://hdl.handle.net/10948/d1006341.

Full text
Abstract:
Numerous informal houses have been built on and adjacent to a landfill in iBayi, Port Elizabeth, South Africa, which accepted domestic and industrial waste. Formal housing surrounds most of the site at a greater distance - some 60 m, or further, from the landfill. Both formal and informally housed residents complain of odours, burning eyes, sore throats and headaches - symptoms which they believed were caused by the landfill. The landfill gas and ambient air were analyzed to classify and quantify the VOCs (volatile organic compounds) emitted and then to compare the quantitative data with recognised standards to establish if the residents are at risk. Eighteen target (potentially hazardous) VOCs were quantified. A wide variety of compounds were detected in both the ambient air and landfill gas. The results of the VOC analyses were similar to those of other workers in both the qualitative and quantitative studies. The concentrations of the VOCs were mostly lower than the TLV (threshold limit values) values, but exceeded the MRLs (minimum recommended levels). The combined concentrations of the VOC’s in the ambient air either approached or exceeded the limit values for combined exposure thus indicating that a potential health hazard exists. One third of the VOCs were detected in both the ambient air and the subsurface gas, however, external pollution sources also appear to contribute to the VOC concentrations ambient air. Dangerously high methane concentrations were repeatedly detected in the landfill gas amongst the informal houses. There was a vast improvement in the aesthetic qualities of the landfill since the disposal restriction to accept only domestic refuse and building rubble in July 1997. The ambient air was less odorous and landfill site littered. Fewer informal recyclers were present and their concomitant squabbling over valuables had almost vanished. The management of the iBayi landfill holds much room for improvement. There is potential for serious injury or even death if no action is taken to remedy the problems at the iBayi landfill. A holistic solution will have to be found to make the landfill a safe neighbour. Some complementary analyses (such as pH, heavy metal concentrations in the water and sediments etc.) were performed on the leachate and water surrounding the landfill.
APA, Harvard, Vancouver, ISO, and other styles
20

West, M. J. "Applications of Raman Microscopy to Trace Forensic Analysis." Thesis, University of Kent, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.520885.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Weston, Daniel John. "Membrane inlet mass spectrometry for trace level analysis." Thesis, Nottingham Trent University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.310922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Shtewi, Farag Zaied. "Stopped-flow ultra-trace analysis of boiler waters." Thesis, University of Salford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261988.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Nilsson, Samuel, and Joakim Eriksson. "Estimating Application Energy Consumption Through Packet Trace Analysis." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-110348.

Full text
Abstract:
The advancement of mobile clients and applications makes it possible for people to always stayconnected, sending and receiving data constantly. The nature of the 3G technology widelyused, however, causes a high battery drain in cellular phones and because of that a lot of toolsfor measuring mobile phones energy consumption has been developed. In this report we lookinto the trace-driven tool EnergyBox and find out how we can use it to estimate the energyconsumption of 3G transmissions for an application we’ve developed ourselves. We beginwith identifying the types of traffic our application generates and identify which parts of itmake up our applications background traffic. Different combinations of the identified traffictypes are looked into in order to decide which ones that need to be present in the packet tracesfor an estimation of our applications energy footprint for 3G transmission. Further, answersare sought to how long the time span should be for which the packet traces are collected andhow many of them are needed in order to draw a conclusion about our application’s energyfootprint. We conclude that all traffic types responsible for our application’s backgroundtraffic need to be present in the analyzed packet traces, and data suggests that collectingmore than 10 one minute packet traces does not improve accuracy significantly (less than1%). Without user interaction, our application generates traffic, which transmitted over 3G,drains as much as an average of 930mW, meaning that a Samsung Galaxy S4 battery with acapacity of 9.88Wh would last for a maximum of 10 hours and 30 minutes (excluding otherenergy consuming sources inside the handset).
APA, Harvard, Vancouver, ISO, and other styles
24

Rohman, Joshua. "A Novel Trace Elemental Analysis of Potassium Phosphates." University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1459243950.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Bolte, Nils-Ole, and Daniel Christopher Goll. "Potential analysis of track-and-trace systems in the outbound logistics of a Swedish retailer." Thesis, Jönköping University, Internationella Handelshögskolan, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-48986.

Full text
Abstract:
Supply chain visibility has become a crucial factor for companies in times of globalization and customer satisfaction. Track-and-trace technologies are important tools in order to enhance supply chain visibility. This thesis was written in cooperation with a Swedish retailer and evaluates potential track-and-trace technologies in order to develop a solution to close their current track-and-trace gap in their outbound logistics. Currently the handover point between the retailer and the postal service provider is not clearly defined, so that shipments get lost during the transition. Therefore, a literature review about currently used track-and-trace technologies was carried out. Several technologies with a wide price and applicability range are currently used and have been analysed regarding their strength and weaknesses. A qualitative study in form of interviews was conducted within the Swedish market about how this gap could possibly be closed. Empirical findings show that the existing track-and-trace technologies do not provide a best practice solution. Especially in the field of outbound logistics, several factors and the individual process requirements of a company have to be considered in order to develop an efficient solution, so that the existing track-and-trace gap can be closed. Each company has its unique set of challenges, which have to be solved in order to successfully implement a long- lasting tracking solution. A high dependence from the postal service provider is additionally given since all process steps need to be aligned to guarantee reliability of the data afterwards. In the case of the Swedish retailer, an automatized scanning bow with a separated area for outbound parcels is expected to improve transparency of the handover and lower the total amount of lost shipments. The breakeven point would be reached within the next years, so that operational saving could soon be achieved. Due to the global outbreak of COVID-19, as well as significant problems of the retailer, the practical application could not be tested. It should therefore be part of further academic studies.
APA, Harvard, Vancouver, ISO, and other styles
26

Oladipo, M. O. A. "Trace element analysis of Corinthian pottery and related clays." Thesis, University of Manchester, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.233057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Knüpfer, Andreas. "Advanced Memory Data Structures for Scalable Event Trace Analysis." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1239979718089-56362.

Full text
Abstract:
The thesis presents a contribution to the analysis and visualization of computational performance based on event traces with a particular focus on parallel programs and High Performance Computing (HPC). Event traces contain detailed information about specified incidents (events) during run-time of programs and allow minute investigation of dynamic program behavior, various performance metrics, and possible causes of performance flaws. Due to long running and highly parallel programs and very fine detail resolutions, event traces can accumulate huge amounts of data which become a challenge for interactive as well as automatic analysis and visualization tools. The thesis proposes a method of exploiting redundancy in the event traces in order to reduce the memory requirements and the computational complexity of event trace analysis. The sources of redundancy are repeated segments of the original program, either through iterative or recursive algorithms or through SPMD-style parallel programs, which produce equal or similar repeated event sequences. The data reduction technique is based on the novel Complete Call Graph (CCG) data structure which allows domain specific data compression for event traces in a combination of lossless and lossy methods. All deviations due to lossy data compression can be controlled by constant bounds. The compression of the CCG data structure is incorporated in the construction process, such that at no point substantial uncompressed parts have to be stored. Experiments with real-world example traces reveal the potential for very high data compression. The results range from factors of 3 to 15 for small scale compression with minimum deviation of the data to factors > 100 for large scale compression with moderate deviation. Based on the CCG data structure, new algorithms for the most common evaluation and analysis methods for event traces are presented, which require no explicit decompression. By avoiding repeated evaluation of formerly redundant event sequences, the computational effort of the new algorithms can be reduced in the same extent as memory consumption. The thesis includes a comprehensive discussion of the state-of-the-art and related work, a detailed presentation of the design of the CCG data structure, an elaborate description of algorithms for construction, compression, and analysis of CCGs, and an extensive experimental validation of all components
Diese Dissertation stellt einen neuartigen Ansatz für die Analyse und Visualisierung der Berechnungs-Performance vor, der auf dem Ereignis-Tracing basiert und insbesondere auf parallele Programme und das Hochleistungsrechnen (High Performance Computing, HPC) zugeschnitten ist. Ereignis-Traces (Ereignis-Spuren) enthalten detaillierte Informationen über spezifizierte Ereignisse während der Laufzeit eines Programms und erlauben eine sehr genaue Untersuchung des dynamischen Verhaltens, verschiedener Performance-Metriken und potentieller Performance-Probleme. Aufgrund lang laufender und hoch paralleler Anwendungen und dem hohen Detailgrad kann das Ereignis-Tracing sehr große Datenmengen produzieren. Diese stellen ihrerseits eine Herausforderung für interaktive und automatische Analyse- und Visualisierungswerkzeuge dar. Die vorliegende Arbeit präsentiert eine Methode, die Redundanzen in den Ereignis-Traces ausnutzt, um sowohl die Speicheranforderungen als auch die Laufzeitkomplexität der Trace-Analyse zu reduzieren. Die Ursachen für Redundanzen sind wiederholt ausgeführte Programmabschnitte, entweder durch iterative oder rekursive Algorithmen oder durch SPMD-Parallelisierung, die gleiche oder ähnliche Ereignis-Sequenzen erzeugen. Die Datenreduktion basiert auf der neuartigen Datenstruktur der "Vollständigen Aufruf-Graphen" (Complete Call Graph, CCG) und erlaubt eine Kombination von verlustfreier und verlustbehafteter Datenkompression. Dabei können konstante Grenzen für alle Abweichungen durch verlustbehaftete Kompression vorgegeben werden. Die Datenkompression ist in den Aufbau der Datenstruktur integriert, so dass keine umfangreichen unkomprimierten Teile vor der Kompression im Hauptspeicher gehalten werden müssen. Das enorme Kompressionsvermögen des neuen Ansatzes wird anhand einer Reihe von Beispielen aus realen Anwendungsszenarien nachgewiesen. Die dabei erzielten Resultate reichen von Kompressionsfaktoren von 3 bis 5 mit nur minimalen Abweichungen aufgrund der verlustbehafteten Kompression bis zu Faktoren > 100 für hochgradige Kompression. Basierend auf der CCG_Datenstruktur werden außerdem neue Auswertungs- und Analyseverfahren für Ereignis-Traces vorgestellt, die ohne explizite Dekompression auskommen. Damit kann die Laufzeitkomplexität der Analyse im selben Maß gesenkt werden wie der Hauptspeicherbedarf, indem komprimierte Ereignis-Sequenzen nicht mehrmals analysiert werden. Die vorliegende Dissertation enthält eine ausführliche Vorstellung des Stands der Technik und verwandter Arbeiten in diesem Bereich, eine detaillierte Herleitung der neu eingeführten Daten-strukturen, der Konstruktions-, Kompressions- und Analysealgorithmen sowie eine umfangreiche experimentelle Auswertung und Validierung aller Bestandteile
APA, Harvard, Vancouver, ISO, and other styles
28

Pinto, Devanand Michael. "Trace analysis of peptides and proteins by capillary electrophoresis." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ34822.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Johnson, David C. "A shift variant filter applied to edge trace analysis /." Online version of thesis, 1989. http://hdl.handle.net/1850/11357.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Thornley, David John. "Analysis of trace data from fluorescence based Sanger sequencing." Thesis, Imperial College London, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.286265.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Caldicott, Kenneth James. "Application of trace analysis and chemometrics to environmental problems." Thesis, University of South Wales, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.480930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Al-Mutawah, Jameela. "Using matrix isolation FTIR for trace atmospheric gas analysis." Thesis, University of Reading, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Colombo, Carlo Maurizio. "Flow analysis of trace metals in seawater by voltammetry." Thesis, University of Liverpool, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Shahtaheri, Seyyed Jamaleddin. "Trace pesticide analysis using immuno-based solid-phase extraction." Thesis, University of Surrey, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Knüpfer, Andreas. "Advanced Memory Data Structures for Scalable Event Trace Analysis." Doctoral thesis, Technische Universität Dresden, 2008. https://tud.qucosa.de/id/qucosa%3A23611.

Full text
Abstract:
The thesis presents a contribution to the analysis and visualization of computational performance based on event traces with a particular focus on parallel programs and High Performance Computing (HPC). Event traces contain detailed information about specified incidents (events) during run-time of programs and allow minute investigation of dynamic program behavior, various performance metrics, and possible causes of performance flaws. Due to long running and highly parallel programs and very fine detail resolutions, event traces can accumulate huge amounts of data which become a challenge for interactive as well as automatic analysis and visualization tools. The thesis proposes a method of exploiting redundancy in the event traces in order to reduce the memory requirements and the computational complexity of event trace analysis. The sources of redundancy are repeated segments of the original program, either through iterative or recursive algorithms or through SPMD-style parallel programs, which produce equal or similar repeated event sequences. The data reduction technique is based on the novel Complete Call Graph (CCG) data structure which allows domain specific data compression for event traces in a combination of lossless and lossy methods. All deviations due to lossy data compression can be controlled by constant bounds. The compression of the CCG data structure is incorporated in the construction process, such that at no point substantial uncompressed parts have to be stored. Experiments with real-world example traces reveal the potential for very high data compression. The results range from factors of 3 to 15 for small scale compression with minimum deviation of the data to factors > 100 for large scale compression with moderate deviation. Based on the CCG data structure, new algorithms for the most common evaluation and analysis methods for event traces are presented, which require no explicit decompression. By avoiding repeated evaluation of formerly redundant event sequences, the computational effort of the new algorithms can be reduced in the same extent as memory consumption. The thesis includes a comprehensive discussion of the state-of-the-art and related work, a detailed presentation of the design of the CCG data structure, an elaborate description of algorithms for construction, compression, and analysis of CCGs, and an extensive experimental validation of all components.
Diese Dissertation stellt einen neuartigen Ansatz für die Analyse und Visualisierung der Berechnungs-Performance vor, der auf dem Ereignis-Tracing basiert und insbesondere auf parallele Programme und das Hochleistungsrechnen (High Performance Computing, HPC) zugeschnitten ist. Ereignis-Traces (Ereignis-Spuren) enthalten detaillierte Informationen über spezifizierte Ereignisse während der Laufzeit eines Programms und erlauben eine sehr genaue Untersuchung des dynamischen Verhaltens, verschiedener Performance-Metriken und potentieller Performance-Probleme. Aufgrund lang laufender und hoch paralleler Anwendungen und dem hohen Detailgrad kann das Ereignis-Tracing sehr große Datenmengen produzieren. Diese stellen ihrerseits eine Herausforderung für interaktive und automatische Analyse- und Visualisierungswerkzeuge dar. Die vorliegende Arbeit präsentiert eine Methode, die Redundanzen in den Ereignis-Traces ausnutzt, um sowohl die Speicheranforderungen als auch die Laufzeitkomplexität der Trace-Analyse zu reduzieren. Die Ursachen für Redundanzen sind wiederholt ausgeführte Programmabschnitte, entweder durch iterative oder rekursive Algorithmen oder durch SPMD-Parallelisierung, die gleiche oder ähnliche Ereignis-Sequenzen erzeugen. Die Datenreduktion basiert auf der neuartigen Datenstruktur der "Vollständigen Aufruf-Graphen" (Complete Call Graph, CCG) und erlaubt eine Kombination von verlustfreier und verlustbehafteter Datenkompression. Dabei können konstante Grenzen für alle Abweichungen durch verlustbehaftete Kompression vorgegeben werden. Die Datenkompression ist in den Aufbau der Datenstruktur integriert, so dass keine umfangreichen unkomprimierten Teile vor der Kompression im Hauptspeicher gehalten werden müssen. Das enorme Kompressionsvermögen des neuen Ansatzes wird anhand einer Reihe von Beispielen aus realen Anwendungsszenarien nachgewiesen. Die dabei erzielten Resultate reichen von Kompressionsfaktoren von 3 bis 5 mit nur minimalen Abweichungen aufgrund der verlustbehafteten Kompression bis zu Faktoren > 100 für hochgradige Kompression. Basierend auf der CCG_Datenstruktur werden außerdem neue Auswertungs- und Analyseverfahren für Ereignis-Traces vorgestellt, die ohne explizite Dekompression auskommen. Damit kann die Laufzeitkomplexität der Analyse im selben Maß gesenkt werden wie der Hauptspeicherbedarf, indem komprimierte Ereignis-Sequenzen nicht mehrmals analysiert werden. Die vorliegende Dissertation enthält eine ausführliche Vorstellung des Stands der Technik und verwandter Arbeiten in diesem Bereich, eine detaillierte Herleitung der neu eingeführten Daten-strukturen, der Konstruktions-, Kompressions- und Analysealgorithmen sowie eine umfangreiche experimentelle Auswertung und Validierung aller Bestandteile.
APA, Harvard, Vancouver, ISO, and other styles
36

Wei, Shijun. "Trace Analysis of Crystalline Silica Aerosol Using Vibrational Spectroscopy." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1595849455064302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

BARRESI, ANDREA. "Development of innovative techniques for ultra-trace elements analysis." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2023. https://hdl.handle.net/10281/403458.

Full text
Abstract:
L'esperimento JUNO è stato proposto con l'obiettivo principale di risolvere il problema relativo all'ordinamento di massa dei neutrini attraverso misure accurate del flusso di antineutrini prodotto da reattori nucleari. A causa della sezione estremamente ridotta dei neutrini, il numero di eventi segnale previsti è molto piccolo, circa 60 eventi IBD al giorno, ed è quindi essenziale tenere sotto controllo il tasso di eventi di fondo. Ciò può essere ottenuto minimizzando tutte le sorgenti che contribuiscono alla produzione di eventi spuri e in primo luogo quelle generate dal fondo radioattivo. Per ogni capostipite delle catene naturali (U238 e Th232), per il 40K, e per alcuni nuclidi chiave, come 60Co e 210Pb, è necessario imporre forti limiti alla concentrazione che può essere presente all'interno dei materiali del rivelatore. Data la struttura del rivelatore JUNO, il materiale più critico è il liquido scintillante per il quale sono richieste concentrazioni di uranio e torio inferiori a 1E-15 g/g e potassio inferiori a 1E-16 g/g. In questa tesi, presento il lavoro che ho svolto in questo contesto con due scopi principali. Il primo è la validazione del software Monte Carlo dell'esperimento JUNO applicato alle simulazioni di fondo con l'obiettivo di verificare i limiti di radiopurezza imposti per i materiali e determinare il background-budget dell'esperimento. La seconda è l'implementazione di una tecnica di misura che permette di raggiungere le sensibilità richieste per la misura del contenuto di uranio, torio e potassio nel liquido scintillante. La validazione del software Monte Carlo dell'esperimento JUNO (SNiPER) è stata effettuata confrontandone i risultati con quelli di altri due codici di simulazione, in particolare con il software Arby, sviluppato presso l'Università degli studi di Milano-Bicocca. Ho potuto studiare diversi aspetti e molte criticità della simulazione del fondo e dei risultati riportati dallo strumento ufficiale, come l'applicazione del quenching factor e la forma degli spettri beta. Gli spettri dell'energia depositata prodotta dalle contaminazioni nei componenti principali del rivelatore JUNO sono stati calcolati con i codici Monte Carlo. Il tasso di eventi indotti nel rivelatore è stato valutato in base ai limiti di radiopurezza imposti, ottenendo il rate totale di eventi di fondo previsto. Il valore ottenuto è inferiore al limite imposto per garantire la sensibilità finale dell'esperimento. Ciò ha consentito di correggere e validare la risposta del software ufficiale dell'esperimento JUNO e di verificare l'attualità dei limiti di radiopurezza inizialmente definiti per i componenti del rivelatore. Durante il mio dottorato di ricerca ho completato lo sviluppo del nuovo sistema di misura, chiamato GeSparK che sfrutta la coincidenza tra uno scintillatore liquido e un rivelatore HPGe per ridurre il fondo del rivelatore HPGe singolo. È stata anche sviluppata una nuova tecnica di coincidenza ritardata che sfrutta la struttura nucleare del 239Np, il prodotto di attivazione del 238U, al fine di ottenere un marcatore estremamente forte di questo particolare decadimento e aumentare significativamente la sensibilità di misura rispetto all'approccio tradizionale. La sensibilità ottenuta era ancora insufficiente rispetto alle richieste di JUNO e per questo si decise di attuare una serie di trattamenti radiochimici. Diversi trattamenti sono stati proposti, testati e implementati con i due obiettivi di aumentare la massa del campione misurabile e ridurre la concentrazione di nuclidi interferenti. La tecnica sviluppata per uranio e torio prevede una fase di estrazione liquido-liquido e di estrazione cromatografia con resine UTEVA e TEVA rispettivamente prima e dopo l’irraggiamento. Due misurazioni condotte su campioni "bianchi" con la procedura finale hanno permesso di ottenere una sensibilità compatibile con i limiti imposti da JUNO per lo scintillatore liquido a livello del ppq.
The JUNO experiment was proposed with the main aim of solving the problem related to the neutrino mass ordering through accurate measurements of the antineutrinos flow produced by nuclear reactors. Due to the extremely small cross-section of neutrinos, the number of expected signal events is very small, about 60 IBD events per day, and it is therefore essential to keep under control the rate of background events. This can be achieved by minimizing all the sources that contribute to the generation of spurious events and in the first place those generated by the radioactive background. For each progenitor of the natural chains (U238 and Th232), for the 40K, and for some key nuclides, such as 60Co and 210Pb, it is necessary to impose strong limits on the concentration that may be present within the materials of the detector. Given the structure of the JUNO detector, the most critical material is the liquid scintillator for which uranium and thorium concentrations below 1E-15 g/g and potassium below 1E-16 g/g are required. In this thesis, I present the work I did in this context with two main purposes. The first one is the validation of the Monte Carlo software of the JUNO experiment applied to the background simulations with the aim of verifying the radiopurity limits imposed for the materials and determining the background budget of the experiment. The second one is the implementation of a measurement technique that allows reaching the sensitivities required for the measurement of the content of uranium, thorium, and potassium in the liquid scintillator. The validation of the Monte Carlo software of the JUNO experiment (SNiPER) was performed by comparing its results with those of two other simulation codes, in particular with the software Arby, developed at the University of Milano-Bicocca. I was able to study different aspects and many critical issues of the simulation of the background and the results reported by the official tool, such as the application of the quenching factor and the shape of the radioactive β-decay spectra. The spectra of the deposited energy produced by the contaminations in the main components of the JUNO detector were computed with the Monte Carlo codes. The rate of events induced in the detector was assessed based on the imposed radiopurity limits, obtaining the expected total background event rate. The value obtained is lower than the limit set to ensure the final sensitivity of the experiment. This allowed correcting and validating the answer of the official software of the JUNO experiment and verifying the actuality of the radiopurity limits initially defined for the components of the detector. During my Ph.D. I completed the development of the new measurement system, called GeSparK that exploits the coincidence between a liquid scintillator and an HPGe detector to reduce the background of the single HPGe detector. I also worked on the development of a new delayed coincidence technique that exploits the nuclear structure of 239Np, the activation product of 238U, in order to obtain an extremely strong marker of this particular decay and significantly increase the measurement sensitivity compared to the traditional approach. The sensitivity obtained was still insufficient compared to the requests of JUNO and for this reason, it was decided to implement a series of radiochemical treatments. Different treatments have been proposed, tested, and implemented with the two aims of increasing the mass of the measurable sample and reducing the concentration of interfering nuclides. The technique developed for uranium and thorium involves a liquid-liquid extraction phase and the extraction chromatography with UTEVA and TEVA resins respectively before and after irradiation. Two measurements conducted on "blank samples” with the final procedure allowed us to achieve a sensitivity that is compatible with the limits imposed by JUNO for the liquid scintillator at the ppq level.
APA, Harvard, Vancouver, ISO, and other styles
38

Bairu, Semere Ghebru. "Diamond paste based electrodes for inorganic analysis." Diss., Pretoria : [s.n.], 2003. http://hdl.handle.net/2263/27268.

Full text
Abstract:
Differential pulse voltammetry is one of the most widely used analytical polarographic techniques especially for trace inorganic analysis. Up to now mercury electrode and different types of carbon electrodes were used for such analysis. The emphasis of the present dissertation is on the design of a new class of electrodes, namely mono crystalline diamond paste based electrodes, to be used in differential pulse voltammetry for trace analysis of inorganic compounds. Monocrystalline diamond and boron doped polycrystalline diamond based electrodes exhibit several superior electrochemical properties that are significantly different from those of other carbon allotropes based electrodes, e.g., glassy carbon electrodes, highly oriented pyrolytic graphite based electrodes, which have been widely used for many years. The advantages are: (a) lower background currents and noise signals, which lead to improve SIB and SIN ratios, and lower detection limits; (b) good electrochemical activity ( pre-treatment is not necessary); (c) wide electrochemical potential window in aqueous media; (d) very low capacitance; ( e) extreme electrochemical stability; and (f) high reproducibility of analytical information. Furthermore, later studies shown the superiority of the mono crystalline diamond as electrode material due to high mobilities measured for electrons and holes. The design selected for the electrodes is simple, fast and reproducible. The diamond powder was mixed with paraffine oil to give the diamond paste used as electroactive material in the electrodes. The results obtained by employing the diamond paste based electrodes proved a high sensitivity, selectivity, accuracy and high reliability. These characteristics made them suitable to be used for the analysis of different cations (e.g., Fe(ll), Fe(Ill), Cr(Ill), Cr(VI), Pb(ll), Ag(I)) as well as of anions (e.g., iodide) in pharmaceutical, food and environmental matricies.
Dissertation (MSc (Chemistry))--University of Pretoria, 2006.
Chemistry
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
39

Tsoi, Yeuk Ki. "Prevalent instrumentation and material in trace elements analysis and speciations." HKBU Institutional Repository, 2011. http://repository.hkbu.edu.hk/etd_ra/1250.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Williams, Timothy Paul. "Determination of trace metals by ion-chromatography with chemiluminescence detection." Thesis, University of Plymouth, 1990. http://hdl.handle.net/10026.1/2182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Muskara, Uftade. "Provenance Studies On Limestone Archaeological Artifacts Using Trace Element Analysis." Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/12608429/index.pdf.

Full text
Abstract:
Trace element composition of archaeological artifacts is commonly used for provenance studies. Limestone has generally studied by geologists and there are a few researches done by various archaeological sciences. Although it is a common material for buildings and sculpture it is been thought that limestone used had not imported like marbles. Limestone figurines from Datç
a/Emecik excavations are classified as Cypriote type, which was very popular through 6th century B.C. in the Mediterranean region. Since this type of figurines was found at Emecik numerously to determine its provenance was an important problem. Emecik figurines were examined for their some major, trace elements and REE compositions and results were compared with geological samples which were taken from a near by quarry. Inductively Coupled Plasma-Optical Emission Spectrometry (ICP-OES), Inductively Coupled Plasma-Mass Spectrometry (ICPMS) have been used for analysis. The methods have been optimized by using standard reference material NIST 1d, NCS DC 73306, and IGS40.
APA, Harvard, Vancouver, ISO, and other styles
42

HU, XIAO. "TRACE Analysis of LOCA Transients Performed on FIX-II Facility." Thesis, KTH, Kärnkraftsäkerhet, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-95746.

Full text
Abstract:
As a latest developed computational code, TRACE is expected to be useful and effective for analyzing the thermal-hydraulic behaviors in design, licensing and safety analysis of nuclear power plant. However, its validity and correctness have to be verified and qualified before its application into industry. Loss-of-coolant accident (LOCA) is a kind of transient thermal hydraulic event which has been emphasized a lot as a most important threat to the safety of the nuclear power plant. In the present study, based on FIX- II LOCA tests, simulation models for the tests of No. 3025, No. 3061 and No. 5052 were developed to validate the TRACE code (version 5.0 patch 2). The simulated transient thermal-hydraulic behaviors during the LOCA tests including the pressure in the primary system, the mass flow rate in certain key parts, and the temperature in the core are compared with experimental data. The simulation results show that TRACE model can well reproduce the transient thermal-hydraulic behaviors under different LOCA situations. In addition, sensitivity analysis are also performed to investigate the influence of particular models and parameters, including counter current flow limitation (CCFL) model, choked flow model , insulator in the steam dome, K-factor in the test section, and pump trip, on the results. The sensitivity analyses show that both the models and parameters have significant influence on the outcome of the model.
APA, Harvard, Vancouver, ISO, and other styles
43

Morneau, Jean Philippe. "Trace metal analysis of marine zooplankton from Conception Bay, Newfoundland." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq23164.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

French, Holly. "Trace metal analysis of street drugs by atomic absorption spectroscopy." Thesis, University of Kent, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.591083.

Full text
Abstract:
The aim of the study was to ascertain the capabilities of the graphite furnace atomic absorption spectroscopy (OF AAS) system to discriminate between the two separate ecstasy batches according to the differing concentrations of trace metals. Previous studies have utilised rcp techniques such as rCP-AES and rCP-MS to determine trace metal concentrations in ecstasy tablets with particular interest in potential synthesis marker metals such as Pt, B and Hg. Five metals: Cu, Ba, Ni, Cr and Ph were quantified with good repeatability and reproducibility. The selection of elements comprises of metals that may be present due to additives, dyes, reducing agents, catalysts and elements recorded to be commonly present in ecstasy tablets, for example Ba. Elements Cu, Cr and Pb have been observed previously to be more homogenous within batches than elemental markers for synthesis routes suggesting these trace metals will be of use for indicating linkages between batches. Two separate batches of seized ecstasy tablets were analysed by GF AAS following digestion with nitric acid and hydrogen peroxide. Ecstasy tablets were acquired from the TICTAC unit at St George's Hospital London Tooting. HPLC analysis conducted at St. George's indicated that Batch 21573 of 20 tablets contained significant quantities of MDMA while Batch 26237 of 20 tablets contained predominantly caffeine. The small sample size has allowed for careful interpretation of intra-batch variation and inter-batch variation in the metal concentrations determined for each ecstasy batch. A previous study has used a similar sample size. The current study has successfully demonstrated the quantitation of five trace metals in ecstasy tablets for the first time via OF AAS. Large intra-batch variations were fOlU1d as expected for tablets produced in clandestine laboratories. Nickel in the MDMA containing batch 21573 was present in the range 0.47-13.1 ppm and in the caffeine containing batch 26237 in the range 0.35-9.06 ppm, very similar to previous reports of 1- 25 ppm and 0.3-16 ppm. The Cr levels detenmined in the current study (MDMA batch: 0.12-3.16 ppm, caffeine batch: 0.10-1.36 ppm) are lower than reported previously (0.7-34 ppm). The range ofPb concentrations (MDMA batch: 0.1 1-3.81 ppm, caffeine batch: 0.12- 4.91 ppm) are similar to previously reported (0.02-10 ppm). The copper levels in these samples were found to be particularly high (MDMA batch: 4-2379 ppm, caffeine batch: 17-1851 ppm) compared with previous studies using ICP techniques which fOlU1d ranges of 1-19 ppm and 0.8-38 ppm in ecstasy tablets. Barium concentrations were relatively similar within each batch with significantly higher concentrations found in the caffeine containing batch 26237 (rv1DMA batch: 0.19-0.66 ppm, caffeine batch: 3.77-5.47 ppm). Barium was the only element to offer discrimination between the two batches of tablets. AAS systems are well established, cheaper to install and run, and require less user skill as well as less sample volume. The addition that the current study offers in light of previous literature in the area is evidence of the capabilities of GF AAS to perform analyses on ecstasy tablets, with good detection limits, good precision and small sample volume, but with a technique that is well establish, fmancially less challenging and readily available in forensic laboratories in comparison to the more powerful ICP-MS technique
APA, Harvard, Vancouver, ISO, and other styles
45

Graham, Paul. "Trace chemical analysis and molecular dynamics utilising ultraintense femtosecond lasers." Thesis, University of Glasgow, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Zhao, Song. "HTTP Traffic Analysis based on a Lossy Packet-level Trace." Case Western Reserve University School of Graduate Studies / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=case1373030307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Phillips, Nena Mae Monique. "Vascular flora and gradient analysis of the Natchez Trace Parkway." [College Station, Tex. : Texas A&M University, 2006. http://hdl.handle.net/1969.1/ETD-TAMU-1860.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nelson, James Baird 1967. "Analysis of trace element distributions distal to porphyry copper deposits." Thesis, The University of Arizona, 1996. http://hdl.handle.net/10150/278566.

Full text
Abstract:
Enrichment and depletion of trace elements occurs in host rocks distal to porphyry copper deposits as a result of hydrothermal metasomatism. Subtle geochemical distributions in weakly propylitized host rocks is sufficient to indicate proximity to a mineralized system and may be applied to porphyry copper exploration. Samples collected adjacent to four porphyry copper deposits were analyzed for a multi-element suite, then normalized to the elemental concentrations of the fresh host rocks. The probability that an element has been enriched or depleted is determined by using concentrations in the unaltered host in conjunction with a calculated standard deviation. The probabilities have distinct zoning that is related to alteration around the deposits. Contribution lateral to deposits was observed with: Ag, As, Au, Bi, Br, Ca, Cu, Hg, Mn, Mo, Pb, Sb, Se, V, and Zn. Proximal to the mineralized portion of the systems elemental removal was observed with: Ba, Br, Ca, Mg, Mn, P, Pb, Ti, V, Y, and Zn.
APA, Harvard, Vancouver, ISO, and other styles
49

Tinggi, Ujang. "Trace element analysis and nutritional studies in children with phenylketonuria." Thesis, Queensland University of Technology, 1989. https://eprints.qut.edu.au/36913/1/36913_Tinggi_1989.pdf.

Full text
Abstract:
Techniques suitable for sample preparation and analysis of copper, iron, zinc, chromium, manganese and selenium in plasma, urine and food samples were further developed to achieve the maximum accuracy and reproducibility. The problems and limitations associated with the analytical techniques were discussed. Special emphasis was given to selenium because of prevalent matrix interferences and losses associated with its analysis especially at the low levels present in some of the samples. Precedence was also given to the analysis of selenium, manganese and chromium in selected foods as such data are currently lacking for Australia. Samples of food from three-day weighed dietary diary as well as blood plasma and urine samples were used for the assessment of the above trace element nutritional status in children with phenylketonuria and their healthy siblings. The results of these findings are discussed.
APA, Harvard, Vancouver, ISO, and other styles
50

黃功顯 and Kung-hin Wong. "Clinical application of trace analysis of carbon monoxide in expired air." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1986. http://hub.hku.hk/bib/B31207972.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography