Dissertations / Theses on the topic 'Magnetic measurements Data processing'

To see the other types of publications on this topic, follow the link: Magnetic measurements Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Magnetic measurements Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Friedmann, Arnon A. "Measurements, characterization, and system design for digital storage /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1997. http://wwwlib.umi.com/cr/ucsd/fullcit?p9732719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

拓也, 桑原, and Takuya Kuwahara. "Characterization of gas-liquid two-phase flow regimes using Magnetic fluid : setup, measurements, signal processing and data analysis." Thesis, https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB10268912/?lang=0, 2008. https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB10268912/?lang=0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Jonathan A. "Nuclear magnetic resonance data processing methods." Thesis, University of Oxford, 1992. http://ora.ox.ac.uk/objects/uuid:7df97c9a-4e65-4c10-83eb-dfaccfdccefe.

Full text
Abstract:
This thesis describes the application of a wide variety of data processing methods, in particular the Maximum Entropy Method (MEM), to data from Nuclear Magnetic Resonance (NMR) experiments. Chapter 1 provides a brief introduction to NMR and to data processing, which is developed in chapter 2. NMR is described in terms of the classical model due to Bloch, and the principles of conventional (Fourier transform) data processing developed. This is followed by a description of less conventional techniques. The MEM is derived on several grounds, and related to both Bayesian reasoning and Shannon information theory. Chapter 3 describes several methods of evaluating the quality of NMR spectra obtained by a variety of data processing techniques; the simple criterion of spectral appearance is shown to be completely unsatisfactory. A Monte Carlo method is described which allows several different techniques to be compared, and the relative advantages of Fourier transformation and the MEM are assessed. Chapter 4 describes in vivo NMR, particularly the application of the MEM to data from Phase Modulated Rotating Frame Imaging (PMRFI) experiments. In this case the conventional data processing is highly unsatisfactory, and MEM processing results in much clearer spectra. Chapter 5 describes the application of a range of techniques to the estimation and removal of splittings from NMR spectra. The various techniques are discussed using simple examples, and then applied to data from the amino acid iso-leucine. The thesis ends with five appendices which contain historical and philosophical notes, detailed calculations pertaining to PMRFI spectra, and a listing of the MEM computer program.
APA, Harvard, Vancouver, ISO, and other styles
4

Rydell, Joakim. "Advanced MRI Data Processing." Doctoral thesis, Linköping : Department of Biomedical Engineering, Linköpings universitet, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-10038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bance, Simon G. "Data storage and processing using magnetic nanowires." Thesis, University of Sheffield, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.505475.

Full text
Abstract:
This thesis contains data from micromagnetic simulations that investigate new methods for data storage and processing on the nanoscale using ferromagnetic nanowires. First I consider a magnetic memory, domain wall trap memory, which could compete with a number of existing devices that are currently in widespread use. Domain wall trap memory exhibits a 90% lower coercivity over traditional MRAM designs because, instead of remagnetizing a rectangular or oval magnetic free layer by moment rotation or domain nucleation, an existing domain wall is moved along a structured nanowire to remagnetize part of the wire. I determine the fields for de-pinning, switching and expulsion of domain walls in memory cells to show that the margins between them can be sufficiently large for reliable operation. The nudged elastic band method is used to show that domain wall trap memory is thermally stable at room temperature.
APA, Harvard, Vancouver, ISO, and other styles
6

Ostroumov, Ivan Victorovich. "Magnetic field data processing with personal electronic device." Thesis, Polit. Challenges of science today: International Scientific and Practical Conference of Young Researchers and Students, April 6–8, 2016 : theses. – К., 2016. – 83p, 2016. http://er.nau.edu.ua/handle/NAU/26649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Johnson, Kevin Robert. "In Vivo Coronary Wall Shear Stress Determination Using CT, MRI, and Computational Fluid Dynamics." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/14482.

Full text
Abstract:
Wall shear stress (WSS) has long been identified as a factor in the development of atherosclerotic lesions. Autopsy studies have revealed a strong tendency for lesion development at arterial branch sites and along the inner walls of curvature areas that, in theory, should experience low WSS. Calculations of coronary artery WSS have typically been based upon average models of coronary artery geometry with average flow conditions and then compared to average lesion distributions. With all the averaging involved, a more detailed knowledge of the correlation between WSS and atherosclerotic lesion development might be obscured. Recent advancements in hemodynamic modeling now enable the calculation of WSS in individual subjects. An image-based approach for patient-specific calculation of in vivo WSS using computational fluid dynamics (CFD) would allow a more direct study of this correlation. New state-of-the-art technologies in multi-detector computed tomography (CT) and 3.0 Tesla magnetic resonance imaging (MRI) offer potential improvements for the measurement of coronary artery geometry and blood flow. The overall objective of this research was to evaluate the quantitative accuracy of multi-detector CT and 3.0 Tesla MRI and incorporate those imaging modalities into a patient-specific CFD model of coronary artery WSS. Using a series of vessel motion phantoms, it has been shown that 64-detector CT can provide accurate measurements of coronary artery geometry for heart rates below 70 beats per minute. A flow phantom was used to validate the use of navigator-echo gated, phase contrast MRI at 3.0 Tesla to measure velocity of coronary blood flow. Patient-specific, time-resolved CFD models of coronary WSS were created for two subjects. Furthermore, it was determined that population-average velocity curves or steady state velocities can predict locations of high or low WSS with high degrees of accuracy compared to the use of patient-specific blood flow velocity measurements as CFD boundary conditions. This work is significant because it constitutes the first technique to non-invasively calculate in vivo coronary artery WSS using image-based, patient-specific modeling.
APA, Harvard, Vancouver, ISO, and other styles
8

Maas, Luis C. (Luis Carlos). "Processing strategies for functional magnetic resonance imaging data sets." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/85262.

Full text
Abstract:
Thesis (Ph.D.)--Harvard--Massachusetts Institute of Technology Division of Health Sciences and Technology, 1999.
Includes bibliographical references (leaves 108-118).
by Luis Carlos Maas, III.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
9

Casto, Daniel W. "Calculating depths to shallow magnetic sources using aeromagnetic data from the Tucson Basin." Tucson, Ariz. : U.S. Dept. of the Interior, U.S. Geological Survey, 2001. http://geopubs.wr.usgs.gov/open-file/of01-505/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Jae-Min. "Characterization of spatial and temporal brain activation patterns in functional magnetic resonance imaging data." [Gainesville, Fla.] : University of Florida, 2005. http://purl.fcla.edu/fcla/etd/UFE0013024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Davies, S. J. "Frequency-selective excitation and non-linear data processing in nuclear magnetic resonance." Thesis, University of Oxford, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.233510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Oller, Moreno Sergio. "Data processing for Life Sciences measurements with hyphenated Gas Chromatography-Ion Mobility Spectrometry." Doctoral thesis, Universitat de Barcelona, 2018. http://hdl.handle.net/10803/523539.

Full text
Abstract:
Recent progress in analytical chemistry instrumentation has increased the amount of data available for analysis. This progress has been encompassed by computational improvements, that have enabled new possibilities to analyze larger amounts of data. These two factors have allowed to analyze more complex samples in multiple life science fields, such as biology, medicine, pharmacology, or food science. One of the techniques that has benefited from these improvements is Gas Chromatography - Ion Mobility Spectrometry (GC-IMS). This technique is useful for the detection of Volatile Organic Compounds (VOCs) in complex samples. Ion Mobility Spectrometry is an analytical technique for characterizing chemical substances based on the velocity of gas-phase ions in an electric field. It is able to detect trace levels of volatile chemicals reaching for some analytes ppb concentrations. While the instrument has moderate selectivity it is very fast in the analysis, as an ion mobility spectrum can be acquired in tenths of milliseconds. As it operates at ambient pressure, it is found not only as laboratory instrumentation but also in-site, to perform screening applications. For instance it is often used in airports for the detection of drugs and explosives. To enhance the selectivity of the IMS, especially for the analysis of complex samples, a gas chromatograph can be used for sample pre-separation at the expense of the length of the analysis. While there is better instrumentation and more computational power, better algorithms are still needed to exploit and extract all the information present in the samples. In particular, GC-IMS has not received much attention compared to other analytical techniques. In this work we address some of the data analysis issues for GC-IMS: With respect to the pre-processing, we explore several baseline estimation methods and we suggest a variation of Asymmetric Least Squares, a popular baseline estimation technique, that is able to cope with signals that present large peaks or large dynamic range. This baseline estimation method is used in Gas Chromatography - Mass Spectrometry signals as well, as it suits both techniques. Furthermore, we also characterize spectral misalignments in a several months long study, and propose an alignment method based on monotonic cubic splines for its correction. Based on the misalignment characterization we propose an optimal time span between consecutive calibrant samples. We the explore the usage of Multivariate Curve Resolution methods for the deconvolution of overlapped peaks and their extraction into pure components. We propose the use of a sliding window in the retention time axis to extract the pure components from smaller windows. The pure components are tracked through the windows. This approach is able to extract analytes with lower response with respect to MCR, compounds that have a low variance in the overall matrix Finally we apply some of these developments to real world applications, on a dataset for the prevention of fraud and quality control in the classification of olive oils, measured with GC-IMS, and on data for biomarker discovery of prostate cancer by analyzing the headspace of urine samples with a GC-MS instrument.
Els avenços recents en instrumentació química i el progrés en les capacitats computacionals obren noves possibilitats per l’anàlisi de dades provinents de diversos camps en l’àmbit de les ciències de la vida, com la biologia, la medicina o la ciència de l’alimentació. Una de les tècniques que s’ha beneficiat d’aquests avenços és la cromatografia de gasos – espectrometria de mobilitat d’ions (GC-IMS). Aquesta tècnica és útil per detectar compostos orgànics volàtils en mostres complexes. L’IMS és una tècnica analítica per caracteritzar substàncies químiques basada en la velocitat d’ions en fase gasosa en un camp elèctric, capaç de detectar traces d’alguns volàtils en concentracions de ppb ràpidament. Per augmentar-ne la selectivitat, un cromatògraf de gasos pot emprar-se per pre-separar la mostra, a expenses de la durada de l’anàlisi. Tot i disposar de millores en la instrumentació i més poder computacional, calen millors algoritmes per extreure tota la informació de les mostres. En particular, GC-IMS no ha rebut molta atenció en comparació amb altres tècniques analítiques. En aquest treball, tractem alguns problemes de l’anàlisi de dades de GC-IMS: Pel que fa al pre-processat, explorem algoritmes d’estimació de la línia de base i en proposem una millora, adaptada a les necessitats de l’instrument. Aquest algoritme també s’utilitza en mostres de cromatografia de gasos espectrometria de masses (GC-MS), en tant que s’adapta correctament a ambdues tècniques. Caracteritzem els desalineaments espectrals que es produeixen en un estudi de diversos mesos de durada, i proposem un mètode d’alineat basat en splines cúbics monotònics per a la seva correcció i un interval de temps òptim entre dues mostres calibrants. Explorem l’ús de mètodes de resolució multivariant de corbes (MCR) per a la deconvolució de pics solapats i la seva extracció en components purs. Proposem l’ús d’una finestra mòbil en el temps de retenció. Aquesta millora permet extreure més informació d’analits. Finalment utilitzem alguns d’aquests desenvolupaments a dues aplicacions: la prevenció de frau en la classificació d’olis d’oliva, mesurada amb GC-IMS i la cerca de biomarcadors de càncer de pròstata en volàtils de la orina, feta amb GC-MS.
APA, Harvard, Vancouver, ISO, and other styles
13

Hood, Lon L., David L. Mitchell, Robert P. Lin, Mario H. Acuna, and Alan B. Binder. "Initial measurements of the lunar induced magnetic dipole moment using Lunar Prospector Magnetometer data." AMER GEOPHYSICAL UNION, 1999. http://hdl.handle.net/10150/624011.

Full text
Abstract:
Twenty-one orbits of Lunar Prospector magnetometer data obtained during an extended passage of the Moon through a lobe of the geomagnetic tail in April 1998 are applied to estimate the residual lunar induced magnetic dipole moment. Editing and averaging of individual orbit segments yields a negative induced moment with amplitude −2.4 ±1.6 × 1022 Gauss-cm³ per Gauss of applied field. Assuming that the induced field is caused entirely by electrical currents near the surface of a highly electrically conducting metallic core, the preferred core radius is 340±90 km. For an iron-rich composition, such a core would represent 1 to 3% of the lunar mass.
APA, Harvard, Vancouver, ISO, and other styles
14

Simpson, Charles Robert Jr. "Analysis of Passive End-to-End Network Performance Measurements." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/14612.

Full text
Abstract:
NETI@home, a distributed network measurement infrastructure to collect passive end-to-end network measurements from Internet end-hosts was developed and discussed. The data collected by this infrastructure, as well as other datasets, were used to conduct studies on the behavior of the network and network users as well as the security issues affecting the Internet. A flow-based comparison of honeynet traffic, representing malicious traffic, and NETI@home traffic, representing typical end-user traffic, was conducted. This comparison showed that a large portion of flows in both datasets were failed and potentially malicious connection attempts. We additionally found that worm activity can linger for more than a year after the initial release date. Malicious traffic was also found to originate from across the allocated IP address space. Other security-related observations made include the suspicious use of ICMP packets and attacks on our own NETI@home server. Utilizing observed TTL values, studies were also conducted into the distance of Internet routes and the frequency with which they vary. The frequency and use of network address translation and the private IP address space were also discussed. Various protocol options and flags were analyzed to determine their adoption and use by the Internet community. Network-independent empirical models of end-user network traffic were derived for use in simulation. Two such models were created. The first modeled traffic for a specific TCP or UDP port and the second modeled all TCP or UDP traffic for an end-user. These models were implemented and used in GTNetS. Further anonymization of the dataset and the public release of the anonymized data and their associated analysis tools were also discussed.
APA, Harvard, Vancouver, ISO, and other styles
15

Gupta, Shweta. "Software Development Productivity Metrics, Measurements and Implications." Thesis, University of Oregon, 2018. http://hdl.handle.net/1794/23816.

Full text
Abstract:
The rapidly increasing capabilities and complexity of numerical software present a growing challenge to software development productivity. While many open source projects enable the community to share experiences, learn and collaborate; estimating individual developer productivity becomes more difficult as projects expand. In this work, we analyze some HPC software Git repositories with issue trackers and compute productivity metrics that can be used to better understand and potentially improve development processes. Evaluating productivity in these communities presents additional challenges because bug reports and feature requests are often done by using mailing lists instead of issue tracking, resulting in difficult-to-analyze unstructured data. For such data, we investigate automatic tag generation by using natural language processing techniques. We aim to produce metrics that help quantify productivity improvement or degradation over the projects lifetimes. We also provide an objective measurement of productivity based on the effort estimation for the developer's work.
APA, Harvard, Vancouver, ISO, and other styles
16

Mohammadi, Soroor. "Processing and Modeling of Gravity, Magnetic and Electromagnetic Data in the Falkenberg Area, Sweden." Thesis, Uppsala universitet, Geofysik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-232714.

Full text
Abstract:
Falkenberg area is located in southwest Sweden formed in the Sveconorwegian orogen and contains an extremely complex geological structure. Multiple geophysical datasets have been acquired and together with available petrophysical information, models corresponding to the subsurface geological structures were generated. The collected data comprise ground magnetic, AMT (Audio Magnetotelluric) and RMT (Radio Magnetotelluric) data. The available airborne magnetic and ground gravity data acquired by the Geological Survey of Sweden (SGU) as well as the reflection seismic section from a study made by Uppsala University further aids in obtaining substantially improved interpretation of the geometry of the structures along the AMT profile. The principal objective of this profile was to delineate and map the possible deformation zone crossed by the profile. The AMT study was expected to complement existing geophysical data and improve existing interpretations. The Ullared deformation zone contains decompressed eclogite facies rocks. The presented results were obtained by comparison of different geophysical methods along the profile. The susceptibility model and resistivity model show that eclogites have higher resistivity and susceptibility than the surrounding structures. However use of the Occam type of inversion on the AMT data, makes the resistivity model smoother than the susceptibility model and as a results it is difficult to estimate the dip of the structures. The AMT profile and the seismic section show the same dip direction (NE) for the eclogite bearing structures although due to the smoothing in the AMT model the dips seen in the seismic section cannot be recovered in the resistivity model.
APA, Harvard, Vancouver, ISO, and other styles
17

McLeish, Kate. "Combining data acquisition and post-processing techniques for magnetic resonance imaging of moving objects." Thesis, King's College London (University of London), 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.406105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Kuiwei. "Surface roughness and displacement measurements using a fibre optic sensor and neural networks." Thesis, Brunel University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.246145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Monnot, Cyril Gerard Valery. "Development of a data analysis platform for characterizing functional connectivity networks in rodents." Thesis, KTH, Skolan för teknik och hälsa (STH), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-124391.

Full text
Abstract:
This document addresses the development and implementation of a routine for analyzing resting-state functional Magnetic Resonance Imaging (rs-fMRI) data in rodents. Even though resting-state connectivity is studied in humans already for several years with diverse applications in mental disorders or degenerative brain diseases, the interest for this modality is much more recent and less common in rodents. The goal of this project is to set an ensemble of tools in order to be able for the experimental MR team of KERIC to analyze rs-fMRI in rodents in a well defined and easy way. During this project several critical choices have been done, one of them is to use the Independent Component Analysis (ICA) in order to process the data rather than a seed-based approach. Also it was decided to use medetomidine as anesthesia rather than isoflurane for the experiments. The routine developed during this project was applied for a project studying the effects of running on an animal model of depression. The routine is composed of several steps, the preprocessing of the data mainly realized with SPM8, the processing using GIFT and the postprocessing which is some statistic tests on the results from GIFT in order to reveal differences between groups using the 2nd level analysis from SPM8 and the testing the correlations between components using the FNC toolbox.
Detta dokument behandlar utvecklingen och implementeringen av en rutin för att analysera bilder från resting-state funktionell Magnetisk Resonenstomografi i gnagare. Även om resting-state connectivity studerats i människor i några år, med olika applikationer i psykiska störningar och neurodegenerativa sjukdomar, är intresset för detta område är betydligt nyare bland experimentell förskare som arbetar med gnagare. Målet av denna projekt är att inställa en procedur så att KERICs experimentell MR team kan lätt analysera resting-state funktionnell MRT data. Under denna projekt har olika viktiga val gjorts, en av dem är att använda Independent Component Analysis procedur för att analysera data framför en seed-baserad teknik. En andra var att använda för anestesi medetomidin och inte isofluran för experiment. Rutinen som var utvecklad under denna projekt blev användad på data från en projekt som studerar effekter av löpning på depression hos råttorna. Rutinen är delad i några delar, den första är att förbehandla data främst med SPM8, den andra är att använda GIFT för att behandla data och den sista är att testa statistiskt resultat från ICA med SPM8 och att testa korrelation mellan komponenter med FNC.
APA, Harvard, Vancouver, ISO, and other styles
20

Barron, Nicholas Henry. "An Analysis of an Advanced Software Business Model for Magnetic Resonance Imaging Data Post Processing." Case Western Reserve University School of Graduate Studies / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=case1459422647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Saliaris, Ioannis R. "Real-Time data acquisition and processing of the Magnetic, Angular Rate and Gravity (MARG) sensor /." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Jun%5FSaliaris.pdf.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering and M.S. in Systems Engineering)--Naval Postgraduate School, June 2004.
Thesis advisor(s): Xiaoping Yun. Includes bibliographical references (p. 59-60). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
22

Fortier, Hélène. "AFM Indentation Measurements and Viability Tests on Drug Treated Leukemia Cells." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34345.

Full text
Abstract:
A significant body of literature has reported strategies and techniques to assess the mechanical properties of biological samples such as proteins, cellular and tissue systems. Atomic force microscopy has been used to detect elasticity changes of cancer cells. However, only a few studies have provided a detailed and complete protocol of the experimental procedures and data analysis methods for non-adherent blood cancer cells. In this work, the elasticity of NB4 cells derived from acute promyelocytic leukemia (APL) was probed by AFM indentation measurements to investigate the effects of the disease on cellular biomechanics. Understanding how leukemia influences the nanomechanical properties of cells is expected to provide a better understanding of the cellular mechanisms associated to cancer, and promises to become a valuable new tool for cancer detection and staging. In this context, the quantification of the mechanical properties of APL cells requires a systematic and optimized approach for data collection and analysis, in order to generate reproducible and comparative data. This Thesis elucidates the automated data analysis process that integrates programming, force curve collection and analysis optimization to assess variations of cell elasticity in response to processing criteria. A processing algorithm was developed by using the IGOR Pro software to automatically analyze large numbers of AFM data sets in an efficient and accurate manner. In fact, since the analysis involves multiple steps that must be repeated for many individual cells, an automated and un-biased processing approach is essential to precisely determine cell elasticity. Different fitting models for extracting the Young’s modulus have been systematically applied to validate the process, and the best fitting criteria, such as the contact point location and indentation length, have been determined in order to obtain consistent results. The designed automated processing code described in this Thesis was used to correlate alterations in cellular biomechanics of cancer cells as they undergo drug treatments. In order to fully assess drug effects on NB4 cells, viability assays were first performed using Trypan Blue staining for primary insights before initiating thorough microplate fluorescence intensity readings using a LIVE/DEAD viability kit involving ethidium and calcein AM labelling components. From 0 to 24 h after treatment using 30 µM arsenic trioxide, relative live cell populations increased until 36 h. From 0 to 12 h post-treatment, relative populations of dead cells increased until 24 h post-treatment. Furthermore, a drastic drop in dead cell count has been observed between 12 and 24 h. Additionally, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. With respect to cell mechanics, trapping of the non-adherent NB4 cells within fabricated SU8-10 microwell arrays, allowed consistent AFM indentation measurements up to 48 h after treatment. Results revealed an increase in cell elasticity up to 12 h post-treatment and a drastic decrease between 12 and 24 h. Furthermore, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. In addition to these indentation and viability testing approaches, morphological appearances were monitored, in order to track the apoptosis process of the affected cells. Relationships found between viability and elasticity assays in conjunction with morphology alterations revealed distinguish stages of apoptosis throughout treatment. 24 h after initial treatment, most cells were observed to have burst or displayed obvious blebbing. These relations between different measurement methods may reveal a potential drug screening approach, for understanding specific physical and biological of drug effects on the cancer cells.
APA, Harvard, Vancouver, ISO, and other styles
23

Forshed, Jenny. "Processing and analysis of NMR data : Impurity determination and metabolic profiling." Doctoral thesis, Stockholm : Dept. of analytical chemistry, Stockholm university, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Boogaart, Adrianus van den. "The use of signal processing algorithms to obtain biochemically relevant parameters from magnetic resonance data sets." Thesis, St George's, University of London, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.281778.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ramesh, Maganti V. "Magnetic stripe reader used to collect computer laboratory statistics." Virtual Press, 1990. http://liblink.bsu.edu/uhtbin/catkey/722464.

Full text
Abstract:
This thesis is concerned with interfacing a magnetic stripe reader with an AT&T PC 6300 consisting of a 20 MB hard disk and with collecting laboratory usage statistics. Laboratory usage statistics includes the name and social security number of the student,along with other necessary details. This system replaces all manual modes of entering data, checks for typographical errors, renames the file containing a particular day's data to a file that has the current day's date as its new filename, and keeps track of the number of students for a particular day. This procedure will ensure security of laboratory equipment and can be modified for each computer laboratory on campus. The program results indicate an acceleration of data entry, favorable student response, and an increase in the accuracy of the data recorded.
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
26

劉心雄 and Sum-hung Lau. "Adaptive FEM preprocessing for electro magnetic field analysis of electric machines." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1995. http://hub.hku.hk/bib/B31212451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Derbyshire, John Andrew. "Echo-planar anemometry using conventional magnetic resonance imaging hardware." Thesis, University of Cambridge, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364590.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Lindemeyer, Johannes [Verfasser], Nadim Joni Akademischer Betreuer] Shah, and Achim [Akademischer Betreuer] [Stahl. "Optimisation of Phase Data Processing for Susceptibility Reconstruction in Magnetic Resonance Imaging / Johannes Lindemeyer ; Nadim Joni Shah, Achim Stahl." Aachen : Universitätsbibliothek der RWTH Aachen, 2015. http://d-nb.info/1128316560/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lindemeyer, Johannes Verfasser], Nadim Joni [Akademischer Betreuer] Shah, and Achim [Akademischer Betreuer] [Stahl. "Optimisation of Phase Data Processing for Susceptibility Reconstruction in Magnetic Resonance Imaging / Johannes Lindemeyer ; Nadim Joni Shah, Achim Stahl." Aachen : Universitätsbibliothek der RWTH Aachen, 2015. http://d-nb.info/1128316560/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Chin, Christine Hui Li. "The effects of computer-based tests on the achievement, anxiety and attitudes of grade 10 science students." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29484.

Full text
Abstract:
The purpose of this study was to compare the achievement and test anxiety level of students taking a conventional paper-and-pencil science test comprising multiple-choice questions, and a computer-based version of the same test. The study assessed the equivalence of the computer-based and paper-and-pencil tests in terms of achievement scores and item characteristics, explored the relationship between computer anxiety and previous computer experience, and investigated the affective impact of computerized testing on the students. A 2 X 2 (mode of test administration by gender) factorial design was used. A sample of 54 male and 51 female Grade 10 students participated in the study. Subjects were blocked by gender and their scores on a previous school-based science exam. They were then randomly assigned to take either the computer-based test or the paper-and-pencil test, both versions of which were identical in length, item content and sequence. Three days before the test, all students were given the "Attitude questionnaire" which included pre-measures of test and computer anxiety. Immediately after taking the test, students in the computer-based group completed the "Survey of attitudes towards testing by computers" questionnaire which assessed their previous computer experience, their test anxiety and computer anxiety level while taking the test, and their reactions towards computer-based testing. Students in the paper-and-pencil test group answered the "Survey of attitudes towards testing" questionnaire which measured their test anxiety level while they were taking the paper-and-pencil test. The results indicate that the mean achievement score on the science test was significantly higher for the group taking the computer-based test. No significant difference in mean scores between sexes was observed; there was also no interaction effect between mode of test administration and gender. The test anxiety level was not significantly different between the groups taking the two versions of the test. A significant relationship existed between students' prior computer experience and their computer anxiety before taking the test. However, there was no significant relationship between previous computer experience and the computer anxiety evoked as a result of taking the test on the computer. Hence, the change in computer anxiety due to taking the test was not explained by computer experience. Of the students who took the computer-based test, 71.2 % said that if given a choice, they would prefer to take the test on a computer. Students indicated that they found the test easier, more convenient to answer because they did not have to write, erase mistakes or fill in bubbles on a scannable sheet, and faster to take when compared to a paper-and-pencil test. Negative responses to the computer-based test included the difficulty involved in reviewing and changing answers, having to type and use a keyboard, fear of the computer making mistakes, and a feeling of uneasiness because the medium of test presentation was unconventional. Students taking the computer-based test were more willing to guess on an item, and tended to avoid the option "I don't know." It is concluded that the computer-based and the paper-and-pencil tests were not equivalent in terms of achievement scores. Modifications in the way test items are presented on a computer-based test may change the strategies with which students approach the items. Extraneous variables incidental to the computer administration such as the inclination to guess on a question, the ease of getting cues from other questions, differences in test-taking flexibility, familiarity with computers, and attitudes towards computers may change the test-taking behaviour to the extent that a student's performance on a computer-based test and paper-and-pencil test may not be the same. Also, if the tasks involved in taking a test on a computer are kept simple enough, prior computer experience has little impact on the anxiety evoked in a student taking the test, and even test-takers with minimal computer experience will not be disadvantaged by having to use an unfamiliar machine.
Education, Faculty of
Curriculum and Pedagogy (EDCP), Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
31

Theodoridis, John Apostolis 1972. "Borehole electromagnetic prospecting for weak conductors." Monash University, School of Geosciences, 2004. http://arrow.monash.edu.au/hdl/1959.1/5225.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Schönström, Linus. "Programming a TEM for magnetic measurements : DMscript code for acquiring EMCD data in a single scan with a q-E STEM setup." Thesis, Uppsala universitet, Tillämpad materialvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-306167.

Full text
Abstract:
Code written in the DigitalMicrograph® scripting language enables a new experimental design for acquiring the magnetic dichroism in EELS. Called the q-E STEM setup, it provides simultaneous acquisition of the dichroic pairs of spectra (eliminating major error sources) while preserving the real-space resolution of STEM. This gives the setup great potential for real-space maps of magnetic momenta which can be instrumental in furthering the understanding of e.g. interfacial magnetic effects. The report includes a thorough presentation of the created acquisition routine, a detailed outline of future work and a fast introduction to the DMscript language.
APA, Harvard, Vancouver, ISO, and other styles
33

Sonnert, Adrian. "Predicting inter-frequency measurements in an LTE network using supervised machine learning : a comparative study of learning algorithms and data processing techniques." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148553.

Full text
Abstract:
With increasing demands on network reliability and speed, network suppliers need to effectivize their communications algorithms. Frequency measurements are a core part of mobile network communications, increasing their effectiveness would increase the effectiveness of many network processes such as handovers, load balancing, and carrier aggregation. This study examines the possibility of using supervised learning to predict the signal of inter-frequency measurements by investigating various learning algorithms and pre-processing techniques. We found that random forests have the highest predictive performance on this data set, at 90.7\% accuracy. In addition, we have shown that undersampling and varying the discriminator are effective techniques for increasing the performance on the positive class on frequencies where the negative class is prevalent. Finally, we present hybrid algorithms in which the learning algorithm for each model depends on attributes of the training data set. These algorithms perform at a much higher efficiency in terms of memory and run-time without heavily sacrificing predictive performance.
APA, Harvard, Vancouver, ISO, and other styles
34

湯世傑 and Sai-kit Tong. "A computer-aided measurement system for monopolar high-voltage direct-current coronating lines." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1986. http://hub.hku.hk/bib/B31207467.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Lo, Kin-keung, and 羅建強. "An investigation of computer assisted testing for civil engineering students in a Hong Kong technical institute." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1988. http://hub.hku.hk/bib/B38627000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Benn, Kenneth Robert Andrew. "A classroom-based investigation into the potential of a computer-mediated criterion-referenced test as an evaluation instrument for the assessment of primary end user spreadsheet skills." Thesis, Rhodes University, 1994. http://hdl.handle.net/10962/d1003328.

Full text
Abstract:
The demand for innovative end users of information technology is increasing along with the proliferation of computer equipment within the workplace. This has resulted in increasing demands being made upon educational institutions responsible for the education of computer end users. The demands placed upon the teachers are particularly high. Large class groups and limited physical resources make the task especially difficult. One of the most time consuming, yet important, tasks is that of student evaluation. To effectively assess the practical work of information technology students requires intensive study of the storage media upon which the students'efforts have been saved. The purpose of this study was to assess the suitability of criterion-referenced testing techniques applied to the evaluation of end user computing students. Objective questions were administered to the students using Question Mark, a computer-managed test delivery system which enabled quick and efficient management of scoring and data manipulation for empirical analysis. The study was limited to the classroom situation and the assessment of primary spreadsheet skills. In order to operate within these boundaries, empirical techniques were used which enabled the timeous analysis of the students' test results. The findings of this study proved to be encouraging. Computer-mediated criterion-referenced testing techniques were found to be sufficiently reliable for classroom practice when used to assess primary spreadsheet skills. The validation of the assessment technique proved to be problematic because of the constraints imposed by normal classroom practice as well as the lack of an established methodology for evaluating spreadsheet skills. However, sufficient evidence was obtained to warrant further research aimed at assessing the use of computer-mediated criterion-referenced tests to evaluate information technology end user learning in situations beyond the boundaries of the classroom, such as a national certification examination.
APA, Harvard, Vancouver, ISO, and other styles
37

Hao, Jie. "A study on the application of independent component analysis to in vivo ¹H magnetic resonance spectra of childhood brain tumours for data processing." Thesis, University of Birmingham, 2010. http://etheses.bham.ac.uk//id/eprint/1064/.

Full text
Abstract:
Independent component analysis (ICA) has the potential of automatically determining metabolite, macromolecular and lipid (MMLip) components that make up magnetic resonance (MR) spectra. However, the realiability with which this is accomplished and the optimal ICA approach for investigating in vivo MR spectra, have not yet been determined. A wavelet shrinkage de-noising based enhancement algorithm, utilising a newly derived relationship between the real and imaginary parts of the MR spectrum, is proposed. This algorithm is more robust compared with conventional de-noising methods. The two approaches for applying ICA, blind source separation (BSS) and feature extraction (FE), are thoroughly examined. A feature dimension selection method, which has not been adequately addressed, is proposed to set a theoretical guideline for ICA dimension reduction. Since the advantages and limitations of BSS-ICA and FE-ICA are different, combining them may compensate their disadvantages and lead to better results. A novel ICA approach involving a hybrid of the two techniques for automated decomposition of MRS dataset is proposed. It has been demonstrated that hybrid ICA provides more realistic individual metabolite and MMLip components than BSS-ICA or FE-ICA. It can aid metabolite identification and assignment, and has the potential for extracting biologically useful features and discovering biomarkers.
APA, Harvard, Vancouver, ISO, and other styles
38

Remes, J. (Jukka). "Method evaluations in spatial exploratory analyses of resting-state functional magnetic resonance imaging data." Doctoral thesis, Oulun yliopisto, 2013. http://urn.fi/urn:isbn:9789526202228.

Full text
Abstract:
Abstract Resting-state (RS) measurements during functional magnetic resonance imaging (fMRI) have become an established approach for studying spontaneous brain activity. RS-fMRI results are often obtained using explorative approaches like spatial independent component analysis (sICA). These approaches and their software implementations are rarely evaluated extensively or specifically concerning RS-fMRI. Trust is placed in the software that they will work according to the published method descriptions. Many methods and parameters are used despite the lack of test data, and the validity of the underlying models remains an open question. A substantially greater number of evaluations would be needed to ensure the quality of exploratory RS-fMRI analyses. This thesis investigates the applicability of sICA methodology and software in the RS-fMRI context. The experiences were used to formulate general guidelines to facilitate future method evaluations. Additionally, a novel multiple comparison correction (MCC) method, Maxmad, was devised for adjusting evaluation results statistically. With regard to software considerations, the source code of FSL Melodic, popular sICA software, was analyzed against its published method descriptions. Unreported and unevaluated details were found, which implies that one should not automatically assume a correspondence between the literature and the software implementations. The method implementations should rather be subjected to independent reviews. An experimental contribution of this thesis is that the credibility of the emerging sliding window sICAs has been improved by the validation of sICA related preprocessing procedures. In addition to that, the estimation accuracy regarding the results in existing RS-fMRI sICA literature was also shown not to suffer even though repeatability tools like Icasso have not been used in their computation. Furthermore, the evidence against conventional sICA model suggests the consideration of different approaches to analysis of RS-fMRI. The guidelines developed for facilitation of evaluations include adoption of 1) open software development (improved error detection), 2) modular software designs (easier evaluations), 3) data specific evaluations (increased validity), and 4) extensive coverage of parameter space (improved credibility). The proposed Maxmad MCC addresses a statistical problem arising from broad evaluations. Large scale cooperation efforts are proposed concerning evaluations in order to improve the credibility of exploratory RS-fMRI methods
Tiivistelmä Aivoista toiminnallisella magneettikuvantamisella (engl. functional magnetic resonance imaging, fMRI) lepotilassa tehdyt mittaukset ovat saaneet vakiintuneen aseman spontaanin aivotoiminnan tutkimuksessa. Lepotilan fMRI:n tulokset saadaan usein käyttämällä exploratiivisia menetelmiä, kuten spatiaalista itsenäisten komponenttien analyysia (engl. spatial independent component analysis, sICA). Näitä menetelmiä ja niiden ohjelmistototeutuksia evaluoidaan harvoin kattavasti tai erityisesti lepotilan fMRI:n kannalta. Ohjelmistojen luotetaan toimivan menetelmäkuvausten mukaisesti. Monia menetelmiä ja parametreja käytetään testidatan puuttumisesta huolimatta, ja myös menetelmien taustalla olevien mallien pätevyys on edelleen epäselvä asia. Eksploratiivisten lepotilan fMRI-datan analyysien laadun varmistamiseksi tarvittaisiin huomattavasti nykyistä suurempi määrä evaluaatioita. Tämä väitöskirja tutki sICA-menetelmien ja -ohjelmistojen soveltuvuutta lepotilan fMRI-tutkimuksiin. Kokemuksien perusteella luotiin yleisiä ohjenuoria helpottamaan tulevaisuuden menetelmäevaluaatioita. Lisäksi väitöskirjassa kehitettiin uusi monivertailukorjausmenetelmä, Maxmad, evaluaatiotulosten tilastolliseen korjaukseen. Tunnetun sICA-ohjelmiston, FSL Melodicin, lähdekoodi analysoitiin suhteessa julkaistuihin menetelmäkuvauksiin. Analyysissa ilmeni aiemmin raportoimattomia ja evaluoimattomia menetelmäyksityiskohtia, mikä tarkoittaa, ettei kirjallisuudessa olevien menetelmäkuvausten ja niiden ohjelmistototeutusten välille pitäisi automaattisesti olettaa vastaavuutta. Menetelmätoteutukset pitäisi katselmoida riippumattomasti. Väitöskirjan kokeellisena panoksena parannettiin liukuvassa ikkunassa suoritettavan sICA:n uskottavuutta varmistamalla sICA:n esikäsittelyjen oikeellisuus. Lisäksi väitöskirjassa näytettiin, että aiempien sICA-tulosten tarkkuus ei ole kärsinyt, vaikka niiden estimoinnissa ei ole käytetty toistettavuustyökaluja, kuten Icasso-ohjelmistoa. Väitöskirjan tulokset kyseenalaistavat myös perinteisen sICA-mallin, minkä vuoksi tulisi harkita siitä poikkeavia lähtökohtia lepotilan fMRI-datan analyysiin. Evaluaatioiden helpottamiseksi kehitetyt ohjeet sisältävät seuraavat periaatteet: 1) avoin ohjelmistokehitys (parantunut virheiden havaitseminen), 2) modulaarinen ohjelmistosuunnittelu (nykyistä helpommin toteutettavat evaluaatiot), 3) datatyyppikohtaiset evaluaatiot (parantunut validiteetti) ja 4) parametriavaruuden laaja kattavuus evaluaatioissa (parantunut uskottavuus). Ehdotettu Maxmad-monivertailukorjaus tarjoaa ratkaisuvaihtoehdon laajojen evaluaatioiden tilastollisiin haasteisiin. Jotta lepotilan fMRI:ssä käytettävien exploratiivisten menetelmien uskottavuus paranisi, väitöskirjassa ehdotetaan laaja-alaista yhteistyötä menetelmien evaluoimiseksi
APA, Harvard, Vancouver, ISO, and other styles
39

Tung, I.-Pei. "Documenting the use of digital portfolios in an elementary school classroom." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79982.

Full text
Abstract:
The Quebec Education Program (QEP) provides educators with detailed descriptions of competencies of learning achievement. However, current approaches used by educators to instruct and assess their students do not target the relevant QEP competencies. The goal of this thesis is to document efforts of one teacher to implement digital portfolios in her grade one and two classroom in order to instruct and assess her students according to the relevant QEP competencies. The study documents the kinds of technology and skills that are needed to implement digital portfolios in order to instruct and to assess and communicate student learning to their parents. Interviews with all participants were used to document the process from multiple perspectives. Overall, digital portfolios were found to be a very useful for instructing and assessing student and communicating with parents.
APA, Harvard, Vancouver, ISO, and other styles
40

Abufadel, Amer Y. "4D Segmentation of Cardiac MRI Data Using Active Surfaces with Spatiotemporal Shape Priors." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/14005.

Full text
Abstract:
This dissertation presents a fully automatic segmentation algorithm for cardiac MR data. Some of the currently published methods are automatic, but they only work well in 2D and sometimes in 3D and do not perform well near the extremities (apex and base) of the heart. Additionally, they require substantial user input to make them feasible for use in a clinical environment. This dissertation introduces novel approaches to improve the accuracy, robustness, and consistency of existing methods. Segmentation accuracy can be improved by knowing as much about the data as possible. Accordingly, we compute a single 4D active surface that performs segmentation in space and time simultaneously. The segmentation routine can now take advantage of information from neighboring pixels that can be adjacent either spatially or temporally. Robustness is improved further by using confidence labels on shape priors. Shape priors are deduced from manual segmentation of training data. This data may contain imperfections that may impede proper manual segmentation. Confidence labels indicate the level of fidelity of the manual segmentation to the actual data. The contribution of regions with low confidence levels can be attenuated or excluded from the final result. The specific advantages of using the 4D segmentation along with shape priors and regions of confidence are highlighted throughout the thesis dissertation. Performance of the new method is measured by comparing the results to traditional 3D segmentation and to manual segmentation performed by a trained clinician.
APA, Harvard, Vancouver, ISO, and other styles
41

Herterich, Rebecka, and Anna Sumarokova. "Coil Sensitivity Estimation and Intensity Normalisation for Magnetic Resonance Imaging." Thesis, KTH, Medicinteknik och hälsosystem, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263149.

Full text
Abstract:
The quest for improved efficiency in magnetic resonance imaging has motivated the development of strategies like parallel imaging where arrays of multiple receiver coils are operated simultaneously in parallel. The objective of this project was to find an estimation of phased-array coil sensitivity profiles of magnetic resonance images of the human body. These sensitivity maps can then be used to perform an intensity inhomogeneity correction of the images. Through investigative work in Matlab, a script was developed that uses data embedded in raw data from a magnetic resonance scan, to generate coil sensitivities for each voxel of the volume of interest and recalculate them to two-dimensional sensitivity maps of the corresponding diagnostic images. The resulting mapped sensitivity profiles can be used in Sensitivity Encoding where a more exact solution can be obtained using the carefully estimated sensitivity maps of the images.
Inom magnetresonanstomografi eftersträvas förbättrad effektivitet, villket bidragit till utvecklingen av strategier som parallell imaging, där arrayer av flera mottagarspolar andvänds samtidigt. Syftet med detta projekt var att uppskattamottagarspolarnas känslighetskarta för att utnyttja dem till i metoder inom magnetresonansavbildning. Dessa känslighetskartor kan användas för att utföra intensitetsinhomogenitetskorrigering av bilderna. Genom utforskande arbete i Matlab utvecklades ett skript som tillämpar inbyggd rådata, från en magnetiskresonansavbildning för att generera spolens känslighet för varje voxel av volymen och omberäkna dem till tvådimensionella känslighetskartor av motsvarande diagnostiska bilder. De resulterande kartlagda känslighetsprofilerna kan användas i känslighetskodning, där en mer exakt lösning kan erhållas med hjälp av de noggrant uppskattade känslighetskartorna.
APA, Harvard, Vancouver, ISO, and other styles
42

Tang, Shijun. "Investigation on Segmentation, Recognition and 3D Reconstruction of Objects Based on LiDAR Data Or MRI." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc801920/.

Full text
Abstract:
Segmentation, recognition and 3D reconstruction of objects have been cutting-edge research topics, which have many applications ranging from environmental and medical to geographical applications as well as intelligent transportation. In this dissertation, I focus on the study of segmentation, recognition and 3D reconstruction of objects using LiDAR data/MRI. Three main works are that (I). Feature extraction algorithm based on sparse LiDAR data. A novel method has been proposed for feature extraction from sparse LiDAR data. The algorithm and the related principles have been described. Also, I have tested and discussed the choices and roles of parameters. By using correlation of neighboring points directly, statistic distribution of normal vectors at each point has been effectively used to determine the category of the selected point. (II). Segmentation and 3D reconstruction of objects based on LiDAR/MRI. The proposed method includes that the 3D LiDAR data are layered, that different categories are segmented, and that 3D canopy surfaces of individual tree crowns and clusters of trees are reconstructed from LiDAR point data based on a region active contour model. The proposed method allows for delineations of 3D forest canopy naturally from the contours of raw LiDAR point clouds. The proposed model is suitable not only for a series of ideal cone shapes, but also for other kinds of 3D shapes as well as other kinds dataset such as MRI. (III). Novel algorithms for recognition of objects based on LiDAR/MRI. Aimed to the sparse LiDAR data, the feature extraction algorithm has been proposed and applied to classify the building and trees. More importantly, the novel algorithms based on level set methods have been provided and employed to recognize not only the buildings and trees, the different trees (e.g. Oak trees and Douglas firs), but also the subthalamus nuclei (STNs). By using the novel algorithms based on level set method, a 3D model of the subthalamus nuclei (STNs) in the brain has been successfully reconstructed based on the statistical data of previous investigations of an anatomy atlas as reference. The 3D rendering of the subthalamic nuclei and the skull directly from MR imaging is also utilized to determine the 3D coordinates of the STNs in the brain. In summary, the novel methods and algorithms of segmentation, recognition and 3D reconstruction of objects have been proposed. The related experiments have been done to test and confirm the validation of the proposed methods. The experimental results also demonstrate the accuracy, efficiency and effectiveness of the proposed methods. A framework for segmentation, recognition and 3D reconstruction of objects has been established, which has been applied to many research areas.
APA, Harvard, Vancouver, ISO, and other styles
43

Whitworth, Clifford K. "Equivalency of paper-pencil tests and computer-administered tests." Thesis, University of North Texas, 2001. https://digital.library.unt.edu/ark:/67531/metadc2741/.

Full text
Abstract:
Are computer-administered versions of a multiple choice paper-pencil test equivalent? This study determined whether there were any significant differences between taking a traditional pencil-paper test and taking the same test using a computer. The literature has shown that there are intervening variables that have caused differences when not controlled. To prove equivalency between test modes, scores have to have similar means, dispersions, and shapes; the ranked-order of the scores must also be similar. Four tests were given over the course of a 16-week semester. The sample was divided, half taking paper-pencil tests and half taking the same test administered by a computer. The mode of administration was switched with each test administration. The analysis showed that, when the intervening variables were controlled, the two modes of administration were equivalent. The analysis used a 2x4 ANOVA, which showed no difference between test modes, but showed that each test administration was significantly different. The Levene statistic was used to test whether dispersions were equivalent and confidence intervals were established to test the kurtosis and skewness statistics. Finally, each of the test scores were transformed into their Normal Curve Equivalents so that Pearson's coefficient could be used to determine the equivalency of the ranked-orders.
APA, Harvard, Vancouver, ISO, and other styles
44

Lee, Thomas Seward. "Software-based gradient nonlinearity distortion correction." CSUSB ScholarWorks, 2006. https://scholarworks.lib.csusb.edu/etd-project/3180.

Full text
Abstract:
The primary purpose of the thesis is to discuss the use of Magnetic Resonance Imaging (MRI) in functional proton radiosurgery. The methods presented in this thesis were specifically designed to correct gradient nonlinearity distortion, the single greatest hurdle that limits the deployment of MRI-based functional proton radiosurgery systems. The new system central in the thesis fully utilized MRI to provide localization of anatomical targets with submillimeter accuracy. The thesis provides analysis and solutions to the problems related to gradient nonlinearity distortion. The characteristics of proton radiosurgery are introduced, in addition to a discussion of its advantages over other current methods of radiation oncology. A historical background for proton radiosurgery is also presented, along with a description of its implementation at Loma Linda University Medical Center (LLUMC), where a new system for functional proton radiosurgery has been proposed and is currently under development.
APA, Harvard, Vancouver, ISO, and other styles
45

Millsap, Claudette M. "Comparison of Computer Testing versus Traditional Paper and Pencil Testing." Thesis, University of North Texas, 2000. https://digital.library.unt.edu/ark:/67531/metadc2621/.

Full text
Abstract:
This study evaluated 227 students attending 12 classes of the Apprentice Medical Services Specialist Resident Course. Six classes containing a total of 109 students took the Block One Tests in the traditional paper and pencil form. Another six classes containing a total of 118 students took the same Block One Tests on computers. A confidence level of .99 and level of signifi­cance of .01 was established. An independent samples t-test was conducted on the sample. Additionally, a one-way analysis of variance was performed between the classes administered the Block One Tests on computers. Several other frequencies and comparisons of Block One Test scores and other variables were accomplished. The variables examined included test versions, shifts, student age, student source, and education levels. The study found no significant difference between test administration modes. This study concluded that computer-administering tests identical to those typically administered in the traditional paper and pencil manner had no significant effect on achievement. It is important to note, however, that the conclusion may only be valid if the computer-administered test contains exactly the same test items, in the same order and format, with the same layout, structure, and choices as the traditional paper and pencil test. In other words, unless the tests are identical in every possible way except the actual test administration mode this conclusion may not be applicable.
APA, Harvard, Vancouver, ISO, and other styles
46

Fielder, Patrick J. (Patrick Joseph). "A Comparison of the Effectiveness of Computer Adaptive Testing and Computer Administered Testing." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc279192/.

Full text
Abstract:
The problem with which this study is concerned is determining the effectiveness of a computer adaptive test as compared to the effectiveness of using the entire test. The study has a twofold purpose. The first is to determine whether the two test versions generate equivalent scores, despite being of different lengths. The second is to determine whether the difference in time needed to take the computer adaptive test is significantly shorter than the computer administered full test.
APA, Harvard, Vancouver, ISO, and other styles
47

Kim, Kye Hyun 1956. "Classification of environmental hydrologic behaviors in Northeastern United States." Thesis, The University of Arizona, 1989. http://hdl.handle.net/10150/277083.

Full text
Abstract:
Environmental response to acidic deposition occurs through the vehicle of water movement in the ecosystem. As a part of the environmental studies for acidic deposition in the ecosystem, output-based hydrologic classification was done from basin hydrologies based on the distribution of the baseflow, snowmelt, and the direct runoff sources. Because of the differences in the flow paths and exposure duration, those components were assumed to represent distinct geochemical responses. As a first step, user-friendly software has been developed to calculate the baseflow based on the separation of annual hydrographs. It also generates the hydrograph for visual analysis using trial separation slope. After the software was completed, about 1200 stream flow gauging stations in Northeastern U.S. were accessed for flow separation and other hydrologic characteristics. At the final stage, based on the output from the streamflow analysis, cluster analysis was performed to classify the streamflow behaviors in terms of acidic inflow. The output from the cluster analysis shows more efficient regional boundaries of the subregions than the current regional boundaries used by U.S. Environmental Protection Agency (U.S.E.P.A.) for the environmental management in terms of acidic deposition based on the regional baseflow properties.
APA, Harvard, Vancouver, ISO, and other styles
48

Siu, Ha-ping Angel, and 蕭霞萍. "Using web-based assessment for learning and teaching primary mathematics." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B30424574.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Kubitza, Andy James. "Using standardized test reading comprehension software to improve student academic achievement in reading comprehension." CSUSB ScholarWorks, 2007. https://scholarworks.lib.csusb.edu/etd-project/3262.

Full text
Abstract:
The purpose of this quantitative design research study for fourth grade students was to examine whether a web-based Standardized Test Preparation Intervention for reading comprehension was more effective and efficient in improving student academic achievement in reading comprehension than a paper-based Standarized Test Preparation Intervention. It was found that the paper-based reading comprehension intervention was equally effective as the web-based.
APA, Harvard, Vancouver, ISO, and other styles
50

Neumann, Markus. "Automatic multimodal real-time tracking for image plane alignment in interventional Magnetic Resonance Imaging." Phd thesis, Université de Strasbourg, 2014. http://tel.archives-ouvertes.fr/tel-01038023.

Full text
Abstract:
Interventional magnetic resonance imaging (MRI) aims at performing minimally invasive percutaneous interventions, such as tumor ablations and biopsies, under MRI guidance. During such interventions, the acquired MR image planes are typically aligned to the surgical instrument (needle) axis and to surrounding anatomical structures of interest in order to efficiently monitor the advancement in real-time of the instrument inside the patient's body. Object tracking inside the MRI is expected to facilitate and accelerate MR-guided interventions by allowing to automatically align the image planes to the surgical instrument. In this PhD thesis, an image-based workflow is proposed and refined for automatic image plane alignment. An automatic tracking workflow was developed, performing detection and tracking of a passive marker directly in clinical real-time images. This tracking workflow is designed for fully automated image plane alignment, with minimization of tracking-dedicated time. Its main drawback is its inherent dependence on the slow clinical MRI update rate. First, the addition of motion estimation and prediction with a Kalman filter was investigated and improved the workflow tracking performance. Second, a complementary optical sensor was used for multi-sensor tracking in order to decouple the tracking update rate from the MR image acquisition rate. Performance of the workflow was evaluated with both computer simulations and experiments using an MR compatible testbed. Results show a high robustness of the multi-sensor tracking approach for dynamic image plane alignment, due to the combination of the individual strengths of each sensor.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography