Dissertations / Theses on the topic 'High-resolution simulation'

To see the other types of publications on this topic, follow the link: High-resolution simulation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'High-resolution simulation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Eilertsen, Gabriel. "High-resolution simulation and rendering of gaseous phenomena from low-resolution data." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-70269.

Full text
Abstract:
Numerical simulations are often used in computer graphics to capture the effects of natural phenomena such as fire, water and smoke. However, simulating large-scale events in this way, with the details needed for feature film, poses serious problems. Grid-based simulations at resolutions sufficient to incorporate small-scale details would be costly and use large amounts of memory, and likewise for particle based techniques. To overcome these problems, a new framework for simulation and rendering of gaseous phenomena is presented in this thesis. It makes use of a combination of different existing concepts for such phenomena to resolve many of the issues in using them separately, and the result is a potent method for high-detailed simulation and rendering at low cost. The developed method utilizes a slice refinement technique, where a coarse particle input is transformed into a set of two-dimensional view-aligned slices, which are simulated at high resolution. These slices are subsequently used in a rendering framework accounting for light scattering behaviors in participating media to achieve a final highly detailed volume rendering outcome. However,the transformations from three to two dimensions and back easily introduces visible artifacts, so a number of techniques have been considered to overcome these problems, where e.g. a turbulence function is used in the final volume density function to break up possible interpolation artifacts.
APA, Harvard, Vancouver, ISO, and other styles
2

Romero, Cindy G. "High Resolution Simulation of Synthetic Aperture Radar Imaging." DigitalCommons@CalPoly, 2010. https://digitalcommons.calpoly.edu/theses/345.

Full text
Abstract:
The goal of this Master’s thesis is to develop a more realistic simulation of Synthetic Aperture Radar (SAR) that has the ability to image detailed targets, and that can be used for Automatic Target Recognition (ATR). This thesis project is part of ongoing SAR ATR research at California Polytechnic State University (Cal Poly) sponsored by Raytheon Space & Airborne Systems and supervised by Dr. John Saghri. SAR is a form of radar that takes advantage of the forward motion of an antenna mounted on a moving platform (such as an airplane or spacecraft) to synthetically produce the effect of a longer antenna. Since most SAR images used for military ATR are classified and not available to the general public, all academic research to date on ATR has been limited to a small data set of Moving and Stationary Target Acquisition and Recognition Radar (MSTAR) images. Due to the unavailability of radar equipment or a greater range of SAR data, it has been necessary to create a SAR image generation scheme in which the parameters of the radar platform can be directly modified and controlled to be used for ATR applications. This thesis project focuses on making several improvements to Matthew Schlutz’s ‘Synthetic Aperture Radar Imaging Simulated in Matlab’ thesis. First, the simulation is optimized by porting the antenna pattern and echo generator from Matlab to C++, and the efficiency of the code is improved to reduced processing time. A three-dimensional (3-D) graphics application called Blender is used to create and position the target models in the scene imaged by the radar platform and to give altitude, target range (range of closest approach from the platform to the center area of the target) and elevation angle information to the radar platform. Blender allows the user to take pictures of the target as seen from the radar platform, and outputs range information from the radar platform plane to each point in the image. One of the major advantages of using Blender is that it also outputs range and reflectivity information about each pixel in the image. This is a significant characteristic that was hardcoded in the previous theses, making those simulations less realistic. For this thesis project, once the target scene is created in Blender, an image is rendered and saved as an OpenEXR file. The image is rendered in orthographic mode, which is a form of projection whereby the target plane is parallel with the projection plane. This parameter means that the simulation cannot image point targets that appear and disappear during the platform motion. The echo generation program then uses the range and reflectivity obtained from the OpenEXR file, the optimized antenna pattern, and several other user defined parameters to create the echo (received signal). Once the echo is created in the echo generation program, it is then read into Matlab in order for it to go through the Range Doppler Algorithm (RDA) and then output the final SAR image.
APA, Harvard, Vancouver, ISO, and other styles
3

Saunders, II Charles Phillip. "High Resolution Imaging Ground Penetrating Radar Design and Simulation." Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/47806.

Full text
Abstract:
This paper describes the design and simulation of a microwave band, high resolution imaging ground penetrating radar. A conceptual explanation is given on the mechanics of wave-based imaging, followed by the governing radar equations. The performance specifications for the imaging system are given as inputs to the radar equations, which output the full system specifications. Those specifications are entered into a MATLAB simulation, and the simulation results are discussed with respect to both the mechanics and the desired performance. Finally, this paper discusses limitations of the design, both with the simulations and anticipated issues if the device is fully realized.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
4

Warhola, Paul J. "An analysis of alternative methods to conduct high-resolution activities in a variable-resolution simulation." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1997. http://handle.dtic.mil/100.2/ADA337495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Piotrowski, Jesse Alex. "Development of a high-resolution two-dimensional urban/rural flood simulation." Thesis, University of Iowa, 2010. https://ir.uiowa.edu/etd/574.

Full text
Abstract:
Numerical modeling of extreme flooding in an urban area in eastern Iowa is presented. Modeling is performed using SRH-2D, an unstructured grid, finite volume model that solves the depth-averaged shallow-water equations. Data from a photogrammetric stereo compilation, contour maps, a hydrographic survey and building records were used to create a digital elevation model depicting the river channel and floodplain. A spatially distributed Manning coefficient based on land cover classification, derived from aerial photography is also used. The model is calibrated with high-resolution inundation depth data derived from a 1 m light detection and ranging survey, collected during the falling limb of the flood hydrograph, and discrete global positioning system measurements of water surface elevation at a bankfull condition. The model is validated with discrete high water marks collected immediately after the flood event. Results show the model adequately represents the water surface elevation in the main channel and floodplain and that exclusion of the discharges from minor creeks did not affect simulation accuracy. Reach scale results are not affected by the presence of buildings, but local inconsistencies occur in shallow water if buildings are not removed from the mesh. An unsteady hydrograph approximates flood hydrodynamics better than a steady-state simulation, but extreme computation time is not feasible for most investigations. The two-dimensional model was also compared to a comparable one-dimensional model of the study reach. The 1D model suffered from an inability to accurately predict inundation depth throughout the entire study area.
APA, Harvard, Vancouver, ISO, and other styles
6

Tang, Dong. "Studies of computer aided image interpretaion in high resolution electron microscopy." Thesis, University of Cambridge, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.240064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Coville, Michael Paul. "A methodology for validation of high resolution combat models." Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23305.

Full text
Abstract:
Approved for public release; distribution is unlimited
Senior officers in the United States Army have a high degree of confidence that National Training Center simulated combat results are representative, under similar circumstances, of actual combat. A validation methodology for high resolution combat models, primarily based on data acquired from the National Training Center, is the focus of this thesis. The validation methodology, where appropriate, translates confidence in National Training Center realism, to confidence in the combat model. Theoretical issues, existing methodologies, and the impact of model purpose are considered in this research. The final product is a validation methodology that makes use of a realistic representation of combat, automatically updates validation criteria to account for changes in weapons and tactics, and is responsive to the purpose for which the model was designed.
http://archive.org/details/methodologyforva00covi
Captain, United States Army
APA, Harvard, Vancouver, ISO, and other styles
8

Siu, Christopher E. "Simulating Epidemics and Interventions on High Resolution Social Networks." DigitalCommons@CalPoly, 2019. https://digitalcommons.calpoly.edu/theses/2051.

Full text
Abstract:
Mathematical models of disease spreading are a key factor of ensuring that we are prepared to deal with the next epidemic. They allow us to predict how an infection will spread throughout a population, thereby allowing us to make intelligent choices when attempting to contain the disease. Whether due to a lack of empirical data, a lack of computational power, a lack of biological understanding, or some combination thereof, traditional models must make sweeping assumptions about the behavior of a population during an epidemic. In this thesis, we implement granular epidemic simulations using a rich social network constructed from real-world interactions. We develop computational models for three diseases, and we use these simulations to demonstrate the effects of twelve potential intervention strategies, both before and during a hypothetical epidemic. We show how representing a population as a temporal graph and applying existing graph metrics can lead to more effective interventions.
APA, Harvard, Vancouver, ISO, and other styles
9

Rice, Matthew Jason. "High Resolution Simulation of Laminar and Transitional Flows in a Mixing Vessel." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/27716.

Full text
Abstract:
The present work seeks to fully investigate, describe and characterize the distinct flow regimes existing within a mixing vessel at various rotational speeds. This investigation is computational in nature and simulates the flow within a baffled tank containing a Rushton turbine of the standard configuration. For a Re based on impeller diameter and blade rotational speed (Re â ¡ Ï ND2/μ) the following flow regimes were identified and investigated in detail: Reverse/reciprocating flows at very low Re (<10); stalled flows at low Re (â 10); laminar pumping flow for higher Re and transitional pumping flow (10 squared < Re <10 to the 4th). For the three Re numbers 1, 10 and 28, it was found that for the higher Re number (28), the flow exhibited the familiar outward pumping action associated with radial impellers under turbulent flow conditions. However, as the Re number decreases, the net radial flow during one impeller revolution was reduced and for the lowest Re number a reciprocating motion with negligible net pumping was observed. In order to elucidate the physical mechanism responsible for the observed flow pattern at low Re, the forces acting on a fluid element in the radial direction were analyzed. Based on this analysis, a simplified quasi-analytic model of the flow was developed that gives a satisfactory qualitative, as well as quantitative representation of the flow at very low Re. Investigation of the transitional flow regime (Re â 3000) includes a compilation and characterization of ensemble and turbulent quantities such as the Reynolds stress components, dissipation length η and time scales Ï , as well a detailed investigation of the near-impeller flow and trailing vortex. Calculation and compilation of all terms in the turbulent kinetic energy transport equation was performed (including generation and the illusive turbulent pressure work). Specifically, the most important transport mechanism was turbulent convection/diffusion from the impeller disk-plane/trailing vortex region. Mean flow transport of turbulent kinetic energy was primarily towards the impeller disk-plane and radially outward from the trailing vortex region. The turbulent pressure work was found to partially counteract turbulent convection. Turbulent dissipation followed by turbulent viscous work were found to be the least important mechanism responsible for turbulent transport with both terms being maximized within the vortex region and at the disk-plane down-stream from the vortices.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
10

Groom, Michael Robert. "Direct Numerical Simulation of Shock-Induced Turbulent Mixing with High-Resolution Methods." Thesis, University of Sydney, 2020. https://hdl.handle.net/2123/23721.

Full text
Abstract:
Turbulent mixing evolving from the Richtmyer-Meshkov instability, also known as shock-induced turbulent mixing, is investigated using numerical simulations of fundamental test problems with high-resolution computational methods. An existing state-of-the-art implicit large eddy simulation algorithm for compressible multispecies flows is extended to include the effects of viscous dissipation, thermal conductivity and species diffusion by deriving a novel set of governing equations for binary mixtures. This allows for direct numerical simulations of shock-induced turbulent mixing to be performed for arbitrary gas mixtures cases where the ratio of specific heats may vary with mixture composition at much greater computational efficiency. Using direct numerical simulation, a detailed study is performed of the effects of Reynolds number on the transition to turbulence in shock-induced mixing evolving from narrowband initial conditions. Even though the turbulence in the highest Reynolds number case is not fully developed, a careful analysis shows that the high Reynolds number limit of several key quantities is able to be estimated from the present data. The mixing layer is also shown to be persistently anisotropic at all Reynolds numbers, which also has important consequences for modelling. At the time of writing, the highest Reynolds number case from this set of simulations is the highest achieved in any fully-resolved direct numerical simulations presented in the open literature for this class of problems. Implicit large eddy simulation is employed to investigate the influence of broadband initial conditions on the late-time evolution of a shock-induced turbulent mixing layer. Both the bandwidth of initial modes as well as their relative amplitudes are varied, showing that both the growth rate of the mixing layer width and the decay rate of fluctuating kinetic energy strongly depend on initial conditions. Finally, both implicit large eddy simulations and direct numerical simulations are performed of an idealised shock tube experiment to analyse the effects of additional long wavelength, low amplitude modes in the initial perturbation. These calculations represent the first direct numerical simulations performed of Richtmyer-Meshkov instability evolving from broadband initial conditions.
APA, Harvard, Vancouver, ISO, and other styles
11

Del, Puppo Norman. "High resolution ship hydrodynamics simulations in opens source environment." Doctoral thesis, Università degli studi di Trieste, 2015. http://hdl.handle.net/10077/10983.

Full text
Abstract:
2013/2014
The numerical simulation of wake and free-surface flow around ships is a complex topic that involves multiple tasks: the generation of an optimal computational grid and the development of numerical algorithms capable to predict the flow field around a hull. In this work, a numerical framework is developed aimed at high-resolution CFD simulations of turbulent, free-surface flows around ship hulls. The framework consists in the concatenation of “tools” in the open-source finite volume library OpenFOAM®. A novel, flexible mesh-generation algorithm is presented, capable of producing high-quality computational grids for free-surface ship hydrodynamics. The numerical framework is used to solve some benchmark problems, providing results that are in excellent agreement with the experimental measures.
XXVII Ciclo
1981
APA, Harvard, Vancouver, ISO, and other styles
12

Hahn, Marco. "Implicit large-eddy simulation of low-speed separated flows using high-resolution methods." Thesis, Cranfield University, 2008. http://hdl.handle.net/1826/2633.

Full text
Abstract:
Most flows of practical importance are governed by viscous near-wall phenomena leading to separation and subsequent transition to a turbulent state. This type of problem currently poses one of the greatest challenges for computational methods because its characteristics covers a wide range of physical processes that often place contradictory requirements on the numerics employed. This thesis seeks to investigate the physics of complex, separated flows pertinent to aeronautical engineering and to assess the performance of variants of the Implicit Large-Eddy Simulation approach in predicting this type of problem realistically. For this purpose, different numerical solution strategies based on high-resolution methods, distinguished by their order of accuracy, are used in precursor simulations and one selected approach is applied to a fully three-dimensional wing flow. In order to isolate the development from laminar to turbulent flow after separation has occurred, the prototype Taylor-Green Vortex is considered. Here, the behaviour of the numerical schemes during the linear, non-linear and fully turbulent stages in the flow evolution is tested for different grid sizes. It is found that the resolution power and the likelihood of symmetry breaking is increasing with the order of accuracy of the numerical method. These two properties allow the flow to develop more realistically on coarse grids if higher order schemes are employed. In the next step, flow separation from a gently curved surface is included. The fundamental study of a statistically two-dimensional channel flow with hill-type constrictions demonstrates the basic applicability of ILES to problems featuring massive separation. Without specific wall-treatment, high-resolution methods can improve prediction of the detachment location when compared to classical Large-Eddy Simulations. Finally, an ILES simulation of three-dimensional flow over a swept wing geometry at moderate angle of incidence is presented. The results are in excellent agreement with experiment in the fully separated and turbulent region and they are more accurate than a classical hybrid RANS/LES approach, using a grid twice the size, over the majority of the wing. This outcome will probably settle the dispute that has erupted in the past over the applicability of ILES to complex, wall-bounded flows.
APA, Harvard, Vancouver, ISO, and other styles
13

Fu, Xiaojing S. M. Massachusetts Institute of Technology. "High-resolution simulation of pattern formation and coarsening dynamics in 3D convective mixing." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106958.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, School of Engineering, Center for Computational Engineering, Computation for Design and Optimization Program, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 45-47).
Geologic C0₂ sequestration is considered a promising tool to reduce anthropogenic C0₂ emissions while allowing continued use of fossil fuels for the current time. The process entails capturing C0₂ at point sources such as coal-fired power plants, and injecting it in its supercritical state into deep saline aquifers for long-term storage. Upon injection, C0₂ partially dissolves in groundwater to form an aqueous solution that is denser than groundwater. The local increase in density triggers a gravitational instability at the boundary layer that further develops into columnar C0₂-rich plumes that sink away. This mechanism, also known as convective mixing, greatly accelerates the dissolution rate of C0₂ into water and provides secure storage of C0₂ underground. Understanding convective mixing in the context of C0₂ sequestration is essential for the design of injection and monitoring strategies that prevent leakage of C0₂ back into the atmosphere. While current studies have elucidated various aspects of this phenomenon in 2D, little is known about this process in 3D. In this thesis we investigate the pattern-formation aspects of convective mixing during geological C0₂ sequestration by means of high-resolution three-dimensional simulation. We find that the C0₂ concentration field self-organizes as a cellular network structure in the diffusive boundary layer right beneath the top boundary. By studying the statistics of the cellular network, we identify various regimes of finger coarsening over time, the existence of a nonequilibrium stationary state, and an universal scaling of 3D convective mixing. We explore the correlation between the observed network pattern and the 3D flow structure predicted by hydrodynamics stability theory.
by Xiaojing Fu.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
14

Jimenez, Eduardo Antonio. "Fast history matching of time-lapse seismic and production data for high resolution models." Diss., Texas A&M University, 2008. http://hdl.handle.net/1969.1/85950.

Full text
Abstract:
Integrated reservoir modeling has become an important part of day-to-day decision analysis in oil and gas management practices. A very attractive and promising technology is the use of time-lapse or 4D seismic as an essential component in subsurface modeling. Today, 4D seismic is enabling oil companies to optimize production and increase recovery through monitoring fluid movements throughout the reservoir. 4D seismic advances are also being driven by an increased need by the petroleum engineering community to become more quantitative and accurate in our ability to monitor reservoir processes. Qualitative interpretations of time-lapse anomalies are being replaced by quantitative inversions of 4D seismic data to produce accurate maps of fluid saturations, pore pressure, temperature, among others. Within all steps involved in this subsurface modeling process, the most demanding one is integrating the geologic model with dynamic field data, including 4Dseismic when available. The validation of the geologic model with observed dynamic data is accomplished through a "history matching" (HM) process typically carried out with well-based measurements. Due to low resolution of production data, the validation process is severely limited in its reservoir areal coverage, compromising the quality of the model and any subsequent predictive exercise. This research will aim to provide a novel history matching approach that can use information from high-resolution seismic data to supplement the areally sparse production data. The proposed approach will utilize streamline-derived sensitivities as means of relating the forward model performance with the prior geologic model. The essential ideas underlying this approach are similar to those used for high-frequency approximations in seismic wave propagation. In both cases, this leads to solutions that are defined along "streamlines" (fluid flow), or "rays" (seismic wave propagation). Synthetic and field data examples will be used extensively to demonstrate the value and contribution of this work. Our results show that the problem of non-uniqueness in this complex history matching problem is greatly reduced when constraints in the form of saturation maps from spatially closely sampled seismic data are included. Further on, our methodology can be used to quickly identify discrepancies between static and dynamic modeling. Reducing this gap will ensure robust and reliable models leading to accurate predictions and ultimately an optimum hydrocarbon extraction.
APA, Harvard, Vancouver, ISO, and other styles
15

Münnich, Astrid. "Simulation studies for a high resolution time projection chamber at the international linear collider." [S.l.] : [s.n.], 2007. http://deposit.ddb.de/cgi-bin/dokserv?idn=984635068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Shende, Sachin. "Database-driven hydraulic simulation of canal irrigation networks using object-oriented high-resolution methods." Thesis, Loughborough University, 2006. https://dspace.lboro.ac.uk/2134/14208.

Full text
Abstract:
Canal hydraulic models can be used to understand the hydraulic behaviour of large and complex irrigation networks at low cost. A number of computational hydraulic models were developed and tested in the early 1970s and late 80s. Most were developed using finite difference schemes and procedural programming languages. In spite of the importance of these models, little progress was made on improving the numerical algorithms behind them. Software development efforts were focused more on developing the user interface rather than the core algorithm. This research develops a database-driven, object-oriented hydraulic simulation model for canal irrigation networks using modern high-resolution shock capturing techniques that are capable of handling variety of flow situations which includes trans-critical flow, shock propagation, flows through gated structures and channel networks. The technology platforms were carefully selected by taking into account a multi-user support and possible migration of the new software to a web-based one which integrates a Java-based object-oriented model with a relational database management system that is used to store network configuration and simulation parameters. The developed software is tested using a benchmark test suite formulated jointly by the Department for Environment, Food and Rural Affairs (DEFRA) and the Environment Agency (EA). A total of eight tests (seven of them adapted from the DEFRAjEA benchmark suite) were run and results compiled. The developed software has outperformed ISIS, REC-RAS and MIKE 11 in three of the benchmark tests and equally well for the other four. The outcome of this research is therefore a new category in hydraulic simulation software that uses modern shock-capturing methods fully integrated with a configurational relational database that has been fully evaluated and tested.
APA, Harvard, Vancouver, ISO, and other styles
17

Gholami, Vida. "Fuzzy rock typing enhancing reservoir simulation and modeling by honoring high resolution geological models /." Morgantown, W. Va. : [West Virginia University Libraries], 2009. http://hdl.handle.net/10450/10555.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2009.
Title from document title page. Document formatted into pages; contains xiii, 120 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 89-93).
APA, Harvard, Vancouver, ISO, and other styles
18

Moore, Matthew Roger. "Development of a high-resolution 1D/2D coupled flood simulation of Charles City, Iowa." Thesis, University of Iowa, 2011. https://ir.uiowa.edu/etd/1032.

Full text
Abstract:
The development of a high-resolution coupled one-dimensional/two-dimensional hydrodynamic model of Charles City, Iowa is presented in this study as part of a larger Iowa Flood Center initiative to create a library of steady inundation maps for communities in Iowa which have a high risk of flooding. Channel geometry from bathymetric surveys and surface topography from LiDAR were combined to create the digital elevation model (DEM) used in numerical simulations. Coupled one- and two-dimensional models were used to simulate flood events; the river channel and structures were modeled one-dimensionally, and the floodplain was modeled two-dimensionally. Spatially distributed roughness parameters were estimated using the 2001 National Land Cover Dataset. Simulations were performed at a number of mesh resolutions, and the results were used to investigate the effectiveness of re-sampling simulation results using higher- resolution DEMs. The effect of removing buildings from the computational mesh was also investigated. During 2011, the stream channel geometry is being changed as part of a recreational park in downtown Charles City. After incorporating the planned changes to the stream channel, the model was used to create a library of steady inundation maps which are available on the Iowa Flood Center website.
APA, Harvard, Vancouver, ISO, and other styles
19

Denby, Leif Christopher. "Using high-resolution modelling to improve the parameterisation of convection in a climate model." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/269850.

Full text
Abstract:
In this work high-resolution numerical simulation (Large-Eddie Simulation, LES) has been used to study the characteristic factors causing and influencing the development of moist convective clouds. Through this work a 1D cloud-model was derived from first principles to represent the vertical profile of individual convective clouds. A microphysics framework was implemented to ensure identical behaviour in LES and cloud-model integration where the microphysical processes represented are numerically integrated using a novel adaptive step microphysics integration which uses the physical speed at which a process takes place to adjust the integration step size (in space and time). This work also introduces a simple representation of cloud-droplet formation which allows for super-saturation to exist in-cloud and through this provide more physical representation of the in-cloud state. Together with high-resolution simulation of isolated individual and interacting multiple clouds in environmental conditions leading to shallow convection, the 1D cloud-model was used to infer that the principal influence on moist convective clouds is the entrainment of air from a cloud’s immediate environment which is significantly different from the environmental mean state. This suggests that convection parameterisations must represent the influence of moist convective downdrafts to properly predict the vertical structure of convective clouds so as to correctly predict the cloud-top height and vertical transport. Finally it was found that cloud-base radius is not in itself adequate as a means of classification for defining cloud-types as clouds with the same cloud-base radius showed large variation (≈ 600m) in cloud-top height. Based on simulations of individual convective clouds it was found that 3D simulations are necessary to capture the full dynamic behaviour of convective clouds (2D axisymmetric simulations have too little entrainment) and that agreement with the 1D cloud-model could only be found when entrainment was diagnosed from simulation instead of being parameterised by the traditional Morton-Turner model and only for 2D axisymmetric simulations, suggesting that the 1D cloud-model will require further extension or the diagnosis of entrainment improved.
APA, Harvard, Vancouver, ISO, and other styles
20

Tarhan, Tanil. "Numerical Simulation Of Laminar Reacting Flows." Phd thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/2/12605307/index.pdf.

Full text
Abstract:
Novel sequential and parallel computational fluid dynamic (CFD) codes based on method of lines (MOL) approach were developed for the numerical simulation of multi-component reacting flows using detailed transport and thermodynamic models. Both codes were applied to the prediction of a confined axisymmetric laminar co-flowing methane-air diffusion flame for which experimental data were available in the literature. Flame-sheet model for infinite-rate chemistry and one-, two-, and five- and ten-step reduced finite-rate reaction mechanisms were employed for methane-air combustion sub-model. A second-order high-resolution total variation diminishing (TVD) scheme based on Lagrange interpolation polynomial was proposed in order to alleviate spurious oscillations encountered in time evolution of flame propagation. Steady-state velocity, temperature and species profiles obtained by using infinite- and finite-rate chemistry models were validated against experimental data and other numerical solutions. They were found to be in reasonably good agreement with measurements and numerical results. The proposed difference scheme produced accurate results without spurious oscillations and numerical diffusion encountered in the classical schemes and hence was found to be a successful scheme applicable to strongly convective flow problems with non-uniform grid resolution. The code was also found to be an efficient tool for the prediction and understanding of transient combustion systems. This study constitutes the initial steps in the development of an efficient numerical scheme for direct numerical simulation (DNS) of unsteady, turbulent, multi-dimensional combustion with complex chemistry.
APA, Harvard, Vancouver, ISO, and other styles
21

Dehlinger, Mael. "XAS-XEOL and XRF spectroscopies using near field microscope probes for high-resolution photon collection." Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM4048/document.

Full text
Abstract:
Les microscopes en champ proche permettent d'obtenir la topographie d'un échantillon avec une résolution pouvant atteindre la résolution atomique. Les spectroscopies de rayons-X sont des méthodes de caractérisation qui permettent de déterminer la composition et la structure élémentaire de l'échantillon avec une précision inférieure à l'Ångström. Nous avons choisi de coupler ces deux techniques en collectant localement la luminescence visible issue de l'échantillon par la pointe-sonde d'un microscope à force de cisaillement, constituée d'une fibre optique effilée de faible ouverture. Cette technique a été utilisée pour caractériser des échantillons semiconducteurs micro- et nano-structurés afin d'en obtenir simultanément la topographie et la cartographie de luminescence locale. Afin de pouvoir étendre ce concept à d'autres types de matériaux, la faisabilité de la collecte de la fluorescence X locale a été évaluée avec la microsource. Pour cela la fluorescence X émise par un échantillon a été collectée par un capillaire cylindrique équipant un détecteur EDX. L'influence du diamètre du capillaire sur le niveau de signal a été mesurée. Une simulation numérique a été développée afin d'estimer le niveau de signal obtenu en utilisant un capillaire de 1 µm de diamètre et d'optimiser la géométrie du système. En couplant la microscopie en champ proche et l'analyse XRF, à la lumière de ces résultats, il sera possible d'atteindre 100 nm de résolution latérale en environnement synchrotron et moins de 1 µm à l'aide d'une source de laboratoire. Il serait alors possible de sélectionner un objet particulier sur une surface et d'en faire l'analyse élémentaire
Scanning Probe Microscopes allow to obtain sample topography up to atomic resolution. X-ray spectroscopies allow elemental and structural analysis of a sample with accuracy better than 1 Å. The lateral resolution is limited by the primary beam diameter, currently a few µm². We have chosen to couple this two technics. Local sample visible luminescence is collected through a low aperture sharp optical fibre, probe of a shear force microscope. This technique was used to characterize microstructured semiconducting samples to achieve simultaneously the surface topography and luminescence mapping. The results were obtained using either synchrotron radiation or a laboratory microsource equipped with a polycapillary lens. To extend this concept to a wider variety of materials, local XRF collection by an EDX detector equipped with a cylindrical X-ray capillary was tested. A cobalt sample irradiated with the microsource was used for technique evaluation. The signal magnitude dependence with the capillary diameter was measured. Modelling and numerical calculations were developed to estimate the signal magnitude that could be detected using a 1 µm diameter capillary. The optimal system geometry was determined. Scanning Probe Microscopy combined to XRF analysis could thereby lead to simultaneous acquisition of sample topography and chemical mapping. The expected lateral resolution using synchrotron radiation is 100 nm while sub 1 µm resolution is realistic with a laboratory source. This technique would allow to point a peculiar micro- or nano-object on the surface and to perform its chemical analysis
APA, Harvard, Vancouver, ISO, and other styles
22

Kokkinakis, Ioannis William. "Investigation of high-resolution methods in large-eddy simulation of subsonic and supersonic wall turbulent flows." Thesis, Cranfield University, 2009. http://hdl.handle.net/1826/3749.

Full text
Abstract:
This thesis presents the motivation, objectives and reasoning behind the undertaken PhD to investigate the capability of compressible Implicit Large Eddy Simulation (ILES) in simulating wall-bounded inhomogeneous flows with particular interest in the near wall region and further presents the progress achieved to date. Investigation includes the assessment of current ILES methods to resolve inhomogeneous turbulence as well as compressible turbulent boundary layers and to improve on those models further. A channel flow is an excellent problem to use to investigate the properties of a SGS model near a wall. The presence of a solid boundary tends to alter the behaviour of the turbulent flow in a number of ways that need to be modeled by the SGS model in order to correctly represent the flow near the wall and most importantly the boundary layer. The presence of the wall inhibits the growth of the small scales, alters the exchange mechanisms between the resolved and unresolved scales and finally gives rise in the SGS near wall region to important Reynolds-stress producing events. A literature survey was carried out to identify other numerical investigations in simulating channel flow as well as data that could be used for validation purposes. The main parameters used to validate the level of resolution in simulating channel flow are identified and a number of tools are developed. The primary parameters extensively used to validate LES simulations of channel flow throughout the literature are mean flow velocity profiles, turbulent kinetic energy, dissipation and shear stress profiles, wall shear stress and friction velocities as well as energy spectra in the spanwise and streamwise homogeneous directions. Compressible viscous ILES of inhomogeneous anisotropic turbulence in an incompressible channel flow at wall normal grid resolutions of 68, 96 and 128 cells are carried out with grid clustering applied to the wall normal direction. Initial results conducted in the compressible regime show that in order to obtain satisfactory results, medium and fine grids are required whereas on coarser grids, some additional numerical method is required. Each reconstruction scheme introduces a numerical dissipation characteristic to itself that maybe regarded as a sort of turbulence model. Thus depending on the required dissipation, a suitable limiter can be chosen. The investigation then moves on to supersonic turbulent flow incorporating shockboundary layer interaction. Only the slope-limiters that prove to simulate the flow in the fully developed turbulent channel best are favoured and then also utilised in the subsequent compressible ramp simulations. The capabilities of modelling the shock boundary layer interaction, mean turbulent profiles and shockwave angle are investigated and compared against those obtained by DNS simulations. It is found that the grid at the inlet of the ramp plays a significant role, since it needs to be fine enough to maintain the turbulent in flow at an acceptable level before reaching the shock-boundary layer interaction zone. Further, very high-order numerical reconstructions were found to have difficulties in remaining stable in the high gradient regions of the flow when formulated in conservative form and therefore solutions were not possible to obtain. Nonetheless, lower order reconstruction methods run smoothly and the momentum profiles obtained, matched closely those obtained by DNS.
APA, Harvard, Vancouver, ISO, and other styles
23

Burns, Kimberly Ann. "Coupled multi-group neutron photon transport for the simulation of high-resolution gamma-ray spectroscopy applications." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/29737.

Full text
Abstract:
Thesis (Ph.D)--Mechanical Engineering, Georgia Institute of Technology, 2010.
Committee Chair: Hertel, Nolan; Committee Member: Kulp, William David; Committee Member: Lee, Eva; Committee Member: Pagh, Richard; Committee Member: Petrovic, Bojan; Committee Member: Rahnema, Farzad; Committee Member: Smith, Eric; Committee Member: Wang, Chris. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
24

Anastasiadis, Anastasios. "A Monte Carlo simulation study of collimators for a high-spatial-resolution Gamma Emission Tomography instrument." Thesis, Uppsala universitet, Tillämpad kärnfysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-384170.

Full text
Abstract:
The present master thesis concerns a study of collimator designs for a high-spatial-resolution Gamma Emission Tomography (GET) instrument for used fuels utilizing Monte Carlo simulation codes. Designing a collimator for this purpose is a multilateral process that requires many interconnected and conflicting objectives to be taken into consideration. The purpose is to design a high-spatial-resolution GET system that combines in the best way the properties of a high detector count-rate, high photopeak to total spectrum ratio, low detector dead time and low leaking background radiation through the collimator material. As to achieve the best trade-off among these objectives, the GEANT4 and the Serpent 2 simulation codes were implemented. Used fuel contains various γ-ray emitting radionuclides and depending on the burnup history and cooling time their absolute intensities vary (i.e. for higher γ-ray intensity from the fuel, bigger collimator length is demanded). For this reason, Serpent 2 was used to produce long- and short-cooled fuel gamma emission spectra of low and high burnup. According to the obtained spectra, the collimator slit dimensions and material were determined. As far as the collimator length and material is concerned, the GEANT4 simulation toolkit was used to deal with shielding problems by applying the geometry splitting/Russian roulette variance reduction techniques. Serpent 2 simulations were performed in order to determine the transmitted signal intensity through the slit for various slit height and width dimensions. Finally, it was investigated the peak-to-total ratio change for different slit sizes and when a cavity structure was added along the slit length.
Denna uppsats handlar om kollimatordesign för hög rumsupplösning i gammaemissionstomografi (GET) av använt kärnbränsle genom att använda Monte Carlo-simuleringskoder. Att konstruera en kollimator för detta syfte är en process som kräver hänsyn till sammankopplande och ibland konflikterande målsättningar. Målet är att designa en GET-system som på bästa sätt kombinerar följande specifikationer: hög räknehastighet i detektorn, hög peak-to-total ratio, låg detektors dödtid och låg bakgrund från läckage genom skärmingsmaterialet. För att uppnå bästa möjliga resultat mellan dessa punkter användes simuleringskoderna GEANT4 och Serpent 2. Använt kärnbränsle innehåller varierande radionuklider och beroende på deras utbränning och nedkylningstid deras emissioner (absolutintensitet) varierar (t.ex större kollimatorers längd krävs vid högre intensitet från bränslet). Serpent 2 har använts för att beräkna gammaemisionsspektra för lång- och kort-kylda bränslen med låga och höga utbränningar. Med dessa hypotetiska bränslen, har spaltdimensioner och material undersökts. Beträffande spaltlängden och materialval användes GEANT4 genom att tillämpa variansreduktionsteknikerna geometry splitting/Russian roulette. Spaltbredden och spalthöjden hittades med Serpent 2 genom att beräkna transmissionssignal genom spalter av varierande dimensioner. Slutligen, undersöktes hur peak-to-total ratio ändras för olika spaltmått och även när en kavitet introduceras i kollimatorn.
APA, Harvard, Vancouver, ISO, and other styles
25

ISHIHARA, T., M. KANEDA, K. YOKOKAWA, K. ITAKURA, and A. UNO. "Small-scale statistics in high-resolution direct numerical simulation of turbulence: Reynolds number dependence of one point." Taylor & Francis, 2007. http://hdl.handle.net/2237/11132.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Lumsdon, Parivash. "Development and simulation of signal processing algorithms for high resolution wide band direction finding and multipath cancellation." Thesis, University of Newcastle Upon Tyne, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.294375.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Agada, Simeon. "Numerical simulation and optimisation of IOR and EOR processes in high-resolution models for fractured carbonate reservoirs." Thesis, Heriot-Watt University, 2015. http://hdl.handle.net/10399/2893.

Full text
Abstract:
Carbonate reservoirs contain more than half of the world’s conventional hydrocarbon resources. Hydrocarbon recovery in carbonates, however, is typically low, due to multi-scale geological heterogeneities that are a result of complex diagenetic, reactive, depositional and deformational processes. Improved Oil Recovery (IOR) and Enhanced Oil Recovery (EOR) methods are increasingly considered to maximise oil recovery and minimise field development costs. This is particularly important for carbonate reservoirs containing fractures networks, which can act as high permeability fluid flow pathways or impermeable barriers during interaction with the complex host rock matrix. In this thesis, three important contributions relating to EOR simulation and optimisation in fractured carbonate reservoirs are made using a high-resolution analogue reservoir model for the Arab D formation. First, a systematic approach is employed to investigate, analyse and increase understanding of the fundamental controls on fluid flow in heterogeneous carbonate systems using numerical well testing, secondary and tertiary recovery simulations. Secondly, the interplay between wettability, hysteresis and fracture-matrix exchange during combined CO2 EOR and sequestration is examined. Finally, data-driven surrogates, which construct an approximation of time-consuming numerical simulations, are used for rapid simulation and optimisation of EOR processes in fractured carbonate reservoirs while considering multiple geological uncertainty scenarios.
APA, Harvard, Vancouver, ISO, and other styles
28

Heiple, Shem C. "Using Building Energy Simulation and Geospatial Modeling Techniques in Determine High Resolution Building Sector Energy Consumption Profiles." PDXScholar, 2007. https://pdxscholar.library.pdx.edu/open_access_etds/3399.

Full text
Abstract:
A technique is presented for estimating hourly and seasonal energy consumption profiles in the building sector at spatial scales down to the individual taxlot or parcel. The method combines annual building energy simulations for cityspecific prototypical buildings and commonly available geospatial data in a Geographical Information System (GIS) framework. Hourly results can be extracted for any day and exported as a raster output at spatial scales as fine as an individual parcel (
APA, Harvard, Vancouver, ISO, and other styles
29

Burls, Natalie. "Simulation of high resolution winds over the southern Benguela upwelling system with potential application to harmful algal blooms." Master's thesis, University of Cape Town, 2006. http://hdl.handle.net/11427/6464.

Full text
Abstract:
Includes bibliographical references.
The Southern Benguela upwelling system is particularly susceptible to Harmful Algal Blooms (HABs), most of which are attributed to dinoflagellate species. Dinoflagellates are favoured by stratified conditions. Consequently, temporal or spatial variations in ocean and atmospheric conditions that favour stratification will encourage HAB development. Temporally, prolonged relaxation of the dominant equatorward wings during late summer typically results in quiscent phases in upwelling which promote stratification and bloom development.
APA, Harvard, Vancouver, ISO, and other styles
30

Kuny, Silvia [Verfasser], and S. [Akademischer Betreuer] Hinz. "Detection of Building Damages in High Resolution SAR Images based on SAR Simulation / Silvia Kuny ; Betreuer: S. Hinz." Karlsruhe : KIT-Bibliothek, 2021. http://d-nb.info/123507238X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Zanier, Giulia. "High Resolution Model to Predict Oil Spill Dispersion in Harbour and Coastal Areas." Doctoral thesis, Università degli studi di Trieste, 2015. http://hdl.handle.net/10077/11124.

Full text
Abstract:
2013/2014
Mostriamo un modello allo stato dell’arte, che considera i principali processi fisici che governano il greggio in mare nelle prime ore dopo il rilascio, (Zanier, et al., 2014). Le particelle e i tar sono trattati come particelle lagrangiane, ognuna con la propria densità e il proprio diametro; consideriamo le forze principali che agiscono su di esse ossia: galleggiamento, trascinamento e la forza di Coriolis. Il greggio in forma di film sottile è modellato tramite le equazioni proposte da Nihoul (Nihoul 1983/84). Il modello originale di Nihoul considera le forze principali (ossia gravità, stress indotto da vento e correnti marine) che agiscono sulla macchia e governano il suo trasporto e diffusione, sulla superficie del mare, nelle prime 24 ore dopo il rilascio. Il nostro miglioramento al modello consiste nell’introduzione della forza di Coriolis evitando di utilizzare formulazioni empiriche (Zanier, et al., 2015). Infine i principali processi di weathering che agiscono sulla macchia nelle prime 12-24 ore dopo il rilascio (ossia emulsificazione ed evaporazione) sono considerate in accordo con i modelli presenti in letteratura (Mackay, Peterson, et al., 1980 e Mackay, Buist, et al., 1980, rispettivamente). Per preservare un’accuratezza del secondo ordine del metodo numerico, i termini convettivi, nel modello Euleriano, sono discretizzati usando SMART uno schema numerico upwind del terzo ordine (Gaskell and Lau 1988). Il modello è validato con dei casi test standard. Le correnti marine sono risolte con il modello LES-COAST (IEFLUIDS Università di Trieste), un modello numerico ad alta definizione, adatto per simulare flussi in aree costiere e portuali. Il modello LES-COAST risolve la forma filtrata delle equazioni di Navier-Stokes tridimensionali e non-idrostatiche, assumendo che valga l’approssimazione di Boussinesq; e l’equazione di trasporto degli scalari, salinità e temperatura. Il modello usa l’approccio della large eddy simulation per parametrizzare la turbolenza, le variabili sono filtrate con una funzione filtro, rappresentante la grandezza delle celle. I flussi di sottogriglia (SGS), che appaiono dopo l’operazione di filtraggio delle equazioni, sono parametrizzati con un modello di Smagorinsky anisotropo con due eddy viscosity, per adattare il modello a simulare flussi costieri dove le lunghezze scala orizzontali sono molto più grandi di quelle verticali (Roman et al., 2010 ). Le diffusività di sotto griglia della temperatura e salinità, cioè i numeri di Prandtl e Schmidt, sono imposti come $Pr_{sgs}=Sc_{sgs}=0.8$, assumendo che l’analogia di Reynolds sia valida per entrambi gli scalari. La complessità geometrica che caratterizza le aree costiere, è trattata con una combinazione di griglie curvilinee e il metodo dei contorni immersi (IBM) (Roman, Napoli, et al., 2009). L’azione del vento sulla superficie libera del mare è imposta tramite una formula proposta da Wu (Wu, 1982), nella quale lo stress del vento sul mare è calcolato dalla velocità del vento a 10 m sopra il livello del mare. Allo stress aggiungiamo una varianza del 20% per agevolare la generazione di turbolenza e per tener conto che l’azione del vento non è costante nel tempo e nello spazio. Inoltre vicino agli ostacoli, come moli, navi e frangiflutti, lo stress del vento è ridotto linearmente, per considerare la riduzione del vento che si ha nelle zone di ricircolo. Sui contorni aperti le velocità e le quantità scalari sono ottenute innestando il modello LES-COAST con modelli di larga scala (Petronio, et al., 2013) oppure sono impostati secondo dati rilevati. Vicino ai bordi solidi le velocità sono modellate tramite funzioni parete (Roman, Armenio, et al., 2009). Il modello di rilascio di petrolio e il modello idrodinamico sono stati applicati assieme per simulare degli ipotetici scenari di trasporto e diffusione del greggio in mare nel porto di Barcellona (Mar Mediterraneo Nord-Ovest, Spagna, Galea, et al. 2014) e nella baia di Panzano (Mar Adriatico, Nord, Italia).
We present a novel, state of the art model, which accounts for the relevant short-term physical processes governing oil spill at sea, (Zanier, et al., 2014). Particles and tars are modelled as Lagrangian phase having its own density and diameter; taking into account the main forces acting on them, namely: buoyancy, drag and Coriolis forces. Oil transported in form of thin-film is treated by means of an improved Nihoul’s model (Nihoul 1983/84). The latter considers the main forces (gravity, wind and sea currents stresses) governing oil slick spreading and transport in the first hours after spilling, up to 24h for large spill. Our main improvement to the classical model consists in the introduction of Coriolis effect, avoiding using empirical formulations (Zanier, et al., 2015). Finally the relevant short-term (12-24 hours) weathering processes (mainly emulsification and evaporation) are taken into account through established literature models (Mackay, Peterson, et al., 1980 and Mackay, Buist, et al., 1980, respectively). To preserve second-order accuracy of the overall numerical method, convective terms, in the Eulerian model, are discretized using SMART a third order accurate upwind numerical scheme (Gaskell and Lau 1988). We validate the model on standard test cases. The underground hydrodynamics is resolved using LES-COAST (IEFLUIDS University of Trieste), a high definition numerical model suited for coastal or harbour areas. LES-COAST model solves the filtered form of three dimensional, non-hydrostatic Navier-Stokes equations under Boussinesq approximation and the transport equation for salinity and temperature. It makes use of Large Eddy Simulation approach to parametrize turbulence, the variables are filtered by way of a top-hat filter function represented by the size of the cells. The subgrid-scale fluxes (SGS), which appear after filtering operations, are parametrized by a two-eddy viscosity anisotropic Smagorinsky model, to better adapt to coastal flow in which horizontal length scale is larger than vertical one (Roman et al., 2010). The subgrid-scale eddy diffusivities of temperature and salinity, Prandtl and Schmidt numbers, are set $Pr_{sgs}=Sc_{sgs}=0.8$, by assuming that Reynolds analogy holds also for both scalars. Complex geometry that characterizes coastal flow is treated by a combination of curvilinear grid and Immersed Boundary Method (IBM) (Roman, Napoli, et al., 2009). Wind action on the free surface is taken into account by means of the formula proposed by Wu (Wu, 1982), in which the wind stress on the sea surface is computed from the wind velocities at 10 m above the surface. A 20% of variance is added to the stress to ease the generation of turbulence and to take into account of wind stress variations in time and space. Moreover near obstacles such as docks, ships and breakwaters, the wind stress is linearly reduced considering the relevant reduction of stress in recirculation regions. On the open boundaries the velocities and scalars quantities are obtained by nesting LES-COAST within Large Circulation Models (Petronio, et al., 2013) or are imposed from in-situ measurements. Near the wall velocities are modelled using wall functions (Roman, Armenio, et al., 2009). We apply the coupled oil spill model and hydrodynamical one to simulate hypothetical oil spill events in real case scenarios in Barcelona harbour (North-west Mediterranean Sea, Spain, Galea, et al. 2014) and in Panzano bay (North Adriatic Sea, Italy).
XXVII Ciclo
1986
APA, Harvard, Vancouver, ISO, and other styles
32

Dehlinger, Maël. "XAS-XEOL and XRF spectroscopies using Near-Field Microscope probes for high-resolution photon collection." Phd thesis, Aix-Marseille Université, 2013. http://tel.archives-ouvertes.fr/tel-00880623.

Full text
Abstract:
Les microscopes en champ proche permettent d'obtenir la topographie d'un échantillon avec une résolution pouvant atteindre la résolution atomique. Ces techniques permettent également d'accéder à certaines propriétés locales de la surface telles que le potentiel, l'élasticité, la densité d'états... Ces spectroscopies locales sont de type 'contraste' et ne permettent pas de dresser la cartographie chimique de la surface sans connaissance a priori des éléments qui la composent. Les spectroscopies de rayons-X sont des méthodes de caractérisation puissantes qui permettent de déterminer la composition et la structure élémentaire de l'échantillon avec une précision inférieure à l'Ångström. La résolution latérale est essentiellement limitée par la taille du faisceau primaire, couramment de plusieurs μm². Deux voies sont possibles pour l'améliorer: - réduire l'étendue du faisceau primaire excitateur; - limiter la collecte du rayonnement émis à une portion du volume excité, tout en approchant le détecteur au maximum pour garder un rapport signal/bruit suffisant. C'est cette deuxième option que nous avons choisi de développer. Pour cela nous avons collecté localement la luminescence visible issue de l'échantillon par la pointe-sonde d'un microscope à force de cisaillement, constituée d'une fibre optique effilée de faible ouverture. Cette technique a été utilisée pour caractériser des échantillons semiconducteurs micro- et nano-structurés afin d'en obtenir simultanément la topographie et la cartographie de luminescence locale. Ces résultats ont été obtenus non seulement sur une ligne synchrotron mais également à l'aide d'une microsource de laboratoire équipée d'une lentille polycapillaire. Afin de pouvoir étendre ce concept à d'autres types de matériaux, la faisabilité de la collecte de la fluorescence X locale a été évaluée avec la microsource. Pour cela la fluorescence X émise par un échantillon de cobalt a été collectée par un capillaire cylindrique équipant un détecteur EDX. L'influence du diamètre du capillaire sur le niveau de signal a été mesurée. Une simulation numérique a été développée afin d'estimer le niveau de signal obtenu en utilisant un capillaire de 1 μm de diamètre et d'optimiser la géométrie du système. En couplant la microscopie en champ proche et l'analyse XRF, à la lumière de ces résultats, il sera possible d'atteindre 100 nm de résolution latérale en environnement synchrotron et moins de 1 μm à l'aide d'une source de laboratoire. Il serait alors possible de sélectionner un objet particulier sur une surface et d'en faire l'analyse élémentaire.
APA, Harvard, Vancouver, ISO, and other styles
33

Angerer, Jay Peter. "Examination of high resolution rainfall products and satellite greenness indices for estimating patch and landscape forage biomass." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-2827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Morozova, Nadezhda S. "Utilizing High Resolution Data to Identify Minimum Vehicle Emissions Cases Considering Platoons and EVP." Thesis, Virginia Tech, 2016. http://hdl.handle.net/10919/78885.

Full text
Abstract:
This paper describes efforts to optimize the parameters for a platoon identification and accommodation algorithm that minimizes vehicle emissions. The algorithm was developed and implemented in the AnyLogic framework, and was validated by comparing it to the results of prior research. A four-module flowchart was developed to analyze the traffic data and to identify platoons. The platoon end time was obtained from the simulation and used to calculate the offset of the downstream intersection. The simulation calculates vehicle emissions with the aid of the VT-Micro microscopic emission model. Optimization experiments were run to determine the relationship between platoon parameters and minimum- and maximum-emission scenarios. Optimal platoon identification parameters were found from these experiments, and the simulation was run with these parameters. The total time of all vehicles in the simulation was also found for minimum and maximum emissions scenarios. Time-space diagrams obtained from the simulations demonstrate that optimized parameters allow all cars to travel through the downstream intersection without waiting, and therefore cause a decrease in emissions by as much as 15.5%. This paper also discusses the outcome of efforts to leverage high resolution data obtained from WV-705 corridor in Morgantown, WV. The proposed model was developed for that purpose and implemented in the AnyLogic framework to simulate this particular road network with four coordinated signal-controlled intersections. The simulation was also used to calculate vehicle CO, HC, NOx emissions with the aid of the VT-Micro microscopic emission model. Offset variation was run to determine the optimal offsets for this particular road network with traffic volume, signal phase diagram and vehicle characteristics. A classifier was developed by discriminant analysis based on significant attributes of HRD. Equation of this classifier was developed to distinguish between set of timing plans that produce maximum emission from set of timing plans that produce maximum emission. Also, current work investigates the potential use of the GPS-based and similar priority systems by giving preemption through signalized intersections. Two flowcharts are developed to consider presence of emergency vehicle (EV) in the system so called EV life cycle and EV preemption (EVP). Three scenarios are implemented, namely base case scenario when no EV is involved, EV scenario when EV gets EVP only, and EV scenario when EV gets preemption by signals and right-of-way by other vehicles. Research makes an attempt to compare emission results of these scenarios to find out whether EV effects vehicle emission in the road network and what is the level of this influence if any.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
35

Kuny, Silvia [Verfasser], and S. [Akademischer Betreuer] Hinz. "Texture based detection of building damages in high resolution SAR images supported by SAR simulation / Silvia Kuny ; Betreuer: S. Hinz." Karlsruhe : KIT-Bibliothek, 2021. http://d-nb.info/123406362X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Ramesh, Sathya. "High Resolution Satellite Images and LiDAR Data for Small-Area Building Extraction and Population Estimation." Thesis, University of North Texas, 2009. https://digital.library.unt.edu/ark:/67531/metadc12188/.

Full text
Abstract:
Population estimation in inter-censual years has many important applications. In this research, high-resolution pan-sharpened IKONOS image, LiDAR data, and parcel data are used to estimate small-area population in the eastern part of the city of Denton, Texas. Residential buildings are extracted through object-based classification techniques supported by shape indices and spectral signatures. Three population indicators -building count, building volume and building area at block level are derived using spatial joining and zonal statistics in GIS. Linear regression and geographically weighted regression (GWR) models generated using the three variables and the census data are used to estimate population at the census block level. The maximum total estimation accuracy that can be attained by the models is 94.21%. Accuracy assessments suggest that the GWR models outperformed linear regression models due to their better handling of spatial heterogeneity. Models generated from building volume and area gave better results. The models have lower accuracy in both densely populated census blocks and sparsely populated census blocks, which could be partly attributed to the lower accuracy of the LiDAR data used.
APA, Harvard, Vancouver, ISO, and other styles
37

Florin, Madeleine Jill. "Towards Precision Agriculture for whole farms using a combination of simulation modelling and spatially dense soil and crop information." Thesis, The University of Sydney, 2008. http://hdl.handle.net/2123/3169.

Full text
Abstract:
Precision Agriculture (PA) strives towards holistic production and environmental management. A fundamental research challenge is the continuous expansion of ideas about how PA can contribute to sustainable agriculture. Some associated pragmatic research challenges include quantification of spatio-temporal variation of crop yield; crop growth simulation modelling within a PA context and; evaluating long-term financial and environmental outcomes from site-specific crop management (SSCM). In Chapter 1 literature about managing whole farms with a mind towards sustainability was reviewed. Alternative agricultural systems and concepts including systems thinking, agro-ecology, mosaic farming and PA were investigated. With respect to environmental outcomes it was found that PA research is relatively immature. There is scope to thoroughly evaluate PA from a long-term, whole-farm environmental and financial perspective. Comparatively, the emphasis of PA research on managing spatial variability offers promising and innovative ways forward, particularly in terms of designing new farming systems. It was found that using crop growth simulation modelling in a PA context is potentially very useful. Modelling high-resolution spatial and temporal variability with current simulation models poses a number of immediate research issues. This research focused on three whole farms located in Australia that grow predominantly grains without irrigation. These study sites represent three important grain growing regions within Australia. These are northern NSW, north-east Victoria and South Australia. Note-worthy environmental and climatic differences between these regions such as rainfall timing, soil type and topographic features were outlined in Chapter 2. When considering adoption of SSCM, it is essential to understand the impact of temporal variation on the potential value of managing spatial variation. Quantifying spatiotemporal variation of crop yield serves this purpose; however, this is a conceptually and practically challenging undertaking. A small number of previous studies have found that the magnitude of temporal variation far exceeds that of spatial variation. Chapter 3 of this thesis dealt with existing and new approaches quantifying the relationship between spatial and temporal variability in crop yield. It was found that using pseudo cross variography to obtain spatial and temporal variation ‘equivalents’ is a promising approach to quantitatively comparing spatial and temporal variation. The results from this research indicate that more data in the temporal dimension is required to enable thorough analysis using this approach. This is particularly relevant when questioning the suitability of SSCM. Crop growth simulation modelling offers PA a number of benefits such as the ability to simulate a considerable volume of data in the temporal dimension. A dominant challenge recognised within the PA/modelling literature is the mismatch between the spatial resolution of point-based model output (and therefore input) and the spatial resolution of information demanded by PA. This culminates into questions about the conceptual model underpinning the simulation model and the practicality of using point-based models to simulate spatial variability. iii The ability of point-based models to simulate appropriate spatial and temporal variability of crop yield and the importance of soil available water capacity (AWC) for these simulations were investigated in Chapter 4. The results indicated that simulated spatial variation is low compared to some previously reported spatial variability of real yield data for some climate years. It was found that the structure of spatial yield variation was directly related to the structure of the AWC and interactions between AWC and climate. It is apparent that varying AWC spatially is a reasonable starting point for modelling spatial variation of crop yield. A trade-off between capturing adequate spatio-temporal variation of crop yield and the inclusion of realistically obtainable model inputs is identified. A number of practical solutions to model parameterisation for PA purposes are identified in the literature. A popular approach is to minimise the number of simulations required. Another approach that enables modelling at every desired point across a study area involves taking advantage of high-resolution yield information from a number of years to estimate site-specific soil properties with the inverse use of a crop growth simulation model. Inverse meta-modelling was undertaken in Chapter 5 to estimate AWC on 10- metre grids across each of the study farms. This proved to be an efficient approach to obtaining high-resolution AWC information at the spatial extent of whole farms. The AWC estimates proved useful for yield prediction using simple linear regression as opposed to application within a complex crop growth simulation model. The ability of point-based models to simulate spatial variation was re-visited in Chapter 6 with respect to the exclusion of lateral water movement. The addition of a topographic component into the simple point-based yield prediction models substantially improved yield predictions. The value of these additions was interpreted using coefficients of determination and comparing variograms for each of the yield prediction components. A result consistent with the preceding chapter is the importance of further validating the yield prediction models with further yield data when it becomes available. Finally, some whole-farm management scenarios using SSCM were synthesised in Chapter 7. A framework that enables evaluation of the long-term (50 years) farm outcomes soil carbon sequestration, nitrogen leaching and crop yield was established. The suitability of SSCM across whole-farms over the long term was investigated and it was found that the suitability of SSCM is confined to certain fields. This analysis also enabled identification of parts of the farms that are the least financially and environmentally viable. SSCM in conjunction with other PA management strategies is identified as a promising approach to long-term and whole-farm integrated management.
APA, Harvard, Vancouver, ISO, and other styles
38

Florin, Madeleine Jill. "Towards Precision Agriculture for whole farms using a combination of simulation modelling and spatially dense soil and crop information." University of Sydney, 2008. http://hdl.handle.net/2123/3169.

Full text
Abstract:
Doctor of Philosophy
Precision Agriculture (PA) strives towards holistic production and environmental management. A fundamental research challenge is the continuous expansion of ideas about how PA can contribute to sustainable agriculture. Some associated pragmatic research challenges include quantification of spatio-temporal variation of crop yield; crop growth simulation modelling within a PA context and; evaluating long-term financial and environmental outcomes from site-specific crop management (SSCM). In Chapter 1 literature about managing whole farms with a mind towards sustainability was reviewed. Alternative agricultural systems and concepts including systems thinking, agro-ecology, mosaic farming and PA were investigated. With respect to environmental outcomes it was found that PA research is relatively immature. There is scope to thoroughly evaluate PA from a long-term, whole-farm environmental and financial perspective. Comparatively, the emphasis of PA research on managing spatial variability offers promising and innovative ways forward, particularly in terms of designing new farming systems. It was found that using crop growth simulation modelling in a PA context is potentially very useful. Modelling high-resolution spatial and temporal variability with current simulation models poses a number of immediate research issues. This research focused on three whole farms located in Australia that grow predominantly grains without irrigation. These study sites represent three important grain growing regions within Australia. These are northern NSW, north-east Victoria and South Australia. Note-worthy environmental and climatic differences between these regions such as rainfall timing, soil type and topographic features were outlined in Chapter 2. When considering adoption of SSCM, it is essential to understand the impact of temporal variation on the potential value of managing spatial variation. Quantifying spatiotemporal variation of crop yield serves this purpose; however, this is a conceptually and practically challenging undertaking. A small number of previous studies have found that the magnitude of temporal variation far exceeds that of spatial variation. Chapter 3 of this thesis dealt with existing and new approaches quantifying the relationship between spatial and temporal variability in crop yield. It was found that using pseudo cross variography to obtain spatial and temporal variation ‘equivalents’ is a promising approach to quantitatively comparing spatial and temporal variation. The results from this research indicate that more data in the temporal dimension is required to enable thorough analysis using this approach. This is particularly relevant when questioning the suitability of SSCM. Crop growth simulation modelling offers PA a number of benefits such as the ability to simulate a considerable volume of data in the temporal dimension. A dominant challenge recognised within the PA/modelling literature is the mismatch between the spatial resolution of point-based model output (and therefore input) and the spatial resolution of information demanded by PA. This culminates into questions about the conceptual model underpinning the simulation model and the practicality of using point-based models to simulate spatial variability. iii The ability of point-based models to simulate appropriate spatial and temporal variability of crop yield and the importance of soil available water capacity (AWC) for these simulations were investigated in Chapter 4. The results indicated that simulated spatial variation is low compared to some previously reported spatial variability of real yield data for some climate years. It was found that the structure of spatial yield variation was directly related to the structure of the AWC and interactions between AWC and climate. It is apparent that varying AWC spatially is a reasonable starting point for modelling spatial variation of crop yield. A trade-off between capturing adequate spatio-temporal variation of crop yield and the inclusion of realistically obtainable model inputs is identified. A number of practical solutions to model parameterisation for PA purposes are identified in the literature. A popular approach is to minimise the number of simulations required. Another approach that enables modelling at every desired point across a study area involves taking advantage of high-resolution yield information from a number of years to estimate site-specific soil properties with the inverse use of a crop growth simulation model. Inverse meta-modelling was undertaken in Chapter 5 to estimate AWC on 10- metre grids across each of the study farms. This proved to be an efficient approach to obtaining high-resolution AWC information at the spatial extent of whole farms. The AWC estimates proved useful for yield prediction using simple linear regression as opposed to application within a complex crop growth simulation model. The ability of point-based models to simulate spatial variation was re-visited in Chapter 6 with respect to the exclusion of lateral water movement. The addition of a topographic component into the simple point-based yield prediction models substantially improved yield predictions. The value of these additions was interpreted using coefficients of determination and comparing variograms for each of the yield prediction components. A result consistent with the preceding chapter is the importance of further validating the yield prediction models with further yield data when it becomes available. Finally, some whole-farm management scenarios using SSCM were synthesised in Chapter 7. A framework that enables evaluation of the long-term (50 years) farm outcomes soil carbon sequestration, nitrogen leaching and crop yield was established. The suitability of SSCM across whole-farms over the long term was investigated and it was found that the suitability of SSCM is confined to certain fields. This analysis also enabled identification of parts of the farms that are the least financially and environmentally viable. SSCM in conjunction with other PA management strategies is identified as a promising approach to long-term and whole-farm integrated management.
APA, Harvard, Vancouver, ISO, and other styles
39

O'Brien, Michael J. (Michael James) 1981. "Performance analysis and algorithm enhancement of feature-aided-tracker (FAT) simulation software using 1-D high-range-resolution (HRR) radar signature profiles." Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/30309.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.
Includes bibliographical references (p. 94).
The current Lincoln Laboratory (LL) MATLAB Feature-Aided-Tracker (FAT) software was adjusted and appended to provide a robust ground-target radar tracking simulation tool. It utilizes algorithms from the LL UAV Radar Moving Target Tracker (1991) and the LL FAT Tracking Software (2002). One-dimensional High-Range-Resolution (HRR) radar signature target profiles were used to assist in track-to-report data association through classification-aided and signature-aided tracking (CAT and SAT) algorithms. Profiles were obtained from the DARPA-sponsored Moving Target Feature Phenomenology (MTFP) program. Performance Analysis of this simulation tool reinforced the hypothesis that target aspect angle error estimation (state estimation) drives the performance of CAT, SAT, and Kinematic Tracking (KT) algorithms. A decaying exponential relationship exists between the Kalman filter estimate of target-speed and expected aspect angle error. This relationship was exploited to optimize the allocation of computational resources while enlarging the database aspect angle search in CAT to improve performance. Vehicle classification accuracy is improved by 70% and data association accuracy is improved by 12% in kinematically ambiguous situations such as when target intersections occur. SAT was improved 3% using this knowledge. Additionally, the target report HRR profile from each scan was used to generate an "On-The- Fly" SAT HRR profile database. This algorithm tests the similarity between the current target report HRR profile and the database HRR profiles. If there is sufficient resemblance, the report HRR is added to the database; if not, the database is reset.
(cont.) This information can be employed to provide up to a 9% performance improvement over the previous version of SAT in a best-case scenario. In realistic situations, a 6% performance improvement is still attainable. If a large, accurate database exists, near-perfect data association is achieved. Overall, the above technique adjustments provide an improvement of 6.3% (13.6% in realistic, GPS-generated scenarios) in data association accuracy over the initial FAT algorithm and a corresponding 28.8% improvement over the results of the KT itself.
by Michael J. O'Brien.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
40

Huang, Hsiau-Wen. "Investigation of solution structures of yeast and lupin seed 5S ribosomal RNAs by high resolution nuclear magnetic resonance and molecular dynamics simulation /." The Ohio State University, 1991. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487684245468516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Rata, Mihaela. "Endocavitary applicator of therapeutic ultrasound integrated with RF receiver coil for high resolution MRI-controlled thermal therapy." Phd thesis, Université Claude Bernard - Lyon I, 2009. http://tel.archives-ouvertes.fr/tel-00692346.

Full text
Abstract:
This thesis presents technical and methodological developments aiming tooffer a viable alternative for the treatment of digestive cancers (rectum and esophagus). Compared to the standard methods of therapy, the high intensity contact ultrasound guided by MRI is a less invasive approach. MRI offers 2 advantages: good spatial resolution, and real-time temperature control. This treatment method requires efficacy and safety. Three prototypes of RF coil integrated with ultrasound transducers were built in order to increase the spatial and temporal resolution ofthe MR images, and the accuracy of the temperature measurement. The integrated coils showed a better sensitivity compared to a standard extracorporeal coil. Anatomical (voxel 0.4x0.4x5 mm3)and thermometry (voxel 0.75x0.75x8 mm3, 2s/image) high resolution MR images were acquired in-vivo. The temperature was measured, within a radius of 20 mm from the balloon, with a standard deviation <1°C. The flow artifacts caused by the water circulating inside the cooling balloon could be shifted out of the region of interest. The temperature evolution was controlled automatically, at different depths, with one control point per beam. The controller showed a good accuracy during in-vivo experiments (standard deviation less than 5%). The phased-arrayultra sound transducer permits the successive activation of multiple beams during the same dynamic of sonication. Simulations were conducted in order to offer an optimal treatment planning for a defined tumor. A new design of ultrasound transducer with 256 elements with revolution symmetry, based on a natural geometrical focalization, was proposed.
APA, Harvard, Vancouver, ISO, and other styles
42

Davoodi, Ali. "Mechanistic studies of localized corrosion of Al alloys by high resolution in-situ and ex-situ probing techniques." Doctoral thesis, Stockholm : Kemivetenskap, Kungliga Tekniska högskolan, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4588.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Liu, Chunde. "Creation of hot summer years and evaluation of overheating risk at a high spatial resolution under a changing climate." Thesis, University of Bath, 2017. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.725405.

Full text
Abstract:
It is believed that the extremely hot European summer in 2003, where tens of thousands died in buildings, will become the norm by the 2040s, and hence there is the urgent need to accurately assess the risk that buildings pose. Thermal simulations based on warmer than typical years will be key to this. Unfortunately, the existing warmer than typical years, such as probabilistic Design Summer Years (pDSYs) are not robust measures due to their simple selection method, and can even be cooler than typical years. This study developed two new summer reference years: one (pHSY-1) is suitable for assessing the occurrence and severity of overheating while the other (pHSY-2) is appropriate for evaluating the thermal stress. Both have been proven to be more robust than the pDSYs. In addition, this study investigated the spatial variation in overheating driven by variability in building characteristics and the local environment. This variation had been ignored by previous studies, as most of them either created thermal models using building archetypes with little or no concern about the influence of local shading, or assumed little variation in climate across a landscape. For the first time, approximately a thousand more accurate thermal models were created for a UK city based on the remote measurement including building characteristics and their local shading. By producing overheating and mortality maps this study found that spatial variation in the risk of overheating was considerably higher due to the variability of vernacular forms, contexts and climates than previously thought, and that the heat-related mortality will be tripled by the 2050s if no building and human thermal adaptations are taken. Such maps would be useful to Governments when making cost-effective adaptation strategies against a warming climate.
APA, Harvard, Vancouver, ISO, and other styles
44

Khan, Fayaz A. "Two-dimensional shock capturing numerical simulation of shallow water flow applied to dam break analysis." Thesis, Loughborough University, 2010. https://dspace.lboro.ac.uk/2134/7750.

Full text
Abstract:
With the advances in the computing world, computational fluid dynamics (CFD) is becoming more and more critical tool in the field of fluid dynamics. In the past few decades, a huge number of CFD models have been developed with ever improved performance. In this research a robust CFD model, called Riemann2D, is extended to model flow over a mobile bed and applied to a full scale dam break problem. Riemann2D, an object oriented hyperbolic solver that solves shallow water equations with an unstructured triangular mesh and using high resolution shock capturing methods, provides a generic framework for the solution of hyperbolic problems. The object-oriented design of Riemann2D has the flexibility to apply the model to any type of hyperbolic problem with the addition of new information and inheriting the common components from the generic part of the model. In a part of this work, this feature of Riemann2D is exploited to enhance the model capabilities to compute flow over mobile beds. This is achieved by incorporating the two dimensional version of the one dimensional non-capacity model for erodible bed hydraulics by Cao et al. (2004). A few novel and simple algorithms are included, to track the wet/dry and dry/wet fronts over abruptly varying topography and stabilize the solution while using high resolution shock capturing methods. The negative depths computed from the surface gradient by the limiters are algebraically adjusted to ensure depth positivity. The friction term contribution in the source term, that creates unphysical values near the wet/dry fronts, are resolved by the introduction of a limiting value for the friction term. The model is validated using an extensive variety of tests both on fixed and mobile beds. The results are compared with the analytical, numerical and experimental results available in the literature. The model is also tested against the actual field data of 1957 Malpasset dam break. Finally, the model is applied to simulate dam break flow of Warsak Dam in Pakistan. Remotely sensed topographic data of Warsak dam is used to improve the accuracy of the solution. The study reveals from the thorough testing and application of the model that the simulated results are in close agreement with the available analytical, numerical and experimental results. The high resolution shock capturing methods give far better results than the traditional numerical schemes. It is also concluded that the object oriented CFD model is very easy to adapt and extend without changing the generic part of the model.
APA, Harvard, Vancouver, ISO, and other styles
45

Şensoy, Levent. "Geo-Pet : a novel generic Organ-Pet for small animal organs and tissues." Diss., University of Iowa, 2016. https://ir.uiowa.edu/etd/3186.

Full text
Abstract:
Reconstructed tomographic image resolution of small animal PET imaging systems is improving with advances in radiation detector development. However the trend towards higher resolution systems has come with an increase in price and system complexity. Recent developments in the area of solid-state photomultiplication devices like silicon photomultiplier arrays (SPMA) are creating opportunities for new high performance tools for PET scanner design. Imaging of excised small animal organs and tissues has been used as part of post-mortem studies in order to gain detailed, high-resolution anatomical information on sacrificed animals. However, this kind of ex-vivo specimen imaging has largely been limited to ultra-high resolution μCT. The inherent limitations to PET resolution have, to date, excluded PET imaging from these ex-vivo imaging studies. In this work, we leverage the diminishing physical size of current generation SPMA designs to create a very small, simple, and high-resolution prototype detector system targeting ex-vivo tomographic imaging of small animal organs and tissues. We investigate sensitivity, spatial resolution, and the reconstructed image quality of a prototype small animal PET scanner designed specifically for imaging of excised murine tissue and organs. We aim to demonstrate that a cost-effective silicon photomultiplier (SiPM) array based design with thin crystals (2 mm) to minimize depth of interaction errors might be able to achieve sub-millimeter resolution. We hypothesize that the substantial decrease in sensitivity associated with the thin crystals can be compensated for with increased solid angle detection, longer acquisitions, higher activity and wider acceptance energy windows (due to minimal scatter from excised organs). The constructed system has a functional field of view (FoV) of 40 mm diameter, which is adequate for most small animal specimen studies. We perform both analytical (3D-FBP) and iterative (ML-EM) methods in order to reconstruct tomographic images. Results demonstrate good agreement between the simulation and the prototype. Our detector system with pixelated crystals is able to separate small objects as close as 1.25 mm apart, whereas spatial resolution converges to the theoretical limit of 1.6 mm (half the size of the smallest detecting element), which is to comparable to the spatial resolution of the existing commercial small animal PET systems. Better system spatial resolution is achievable with new generation SiPM detector boards with 1 mm x 1 mm cell dimensions. We demonstrate through Monte Carlo simulations that it is possible to achieve sub-millimeter spatial image resolution (0.7 mm for our scanner) in complex objects using monolithic crystals and exploiting the light-sharing mechanism among the neighboring detector cells. Results also suggest that scanner (or object) rotation minimizes artifacts arising from poor angular sampling, which is even more significant in smaller PET designs as the gaps between the sensitive regions of the detector have a more exaggerated effect on the overall reconstructed image quality when the design is more compact. Sensitivity of the system, on the other hand, can be doubled by adding two additional detector heads resulting in a, fully closed, 4π geometry.
APA, Harvard, Vancouver, ISO, and other styles
46

Luu, Nhat Linh. "The role of human-induced climate change on extreme convective precipitation events in the south of France : a high-resolution model simulation approach." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASV018.

Full text
Abstract:
La zone France-Méditerranée est fréquemment exposée à de fortes précipitations en automne dont l'accumulation quotidienne peut parfois dépasser 300 millimètres. Quelques études montrent une tendance à l'augmentation de la fréquence et de l'intensité de ces événements (par exemple Vautard et al., 2015 ; Ribes et al., 2019). Cependant, une attribution formelle des événements extrêmes qui lie ces changements au changement climatique induit par l'homme pour cette région n'a jamais été faite. Ce sujet de thèse vise à quantifier le rôle du changement climatique induit par l'homme dans l'altération des propriétés statistiques des précipitations convectives extrêmes survenant sur la région France-Méditerranée, en se concentrant sur la chaîne de montagnes des Cévennes et en utilisant pour la première fois une approche de modèle à haute résolution incluant un modèle permettant la convection. J'analyse d'abord l'ensemble EURO-CORDEX, qui comprend différentes combinaisons de modèles climatiques globaux et de modèles climatiques régionaux. Ensuite, j'ai effectué une série de simulations numériques avec le modèle WRF à une résolution permettant la convection. J'ai également comparé les simulations avec les observations et les ré-analyses à haute résolution. Les résultats montrent que les modèles régionaux peuvent reproduire des événements extrêmes de pluie convective avec une meilleure concordance avec les observations en augmentant leur résolution horizontale, en particulier à une résolution permettant la convection (environ 3 km). En utilisant ces simulations, je montre que le changement climatique induit par l'homme rend les précipitations quotidiennes et trihoraires sur 100 ans au moins deux fois plus probables dans le climat actuel. Les résultats suggèrent également la nécessité d'utiliser une approche multi-modèle pour réduire les incertitudes dans ce type d'étude d'impact
The France-Mediterranean area is frequently exposed to heavy precipitation events in the autumn whose daily accumulation can sometimes exceed 300 millimeters. There are a few studies showing the increasing trend in the frequency and intensity of these events (e.g. Vautard et al., 2015; Ribes et al., 2019). However, a formal extreme event attribution that links those changes to human-induced climate change for this area has never been done. This PhD subject aims at quantifying the role of human-induced climate change in altering the statistical properties of extreme convective precipitation event occurring over the France-Mediterranean focusing on the Cevennes mountain range and using a high-resolution model approach including convection-permitting model for the first time. I first analyze the EURO-CORDEX ensemble, which includes different combinations of global climate models and regional climate models. Then I conducted a set of numerical simulations with the WRF model at a convection-permitting resolution. I also compared the simulations with observations and high-resolution re-analyses. The results show that regional models can reproduce extreme convective rainfall events with better agreement with observations by increasing their horizontal resolution, especially to convection-permitting resolution (approx. 3 km). By using these simulations, I show that human-induced climate change consistently makes the 100-year 3-hourly and daily precipitation event at least 2 times more likely under current climate. The results also suggest the need of using multi-model approach to reduce the uncertainties in this type of impact study
APA, Harvard, Vancouver, ISO, and other styles
47

Maronga, Björn [Verfasser]. "High-resolution large-eddy simulation studies of the turbulent structure of the convective boundary layer over homogeneous and heterogeneous terrain and implications for the interpretation of scintillometer data / Björn Maronga." Hannover : Technische Informationsbibliothek und Universitätsbibliothek Hannover (TIB), 2013. http://d-nb.info/1047414686/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Schaaf, Benjamin [Verfasser], and Hans von [Akademischer Betreuer] Storch. "Added Value and regional effects in the multidecadal trends of a very high-resolution regional climate long-term model simulation at the coasts of Northern Germany / Benjamin Schaaf ; Betreuer: Hans von Storch." Hamburg : Staats- und Universitätsbibliothek Hamburg, 2018. http://d-nb.info/1163394319/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Bates, Thomas Hathaway. "Sclerocactus wrightiae (Cactaceace): An Evaluation of the Impacts Associated with Cattle Grazing and the Use of Remote Sensing to Assess Cactus Detectability." BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8986.

Full text
Abstract:
The Wright fishhook cactus (Sclerocactus wrightiae L.D. Benson) is an endangered cactus species endemic to south-central Utah. Since its listing in 1979 by the U.S. Fish and Wildife Service, the potential impacts of soil disturbance by cattle have become a central focus of management policies and monitoring efforts. However, little to no empirical data has been collected to substantiate the hypothesis that soil disturbance by cattle has direct or indirect negative effects on Wright fishhook cactus growth or reproduction. Over the years, the Bureau of Land Mangement (BLM) and Capitol Reef National Park (CRNP) have invested significant resources documenting cactus populations including several attributes of individual cacti: GPS location, diameter, number of flowers, fruits, or buds, number of stems, and the presence or absence of a cow track within 15 cm of the cactus. While these efforts have been commendable, due to the defining phenological characteristics of this species (flower and filament color) and its short flowering period (April-May) it remains difficult to study and much basic biological information including a range wide population estimate and defined critical habitat remain unknown. Our research had two primary objectives, 1) evalutate the effects of soil disturbance by cattle on reproduction and diameter of the Wright fishhook cactus (Chapters 1 and 2), and 2) explore the use of drones and GIS to define critical habitat and obtain an accurate range wide population estimate (Chapters 3 and 4). In Chapter 1, we analyzed cactus attribute data collected by the BLM at 30 macro-plots representing different levels of soil related cattle disturbance (high, moderate, and low) from 2011-2017. We found no significant association between level of cattle disturbance and flower density or cactus diameter. We did find a significant negative association between flower frequency and increased disturbance. In Chapter 2, we conducted an experimental study where tracks were simulated within 15 cm of cacti at various levels (Ctrl, 1-Track, 2-Track, 4-Tracks, and 4-Tracks Doubled). No significant association was observed between the number of tracks and response in diameter, flower production, fruit production, or seed set. In Chapter 3, we conducted drone flights over 14 macro-plots at three different altitudes above ground level (10 m, 15 m, and 20 m) and found that while the 10 m flights provided the best remotely sensed survey results, drones are not a suitable replacement for ground censuses. In Chapter 4, we used Resource Selection Function to define critical habitat for the Wright fishhook cactus. Our modeling suggests that geology, elevation, and slope are significant factors in defining cactus habitat. Based on the results of our research we conclude that soil disturbance by cattle may not have a significant influence on Wright fishhook cactus populations or dynamics, and that accurate range wide population estimates may be best obtained through ground surveys within the predicted critical habitat.
APA, Harvard, Vancouver, ISO, and other styles
50

Connors, Timothy W. "High resolution simulations of galactic cannibalism." Swinburne Research Bank, 2008. http://hdl.handle.net/1959.3/44962.

Full text
Abstract:
Thesis (PhD) - Swinburne University of Technology, 2008.
A dissertation presented in fulfillment of the requirements for the degree of Doctor of Philosophy, Swinburne University of Technology - 2008. Typescript. Bibliography: p. 133-145.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography