Letteratura scientifica selezionata sul tema "Simulation validée"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Simulation validée".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Articoli di riviste sul tema "Simulation validée":

1

Secretan, Y., M. Leclerc, S. Duchesne e M. Heniche. "Une méthodologie de modélisation numérique de terrain pour la simulation hydrodynamique bidimensionnelle". Revue des sciences de l'eau 14, n. 2 (12 aprile 2005): 187–212. http://dx.doi.org/10.7202/705417ar.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
L'article pose la problématique de la construction du Modèle Numérique de Terrain (MNT) dans le contexte d'études hydrauliques à deux dimensions, ici reliées aux inondations. La difficulté est liée à l'hétérogénéité des ensembles de données qui diffèrent en précision, en couverture spatiale, en répartition et en densité, ainsi qu'en géoréférentiation, notamment. Dans le cadre d'un exercice de modélisation hydrodynamique, toute la région à l'étude doit être documentée et l'information portée sur un support homogène. L'article propose une stratégie efficace supportée par un outil informatique, le MODELEUR, qui permet de fusionner rapidement les divers ensembles disponibles pour chaque variable qu'elle soit scalaire comme la topographie ou vectorielle comme le vent, d'en préserver l'intégrité et d'y donner accès efficacement à toutes les étapes du processus d'analyse et de modélisation. Ainsi, quelle que soit l'utilisation environnementale du modèle numérique de terrain (planification d'aménagement, conservation d'habitats, inondations, sédimentologie), la méthode permet de travailler avec la projection des données sur un support homogène de type maillage d'éléments finis et de conserver intégralement l'original comme référence. Cette méthode est basée sur une partition du domaine d'analyse par type d'information : topographie, substrat, rugosité de surface, etc.. Une partition est composée de sous-domaines et chacun associe un jeu de données à une portion du domaine d'analyse par un procédé déclaratoire. Ce modèle conceptuel forme à notre sens le MNT proprement dit. Le processus de transfert des données des partitions à un maillage d'analyse est considéré comme un résultat du MNT et non le MNT lui-même. Il est réalisé à l'aide d'une technique d'interpolation comme la méthode des éléments finis. Suite aux crues du Saguenay en 1996, la méthode a pu être testée et validée pour en démontrer l'efficacité. Cet exemple nous sert d'illustration.
2

Mueller, Patrick, Matthias Lehmann e Alexander Braun. "Simulating tests to test simulation". Electronic Imaging 2020, n. 16 (26 gennaio 2020): 149–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.16.avm-148.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Simulation is an established tool to develop and validate camera systems. The goal of autonomous driving is pushing simulation into a more important and fundamental role for safety, validation and coverage of billions of miles. Realistic camera models are moving more and more into focus, as simulations need to be more then photo-realistic, they need to be physical-realistic, representing the actual camera system onboard the self-driving vehicle in all relevant physical aspects – and this is not only true for cameras, but also for radar and lidar. But when the camera simulations are becoming more and more realistic, how is this realism tested? Actual, physical camera samples are tested in laboratories following norms like ISO12233, EMVA1288 or the developing P2020, with test charts like dead leaves, slanted edge or OECF-charts. In this article we propose to validate the realism of camera simulations by simulating the physical test bench setup, and then comparing the synthetical simulation result with physical results from the real-world test bench using the established normative metrics and KPIs. While this procedure is used sporadically in industrial settings we are not aware of a rigorous presentation of these ideas in the context of realistic camera models for autonomous driving. After the description of the process we give concrete examples for several different measurement setups using MTF and SFR, and show how these can be used to characterize the quality of different camera models.
3

BIAŁASZ, Sebastian, e Ramon PAMIES. "NUMERICAL SIMULATION OF THE DESIGN OF EXTRUSION PROCESS OF POLYMERIC MINI-TUBES". Applied Computer Science 14, n. 3 (30 settembre 2018): 81–95. http://dx.doi.org/10.35784/acs-2018-23.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
In this paper we represent a study reporting the numerical simulation of small-diameter pipes extrusion process. Polypropylene and low density polyethylene were chosen as plastics and a selected transverse head as a tool in the simulations. The aim of the study is to examine the distribution of temperature in the individual sections of the bagasse and tools, in order to optimize the parameters and process flow extrusion and validate the implementation tools, by simulating the flow of plastic by the head.
4

Youngman, Benjamin D., e David B. Stephenson. "A geostatistical extreme-value framework for fast simulation of natural hazard events". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 472, n. 2189 (maggio 2016): 20150855. http://dx.doi.org/10.1098/rspa.2015.0855.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t -process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements.
5

Jones, Jenny. "Researchers Validate Alternative Seismic Simulation Method". Civil Engineering Magazine Archive 83, n. 9 (ottobre 2013): 38–39. http://dx.doi.org/10.1061/ciegag.0000656.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Cavargna, Augusto, Luigi Mongibello, Marcello Iasiello e Nicola Bianco. "Analysis of a Phase Change Material-Based Condenser of a Low-Scale Refrigeration System". Energies 16, n. 9 (28 aprile 2023): 3798. http://dx.doi.org/10.3390/en16093798.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This study concerns the numerical simulation and the experimental implementation of a low-scale Phase Change Material-based (PCM-based) condenser, to be included in a PCM-based portable cooling systems. In this category of cooling systems, the PCM can be integrated either in the condenser or in the evaporator. In the present study, the PCM is integrated in the condenser of the vapor compression cycle to absorb the heat power released from the refrigerant fluid (R134a) during condensation, thus eliminating the need to transfer heat to the external environment. The main objective of the present study is to realize and validate a numerical model capable of simulating both the refrigerant fluid and the PCM thermofluid dynamics. For this purpose, a commercial solver was used for the implementation of the developed numerical model, and experimental tests were performed to validate the numerical simulations results. The paper reports the details and test results of both the numerical model and the experimental apparatus. The simulation results indicate a good accordance between the numerical and experimental data.
7

Sheng, Chunyang, Kenichi Nomura, Pankaj Rajak, Aiichiro Nakano, Rajiv K. Kalia e Priya Vashishta. "Quantum Molecular Dynamics Validation of Nanocarbon Synthesis by High-Temperature Oxidation of Nanoparticles". MRS Advances 1, n. 24 (2016): 1811–16. http://dx.doi.org/10.1557/adv.2016.413.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
ABSTRACTThis study uses ab initio quantum molecular dynamics (QMD) simulations to validate multimillion-atom reactive molecular dynamics (RMD) simulations, and predicts unexpected condensation of carbon atoms during high-temperature oxidation of silicon-carbide nanoparticles (nSiC). For the validation process, a small nSiC in oxygen environment is chosen to perform QMD simulation. The QMD results provide the number of Si-O and C-O bonds as a function of time. RMD simulation is then performed under the identical condition. The time evolutions of different bonds are compared between the QMD and RMD simulations. We observe the condensation of large number of C-cluster nuclei into larger C clusters in both simulations, thereby validating RMD. Furthermore, we use the QMD simulation results as an input to a multi-objective genetic algorithm to train the RMD force-field parameters. The resulting force field far better reproduces the ground-truth QMD simulation results.
8

Nemes, Dániel, e Sándor Hajdu. "Simulation of BLDC Motor Drive Systems for Electric Vehicles Using Matlab Simulink". International Journal of Engineering and Management Sciences 8, n. 1 (30 aprile 2023): 48–52. http://dx.doi.org/10.21791/ijems.2023.1.6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The defining scientific developments of our time would not have been possible without the use of simulations. The aim of the research is to create a simulation of a BLDC motor. When creating a simulation, great emphasis must be placed on defining the purpose of the simulation. This basically determines the structure and complexity of the model. The model discussed here was created so that an optimization task could be defined more precisely by inserting it as a sub-model into a vehicle dynamics model. Scalability was another aspect, that is, to be able to increase the accuracy of the model with measured data in the future, as well as to be able to validate it. During the research, a BLDC motor efficiency map generation program was created, as well as an environment for testing the generated data. The created system gives researchers the opportunity to use a shape-correct efficiency model when simulating a BLDC motor even without measured data. This makes it possible to discover real relationships between model parameters when performing optimization.
9

Durst, Phillip J., Derek T. Anderson e Cindy L. Bethel. "A historical review of the development of verification and validation theories for simulation models". International Journal of Modeling, Simulation, and Scientific Computing 08, n. 02 (9 gennaio 2017): 1730001. http://dx.doi.org/10.1142/s1793962317300011.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Modeling and simulation (M&S) play a critical role in both engineering and basic research processes. Computer-based models have existed since the 1950s, and those early models have given way to the more complex computational and physics-based simulations used today. As such, a great deal of research has been done to establish what level of trust should be given to simulation outputs and how to verify and validate the models used in these simulations. This paper presents an overview of the theoretical work done to date defining formal definitions for, and methods of, verification and validation (V&V) of computer models. Simulation models are broken down into three broad categories: analytical and simulation models, computational and physics-based models, and simulations of autonomous systems, and the unique theories and methods developed to address V&V of these models are presented. This paper also presents the current problems in the theoretical field of V&V for models as simulations move from single system models and simulations to more complex simulation tools. In particular, this paper highlights the lack of agreed-upon methods for V&V of simulations of autonomous systems, such as an autonomous unmanned vehicles, and proposes some next steps needed to address this problem.
10

Bindle, Liam, Randall V. Martin, Matthew J. Cooper, Elizabeth W. Lundgren, Sebastian D. Eastham, Benjamin M. Auer, Thomas L. Clune et al. "Grid-stretching capability for the GEOS-Chem 13.0.0 atmospheric chemistry model". Geoscientific Model Development 14, n. 10 (6 ottobre 2021): 5977–97. http://dx.doi.org/10.5194/gmd-14-5977-2021.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract. Modeling atmospheric chemistry at fine resolution globally is computationally expensive; the capability to focus on specific geographic regions using a multiscale grid is desirable. Here, we develop, validate, and demonstrate stretched grids in the GEOS-Chem atmospheric chemistry model in its high-performance implementation (GCHP). These multiscale grids are specified at runtime by four parameters that offer users nimble control of the region that is refined and the resolution of the refinement. We validate the stretched-grid simulation versus global cubed-sphere simulations. We demonstrate the operation and flexibility of stretched-grid simulations with two case studies that compare simulated tropospheric NO2 column densities from stretched-grid and cubed-sphere simulations to retrieved column densities from the TROPOspheric Monitoring Instrument (TROPOMI). The first case study uses a stretched grid with a broad refinement covering the contiguous US to produce simulated columns that perform similarly to a C180 (∼ 50 km) cubed-sphere simulation at less than one-ninth the computational expense. The second case study experiments with a large stretch factor for a global stretched-grid simulation with a highly localized refinement with ∼10 km resolution for California. We find that the refinement improves spatial agreement with TROPOMI columns compared to a C90 cubed-sphere simulation of comparable computational demands. Overall, we find that stretched grids in GEOS-Chem are a practical tool for fine-resolution regional- or continental-scale simulations of atmospheric chemistry. Stretched grids are available in GEOS-Chem version 13.0.0.

Tesi sul tema "Simulation validée":

1

Bénard, Vincent. "Evaluation de la sûreté de fonctionnement des systèmes complexes, basée sur un mode fontionnel dynamique : la méthode SAFE-SADT". Valenciennes, 2004. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/3fb536a1-1028-4880-b5f0-7ff718ea5b94.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Cette thèse s'attache à tenir compte dès la phase de conception des aspects de la sûreté de fonctionnement. Une partie qualitative vise à modéliser les sous-ensembles fonctionnels en intégrant des contraintes de sûreté de fonctionnement, performance, coût, etc. Le problème de conception d'un système complexe automatisé en considérant les critères sûreté de fonctionnement peut se résumer par la question suivante : comment se traduit l'agrégation de ces différentes fonctions en termes de paramètres FMDS du système global ? Les travaux menés partent du constat de l'absence de langages et d'outils pour la modélisation d'architectures abstraites obtenues par composition d'entités logicielles et matérielles. La méthode fonctionnelle dynamique SAFE-SADT proposée est un premier élément de réponse. Elle permet la modélisation, caractérisation, identification et représentation des dépendances au sein de l'architecture opérationnelle et la quantification des paramètres de sûreté à des fins de validation de l'architecture opérationnelle en prenant en compte les aspects dynamiques grâce à la simulation de Monte Carlo
This thesis deals with design of dependable automated complex systems. A qualitative approach aims at modelling the functional subsets with the integration of various constraints as early as the design phase. The problem inherent in design of automated complex systems can be summarized by the following question: how to express the aggregation of these different functions in terms RAMS parameters of the global system? These works result from the absence of languages and tools for the modelling of abstracted architectures obtained by composition of software and hardware entities. The proposed SAFE-SADT method is a first response element. It allows the modelling, the characterization, the identification and the representation of dependences within the operational architecture. It allows the quantification of the dependability parameters with intent to validate the operational architecture by taking into account the dynamic aspects by means of a Monte Carlo simulation
2

Bertin, Étienne. "Robust optimal control for the guidance of autonomous vehicles". Electronic Thesis or Diss., Institut polytechnique de Paris, 2022. http://www.theses.fr/2022IPPAE012.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Le guidage d'un lanceur réutilisable est un problème de contrôle qui nécessite à la fois précision et robustesse : il faut calculer une trajectoire et un contrôle, de sorte que le lanceur atteigne la piste d'atterrissage, sans s'écraser ni exploser en vol, le tout en utilisant le moins de carburant possible.Les méthodes de Contrôle Optimal issu du Principe de Pontryagin calculent une trajectoire optimale avec grande précision, mais les incertitudes, soit les erreurs entre les estimations de l'état initial et des paramètres et leurs valeurs réelles, causent une déviation potentiellement dangereuse de la trajectoire réelle. En parallèle, les méthodes ensemblistes et notamment la simulation validée peuvent encadrer toutes les trajectoires possibles d'un système dynamique avec des incertitudes bornées.Cette thèse combine ces deux approches pour encadrer des ensembles de trajectoires optimales de systèmes avec incertitudes afin de garantir la robustesse du guidage d'un véhicule autonome.Nous commençons par définir des ensembles de trajectoires optimales pour des systèmes avec incertitudes, d'abord pour les trajectoires mathématiquement parfaites, puis pour les trajectoires d'un véhicule sujet à des erreurs d'estimation, mais qui utiliserait, ou non, les données des capteurs pour recalculer sa trajectoire en cours de route. Le principe de Pontryagin caractérise ces ensembles comme solutions de problèmes aux deux bouts avec des dynamiques avec incertitudes. Nous développons alors des algorithmes qui encadrent toutes les solutions de ces problèmes aux deux bouts en utilisant la simulation validée, l'arithmétique des intervalles et la théorie des contracteurs. Cependant, la simulation avec des intervalles occasionne une forte sur-approximation qui limite nos méthodes. Pour y remédier, nous remplaçons les intervalles par des zonotopes symboliques contraints. Nous utilisons notamment ces zonotopes pour simuler des systèmes hybrides, encadrer des solutions de problèmes aux deux bouts et construire des sous-approximations en complément de la sur-approximation classique. Enfin, nous combinons tout ceci pour calculer des ensembles de trajectoires de systèmes aérospatiaux et les utilisons pour évaluer la robustesse du contrôle
The guidance of a reusable launcher is a control problem that requires both precision and robustness: one must compute a trajectory and a control such that the system reaches the landing zone, without crashing into it or exploding mid-flight, all while using as little fuel as possible. Optimal control methods based on Pontryagin's Maximum Principle can compute an optimal trajectory with great precision, but uncertainties, the discrepancies between estimated values of the initial state and parameters and actual values, cause the actual trajectory to deviate, which can be dangerous. In parallel, set-based methods and notably validated simulation can enclose all trajectories of a system with uncertainties.This thesis combines those two approaches to enclose sets of optimal trajectories of a problem with uncertainties to guarantee the robustness of the guidance of autonomous vehicles.We start by defining sets of optimal trajectories for systems with uncertainties, first for mathematically perfect trajectories, then for the trajectory of a vehicle subject to estimation errors that can use, or not use, sensor information to compute a new trajectory online. Pontryagin's principle characterizes those sets as solutions of a boundary value problem with dynamics subject to uncertainties. We develop algorithms that enclose all solutions of these boundary value problem using validated simulation, interval arithmetic and contractor theory. However, validated simulation with intervals is subject to significant over-approximation that limits our methods. To remedy that we replace intervals by constrained symbolic zonotopes. We use those zonotopes to simulate hybrid systems, enclose the solutions of boundary value problems and build an inner-approximation to complement the classical outer-approximation. Finally, we combine all our methods to compute sets of trajectories for aerospace systems and use those sets to assess the robustness of a control
3

PAGEOT, JEAN-MARC. "Guider la simulation pour valider les protocoles". Rennes 1, 1989. http://www.theses.fr/1989REN10086.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
La notion de guidage est developpee; celui qui peut etre mis en oeuvre independamment du protocole traite et qui consiste a reduire statiquement l'espace d'etats du protocole et, par opposition, celui qui est dependant du protocole. Le guidage dependant du protocole est lui meme divisible suivant qu'il est intra-protocole et extra-protocole. L'etude a ete realisee dans le cadre du guidage dependant extra-protocole. La technique du guidage a ete appliquee au cours de la simulation d'un algorithme distribue de detection de terminaison de processus et pour deux protocoles (connection less network protocol et t70)
4

Benson, Kristen D. "Use of centrifuge modelling to validate an unsaturated transport numerical simulation". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2002. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/NQ65665.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Papapanagiotou, Nikolaos, Eugen Constantin, Sanjeev Singh e Nikolaos Papapanagiotou. "Analysis of DDD and VDT simulation techniques to determine feasibility of using VDT simulation to validate DDD models". Monterey, California. Naval Postgraduate School, 2004. http://hdl.handle.net/10945/9925.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Approved for public release; distribution is unlimited.
MBA Professional Report
Approved for public release; distribution is unlimited.
The purpose of this MBA project was to determine whether and how VDT can emulate the results obtained from A2C2 Experiments. To do that, we have first focused on learning the basics of VDT and DDD simulation techniques and then on how the models used in DDD can be analyzed using VDT. To this end, we obtained experimental data from DDD Experiment 8 and created representative models in VDT to determine the similarities and differences. We also kept detailed records of our research to assist individuals in the future who may want to expand on our work. The project involved studying of DDD and VDT techniques, establishing building blocks in VDT, creating a best effort model for DDD Experiment 8 and studying the various outcomes. In this project we could not successfully replicate the complex DDD Experiment 8 scenarios within VDT. However, important conclusions were drawn that would go a long way towards helping future studies in this regard.
6

Benezech, Laurent Jean-Michel Dimotakis Paul E. "Premixed hydrocarbon stagnation flames : experiments and simulations to validate combustion chemical-kinetic models /". Diss., Pasadena, Calif. : California Institute of Technology, 2008. http://resolver.caltech.edu/CaltechETD:etd-05302008-113043.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Miller, Craig. "A Research Based General Framework for Effective Simulation Development and Methodology to Validate Economic Fidelity". Thesis, Metropolitan State University, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3668376.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):

The three primary objectives of this project were: (1) to identify and codify a framework for best practices in developing a simulation; (2) to construct a prototype or test simulation based on these best practices, and (3) to create a methodology to assess pedagogical efficacy and economic fidelity.

While the current body of knowledge is rich in describing the virtues and pitfalls of computer simulation technology that has existed for close to 60 years, the literature nonetheless lacks a codified set of best practices for developers and objective assessment methods to judge a simulation quality for both the pedagogical effectiveness and economic fidelity. This study addresses both issues and offers a solution that is unique and effective. A General Framework for Effective Simulation Development that is derivative, and an extension of existing research in the business simulation domain. A simulation prototype, SimWrite!, has been developed that is consistent with the 12 elements identified in this framework. Each stage of the development of this test simulation is explicitly tied to the best practices that emerged from the literature. A second assessment tool, The Economic Theory Input-Output Matrix, is presented to enable a user to measure the economic fidelity of a simulation. This tool is based on microeconomic theory that is taught at business schools throughout the globe. Both assessment tools will be applied to the test simulation in a manner that will enable the user to replicate this research with other simulations they are interested in. The products of this dissertation are intended to aid current and future developers make better simulations and faculty users of simulations to better select simulations that will help them to achieve the goal of all involved in teaching business: To produce greater learning for students.

8

Shaw, J. "Use of plant growth simulations to validate BRDF model parameters derived SPOT-VGT data". Thesis, Swansea University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.639015.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
By inverting BRDF models against satellite-sensor measurements of spectral reflectance recorded at different illumination and view angles, it is theoretically possible to obtain estimates of several important environmental variables (e.g., LAI, albedo). In recent years, a growing number of studies has exploited this basic approach, however, the relationships between the resulting model-derived quantities and the corresponding surface properties have yet to be rigorously explored. This thesis examines these relationships, by comparing upscaled observations of LAI with the outputs of BRDF model inversions. The research makes use of ground-base measurements of LAI, albedo, spectral reflectance and canopy biometric characteristics for a variety of arable crops over two growing seasons (1999/2000). These data sets are, however, recorded at very different spatial and temporal scales compared to the satellite-sensor data employed as input to the BRDF models (c.1km). The former data set was, therefore, upscaled to match the satellite data. Upscaling is achieved using a combination of plant growth models and intermediate spatial resolution satellite-sensor images. The plant growth models, in particular SUCROS, are used to provide estimates of LAI over the full growing season. These temporal profiles of LAI are up-scaled to the 1km spatial resolution of the SPOT-VGT image data employed in the BRDF model inversions using image data acquired by the Lansat-TM sensor. These data are used to generate a land cover map of the study area (84% accuracy). The simulated temporal profiles of LAI are applied to this land cover map on a cover-type by cover-type basis to generate images of LAI, initially at 30m resolution and, subsequently, at 1km resolution. In theory, the kernel weights are related to surface biophysical properties, however, it was subsequently determined that, in the case of LAI, this was not so. Therefore, evidence that the kernel weights may be related to LAI is sought by comparison of the temporal profiles of up-scaled LAI with temporal profiles of up-scaled LAI with temporal profiles of the kernel weights. Although positive correlation with the other kernel weights. It was, however, found that the up-scaled LAI was strongly correlated with the NDVI and the corrected NDVI than with the kernel weights.
9

Bochníček, Štěpán. "Validace numerické simulace průběhu plnění matečné formy voskem a její následná optimalizace". Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2013. http://www.nusl.cz/ntk/nusl-230975.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The main topic of the diploma thesis is simulation of the process when wax is filling the cavity of the "mother" metal die.This knowledge is the neccessary presumption for correct design of the gating system and setting correct injection parameters (temperature, pressure, wax flow) when making wax patterns.
10

Mokrý, Michal. "In silico návrh a validace peptidových derivátů konotoxinu pro nanoterapii neuroblastomu". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2021. http://www.nusl.cz/ntk/nusl-442491.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Práca sa zaoberá in silico dizajnom a validáciou peptidov založených na konotoxíne - MrIA, izolovaného z morských slimákov druhu Conus marmoreus a možnosti využitia týchto peptidov v liečbe neuroblastómu pomocou cielenia norepinefrínového transportéru. Päť peptidov založených na tomto konotoxíne bolo simulovaných pomocou simulácii molekulárnej dynamiky, ich trajektórie boli analyzované pre zistenie vlastností týchto peptidov. Dva homologické modely ľudského norepinefrínového transportéru boli vytvorené pre analýzu väzobných vlastností peptidov založených na konotoxíne ku norepinefrínovému transportéru. Peptidy boli následne syntetizované a použité na pokrytie apoferitínových nanočastíc s elipticínom uväzneným vnútri apoferitínu. Vytvorené peptidy a nanočastice boli ďalej skúmané pre objasnenie ich fyzikálo-chemických vlastností. Interakcie a cytotoxicita boli skúmané aplokáciou nanočastíc na bunky neuroblastómu a epitelu. Z in silico a in vitro analýz vyšiel YKL-6 peptid ako najlepší kandidát na ďalší výskum.

Libri sul tema "Simulation validée":

1

Analysis of DDD and VDT Simulation Techniques to Determine Feasibility of Using VDT Simulation to Validate DDD Models. Storming Media, 2004.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Monge, Peter R., e Noshir Contractor. Theories of Communication Networks. Oxford University Press, 2003. http://dx.doi.org/10.1093/oso/9780195160369.001.0001.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
To date, most network research contains one or more of five major problems. First, it tends to be atheoretical, ignoring the various social theories that contain network implications. Second, it explores single levels of analysis rather than the multiple levels out of which most networks are comprised. Third, network analysis has employed very little the insights from contemporary complex systems analysis and computer simulations. Foruth, it typically uses descriptive rather than inferential statistics, thus robbing it of the ability to make claims about the larger universe of networks. Finally, almost all the research is static and cross-sectional rather than dynamic. Theories of Communication Networks presents solutions to all five problems. The authors develop a multitheoretical model that relates different social science theories with different network properties. This model is multilevel, providing a network decomposition that applies the various social theories to all network levels: individuals, dyads, triples, groups, and the entire network. The book then establishes a model from the perspective of complex adaptive systems and demonstrates how to use Blanche, an agent-based network computer simulation environment, to generate and test network theories and hypotheses. It presents recent developments in network statistical analysis, the p* family, which provides a basis for valid multilevel statistical inferences regarding networks. Finally, it shows how to relate communication networks to other networks, thus providing the basis in conjunction with computer simulations to study the emergence of dynamic organizational networks.
3

Shaikh, Mohd Faraz. Machine Learning in Detecting Auditory Sequences in Magnetoencephalography Data : Research Project in Computational Modelling and Simulation. Technische Universität Dresden, 2021. http://dx.doi.org/10.25368/2022.411.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Does your brain replay your recent life experiences while you are resting? An open question in neuroscience is which events does our brain replay and is there any correlation between the replay and duration of the event? In this study I tried to investigate this question by using Magnetoencephalography data from an active listening experiment. Magnetoencephalography (MEG) is a non-invasive neuroimaging technique used to study the brain activity and understand brain dynamics in perception and cognitive tasks particularly in the fields of speech and hearing. It records the magnetic field generated in our brains to detect the brain activity. I build a machine learning pipeline which uses part of the experiment data to learn the sound patterns and then predicts the presence of sound in the later part of the recordings in which the participants were made to sit idle and no sound was fed. The aim of the study of test replay of learned sound sequences in the post listening period. I have used classification scheme to identify patterns if MEG responses to different sound sequences in the post task period. The study concluded that the sound sequences can be identified and distinguished above theoretical chance level and hence proved the validity of our classifier. Further, the classifier could predict the sound sequences in the post-listening period with very high probability but in order to validate the model results on post listening period, more evidence is needed.

Capitoli di libri sul tema "Simulation validée":

1

Helmig, Thorsten, Hui Liu, Simon Winter, Thomas Bergs e Reinhold Kneer. "Development of a Tool Temperature Simulation During Side Milling". In Lecture Notes in Production Engineering, 308–17. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-34486-2_22.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractCurrent modeling approaches of cutting processes require on the one hand extensive numerical and analytical simulations and further an experienced user in the field of numerical simulations, which makes a large-scale application time-consuming to apply.Therefore, the goal is to implement existing models into an established side-milling simulation program aiming for a computationally fast and user-friendly simulation approach capable of predicting transient tool temperatures along the cutting edge. Aim of this work is the development of the thermal model, which can later be implemented into existing programs. The model process involves the following two major steps: First, a geometric engagement simulation of the milling process with a parameterizable tool geometry is performed. These results are used to form a database linking the specific cutting force components with the heat flux components. Second, a three-dimensional transient heat conduction model of the cutter is established, applying the calculated heat flux components as boundary conditions in the simulation. Finally, first results of the performed simulation are presented and evaluated, in particular to validate the work flow and user accessibility. Future studies will then focus on further parameter analysis and experimental validation.
2

Pfeifer, Denis, Andreas Baumann, Marco Giani, Christian Scheifele e Jörg Fehr. "Hybrid Digital Twins Using FMUs to Increase the Validity and Domain of Virtual Commissioning Simulations". In Advances in Automotive Production Technology – Towards Software-Defined Manufacturing and Resilient Supply Chains, 200–209. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-27933-1_19.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractThe main objective of virtual commissioning is to help design and validate the control systems of entire production plants. Therefore, simulations on a logical and kinematic level are performed, typically in a Software- or Hardware-in-the-Loop configuration using the original control software and controller [1].However, the lack of level of detail means that this type of simulation is insufficient for an integrated system dynamics and control algorithms design. These engineering tasks are currently performed in separate tools, e.g. by finite element analysis, multibody simulations or by a combination, i.e. elastic multibody systems (EMBS) [2]. However, the designed components are only considered individually and not in the context of the control technology used. Therefore, primarily synthetic inputs are used and not the original control behavior. With a higher level of simulation detail, further questions about the system, such as the effect of control algorithms on the dynamic processes, can be virtually validated.Therefore, this paper explores hybrid component-based digital twins to combine the advantages of both VC and EMBS. Hybrid components allow the simulation of the interactions between process, machine and control system with a high level of detail where this is beneficial. Such integration is achieved using the Functional Mock-up Interface (FMI) to couple different simulation models in a co-simulation environment [3]. This is demonstrated in a simulation use case of an inverted pendulum. The level of detail of individual components in the virtual commissioning tool ISG-virtuos [4] is increased by the modular integration of elastic multibody simulations via FMI so that the swing-up controller can be designed in the simulation.
3

Paegelow, Martin, e David García-Álvarez. "Advanced Pattern Analysis to Validate Land Use Cover Maps". In Land Use Cover Datasets and Validation Tools, 229–54. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-90998-7_12.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractIn this chapter we explore pattern analysis for categorical LUC maps as a means of validating land use cover maps, land change and land change simulations. In addition to those described in Chap. “Spatial Metrics to Validate Land Use Cover Maps”, we present three complementary methods and techniques: a Goodness of Fit metric to measure the agreement between two maps in terms of pattern (Map Curves), the focus on changes on pattern borders as a method for validating on-border processes and a technique quantifying the magnitude of distance error. Map Curves (Sect. 1) offers a universal pattern-based index, called Goodness of Fit (GOF), which measures the spatial concordance between categorical rasters or vector layers. Complementary to this pattern validation metric, the following Sect. 2 focuses specifically on the changes that take place on pattern borders. This enables changes to be divided into those that take place on the borders of existing features and those that form new, disconnected features. Bringing this chapter on landscape patterns to a close, Sect. 3 presents a technique for quantifying allocation errors in simulation maps and more precisely on the minimum distance between the allocation errors in simulation maps and the nearest patch belonging to the same category on the reference map. The comparison between a raster-based and a vector-based approach brings us back to the differences in measurement inherent in the representation of entities in raster and vector mode. These techniques are applied to two datasets. Section 1 uses the Asturias Central Area database, where CORINE maps are compared to SIOSE maps and simulation outputs. For their part, the techniques described in Sects. 2 and 3 are applied to the Ariège Valley database. CORINE maps for 2000 and 2018 are used as reference maps in comparisons with simulated land covers.
4

Fink, Maximilian C., Victoria Reitmeier, Matthias Siebeck, Frank Fischer e Martin R. Fischer. "Live and Video Simulations of Medical History-Taking: Theoretical Background, Design, Development, and Validation of a Learning Environment". In Learning to Diagnose with Simulations, 109–22. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89147-3_9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractHistory-taking is an essential diagnostic situation and has long been an important objective of medical education in European countries and beyond. Thus, the research project presented here investigates facilitating diagnostic competences in live and video history-taking simulations. In this chapter, the theoretical background and the design, development, and validation process of the learning environment for this research project are described. In the first section, an overview of history-taking models is provided, the concept of diagnostic competences for history-taking is specified, and a summary of research on simulation-based learning and assessment of history-taking is given. The second section reports on the creation of knowledge tests and the live and video simulations. In the third section, results from a pilot study and an expert workshop are disclosed and findings from a validation study are provided. These findings indicate that the created simulations and knowledge tests measure separate but related aspects of diagnostic competences reliably and validly and may be used for assessment. In the final section, a summary is provided and future questions for research are presented with a focus on the adaptivity of scaffolds and simulation-based learning from atypical cases.
5

Altherr, N., B. Kirsch e J. C. Aurich. "Investigation of Micro Grinding via Kinematic Simulations". In Proceedings of the 3rd Conference on Physical Modeling for Virtual Manufacturing Systems and Processes, 233–59. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-35779-4_13.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractWith increasing demand for micro structured surfaces in hard and brittle materials, the importance of micro grinding increases. The application of micro pencil grinding tools (MPGTs) in combination with ultra precision multi axes machine tools allow an increased freedom of shaping. However, small dimensions of the grinding tools below 500 µm substantiate high rotational speeds and low feed rates to enable the machining process. Besides, the abrasive grits of the tool can be large in comparison to the tool dimensions. All factors will influence the resulting surface topography of the workpiece. But some of the topography properties are no longer accessible for optical measurements, making process evaluations and improvements difficult.In the present contribution the measurement results are supplemented by the results of a kinematic simulation model. The built up of such a kinematic simulation is described, which considers real process and tool properties. The results received by the simulation are compared to measurements to validate the model and point out the advantages of the simulations. In a further step, a principle is shown how the simulation can be used to make the undeformed chip thickness accessible, a process result which cannot be measured within the real machining process.
6

Reinhardt, Oliver, Tom Warnke, Andreas Ruscheinski e Adelinde M. Uhrmacher. "Valid and Reproducible Simulation Studies—Making It Explicit". In Simulation Foundations, Methods and Applications, 607–27. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-70766-2_25.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Wang, Jing, Jinglin Zhou e Xiaolu Chen. "Simulation Platform for Fault Diagnosis". In Intelligent Control and Learning Systems, 45–58. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-8044-1_4.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractThe previous chapters have described the mathematical principles and algorithms of multivariate statistical methods, as well as the monitoring processes when used for fault diagnosis. In order to validate the effectiveness of data-driven multivariate statistical analysis methods in the field of fault diagnosis, it is necessary to conduct the corresponding fault monitoring experiments. Therefore this chapter introduces two kinds of simulation platform, Tennessee Eastman (TE) process simulation system and fed-batch Penicillin Fermentation Process simulation system. They are widely used as test platforms for the process monitoring, fault classification, and identification of industrial process. The related experiments based on PCA, CCA, PLS, and FDA are completed on the TE simulation platforms.
8

Heng, Kevin. "Reflections by a Theoretical Astrophysicist". In Synthese Library, 297–303. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26618-8_16.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractA theoretical astrophysicist discusses the principles and rules-of-thumb underlying the construction of models and simulations from the perspective of an active practitioner, where it is emphasised that they are designed to address specific scientific questions. That models are valid only within a restricted space of parameters and degenerate combinations of parameter values produce the same observable outcome are features, and not bugs, of competent practice that fit naturally within a Bayesian framework of inference. Idealisations within a model or simulation are strongly tied to the questions they are designed to address and the precision at which they are confronted by data. If the practitioner visualises a hierarchy of models of varying sophistication (which is standard practice in astrophysics and climate science), then de-idealisation becomes an irrelevant concept. Opportunities for future collaborations between astrophysicists and philosophers of science are suggested.
9

Kunkel, Julian Martin. "Using Simulation to Validate Performance of MPI(-IO) Implementations". In Lecture Notes in Computer Science, 181–95. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38750-0_14.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Dickmann, Christoph, Harald Klein, Thomas Birkhölzer, Wolfgang Fietz, Jürgen Vaupel e Ludger Meyer. "Deriving a Valid Process Simulation from Real World Experiences". In Software Process Dynamics and Agility, 272–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-72426-1_23.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Atti di convegni sul tema "Simulation validée":

1

Nour, Ahmed, Lulwa Al-Suwailem, Dalal Al-Jutaili, Ken Monteiro, Sarah Al-Safran, Martijn Bogaerts, M. Aiman Fituri, Mischa Oostendorp e Mohamed Khalil. "Using Ultrasonic Flexural Measurements to Validate Casing Standoff Simulations: A Kuwait Case Study". In SPE/IADC Middle East Drilling Technology Conference and Exhibition. SPE, 2023. http://dx.doi.org/10.2118/214538-ms.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Removing mud from around the casing or liner and replacing it with drilling or cement fluid is fundamental to achieving zonal isolation. One significant parameter needed to achieve flow around the casing is proper casing centralization. Casing centralization is a function of many wellbore properties, such as fluid and centralizer data, which are obtained from the directional survey and caliper data. Computer simulations are used to optimize centralizer selection and placement prior to running the casing into the wellbore and the cementing operations. This paper presents the method and technology used to compare simulated vs. real centralization and the key lessons learned from a Kuwait project. To complete the continuous improvement cycle, it is important to confirm the casing standoff in a postcement operation to determine if the prejob assumptions and the simulations were accurate. Using standard cement evaluation logs, it is not possible to directly measure the casing standoff. Therefore, conclusions have to be made indirectly, based on the cement evaluation data. The new-generation ultrasonic flexural measurement tools can be used to evaluate casing centralization directly by evaluating the time between the first casing reflection (mud to casing interface) and the third reflection (cement formation interface). For a Kuwait project, a new one-piece slip-on centralizer was introduced for field operations. Prejob standoff simulations were performed to optimize the casing standoff to meet the operator and service company recommendations. All available well and fluid data were included in the simulations to accurately predict the casing standoff. The simulations used a state-of-the-art, stiff-string simulator to provide the most accurate simulations results. To evaluate the standoff simulations and centralizer performance, the third-interface echo (TIE) measurements were used to determine actual standoff. The ultrasonic measurements were run on three different cemented intervals. These intervals ranged from a vertical 16-in open hole interval to a highly deviated 8½-in openhole section. By comparing the actual measurement with the simulations results, a direct standoff evaluation was made possible regarding the centralizer selection and placement and the assumptions made during the well planning phase. It also provides better understanding on the performance of the centralizers. Using the advanced flexural ultrasonic logging tool and the TIE measurement provided the opportunity to compare actual casing standoff results vs. prejob casing centralization simulation. The results demonstrated the importance of having accurate well data available during the design phase and the impact particular assumptions have on the final casing standoff. By comparing the actual casing standoff results vs. prejob casing centralization simulation, important lessons can be learned about centralizer selection, placement, and how standoff simulations can be implemented during field development to improve casing standoff. Thus, the probability of effective mud removal and zonal isolation increases.
2

Montevechi, Jose Arnaldo Barra, Gustavo Teodoro Gabriel, Afonso Teberga Campos, Carlos Henrique dos Santos, Fabiano Leal e Michael E. F. H. S. Machado. "Using Generative Adversarial Networks to Validate Discrete Event Simulation Models". In 2022 Winter Simulation Conference (WSC). IEEE, 2022. http://dx.doi.org/10.1109/wsc57314.2022.10015375.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Law, Averill. "How to Build Valid and Credible Simulation Models". In 2006 Winter Simulation Conference. IEEE, 2006. http://dx.doi.org/10.1109/wsc.2006.323038.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Beuran, Razvan, Shingo Yasuda, Tomoya Inoue, Shinsuke Miwa e Yoichi Shinoda. "Using Emulation to Validate Post-disaster Network Recovery Solutions". In Seventh International Conference on Simulation Tools and Techniques. ICST, 2014. http://dx.doi.org/10.4108/icst.simutools.2014.254619.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Law, Averill M. "How to Build Valid and Credible Simulation Models". In 2019 Winter Simulation Conference (WSC). IEEE, 2019. http://dx.doi.org/10.1109/wsc40007.2019.9004789.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Law, Averill M. "How to build valid and credible simulation models". In 2008 Winter Simulation Conference (WSC). IEEE, 2008. http://dx.doi.org/10.1109/wsc.2008.4736054.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Law, Averill M. "How to Build Valid and Credible Simulation Models". In 2022 Winter Simulation Conference (WSC). IEEE, 2022. http://dx.doi.org/10.1109/wsc57314.2022.10015411.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Ribault, Judicaël, Olivier Dalle, Denis Conan e Sébastien Leriche. "OSIF: A Framework To Instrument, Validate, and Analyze Simulations". In 3rd International ICST Conference on Simulation Tools and Techniques. ICST, 2010. http://dx.doi.org/10.4108/icst.simutools2010.8729.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Rajadurai, Sivanandi, Guru Prasad Mani, Kavin Raja e Sundaravadivelu Mohan. "Computational Simulation to Validate Resonator through Bending Moment". In SAE 2015 Noise and Vibration Conference and Exhibition. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2015. http://dx.doi.org/10.4271/2015-01-2290.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Law, Averill M. "How to build valid and credible simulation models". In 2009 Winter Simulation Conference - (WSC 2009). IEEE, 2009. http://dx.doi.org/10.1109/wsc.2009.5429312.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Rapporti di organizzazioni sul tema "Simulation validée":

1

Selvaraju, Ragul, SHABARIRAJ SIDDESWARAN e Hariharan Sankarasubramanian. The Validation of Auto Rickshaw Model for Frontal Crash Studies Using Video Capture Data. SAE International, settembre 2020. http://dx.doi.org/10.4271/2020-28-0490.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Despite being Auto rickshaws are the most important public transportation around Asian countries and especially in India, the safety standards and regulations have not been established as much as for the car segment. The Crash simulations have evolved to analyze the vehicle crashworthiness since crash experimentations are costly. The work intends to provide the validation for an Auto rickshaw model by comparing frontal crash simulation with a random head-on crash video. MATLAB video processing tool has been used to process the crash video, and the impact velocity of the frontal crash is obtained. The vehicle modelled in CATIA is imported in the LS-DYNA software simulation environment to perform frontal crash simulation at the captured speed. The simulation is compared with the crash video at 5, 25, and 40 milliseconds respectively. The comparison shows that the crash pattern of simulation and real crash video are similar in detail. Thus the modelled Auto-rickshaw can be used in the future to validate the real-time crash for providing the scope of improvement in Three-wheeler safety.
2

Selvaraju, Ragul, SHABARIRAJ SIDDESWARAN e Hariharan Sankarasubramanian. The Validation of Auto Rickshaw Model for Frontal Crash Studies Using Video Capture Data. SAE International, settembre 2020. http://dx.doi.org/10.4271/2020-28-0490.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Despite being Auto rickshaws are the most important public transportation around Asian countries and especially in India, the safety standards and regulations have not been established as much as for the car segment. The Crash simulations have evolved to analyze the vehicle crashworthiness since crash experimentations are costly. The work intends to provide the validation for an Auto rickshaw model by comparing frontal crash simulation with a random head-on crash video. MATLAB video processing tool has been used to process the crash video, and the impact velocity of the frontal crash is obtained. The vehicle modelled in CATIA is imported in the LS-DYNA software simulation environment to perform frontal crash simulation at the captured speed. The simulation is compared with the crash video at 5, 25, and 40 milliseconds respectively. The comparison shows that the crash pattern of simulation and real crash video are similar in detail. Thus the modelled Auto-rickshaw can be used in the future to validate the real-time crash for providing the scope of improvement in Three-wheeler safety.
3

Hoppel, Mark. Creation of Robotic Snake to Validate Contact Modeling in Simulation. Fort Belvoir, VA: Defense Technical Information Center, dicembre 2013. http://dx.doi.org/10.21236/ada594656.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Constantin, Eugen, Nikolaos Papapanagiotou e Sanjeev Singh. Analysis of DDD and VDT Simulation Techniques to Determine Feasibility of Using VDT Simulation to Validate DDD Models. Fort Belvoir, VA: Defense Technical Information Center, giugno 2004. http://dx.doi.org/10.21236/ada424673.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Li, Honghai, Grace Maze, Kevin Conner e John Hazelton. Sediment transport modeling at Stono Inlet and adjacent beaches, South Carolina. Engineer Research and Development Center (U.S.), dicembre 2021. http://dx.doi.org/10.21079/11681/42501.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This report documents a numerical modeling investigation for dredged material from nearshore borrow areas and placed on Folly Beach adjacent to Stono Inlet, South Carolina. Historical and newly collected wave and hydrodynamic data around the inlet were assembled and analyzed. The datasets were used to calibrate and validate a coastal wave, hydrodynamic and sediment transport model, the Coastal Modeling System. Sediment transport and morphology changes within and around the immediate vicinity of the Stono Inlet estuarine system, including sand borrow areas and nearshore Folly Beach area, were evaluated. Results of model simulations show that sand removal in the borrow areas increases material backfilling, which is more significant in the nearshore than the offshore borrow areas. In the nearshore Folly Beach area, the dominant flow and sediment transport directions are from the northeast to the southwest. Net sediment gain occurs in the central and southwest sections while net sediment loss occurs in the northeast section of Folly Island. A storm and a 1-year simulation developed for the study produce a similar pattern of morphology changes, and erosion and deposition around the borrow areas and the nearshore Folly Beach area.
6

Aursjø, Olav, Aksel Hiorth, Alexey Khrulenko e Oddbjørn Mathias Nødland. Polymer flooding: Simulation Upscaling Workflow. University of Stavanger, novembre 2021. http://dx.doi.org/10.31265/usps.203.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
There are many issues to consider when implementing polymer flooding offshore. On the practical side one must handle large volumes of polymer in a cost-efficient manner, and it is crucial that the injected polymer solutions maintain their desired rheological properties during transit from surface facilities and into the reservoir. On the other hand, to predict polymer flow in the reservoir, one must conduct simulations to find out which of the mechanisms observed at the pore and core scales are important for field behavior. This report focuses on theoretical aspects relevant for upscaling of polymer flooding. To this end, several numerical tools have been developed. In principle, the range of length scales covered by these tools is extremely wide: from the nm (10-9 m) to the mm (10-3 m) range, all the way up to the m and km range. However, practical limitations require the use of other tools as well, as described in the following paragraphs. The simulator BADChIMP is a pore-scale computational fluid dynamics (CFD) solver based on the Lattice Boltzmann method. At the pore scale, fluid flow is described by classical laws of nature. To a large extent, pore scale simulations can therefore be viewed as numerical experiments, and they have great potential to foster understanding of the detailed physics of polymer flooding. While valid across length scales, pore scale models require a high numerical resolution, and, subsequently, large computational resources. To model laboratory experiments, the NIORC has, through project 1.1.1 DOUCS, developed IORCoreSim. This simulator includes a comprehensive model for polymer rheological behavior (Lohne A. , Stavland, Åsen, Aursjø, & Hiorth, 2021). The model is valid at all continuum scales; however, the simulator implementation is not able to handle very large field cases, only smaller sector scale systems. To capture polymer behavior at the full field scale, simulators designed for that specific purpose must be used. One practical problem is therefore: How can we utilize the state-of-the-art polymer model, only found in IORCoreSim, as a tool to decrease the uncertainty in full field forecasts? To address this question, we suggest several strategies for how to combine different numerical tools. In the Methodological Approach section, we briefly discuss the more general issue of linking different scales and simulators. In the Validation section, we present two case studies demonstrating the proposed strategies and workflows.
7

Watts, Benjamin, e Danielle Kennedy. Additive regulated concrete for thermally extreme conditions. Engineer Research and Development Center (U.S.), maggio 2024. http://dx.doi.org/10.21079/11681/48510.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This study details a multiprong effort to validate the Cold Regions Research and Engineering Laboratory’s solution for concrete construction and repair in cold weather, Additive Regulated Concrete for Thermally Extreme Conditions (ARCTEC). ARCTEC is the product of several years of research and consists of a testing and simulation workflow which generates scenario-sensitive guidance for use of accelerating admixtures in concrete. This report details efforts to validate ARCTEC using real-world, full-scale, field demonstrations. These demonstrations were used to collect data on the behavior of concrete obtained through conventional supply chains, to assess the accuracy of the simulation component of the workflow, and test efficacy of ARCTEC guidance in achieving frost protection. Results indicate that ARCTEC is at a high level of maturity, and provides additive dosage guidance that ensures frost protection and strength development in concrete placed where overnight lows fall as low as 0°F. The effort and cost required to implement ARCTEC as a cold weather protection strategy is minimal, and significantly less burdensome than conventional methods. Any cold region installation with a winter construction or repair needs and access to conventional concrete supply chains could field ARCTEC, and reduce the cost and schedule constraints associated with winter construction.
8

Baete, Christophe. PR-405-173610-R01 Develop New Criteria for DC Stray Current Interference. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), giugno 2019. http://dx.doi.org/10.55274/r0011602.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This report refers to the activities performed in the frame of PRCI project on the refinement of the dynamic DC stray current corrosion criteria by applying an advanced DC corrosion prediction model. In order to find realistic stray current interference conditions, an industry survey was performed to retrieve dynamic DC interference signals on real-world pipelines. After analysis of the cases, a simulation matrix was proposed that covers a wide variety of interference conditions. The simulated signals were simplified as squared pulses. The European Standard EN 50162 Protection against corrosion stray current from direct current systems was used a reference for validation. Some other criteria that are currently under investigation have been considered as well. The criteria were validated against simulated corrosion rates. The final goal is achieving a further refinement of the dynamic DC stray current criteria. The simulations demonstrate that current criteria are either not valid, either too conservative when steel tends to passivate under anodic excursions in high pH soil due to the development of a Fe3O4 film. The lowest pH value at which the passive film developed was 10.34 with relatively short cathodic duration (30 sec) and long (50 sec) and strong (-200 mVcse) anodic potential. There is a related webinar.
9

Siebke, Christian, Maximilian Bäumler, Madlen Ringhand, Marcus Mai, Mohamed Nadar Ramadan e Günther Prokop. Report on layout of the traffic simulation and trial design of the evaluation. Technische Universität Dresden, 2021. http://dx.doi.org/10.26128/2021.244.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Within the AutoDrive project, openPASS is used to develop a cognitive stochastic traffic flow simulation for urban intersections and highway scenarios, which are described in deliverable D1.14. The deliverable D2.16 includes the customizations of the framework openPASS that are required to provide a basis for the development and implementation of the driver behavior model and the evaluated safety function. The trial design for the evaluation of the safety functions is described. Furthermore, the design of the driver behavior study is introduced to parameterize and validate the underlying driver behavior model.
10

Chandler, Joseph F., Dain S. Horning, Richard D. Arnold, Jeffrey B. Phillips, Dean S. Horak e D. L. Taylor. The Use of Commercial Flight Simulation Software as a Psychometrically Sound, Ecologically Valid Measure of Fatigued Performance. Fort Belvoir, VA: Defense Technical Information Center, agosto 2011. http://dx.doi.org/10.21236/ada548034.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Vai alla bibliografia