Literatura académica sobre el tema "Probabilities – Computer simulations"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Probabilities – Computer simulations".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Probabilities – Computer simulations"

1

DOBRESCU, GIANINA, M. RUSU y M. VASS. "COMPUTER SIMULATIONS OF FRACTAL SURFACES: APPLICATION IN ADSORPTION". Fractals 01, n.º 03 (septiembre de 1993): 430–38. http://dx.doi.org/10.1142/s0218348x93000459.

Texto completo
Resumen
A computer program which is able to simulate adsorption on fractal surfaces was developed. The fractal surfaces are generated as Takagi surfaces. The computer program is based on a DLA-algorithm. Adsorption was simulated in different conditions: 1. equivalent active sites (homogeneous surfaces); 2. active sites with different adsorption probabilities; the probability associated with every active site is computed using a van der Waals potential. Our simulation allows us to explore the actual structure of the gas-solid interface and to study the sensitivity to energetic disorder. The fractal dimension of gas-solid interface vs. adsorption coverage curves are computed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Smith, Peter J. "Underestimation of Rare Event Probabilities in Importance Sampling Simulations". SIMULATION 76, n.º 3 (marzo de 2001): 140–50. http://dx.doi.org/10.1177/003754970107600301.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Peng, Xidan y Xiangyang Li. "Performance Analysis for Analog Network Coding with Imperfect CSI in FDD Two Way Channels". Journal of Systems Science and Information 3, n.º 4 (25 de agosto de 2015): 357–64. http://dx.doi.org/10.1515/jssi-2015-0357.

Texto completo
Resumen
AbstractA time-division duplex (TDD) two-way channel exploits reciprocity to estimate the forward channel gain from the reverse link. Many previous works explore outage probabilities in the TDD system, based on the reciprocity property. However, a frequency-division duplex (FDD) system has no reciprocity property. In this letter, we investigate the impact of CSI estimation error on the performance of non-orthogonal and orthogonal analog network coding protocols in an FDD two-way system, where channel gains are independent of each other. Considering imperfect CSI, the closed-form expressions of outage probabilities by two protocols are derived in the high signal-to-noise ratio (SNR) regime, respectively. It is shown that the derived outage probabilities match results of Monte Carlo simulations in different communication scenarios. It is interesting that ANC in the FDD two-way channel is proved to outperform that in the TDD channel by the computer simulation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

SCHURZ, GERHARD y PAUL D. THORN. "REWARD VERSUS RISK IN UNCERTAIN INFERENCE: THEOREMS AND SIMULATIONS". Review of Symbolic Logic 5, n.º 4 (4 de julio de 2012): 574–612. http://dx.doi.org/10.1017/s1755020312000184.

Texto completo
Resumen
AbstractSystems oflogico-probabilistic(LP) reasoning characterize inference from conditional assertions that express high conditional probabilities. In this paper we investigate four prominent LP systems, the systemsO, P,Z, andQC. These systems differ in the number of inferences they licence (O⊂ P ⊂Z⊂QC). LP systems that license more inferences enjoy the possiblerewardof deriving more true and informative conclusions, but with this possible reward comes theriskof drawing more false or uninformative conclusions. In the first part of the paper, we present the four systems and extend each of them by theorems that allow one to compute almost-tight lower-probability-bounds for the conclusion of an inference, given lower-probability-bounds for its premises. In the second part of the paper, we investigate by means of computer simulations which of the four systems provides the best balance ofrewardversusrisk. Our results suggest that systemZoffers the best balance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Shchur, Lev N. y Sergey S. Kosyakov. "Probability of Incipient Spanning Clusters in Critical Square Bond Percolation". International Journal of Modern Physics C 08, n.º 03 (junio de 1997): 473–81. http://dx.doi.org/10.1142/s0129183197000394.

Texto completo
Resumen
The probability of simultaneous occurrence of at least k spanning clusters has been studied by Monte Carlo simulations on the 2D square lattice with free boundaries at the bond percolation threshold pc =1/2. It is found that the probability of k and more Incipient Spanning Clusters (ISC) have the values P(k>1) ≈ 0.00658(3) and P(k>2) ≈ 0.00000148(21) provided that the limit of these probabilities for infinite lattices exists. The probability P(k>3) of more than three ISC could be estimated to be of the order of 10-11 and is beyond the possibility to compute such a value by nowadays computers. So, it is impossible to check in simulations the Aizenman law for the probabilities when k≫1. We have detected a single sample with four ISC in a total number of about 1010 samples investigated. The probability of this single event is 1/10 for that number of samples. The influence of boundary conditions is discussed in the last section.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Paschalidis, I. C. y S. Vassilaras. "Importance Sampling for the Estimation of Buffer Overflow Probabilities via Trace-Driven Simulations". IEEE/ACM Transactions on Networking 12, n.º 5 (octubre de 2004): 907–19. http://dx.doi.org/10.1109/tnet.2004.836139.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ojha, Durga Prasad. "Nematogenic Behaviour of a Cyano-Compound Using Quantum Mechanics and Computer Simulations". Zeitschrift für Naturforschung A 56, n.º 3-4 (1 de abril de 2001): 319–25. http://dx.doi.org/10.1515/zna-2001-0315.

Texto completo
Resumen
Abstract Using quantum mechanics and intermolecular forces, the molecular ordering of a nematogenic cya-no-compound, 5-(frans-4-ethylcyclohexyl)-2-(4-cyanophenyl)-pyrimidine (ECCPP), has been exam­ ined. The CNDO/2 method has been employed to evaluate the net atomic charge and the dipole mo­ ment components at each atomic centre of the molecule. The configuration energy has been computed using the modified Rayleigh-Schrödinger perturbation method at intervals of 1Ä in translation and 10P in rotations, and corresponding probabilities have been calculated using Maxwell-Boltzmann statistics. The flexibility of various configurations has been studied in terms of the variation of the probability due to small departures from the most probable configuration. All possible geometrical arrangements between a molecular pair have been considered during stacking, in-plane and terminal interactions, and the most favourable configuration of pairing has been obtained. An attempt has been made to under­ stand the behaviour of the molecules in terms of their relative order. The results have been compared with those obtained for other nematogens like DPAB [4,4'-di-n-propoxy-azoxybenzene] and EMBAC [ethyl 4-(4'-methoxybenzylidene amino) cinnamate].
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Chiou, Rong Nan y Chia-Nian Shyi. "Adaptive Maximums of Random Variables for Network Simulations". Journal of Computer Systems, Networks, and Communications 2009 (2009): 1–6. http://dx.doi.org/10.1155/2009/383720.

Texto completo
Resumen
In order to enhance the precision of network simulations, the paper proposes an approach to adaptively decide the maximum of random variables that create the discrete probabilities to generate nodal traffic on simulated networks. In this paper, a statistical model is first suggested to manifest the bound of statistical errors. Then, according to the minimum probability that generates nodal traffic, a formula is proposed to decide the maximum. In the formula, a precision parameter is used to present the degree of simulative accuracy. Meanwhile, the maximum adaptively varies with the traffic distribution among nodes because the decision depends on the minimum probability generating nodal traffic. In order to verify the effect of the adaptive maximum on simulative precision, an optical network is introduced. After simulating the optical network, the theoretic average waiting time of nodes on the optical network is exploited to validate the exactness of the simulation. The proposed formula deciding the adaptive maximum can be generally exploited in the simulations of various networks. Based on the precision parameterK, a recursive procedure will be developed to automatically produce the adaptive maximum for network simulations in the future.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Zhang, Xulong y Xiaoxia Song. "Stability Analysis of a Dynamical Model for Malware Propagation with Generic Nonlinear Countermeasure and Infection Probabilities". Security and Communication Networks 2020 (22 de septiembre de 2020): 1–7. http://dx.doi.org/10.1155/2020/8859883.

Texto completo
Resumen
The dissemination of countermeasures is widely recognized as one of the most effective strategies of inhibiting malware propagation, and the study of general countermeasure and infection has an important and practical significance. On this point, a dynamical model incorporating generic nonlinear countermeasure and infection probabilities is proposed. Theoretical analysis shows that the model has a unique equilibrium which is globally asymptotically stable. Accordingly, a real network based on the model assumptions is constructed, and some numerical simulations are conducted on it. Simulations not only illustrate theoretical results but also demonstrate the reasonability of general countermeasure and infection.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Lerche, Ian y Brett S. Mudford. "How Many Monte Carlo Simulations Does One Need to Do?" Energy Exploration & Exploitation 23, n.º 6 (diciembre de 2005): 405–27. http://dx.doi.org/10.1260/014459805776986876.

Texto completo
Resumen
This article derives an estimation procedure to evaluate how many Monte Carlo realizations need to be done in order to achieve prescribed accuracies in the estimated mean value and also in the cumulative probabilities of achieving values greater than, or less than, a particular value as the chosen particular value is allowed to vary. In addition, by inverting the argument and asking what the accuracies are that result for a prescribed number of Monte Carlo realizations, one can assess the computer time that would be involved should one choose to carry out the Monte Carlo realizations. These two complementary procedures are of great benefit in attempting to control the worth of undertaking an unknown number of Monte Carlo realizations, and of continuing to carry out the unknown number until the results have reached a level of accuracy that one deems acceptable. Such a procedure is not only computer intensive; however, is also very open-ended, a less than desirable trait when running a complex computer program that might take many hours or days to run through even once. The procedure presented here allows one to assess, ahead of performing a large number of Monte Carlo realizations, roughly how many are actually needed. Several illustrative numerical examples provide indications how one uses this novel procedure in practical situations.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Probabilities – Computer simulations"

1

Peng, Linghua. "Normalizing constant estimation for discrete distribution simulation /". Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Heimbigner, Stephen. "Implications in using Monte Carlo simulation in predicting cardiovascular risk factors among overweight children and adolescents a stochastic computer model based on probabilities from the Bogalusa Heart Study /". unrestricted, 2007. http://etd.gsu.edu/theses/available/etd-07252007-234503/.

Texto completo
Resumen
Thesis (M.P.H.)--Georgia State University, 2007.
Title from file title page. Russ Toal, committee chair; Michael Eriksen, Valerie Hepburn, committee members. Electronic text (102 p. : ill. (some col.)) : digital, PDF file. Description based on contents viewed Mar. 26, 2008. Includes bibliographical references (p. 71-73).
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Coelho, Renato Schattan Pereira 1987. "Simulação de multidões e planejamento probabilístico para otimização dos tempos de semáforos". [s.n.], 2012. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275643.

Texto completo
Resumen
Orientadores: Siome Klein Goldenstein, Jacques Wainer
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-22T23:53:33Z (GMT). No. of bitstreams: 1 Coelho_RenatoSchattanPereira_M.pdf: 864445 bytes, checksum: 8f57902047a23925af4b81fa0d7f3188 (MD5) Previous issue date: 2013
Resumo: O trânsito é um problema cada vez maior nas cidades, consumindo recursos e agravando a poluição; em São Paulo perdem-se cerca de R$33 bilhões por ano por causa do trânsito. Neste trabalho de mestrado desenvolvemos um sistema que une as áreas de simulação de multidões e planejamento probabilístico para otimizar semáforos de tempo fixo. Essas duas áreas apresentam algoritmos que permitem soluções eficientes para os problemas, mas a sua aplicação ainda depende largamente da intervenção de especialistas no problema a ser estudado, seja descrevendo o problema de planejamento probabilístico, seja interpretando os dados devolvidos pelo simulador. Nosso sistema diminui essa dependência ao utilizar autômatos celulares para simular o tráfego e gerar informações que são então utilizadas para descrever o problema de planejamento probabilístico. Com isso podemos: (i) reduzir a necessidade de coleta de dados, que passam a ser gerados pelo simulador e (ii) produzir bons planos para o controle de semáforos de tempo fixo sem que seja necessária a intervenção de especialistas para a análise dos dados. Nos dois testes realizados a solução proposta pelo sistema diminuiu o tempo médio de percurso em 18:51% e 13:51%, respectivamente
Abstract: Traffic is an ever increasing problem, draining resources and aggravating pollution. In Sao Paulo, for instance, financial losses caused by traffic represent a sum of about R$33 billions a year. In this work we've developed a system that puts together the areas of Crowd Simulation and Probabilistic Planning to optimize fixed time traffic lights. Although both areas present good algorithms their use is limited by their reliance on specialists, whether to describe the probabilistic planning problem or to analyze the data produced by the simulations. Our approach contributes to minimize this dependence by using cellular automata simulations to generate the data that is used to describe the probabilistic planning problem. This allows us to: (i) reduce the amount of data collection, since the data is now generated by the simulator and (ii) produce good policies for fixed time traffic light control without the intervention of specialists to analyze the data. In the two tests performed the solution proposed by the system was able to reduce travel times by 18:51% and 13:51%, respectively
Mestrado
Ciência da Computação
Mestre em Ciência da Computação
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Deng, Yuxin. "Axiomatisations et types pour des processus probabilistes et mobiles". Phd thesis, École Nationale Supérieure des Mines de Paris, 2005. http://tel.archives-ouvertes.fr/tel-00155225.

Texto completo
Resumen
Cette th`ese se concentre sur des bases th´eoriques utiles pour l'analyse d'algorithmes et de protocoles
pour des syst`emes r´epartis modernes. Deux caract´eristiques importantes des mod`eles pour
ces syst`emes sont les probabilit´es et la mobilit´e typ´ee : des probabilit´es peuvent ˆetre utilis´ees pour
quantifier des comportements incertains ou impr´evisibles, et des types peuvent ˆetre utilis´es pour
garantir des comportements sˆurs dans des syst`emes mobiles. Dans cette th`ese nous d´eveloppons
des techniques alg´ebriques et des techniques bas´ees sur les types pour l'´etude comportementale des
processus probabilistes et mobiles.

Dans la premi`ere partie de la th`ese nous ´etudions la th´eorie alg´ebrique d'un calcul de processus
qui combine les comportements non-d´eterministe et probabiliste dans le mod`ele des automates probabilistes
propos´es par Segala et Lynch. Nous consid´erons diverses ´equivalences comportementales
fortes et faibles, et nous fournissons des axiomatisations compl`etes pour des processus `a ´etats finis,
limit´ees `a la r´ecursion gard´ee dans le cas des ´equivalences faibles.

Dans la deuxi`eme partie de la th`ese nous ´etudions la th´eorie alg´ebrique du -calcul en pr´esence
des types de capacit´es, qui sont tr`es utiles dans les calculs de processus mobiles. Les types de
capacit´es distinguent la capacit´e de lire sur un canal, la capacit´e d'´ecrire sur un canal, et la capacit´e
de lire et d'´ecrire `a la fois. Ils introduisent ´egalement une relation de sous-typage naturelle et
puissante. Nous consid´erons deux variantes de la bisimilarit´e typ´ee, dans leurs versions retard´ees
et anticip´ees. Pour les deux variantes, nous donnons des axiomatisations compl`etes pour les termes
ferm´es. Pour une des deux variantes, nous fournissons une axiomatisation compl`ete pour tous les
termes finis.

Dans la derni`ere partie de la th`ese nous d´eveloppons des techniques bas´ees sur les types pour
v´erifier la propri´et´e de terminaison de certains processus mobiles. Nous fournissons quatre syst`emes
de types pour garantir cette propri´et´e. Les syst`emes de types sont obtenus par des am´eliorations
successives des types du -calcul simplement typ´e. Les preuves de terminaison utilisent des techniques
employ´ees dans les syst`emes de r´e´ecriture. Ces syst`emes de types peuvent ˆetre utilis´es pour
raisonner sur le comportement de terminaison de quelques exemples non triviaux : les codages des
fonctions r´ecursives primitives, le protocole pour coder le choix s´epar´e en terme de composition
parall`ele, une table de symboles implement´ee comme une chaˆıne dynamique de cellules.

Ces r´esultats ´etablissent des bases pour une future ´etude de mod`eles plus avanc´es qui peuvent
combiner des probabilit´es avec des types. Ils soulignent ´egalement la robustesse des techniques
alg´ebriques et de celles bas´ees sur les types pour le raisonnement comportemental.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Reuillon, Romain. "Simulations stochastiques en environnements distribués : application aux grilles de calcul". Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2008. http://tel.archives-ouvertes.fr/tel-00731242.

Texto completo
Resumen
Contrairement aux modèles déterministes, le déroulement d'un modèle stochastique est conditionné par la réalisation de variables aléatoires. L'utilisation de hasard permet d'approcher un résultat le plus souvent incalculable de manière déterministe. En contrepartie, il est nécessaire d'estimer les paramètres des distributions associées aux quantités aléatoires en sortie du modèle stochastique. Ce calcul requiert l'exécution de multiples réplications indépendantes de la même expérience et de ce fait, d'une importante quantité de calcul. Toutes les simulations stochastiques comportent par conception un aspect naturellement parallèle. Elles représentent ainsi une des applications phares pour l'utilisation d'environnements de calculs distribués permettant de partager de la puissance de calcul à l'échelle mondiale, appelée grille de calcul. Bien que 50% des cycles des plus gros supercalculateurs de la planète soient consommés par des calculs stochastiques, les techniques de génération parallèle de nombres pseudoaléatoires sont méconnues. Il existe de ce fait un risque bien réel de produire et de publier des résultats de simulations stochastiques erronés. Cette thèse présente l'état de l'art des méthodes pour la distribution des réplications de simulations stochastiques et contribue à leur développement. Elle propose ainsi des méthodes novatrices permettant d'assurer une traçabilité dans le processus complexe de distribution de simulations stochastiques. Elle expose enfin des applications dans les domaines de l'imagerie médicale nucléaire et des simulations environnementales totalisant plus de 70 années de calcul sur un ordinateur séquentiel.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Geffroy, Arthur. "Contribution a l'étude locale et globale de l'enveloppe convexe d'un échantillon aléatoire". Rouen, 1997. http://www.theses.fr/1997ROUES017.

Texto completo
Resumen
On définit le concept de plage d'appui d'un échantillon aléatoire, les plages d'appui étant certaines parties de l'enveloppe convexe pouvant coïncider avec l'enveloppe complète. On établit des formules générales pour l'espérance du nombre de sommets ou du nombre de cotes d'une plage d'appui quelconque, ainsi que l'espérance de sa longueur. Ces formules sont ensuite calculées dans le cas d'une loi normale dans le plan, d'une loi uniforme dans un polygone convexe, puis d'une loi uniforme dans une courbe convexe lisse. Des simulations informatiques permettent finalement d'étudier les vitesses de convergence vers les résultats asymptotiques précédemment trouvés (dans le début de cette thèse ainsi que dans d'autres travaux), et d'obtenir des estimations plus fines pour la moyenne, la variance et la loi du nombre de sommets de l'enveloppe convexe.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Baudrit, Cédric. "Représentation et propagation de connaissances imprécises et incertaines : application à l'évaluation des risques liés aux sites et aux sols pollués". Phd thesis, Université Paul Sabatier - Toulouse III, 2005. http://tel.archives-ouvertes.fr/tel-00011933.

Texto completo
Resumen
Actuellement, les choix relatifs à la gestion des sites potentiellement pollués s'appuient, notamment,
sur une évaluation des risques pour l'homme et l'environnement. Cette évaluation est effectuée à l'aide de modèles qui simulent le transfert de polluant depuis une source de pollution vers une cible vulnérable, pour différents scénarii d'exposition. La sélection des valeurs des paramètres de ces modèles s'appuie autant que possible sur les données recueillies lors des investigations de terrain (phase de diagnostic de site). Or pour des raisons de délais et de coûts, l'information recueillie lors de cette phase de diagnostic est toujours incomplète; elle est donc entachée d'incertitude. De même, les modèles de transferts et d'exposition présentent également des incertitudes à intégrer dans les procédures. Cette notion globale d'incertitude doit être prise en compte dans l'évaluation du risque pour que les résultats soient utiles lors la phase décisionnelle.

L'incertitude sur les paramètres peut avoir deux origines. La première provient du caractère aléatoire de l'information due à une variabilité naturelle résultant de phénomènes stochastiques. On parle alors d'incertitudes de variabilité ou d'incertitudes stochastiques. La seconde est liée au caractère imprécis de l'information lié à un manque de connaissance et qui résulte par exemple d'erreurs systématiques lors de mesures ou d'avis d'experts.
On parle alors d'incertitudes épistémiques. Dans le calcul de risque, ces deux notions sont souvent confondues alors qu'elles devraient être traitées de manière différente.

L'incertitude en évaluation des risques a surtout été appréhendée dans un cadre purement probabiliste.
Cela revient à supposer que la connaissance sur les paramètres des modèles est toujours de nature aléatoire (variabilité). Cette approche consiste à représenter les paramètres incertains par des distributions de probabilité uniques et à transmettre l'incertitude relative à ces paramètres sur celle du risque encouru par la cible, en appliquant en général la technique dite Monte Carlo. Si cette approche est bien connue, toute la difficulté tient à une définition cohérente des distributions de probabilité affectées aux paramètres par rapport à la connaissance disponible. En effet dans un contexte d'évaluation des risques liés à l'exposition aux polluants, l'information dont on dispose concernant certains paramètres est souvent de nature imprécise. Le calage d'une distribution de probabilité unique sur ce type de
connaissance devient subjectif et en partie arbitraire.

L'information dont on dispose réellement est souvent plus riche qu'un intervalle mais moins riche qu'une distribution de probabilité. En pratique, l'information de nature aléatoire est traitée de manière rigoureuse par les distributions de probabilité classiques. Celle de nature imprécise est traitée de manière rigoureuse par des familles de distributions de probabilité définies au moyen de paires de probabilités cumulées hautes et basses ou, à l'aide de théories plus récentes, au moyen de distributions de possibilité (aussi appelées intervalles flous) ou encore au moyen d'intervalles aléatoires utilisant les fonctions de croyance de Dempster-Shafer.

Un des premiers objectifs de ce travail est de promouvoir la cohérence entre la manière dont on représente la connaissance sur les paramètres
des modèles du risque et la connaissance dont on dispose réellement. Le deuxième objectif est de proposer différentes méthodes pour propager l'information de nature aléatoire et l'information de nature imprécise à travers les modèles du risque tout en essayant de tenir compte des dépendances entre les paramètres. Enfin, ces méthodes alternatives ont été testées sur des cas synthétiques puis sur des cas réels simplifiés, notamment pour proposer des moyens de présenter les résultats pour une phase décisionnelle:
- Calcul de dose : Transfert d'un polluant radioactif (le strontium) depuis le dépôt jusqu'à
l'homme, au travers de la consommation d'un aliment (le lait de vache).
- Risque toxique après un déversement accidentel de trichloréthylène (TCE) au dessus d'une nappe d'eau (modèle semi analytique).

- Risque pour la santé liée aux sols pollués par des retombées de plomb.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Touya, Thierry. "Méthodes d'optimisation pour l'espace et l'environnement". Phd thesis, Université Paul Sabatier - Toulouse III, 2008. http://tel.archives-ouvertes.fr/tel-00366141.

Texto completo
Resumen
Ce travail se compose de deux parties relevant d'applications industrielles différentes.
La première traite d'une antenne spatiale réseau active.
Il faut d'abord calculer les lois d'alimentation pour satisfaire les contraintes de rayonnement. Nous transformons un problème avec de nombreux minima locaux en un problème d'optimisation convexe, dont l'optimum est le minimum global du problème initial, en utilisant le principe de conservation de l'énergie.
Nous résolvons ensuite un problème d'optimisation topologique: il faut réduire le nombre d'éléments rayonnants (ER). Nous appliquons une décomposition en valeurs singulières à l'ensemble des modules optimaux relaxés, puis un algorithme de type gradient topologique décide les regroupements entre ER élémentaires.

La deuxième partie porte sur une simulation type boîte noire d'un accident chimique.
Nous effectuons une étude de fiabilité et de sensibilité suivant un grand nombre de paramètres (probabilités de défaillance, point de conception, et paramètres influents). Sans disposer du gradient, nous utilisons un modèle réduit.
Dans un premier cas test nous avons comparé les réseaux neuronaux et la méthode d'interpolation sur grille éparse Sparse Grid (SG). Les SG sont une technique émergente: grâce à leur caractère hiérarchique et un algorithme adaptatif, elles deviennent particulièrement efficaces pour les problèmes réels (peu de variables influentes).
Elles sont appliquées à un cas test en plus grande dimension avec des améliorations spécifiques (approximations successives et seuillage des données).
Dans les deux cas, les algorithmes ont donné lieu à des logiciels opérationnels.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Patrix, Jérémy. "Détection de comportements à travers des modèles multi-agents collaboratifs, appliquée à l'évaluation de la situation, notamment en environnement asymétrique avec des données imprécises et incertaines". Phd thesis, Université de Caen, 2013. http://tel.archives-ouvertes.fr/tel-00991091.

Texto completo
Resumen
Ce manuscrit de thèse présente une méthode innovante brevetée pour la détection de comportements collectifs. En utilisant des procédés de fusion sur les données issues d'un réseau multi-capteurs, les récents systèmes de surveillance obtiennent les séquences d'observations des personnes surveillées. Ce bas niveau d'évaluation de la situation a été mesuré insuffisant pour aider les forces de sécurité lors des événements de foule. Afin d'avoir une plus haute évaluation de la situation dans ces environnements asymétriques, nous proposons une approche multi-agents qui réduit la complexité du problème par des agents sur trois niveaux - macro, méso et micro - d'observations. Nous utilisons un nouvel état relatif dans les approches de l'état de l'art pour nous permettre la détection, en temps réel, des groupes, de leurs comportements, objectifs et intentions. Dans le cadre de projets européens, nous avons utilisé un serious game simulant une foule dans des scénarios asymétriques. Les résultats montrent un meilleur accord avec les prédictions théoriques et une amélioration significative des travaux précédents. Le travail présenté ici pourrait être utilisé dans de futures études de détection de comportements multi-agents et pourrait un jour aider à résoudre les problèmes liés aux événements catastrophiques de foules incontrôlables.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Martin, Victorin. "Modélisation probabiliste et inférence par l'algorithme Belief Propagation". Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2013. http://tel.archives-ouvertes.fr/tel-00867693.

Texto completo
Resumen
On s'intéresse à la construction et l'estimation - à partir d'observations incomplètes - de modèles de variables aléatoires à valeurs réelles sur un graphe. Ces modèles doivent être adaptés à un problème de régression non standard où l'identité des variables observées (et donc celle des variables à prédire) varie d'une instance à l'autre. La nature du problème et des données disponibles nous conduit à modéliser le réseau sous la forme d'un champ markovien aléatoire, choix justifié par le principe de maximisation d'entropie de Jaynes. L'outil de prédiction choisi dans ces travaux est l'algorithme Belief Propagation - dans sa version classique ou gaussienne - dont la simplicité et l'efficacité permettent son utilisation sur des réseaux de grande taille. Après avoir fourni un nouveau résultat sur la stabilité locale des points fixes de l'algorithme, on étudie une approche fondée sur un modèle d'Ising latent où les dépendances entre variables réelles sont encodées à travers un réseau de variables binaires. Pour cela, on propose une définition de ces variables basée sur les fonctions de répartition des variables réelles associées. Pour l'étape de prédiction, il est nécessaire de modifier l'algorithme Belief Propagation pour imposer des contraintes de type bayésiennes sur les distributions marginales des variables binaires. L'estimation des paramètres du modèle peut aisément se faire à partir d'observations de paires. Cette approche est en fait une manière de résoudre le problème de régression en travaillant sur les quantiles. D'autre part, on propose un algorithme glouton d'estimation de la structure et des paramètres d'un champ markovien gaussien, basé sur l'algorithme Iterative Proportional Scaling. Cet algorithme produit à chaque itération un nouveau modèle dont la vraisemblance, ou une approximation de celle-ci dans le cas d'observations incomplètes, est supérieure à celle du modèle précédent. Cet algorithme fonctionnant par perturbation locale, il est possible d'imposer des contraintes spectrales assurant une meilleure compatibilité des modèles obtenus avec la version gaussienne de Belief Propagation. Les performances des différentes approches sont illustrées par des expérimentations numériques sur des données synthétiques.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Probabilities – Computer simulations"

1

Ross, Sheldon M. Simulation. 2a ed. San Diego: Academic Press, 1997.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Ross, Sheldon M. Simulation. 3a ed. San Diego: Academic Press, 2002.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Probability modeling and computer simulation: An integrated introduction with applications to engineering and computer science. Boston, Mass: PWS-Kent Pub. Co., 1988.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ross, Sheldon M. A course in simulation. New York: Macmillan, 1990.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Baron, Michael. Probability and statistics for computer scientists. Boca Raton, FL: Chapman & Hall/CRC, 2007.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Ross, Sheldon M. Simulation. 4a ed. Amsterdam: Elsevier Academic Press, 2006.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Baron, Michael. Probability and statistics for computer scientists. Boca Raton, FL: Chapman & Hall/CRC, 2007.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Probability, Markov chains, queues and simulation: The mathematical basis of performance modeling. Princeton: Princeton University Press, 2009.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Intuitive probability and random processes using MATLAB. New York: Springer, 2005.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

1942-, Hillestad R. J., Rand Corporation y European-American Center for Policy Analysis., eds. Modeling the external risks of airports for policy analysis. Santa Monica, CA: Rand, 1995.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Probabilities – Computer simulations"

1

Budde, Carlos E. y Arnd Hartmanns. "Replicating $$\textsc {Restart}$$ with Prolonged Retrials: An Experimental Report". En Tools and Algorithms for the Construction and Analysis of Systems, 373–80. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72013-1_21.

Texto completo
Resumen
AbstractStatistical model checking uses Monte Carlo simulation to analyse stochastic formal models. It avoids state space explosion, but requires rare event simulation techniques to efficiently estimate very low probabilities. One such technique is $$\textsc {Restart}$$ R E S T A R T . Villén-Altamirano recently showed—by way of a theoretical study and ad-hoc implementation—that a generalisation of $$\textsc {Restart}$$ R E S T A R T to prolonged retrials offers improved performance. In this paper, we demonstrate our independent replication of the original experimental results. We implemented $$\textsc {Restart}$$ R E S T A R T with prolonged retrials in the and tools, and apply them to the models used originally. To do so, we had to resolve ambiguities in the original work, and refine our setup multiple times. We ultimately confirm the previous results, but our experience also highlights the need for precise documentation of experiments to enable replicability in computer science.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Karvo, Jouni. "Efficient Simulation of Blocking Probabilities for Multi-layer Multicast Streams". En NETWORKING 2002: Networking Technologies, Services, and Protocols; Performance of Computer and Communication Networks; Mobile and Wireless Communications, 1020–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-47906-6_83.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Bacharoudis, Konstantinos, Atanas Popov y Svetan Ratchev. "Application of Advanced Simulation Methods for the Tolerance Analysis of Mechanical Assemblies". En IFIP Advances in Information and Communication Technology, 153–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72632-4_11.

Texto completo
Resumen
AbstractIn the frame of a statistical tolerance analysis of complex assemblies, for example an aircraft wing, the capability to predict accurately and fast specified, very small quantiles of the distribution of the assembly key characteristic becomes crucial. The problem is significantly magnified, when the tolerance synthesis problem is considered in which several tolerance analyses are performed and thus, a reliability analysis problem is nested inside an optimisation one in a fully probabilistic approach. The need to reduce the computational time and accurately estimate the specified probabilities is critical. Therefore, herein, a systematic study on several state of the art simulation methods is performed whilst they are critically evaluated with respect to their efficiency to deal with tolerance analysis problems. It is demonstrated that tolerance analysis problems are characterised by high dimensionality, high non-linearity of the state functions, disconnected failure domains, implicit state functions and small probability estimations. Therefore, the successful implementation of reliability methods becomes a formidable task. Herein, advanced simulation methods are combined with in-house developed assembly models based on the Homogeneous Transformation Matrix method as well as off-the-self Computer Aided Tolerance tools. The main outcome of the work is that by using an appropriate reliability method, computational time can be reduced whilst the probability of defected products can be accurately predicted. Furthermore, the connection of advanced mathematical toolboxes with off-the-self 3D tolerance tools into a process integration framework introduces benefits to successfully deal with the tolerance allocation problem in the future using dedicated and powerful computational tools.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ababneh, Jafar, Hussein Abdel-Jaber, Firas Albalas y Amjad Daoud. "Analyzing and Evaluating Current Computer Networks Simulation Models". En Simulation in Computer Network Design and Modeling, 459–78. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0191-8.ch022.

Texto completo
Resumen
Computer simulation is widely-used in investigating the performance of existing and proposed systems in many areas of science, engineering, operations research, and management science, especially in applications characterized by complicated geometries and interaction probabilities, and for dealing with system design in the presence of uncertainty. This is particularly true in the case of computer systems and computer networks. Therefore, it is very important to have efficient, reliable, and accurate methodologies for simulating these systems to ensure effective, reliable, and accurate evaluation and analysis of performance and to create the optimum design parameters for communication and computer networks. Although practical experiments and simulations are the most widely used, several efforts have also been directed towards mathematical models. The main objectives of this chapter are to: present the methodologies and techniques used to evaluate and analyze the performance of communication and computer networks routers, such as mathematical analysis, computer simulations techniques, and empirical measurements; identify the workload required for accomplishing a simulation model or mathematical analysis; identify the main metrics used to evaluate, manage, and control the performance of the computer networks; present the advantage and disadvantage of these techniques; and identify the challenges facing these different methodologies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

"Faking Probabilities: Computer Simulation". En Probabilities, 245–62. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2006. http://dx.doi.org/10.1002/9780470099797.ch9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

"Faking Probabilities: Computer Simulation". En Probabilities, 285–304. Hoboken, NJ, USA: John Wiley & Sons, Inc, 2015. http://dx.doi.org/10.1002/9781118898864.ch9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Osais, Yahya E. "Simulating Probabilities". En Computer Simulation, 27–37. Chapman and Hall/CRC, 2017. http://dx.doi.org/10.1201/9781315120294-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Khan, Mohammad S. "A Study of Computer Virus Propagation on Scale Free Networks Using Differential Equations". En Advanced Methods for Complex Network Analysis, 196–214. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9964-9.ch008.

Texto completo
Resumen
The SIR model is used extensively in the field of epidemiology, in particular, for the analysis of communal diseases. One problem with SIR and other existing models is that they are tailored to random or Erdos type networks since they do not consider the varying probabilities of infection or immunity per node. In this paper, we present the application and the simulation results of the pSEIRS model that takes into account the probabilities, and is thus suitable for more realistic scale free networks. In the pSEIRS model, the death rate and the excess death rate are constant for infective nodes. Latent and immune periods are assumed to be constant and the infection rate is assumed to be a function of the size of the total population and the size of the infected population. A node recovers from an infection temporarily with a probability p and dies from the infection with probability (1-p).
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Biswas, Rathindra Nath, Swarup Kumar Mitra y Mrinal Kanti Naskar. "Preserving Security of Mobile Anchors Against Physical Layer Attacks". En Cryptographic Security Solutions for the Internet of Things, 211–43. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-5742-5.ch008.

Texto completo
Resumen
This chapter introduces a new security scheme for mobile anchors avoiding the physical layer attacks towards localization in wireless sensor networks (WSNs). In a network, anchors are made location-aware equipping them with GPS (global positioning system) receivers. Direction finding capabilities are also incorporated with smart antennas. The proposed algorithm is based on adaptive beamforming of smart array that always minimizes the probabilities of successful attacks, keeping the adversaries beyond its beam coverage. Particle swarm optimization (PSO) technique is used to compute array excitation coefficients, generating the desired pattern. Thus, anchors remain secured through pattern irregularities, deteriorating the information retrieval process even though chances of occurring adequate RSS (received signal strength)/AoA (angle of arrival) measurements may exist. Moreover, anchors are assumed to send pseudo references towards stationary nodes over private links, preserving data integrity for localization. Simulation results validate its effectiveness over the existing methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Biswas, Rathindra Nath, Swarup Kumar Mitra y Mrinal Kanti Naskar. "Preserving Security of Mobile Anchors Against Physical Layer Attacks". En Research Anthology on Securing Mobile Technologies and Applications, 93–118. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-8545-0.ch006.

Texto completo
Resumen
This chapter introduces a new security scheme for mobile anchors avoiding the physical layer attacks towards localization in wireless sensor networks (WSNs). In a network, anchors are made location-aware equipping them with GPS (global positioning system) receivers. Direction finding capabilities are also incorporated with smart antennas. The proposed algorithm is based on adaptive beamforming of smart array that always minimizes the probabilities of successful attacks, keeping the adversaries beyond its beam coverage. Particle swarm optimization (PSO) technique is used to compute array excitation coefficients, generating the desired pattern. Thus, anchors remain secured through pattern irregularities, deteriorating the information retrieval process even though chances of occurring adequate RSS (received signal strength)/AoA (angle of arrival) measurements may exist. Moreover, anchors are assumed to send pseudo references towards stationary nodes over private links, preserving data integrity for localization. Simulation results validate its effectiveness over the existing methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Probabilities – Computer simulations"

1

Di Blasi, Marti´n y Carlos Muravchik. "Leak Detection in a Pipeline Using Modified Line Volume Balance and Sequential Probability Tests". En 2006 International Pipeline Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/ipc2006-10210.

Texto completo
Resumen
The use of statistical tools to improve the decision aspect of leak detection is becoming a common practice in the area of computer pipeline monitoring. Among these tools, the sequential probability ratio test is one of the most named techniques used by commercial leak detection systems [1]. This decision mechanism is based on the comparison of the estimated probabilities of leak or no leak observed from the pipeline data. This paper proposes a leak detection system that uses a simplified statistical model for the pipeline operation, allowing a simple implementation in the pipeline control system [2]. Applying linear regression to volume balance and average pipeline pressure signals, a statistically corrected volume balance signal with reduced variance is introduced. Its expected value is zero during normal operation whereas it equals the leak flow under a leak condition. Based on the corrected volume balance, differently configured sequential probability ratio tests (SPRT) to extend the dynamic range of detectable leak flow are presented. Simplified mathematical expressions are obtained for several system performance indices, such as spilled volume until detection, time to leak detection, minimum leak flow detected, etc. Theoretical results are compared with leak simulations on a real oil pipeline. A description of the system tested over a 500 km oil pipeline is included, showing some real data results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Huei, Y. C., P. H. Keng y N. Krivulin. "Average Network Blocking Probabilities for TDM WDM Optical Networks with OTSIs and without WC". En 2007 15th International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS). IEEE, 2007. http://dx.doi.org/10.1109/mascots.2007.12.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Yassine, Ali A., Daniel E. Whitney y Tony Zambito. "Assessment of Rework Probabilities for Simulatng Product Development Processes Using the Design Structure Matrix (DSM)". En ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dtm-21693.

Texto completo
Resumen
Abstract This paper uses the Design Structure Matrix (DSM) to model and simulate the performance of development processes. Though the simulation is a powerful tool for analyzing process performance, its ability is limited by the quality of input information used in the analysis. DSM simulation requires process data that is hard to assess or estimate directly from development participants. In this paper, we propose a methodology that allows a more practical estimation of an important simulation parameter: rework probabilities. Furthermore, we show how does this assessment method (combined with simulation) allow managers to evaluate process improvement plans based on two resulting process measures: reliability and robustness. The method is illustrated with a real application from the automotive industry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Parameswaran, Nataraj y Lidvin Kjerengtroen. "Determination of Failure Probabilities and Sensitivity Factors Based on First Order Reliability Method". En ASME 1994 Design Technical Conferences collocated with the ASME 1994 International Computers in Engineering Conference and Exhibition and the ASME 1994 8th Annual Database Symposium. American Society of Mechanical Engineers, 1994. http://dx.doi.org/10.1115/detc1994-0092.

Texto completo
Resumen
Abstract Traditionally, most engineering problems are modeled in such a manner that all the variables involved in the design equations are deterministic. By nature, however, seldom does such a phenomenon exist. Most of the variables involved are randomly distributed with a certain mean and standard deviation and follow a certain type of statistical distribution. This investigation compares two such statistical based design processes to evaluate failure probabilities of a one dimensional heat transfer problem with various statistically distributed parameters in its performance function. The methods developed are the Monte Carlo simulation and First Order Reliability Method (FORM). Comparison is made between the Monte Carlo simulation and FORM based upon the investigated problem and the relative advantages and disadvantages of both methods are noted at the end of the investigation. The investigation demonstrates that FORM can be used effectively to determine failure probabilities and sensitivity factors in a manner better than Monte Carlo simulation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Wang, Zhonglai, Zissimos P. Mourelatos, Jing Li, Amandeep Singh y Igor Baseski. "Time-Dependent Reliability of Dynamic Systems Using Subset Simulation With Splitting Over a Series of Correlated Time Intervals". En ASME 2013 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/detc2013-12257.

Texto completo
Resumen
Time-dependent reliability is the probability that a system will perform its intended function successfully for a specified time. Unless many and often unrealistic assumptions are made, the accuracy and efficiency of time-dependent reliability estimation are major issues which may limit its practicality. Monte Carlo simulation (MCS) is accurate and easy to use but it is computationally prohibitive for high dimensional, long duration, time-dependent (dynamic) systems with a low failure probability. This work addresses systems with random parameters excited by stochastic processes. Their response is calculated by time integrating a set of differential equations at discrete times. The limit state functions are therefore, explicit in time and depend on time-invariant random variables and time-dependent stochastic processes. We present an improved subset simulation with splitting approach by partitioning the original high dimensional random process into a series of correlated, short duration, low dimensional random processes. Subset simulation reduces the computational cost by introducing appropriate intermediate failure sub-domains to express the low failure probability as a product of larger conditional failure probabilities. Splitting is an efficient sampling method to estimate the conditional probabilities. The proposed subset simulation with splitting not only estimates the time-dependent probability of failure at a given time but also estimates the cumulative distribution function up to that time with approximately the same cost. A vibration example involving a vehicle on a stochastic road demonstrates the advantages of the proposed approach.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Banerjee, Ashis G., Arvind Balijepalli, Satyandra K. Gupta y Thomas W. LeBrun. "Radial Basis Function Based Simplified Trapping Probability Models for Optical Tweezers". En ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49124.

Texto completo
Resumen
In order to automatically manipulate microspheres using optical tweezers in real-time, it is essential to efficiently compute an estimate of the probability with which they will be trapped by a moving laser beam in a spatial region close to its focus. This paper presents a radial basis function approach to generate simplified models for estimating trapping probability from the offline simulation data. The difference form of Langevin’s equation is used to perform offline particle motion simulation for estimating probabilities at discrete points. Gaussian radial basis functions combined with kd-tree based partitioning of the parameter space are then used to generate simplified models. We show that the proposed approach is computationally efficient in estimating the trapping probability. We also show that the estimated probability using the simplified models is sufficiently close to the probability estimated from the offline simulation data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Beck, James L., S. K. Au y K. V. Yuen. "Bayesian Updating of Nonlinear Model Predictions Using Markov Chain Monte Carlo Simulation". En ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/vib-21398.

Texto completo
Resumen
Abstract The usual practice in system identification is to use system data to identify one model from a set of possible models and then to use this model for predicting system behavior. In contrast, the present robust predictive approach rigorously combines the predictions of all the possible models, appropriately weighted by their updated probabilities based on the data. This Bayesian system identification approach is applied to update the robust reliability of a dynamical system based on its measured response time histories. A Markov chain simulation method based on the Metropolis-Hastings algorithm and an adaptive scheme is proposed to evaluate the robust reliability integrals. An example for updating the reliability of a Duffing oscillator is given to illustrate the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Wang, Yan. "Solving Interval Master Equation in Simulation of Jump Processes Under Uncertainties". En ASME 2013 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/detc2013-12740.

Texto completo
Resumen
Two types of uncertainty are generally recognized in modeling and simulation, including variability caused by inherent randomness and incertitude due to the lack of perfect knowledge. Generalized interval probability is able to model both uncertainty components simultaneously, where epistemic uncertainty is quantified by the generalized interval in addition to the probabilistic measure. With the conditioning, independence, and Markovian property uniquely defined, the calculus structures in generalized interval probability resembles those in the classical probability theory. An imprecise Markov chain model is proposed with the ease of computation. A Krylov subspace projection method is developed to solve the interval master equation to simulate jump processes with finite state transitions under uncertainties. The state transitions with interval-valued probabilities can be simulated, which provides the lower and upper bound information of evolving distributions as an alternative to the traditional sensitivity analysis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Thakur, Atul, Petr Svec y Satyandra K. Gupta. "Generation of State Transition Models Using Simulations for Unmanned Sea Surface Vehicle Trajectory Planning". En ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-48624.

Texto completo
Resumen
Trajectory planning for unmanned sea surface vehicles (USSVs) in high sea-states is a challenging problem. Large and somewhat stochastic ocean forces can cause significant deviations in the motion of the USSV. Controllers are employed to reject disturbances and get back on the desired trajectory. However, the position uncertainty can be still high and needs to be accounted for during the trajectory planning to circumvent collisions with the obstacles. We model the motion of the USSV as Markov decision process and use a trajectory planning approach based on stochastic dynamic programming. A key component of our approach is the estimation of transition probabilities from one state to another when executing an action. In this paper, we present algorithms to generate state transition model using Monte Carlo simulation of USSV motion. Our simulations are based on potential flow based 6-DOF dynamics. Using this approach, we are able to generate dynamically feasible trajectories for USSVs that exhibit safe behaviors in high sea-states in the vicinity of static obstacles.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Xu, Yanwen y Pingfeng Wang. "Sequential Sampling Based Reliability Analysis for High Dimensional Rare Events With Confidence Intervals". En ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/detc2020-22146.

Texto completo
Resumen
Abstract Analysis of rare failure events accurately is often challenging with an affordable computational cost in many engineering applications, and this is especially true for problems with high dimensional system inputs. The extremely low probabilities of occurrences for those rare events often lead to large probability estimation errors and low computational efficiency. Thus, it is vital to develop advanced probability analysis methods that are capable of providing robust estimations of rare event probabilities with narrow confidence bounds. Generally, confidence intervals of an estimator can be established based on the central limit theorem, but one of the critical obstacles is the low computational efficiency, since the widely used Monte Carlo method often requires a large number of simulation samples to derive a reasonably narrow confidence interval. This paper develops a new probability analysis approach that can be used to derive the estimates of rare event probabilities efficiently with narrow estimation bounds simultaneously for high dimensional problems. The asymptotic behaviors of the developed estimator has also been proved theoretically without imposing strong assumptions. Further, an asymptotic confidence interval is established for the developed estimator. The presented study offers important insights into the robust estimations of the probability of occurrences for rare events. The accuracy and computational efficiency of the developed technique is assessed with numerical and engineering case studies. Case study results have demonstrated that narrow bounds can be built efficiently using the developed approach, and the true values have always been located within the estimation bounds, indicating that good estimation accuracy along with a significantly improved efficiency.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía