To see the other types of publications on this topic, follow the link: Probabilities – Computer simulations.

Journal articles on the topic 'Probabilities – Computer simulations'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Probabilities – Computer simulations.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

DOBRESCU, GIANINA, M. RUSU, and M. VASS. "COMPUTER SIMULATIONS OF FRACTAL SURFACES: APPLICATION IN ADSORPTION." Fractals 01, no. 03 (September 1993): 430–38. http://dx.doi.org/10.1142/s0218348x93000459.

Full text
Abstract:
A computer program which is able to simulate adsorption on fractal surfaces was developed. The fractal surfaces are generated as Takagi surfaces. The computer program is based on a DLA-algorithm. Adsorption was simulated in different conditions: 1. equivalent active sites (homogeneous surfaces); 2. active sites with different adsorption probabilities; the probability associated with every active site is computed using a van der Waals potential. Our simulation allows us to explore the actual structure of the gas-solid interface and to study the sensitivity to energetic disorder. The fractal dimension of gas-solid interface vs. adsorption coverage curves are computed.
APA, Harvard, Vancouver, ISO, and other styles
2

Smith, Peter J. "Underestimation of Rare Event Probabilities in Importance Sampling Simulations." SIMULATION 76, no. 3 (March 2001): 140–50. http://dx.doi.org/10.1177/003754970107600301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Peng, Xidan, and Xiangyang Li. "Performance Analysis for Analog Network Coding with Imperfect CSI in FDD Two Way Channels." Journal of Systems Science and Information 3, no. 4 (August 25, 2015): 357–64. http://dx.doi.org/10.1515/jssi-2015-0357.

Full text
Abstract:
AbstractA time-division duplex (TDD) two-way channel exploits reciprocity to estimate the forward channel gain from the reverse link. Many previous works explore outage probabilities in the TDD system, based on the reciprocity property. However, a frequency-division duplex (FDD) system has no reciprocity property. In this letter, we investigate the impact of CSI estimation error on the performance of non-orthogonal and orthogonal analog network coding protocols in an FDD two-way system, where channel gains are independent of each other. Considering imperfect CSI, the closed-form expressions of outage probabilities by two protocols are derived in the high signal-to-noise ratio (SNR) regime, respectively. It is shown that the derived outage probabilities match results of Monte Carlo simulations in different communication scenarios. It is interesting that ANC in the FDD two-way channel is proved to outperform that in the TDD channel by the computer simulation.
APA, Harvard, Vancouver, ISO, and other styles
4

SCHURZ, GERHARD, and PAUL D. THORN. "REWARD VERSUS RISK IN UNCERTAIN INFERENCE: THEOREMS AND SIMULATIONS." Review of Symbolic Logic 5, no. 4 (July 4, 2012): 574–612. http://dx.doi.org/10.1017/s1755020312000184.

Full text
Abstract:
AbstractSystems oflogico-probabilistic(LP) reasoning characterize inference from conditional assertions that express high conditional probabilities. In this paper we investigate four prominent LP systems, the systemsO, P,Z, andQC. These systems differ in the number of inferences they licence (O⊂ P ⊂Z⊂QC). LP systems that license more inferences enjoy the possiblerewardof deriving more true and informative conclusions, but with this possible reward comes theriskof drawing more false or uninformative conclusions. In the first part of the paper, we present the four systems and extend each of them by theorems that allow one to compute almost-tight lower-probability-bounds for the conclusion of an inference, given lower-probability-bounds for its premises. In the second part of the paper, we investigate by means of computer simulations which of the four systems provides the best balance ofrewardversusrisk. Our results suggest that systemZoffers the best balance.
APA, Harvard, Vancouver, ISO, and other styles
5

Shchur, Lev N., and Sergey S. Kosyakov. "Probability of Incipient Spanning Clusters in Critical Square Bond Percolation." International Journal of Modern Physics C 08, no. 03 (June 1997): 473–81. http://dx.doi.org/10.1142/s0129183197000394.

Full text
Abstract:
The probability of simultaneous occurrence of at least k spanning clusters has been studied by Monte Carlo simulations on the 2D square lattice with free boundaries at the bond percolation threshold pc =1/2. It is found that the probability of k and more Incipient Spanning Clusters (ISC) have the values P(k>1) ≈ 0.00658(3) and P(k>2) ≈ 0.00000148(21) provided that the limit of these probabilities for infinite lattices exists. The probability P(k>3) of more than three ISC could be estimated to be of the order of 10-11 and is beyond the possibility to compute such a value by nowadays computers. So, it is impossible to check in simulations the Aizenman law for the probabilities when k≫1. We have detected a single sample with four ISC in a total number of about 1010 samples investigated. The probability of this single event is 1/10 for that number of samples. The influence of boundary conditions is discussed in the last section.
APA, Harvard, Vancouver, ISO, and other styles
6

Paschalidis, I. C., and S. Vassilaras. "Importance Sampling for the Estimation of Buffer Overflow Probabilities via Trace-Driven Simulations." IEEE/ACM Transactions on Networking 12, no. 5 (October 2004): 907–19. http://dx.doi.org/10.1109/tnet.2004.836139.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ojha, Durga Prasad. "Nematogenic Behaviour of a Cyano-Compound Using Quantum Mechanics and Computer Simulations." Zeitschrift für Naturforschung A 56, no. 3-4 (April 1, 2001): 319–25. http://dx.doi.org/10.1515/zna-2001-0315.

Full text
Abstract:
Abstract Using quantum mechanics and intermolecular forces, the molecular ordering of a nematogenic cya-no-compound, 5-(frans-4-ethylcyclohexyl)-2-(4-cyanophenyl)-pyrimidine (ECCPP), has been exam­ ined. The CNDO/2 method has been employed to evaluate the net atomic charge and the dipole mo­ ment components at each atomic centre of the molecule. The configuration energy has been computed using the modified Rayleigh-Schrödinger perturbation method at intervals of 1Ä in translation and 10P in rotations, and corresponding probabilities have been calculated using Maxwell-Boltzmann statistics. The flexibility of various configurations has been studied in terms of the variation of the probability due to small departures from the most probable configuration. All possible geometrical arrangements between a molecular pair have been considered during stacking, in-plane and terminal interactions, and the most favourable configuration of pairing has been obtained. An attempt has been made to under­ stand the behaviour of the molecules in terms of their relative order. The results have been compared with those obtained for other nematogens like DPAB [4,4'-di-n-propoxy-azoxybenzene] and EMBAC [ethyl 4-(4'-methoxybenzylidene amino) cinnamate].
APA, Harvard, Vancouver, ISO, and other styles
8

Chiou, Rong Nan, and Chia-Nian Shyi. "Adaptive Maximums of Random Variables for Network Simulations." Journal of Computer Systems, Networks, and Communications 2009 (2009): 1–6. http://dx.doi.org/10.1155/2009/383720.

Full text
Abstract:
In order to enhance the precision of network simulations, the paper proposes an approach to adaptively decide the maximum of random variables that create the discrete probabilities to generate nodal traffic on simulated networks. In this paper, a statistical model is first suggested to manifest the bound of statistical errors. Then, according to the minimum probability that generates nodal traffic, a formula is proposed to decide the maximum. In the formula, a precision parameter is used to present the degree of simulative accuracy. Meanwhile, the maximum adaptively varies with the traffic distribution among nodes because the decision depends on the minimum probability generating nodal traffic. In order to verify the effect of the adaptive maximum on simulative precision, an optical network is introduced. After simulating the optical network, the theoretic average waiting time of nodes on the optical network is exploited to validate the exactness of the simulation. The proposed formula deciding the adaptive maximum can be generally exploited in the simulations of various networks. Based on the precision parameterK, a recursive procedure will be developed to automatically produce the adaptive maximum for network simulations in the future.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Xulong, and Xiaoxia Song. "Stability Analysis of a Dynamical Model for Malware Propagation with Generic Nonlinear Countermeasure and Infection Probabilities." Security and Communication Networks 2020 (September 22, 2020): 1–7. http://dx.doi.org/10.1155/2020/8859883.

Full text
Abstract:
The dissemination of countermeasures is widely recognized as one of the most effective strategies of inhibiting malware propagation, and the study of general countermeasure and infection has an important and practical significance. On this point, a dynamical model incorporating generic nonlinear countermeasure and infection probabilities is proposed. Theoretical analysis shows that the model has a unique equilibrium which is globally asymptotically stable. Accordingly, a real network based on the model assumptions is constructed, and some numerical simulations are conducted on it. Simulations not only illustrate theoretical results but also demonstrate the reasonability of general countermeasure and infection.
APA, Harvard, Vancouver, ISO, and other styles
10

Lerche, Ian, and Brett S. Mudford. "How Many Monte Carlo Simulations Does One Need to Do?" Energy Exploration & Exploitation 23, no. 6 (December 2005): 405–27. http://dx.doi.org/10.1260/014459805776986876.

Full text
Abstract:
This article derives an estimation procedure to evaluate how many Monte Carlo realizations need to be done in order to achieve prescribed accuracies in the estimated mean value and also in the cumulative probabilities of achieving values greater than, or less than, a particular value as the chosen particular value is allowed to vary. In addition, by inverting the argument and asking what the accuracies are that result for a prescribed number of Monte Carlo realizations, one can assess the computer time that would be involved should one choose to carry out the Monte Carlo realizations. These two complementary procedures are of great benefit in attempting to control the worth of undertaking an unknown number of Monte Carlo realizations, and of continuing to carry out the unknown number until the results have reached a level of accuracy that one deems acceptable. Such a procedure is not only computer intensive; however, is also very open-ended, a less than desirable trait when running a complex computer program that might take many hours or days to run through even once. The procedure presented here allows one to assess, ahead of performing a large number of Monte Carlo realizations, roughly how many are actually needed. Several illustrative numerical examples provide indications how one uses this novel procedure in practical situations.
APA, Harvard, Vancouver, ISO, and other styles
11

Moen, Jon, Pär K. Ingvarsson, and David WH Walton. "Estimates of structural complexity in clonal plant morphology: comparisons of grazed and ungrazed Acaena magellanica rhizomes." Canadian Journal of Botany 77, no. 6 (October 30, 1999): 869–76. http://dx.doi.org/10.1139/b99-047.

Full text
Abstract:
The aim of this study is to examine the information given by various indices of rhizome morphology that describe grazed and ungrazed rhizome systems of Acaena magellanica (Rosaceae). Internode lengths, branching probabilities, and branching angles were estimated from grazed and ungrazed rhizomes in the field. These parameter values were then used in computer simulations of rhizome growth, and the structural complexity of the simulated rhizomes were described using size, topology, and fractal dimensions. Grazed rhizomes had shorter internodes, higher probabilities of branching, and more open branching angles than ungrazed rhizomes. This resulted in a more directional growth (herring-bone pattern) in the simulated ungrazed rhizomes, whereas the grazed rhizomes had a more space-filling growth pattern. Most indices, even though they are based on different mathematical and theoretical backgrounds, were highly correlated and thus equally good at describing the structural complexity exhibited by the rhizomes. However, indices have different relationships to theories about function, and we suggest that any study of structural complexity of branching systems should use several different indices of shape depending on the questions asked.Key words: Acaena magellanica, fractal dimension, grazing, growth simulation, topology.
APA, Harvard, Vancouver, ISO, and other styles
12

El-Taha, M., and D. E. Clark. "Generation of Correlated Logistic-Normal Random Variates for Medical Decision Trees." Methods of Information in Medicine 37, no. 03 (July 1998): 235–38. http://dx.doi.org/10.1055/s-0038-1634534.

Full text
Abstract:
AbstractA Logistic-Normal random variable (Y) is obtained from a Normal random variable (X) by the relation Y = (ex)/(1 + ex). In Monte-Carlo analysis of decision trees, Logistic-Normal random variates may be used to model the branching probabilities. In some cases, the probabilities to be modeled may not be independent, and a method for generating correlated Logistic-Normal random variates would be useful. A technique for generating correlated Normal random variates has been previously described. Using Taylor Series approximations and the algebraic definitions of variance and covariance, we describe methods for estimating the means, variances, and covariances of Normal random variates which, after translation using the above formula, will result in Logistic-Normal random variates having approximately the desired means, variances, and covariances. Multiple simulations of the method using the Mathematica computer algebra system show satisfactory agreement with the theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
13

Van den Nest, Maarten. "Simulating quantum computers with probabilistic methods." Quantum Information and Computation 11, no. 9&10 (September 2011): 784–812. http://dx.doi.org/10.26421/qic11.9-10-5.

Full text
Abstract:
We investigate the boundary between classical and quantum computational power. This work consists of two parts. First we develop new classical simulation algorithms that are centered on sampling methods. Using these techniques we generate new classes of classically simulatable quantum circuits where standard techniques relying on the exact computation of measurement probabilities fail to provide efficient simulations. For example, we show how various concatenations of matchgate, Toffoli, Clifford, bounded-depth, Fourier transform and other circuits are classically simulatable. We also prove that sparse quantum circuits as well as circuits composed of CNOT and $\exp[{i\theta X}]$ gates can be simulated classically. In a second part, we apply our results to the simulation of quantum algorithms. It is shown that a recent quantum algorithm, concerned with the estimation of Potts model partition functions, can be simulated efficiently classically. Finally, we show that the exponential speed-ups of Simon's and Shor's algorithms crucially depend on the very last stage in these algorithms, dealing with the classical postprocessing of the measurement outcomes. Specifically, we prove that both algorithms would be classically simulatable if the function classically computed in this step had a sufficiently peaked Fourier spectrum.
APA, Harvard, Vancouver, ISO, and other styles
14

Hennecke, Michael. "Markov Chain Analysis of Single Spin Flip Ising Simulations." International Journal of Modern Physics C 08, no. 02 (April 1997): 207–27. http://dx.doi.org/10.1142/s0129183197000199.

Full text
Abstract:
The Markov processes defined by random and loop-based schemes for single spin flip attempts in Monte Carlo simulations of the 2D Ising model are investigated, by explicitly constructing their transition matrices. Their analysis reveals that loops over all lattice sites using a Metropolis-type single spin flip probability often do not define ergodic Markov chains, and have distorted dynamical properties even if they are ergodic. The transition matrices also enable a comparison of the dynamics of random versus loop spin selection and Glauber versus Metropolis probabilities.
APA, Harvard, Vancouver, ISO, and other styles
15

Ha, Il-Kyu, and You-Ze Cho. "Analysis of factors affecting the speed of probabilistic target search using unmanned aerial vehicles." International Journal of Distributed Sensor Networks 15, no. 9 (September 2019): 155014771987761. http://dx.doi.org/10.1177/1550147719877610.

Full text
Abstract:
When searching for targets using unmanned aerial vehicles, speed is important for many applications such as the discovery of patients in a medical emergency. The speed of operation of actual unmanned aerial vehicles is strongly related to the performance of the camera sensor used for target recognition, search altitude, and the search algorithm employed by the unmanned aerial vehicle. In this study, the major factors affecting the speed of a probabilistic unmanned aerial vehicle target search are analyzed. In particular, simulations are performed to analyze the influence of the search altitude, sensor false alarm rate, and sensor missed detection rate on the required travel distance and the time required for a search. Furthermore, the search performance of an unmanned aerial vehicle is analyzed by varying the search altitude with fixed false alarm and missed detection probabilities. The simulation results show that the search performance is significantly affected by changes in the false alarm and missed detection probabilities of the sensor, and it confirms that the effect of the missed detection probability is greater than that of the false alarm probability. The second simulation proves that the altitude of an unmanned aerial vehicle is a very important factor for the speed of a target search. In particular, the result shows that, for a real data set, the search distance and time at 10 and 5 m are about 2.8 times and 14.3 times larger, respectively, than those at 20 m.
APA, Harvard, Vancouver, ISO, and other styles
16

Cherry, Joshua L. "Selection in a Subdivided Population With Local Extinction and Recolonization." Genetics 164, no. 2 (June 1, 2003): 789–95. http://dx.doi.org/10.1093/genetics/164.2.789.

Full text
Abstract:
Abstract In a subdivided population, local extinction and subsequent recolonization affect the fate of alleles. Of particular interest is the interaction of this force with natural selection. The effect of selection can be weakened by this additional source of stochastic change in allele frequency. The behavior of a selected allele in such a population is shown to be equivalent to that of an allele with a different selection coefficient in an unstructured population with a different size. This equivalence allows use of established results for panmictic populations to predict such quantities as fixation probabilities and mean times to fixation. The magnitude of the quantity Nese, which determines fixation probability, is decreased by extinction and recolonization. Thus deleterious alleles are more likely to fix, and advantageous alleles less likely to do so, in the presence of extinction and recolonization. Computer simulations confirm that the theoretical predictions of both fixation probabilities and mean times to fixation are good approximations.
APA, Harvard, Vancouver, ISO, and other styles
17

YAŞAR, F., and K. DEMIR. "STUDY OF TWO BIOACTIVE PEPTIDES IN VACUUM AND SOLVENT BY MOLECULAR MODELING." International Journal of Modern Physics C 17, no. 06 (June 2006): 825–39. http://dx.doi.org/10.1142/s0129183106009382.

Full text
Abstract:
The thermodynamic and structural properties of Tyrosine-Glycine-Leusine-Phenylalanine (YGLF, in a one letter code) and Lysine-Valine-Leusine-Proline-Valine-Proline-Glutamine (KVLPVPQ) peptide sequences were studied by three-dimensional molecular modeling in vacuum and solution. All the three-dimensional conformations of each peptide sequences were obtained by multicanonical simulations with using ECEPP/2 force field and each simulation started from completely random initial conformation. Solvation contributions are included by a term that is proportional to solvent-accessible surface areas of peptides. In the present study, we calculated the average values of total energy, specific heat, fourth-order cumulant and end-to-end distance for two peptide sequences of milk protein as a function of temperature. With using major advantage of this simulation technique, Ramachandran plots were prepared and analysed to predict the relative occurrence probabilities of β-turn, γ-turn and helical structures. Although structural predictions of these sequences indicate both the presence of high level of γ-turns and low level of β-turns in vacuum and solvent, it was observed that these probabilities in vacuum were higher than the ones in solvent model.
APA, Harvard, Vancouver, ISO, and other styles
18

Yagawa, G., S. Yoshimura, N. Handa, T. Uno, K. Watashi, T. Fujioka, H. Ueda, M. Uno, K. Hojo, and S. Ueda. "Study on Life Extension of Aged RPV Material Based on Probabilistic Fracture Mechanics: Japanese Round Robin." Journal of Pressure Vessel Technology 117, no. 1 (February 1, 1995): 7–13. http://dx.doi.org/10.1115/1.2842095.

Full text
Abstract:
This paper is concerned with round-robin analyses of probabilistic fracture mechanics (PFM) problems of aged RPV material. Analyzed here is a plate with a semi-elliptical surface crack subjected to various cyclic tensile and bending stresses. A depth and an aspect ratio of the surface crack are assumed to be probabilistic variables. Failure probabilities are calculated using the Monte Carlo methods with the importance sampling or the stratified sampling techniques. Material properties are chosen from the Marshall report, the ASME Code Section XI, and the experiments on a Japanese RPV material carried out by the Life Evaluation (LE) subcommittee of the Japan Welding Engineering Society (JWES), while loads are determined referring to design loading conditions of pressurized water reactors (PWR). Seven organizations participate in this study. At first, the procedures for obtaining reliable PFM solutions with low failure probabilities are examined by solving a unique problem with seven computer programs. The seven solutions agree very well with one another, i.e., by a factor of 2 to 5 in failure probabilities. Next, sensitivity analyses are performed by varying fracture toughness values, loading conditions, and pre and in-service inspections. Finally, life extension simulations based on the PFM analyses are performed. It is clearly demonstrated from these analyses that failure probabilities are so sensitive to the change of fracture toughness values that the degree of neutron irradiation significantly influences the judgment of plant life extension.
APA, Harvard, Vancouver, ISO, and other styles
19

Simiu, Emil, and Marek Franaszek. "Melnikov-Based Open-Loop Control of Escape for a Class of Nonlinear Systems." Journal of Dynamic Systems, Measurement, and Control 119, no. 3 (September 1, 1997): 590–94. http://dx.doi.org/10.1115/1.2801302.

Full text
Abstract:
The performance of certain nonlinear stochastic systems is deemed acceptable if during a specified time interval, the systems have sufficiently low probabilities of escape from a preferred region of phase space. We propose an open-loop control method for reducing these probabilities. The method is applicable to stochastic systems whose dissipation- and excitation-free counterparts have homoclinic or heteroclinic orbits. The Melnikov relative scale factors are system properties containing information on the frequencies of the random forcing spectral components that are most effective in inducing escapes. Numerical simulations show that substantial advantages can be achieved in some cases by designing control systems that take into account the information contained in the Melnikov scale factors.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, R., M. Hoover, and F. Gaitan. "High-fidelity single-qubit gates using non-adiabatic rapid passage." Quantum Information and Computation 7, no. 7 (September 2007): 594–608. http://dx.doi.org/10.26421/qic7.7-3.

Full text
Abstract:
Numerical simulation results are presented which suggest that a class of non-adiabatic rapid passage sweeps first realized experimentally in 1991 should be capable of implementing a set of quantum gates that is universal for one-qubit unitary operations and whose elements operate with error probabilities $P_{e}<10^{-4}$. The sweeps are non-composite and generate controllable quantum interference effects which allow the one-qubit gates produced to operate non-adiabatically while maintaining high accuracy. The simulations suggest that the one-qubit gates produced by these sweeps show promise as possible elements of a fault-tolerant scheme for quantum computing.
APA, Harvard, Vancouver, ISO, and other styles
21

Fukś, Henryk, and Yucen Jin. "Approximating dynamics of a number-conserving cellular automaton by a finite-dimensional dynamical system." International Journal of Modern Physics C 31, no. 12 (October 15, 2020): 2050172. http://dx.doi.org/10.1142/s0129183120501727.

Full text
Abstract:
The local structure theory for cellular automata (CA) can be viewed as an finite-dimensional approximation of infinitely dimensional system. While it is well known that this approximation works surprisingly well for some CA, it is still not clear why it is the case, and which CA rules have this property. In order to shed some light on this problem, we present an example of a four input CA for which probabilities of occurrence of short blocks of symbols can be computed exactly. This rule is number conserving and possesses a blocking word. Its local structure approximation correctly predicts steady-state probabilities of small length blocks, and we present a rigorous proof of this fact, without resorting to numerical simulations. We conjecture that the number-conserving property together with the existence of the blocking word are responsible for the observed perfect agreement between the finite-dimensional approximation and the actual infinite-dimensional dynamical system.
APA, Harvard, Vancouver, ISO, and other styles
22

ARKIN, HANDAN, FATİH YAŞAR, TARIK ÇELİK, SÜEDA ÇELİK, and HAMİT KÖKSEL. "MOLECULAR MODELING OF TWO HEXAPEPTIDE REPEAT MOTIFS OF HMW GLUTENIN SUBUNITS." International Journal of Modern Physics C 12, no. 02 (February 2001): 281–92. http://dx.doi.org/10.1142/s0129183101001675.

Full text
Abstract:
The three-dimensional structures of two hexapeptide repeat motifs (PGQGQQ and SGQGQQ, in one letter code) in the repetitive central domain of HMW glutenin subunits are investigated by using the multicanonical simulation procedure. Ramachandran plots were prepared and analyzed to predict the relative occurrence probabilities of β-turn and γ-turn structures and helical state. Structural predictions of PGQGQQ repeat motif indicated the presence of high level of β-turns and considerable level of γ-turns. Simulations of the repeat motifs in the repetitive central domain of HMW glutenin subunits indicated that these structures take important part in the three-dimensional structures of repeat motifs.
APA, Harvard, Vancouver, ISO, and other styles
23

Nowacki, Amy S., Wenle Zhao, and Yuko Y. Palesch. "A surrogate-primary replacement algorithm for response-adaptive randomization in stroke clinical trials." Statistical Methods in Medical Research 26, no. 3 (January 12, 2015): 1078–92. http://dx.doi.org/10.1177/0962280214567142.

Full text
Abstract:
Response-adaptive randomization (RAR) offers clinical investigators benefit by modifying the treatment allocation probabilities to optimize the ethical, operational, or statistical performance of the trial. Delayed primary outcomes and their effect on RAR have been studied in the literature; however, the incorporation of surrogate outcomes has not been fully addressed. We explore the benefits and limitations of surrogate outcome utilization in RAR in the context of acute stroke clinical trials. We propose a novel surrogate-primary (S-P) replacement algorithm where a patient’s surrogate outcome is used in the RAR algorithm only until their primary outcome becomes available to replace it. Computer simulations investigate the effect of both the delay in obtaining the primary outcome and the underlying surrogate and primary outcome distributional discrepancies on complete randomization, standard RAR and the S-P replacement algorithm methods. Results show that when the primary outcome is delayed, the S-P replacement algorithm reduces the variability of the treatment allocation probabilities and achieves stabilization sooner. Additionally, the S-P replacement algorithm benefit proved to be robust in that it preserved power and reduced the expected number of failures across a variety of scenarios.
APA, Harvard, Vancouver, ISO, and other styles
24

Allison, Jane R. "Computational methods for exploring protein conformations." Biochemical Society Transactions 48, no. 4 (August 5, 2020): 1707–24. http://dx.doi.org/10.1042/bst20200193.

Full text
Abstract:
Proteins are dynamic molecules that can transition between a potentially wide range of structures comprising their conformational ensemble. The nature of these conformations and their relative probabilities are described by a high-dimensional free energy landscape. While computer simulation techniques such as molecular dynamics simulations allow characterisation of the metastable conformational states and the transitions between them, and thus free energy landscapes, to be characterised, the barriers between states can be high, precluding efficient sampling without substantial computational resources. Over the past decades, a dizzying array of methods have emerged for enhancing conformational sampling, and for projecting the free energy landscape onto a reduced set of dimensions that allow conformational states to be distinguished, known as collective variables (CVs), along which sampling may be directed. Here, a brief description of what biomolecular simulation entails is followed by a more detailed exposition of the nature of CVs and methods for determining these, and, lastly, an overview of the myriad different approaches for enhancing conformational sampling, most of which rely upon CVs, including new advances in both CV determination and conformational sampling due to machine learning.
APA, Harvard, Vancouver, ISO, and other styles
25

SEIGNEURIC, R. G., J.-L. CHASSÉ, P. M. AUGER, and A. L. BARDOU. "ROLE OF CELLULAR COUPLING AND DISPERSION OF REFRACTORINESS IN CARDIAC ARRHYTHMIAS: A SIMULATION STUDY." Journal of Biological Systems 07, no. 04 (December 1999): 529–40. http://dx.doi.org/10.1142/s0218339099000309.

Full text
Abstract:
Computer simulation is applied to study the role of cellular coupling, dispersion of refractoriness as well as both of them, in the mechanisms underlying cardiac arrhythmias. We first assumed that local ischemia mainly induces cell to cell dispersion in the coupling resistance (case 1), refractory period (case 2) or both (case 3). Numerical experiments, based on the van Capelle and Durrer model, showed that vortices could not be induced in these conditions. In order to be more realistic about coronary circulation we simulated a patchy dispersion of cellular properties, each patch corresponding to the zone irrigated by a small coronary artery. In these conditions, a single activation wave could give rise to abnormal activities. Probabilities of reentry, estimated for the three cases cited above, showed that a severe alteration of the coupling resistance may be an important factor in the genesis of reentry. Moreover, use of isochronal maps revealed that vortices were both stable and sustained with an alteration of coupling alone or along with reductions of action potential duration. Conversely, simulations with reduction of the refractoriness alone induced only transient patterns.
APA, Harvard, Vancouver, ISO, and other styles
26

MHIRECH, ABDELAZIZ, and ASSIA ALAOUI ISMAILI. "VEHICULAR TRAFFIC FLOW CONTROLLED BY TRAFFIC LIGHT ON A STREET WITH OPEN BOUNDARIES." International Journal of Modern Physics C 24, no. 08 (July 3, 2013): 1350050. http://dx.doi.org/10.1142/s0129183113500502.

Full text
Abstract:
The Nagel–Schreckenberg (NS) cellular automata (CA) model for describing the vehicular traffic flow in a street with open boundaries is studied. To control the traffic flow, a traffic signalization light operating for a fixed-time scheme is placed in the middle of the street. Extensive Monte Carlo simulations are carried out to calculate various model characteristics. Essentially, we investigate the formation of the cars queue behind traffic light dependence on the duration of green light Tg, injecting and extracting probabilities α and β, respectively. Two phases of average training queues were found. Besides, the dependence of car accident probability per site and per time step on Tg, α and β is computed.
APA, Harvard, Vancouver, ISO, and other styles
27

Ilic, Radovan, Darko Lalic, and Srboljub Stankovic. "Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries." Nuclear Technology and Radiation Protection 17, no. 1-2 (2002): 27–36. http://dx.doi.org/10.2298/ntrp0202027i.

Full text
Abstract:
This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.
APA, Harvard, Vancouver, ISO, and other styles
28

Balderama, Orlando F. "Development of a decision support system for small reservoir irrigation systems in rainfed and drought prone areas." Water Science and Technology 61, no. 11 (June 1, 2010): 2779–85. http://dx.doi.org/10.2166/wst.2010.193.

Full text
Abstract:
An integrated computer program called Cropping System and Water Management Model (CSWM) with a three-step feature (expert system—simulation—optimization) was developed to address a range of decision support for rainfed farming, i.e. crop selection, scheduling and optimisation. The system was used for agricultural planning with emphasis on sustainable agriculture in the rainfed areas through the use of small farm reservoirs for increased production and resource conservation and management. The application of the model was carried out using crop, soil, and climate and water resource data from the Philippines. Primarily, four sets of data representing the different rainfall classification of the country were collected, analysed, and used as input in the model. Simulations were also done on date of planting, probabilities of wet and dry period and with various capacities of the water reservoir used for supplemental irrigation. Through the analysis, useful information was obtained to determine suitable crops in the region, cropping schedule and pattern appropriate to the specific climate conditions. In addition, optimisation of the use of the land and water resources can be achieved in areas partly irrigated by small reservoirs.
APA, Harvard, Vancouver, ISO, and other styles
29

Fagiolini, Andrea, Alice Matone, Claudio Gaz, Simona Panunzi, and Andrea De Gaetano. "Pharmacoeconomic comparison of ziprasidone with other atypical oral antipsychotic agents in schizophrenia." Farmeconomia. Health economics and therapeutic pathways 12, no. 1 (November 21, 2011): 29–40. http://dx.doi.org/10.7175/fe.v12i1.96.

Full text
Abstract:
Objective: to comparatively investigate – by means of computer simulations – the economic cost and clinical outcomes of five atypical oral antipsychotic agents (ziprasidone, olanzapina, risperidone, paliperidone and aripiprazolo).Methods: a cyclical stochastic model representing patient evolution, taking into account main adverse reactions (akathisia, weight gain and extra-pyramidal ARs), drug efficacy on psychosis stabilization and probability of relapse, was developed. Ten different scenarios were compared, each starting with one of the considered antipsychotics, prescribed either at home or in a hospital setting. Switching to another medication was allowed until no untried drugs were available, in which case clozapine treatment or admission to a Psychiatric Therapeutic Rehabilitation Center were irreversibly assigned. Model inputs were probabilities of ARs, probabilities of stabilization and probabilities of destabilization (assumed equal for all); as well as costs attributable to drugs, hospitalization, outpatient care and costs adverse reactions in terms of concomitant medications. Sources for the inputs were the trials reported in the most recent literature (from the year 2000), selected based on the homogeneity of the observational period and antipsychotic dosage used.Results: in each scenario, the hospitalization cost represented the highest component of the overall cost (approximately 67%). Assuming equal drug effectiveness, ziprasidone fared better than all other considered competitors, showing the lowest average annual costs per patient (and also the lowest average annual hospitalization costs) as well as the largest numbers of controlled months without adverse reactions, independently of the initial setting. Conclusions: the most important determinant of total cost appears to be hospitalization, whose cost is about 600% higher than the medications cost. Medication effectiveness and tolerability remain however of utmost importance for the patients well being and reduction of hospitalization rate.
APA, Harvard, Vancouver, ISO, and other styles
30

DIAMOND, P., and I. VLADIMIROV. "BRANCHING PROCESSES AND COMPUTATIONAL COLLAPSE OF DISCRETIZED UNIMODAL MAPPINGS." International Journal of Bifurcation and Chaos 12, no. 12 (December 2002): 2847–67. http://dx.doi.org/10.1142/s0218127402006229.

Full text
Abstract:
In computer simulations of smooth dynamical systems, the original phase space is replaced by machine arithmetic, which is a finite set. The resulting spatially discretized dynamical systems do not inherit all functional properties of the original systems, such as surjectivity and existence of absolutely continuous invariant measures. This can lead to computational collapse to fixed points or short cycles. The paper studies loss of such properties in spatial discretizations of dynamical systems induced by unimodal mappings of the unit interval. The problem reduces to studying set-valued negative semitrajectories of the discretized system. As the grid is refined, the asymptotic behavior of the cardinality structure of the semitrajectories follows probabilistic laws corresponding to a branching process. The transition probabilities of this process are explicitly calculated. These results are illustrated by the example of the discretized logistic mapping.
APA, Harvard, Vancouver, ISO, and other styles
31

Kalinowski, Steven T. "A Graphical Method for Displaying the Model Fit of Item Response Theory Trace Lines." Educational and Psychological Measurement 79, no. 6 (May 16, 2019): 1064–74. http://dx.doi.org/10.1177/0013164419846234.

Full text
Abstract:
Item response theory (IRT) is a statistical paradigm for developing educational tests and assessing students. IRT, however, currently lacks an established graphical method for examining model fit for the three-parameter logistic model, the most flexible and popular IRT model in educational testing. A method is presented here to do this. The graph, which is referred to herein as a “bin plot,” is the IRT equivalent of a scatterplot for linear regression. Bin plots display a conventional IRT trace line (with ability on the horizontal axis and probability correct on the vertical axis). Students are binned according to how well they performed on the entire test, and the proportion of students in each bin who answered the focal question correctly is displayed on the graph as points above or below the trace line. With this arrangement, the difference between each point and the trace line is the residual for the bin. Confidence intervals can be added to the observed proportions in order to display uncertainty. Computer simulations were used to test four alternative ways for binning students. These simulations showed that binning students according to number of questions they answered correctly on the entire test works best. Simulations also showed confidence intervals for bin plots had coverage probabilities close to nominal values for common testing scenarios, but that there are scenarios in which confidence intervals had inflated error rates.
APA, Harvard, Vancouver, ISO, and other styles
32

Möller, Peter. "Essentials of the macroscopic-microscopic folded-Yukawa approach and examples of its record in providing nuclear-structure data for simulations." EPJ Web of Conferences 184 (2018): 01013. http://dx.doi.org/10.1051/epjconf/201818401013.

Full text
Abstract:
The macroscopic-microscopic model based on the folded-Yukawa singleparticle potential and a “finite-range” macroscopic model is probably the approach that has provided the most reliable predictions of a large number of nuclear-structure properties for all nuclei between the proton and neutron drip lines. I will describe some basic features of the model and the development philosophy that may be the reason for its success. Examples of quantities modeled within the same model framework are, nuclear masses, ground-state level structure, including spins, ground-state shapes, fission barriers, heavy-ion fusion barriers, sub-barrier fusion cross sections, β-decay half-lives and delayed neutron emission probabilities, shape coexistence, and α-decay Qα energies to name a few. I will show how well it predicted various properties measured after published results. Rather than giving an incomplete model description here I will give a timeline of model development and provide references to typical applications and references that are sufficiently complete that several individuals have written computer codes based on these references, codes whose results have excellent agreement with ours.
APA, Harvard, Vancouver, ISO, and other styles
33

Salathé, Marcel, and Sebastian Bonhoeffer. "The effect of opinion clustering on disease outbreaks." Journal of The Royal Society Interface 5, no. 29 (August 19, 2008): 1505–8. http://dx.doi.org/10.1098/rsif.2008.0271.

Full text
Abstract:
Many high-income countries currently experience large outbreaks of vaccine-preventable diseases such as measles despite the availability of highly effective vaccines. This phenomenon lacks an explanation in countries where vaccination rates are rising on an already high level. Here, we build on the growing evidence that belief systems, rather than access to vaccines, are the primary barrier to vaccination in high-income countries, and show how a simple opinion formation process can lead to clusters of unvaccinated individuals, leading to a dramatic increase in disease outbreak probability. In particular, the effect of clustering on outbreak probabilities is strongest when the vaccination coverage is close to the level required to provide herd immunity under the assumption of random mixing. Our results based on computer simulations suggest that the current estimates of vaccination coverage necessary to avoid outbreaks of vaccine-preventable diseases might be too low.
APA, Harvard, Vancouver, ISO, and other styles
34

WANG, JIAN-WEI, and LI-LI RONG. "CASCADING FAILURES IN BARABÁSI–ALBERT SCALE-FREE NETWORKS WITH A BREAKDOWN PROBABILITY." International Journal of Modern Physics C 20, no. 04 (April 2009): 585–95. http://dx.doi.org/10.1142/s0129183109013819.

Full text
Abstract:
In this paper, adopting the initial load of a node j to be [Formula: see text], where kj is the degree of the node j and α is a tunable parameter that controls the strength of the initial load of a node, we propose a cascading model with a breakdown probability and explore cascading failures on a typical network, i.e., the Barabási–Albert (BA) network with scale-free property. Assume that a failed node leads only to a redistribution of the load passing through it to its neighboring nodes. According to the simulation results, we find that BA networks reach the strongest robustness level against cascading failures when α = 1 and the robustness of networks has a positive correlation with the average degree 〈k〉, not relating to the different breakdown probabilities. In addition, it is found that the robustness against cascading failures has an inversely proportional relationship with the breakdown probability of an overload node. Finally, the numerical simulations are verified by the theoretical analysis.
APA, Harvard, Vancouver, ISO, and other styles
35

Yang, Dan, Liming Pan, Zhidan Zhao, and Tao Zhou. "Identifying the Influential Latent Edges for Promoting the Co-SIR Model." Complexity 2021 (March 24, 2021): 1–11. http://dx.doi.org/10.1155/2021/6614545.

Full text
Abstract:
The network-based cooperative information spreading is a widely existing phenomenon in the real world. For instance, the spreading of disease outbreak news and disease prevention information often coexist and interact with each other on the Internet. Promoting the cooperative spreading of information in network-based systems is a subject of great importance in both theoretical and practical perspectives. However, very limited attention has been paid to this specific research area so far. In this study, we propose an effective approach for identifying the influential latent edges (that is, the edges that do not originally exist) which, if added to the original network, can promote the cooperative susceptible-infected-recovered (co-SIR) dynamics. To be specific, we first obtain the probabilities of each nodes being in different node states by the message-passing approach. Then, based on the state probabilities of nodes obtained, we come up with an indicator, which incorporates both the information of network topology and the co-SIR dynamics, to measure the influence of each latent edge in promoting the co-SIR dynamics. Thus, the most influential latent edges can be located after ranking all the latent edges according to their quantified influence. We verify the rationality and superiority of the proposed indicator in identifying the influential latent edges of both synthetic and real-world networks by extensive numerical simulations. This study provides an effective approach to identify the influential latent edges for promoting the network-based co-SIR information spreading model and offers inspirations for further research on intervening the cooperative spreading dynamics from the perspective of performing network structural perturbations.
APA, Harvard, Vancouver, ISO, and other styles
36

Kim, Hyun Woo, and Daewoong Kwon. "Analysis on Tunnel Field-Effect Transistor with Asymmetric Spacer." Applied Sciences 10, no. 9 (April 27, 2020): 3054. http://dx.doi.org/10.3390/app10093054.

Full text
Abstract:
Tunnel field-effect transistor (Tunnel FET) with asymmetric spacer is proposed to obtain high on-current and reduced inverter delay simultaneously. In order to analyze the proposed Tunnel FET, electrical characteristics are evaluated by technology computer-aided design (TCAD) simulations with calibrated tunneling model parameters. The impact of the spacer κ values on tunneling rate is investigated with the symmetric spacer. As the κ values of the spacer increase, the on-current becomes enhanced since tunneling probabilities are increased by the fringing field through the spacer. However, on the drain-side, that fringing field through the drain-side spacer increases ambipolar current and gate-to-drain capacitance, which degrades leakage property and switching response. Therefore, the drain-side low-κ spacer, which makes the low fringing field, is adapted asymmetrically with the source-side high-κ spacer. This asymmetric spacer results in the reduction of gate-to-drain capacitance and switching delay with the improved on-current induced by the source-side high-κ spacer.
APA, Harvard, Vancouver, ISO, and other styles
37

German, Paul W., and Howard L. Fields. "How Prior Reward Experience Biases Exploratory Movements: A Probabilistic Model." Journal of Neurophysiology 97, no. 3 (March 2007): 2083–93. http://dx.doi.org/10.1152/jn.00303.2006.

Full text
Abstract:
Animals return to rewarded locations. An example of this is conditioned place preference (CPP), which is widely used in studies of drug reward. Although CPP is expressed as increased time spent in a previously rewarded location, the behavioral strategy underlying this change is unknown. We continuously monitored rats ( n = 22) in a three-room in-line configuration, before and after morphine conditioning in one end room. Although sequential room visit durations were variable, their probability distribution was exponential, indicating that the processes controlling visit durations can be modeled by instantaneous room exit probabilities. Further analysis of room transitions and computer simulations of probabilistic models revealed that the exploratory bias toward the morphine room is best explained by an increase in the probability of a subset of rapid, direct transitions from the saline- to the morphine-paired room by the central room. This finding sharply delineates and constrains possible neural mechanisms for a class of self-initiated, goal-directed behaviors toward previously rewarded locations.
APA, Harvard, Vancouver, ISO, and other styles
38

Watanabe, K., and S. G. Tzafestas. "Stochastic Control for Systems With Faulty Sensors." Journal of Dynamic Systems, Measurement, and Control 112, no. 1 (March 1, 1990): 143–47. http://dx.doi.org/10.1115/1.2894131.

Full text
Abstract:
The problem of control of linear discrete-time stochastic systems with faulty sensors is considered. The anomaly sensors are assumed to be modeled by a finite-state Markov chain whose transition probabilities are completely known. A passive type multiple model adaptive control (MMAC) law is developed by applying a new generalized pseudo-Bayes algorithm (GPBA), which is based on an n-step measurement update method. The present and other existing algorithms are compared through some Monte Carlo simulations. It is then shown that, for a case of only measurement noise uncertainty (i.e., a case when the certainty equivalence principle holds), the proposed MMAC has better control performance than MMAC’s based on using other existing GPBA’s.
APA, Harvard, Vancouver, ISO, and other styles
39

Rao, G. Srinivasa, Sauda Mbwambo, and P. K. Josephat. "Estimation of Stress–Strength Reliability from Exponentiated Inverse Rayleigh Distribution." International Journal of Reliability, Quality and Safety Engineering 26, no. 01 (February 2019): 1950005. http://dx.doi.org/10.1142/s0218539319500050.

Full text
Abstract:
This paper considers the estimation of stress–strength reliability when two independent exponential inverse Rayleigh distributions with different shape parameters and common scale parameter. The maximum likelihood estimator (MLE) of the reliability, its asymptotic distribution and asymptotic confidence intervals are constructed. Comparisons of the performance of the estimators are carried out using Monte Carlo simulations, the mean squared error (MSE), bias, average length and coverage probabilities. Finally, a demonstration is delivered on how the proposed reliability model may be applied in data analysis of the strength data for single carbon fibers test data.
APA, Harvard, Vancouver, ISO, and other styles
40

Ho-Van, Khuong, and Thiem Do-Dac. "Relay Selection for Security Improvement in Cognitive Radio Networks with Energy Harvesting." Wireless Communications and Mobile Computing 2021 (June 19, 2021): 1–16. http://dx.doi.org/10.1155/2021/9921782.

Full text
Abstract:
This paper selects an unlicensed relay among available self-powered relays to not only remain but also secure information transmission from an unlicensed source to an unlicensed destination. The relays harvest energy in signals of the unlicensed source and the licensed transmitter. Then, they spend the harvested energy for their relaying operation. Conditioned on the licensed outage restriction, the peak transmission power restriction, Rayleigh fading, and the licensed interference, the current paper proposes an exact closed-form formula of the secrecy outage probability to quickly evaluate the secrecy performance of the proposed relay selection method in cognitive radio networks with energy harvesting. The proposed formula is corroborated by computer simulations. Several results illustrate the effectiveness of the relay selection in securing information transmission. Additionally, the security capability is saturated at large peak transmission powers or large preset outage probabilities of licensed users. Furthermore, the security capability depends on many specifications among which the power splitting ratio, the relays’ positions, and the time switching ratio can be optimally selected to obtain the best security performance.
APA, Harvard, Vancouver, ISO, and other styles
41

Fan, Chongjun, Yang Jin, Liang-An Huo, Chen Liu, and Yunpeng Yang. "Epidemic spreading of interacting diseases with activity of nodes reshapes the critical threshold." International Journal of Modern Physics C 28, no. 01 (January 2017): 1750013. http://dx.doi.org/10.1142/s0129183117500139.

Full text
Abstract:
In this paper, based on susceptible–infected–susceptible (SIS) scheme, we introduce a framework that allows us to describe the spreading dynamics of two interacting diseases with active nodes. Different from previous studies, the two different diseases, propagating concurrently on the same population, can interact with each other by modifying their transmission rates. Meanwhile, according to certain probabilities, each node on the complex networks rotates between active state and inactive state. Based on heterogeneous mean-field approach, we analyze the epidemic thresholds of the two diseases and compute the temporal evolution characterizing the spreading dynamics. In addition, we validate these theoretical predictions by numerical simulations with phase diagrams. Results show that the secondary thresholds for the two opposite scenarios (mutual enhancement scenario and mutual impairment scenario) are different. We also find that the value of critical threshold and the final size of spreading dynamics are reduced as the node activity rate decreases.
APA, Harvard, Vancouver, ISO, and other styles
42

Lin, Chun-Li, and Che An Pai. "NUMERICAL INVESTIGATION OF FAILURE RISK OF CAD/CAM CERAMIC RESTORATION FOR AN ENDODONTICALLY TREATED MAXILLARY PREMOLAR WITH MO PREPARATION." Biomedical Engineering: Applications, Basis and Communications 22, no. 04 (August 2010): 327–35. http://dx.doi.org/10.4015/s1016237210002043.

Full text
Abstract:
This study evaluates the risk of failure for an endodontically treated premolar with mesio-occlosal (MO) preparation and four different computer-aided design/manufacturing (CAD/CAM) ceramic restoration configurations. Four three-dimensional finite element (FE) models designed with CAD/CAM ceramic inlay, endoinlay, endocrown, and classical crown restorations were constructed to perform simulations. The Weibull function was incorporated with an FE analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restoration were the lowest values relative to the other three restorations. According to the Weibull analysis, overall failure probabilities were found at 100, 100, 1 and 1% for the inlay, endoinlay, endocrown, and classical crown restorations, respectively in the normal biting. The corresponding values for clenching were over 100% for inlay and endoinlay restorations and about 87 and 70% for endocrown and classical crown, respectively. This numerical investigation suggests that endocrown and classical crown restorations for endodontically treated premolars with MO preparation present similar longevity.
APA, Harvard, Vancouver, ISO, and other styles
43

Leibold, Christian, and Michael H. K. Bendels. "Learning to Discriminate Through Long-Term Changes of Dynamical Synaptic Transmission." Neural Computation 21, no. 12 (December 2009): 3408–28. http://dx.doi.org/10.1162/neco.2009.12-08-929.

Full text
Abstract:
Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activities may reflect decorrelation of the expanded stimulus space by intracortical synaptic connections.
APA, Harvard, Vancouver, ISO, and other styles
44

Huynh, Tan-Phuoc, Pham Ngoc Son, and Miroslav Voznak. "Secrecy Performance of Underlay Cooperative Cognitive Network Using Non-Orthogonal Multiple Access with Opportunistic Relay Selection." Symmetry 11, no. 3 (March 15, 2019): 385. http://dx.doi.org/10.3390/sym11030385.

Full text
Abstract:
In this paper, an underlay cooperative cognitive network using a non-orthogonal multiple access (UCCN-NOMA) system is investigated, in which the intermediate multiple relays help to decode and forward two signals x 1 and x 2 from a source node to two users D1 and D2, respectively, under wiretapping of an eavesdropper (E). We study the best relay selection strategies by three types of relay selection criteria: the first and second best relay selection is based on the maximum channel gain of the links R i -D 1 , R i -D2, respectively; the third one is to ensure a minimum value of the channel gains from the R i -E link. We analyze and evaluate the secrecy performances of the transmissions x 1 and x 2 from the source node to the destination nodes D1, D2, respectively, in the proposed UCCN-NOMA system in terms of the secrecy outage probabilities (SOPs) over Rayleigh fading channels. Simulation and analysis results are presented as follows. The results of the (sum) secrecy outage probability show that proposed scheme can realize the maximal diversity gain. The security of the system is very good when eavesdropper node E is far from the source and cooperative relay. Finally, the theoretical analyses are verified by performing Monte Carlo simulations.
APA, Harvard, Vancouver, ISO, and other styles
45

HERINGA, J. R., H. W. J. BLÖTE, and A. HOOGLAND. "CRITICAL PROPERTIES OF 3D ISING SYSTEMS WITH NON-HAMILTONIAN DYNAMICS." International Journal of Modern Physics C 05, no. 03 (June 1994): 589–98. http://dx.doi.org/10.1142/s0129183194000763.

Full text
Abstract:
We investigate two three-dimensional Ising models with non-Hamiltonian Glauber dynamics. The transition probabilities of these models can, just as in the case of equilibrium models, be expressed in terms of Boltzmann factors depending only on the interacting spins and the bond strengths. However, the bond strength associated with each lattice edge assumes different values for the two spins involved. The first model has cubic symmetry and consists of two sublattices at different temperatures. In the second model a preferred direction is present. These two models are investigated by Monte Carlo simulations on the Delft Ising System Processor. Both models undergo a phase transition between an ordered and a disordered state. Their critical properties agree with Ising universality. The second model displays magnetization bistability.
APA, Harvard, Vancouver, ISO, and other styles
46

Thall, Peter F., Richard C. Herrick, Hoang Q. Nguyen, John J. Venier, and J. Clift Norris. "Effective sample size for computing prior hyperparameters in Bayesian phase I–II dose-finding." Clinical Trials 11, no. 6 (September 1, 2014): 657–66. http://dx.doi.org/10.1177/1740774514547397.

Full text
Abstract:
Background: The efficacy–toxicity trade-off based design is a practical Bayesian phase I–II dose-finding methodology. Because the design’s performance is very sensitive to prior hyperparameters and the shape of the target trade-off contour, specifying these two design elements properly is essential. Purpose: The goals are to provide a method that uses elicited mean outcome probabilities to derive a prior that is neither overly informative nor overly disperse, and practical guidelines for specifying the target trade-off contour. Methods: A general algorithm is presented that determines prior hyperparameters using least squares penalized by effective sample size. Guidelines for specifying the trade-off contour are provided. These methods are illustrated by a clinical trial in advanced prostate cancer. A new version of the efficacy–toxicity program is provided for implementation. Results: Together, the algorithm and guidelines provide substantive improvements in the design’s operating characteristics. Limitations: The method requires a substantial number of elicited values and design parameters, and computer simulations are required to obtain an acceptable design. Conclusion: The two key improvements greatly enhance the efficacy–toxicity design’s practical usefulness and are straightforward to implement using the updated computer program. The algorithm for determining prior hyperparameters to ensure a specified level of informativeness is general, and may be applied to models other than that underlying the efficacy–toxicity method.
APA, Harvard, Vancouver, ISO, and other styles
47

Squires, John R., Kevin S. McKelvey, and Leonard F. Ruggiero. "A Snow-tracking Protocol Used to Delineate Local Lynx, Lynx canadensis, Distributions." Canadian Field-Naturalist 118, no. 4 (October 1, 2004): 583. http://dx.doi.org/10.22621/cfn.v118i4.60.

Full text
Abstract:
Determining Canada Lynx (Lynx canadensis) distribution is an important management need, especially at the southern extent of the species range where it is listed as threatened under the U. S. Endangered Species Act. We describe a systematic snowtrack based sampling framework that provides reliable distribution data for Canada Lynx. We used computer simulations to evaluate protocol efficacy. Based on these simulations, the probability of detecting lynx tracks during a single visit (8 km transect) to a survey unit ranged from approximately 0.23 for surveys conducted only one day after snowfall, to 0.78 for surveys conducted 7 days after a snowfall. If the survey effort was increased to three visits, then detection probabilities increased substantially from 0.58 for one day after snowfall to about 0.95 for surveys conducted 7 days after a snowfall. We tested the protocol in the Garnet Range, Montana, where most lynx were radio-collared. We documented a total of 189 lynx tracks during two winters (2001-2003). Lynx distribution based on snow-track surveys was coincident with the area defined through radio telemetry. Additionally, we conducted snow-track surveys in areas of western Wyoming where lynx were believed present but scarce. We detected a total of six lynx tracks during three winters (1999-2002). In Wyoming , where lynx presence was inferred from a few tracks, we verified species identification by securing genetic samples (hairs from daybeds) along track-lines.
APA, Harvard, Vancouver, ISO, and other styles
48

Rykova, Tatiana, Barış Göktepe, Thomas Schierl, Konstantin Samouylov, and Cornelius Hellge. "Analytical Model and Feedback Predictor Optimization for Combined Early-HARQ and HARQ." Mathematics 9, no. 17 (August 31, 2021): 2104. http://dx.doi.org/10.3390/math9172104.

Full text
Abstract:
In order to fulfill the stringent Ultra-Reliable Low Latency Communication (URLLC) requirements towards Fifth Generation (5G) mobile networks, early-Hybrid Automatic Repeat reQuest (e-HARQ) schemes have been introduced, aimed at providing faster feedback and thus earlier retransmission. The performance of e-HARQ prediction strongly depends on the classification mechanism, data length, threshold value. In this paper, we propose an analytical model that incorporates e-HARQ and Hybrid Automatic Repeat reQuest (HARQ) functionalities in terms of two phases in discrete time. The model implies a fast and accurate way to get the main performance measures, and apply optimization analysis to find the optimal values used in predictor’s classification. We employ realistic data for transition probabilities obtained by means of 5G link-level simulations and conduct extensive experimental analysis. The results show that at false positive probability of 10−1, the e-HARQ prediction with the found optimal parameters can achieve around 20% of gain over HARQ at False Negative (FN) of 10−1 and around 7.5% at FN of 10−3 in terms of a mean spending time before successful delivery.
APA, Harvard, Vancouver, ISO, and other styles
49

Li, Xiaoou, and Jingchen Liu. "Rare-event simulation and efficient discretization for the supremum of Gaussian random fields." Advances in Applied Probability 47, no. 03 (September 2015): 787–816. http://dx.doi.org/10.1017/s0001867800048837.

Full text
Abstract:
In this paper we consider a classic problem concerning the high excursion probabilities of a Gaussian random fieldfliving on a compact setT. We develop efficient computational methods for the tail probabilitiesℙ{supTf(t) &gt;b}. For each positive ε, we present Monte Carlo algorithms that run inconstanttime and compute the probabilities with relative error ε for arbitrarily largeb. The efficiency results are applicable to a large class of Hölder continuous Gaussian random fields. Besides computations, the change of measure and its analysis techniques have several theoretical and practical indications in the asymptotic analysis of Gaussian random fields.
APA, Harvard, Vancouver, ISO, and other styles
50

Li, Xiaoou, and Jingchen Liu. "Rare-event simulation and efficient discretization for the supremum of Gaussian random fields." Advances in Applied Probability 47, no. 3 (September 2015): 787–816. http://dx.doi.org/10.1239/aap/1444308882.

Full text
Abstract:
In this paper we consider a classic problem concerning the high excursion probabilities of a Gaussian random fieldfliving on a compact setT. We develop efficient computational methods for the tail probabilitiesℙ{supTf(t) >b}. For each positive ε, we present Monte Carlo algorithms that run inconstanttime and compute the probabilities with relative error ε for arbitrarily largeb. The efficiency results are applicable to a large class of Hölder continuous Gaussian random fields. Besides computations, the change of measure and its analysis techniques have several theoretical and practical indications in the asymptotic analysis of Gaussian random fields.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography