Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Sequential Monte Carlo (SMC) method.

Artykuły w czasopismach na temat „Sequential Monte Carlo (SMC) method”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Sequential Monte Carlo (SMC) method”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Wang, Liangliang, Shijia Wang i Alexandre Bouchard-Côté. "An Annealed Sequential Monte Carlo Method for Bayesian Phylogenetics". Systematic Biology 69, nr 1 (6.06.2019): 155–83. http://dx.doi.org/10.1093/sysbio/syz028.

Pełny tekst źródła
Streszczenie:
Abstract We describe an “embarrassingly parallel” method for Bayesian phylogenetic inference, annealed Sequential Monte Carlo (SMC), based on recent advances in the SMC literature such as adaptive determination of annealing parameters. The algorithm provides an approximate posterior distribution over trees and evolutionary parameters as well as an unbiased estimator for the marginal likelihood. This unbiasedness property can be used for the purpose of testing the correctness of posterior simulation software. We evaluate the performance of phylogenetic annealed SMC by reviewing and comparing with other computational Bayesian phylogenetic methods, in particular, different marginal likelihood estimation methods. Unlike previous SMC methods in phylogenetics, our annealed method can utilize standard Markov chain Monte Carlo (MCMC) tree moves and hence benefit from the large inventory of such moves available in the literature. Consequently, the annealed SMC method should be relatively easy to incorporate into existing phylogenetic software packages based on MCMC algorithms. We illustrate our method using simulation studies and real data analysis.
Style APA, Harvard, Vancouver, ISO itp.
2

Finke, Axel, Arnaud Doucet i Adam M. Johansen. "Limit theorems for sequential MCMC methods". Advances in Applied Probability 52, nr 2 (czerwiec 2020): 377–403. http://dx.doi.org/10.1017/apr.2020.9.

Pełny tekst źródła
Streszczenie:
AbstractBoth sequential Monte Carlo (SMC) methods (a.k.a. ‘particle filters’) and sequential Markov chain Monte Carlo (sequential MCMC) methods constitute classes of algorithms which can be used to approximate expectations with respect to (a sequence of) probability distributions and their normalising constants. While SMC methods sample particles conditionally independently at each time step, sequential MCMC methods sample particles according to a Markov chain Monte Carlo (MCMC) kernel. Introduced over twenty years ago in [6], sequential MCMC methods have attracted renewed interest recently as they empirically outperform SMC methods in some applications. We establish an $\mathbb{L}_r$ -inequality (which implies a strong law of large numbers) and a central limit theorem for sequential MCMC methods and provide conditions under which errors can be controlled uniformly in time. In the context of state-space models, we also provide conditions under which sequential MCMC methods can indeed outperform standard SMC methods in terms of asymptotic variance of the corresponding Monte Carlo estimators.
Style APA, Harvard, Vancouver, ISO itp.
3

Cong-An, Xu, Xu Congqi, Dong Yunlong, Xiong Wei, Chai Yong i Li Tianmei. "A Novel Sequential Monte Carlo-Probability Hypothesis Density Filter for Particle Impoverishment Problem". Journal of Computational and Theoretical Nanoscience 13, nr 10 (1.10.2016): 6872–77. http://dx.doi.org/10.1166/jctn.2016.5640.

Pełny tekst źródła
Streszczenie:
As a typical implementation of the probability hypothesis density (PHD) filter, sequential Monte Carlo PHD (SMC-PHD) is widely employed in highly nonlinear systems. However, diversity loss of particles introduced by the resampling step, which can be called particle impoverishment problem, may lead to performance degradation and restrain the use of SMC-PHD filter in practical applications. In this paper, a novel SMC-PHD filter based on particle compensation is proposed to solve the problem. Firstly, based on an analysis of the particle impoverishment problem, a new particle compensatory method is developed to improve the particle diversity. Then, all the particles are integrated into the SMC-PHD filter framework. Compared with the SMC-PHD filter, simulation results demonstrate that the proposed particle compensatory SMC-PHD filter is capable of overcoming the particle impoverishment problem, which indicate good application prospects.
Style APA, Harvard, Vancouver, ISO itp.
4

Abu Znaid, Ammar M. A., Mohd Yamani Idna Idris, Ainuddin Wahid Abdul Wahab, Liana Khamis Qabajeh i Omar Adil Mahdi. "Sequential Monte Carlo Localization Methods in Mobile Wireless Sensor Networks: A Review". Journal of Sensors 2017 (2017): 1–19. http://dx.doi.org/10.1155/2017/1430145.

Pełny tekst źródła
Streszczenie:
The advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages and disadvantages. The similarities and differences of each scheme are investigated on the basis of significant parameters, namely, localization accuracy, computational cost, communication cost, and number of samples. We discuss the challenges and direction of the future research work for each parameter.
Style APA, Harvard, Vancouver, ISO itp.
5

Deng, Yue, Yongzhen Pei, Changguo Li i Bin Zhu. "Model Selection and Parameter Estimation for an Improved Approximate Bayesian Computation Sequential Monte Carlo Algorithm". Discrete Dynamics in Nature and Society 2022 (30.06.2022): 1–14. http://dx.doi.org/10.1155/2022/8969903.

Pełny tekst źródła
Streszczenie:
Model selection and parameter estimation are very important in many fields. However, the existing methods have many problems, such as low efficiency in model selection and inaccuracy in parameter estimation. In this study, we proposed a new algorithm named improved approximate Bayesian computation sequential Monte Carlo algorithm (IABC-SMC) based on approximate Bayesian computation sequential Monte Carlo algorithm (ABC-SMC). Using the IABC-SMC algorithm, given data and the set of two models including logistic and Gompertz models of infectious diseases, we obtained the best fitting model and the values of unknown parameters of the corresponding model. The simulation results showed that the IABC-SMC algorithm can quickly and accurately select a model that best matches the corresponding epidemic data among multiple candidate models and estimate the values of unknown parameters of model very accurately. We further compared the effects of IABC-SMC algorithm with that of ABC-SMC algorithm. Simulations showed that the IABC-SMC algorithm can improve the accuracy of estimated parameter values and the speed of model selection and also avoid the shortage of ABC-SMC algorithm. This study suggests that the IABC-SMC algorithm can be seen as a promising method for model selection and parameter estimation.
Style APA, Harvard, Vancouver, ISO itp.
6

Hsu, Kuo-Lin. "Hydrologic forecasting using artificial neural networks: a Bayesian sequential Monte Carlo approach". Journal of Hydroinformatics 13, nr 1 (2.04.2010): 25–35. http://dx.doi.org/10.2166/hydro.2010.044.

Pełny tekst źródła
Streszczenie:
Sequential Monte Carlo (SMC) methods are known to be very effective for the state and parameter estimation of nonlinear and non-Gaussian systems. In this study, SMC is applied to the parameter estimation of an artificial neural network (ANN) model for streamflow prediction of a watershed. Through SMC simulation, the probability distribution of model parameters and streamflow estimation is calculated. The results also showed the SMC approach is capable of providing reliable streamflow prediction under limited available observations.
Style APA, Harvard, Vancouver, ISO itp.
7

Weng, Zhipeng, Jinghua Zhou i Zhengdong Zhan. "Reliability Evaluation of Standalone Microgrid Based on Sequential Monte Carlo Simulation Method". Energies 15, nr 18 (14.09.2022): 6706. http://dx.doi.org/10.3390/en15186706.

Pełny tekst źródła
Streszczenie:
In order to analyze the influence of uncertainty and an operation strategy on the reliability of a standalone microgrid, a reliability evaluation method based on a sequential Monte Carlo (SMC) simulation was developed. Here, the duty cycles of a microturbine (MT), the stochastic performance of photovoltaics (PV), and wind turbine generators (WTG) were considered. Moreover, the time-varying load with random fluctuation was modeled. In this method, the available capacity of an energy storage system (ESS) was also comprehensively considered by the SMC simulation. Then, the reliability evaluation framework was established from the perspectives of probability, frequency, and duration, and reliability evaluation algorithms under different operation strategies were formulated. Lastly, the influence of WTG and PV penetration and equipment capacity on the reliability was evaluated in the test system. The results showed that the complementary characteristics of wind and solar and the enhancement of the equipment capacity can both improve the reliability; but, with the increase in the penetration rate of WTG and PV, more ESS capacity is needed to cope with the randomness of WTG and PV. In addition, load shedding minimization strategies can minimize the probability and the number of reductions and achieve optimal reliability, which can provide a reference for the formulation of microgrid operation strategies.
Style APA, Harvard, Vancouver, ISO itp.
8

Röder, Lenard L., Patrick Dewald, Clara M. Nussbaumer, Jan Schuladen, John N. Crowley, Jos Lelieveld i Horst Fischer. "Data quality enhancement for field experiments in atmospheric chemistry via sequential Monte Carlo filters". Atmospheric Measurement Techniques 16, nr 5 (7.03.2023): 1167–78. http://dx.doi.org/10.5194/amt-16-1167-2023.

Pełny tekst źródła
Streszczenie:
Abstract. In this study, we explore the applications and limitations of sequential Monte Carlo (SMC) filters to field experiments in atmospheric chemistry. The proposed algorithm is simple, fast, versatile and returns a complete probability distribution. It combines information from measurements with known system dynamics to decrease the uncertainty of measured variables. The method shows high potential to increase data coverage, precision and even possibilities to infer unmeasured variables. We extend the original SMC algorithm with an activity variable that gates the proposed reactions. This extension makes the algorithm more robust when dynamical processes not considered in the calculation dominate and the information provided via measurements is limited. The activity variable also provides a quantitative measure of the dominant processes. Free parameters of the algorithm and their effect on the SMC result are analyzed. The algorithm reacts very sensitively to the estimated speed of stochastic variation. We provide a scheme to choose this value appropriately. In a simulation study, O3, NO, NO2 and jNO2 are tested for interpolation and de-noising using measurement data of a field campaign. Generally, the SMC method performs well under most conditions, with some dependence on the particular variable being analyzed.
Style APA, Harvard, Vancouver, ISO itp.
9

Nakano, S., K. Suzuki, K. Kawamura, F. Parrenin i T. Higuchi. "A sequential Bayesian approach for the estimation of the age–depth relationship of Dome Fuji ice core". Nonlinear Processes in Geophysics Discussions 2, nr 3 (26.06.2015): 939–68. http://dx.doi.org/10.5194/npgd-2-939-2015.

Pełny tekst źródła
Streszczenie:
Abstract. A technique for estimating the age–depth relationship in an ice core and evaluating its uncertainty is presented. The age–depth relationship is mainly determined by the accumulation of snow at the site of the ice core and the thinning process due to the horizontal stretching and vertical compression of ice layers. However, since neither the accumulation process nor the thinning process are fully understood, it is essential to incorporate observational information into a model that describes the accumulation and thinning processes. In the proposed technique, the age as a function of depth is estimated from age markers and δ18O data. The estimation is achieved using the particle Markov chain Monte Carlo (PMCMC) method, in which the sequential Monte Carlo (SMC) method is combined with the Markov chain Monte Carlo method. In this hybrid method, the posterior distributions for the parameters in the models for the accumulation and thinning processes are computed using the Metropolis method, in which the likelihood is obtained with the SMC method. Meanwhile, the posterior distribution for the age as a function of depth is obtained by collecting the samples generated by the SMC method with Metropolis iterations. The use of this PMCMC method enables us to estimate the age–depth relationship without assuming either linearity or Gaussianity. The performance of the proposed technique is demonstrated by applying it to ice core data from Dome Fuji in Antarctica.
Style APA, Harvard, Vancouver, ISO itp.
10

Rusyda Roslan, Nur Nabihah, NoorFatin Farhanie Mohd Fauzi i Mohd Ikhwan Muhammad Ridzuan. "Variance reduction technique in reliability evaluation for distribution system by using sequential Monte Carlo simulation". Bulletin of Electrical Engineering and Informatics 11, nr 6 (1.12.2022): 3061–68. http://dx.doi.org/10.11591/eei.v11i6.3950.

Pełny tekst źródła
Streszczenie:
This paper discusses the need for variance reduction in simulations in order to reduce the time required to compute a simulation. The large and complex network is commonly evaluated using a large-scale Monte Carlo simulation. Unfortunately, due to the different sizes of the network, it takes some time to complete a simulation. However, variance reduction techniques (VRT) can help to solve the issues. The effect of VRT changes the behaviors of a simulation, particularly the time required to run the simulation. To evaluate the reliability indices, two sequential Monte Carlo (SMC) methods are used. SMC with VRT and SMC without VRT are the two options. The presence of VRT in the simulation distinguishes the two simulations. Finally, reliability indices: system average interruption frequency index (SAIFI), system average interruption duration index (SAIDI), and customer average interruption duration index (CAIDI) will be calculated at the end of the simulation to determine the efficiency for the SMC with and without VRT. Overall, the SMC with VRT is more efficient because it is more convenient and saves time than the SMC without VRT.
Style APA, Harvard, Vancouver, ISO itp.
11

GU, FENG, i XIAOLIN HU. "ANALYSIS AND QUANTIFICATION OF DATA ASSIMILATION BASED ON SEQUENTIAL MONTE CARLO METHODS FOR WILDFIRE SPREAD SIMULATION". International Journal of Modeling, Simulation, and Scientific Computing 01, nr 04 (grudzień 2010): 445–68. http://dx.doi.org/10.1142/s1793962310000298.

Pełny tekst źródła
Streszczenie:
Data assimilation is an important technique to improve simulation results by assimilating real time sensor data into a simulation model. A data assimilation framework based on Sequential Monte Carlo (SMC) methods for wildfire spread simulation has been developed in previous work. This paper provides systematic analysis and measurement to quantify the effectiveness and robustness of the developed data assimilation method. Measurement metrics are used to evaluate the robustness of SMC methods in data assimilation for wildfire spread simulation. Sensitivity analysis is carried out to examine the influences of important parameters to the data assimilation results. This work of analysis and quantification provides information to assess the effectiveness of the data assimilation method and suggests guidelines to further improve the data assimilation method for wildfire spread simulation.
Style APA, Harvard, Vancouver, ISO itp.
12

Rusyda Roslan, Nur Nabihah, NoorFatin Farhanie Mohd Fauzi i Mohd Ikhwan Muhammad Ridzuan. "Monte Carlo simulation convergences’ percentage and position in future reliability evaluation". International Journal of Electrical and Computer Engineering (IJECE) 12, nr 6 (1.12.2022): 6218. http://dx.doi.org/10.11591/ijece.v12i6.pp6218-6227.

Pełny tekst źródła
Streszczenie:
<span lang="EN-US">Reliability assessment is a needed assessment in today's world. It is required not only for system design but also to ensure the power delivered reaches the consumer. It is usual for fault to occur, but it is best if the fault can be predicted and the way to overcome it can be prepared in advance. Monte Carlo simulation is a standard method of assessing reliability since it is a time-based evaluation that nearly represents the actual situation. However, sequential Monte Carlo (SMC) typically took long-time simulation. A convergence element can be implemented into the simulation to ensure that the time taken to compute the simulation can be reduced. The SMC can be done with and without convergence. SMC with convergence has high accuracy compared to the SMC without convergence, as it takes a long time and has a high possibility of not getting accurate output. In this research, the SMC is subjected to five different convergence items to determine which converge simulation is the fastest while providing better performance for reliability evaluation. There are two types of convergence positions, namely input convergence and output convergence. Overall, output convergence shows the best result compared to input convergence.</span>
Style APA, Harvard, Vancouver, ISO itp.
13

Nakano, Shin'ya, Kazue Suzuki, Kenji Kawamura, Frédéric Parrenin i Tomoyuki Higuchi. "A sequential Bayesian approach for the estimation of the age–depth relationship of the Dome Fuji ice core". Nonlinear Processes in Geophysics 23, nr 1 (29.02.2016): 31–44. http://dx.doi.org/10.5194/npg-23-31-2016.

Pełny tekst źródła
Streszczenie:
Abstract. A technique for estimating the age–depth relationship in an ice core and evaluating its uncertainty is presented. The age–depth relationship is determined by the accumulation of snow at the site of the ice core and the thinning process as a result of the deformation of ice layers. However, since neither the accumulation rate nor the thinning process is fully known, it is essential to incorporate observational information into a model that describes the accumulation and thinning processes. In the proposed technique, the age as a function of depth is estimated by making use of age markers and δ18O data. The age markers provide reliable age information at several depths. The data of δ18O are used as a proxy of the temperature for estimating the accumulation rate. The estimation is achieved using the particle Markov chain Monte Carlo (PMCMC) method, which is a combination of the sequential Monte Carlo (SMC) method and the Markov chain Monte Carlo method. In this hybrid method, the posterior distributions for the parameters in the models for the accumulation and thinning process are computed using the Metropolis method, in which the likelihood is obtained with the SMC method, and the posterior distribution for the age as a function of depth is obtained by collecting the samples generated by the SMC method with Metropolis iterations. The use of this PMCMC method enables us to estimate the age–depth relationship without assuming either linearity or Gaussianity. The performance of the proposed technique is demonstrated by applying it to ice core data from Dome Fuji in Antarctica.
Style APA, Harvard, Vancouver, ISO itp.
14

Toni, Tina, David Welch, Natalja Strelkowa, Andreas Ipsen i Michael P. H. Stumpf. "Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems". Journal of The Royal Society Interface 6, nr 31 (9.07.2008): 187–202. http://dx.doi.org/10.1098/rsif.2008.0172.

Pełny tekst źródła
Streszczenie:
Approximate Bayesian computation (ABC) methods can be used to evaluate posterior distributions without having to calculate likelihoods. In this paper, we discuss and apply an ABC method based on sequential Monte Carlo (SMC) to estimate parameters of dynamical models. We show that ABC SMC provides information about the inferability of parameters and model sensitivity to changes in parameters, and tends to perform better than other ABC approaches. The algorithm is applied to several well-known biological systems, for which parameters and their credible intervals are inferred. Moreover, we develop ABC SMC as a tool for model selection; given a range of different mathematical descriptions, ABC SMC is able to choose the best model using the standard Bayesian model selection apparatus.
Style APA, Harvard, Vancouver, ISO itp.
15

Cameron, Scott, Hans Eggers i Steve Kroon. "A Sequential Marginal Likelihood Approximation Using Stochastic Gradients". Proceedings 33, nr 1 (3.12.2019): 18. http://dx.doi.org/10.3390/proceedings2019033018.

Pełny tekst źródła
Streszczenie:
Existing algorithms like nested sampling and annealed importance sampling are able to produce accurate estimates of the marginal likelihood of a model, but tend to scale poorly to large data sets. This is because these algorithms need to recalculate the log-likelihood for each iteration by summing over the whole data set. Efficient scaling to large data sets requires that algorithms only visit small subsets (mini-batches) of data on each iteration. To this end, we estimate the marginal likelihood via a sequential decomposition into a product of predictive distributions p ( y n | y < n ) . Predictive distributions can be approximated efficiently through Bayesian updating using stochastic gradient Hamiltonian Monte Carlo, which approximates likelihood gradients using mini-batches. Since each data point typically contains little information compared to the whole data set, the convergence to each successive posterior only requires a short burn-in phase. This approach can be viewed as a special case of sequential Monte Carlo (SMC) with a single particle, but differs from typical SMC methods in that it uses stochastic gradients. We illustrate how this approach scales favourably to large data sets with some simple models.
Style APA, Harvard, Vancouver, ISO itp.
16

Liu, Qinming, i Ming Dong. "Online Health Management for Complex Nonlinear Systems Based on Hidden Semi-Markov Model Using Sequential Monte Carlo Methods". Mathematical Problems in Engineering 2012 (2012): 1–22. http://dx.doi.org/10.1155/2012/951584.

Pełny tekst źródła
Streszczenie:
Health management for a complex nonlinear system is becoming more important for condition-based maintenance and minimizing the related risks and costs over its entire life. However, a complex nonlinear system often operates under dynamically operational and environmental conditions, and it subjects to high levels of uncertainty and unpredictability so that effective methods for online health management are still few now. This paper combines hidden semi-Markov model (HSMM) with sequential Monte Carlo (SMC) methods. HSMM is used to obtain the transition probabilities among health states and health state durations of a complex nonlinear system, while the SMC method is adopted to decrease the computational and space complexity, and describe the probability relationships between multiple health states and monitored observations of a complex nonlinear system. This paper proposes a novel method of multisteps ahead health recognition based on joint probability distribution for health management of a complex nonlinear system. Moreover, a new online health prognostic method is developed. A real case study is used to demonstrate the implementation and potential applications of the proposed methods for online health management of complex nonlinear systems.
Style APA, Harvard, Vancouver, ISO itp.
17

Noh, S. J., Y. Tachikawa, M. Shiiba i S. Kim. "Applying sequential Monte Carlo methods into a distributed hydrologic model: lagged particle filtering approach with regularization". Hydrology and Earth System Sciences Discussions 8, nr 2 (4.04.2011): 3383–420. http://dx.doi.org/10.5194/hessd-8-3383-2011.

Pełny tekst źródła
Streszczenie:
Abstract. Applications of data assimilation techniques have been widely used to improve hydrologic prediction. Among various data assimilation techniques, sequential Monte Carlo (SMC) methods, known as "particle filters", provide the capability to handle non-linear and non-Gaussian state-space models. In this paper, we propose an improved particle filtering approach to consider different response time of internal state variables in a hydrologic model. The proposed method adopts a lagged filtering approach to aggregate model response until uncertainty of each hydrologic process is propagated. The regularization with an additional move step based on Markov chain Monte Carlo (MCMC) is also implemented to preserve sample diversity under the lagged filtering approach. A distributed hydrologic model, WEP is implemented for the sequential data assimilation through the updating of state variables. Particle filtering is parallelized and implemented in the multi-core computing environment via open message passing interface (MPI). We compare performance results of particle filters in terms of model efficiency, predictive QQ plots and particle diversity. The improvement of model efficiency and the preservation of particle diversity are found in the lagged regularized particle filter.
Style APA, Harvard, Vancouver, ISO itp.
18

Imani, Mahdi, Seyede Fatemeh Ghoreishi, Douglas Allaire i Ulisses M. Braga-Neto. "MFBO-SSM: Multi-Fidelity Bayesian Optimization for Fast Inference in State-Space Models". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17.07.2019): 7858–65. http://dx.doi.org/10.1609/aaai.v33i01.33017858.

Pełny tekst źródła
Streszczenie:
Nonlinear state-space models are ubiquitous in modeling real-world dynamical systems. Sequential Monte Carlo (SMC) techniques, also known as particle methods, are a well-known class of parameter estimation methods for this general class of state-space models. Existing SMC-based techniques rely on excessive sampling of the parameter space, which makes their computation intractable for large systems or tall data sets. Bayesian optimization techniques have been used for fast inference in state-space models with intractable likelihoods. These techniques aim to find the maximum of the likelihood function by sequential sampling of the parameter space through a single SMC approximator. Various SMC approximators with different fidelities and computational costs are often available for sample-based likelihood approximation. In this paper, we propose a multi-fidelity Bayesian optimization algorithm for the inference of general nonlinear state-space models (MFBO-SSM), which enables simultaneous sequential selection of parameters and approximators. The accuracy and speed of the algorithm are demonstrated by numerical experiments using synthetic gene expression data from a gene regulatory network model and real data from the VIX stock price index.
Style APA, Harvard, Vancouver, ISO itp.
19

Infante, Saba, Luis Sánchez, Aracelis Hernández i José Marcano. "Sequential Monte Carlo Filters with Parameters Learning for Commodity Pricing Models". Statistics, Optimization & Information Computing 9, nr 3 (22.06.2021): 694–716. http://dx.doi.org/10.19139/soic-2310-5070-814.

Pełny tekst źródła
Streszczenie:
In this article, an estimation methodology based on the sequential Monte Carlo algorithm is proposed, thatjointly estimate the states and parameters, the relationship between the prices of futures contracts and the spot prices of primary products is determined, the evolution of prices and the volatility of the historical data of the primary market (Gold and Soybean) are analyzed. Two stochastic models for an estimate the states and parameters are considered, the parameters and states describe physical measure (associated with the price) and risk-neutral measure (associated with the markets to futures), the price dynamics in the short-term through the reversion to the mean and volatility are determined, while that in the long term through markets to futures. Other characteristics such as seasonal patterns, price spikes, market dependent volatilities, and non-seasonality can also be observed. In the methodology, a parameter learning algorithm is used, specifically, three algorithms are proposed, that is the sequential Monte Carlo estimation (SMC) for state space modelswith unknown parameters: the first method is considered a particle filter that is based on the sampling algorithm of sequential importance with resampling (SISR). The second implemented method is the Storvik algorithm [19], the states and parameters of the posterior distribution are estimated that have supported in low-dimensional spaces, a sufficient statistics from the sample of the filtered distribution is considered. The third method is (PLS) Carvalho’s Particle Learning and Smoothing algorithm [31]. The cash prices of the contracts with future delivery dates are analyzed. The results indicate postponement of payment, the future prices on different maturity dates with the spot price are highly correlated. Likewise, the contracts with a delivery date for the last periods of the year 2017, the spot price lower than the prices of the contracts with expiration date for 12 and 24 months is found, opposite occurs in the contracts with expiration date for 1 and 6 months.
Style APA, Harvard, Vancouver, ISO itp.
20

Shafii, Mahyar, Bryan Tolson i L. Shawn Matott. "Improving the efficiency of Monte Carlo Bayesian calibration of hydrologic models via model pre-emption". Journal of Hydroinformatics 17, nr 5 (23.02.2015): 763–70. http://dx.doi.org/10.2166/hydro.2015.043.

Pełny tekst źródła
Streszczenie:
Bayesian inference via Markov Chain Monte Carlo (MCMC) sampling and sequential Monte Carlo (SMC) sampling are popular methods for uncertainty analysis in hydrological modelling. However, application of these methodologies can incur significant computational costs. This study investigated using model pre-emption for improving the computational efficiency of MCMC and SMC samplers in the context of hydrological modelling. The proposed pre-emption strategy facilitates early termination of low-likelihood simulations and results in reduction of unnecessary simulation time steps. The proposed approach is incorporated into two samplers and applied to the calibration of three rainfall–runoff models. Results show that overall pre-emption savings range from 5 to 21%. Furthermore, results indicate that pre-emption savings are greatest during the pre-convergence ‘burn-in’ period (i.e., between 8 and 39%) and decrease as the algorithms converge towards high likelihood regions of parameter space. The observed savings are achieved with absolutely no change in the posterior set of parameters.
Style APA, Harvard, Vancouver, ISO itp.
21

Zhan, Ronghui, Liping Wang i Jun Zhang. "Joint Tracking and Classification of Multiple Targets with Scattering Center Model and CBMeMBer Filter". Sensors 20, nr 6 (17.03.2020): 1679. http://dx.doi.org/10.3390/s20061679.

Pełny tekst źródła
Streszczenie:
This paper deals with joint tracking and classification (JTC) of multiple targets based on scattering center model (SCM) and wideband radar observations. We first introduce an SCM-based JTC method, where the SCM is used to generate the predicted high range resolution profile (HRRP) with the information of the target aspect angle, and target classification is implemented through the data correlation of observed HRRP with predicted HRRPs. To solve the problem of multi-target JTC in the presence of clutter and detection uncertainty, we then integrate the SCM-based JTC method into the CBMeMBer filter framework, and derive a novel SCM-JTC-CBMeMBer filter with Bayesian theory. To further tackle the complex integrals’ calculation involved in targets state and class estimation, we finally provide the sequential Monte Carlo (SMC) implementation of the proposed SCM-JTC-CBMeMBer filter. The effectiveness of the presented multi-target JTC method is validated by simulation results under the application scenario of maritime ship surveillance.
Style APA, Harvard, Vancouver, ISO itp.
22

Ogundijo, Oyetunji E., i Xiaodong Wang. "Characterization of tumor heterogeneity by latent haplotypes: a sequential Monte Carlo approach". PeerJ 6 (30.05.2018): e4838. http://dx.doi.org/10.7717/peerj.4838.

Pełny tekst źródła
Streszczenie:
Tumor samples obtained from a single cancer patient spatially or temporally often consist of varying cell populations, each harboring distinct mutations that uniquely characterize its genome. Thus, in any given samples of a tumor having more than two haplotypes, defined as a scaffold of single nucleotide variants (SNVs) on the same homologous genome, is evidence of heterogeneity because humans are diploid and we would therefore only observe up to two haplotypes if all cells in a tumor sample were genetically homogeneous. We characterize tumor heterogeneity by latent haplotypes and present state-space formulation of the feature allocation model for estimating the haplotypes and their proportions in the tumor samples. We develop an efficient sequential Monte Carlo (SMC) algorithm that estimates the states and the parameters of our proposed state-space model, which are equivalently the haplotypes and their proportions in the tumor samples. The sequential algorithm produces more accurate estimates of the model parameters when compared with existing methods. Also, because our algorithm processes the variant allele frequency (VAF) of a locus as the observation at a single time-step, VAF from newly sequenced candidate SNVs from next-generation sequencing (NGS) can be analyzed to improve existing estimates without re-analyzing the previous datasets, a feature that existing solutions do not possess.
Style APA, Harvard, Vancouver, ISO itp.
23

Yuan, Xianghui, Feng Lian i Chongzhao Han. "Multiple-Model Cardinality Balanced Multitarget Multi-Bernoulli Filter for Tracking Maneuvering Targets". Journal of Applied Mathematics 2013 (2013): 1–16. http://dx.doi.org/10.1155/2013/727430.

Pełny tekst źródła
Streszczenie:
By integrating the cardinality balanced multitarget multi-Bernoulli (CBMeMBer) filter with the interacting multiple models (IMM) algorithm, an MM-CBMeMBer filter is proposed in this paper for tracking multiple maneuvering targets in clutter. The sequential Monte Carlo (SMC) method is used to implement the filter for generic multi-target models and the Gaussian mixture (GM) method is used to implement the filter for linear-Gaussian multi-target models. Then, the extended Kalman (EK) and unscented Kalman filtering approximations for the GM-MM-CBMeMBer filter to accommodate mildly nonlinear models are described briefly. Simulation results are presented to show the effectiveness of the proposed filter.
Style APA, Harvard, Vancouver, ISO itp.
24

Samuelsson, Oscar, Anders Björk, Jesús Zambrano i Bengt Carlsson. "Gaussian process regression for monitoring and fault detection of wastewater treatment processes". Water Science and Technology 75, nr 12 (25.03.2017): 2952–63. http://dx.doi.org/10.2166/wst.2017.162.

Pełny tekst źródła
Streszczenie:
Monitoring and fault detection methods are increasingly important to achieve a robust and resource efficient operation of wastewater treatment plants (WWTPs). The purpose of this paper was to evaluate a promising machine learning method, Gaussian process regression (GPR), for WWTP monitoring applications. We evaluated GPR at two WWTP monitoring problems: estimate missing data in a flow rate signal (simulated data), and detect a drift in an ammonium sensor (real data). We showed that GPR with the standard estimation method, maximum likelihood estimation (GPR-MLE), suffered from local optima during estimation of kernel parameters, and did not give satisfactory results in a simulated case study. However, GPR with a state-of-the-art estimation method based on sequential Monte Carlo estimation (GPR-SMC) gave good predictions and did not suffer from local optima. Comparisons with simple standard methods revealed that GPR-SMC performed better than linear interpolation in estimating missing data in a noisy flow rate signal. We conclude that GPR-SMC is both a general and powerful method for monitoring full-scale WWTPs. However, this paper also shows that it does not always pay off to use more sophisticated methods. New methods should be critically compared against simpler methods, which might be good enough for some scenarios.
Style APA, Harvard, Vancouver, ISO itp.
25

Noh, S. J., Y. Tachikawa, M. Shiiba i S. Kim. "Applying sequential Monte Carlo methods into a distributed hydrologic model: lagged particle filtering approach with regularization". Hydrology and Earth System Sciences 15, nr 10 (25.10.2011): 3237–51. http://dx.doi.org/10.5194/hess-15-3237-2011.

Pełny tekst źródła
Streszczenie:
Abstract. Data assimilation techniques have received growing attention due to their capability to improve prediction. Among various data assimilation techniques, sequential Monte Carlo (SMC) methods, known as "particle filters", are a Bayesian learning process that has the capability to handle non-linear and non-Gaussian state-space models. In this paper, we propose an improved particle filtering approach to consider different response times of internal state variables in a hydrologic model. The proposed method adopts a lagged filtering approach to aggregate model response until the uncertainty of each hydrologic process is propagated. The regularization with an additional move step based on the Markov chain Monte Carlo (MCMC) methods is also implemented to preserve sample diversity under the lagged filtering approach. A distributed hydrologic model, water and energy transfer processes (WEP), is implemented for the sequential data assimilation through the updating of state variables. The lagged regularized particle filter (LRPF) and the sequential importance resampling (SIR) particle filter are implemented for hindcasting of streamflow at the Katsura catchment, Japan. Control state variables for filtering are soil moisture content and overland flow. Streamflow measurements are used for data assimilation. LRPF shows consistent forecasts regardless of the process noise assumption, while SIR has different values of optimal process noise and shows sensitive variation of confidential intervals, depending on the process noise. Improvement of LRPF forecasts compared to SIR is particularly found for rapidly varied high flows due to preservation of sample diversity from the kernel, even if particle impoverishment takes place.
Style APA, Harvard, Vancouver, ISO itp.
26

Zhang, Jungen. "Bearings-only multitarget tracking based onRao-Blackwellized particle CPHD filter". International Journal of Circuits, Systems and Signal Processing 14 (13.01.2021): 1129–36. http://dx.doi.org/10.46300/9106.2020.14.141.

Pełny tekst źródła
Streszczenie:
Following Mahler’s framework forinformation fusion, this paper develops a implementationof cardinalized probability hypothesis density (CPHD)filter for bearings-only multitarget tracking.Rao-Blackwellized method is introduced in the CPHDfiltering framework for mixed linear/nonlinear state spacemodels. The sequential Monte Carlo (SMC) method is usedto predict and estimate the nonlinear state of targets.Kalman filter (KF) is adopted to estimate the linear stateswith the information embedded in the estimated nonlinearstates. The multitarget state estimates are extracted byutilizing the kernel density estimation (KDE) theory andmean-shift algorithm to enhance tracking performance.Moreover, the computational load of the filter is analyzedby introducing equivalent flop measure. Finally, theperformance of the proposed Rao-Blackwellized particleCPHD filter is evaluated through a challengingbearings-only multitarget tracking simulation experiment.
Style APA, Harvard, Vancouver, ISO itp.
27

Ikoma, Norikazu, Ryuichi Yamaguchi, Hideaki Kawano i Hiroshi Maeda. "Tracking of Multiple Moving Objects in Dynamic Image of Omni-Directional Camera Using PHD Filter". Journal of Advanced Computational Intelligence and Intelligent Informatics 12, nr 1 (20.01.2008): 16–25. http://dx.doi.org/10.20965/jaciii.2008.p0016.

Pełny tekst źródła
Streszczenie:
A method of multiple moving objects tracking in dynamic image of omni-directional camera has been proposed. Finite random set (FRS) based state space model is employed in the method due to its inherent nature capable to represent the scene having occlusion and appearance of object as well as missing and false detection in observation. Sequential Monte Carlo (SMC) implementation of Probability hypothesis density (PHD) filter has been used for estimating state of the state space model. The state is a finite random set of single object states, where each element of the set consists of position and velocity of the object in panoramic image coordinate of omni-directional camera image. We propose a new method to display tracking result from weighted particles obtained from the estimation process by SMC implementation of PHD filter. Key idea of the method is to put an integer label on each particle, where the label indicates specific object among multiple objects in the image scene tracked by the particle. Numerical simulation and real image experiments illustrate tracking performance of the proposed method.
Style APA, Harvard, Vancouver, ISO itp.
28

Olivieri, D., J. Faro, I. Gomez-Conde i C. E. Tadokoro. "Tracking T and B cells from two-photon microscopy imaging using constrained SMC clusters". Journal of Integrative Bioinformatics 8, nr 3 (1.12.2011): 141–57. http://dx.doi.org/10.1515/jib-2011-180.

Pełny tekst źródła
Streszczenie:
Summary This paper describes a novel software algorithm, called constrained Sequential Monte Carlo (SMC) clusters, for tracking a large collection of individual cells from intra-vital two-photon microscopy image sequences. We show how our method and software tool, implemented in python, is useful for quantifying the motility of T and B lymphocytes involved in an immune response vs lymphocytes under non immune conditions. We describe the theory behind our algorithm and briefly discuss the architecture of our software. Finally, we demonstrate both the functionality and utility of software by applying it to two practical examples from videos displaying lymphocyte motility in B cell zones (follicles) and T cell zones of lymph nodes.
Style APA, Harvard, Vancouver, ISO itp.
29

Drovandi, C. C., N. Cusimano, S. Psaltis, B. A. J. Lawson, A. N. Pettitt, P. Burrage i K. Burrage. "Sampling methods for exploring between-subject variability in cardiac electrophysiology experiments". Journal of The Royal Society Interface 13, nr 121 (sierpień 2016): 20160214. http://dx.doi.org/10.1098/rsif.2016.0214.

Pełny tekst źródła
Streszczenie:
Between-subject and within-subject variability is ubiquitous in biology and physiology, and understanding and dealing with this is one of the biggest challenges in medicine. At the same time, it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler–Reuter cardiac electrophysiological model. We show improved efficiency for SMC that produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block. Finally, we show the performance of our approach on a complex atrial electrophysiological model, namely the Courtemanche–Ramirez–Nattel model.
Style APA, Harvard, Vancouver, ISO itp.
30

Kim, Sun Young, Chang Ho Kang i Chan Gook Park. "SMC-CPHD Filter with Adaptive Survival Probability for Multiple Frequency Tracking". Applied Sciences 12, nr 3 (27.01.2022): 1369. http://dx.doi.org/10.3390/app12031369.

Pełny tekst źródła
Streszczenie:
We propose a sequential Monte Carlo-based cardinalized probability hypothesis density (SMC-CPHD) filter with adaptive survival probability for multiple frequency tracking to enhance the tracking performance. The survival probability of the particles in the filter is adjusted using the pre-designed exponential function related to the distribution of the estimated particle points. In order to ensure whether the proposed survival probability affects the stability of the filter, the error bounds in the prediction process are analyzed. Moreover, an inverse covariance intersection-based compensation method is added to enhance cardinality tracking performance by integrating two types of cardinality information from the CPHD filter and data clustering process. To evaluate the proposed method’s performance, MATLAB-based simulations are performed. As a result, the tracking performance of the multiple frequencies has been confirmed, and the accuracy of cardinality estimates are improved compared to the existing filters.
Style APA, Harvard, Vancouver, ISO itp.
31

Alahmadi, Amani A., Jennifer A. Flegg, Davis G. Cochrane, Christopher C. Drovandi i Jonathan M. Keith. "A comparison of approximate versus exact techniques for Bayesian parameter inference in nonlinear ordinary differential equation models". Royal Society Open Science 7, nr 3 (marzec 2020): 191315. http://dx.doi.org/10.1098/rsos.191315.

Pełny tekst źródła
Streszczenie:
The behaviour of many processes in science and engineering can be accurately described by dynamical system models consisting of a set of ordinary differential equations (ODEs). Often these models have several unknown parameters that are difficult to estimate from experimental data, in which case Bayesian inference can be a useful tool. In principle, exact Bayesian inference using Markov chain Monte Carlo (MCMC) techniques is possible; however, in practice, such methods may suffer from slow convergence and poor mixing. To address this problem, several approaches based on approximate Bayesian computation (ABC) have been introduced, including Markov chain Monte Carlo ABC (MCMC ABC) and sequential Monte Carlo ABC (SMC ABC). While the system of ODEs describes the underlying process that generates the data, the observed measurements invariably include errors. In this paper, we argue that several popular ABC approaches fail to adequately model these errors because the acceptance probability depends on the choice of the discrepancy function and the tolerance without any consideration of the error term. We observe that the so-called posterior distributions derived from such methods do not accurately reflect the epistemic uncertainties in parameter values. Moreover, we demonstrate that these methods provide minimal computational advantages over exact Bayesian methods when applied to two ODE epidemiological models with simulated data and one with real data concerning malaria transmission in Afghanistan.
Style APA, Harvard, Vancouver, ISO itp.
32

Martínez-Barberá, Humberto, Pablo Bernal-Polo i David Herrero-Pérez. "Sensor Modeling for Underwater Localization Using a Particle Filter". Sensors 21, nr 4 (23.02.2021): 1549. http://dx.doi.org/10.3390/s21041549.

Pełny tekst źródła
Streszczenie:
This paper presents a framework for processing, modeling, and fusing underwater sensor signals to provide a reliable perception for underwater localization in structured environments. Submerged sensory information is often affected by diverse sources of uncertainty that can deteriorate the positioning and tracking. By adopting uncertain modeling and multi-sensor fusion techniques, the framework can maintain a coherent representation of the environment, filtering outliers, inconsistencies in sequential observations, and useless information for positioning purposes. We evaluate the framework using cameras and range sensors for modeling uncertain features that represent the environment around the vehicle. We locate the underwater vehicle using a Sequential Monte Carlo (SMC) method initialized from the GPS location obtained on the surface. The experimental results show that the framework provides a reliable environment representation during the underwater navigation to the localization system in real-world scenarios. Besides, they evaluate the improvement of localization compared to the position estimation using reliable dead-reckoning systems.
Style APA, Harvard, Vancouver, ISO itp.
33

Ruchi, Sangeetika, Svetlana Dubinkina i Jana de Wiljes. "Fast hybrid tempered ensemble transform filter formulation for Bayesian elliptical problems via Sinkhorn approximation". Nonlinear Processes in Geophysics 28, nr 1 (15.01.2021): 23–41. http://dx.doi.org/10.5194/npg-28-23-2021.

Pełny tekst źródła
Streszczenie:
Abstract. Identification of unknown parameters on the basis of partial and noisy data is a challenging task, in particular in high dimensional and non-linear settings. Gaussian approximations to the problem, such as ensemble Kalman inversion, tend to be robust and computationally cheap and often produce astonishingly accurate estimations despite the simplifying underlying assumptions. Yet there is a lot of room for improvement, specifically regarding a correct approximation of a non-Gaussian posterior distribution. The tempered ensemble transform particle filter is an adaptive Sequential Monte Carlo (SMC) method, whereby resampling is based on optimal transport mapping. Unlike ensemble Kalman inversion, it does not require any assumptions regarding the posterior distribution and hence has shown to provide promising results for non-linear non-Gaussian inverse problems. However, the improved accuracy comes with the price of much higher computational complexity, and the method is not as robust as ensemble Kalman inversion in high dimensional problems. In this work, we add an entropy-inspired regularisation factor to the underlying optimal transport problem that allows the high computational cost to be considerably reduced via Sinkhorn iterations. Further, the robustness of the method is increased via an ensemble Kalman inversion proposal step before each update of the samples, which is also referred to as a hybrid approach. The promising performance of the introduced method is numerically verified by testing it on a steady-state single-phase Darcy flow model with two different permeability configurations. The results are compared to the output of ensemble Kalman inversion, and Markov chain Monte Carlo methods results are computed as a benchmark.
Style APA, Harvard, Vancouver, ISO itp.
34

Zeng, HongCheng, Jie Chen, PengBo Wang, Wei Liu, XinKai Zhou i Wei Yang. "Moving Target Detection in Multi-Static GNSS-Based Passive Radar Based on Multi-Bernoulli Filter". Remote Sensing 12, nr 21 (24.10.2020): 3495. http://dx.doi.org/10.3390/rs12213495.

Pełny tekst źródła
Streszczenie:
Over the past few years, the global navigation satellite system (GNSS)-based passive radar (GBPR) has attracted more and more attention and has developed very quickly. However, the low power level of GNSS signal limits its application. To enhance the ability of moving target detection, a multi-static GBPR (MsGBPR) system is considered in this paper, and a modified iterated-corrector multi-Bernoulli (ICMB) filter is also proposed. The likelihood ratio model of the MsGBPR with range-Doppler map is first presented. Then, a signal-to-noise ratio (SNR) online estimation method is proposed, which can estimate the fluctuating and unknown map SNR effectively. After that, a modified ICMB filter and its sequential Monte Carlo (SMC) implementation are proposed, which can update all measurements from multi-transmitters in the optimum order (ascending order). Moreover, based on the proposed method, a moving target detecting framework using MsGBPR data is also presented. Finally, performance of the proposed method is demonstrated by numerical simulations and preliminary experimental results, and it is shown that the position and velocity of the moving target can be estimated accurately.
Style APA, Harvard, Vancouver, ISO itp.
35

Zhang, Xiaoguo, Yujin Kuang, Haoran Yang, Hang Lu i Yuan Yang. "UWB Indoor Localization Algorithm Using Firefly of Multistage Optimization on Particle Filter". Journal of Sensors 2021 (15.12.2021): 1–9. http://dx.doi.org/10.1155/2021/1383767.

Pełny tekst źródła
Streszczenie:
With the increasing application potential of indoor personnel positioning, ultra-wideband (UWB) positioning technology has attracted more and more attentions of scholars. In practice, an indoor positioning process often involves multipath and Non-Line-Of-Sight (NLOS) problems, and a particle filtering (PF) algorithm has been widely used in the indoor positioning research field because of its outstanding performance in nonlinear and non-Gaussian estimations. Aiming at mitigating the accuracy decreasing caused by the particle degradation and impoverishment in traditional Sequential Monte Carlo (SMC) positioning, we propose a method to integrate the firefly and particle algorithm for multistage optimization. The proposed algorithm not only enhances the searching ability of particles of initialization but also makes the particles propagate out of the local optimal condition in the sequential estimations. In addition, to prevent particles from falling into the oscillatory situation and find the global optimization faster, a decreasing function is designed to improve the reliability of the particle propagation. Real indoor experiments are carried out, and results demonstrate that the positioning accuracy can be improved up to 36%, and the number of needed particles is significantly reduced.
Style APA, Harvard, Vancouver, ISO itp.
36

J., Kiruba, i Rajesh T. "Enabling accurate range free localization for mo-bile sensor networks". International Journal of Engineering & Technology 7, nr 1.3 (31.12.2017): 1. http://dx.doi.org/10.14419/ijet.v7i1.3.8975.

Pełny tekst źródła
Streszczenie:
The Existing localization algorithm was Sequential Monte Carlo (SMC) method for mobile sensor networks. In this paper, we propose an energy efficient algorithm called novel localization, which can achieve high localization in a better way. In existing algorithm, high Localization is achieved by getting more number of beacon nodes as input which improves the computational cost and decreases the sampling efficiency to a higher rate. But in the Proposed algorithm, we achieve high Localization accuracy with less number of beacon nodes itself thus the computational cost will be less and the sampling efficiency will be high. Our algorithm uses the approximate calculated position information of sensor nodes to increase the localization accuracy. The existing algorithms doesn’t have high localization accuracy when nodes move very fast, so we have propose a new algorithm, novel localization which is executed based on spped of nodes. The novel algorithm improves the Localization accuracy also when the node move fast.
Style APA, Harvard, Vancouver, ISO itp.
37

Wang, Shuqiang, Yanyan Shen, Changhong Shi, Tao Wang, Zhiming Wei i Hanxiong Li. "Defining Biological Networks for Noise Buffering and Signaling Sensitivity Using Approximate Bayesian Computation". Scientific World Journal 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/625754.

Pełny tekst źródła
Streszczenie:
Reliable information processing in cells requires high sensitivity to changes in the input signal but low sensitivity to random fluctuations in the transmitted signal. There are often many alternative biological circuits qualifying for this biological function. Distinguishing theses biological models and finding the most suitable one are essential, as such model ranking, by experimental evidence, will help to judge the support of the working hypotheses forming each model. Here, we employ the approximate Bayesian computation (ABC) method based on sequential Monte Carlo (SMC) to search for biological circuits that can maintain signaling sensitivity while minimizing noise propagation, focusing on cases where the noise is characterized by rapid fluctuations. By systematically analyzing three-component circuits, we rank these biological circuits and identify three-basic-biological-motif buffering noise while maintaining sensitivity to long-term changes in input signals. We discuss in detail a particular implementation in control of nutrient homeostasis in yeast. The principal component analysis of the posterior provides insight into the nature of the reaction between nodes.
Style APA, Harvard, Vancouver, ISO itp.
38

Sun, Ming, i Chao Shi. "Application of Particle Filtering in Visual Tracking". Advanced Materials Research 485 (luty 2012): 207–12. http://dx.doi.org/10.4028/www.scientific.net/amr.485.207.

Pełny tekst źródła
Streszczenie:
Particle filtering, also known as Sequential Monte Carlo methods (SMC), is a sophisticated model estimation techniques based on simulation. Particle filtering has important applications in location, tracking and other fields. It indicates probability using particle set and can be used in any space-state model. Its core idea is to extract a random state from the posterior probability express the distribution. In general, particle filtering is a process which uses a set of stochastic sample propagating in space-state to approximate probability density function and to replace integral operation with mean value of a sample to obtain minimum state variance distribution. It solves the restriction that nonlinear filtering should match Gaussian distribution, expresses a wider range of distribution than Gaussian distribution and has a strong ability to model the nonlinear characteristic of variance parameter. This paper introduces the application of particle filtering in visual tracking. Finally, it puts forward some improved algorithms to revise the inherent deficiencies existing in particle filtering.
Style APA, Harvard, Vancouver, ISO itp.
39

Roldán-Blay, Carlos, Carlos Roldán-Porta, Eduardo Quiles i Guillermo Escrivá-Escrivá. "Smart Cooperative Energy Supply Strategy to Increase Reliability in Residential Stand-Alone Photovoltaic Systems". Applied Sciences 11, nr 24 (10.12.2021): 11723. http://dx.doi.org/10.3390/app112411723.

Pełny tekst źródła
Streszczenie:
In reliability studies of isolated energy supply systems for residential buildings, supply failures due to insufficient generation are generally analysed. Recent studies conclude that this kind of analysis makes it possible to optimally design the sizes of the elements of the generation system. However, in isolated communities or rural areas, it is common to find groups of dwellings in which micro-renewable sources, such as photovoltaic (PV) systems, can be installed. In this situation, the generation and storage of several houses can be considered as an interconnected system forming a cooperative microgrid (CoMG). This work analyses the benefits that sharing two autonomous installations can bring to each one, from the point of view of reliability. The method consists of the application of a random sequential Monte Carlo (SMC) simulation to the CoMG to evaluate the impact of a simple cooperative strategy on the reliability of the set. The study considers random failures in the generation systems. The results show that the reliability of the system increases when cooperation is allowed. Additionally, at the design stage, this allows more cost-effective solutions than single sizing with a similar level of reliability.
Style APA, Harvard, Vancouver, ISO itp.
40

Chen, Zhikun, Bin’an Wang, Ruiheng Yang i Yuchao Lou. "Joint Direction of Arrival-Polarization Parameter Tracking Algorithm Based on Multi-Target Multi-Bernoulli Filter". Remote Sensing 15, nr 16 (8.08.2023): 3929. http://dx.doi.org/10.3390/rs15163929.

Pełny tekst źródła
Streszczenie:
This paper presents a tracking algorithm for joint estimation of direction of arrival (DOA) and polarization parameters, which exhibit dynamic behavior due to the movement of signal source carriers. The proposed algorithm addresses the challenge of real-time estimation in multi-target scenarios with an unknown number. This algorithm is built upon the Multi-target Multi-Bernoulli (MeMBer) filter algorithm, which makes use of a sensor array called Circular Orthogonal Double-Dipole (CODD). The algorithm begins by constructing a Minimum Description Length (MDL) principle, taking advantage of the characteristics of the polarization-sensitive array. This allows for adaptive estimation of the number of signal sources and facilitates the separation of the noise subspace. Subsequently, the joint parameter Multiple Signal Classification (MUSIC) spatial spectrum function is employed as the pseudo-likelihood function, overcoming the limitations imposed by unknown prior information constraints. To approximate the posterior distribution of MeMBer filters, Sequential Monte Carlo (SMC) method is utilized. The simulation results demonstrate that the proposed algorithm achieves excellent tracking accuracy in joint DOA-polarization parameter estimation, whether in scenarios with known or unknown numbers of signal sources. Moreover, the algorithm demonstrates robust tracking convergence even under low Signal-to-Noise Ratio (SNR) conditions.
Style APA, Harvard, Vancouver, ISO itp.
41

Wang, Xilu, i Yaochu Jin. "Knowledge Transfer Based on Particle Filters for Multi-Objective Optimization". Mathematical and Computational Applications 28, nr 1 (18.01.2023): 14. http://dx.doi.org/10.3390/mca28010014.

Pełny tekst źródła
Streszczenie:
Particle filters, also known as sequential Monte Carlo (SMC) methods, constitute a class of importance sampling and resampling techniques designed to use simulations to perform on-line filtering. Recently, particle filters have been extended for optimization by utilizing the ability to track a sequence of distributions. In this work, we incorporate transfer learning capabilities into the optimizer by using particle filters. To achieve this, we propose a novel particle-filter-based multi-objective optimization algorithm (PF-MOA) by transferring knowledge acquired from the search experience. The key insight adopted here is that, if we can construct a sequence of target distributions that can balance the multiple objectives and make the degree of the balance controllable, we can approximate the Pareto optimal solutions by simulating each target distribution via particle filters. As the importance weight updating step takes the previous target distribution as the proposal distribution and takes the current target distribution as the target distribution, the knowledge acquired from the previous run can be utilized in the current run by carefully designing the set of target distributions. The experimental results on the DTLZ and WFG test suites show that the proposed PF-MOA achieves competitive performance compared with state-of-the-art multi-objective evolutionary algorithms on most test instances.
Style APA, Harvard, Vancouver, ISO itp.
42

Thorn, Graeme J., i John R. King. "The metabolic network of Clostridium acetobutylicum: Comparison of the approximate Bayesian computation via sequential Monte Carlo (ABC-SMC) and profile likelihood estimation (PLE) methods for determinability analysis". Mathematical Biosciences 271 (styczeń 2016): 62–79. http://dx.doi.org/10.1016/j.mbs.2015.10.016.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Lin, Ye, i Sean B. Andersson. "Expectation maximization based framework for joint localization and parameter estimation in single particle tracking from segmented images". PLOS ONE 16, nr 5 (21.05.2021): e0243115. http://dx.doi.org/10.1371/journal.pone.0243115.

Pełny tekst źródła
Streszczenie:
Single Particle Tracking (SPT) is a well known class of tools for studying the dynamics of biological macromolecules moving inside living cells. In this paper, we focus on the problem of localization and parameter estimation given a sequence of segmented images. In the standard paradigm, the location of the emitter inside each frame of a sequence of camera images is estimated using, for example, Gaussian fitting (GF), and these locations are linked to provide an estimate of the trajectory. Trajectories are then analyzed by using Mean Square Displacement (MSD) or Maximum Likelihood Estimation (MLE) techniques to determine motion parameters such as diffusion coefficients. However, the problems of localization and parameter estimation are clearly coupled. Motivated by this, we have created an Expectation Maximization (EM) based framework for simultaneous localization and parameter estimation. We demonstrate this framework through two representative methods, namely, Sequential Monte Carlo combined with Expectation Maximization (SMC-EM) and Unscented Kalman Filter combined with Expectation Maximization (U-EM). Using diffusion in two-dimensions as a prototypical example, we conduct quantitative investigations on localization and parameter estimation performance across a wide range of signal to background ratios and diffusion coefficients and compare our methods to the standard techniques based on GF-MSD/MLE. To demonstrate the flexibility of the EM based framework, we do comparisons using two different camera models, an ideal camera with Poisson distributed shot noise but no readout noise, and a camera with both shot noise and the pixel-dependent readout noise that is common to scientific complementary metal-oxide semiconductor (sCMOS) camera. Our results indicate our EM based methods outperform the standard techniques, especially at low signal levels. While U-EM and SMC-EM have similar accuracy, U-EM is significantly more computationally efficient, though the use of the Unscented Kalman Filter limits U-EM to lower diffusion rates.
Style APA, Harvard, Vancouver, ISO itp.
44

Schaaf, Alexander, Miguel de la Varga, Florian Wellmann i Clare E. Bond. "Constraining stochastic 3-D structural geological models with topology information using approximate Bayesian computation in GemPy 2.1". Geoscientific Model Development 14, nr 6 (28.06.2021): 3899–913. http://dx.doi.org/10.5194/gmd-14-3899-2021.

Pełny tekst źródła
Streszczenie:
Abstract. Structural geomodeling is a key technology for the visualization and quantification of subsurface systems. Given the limited data and the resulting necessity for geological interpretation to construct these geomodels, uncertainty is pervasive and traditionally unquantified. Probabilistic geomodeling allows for the simulation of uncertainties by automatically constructing geomodel ensembles from perturbed input data sampled from probability distributions. But random sampling of input parameters can lead to construction of geomodels that are unrealistic, either due to modeling artifacts or by not matching known information about the regional geology of the modeled system. We present a method to incorporate geological information in the form of known geomodel topology into stochastic simulations to constrain resulting probabilistic geomodel ensembles using the open-source geomodeling software GemPy. Simulated geomodel realizations are checked against topology information using an approximate Bayesian computation approach to avoid the specification of a likelihood function. We demonstrate how we can infer the posterior distributions of the model parameters using topology information in two experiments: (1) a synthetic geomodel using a rejection sampling scheme (ABC-REJ) to demonstrate the approach and (2) a geomodel of a subset of the Gullfaks field in the North Sea comparing both rejection sampling and a sequential Monte Carlo sampler (ABC-SMC). Possible improvements to processing speed of up to 10.1 times are discussed, focusing on the use of more advanced sampling techniques to avoid the simulation of unfeasible geomodels in the first place. Results demonstrate the feasibility of using topology graphs as a summary statistic to restrict the generation of geomodel ensembles with known geological information and to obtain improved ensembles of probable geomodels which respect the known topology information and exhibit reduced uncertainty using stochastic simulation methods.
Style APA, Harvard, Vancouver, ISO itp.
45

Avecilla, Grace, Julie N. Chuong, Fangfei Li, Gavin Sherlock, David Gresham i Yoav Ram. "Neural networks enable efficient and accurate simulation-based inference of evolutionary parameters from adaptation dynamics". PLOS Biology 20, nr 5 (27.05.2022): e3001633. http://dx.doi.org/10.1371/journal.pbio.3001633.

Pełny tekst źródła
Streszczenie:
The rate of adaptive evolution depends on the rate at which beneficial mutations are introduced into a population and the fitness effects of those mutations. The rate of beneficial mutations and their expected fitness effects is often difficult to empirically quantify. As these 2 parameters determine the pace of evolutionary change in a population, the dynamics of adaptive evolution may enable inference of their values. Copy number variants (CNVs) are a pervasive source of heritable variation that can facilitate rapid adaptive evolution. Previously, we developed a locus-specific fluorescent CNV reporter to quantify CNV dynamics in evolving populations maintained in nutrient-limiting conditions using chemostats. Here, we use CNV adaptation dynamics to estimate the rate at which beneficial CNVs are introduced through de novo mutation and their fitness effects using simulation-based likelihood–free inference approaches. We tested the suitability of 2 evolutionary models: a standard Wright–Fisher model and a chemostat model. We evaluated 2 likelihood-free inference algorithms: the well-established Approximate Bayesian Computation with Sequential Monte Carlo (ABC-SMC) algorithm, and the recently developed Neural Posterior Estimation (NPE) algorithm, which applies an artificial neural network to directly estimate the posterior distribution. By systematically evaluating the suitability of different inference methods and models, we show that NPE has several advantages over ABC-SMC and that a Wright–Fisher evolutionary model suffices in most cases. Using our validated inference framework, we estimate the CNV formation rate at the GAP1 locus in the yeast Saccharomyces cerevisiae to be 10−4.7 to 10−4 CNVs per cell division and a fitness coefficient of 0.04 to 0.1 per generation for GAP1 CNVs in glutamine-limited chemostats. We experimentally validated our inference-based estimates using 2 distinct experimental methods—barcode lineage tracking and pairwise fitness assays—which provide independent confirmation of the accuracy of our approach. Our results are consistent with a beneficial CNV supply rate that is 10-fold greater than the estimated rates of beneficial single-nucleotide mutations, explaining the outsized importance of CNVs in rapid adaptive evolution. More generally, our study demonstrates the utility of novel neural network–based likelihood–free inference methods for inferring the rates and effects of evolutionary processes from empirical data with possible applications ranging from tumor to viral evolution.
Style APA, Harvard, Vancouver, ISO itp.
46

Beskos, Alexandros, Dan O. Crisan, Ajay Jasra i Nick Whiteley. "Error Bounds and Normalising Constants for Sequential Monte Carlo Samplers in High Dimensions". Advances in Applied Probability 46, nr 01 (marzec 2014): 279–306. http://dx.doi.org/10.1017/s0001867800007047.

Pełny tekst źródła
Streszczenie:
In this paper we develop a collection of results associated to the analysis of the sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional independent and identically distributed target probabilities. The SMC samplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations with respect to this law. Given a target density inddimensions our results are concerned withd→ ∞, while the number of Monte Carlo samples,N, remains fixed. We deduce an explicit bound on the Monte-Carlo error for estimates derived using the SMC sampler and the exact asymptotic relative-error of the estimate of the normalising constant associated to the target. We also establish marginal propagation of chaos properties of the algorithm. These results are deduced when the cost of the algorithm isO(Nd2).
Style APA, Harvard, Vancouver, ISO itp.
47

Beskos, Alexandros, Dan O. Crisan, Ajay Jasra i Nick Whiteley. "Error Bounds and Normalising Constants for Sequential Monte Carlo Samplers in High Dimensions". Advances in Applied Probability 46, nr 1 (marzec 2014): 279–306. http://dx.doi.org/10.1239/aap/1396360114.

Pełny tekst źródła
Streszczenie:
In this paper we develop a collection of results associated to the analysis of the sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional independent and identically distributed target probabilities. The SMC samplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations with respect to this law. Given a target density in d dimensions our results are concerned with d → ∞, while the number of Monte Carlo samples, N, remains fixed. We deduce an explicit bound on the Monte-Carlo error for estimates derived using the SMC sampler and the exact asymptotic relative -error of the estimate of the normalising constant associated to the target. We also establish marginal propagation of chaos properties of the algorithm. These results are deduced when the cost of the algorithm is O(Nd2).
Style APA, Harvard, Vancouver, ISO itp.
48

Chen, Shoudong, Yan-lin Sun i Yang Liu. "Forecast of stock price fluctuation based on the perspective of volume information in stock and exchange market". China Finance Review International 8, nr 3 (20.08.2018): 297–314. http://dx.doi.org/10.1108/cfri-08-2017-0184.

Pełny tekst źródła
Streszczenie:
Purpose In the process of discussing the relationship between volume and price in the stock market, the purpose of this paper is to consider how to take the flow of foreign capital into consideration, to determine whether the inclusion of volume information really contributes to the prediction of the volatility of the stock price. Design/methodology/approach By comparing the relative advantages and disadvantages of the two main non-parametric methods mainstream, and taking the characteristics of the time series of the volume into consideration, the stochastic volatility with Volume (SV-VOL) model based on the APF-LW simulation method is used in the end, to explore and implement a more efficient estimation algorithm. And the volume is incorporated into the model for submersible quantization, by which the problem of insufficient use of volume information in previous research has been solved, which means that the development of the SV model is realized. Findings Through the Sequential Monte Carlo (SMC) algorithm, the effective estimation of the SV-VOL model is realized by programming. It is found that the stock market volume information is helpful to the prediction of the volatility of the stock price. The exchange market volume information affects the stock returns and the price-volume relationship, which is achieved indirectly through the net capital into stock market. The current exchange devaluation and fluctuation are not conducive to the restoration and recovery of the stock market. Research limitations/implications It is still in the exploratory stage that whether the inclusion of volume information really contributes to the prediction of the volatility of the stock price, and how to incorporate the exchange market volume information. This paper tries to determine the information weight of the exchange market volume according to the direct and indirect channels from the perspective of causality. The relevant practices and conclusions need to be tested and perfected. Practical implications Previous studies have neglected the influence of the information contained in the exchange market volume on the volatility of stock prices. To a certain extent, this research makes a useful supplement to the existing research, especially in the aspects of research problems, research paradigms, research methods and research conclusion. Originality/value SV model with volume information can not only effectively solve the inefficiency of information use problem contained in volume in traditional practice, but also further improve the estimation accuracy of the model by introducing the exchange market volume information into the model through weighted processing, which is a useful supplement to the existing literature. The SMC algorithm realized by programming is helpful to the further advancement and development of non-parametric algorithms. And this paper has made a useful attempt to determine the weight of the exchange market volume information, and some useful conclusions are drawn.
Style APA, Harvard, Vancouver, ISO itp.
49

Lian, Feng, Chen Li, Chongzhao Han i Hui Chen. "Convergence Analysis for the SMC-MeMBer and SMC-CBMeMBer Filters". Journal of Applied Mathematics 2012 (2012): 1–25. http://dx.doi.org/10.1155/2012/584140.

Pełny tekst źródła
Streszczenie:
The convergence for the sequential Monte Carlo (SMC) implementations of the multitarget multi-Bernoulli (MeMBer) filter and cardinality-balanced MeMBer (CBMeMBer) filters is studied here. This paper proves that the SMC-MeMBer and SMC-CBMeMBer filters, respectively, converge to the true MeMBer and CBMeMBer filters in the mean-square sense and the corresponding bounds for the mean-square errors are given. The significance of this paper is in theory to present the convergence results of the SMC-MeMBer and SMC-CBMeMBer filters and the conditions under which the two filters satisfy mean-square convergence.
Style APA, Harvard, Vancouver, ISO itp.
50

Ahmed, Imtiaz. "Dolphin Whistle Track Estimation Using Sequential Monte Carlo Probability Hypothesis Density Filter". Dhaka University Journal of Science 62, nr 1 (7.02.2015): 17–20. http://dx.doi.org/10.3329/dujs.v62i1.21954.

Pełny tekst źródła
Streszczenie:
This article focuses on possible automation of dolphin whistle track estimation process within the context of Multiple Target Tracking (MTT). It provides automatic whistle track estimation from raw hydrophone measurements using the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) filter. Hydrophone measurements for three different types of species namely bottlenose dolphin (Tursiops truncates), common dolphin (Delphinus delphis) and striped dolphin (Stenella coeruleoalba) have been used to benchmark the tracking performance of the SMC-PHD filter against three major challenges- the presence of multiple whistles, spontaneous death/birth of whistles and multiple whistles crossing each other. Quantitative analysis of the whistle track estimation accuracy is not possible since there is no ground truth type track for the dolphin whistles. Hence visual inspection of estimated tracks has been used corroborate the satisfactory tracking performance in the presence of all three challenges. DOI: http://dx.doi.org/10.3329/dujs.v62i1.21954 Dhaka Univ. J. Sci. 62(1): 17-20, 2014 (January)
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii