Journal articles on the topic 'Approximate sampling'

To see the other types of publications on this topic, follow the link: Approximate sampling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Approximate sampling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Von Collani, Elart. "Approximate a-optimal sampling plans." Statistics 18, no. 3 (January 1987): 333–44. http://dx.doi.org/10.1080/02331888708802025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Carrizosa, Emilio. "On approximate Monetary Unit Sampling." European Journal of Operational Research 217, no. 2 (March 2012): 479–82. http://dx.doi.org/10.1016/j.ejor.2011.09.037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dimitrakakis, Christos, and Michail G. Lagoudakis. "Rollout sampling approximate policy iteration." Machine Learning 72, no. 3 (July 10, 2008): 157–71. http://dx.doi.org/10.1007/s10994-008-5069-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rodrigues, G. S., David J. Nott, and S. A. Sisson. "Likelihood-free approximate Gibbs sampling." Statistics and Computing 30, no. 4 (March 11, 2020): 1057–73. http://dx.doi.org/10.1007/s11222-020-09933-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ryan, Kenneth J. "Approximate Confidence Intervals forpWhen Double Sampling." American Statistician 63, no. 2 (May 2009): 132–40. http://dx.doi.org/10.1198/tast.2009.0027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Geng, Bo, HuiJuan Zhang, Heng Wang, and GuoPing Wang. "Approximate Poisson disk sampling on mesh." Science China Information Sciences 56, no. 9 (September 9, 2011): 1–12. http://dx.doi.org/10.1007/s11432-011-4322-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Z., J. K. Kim, and S. Yang. "Approximate Bayesian inference under informative sampling." Biometrika 105, no. 1 (December 18, 2017): 91–102. http://dx.doi.org/10.1093/biomet/asx073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shaltiel, Ronen, and Christopher Umans. "Pseudorandomness for Approximate Counting and Sampling." computational complexity 15, no. 4 (December 2006): 298–341. http://dx.doi.org/10.1007/s00037-007-0218-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Monaco, Salvatore, and Dorothée Normand-Cyrot. "Linearization by Output Injection under Approximate Sampling." European Journal of Control 15, no. 2 (January 2009): 205–17. http://dx.doi.org/10.3166/ejc.15.205-217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chaudhuri, Surajit, Gautam Das, and Vivek Narasayya. "Optimized stratified sampling for approximate query processing." ACM Transactions on Database Systems 32, no. 2 (June 2007): 9. http://dx.doi.org/10.1145/1242524.1242526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Warne, David J., Ruth E. Baker, and Matthew J. Simpson. "Multilevel rejection sampling for approximate Bayesian computation." Computational Statistics & Data Analysis 124 (August 2018): 71–86. http://dx.doi.org/10.1016/j.csda.2018.02.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Saad, Feras A., Cameron E. Freer, Martin C. Rinard, and Vikash K. Mansinghka. "Optimal approximate sampling from discrete probability distributions." Proceedings of the ACM on Programming Languages 4, POPL (January 2020): 1–31. http://dx.doi.org/10.1145/3371104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Etheridge, Alison, Peter Pfaffelhuber, and Anton Wakolbinger. "An approximate sampling formula under genetic hitchhiking." Annals of Applied Probability 16, no. 2 (May 2006): 685–729. http://dx.doi.org/10.1214/105051606000000114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Shou-zhi, Yang, Cheng Zheng-xing, and Tang Yuan-yan. "Approximate sampling theorem for bivariate continuous function." Applied Mathematics and Mechanics 24, no. 11 (November 2003): 1355–61. http://dx.doi.org/10.1007/bf02439660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Cervellera, Cristiano, and Marco Muselli. "Efficient sampling in approximate dynamic programming algorithms." Computational Optimization and Applications 38, no. 3 (June 23, 2007): 417–43. http://dx.doi.org/10.1007/s10589-007-9054-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ding, Lizhong, Yong Liu, Shizhong Liao, Yu Li, Peng Yang, Yijie Pan, Chao Huang, Ling Shao, and Xin Gao. "Approximate Kernel Selection with Strong Approximate Consistency." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3462–69. http://dx.doi.org/10.1609/aaai.v33i01.33013462.

Full text
Abstract:
Kernel selection is fundamental to the generalization performance of kernel-based learning algorithms. Approximate kernel selection is an efficient kernel selection approach that exploits the convergence property of the kernel selection criteria and the computational virtue of kernel matrix approximation. The convergence property is measured by the notion of approximate consistency. For the existing Nyström approximations, whose sampling distributions are independent of the specific learning task at hand, it is difficult to establish the strong approximate consistency. They mainly focus on the quality of the low-rank matrix approximation, rather than the performance of the kernel selection criterion used in conjunction with the approximate matrix. In this paper, we propose a novel Nyström approximate kernel selection algorithm by customizing a criterion-driven adaptive sampling distribution for the Nyström approximation, which adaptively reduces the error between the approximate and accurate criteria. We theoretically derive the strong approximate consistency of the proposed Nyström approximate kernel selection algorithm. Finally, we empirically evaluate the approximate consistency of our algorithm as compared to state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
17

Jia, Cai-Yan, and Xie-Ping Gao. "Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules." Journal of Computer Science and Technology 20, no. 3 (May 2005): 309–18. http://dx.doi.org/10.1007/s11390-005-0309-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Rengaraj, Varadarajan, Michael Lass, Christian Plessl, and Thomas D. Kühne. "Accurate Sampling with Noisy Forces from Approximate Computing." Computation 8, no. 2 (April 28, 2020): 39. http://dx.doi.org/10.3390/computation8020039.

Full text
Abstract:
In scientific computing, the acceleration of atomistic computer simulations by means of custom hardware is finding ever-growing application. A major limitation, however, is that the high efficiency in terms of performance and low power consumption entails the massive usage of low precision computing units. Here, based on the approximate computing paradigm, we present an algorithmic method to compensate for numerical inaccuracies due to low accuracy arithmetic operations rigorously, yet still obtaining exact expectation values using a properly modified Langevin-type equation.
APA, Harvard, Vancouver, ISO, and other styles
19

Booth, Thomas E. "An Approximate Monte Carlo Adaptive Importance Sampling Method." Nuclear Science and Engineering 138, no. 1 (May 2001): 96–103. http://dx.doi.org/10.13182/nse01-a2204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ma, Jianwei, Gerlind Plonka, and M. Yousuff Hussaini. "Compressive Video Sampling With Approximate Message Passing Decoding." IEEE Transactions on Circuits and Systems for Video Technology 22, no. 9 (September 2012): 1354–64. http://dx.doi.org/10.1109/tcsvt.2012.2201673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Berliner, L. Mark, and Christopher K. Wikle. "Approximate importance sampling Monte Carlo for data assimilation." Physica D: Nonlinear Phenomena 230, no. 1-2 (June 2007): 37–49. http://dx.doi.org/10.1016/j.physd.2006.07.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Boichard, D., LR Schaeffer, and AJ Lee. "Approximate restricted maximum likelihood and approximate prediction error variance of the Mendelian sampling effect." Genetics Selection Evolution 24, no. 4 (1992): 331. http://dx.doi.org/10.1186/1297-9686-24-4-331.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Zheng, Haifeng, Min Gao, Zhizhang Chen, Xiao-Yang Liu, and Xinxin Feng. "An Adaptive Sampling Scheme via Approximate Volume Sampling for Fingerprint-Based Indoor Localization." IEEE Internet of Things Journal 6, no. 2 (April 2019): 2338–53. http://dx.doi.org/10.1109/jiot.2019.2906489.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Spagnolo, Fanny, Stefania Perri, and Pasquale Corsonello. "Approximate Down-Sampling Strategy for Power-Constrained Intelligent Systems." IEEE Access 10 (2022): 7073–81. http://dx.doi.org/10.1109/access.2022.3142292.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Antoniou, Dimitri, and Steven D. Schwartz. "Approximate inclusion of quantum effects in transition path sampling." Journal of Chemical Physics 131, no. 22 (December 14, 2009): 224111. http://dx.doi.org/10.1063/1.3272793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

SAKATA, Sei-ichiro, Fumihiro ASHIDA, and Masaru ZAKO. "106 Kriging-based Approximate Optimization using Dispersed Sampling Data." Proceedings of The Computational Mechanics Conference 2006.19 (2006): 31–32. http://dx.doi.org/10.1299/jsmecmd.2006.19.31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Cervellera, Cristiano, and Danilo Maccio. "$F$ -Discrepancy for Efficient Sampling in Approximate Dynamic Programming." IEEE Transactions on Cybernetics 46, no. 7 (July 2016): 1628–39. http://dx.doi.org/10.1109/tcyb.2015.2453123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Gelb, Lev D. "Monte Carlo simulations using sampling from an approximate potential." Journal of Chemical Physics 118, no. 17 (May 2003): 7747–50. http://dx.doi.org/10.1063/1.1563597.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Jiang, Feng, Xu Yu, Junwei Du, Dunwei Gong, Youqiang Zhang, and Yanjun Peng. "Ensemble learning based on approximate reducts and bootstrap sampling." Information Sciences 547 (February 2021): 797–813. http://dx.doi.org/10.1016/j.ins.2020.08.069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Singh, Anubhav, Nir Lipovetzky, Miquel Ramirez, and Javier Segovia-Aguas. "Approximate Novelty Search." Proceedings of the International Conference on Automated Planning and Scheduling 31 (May 17, 2021): 349–57. http://dx.doi.org/10.1609/icaps.v31i1.15980.

Full text
Abstract:
Width-based search algorithms seek plans by prioritizing states according to a suitably defined measure of novelty, that maps states into a set of novelty categories. Space and time complexity to evaluate state novelty is known to be exponential on the cardinality of the set. We present novel methods to obtain polynomial approximations of novelty and width-based search. First, we approximate novelty computation via random sampling and Bloom filters, reducing the runtime and memory footprint. Second, we approximate the best-first search using an adaptive policy that decides whether to forgo the expansion of nodes in the open list. These two techniques are integrated into existing width-based algorithms, resulting in new planners that perform significantly better than other state-of-the-art planners over benchmarks from the International Planning Competitions.
APA, Harvard, Vancouver, ISO, and other styles
31

Pham, Trung, and Alex A. Gorodetsky. "Ensemble Approximate Control Variate Estimators: Applications to MultiFidelity Importance Sampling." SIAM/ASA Journal on Uncertainty Quantification 10, no. 3 (September 28, 2022): 1250–92. http://dx.doi.org/10.1137/21m1390426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Prescott, Thomas P., and Ruth E. Baker. "Multifidelity Approximate Bayesian Computation with Sequential Monte Carlo Parameter Sampling." SIAM/ASA Journal on Uncertainty Quantification 9, no. 2 (January 2021): 788–817. http://dx.doi.org/10.1137/20m1316160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Gibbons, Phillip B., and Yossi Matias. "New sampling-based summary statistics for improving approximate query answers." ACM SIGMOD Record 27, no. 2 (June 1998): 331–42. http://dx.doi.org/10.1145/276305.276334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Tian, J. L., and X. Y. Tong. "An Adaptive Sampling Algorithm and Approximate-Model-Based Optimization Method." Journal of Physics: Conference Series 1060 (July 2018): 012080. http://dx.doi.org/10.1088/1742-6596/1060/1/012080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Jabot, Franck, Thierry Faure, and Nicolas Dumoulin. "EasyABC: performing efficient approximate Bayesian computation sampling schemes using R." Methods in Ecology and Evolution 4, no. 7 (April 8, 2013): 684–87. http://dx.doi.org/10.1111/2041-210x.12050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

JOKEI, Yosuke, Sei-ichiro SAKATA, and Fumihiro ASHIDA. "1118 On Improvement of Uniform Sampling Strategy for Approximate Optimization." Proceedings of The Computational Mechanics Conference 2011.24 (2011): 394–95. http://dx.doi.org/10.1299/jsmecmd.2011.24.394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Bhaskar, Anand, John A. Kamm, and Yun S. Song. "Approximate Sampling Formulae for General Finite-Alleles Models of Mutation." Advances in Applied Probability 44, no. 2 (June 2012): 408–28. http://dx.doi.org/10.1239/aap/1339878718.

Full text
Abstract:
Many applications in genetic analyses utilize sampling distributions, which describe the probability of observing a sample of DNA sequences randomly drawn from a population. In the one-locus case with special models of mutation, such as the infinite-alleles model or the finite-alleles parent-independent mutation model, closed-form sampling distributions under the coalescent have been known for many decades. However, no exact formula is currently known for more general models of mutation that are of biological interest. In this paper, models with finitely-many alleles are considered, and an urn construction related to the coalescent is used to derive approximate closed-form sampling formulae for an arbitrary irreducible recurrent mutation model or for a reversible recurrent mutation model, depending on whether the number of distinct observed allele types is at most three or four, respectively. It is demonstrated empirically that the formulae derived here are highly accurate when the per-base mutation rate is low, which holds for many biological organisms.
APA, Harvard, Vancouver, ISO, and other styles
38

L'Ecuyer, Pierre, Gerardo Rubino, Samira Saggadi, and Bruno Tuffin. "Approximate Zero-Variance Importance Sampling for Static Network Reliability Estimation." IEEE Transactions on Reliability 60, no. 3 (September 2011): 590–604. http://dx.doi.org/10.1109/tr.2011.2135670.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Bhaskar, Anand, John A. Kamm, and Yun S. Song. "Approximate Sampling Formulae for General Finite-Alleles Models of Mutation." Advances in Applied Probability 44, no. 02 (June 2012): 408–28. http://dx.doi.org/10.1017/s0001867800005668.

Full text
Abstract:
Many applications in genetic analyses utilize sampling distributions, which describe the probability of observing a sample of DNA sequences randomly drawn from a population. In the one-locus case with special models of mutation, such as the infinite-alleles model or the finite-alleles parent-independent mutation model, closed-form sampling distributions under the coalescent have been known for many decades. However, no exact formula is currently known for more general models of mutation that are of biological interest. In this paper, models with finitely-many alleles are considered, and an urn construction related to the coalescent is used to derive approximate closed-form sampling formulae for an arbitrary irreducible recurrent mutation model or for a reversible recurrent mutation model, depending on whether the number of distinct observed allele types is at most three or four, respectively. It is demonstrated empirically that the formulae derived here are highly accurate when the per-base mutation rate is low, which holds for many biological organisms.
APA, Harvard, Vancouver, ISO, and other styles
40

Tang, Man-Lai, and Maozai Tian. "Approximate confidence interval construction for risk difference under inverse sampling." Statistics and Computing 20, no. 1 (March 26, 2009): 87–98. http://dx.doi.org/10.1007/s11222-009-9118-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Maire, Florian, Nial Friel, and Pierre Alquier. "Informed sub-sampling MCMC: approximate Bayesian inference for large datasets." Statistics and Computing 29, no. 3 (June 9, 2018): 449–82. http://dx.doi.org/10.1007/s11222-018-9817-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Robinson, P. M. "Approximate optimal allocation in repeated sampling from a finite population." Journal of Statistical Planning and Inference 11, no. 2 (February 1985): 135–48. http://dx.doi.org/10.1016/0378-3758(85)90001-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Singh, Manjit, Butta Singh, and Vijay Kumar Banga. "Effect of ECG Sampling Frequency on Approximate Entropy based HRV." International Journal of Bio-Science and Bio-Technology 6, no. 4 (August 31, 2014): 179–86. http://dx.doi.org/10.14257/ijbsbt.2014.6.4.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Ji, Akshita Maradapu Vera Venkata Sai, Xiuzhen Cheng, Wei Cheng, Zhi Tian, and Yingshu Li. "Sampling-based approximate skyline query in sensor equipped IoT networks." Tsinghua Science and Technology 26, no. 2 (April 2021): 219–29. http://dx.doi.org/10.26599/tst.2019.9010060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Guo, Ling, Akil Narayan, Liang Yan, and Tao Zhou. "Weighted Approximate Fekete Points: Sampling for Least-Squares Polynomial Approximation." SIAM Journal on Scientific Computing 40, no. 1 (January 2018): A366—A387. http://dx.doi.org/10.1137/17m1140960.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Cervellera, Cristiano, Mauro Gaggero, and Danilo Macciò. "Lattice point sets for state sampling in approximate dynamic programming." Optimal Control Applications and Methods 38, no. 6 (May 24, 2017): 1193–207. http://dx.doi.org/10.1002/oca.2325.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Wang, Liang, James C. Bezdek, Christopher Leckie, and Ramamohanarao Kotagiri. "Selective sampling for approximate clustering of very large data sets." International Journal of Intelligent Systems 23, no. 3 (2008): 313–31. http://dx.doi.org/10.1002/int.20268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Cervellera, C., M. Gaggero, and D. Macciò. "Low-discrepancy sampling for approximate dynamic programming with local approximators." Computers & Operations Research 43 (March 2014): 108–15. http://dx.doi.org/10.1016/j.cor.2013.09.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Boje, Edward. "Approximate models for continuous-time linear systems with sampling jitter." Automatica 41, no. 12 (December 2005): 2091–98. http://dx.doi.org/10.1016/j.automatica.2005.06.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Romanuke, Vadim. "Sampling Individually Fundamental Simplexes as Sets of Players’ Mixed Strategies in Finite Noncooperative Game for Applicable Approximate Nash Equilibrium Situations with Possible Concessions." Journal of information and organizational sciences 40, no. 1 (June 16, 2016): 105–43. http://dx.doi.org/10.31341/jios.40.1.6.

Full text
Abstract:
In finite noncooperative game, a method for finding approximate Nash equilibrium situations is developed. The method is prior-based on sampling fundamental simplexes being the sets of players’ mixed strategies. Whereas the sampling is exercised, the sets of players’ mixed strategies are mapped into finite lattices. Sampling steps are envisaged dissimilar. Thus, each player within every dimension of its simplex selects and controls one’s sampling individually. For preventing approximation low quality, however, sampling steps are restricted. According to the restricted sampling steps, a player acting singly with minimal spacing over its lattice cannot change payoff of any player more than by some predetermined magnitude, being specific for each player. The finite lattice is explicitly built by the represented routine, where the player’s mixed strategies are calculated and arranged. The product of all the players’ finite lattices approximates the product of continuous fundamental simplexes. This re-defines the finite noncooperative game in its finite mixed extension on the finite lattices’ product. In such a finite-mixed-extension-defined game, the set of Nash equilibrium situations may be empty. Therefore, approximate Nash equilibrium situations are defined by the introduced possible payoff concessions. A routine for finding approximate equilibrium situations is represented. Approximate strong Nash equilibria with possible concessions are defined, and a routine for finding them is represented as well. Acceleration of finding approximate equilibria is argued also. Finally, the developed method is discussed to be a basis in stating a universal approach for the finite noncooperative game solution approximation implying unification of the game solvability, applicability, realizability, and adaptability.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography