Literatura científica selecionada sobre o tema "Bernoulli-Gaussian"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Bernoulli-Gaussian".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Bernoulli-Gaussian"
Hrabovets, Anastasiia. "Feynman diagrams and their limits for Bernoulli noise". Theory of Stochastic Processes 27(43), n.º 1 (16 de novembro de 2023): 11–30. http://dx.doi.org/10.3842/tsp-4311781209-33.
Texto completo da fonteTörő, Olivér, Tamás Bécsi, Szilárd Aradi e Péter Gáspár. "IMM Bernoulli Gaussian Particle Filter". IFAC-PapersOnLine 51, n.º 22 (2018): 274–79. http://dx.doi.org/10.1016/j.ifacol.2018.11.554.
Texto completo da fonteXie, Shaohao, Shaohua Zhuang e Yusong Du. "Improved Bernoulli Sampling for Discrete Gaussian Distributions over the Integers". Mathematics 9, n.º 4 (13 de fevereiro de 2021): 378. http://dx.doi.org/10.3390/math9040378.
Texto completo da fonteFinamore, Weiler, Marcelo Pinho, Manish Sharma e Moises Ribeiro. "Modeling Noise as a Bernoulli-Gaussian Process". Journal of Communication and Information Systems 38 (2023): 175–86. http://dx.doi.org/10.14209/jcis.2023.20.
Texto completo da fonteBobkov, Sergey G., Friedrich Gotze e Christian Houdre. "On Gaussian and Bernoulli Covariance Representations". Bernoulli 7, n.º 3 (junho de 2001): 439. http://dx.doi.org/10.2307/3318495.
Texto completo da fonteLavielle, Marc. "Bayesian deconvolution of Bernoulli-Gaussian processes". Signal Processing 33, n.º 1 (julho de 1993): 67–79. http://dx.doi.org/10.1016/0165-1684(93)90079-p.
Texto completo da fonteDe La Rue, Thierry. "Systèmes dynamiques gaussiens d'entropie nulle, lâchement et non lâchement Bernoulli". Ergodic Theory and Dynamical Systems 16, n.º 2 (abril de 1996): 379–404. http://dx.doi.org/10.1017/s0143385700008865.
Texto completo da fonteAl-Zuhairi, Dheyaa T., e Abbas Salman Hameed. "DOA estimation under Bernoulli-Gaussian impulsive noise". IOP Conference Series: Materials Science and Engineering 1090, n.º 1 (1 de março de 2021): 012096. http://dx.doi.org/10.1088/1757-899x/1090/1/012096.
Texto completo da fonteChong-Yung Chi e J. Mendel. "Viterbi algorithm detector for Bernoulli-Gaussian processes". IEEE Transactions on Acoustics, Speech, and Signal Processing 33, n.º 3 (junho de 1985): 511–19. http://dx.doi.org/10.1109/tassp.1985.1164580.
Texto completo da fonteTalagrand, Michel. "Gaussian averages, Bernoulli averages, and Gibbs' measures". Random Structures and Algorithms 21, n.º 3-4 (outubro de 2002): 197–204. http://dx.doi.org/10.1002/rsa.10059.
Texto completo da fonteTeses / dissertações sobre o assunto "Bernoulli-Gaussian"
Gaerke, Tiffani M. "Characteristic Functions and Bernoulli-Gaussian Impulsive Noise Channels". University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1408040080.
Texto completo da fonteVu, Hung Van. "Capacities of Bernoulli-Gaussian Impulsive Noise Channels in Rayleigh Fading". University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1407370145.
Texto completo da fonteBoudineau, Mégane. "Vers la résolution "optimale" de problèmes inverses non linéaires parcimonieux grâce à l'exploitation de variables binaires sur dictionnaires continus : applications en astrophysique". Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30020/document.
Texto completo da fonteThis thesis deals with solutions of nonlinear inverse problems using a sparsity prior; more specifically when the data can be modelled as a linear combination of a few functions, which depend non-linearly on a "location" parameter, i.e. frequencies for spectral analysis or time-delay for spike train deconvolution. These problems are generally reformulated as linear sparse approximation problems, thanks to an evaluation of the nonlinear functions at location parameters discretised on a thin grid, building a "discrete dictionary". However, such an approach has two major drawbacks. On the one hand, the discrete dictionary is highly correlated; classical sub-optimal methods such as L1- penalisation or greedy algorithms can then fail. On the other hand, the estimated location parameter, which belongs to the discretisation grid, is necessarily discrete and that leads to model errors. To deal with these issues, we propose in this work an exact sparsity model, thanks to the introduction of binary variables, and an optimal solution of the problem with a "continuous dictionary" allowing a continuous estimation of the location parameter. We focus on two research axes, which we illustrate with problems such as spike train deconvolution and spectral analysis of unevenly sampled data. The first axis focusses on the "dictionary interpolation" principle, which consists in a linearisation of the continuous dictionary in order to get a constrained linear sparse approximation problem. The introduction of binary variables allows us to reformulate this problem as a "Mixed Integer Program" (MIP) and to exactly model the sparsity thanks to the "pseudo-norm L0". We study different kinds of dictionary interpolation and constraints relaxation, in order to solve the problem optimally thanks to MIP classical algorithms. For the second axis, in a Bayesian framework, the binary variables are supposed random with a Bernoulli distribution and we model the sparsity through a Bernoulli-Gaussian prior. This model is extended to take into account continuous location parameters (BGE model). We then estimate the parameters from samples drawn using Markov chain Monte Carlo algorithms. In particular, we show that marginalising the amplitudes allows us to improve the sampling of a Gibbs algorithm in a supervised case (when the model's hyperparameters are known). In an unsupervised case, we propose to take advantage of such a marginalisation through a "Partially Collapsed Gibbs Sampler." Finally, we adapt the BGE model and associated samplers to a topical science case in Astrophysics: the detection of exoplanets from radial velocity measurements. The efficiency of our method will be illustrated with simulated data, as well as actual astrophysical data
Barbault, Pierre. "Un ticket pour le sparse : de l'estimation des signaux et des paramètres en problèmes inverses bernoulli-gaussiens". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG049.
Texto completo da fonteMagneto/Electro Encephalography (M/EEG) imaging can be used to reconstruct focal points of cerebral activity by measuring the Electro Magnetic field produced by it. Even if the characteristic time of the recorded signals is low enough to be able to consider a linear acquisition model, the number of possible sources remains very large compared to the number of sensors. In fact, this is an ill-posed and, moreover, a large-scale problem. In order to reduce it to a 'well-posed' problem, a common assumption, and which makes sense for neurons, is that the sources are sparse i.e. that the number of non-zero values is very small. We then model the problem from a probabilistic point of view using a Bernoulli-Gaussian (BG) a priori for the sources. There are many methods that can solve such a problem, but most of them require knowledge of the parameters of the BG law. The objective of this thesis is to propose a completely unsupervised approach which allows to estimate the parameters of the BG law as well as to estimate the sources if possible. To do this, Expectation-Minimization (EM) algorithms are explored. First, the simplest case is treated: that of denoising where the linear operator is the identity. In this framework, three algorithms are proposed: A Moments method based on data statistics, an EM and a joint estimation algorithm for sources and parameters. The results show that the EM initialized by the Method of Moments is the best candidate for parameter estimation. Secondly, the previous results are extended to the general case of any linear operator thanks to the introduction of a latent variable. This variable, by decoupling the sources from the observations, makes it possible to derive so-called 'latent' algorithms which alternate between a gradient descent step and a denoising step which corresponds exactly to the problem dealt with previously. The results then show that the most effective strategy is the use of the 'latent' joint estimate which initializes the 'latent' EM. Finally, the last part of this work is devoted to theoretical considerations concerning the choice of joint or marginal estimators in the supervised case. In particular with regard to the sources and their supports. This work shows that it is possible to frame marginal problems by joint problems thanks to a reparameterization of the problem. This then makes it possible to propose a general estimation strategy based on the initialization of marginal estimation algorithms by joint estimation algorithms
Yu, Jia. "Distributed parameter and state estimation for wireless sensor networks". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28929.
Texto completo da fonteDe, la Rue Thierry. "Quelques résultats sur les systèmes dynamiques gaussiens réels". Rouen, 1994. https://tel.archives-ouvertes.fr/tel-01546012.
Texto completo da fonteGuan, Feng-Bo, e 馮博冠. "Parameter Estimations of Condition Gaussian Distribution Given Bernoulli Distribution". Thesis, 2002. http://ndltd.ncl.edu.tw/handle/87455161476925419316.
Texto completo da fonte國立中央大學
數學研究所
90
Abstract The sample mean and sample variance of a Gaussian distribution have the following nice statistical properties:(1)both are sufficient,(2)they are independent, (3)sample mean is m.l.e., UMVUE, and method of momemt estimator,(4)sample variance is m.l.e.,method of moment estimator and UMVUE if multiplied by a constant, (5)both estimators have variances achieve the Cramer-Rao lower bound,(6)both estimators are asymptotically efficient.Based on sample obtianed from the conditional Gaussian distribution given Bernoulli distribution,we study conditional sample mean and conditional sample variance and check if they also have the above statistical properties.
"Bayesian Inference Frameworks for Fluorescence Microscopy Data Analysis". Master's thesis, 2019. http://hdl.handle.net/2286/R.I.53545.
Texto completo da fonteDissertation/Thesis
Masters Thesis Applied Mathematics 2019
FANTACCI, CLAUDIO. "Distributed multi-object tracking over sensor networks: a random finite set approach". Doctoral thesis, 2015. http://hdl.handle.net/2158/1003256.
Texto completo da fonteCapítulos de livros sobre o assunto "Bernoulli-Gaussian"
Cho, KyungHyun, Alexander Ilin e Tapani Raiko. "Improved Learning of Gaussian-Bernoulli Restricted Boltzmann Machines". In Lecture Notes in Computer Science, 10–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21735-7_2.
Texto completo da fonteErshov, M. A., e A. S. Voroshilov. "UCB Strategy for Gaussian and Bernoulli Multi-armed Bandits". In Communications in Computer and Information Science, 67–78. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43257-6_6.
Texto completo da fonteLi, Ziqiang, Xun Cai e Ti Liang. "Gaussian-Bernoulli Based Convolutional Restricted Boltzmann Machine for Images Feature Extraction". In Neural Information Processing, 593–602. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46672-9_66.
Texto completo da fonteVeziroğlu, Merve, Erkan Eziroğlu e İhsan Ömür Bucak. "PERFORMANCE COMPARISON BETWEEN NAIVE BAYES AND MACHINE LEARNING ALGORITHMS FOR NEWS CLASSIFICATION". In Bayesian Inference - Recent Trends. IntechOpen, 2024. http://dx.doi.org/10.5772/intechopen.1002778.
Texto completo da fonteSivia, D. S. "Assigning probabilities". In Data Analysis, 103–26. Oxford University PressOxford, 2006. http://dx.doi.org/10.1093/oso/9780198568315.003.0005.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Bernoulli-Gaussian"
Cho, Kyung Hyun, Tapani Raiko e Alexander Ilin. "Gaussian-Bernoulli deep Boltzmann machine". In 2013 International Joint Conference on Neural Networks (IJCNN 2013 - Dallas). IEEE, 2013. http://dx.doi.org/10.1109/ijcnn.2013.6706831.
Texto completo da fonteJones, George, Ángel F. García-Fernández e Prudence W. H. Wong. "GOSPA-Driven Gaussian Bernoulli Sensor Management". In 2023 26th International Conference on Information Fusion (FUSION). IEEE, 2023. http://dx.doi.org/10.23919/fusion52260.2023.10224220.
Texto completo da fonteLiu, Lei, Chongwen Huang, Yuhao Chi, Chau Yuen, Yong Liang Guan e Ying Li. "Sparse Vector Recovery: Bernoulli-Gaussian Message Passing". In 2017 IEEE Global Communications Conference (GLOBECOM 2017). IEEE, 2017. http://dx.doi.org/10.1109/glocom.2017.8254836.
Texto completo da fonteVila, Jeremy, e Philip Schniter. "Expectation-maximization Bernoulli-Gaussian approximate message passing". In 2011 45th Asilomar Conference on Signals, Systems and Computers. IEEE, 2011. http://dx.doi.org/10.1109/acssc.2011.6190117.
Texto completo da fonteBazot, Cecile, Nicolas Dobigeon, Jean-Yves Tourneret e Alfred O. Hero. "A Bernoulli-Gaussian model for gene factor analysis". In 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2011. http://dx.doi.org/10.1109/icassp.2011.5947728.
Texto completo da fonteBassi, Francesca, Michel Kieffer e Cagatay Dikici. "Multiterminal source coding of Bernoulli-Gaussian correlated sources". In ICASSP 2009 - 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2009. http://dx.doi.org/10.1109/icassp.2009.4960130.
Texto completo da fonteYin, Jianjun, Jianqiu Zhang e Jin Zhao. "The Gaussian Particle multi-target multi-Bernoulli filter". In 2010 2nd International Conference on Advanced Computer Control. IEEE, 2010. http://dx.doi.org/10.1109/icacc.2010.5486859.
Texto completo da fonteGarcaa-Fernandez, Angel F., Yuxuan Xia, Karl Granstrom, Lennart Svensson e Jason L. Williams. "Gaussian implementation of the multi-Bernoulli mixture filter". In 2019 22th International Conference on Information Fusion (FUSION). IEEE, 2019. http://dx.doi.org/10.23919/fusion43075.2019.9011346.
Texto completo da fonteChaari, Lotfi, Jean-Yves Toumeret e Caroline Chaux. "Sparse signal recovery using a Bernoulli generalized Gaussian prior". In 2015 23rd European Signal Processing Conference (EUSIPCO). IEEE, 2015. http://dx.doi.org/10.1109/eusipco.2015.7362676.
Texto completo da fonteYildirim, Sinan, A. Taylan Cemgil e Aysin B. Ertuzun. "A hybrid method for deconvolution of Bernoulli-Gaussian processes". In ICASSP 2009 - 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2009. http://dx.doi.org/10.1109/icassp.2009.4960359.
Texto completo da fonte