To see the other types of publications on this topic, follow the link: Machines de Boltzmann restreintes.

Journal articles on the topic 'Machines de Boltzmann restreintes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Machines de Boltzmann restreintes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Apolloni, B., A. Bertoni, P. Campadelli, and D. de Falco. "Asymmetric Boltzmann machines." Biological Cybernetics 66, no. 1 (November 1991): 61–70. http://dx.doi.org/10.1007/bf00196453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lu, Wenhao, Chi-Sing Leung, and John Sum. "Analysis on Noisy Boltzmann Machines and Noisy Restricted Boltzmann Machines." IEEE Access 9 (2021): 112955–65. http://dx.doi.org/10.1109/access.2021.3102275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Livesey, M. "Clamping in Boltzmann machines." IEEE Transactions on Neural Networks 2, no. 1 (1991): 143–48. http://dx.doi.org/10.1109/72.80301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Fischer, Asja. "Training Restricted Boltzmann Machines." KI - Künstliche Intelligenz 29, no. 4 (May 12, 2015): 441–44. http://dx.doi.org/10.1007/s13218-015-0371-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bojnordi, Mahdi Nazm, and Engin Ipek. "The Memristive Boltzmann Machines." IEEE Micro 37, no. 3 (2017): 22–29. http://dx.doi.org/10.1109/mm.2017.53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Jeremy, Ke-Thia Yao, and Federico Spedalieri. "Dynamic Topology Reconfiguration of Boltzmann Machines on Quantum Annealers." Entropy 22, no. 11 (October 24, 2020): 1202. http://dx.doi.org/10.3390/e22111202.

Full text
Abstract:
Boltzmann machines have useful roles in deep learning applications, such as generative data modeling, initializing weights for other types of networks, or extracting efficient representations from high-dimensional data. Most Boltzmann machines use restricted topologies that exclude looping connectivity, as such connectivity creates complex distributions that are difficult to sample. We have used an open-system quantum annealer to sample from complex distributions and implement Boltzmann machines with looping connectivity. Further, we have created policies mapping Boltzmann machine variables to the quantum bits of an annealer. These policies, based on correlation and entropy metrics, dynamically reconfigure the topology of Boltzmann machines during training and improve performance.
APA, Harvard, Vancouver, ISO, and other styles
7

Luo, Heng, Ruimin Shen, Changyong Niu, and Carsten Ullrich. "Sparse Group Restricted Boltzmann Machines." Proceedings of the AAAI Conference on Artificial Intelligence 25, no. 1 (August 4, 2011): 429–34. http://dx.doi.org/10.1609/aaai.v25i1.7923.

Full text
Abstract:
Since learning in Boltzmann machines is typically quite slow, there is a need to restrict connections within hidden layers. However, theresulting states of hidden units exhibit statistical dependencies. Based on this observation, we propose using l1/l2 regularization upon the activation probabilities of hidden units in restricted Boltzmann machines to capture the local dependencies among hidden units. This regularization not only encourages hidden units of many groups to be inactive given observed data but also makes hidden units within a group compete with each other for modeling observed data. Thus, the l1/l2 regularization on RBMs yields sparsity at both the group and the hidden unit levels. We call RBMs trained with the regularizer sparse group RBMs (SGRBMs). The proposed SGRBMs are appliedto model patches of natural images, handwritten digits and OCR English letters. Then to emphasize that SGRBMs can learn more discriminative features we applied SGRBMs to pretrain deep networks for classification tasks. Furthermore, we illustrate the regularizer can also be applied to deep Boltzmann machines, which lead to sparse group deep Boltzmann machines. When adapted to the MNIST data set, a two-layer sparse group Boltzmann machine achieves an error rate of 0.84%, which is, to our knowledge, the best published result on the permutation-invariant version of the MNIST task.
APA, Harvard, Vancouver, ISO, and other styles
8

Decelle, Aurélien, and Cyril Furtlehner. "Gaussian-spherical restricted Boltzmann machines." Journal of Physics A: Mathematical and Theoretical 53, no. 18 (April 16, 2020): 184002. http://dx.doi.org/10.1088/1751-8121/ab79f3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Apolloni, B., and D. de Falco. "Learning by parallel Boltzmann machines." IEEE Transactions on Information Theory 37, no. 4 (July 1991): 1162–65. http://dx.doi.org/10.1109/18.87009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

d'Anjou, A., M. Grana, F. J. Torrealdea, and M. C. Hernandez. "Solving satisfiability via Boltzmann machines." IEEE Transactions on Pattern Analysis and Machine Intelligence 15, no. 5 (May 1993): 514–21. http://dx.doi.org/10.1109/34.211473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Ticknor, Anthony J., and Harrison H. Barrett. "Optical Implementations In Boltzmann Machines." Optical Engineering 26, no. 1 (January 1, 1987): 260116. http://dx.doi.org/10.1117/12.7974015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Amari, S., K. Kurata, and H. Nagaoka. "Information geometry of Boltzmann machines." IEEE Transactions on Neural Networks 3, no. 2 (March 1992): 260–71. http://dx.doi.org/10.1109/72.125867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Welling, Max, and Yee Whye Teh. "Approximate inference in Boltzmann machines." Artificial Intelligence 143, no. 1 (January 2003): 19–50. http://dx.doi.org/10.1016/s0004-3702(02)00361-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Prager, R. W., T. D. Harrison, and F. Fallside. "Boltzmann machines for speech recognition." Computer Speech & Language 1, no. 1 (March 1986): 3–27. http://dx.doi.org/10.1016/s0885-2308(86)80008-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Balzer, Wolfgang, Masanobu Takahashi, Jun Ohta, and Kazuo Kyuma. "Weight quantization in Boltzmann machines." Neural Networks 4, no. 3 (January 1991): 405–9. http://dx.doi.org/10.1016/0893-6080(91)90077-i.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Passos, Leandro Aparecido, and João Paulo Papa. "Temperature-Based Deep Boltzmann Machines." Neural Processing Letters 48, no. 1 (September 8, 2017): 95–107. http://dx.doi.org/10.1007/s11063-017-9707-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

KOBAYASHI, M. "Boltzmann Machines with Identified States." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E91-A, no. 3 (March 1, 2008): 887–90. http://dx.doi.org/10.1093/ietfec/e91-a.3.887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Miasnikof, Pierre, Mohammad Bagherbeik, and Ali Sheikholeslami. "Graph clustering with Boltzmann machines." Discrete Applied Mathematics 343 (January 2024): 208–23. http://dx.doi.org/10.1016/j.dam.2023.10.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Liu, Kai, Li Min Zhang, and Yong Wei Sun. "Deep Boltzmann Machines Aided Design Based on Genetic Algorithms." Applied Mechanics and Materials 568-570 (June 2014): 848–51. http://dx.doi.org/10.4028/www.scientific.net/amm.568-570.848.

Full text
Abstract:
To resolve the problem of no guidance about how to set the values of numerical meta-parameters and difficulty to achieve optimization of Deep Boltzmann Machines, genetic algorithms are used to develop an automatic optimizing method named GA-RBMs (Genetic Algorithm-Restricted Boltzmann Machines) for this model’s aided design. Based on the Restricted Boltzmann Machines’ features and evaluation function, a genetic algorithm is designed and realizes the global search of satisfied structure. We also initialize the network’s weights to determine the number of visible units and hidden units. The experiments were conducted on MNIST digits handwritten datasets. The results proved that this optimization reduced the dimension of visible units and improved the performance of feature extracted by Deep Boltzmann Machines. The network optimized has good generalization performance and meets the demand of Deep Boltzmann Machines’ aided design.
APA, Harvard, Vancouver, ISO, and other styles
20

Saul, Lawrence, and Michael I. Jordan. "Learning in Boltzmann Trees." Neural Computation 6, no. 6 (November 1994): 1174–84. http://dx.doi.org/10.1162/neco.1994.6.6.1174.

Full text
Abstract:
We introduce a large family of Boltzmann machines that can be trained by standard gradient descent. The networks can have one or more layers of hidden units, with tree-like connectivity. We show how to implement the supervised learning algorithm for these Boltzmann machines exactly, without resort to simulated or mean-field annealing. The stochastic averages that yield the gradients in weight space are computed by the technique of decimation. We present results on the problems of N-bit parity and the detection of hidden symmetries.
APA, Harvard, Vancouver, ISO, and other styles
21

Yasuda, Muneki, and Kazuyuki Tanaka. "Approximate Learning Algorithm in Boltzmann Machines." Neural Computation 21, no. 11 (November 2009): 3130–78. http://dx.doi.org/10.1162/neco.2009.08-08-844.

Full text
Abstract:
Boltzmann machines can be regarded as Markov random fields. For binary cases, they are equivalent to the Ising spin model in statistical mechanics. Learning systems in Boltzmann machines are one of the NP-hard problems. Thus, in general we have to use approximate methods to construct practical learning algorithms in this context. In this letter, we propose new and practical learning algorithms for Boltzmann machines by using the belief propagation algorithm and the linear response approximation, which are often referred as advanced mean field methods. Finally, we show the validity of our algorithm using numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
22

Crawford, Daniel, Anna Levit, Navid Ghadermarzy, Jaspreet S. Oberoi, and Pooya Ronagh. "Reinforcement learning using quantum Boltzmann machines." Quantum Information and Computation 18, no. 1&2 (February 2018): 51–74. http://dx.doi.org/10.26421/qic18.1-2-3.

Full text
Abstract:
We investigate whether quantum annealers with select chip layouts can outperform classical computers in reinforcement learning tasks. We associate a transverse field Ising spin Hamiltonian with a layout of qubits similar to that of a deep Boltzmann machine (DBM) and use simulated quantum annealing (SQA) to numerically simulate quantum sampling from this system. We design a reinforcement learning algorithm in which the set of visible nodes representing the states and actions of an optimal policy are the first and last layers of the deep network. In absence of a transverse field, our simulations show that DBMs are trained more effectively than restricted Boltzmann machines (RBM) with the same number of nodes. We then develop a framework for training the network as a quantum Boltzmann machine (QBM) in the presence of a significant transverse field for reinforcement learning. This method also outperforms the reinforcement learning method that uses RBMs.
APA, Harvard, Vancouver, ISO, and other styles
23

Kobayashi, Masaki. "Information geometry of rotor Boltzmann machines." Nonlinear Theory and Its Applications, IEICE 7, no. 2 (2016): 266–82. http://dx.doi.org/10.1587/nolta.7.266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Yasuda, Muneki, and Tomoyuki Obuchi. "Empirical Bayes method for Boltzmann machines." Journal of Physics A: Mathematical and Theoretical 53, no. 1 (December 10, 2019): 014004. http://dx.doi.org/10.1088/1751-8121/ab57a7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Genovese, Giuseppe, and Daniele Tantari. "Legendre equivalences of spherical Boltzmann machines." Journal of Physics A: Mathematical and Theoretical 53, no. 9 (February 4, 2020): 094001. http://dx.doi.org/10.1088/1751-8121/ab6b92.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Melko, Roger G., Giuseppe Carleo, Juan Carrasquilla, and J. Ignacio Cirac. "Restricted Boltzmann machines in quantum physics." Nature Physics 15, no. 9 (June 24, 2019): 887–92. http://dx.doi.org/10.1038/s41567-019-0545-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ackley, David H., Geoffrey E. Hinton, and Terrence J. Sejnowski. "A Learning Algorithm for Boltzmann Machines*." Cognitive Science 9, no. 1 (January 1985): 147–69. http://dx.doi.org/10.1207/s15516709cog0901_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Apolloni, Bruno, and Diego de Falco. "Learning by Asymmetric Parallel Boltzmann Machines." Neural Computation 3, no. 3 (September 1991): 402–8. http://dx.doi.org/10.1162/neco.1991.3.3.402.

Full text
Abstract:
We consider the Little, Shaw, Vasudevan model as a parallel asymmetric Boltzmann machine, in the sense that we extend to this model the entropic learning rule first studied by Ackley, Hinton, and Sejnowski in the case of a sequentially activated network with symmetric synaptic matrix. The resulting Hebbian learning rule for the parallel asymmetric model draws the signal for the updating of synaptic weights from time averages of the discrepancy between expected and actual transitions along the past history of the network. As we work without the hypothesis of symmetry of the weights, we can include in our analysis also feedforward networks, for which the entropic learning rule turns out to be complementary to the error backpropagation rule, in that it “rewards the correct behavior” instead of “penalizing the wrong answers.”
APA, Harvard, Vancouver, ISO, and other styles
29

Zhang, Jian, Shifei Ding, Nan Zhang, and Weikuan Jia. "Adversarial Training Methods for Boltzmann Machines." IEEE Access 8 (2020): 4594–604. http://dx.doi.org/10.1109/access.2019.2962758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Fischer, Asja, and Christian Igel. "Training restricted Boltzmann machines: An introduction." Pattern Recognition 47, no. 1 (January 2014): 25–39. http://dx.doi.org/10.1016/j.patcog.2013.05.025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Zhang, Nan, Shifei Ding, Jian Zhang, and Yu Xue. "An overview on Restricted Boltzmann Machines." Neurocomputing 275 (January 2018): 1186–99. http://dx.doi.org/10.1016/j.neucom.2017.09.065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Schulz, Hannes, Andreas Müller, and Sven Behnke. "Exploiting local structure in Boltzmann machines." Neurocomputing 74, no. 9 (April 2011): 1411–17. http://dx.doi.org/10.1016/j.neucom.2010.12.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Kappen, Hilbert J. "Deterministic learning rules for boltzmann machines." Neural Networks 8, no. 4 (January 1995): 537–48. http://dx.doi.org/10.1016/0893-6080(94)00112-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Zwietering, Patrick, and Emile Aarts. "Parallel Boltzmann machines: A mathematical model." Journal of Parallel and Distributed Computing 13, no. 1 (September 1991): 65–75. http://dx.doi.org/10.1016/0743-7315(91)90110-u.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Aarts, Emile H. L., and Jan H. M. Korst. "Boltzmann machines for travelling salesman problems." European Journal of Operational Research 39, no. 1 (March 1989): 79–95. http://dx.doi.org/10.1016/0377-2217(89)90355-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Upadhya, Vidyadhar, and P. S. Sastry. "An Overview of Restricted Boltzmann Machines." Journal of the Indian Institute of Science 99, no. 2 (February 18, 2019): 225–36. http://dx.doi.org/10.1007/s41745-019-0102-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Cheng, Song, Jing Chen, and Lei Wang. "Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines." Entropy 20, no. 8 (August 7, 2018): 583. http://dx.doi.org/10.3390/e20080583.

Full text
Abstract:
We compare and contrast the statistical physics and quantum physics inspired approaches for unsupervised generative modeling of classical data. The two approaches represent probabilities of observed data using energy-based models and quantum states, respectively. Classical and quantum information patterns of the target datasets therefore provide principled guidelines for structural design and learning in these two approaches. Taking the Restricted Boltzmann Machines (RBM) as an example, we analyze the information theoretical bounds of the two approaches. We also estimate the classical mutual information of the standard MNIST datasets and the quantum Rényi entropy of corresponding Matrix Product States (MPS) representations. Both information measures are much smaller compared to their theoretical upper bound and exhibit similar patterns, which imply a common inductive bias of low information complexity. By comparing the performance of RBM with various architectures on the standard MNIST datasets, we found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems.
APA, Harvard, Vancouver, ISO, and other styles
38

Teng, Da, Zhang Li, Guanghong Gong, and Liang Han. "Boltzmann machines with clusters of stochastic binary units." International Journal of Modeling, Simulation, and Scientific Computing 07, no. 02 (June 2016): 1650018. http://dx.doi.org/10.1142/s1793962316500185.

Full text
Abstract:
The original restricted Boltzmann machines (RBMs) are extended by replacing the binary visible and hidden variables with clusters of binary units, and a new learning algorithm for training deep Boltzmann machine of this new variant is proposed. The sum of binary units of each cluster is approximated by a Gaussian distribution. Experiments demonstrate that the proposed Boltzmann machines can achieve good performance in the MNIST handwritten digital recognition task.
APA, Harvard, Vancouver, ISO, and other styles
39

Salakhutdinov, Ruslan, and Geoffrey Hinton. "An Efficient Learning Procedure for Deep Boltzmann Machines." Neural Computation 24, no. 8 (August 2012): 1967–2006. http://dx.doi.org/10.1162/neco_a_00311.

Full text
Abstract:
We present a new learning algorithm for Boltzmann machines that contain many layers of hidden variables. Data-dependent statistics are estimated using a variational approximation that tends to focus on a single mode, and data-independent statistics are estimated using persistent Markov chains. The use of two quite different techniques for estimating the two types of statistic that enter into the gradient of the log likelihood makes it practical to learn Boltzmann machines with multiple hidden layers and millions of parameters. The learning can be made more efficient by using a layer-by-layer pretraining phase that initializes the weights sensibly. The pretraining also allows the variational inference to be initialized sensibly with a single bottom-up pass. We present results on the MNIST and NORB data sets showing that deep Boltzmann machines learn very good generative models of handwritten digits and 3D objects. We also show that the features discovered by deep Boltzmann machines are a very effective way to initialize the hidden layers of feedforward neural nets, which are then discriminatively fine-tuned.
APA, Harvard, Vancouver, ISO, and other styles
40

Suykens, Johan A. K. "Deep Restricted Kernel Machines Using Conjugate Feature Duality." Neural Computation 29, no. 8 (August 2017): 2123–63. http://dx.doi.org/10.1162/neco_a_00984.

Full text
Abstract:
The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.
APA, Harvard, Vancouver, ISO, and other styles
41

Kobayashi, Masaki. "Information geometry of hyperbolic-valued Boltzmann machines." Neurocomputing 431 (March 2021): 163–68. http://dx.doi.org/10.1016/j.neucom.2020.12.048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

YASUDA, Muneki, and Kazuyuki TANAKA. "Boltzmann Machines with Bounded Continuous Random Variables." Interdisciplinary Information Sciences 13, no. 1 (2007): 25–31. http://dx.doi.org/10.4036/iis.2007.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Mohan, Ankith, Aiichiro Nakano, and Emilio Ferrara. "Graph signal recovery using restricted Boltzmann machines." Expert Systems with Applications 185 (December 2021): 115635. http://dx.doi.org/10.1016/j.eswa.2021.115635.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Giuffrida, Mario Valerio, and Sotirios A. Tsaftaris. "Unsupervised Rotation Factorization in Restricted Boltzmann Machines." IEEE Transactions on Image Processing 29 (2020): 2166–75. http://dx.doi.org/10.1109/tip.2019.2946455.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Apolloni, Bruno, Egidio Battistini, and Diego de Falco. "Higher-order Boltzmann machines and entropy bounds." Journal of Physics A: Mathematical and General 32, no. 30 (July 20, 1999): 5529–38. http://dx.doi.org/10.1088/0305-4470/32/30/301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Li, Qing, Yang Chen, and Yongjune Kim. "Compression by and for Deep Boltzmann Machines." IEEE Transactions on Communications 68, no. 12 (December 2020): 7498–510. http://dx.doi.org/10.1109/tcomm.2020.3020796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Bounds, D. G. "A statistical mechanical study of Boltzmann machines." Journal of Physics A: Mathematical and General 20, no. 8 (June 1, 1987): 2133–45. http://dx.doi.org/10.1088/0305-4470/20/8/027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Azencott, Robert, Antoine Doutriaux, and Laurent Younes. "Synchronous Boltzmann machines and curve identification tasks*." Network: Computation in Neural Systems 4, no. 4 (November 1, 1993): 461–80. http://dx.doi.org/10.1088/0954-898x/4/4/004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Azencott, Robert, Antoine Doutriaux, and Laurent Younes. "Synchronous Boltzmann machines and curve identification tasks." Network: Computation in Neural Systems 4, no. 4 (January 1993): 461–80. http://dx.doi.org/10.1088/0954-898x_4_4_004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Neal, Radford M. "Asymmetric Parallel Boltzmann Machines are Belief Networks." Neural Computation 4, no. 6 (November 1992): 832–34. http://dx.doi.org/10.1162/neco.1992.4.6.832.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography