What are Boltzmann machines used for?

linkCatch


The following important information is missing from this article or section:
  • How are Boltzmann machines trained?
  • What is the energy function good for?
Help Wikipedia by researching and pasting it.
This article or the following section is not adequately provided with supporting documents (e.g. individual evidence). Information without sufficient evidence could be removed soon. Please help Wikipedia by researching the information and including good evidence.
Not a single receipt. It should e.g. For example, reference can be made to the work of Hinton, in which the Boltzmann machine was introduced. --Martin Thoma 1:40 p.m., Dec. 6, 2015 (CET)

A Boltzmann machine is a stochastic artificial neural network developed by Geoffrey Hinton and Terrence J. Sejnowski in 1985.[1] These networks are named after the Boltzmann distribution. Boltzmann machines with no connection restrictions are very difficult to train. However, if the connections between the neurons are restricted, the learning process can be greatly simplified, which means that Limited Boltzmann machines can be used to solve practical problems.

Table of Contents

construction


A Boltzmann machine, like a Hopfield network, is a network of neurons in which an energy level is defined. As in Hopfield networks, the neurons only accept binary values ​​(0 or 1), but in contrast behave stochastically. The energy level \ ({\ displaystyle E} \) of a Boltzmann machine is defined as in a Hopfield network:

\ ({\ displaystyle E = - \ left (\ sum _ {i

where:

  • \ ({\ displaystyle w_ {ij}} \) is the weight of the connection between neuron \ ({\ displaystyle i} \) and \ ({\ displaystyle j} \).
  • \ ({\ displaystyle s_ {i}} \) is the state \ ({\ displaystyle s_ {i} \ in \ {0,1 \}} \) of the neuron \ ({\ displaystyle i} \).
  • \ ({\ displaystyle \ theta _ {i}} \) is the threshold value of a neuron \ ({\ displaystyle i} \). (\ ({\ displaystyle - \ theta _ {i}} \) is the value from which a neuron is activated.)

The connections of a Boltzmann machine have two limitations:

  • \ ({\ displaystyle w_ {ii} = 0 \ qquad \ forall i} \). (No neuron has a connection with itself.)
  • \ ({\ displaystyle w_ {ij} = w_ {ji} \ qquad \ forall i, j} \). (All connections are symmetrical.)

The weightings can be represented in the form of a symmetrical matrix \ ({\ displaystyle W} \), the main diagonal of which consists of zeros.

Just as with the Hopfield network, the Boltzmann machine tends to reduce the value of the energy defined in this way with successive updates, i.e. ultimately to minimize it until a stable state is reached.

Restricted Boltzmann machine


A so-called Restricted Boltzmann Machine (RBM) consists of visible units and hidden units. The feature vector is applied to the invisible units.

The "restricted" comes from the fact that the visible units are not connected to each other and the hidden units are not connected to each other. However, the visible units are fully connected to the hidden units. So they form a bipartite, undirected graph. This is illustrated below:

The parameters to be learned are the weights of the edges between visible and hidden units and the bias vectors \ ({\ displaystyle b_ {h}, b_ {v}} \) of the hidden and visible units. These are learned using the Contrastive Divergence Algorithm.[2]

Restricted Boltzmann Machines were used for collaborative filtering on Netflix.[3]

Web links


Individual evidence


  1. ↑ David H. Ackley, Geoffrey E. Hinton, Terrence J. Sejnowski: A Learning Algorithm for Boltzmann Machines. In: Cognitive science, Volume 9, Issue 1, January 1985, pp. 147-169. On Wiley.com (PDF, English), accessed on February 13, 2021, doi: 10.1207 / s15516709cog0901_7.
  2. ↑ Geoffrey Hinton: A practical guide to training restricted Boltzmann machines. 2010. 
  3. ↑ Ruslan Salakhutdinov, Andriy Mnih, Geoffrey Hinton: Restricted Boltzmann machines for collaborative filtering. In: Proceedings of the 24th international conference on machine learning. 2007, pp. 791-798.









Categories:Neuroinformatics | Computational Neuroscience | Machine learning | Ludwig Boltzmann




Status of information: 04/27/2021 2:01:26 AM CEST

Source: Wikipedia (authors [version history]) License: CC-BY-SA-3.0

Changes: All images and most of the design elements associated with them have been removed. Some of the icons have been replaced by FontAwesome icons. Some templates have been removed (such as "Article worth reading", "Excellent article") or rewritten. Most of the CSS classes have been removed or standardized.
Wikipedia-specific links that do not lead to articles or categories (such as "Redlink", "Edit links", "Portal links") have been removed. All external links have an additional FontAwesome icon. In addition to other small design adjustments, media containers, maps, navigation boxes, spoken versions and geo-microformats have been removed.

Important NOTE Since the given content was automatically taken over from Wikipedia at the specified time, manual checking was and is not possible. LinkFang.org therefore does not guarantee the correctness and topicality of the content taken over. If the information is incorrect in the meantime or there are errors in the presentation, we ask you to contact us by: E-Mail.
Also note:Imprint & privacy policy.