Hello and welcome. In this video, we'll be providing an introduction to restricted Boltzmann machines. So let's get started. Imagine that we have access to a matrix of the viewer ratings of a certain number of movies on Netflix, where each row corresponds to a movie and each column corresponds to a user's rating. For the sake of simplicity, let's say we're only examining eight users and their ratings of seven movies. The value in each cell shows the score that the user has given to the movie after watching it and is based on a rating scale of one to five. Also, imagine that we have a type of neural network that has only two layers, the input layer and the hidden layer. Let's also assume that this network has learned in such a way that it can reconstruct the input vectors. For example, when you feed the first user vector into the network, it goes through the network and finally fires up some units in the hidden layer. Then the values of the hidden layer will be fed back into the network and a vector which is almost the same as the input vector is reconstructed as output. We can think of it as making guesses about the input data. You feed the second user's ratings, which are not very different from the first user and thus the same hidden units will be turned on and the network output would be the same as the first reconstructed vector. We can repeat it for the third user and for user number 4. Now, let's feed a user that has a completely different idea about these movies. As you can see, this particular user was not a fan of the first movie. When we feed the respective rating values into the network, different hidden units get turned on and a different vector is reconstructed which is almost the same as user number five's preferences. It is the same for user number 6, and the process can be repeated for the other users. Now, let's look at user number 8. He hasn't watched movie 6, but does have some preferences that are almost the same as users 5 and 6, right? Let's feed this vector into our network. It'll fire up the same hidden units as users 5 and 6, and feeding back these values will reconstruct a new vector. Look at the expected value for movie 6 in the reconstructed vector. Using this value is not hard to imagine that user number 8 might be interested in this movie even though he hasn't watched it yet. Maybe we could even recommended to him, yes. In fact, it is a way of solving collaborative filtering, which is a type of recommender system engine and the network that can make such a model is called a restricted Boltzmann machine. Restricted Boltzmann machines or RBMs for short, are shallow neural networks that only have two layers. They are an unsupervised method used to find patterns in data by reconstructing the input. The first layer of the RBM is called the visible layer and the second layer is the hidden layer. We say that they are restricted because neurons within the same layer are not connected. Feeding the input data, the network learns its weights. Then feeding an input image, the values that appear in the hidden layer can be considered as features learned automatically from the input data. As there are a smaller number of units in the hidden units of an RBM, we can tell that the values in the hidden units are a good representation of data, that are lower in dimensionality when compared to the original data. Restricted Boltzmann machines are useful in many applications, like dimensionality reduction, feature extraction, and collaborative filtering just to name a few. On top of that RBMs are used as the main block of another type of deep neural network which is called deep belief networks which we'll be talking about later. By now you should have basic knowledge about RBMs and their applications. Thanks for watching this video.