In this post, I provide a detailed description and explanation of the Convolutional Neural Network example provided in Rasmus Berg Palm’s DeepLearnToolbox for MATLAB. His example code applies a relatively simple CNN with 2 hidden layers and only 18 neurons to the MNIST dataset. The CNN’s accuracy is 98.92% on the test set, which seems very impressive to me given the small number of neurons.
The Euclidean distance (also called the L2 distance) has many applications in machine learning, such as in K-Nearest Neighbor, K-Means Clustering, and the Gaussian kernel (which is used, for example, in Radial Basis Function Networks).
You can think of building a Gaussian Mixture Model as a type of clustering algorithm. Using an iterative technique called Expectation Maximization, the process and result is very similar to k-means clustering. The difference is that the clusters are assumed to each have an independent Gaussian distribution, each with their own mean and covariance matrix.