LOGIN
Jan 01, 2018

Accelerating Deep Learning with Memcomputing

Fabio Lorenzo Traversa, Massimiliano Di Ventra, Haik Manukian

Restricted Boltzmann machines (RBMs) and their extensions, often called “deep-belief networks”, are very powerful neural networks that have found widespread applicability in the fields of machine learning and big data. The standard way to training these models resorts to an iterative unsupervised procedure based on Gibbs sampling, called “contrastive divergence”, and additional supervised tuning via back-propagation. However, this procedure has been shown not to follow any gradient and can lead to suboptimal solutions. In this paper, we show a very efficient alternative to contrastive divergence by means of simulations of digital memcomputing machines (DMMs). We test our approach on pattern recognition using the standard MNIST data set of hand-written numbers…

Go To Publication  →

View arXiv version  →