Unsupervised Pre-Training by Evolving for Diverse Features
Abstract
Deep neural networks (DNNs) excel at extracting complex patterns from data to solve complex, non-linear problems across several domains. The various initialization strategies utilized can greatly affect the accuracy of the resulting trained network and efficiency of the training process. We propose an evolutionary pre-training technique that initializes networks in a manner that optimizes toward orthogonality of feature activations in the convolutional filters of a CNN. Relative to randomly initialized parameters, this evolutionary pre-training improves the resulting accuracy of networks when training these convolutional filters on the image classification benchmark CIFAR-100.
Unsupervised Pre-Training by Evolving for Diverse Features
Deep neural networks (DNNs) excel at extracting complex patterns from data to solve complex, non-linear problems across several domains. The various initialization strategies utilized can greatly affect the accuracy of the resulting trained network and efficiency of the training process. We propose an evolutionary pre-training technique that initializes networks in a manner that optimizes toward orthogonality of feature activations in the convolutional filters of a CNN. Relative to randomly initialized parameters, this evolutionary pre-training improves the resulting accuracy of networks when training these convolutional filters on the image classification benchmark CIFAR-100.