Unsupervised Pre-Training by Evolving for Diverse Features
Conference Year
January 2023
Abstract
Deep neural networks (DNNs) excel at extracting complex patterns from data to solve complex, non-linear problems across several domains. The various initialization strategies utilized can greatly affect the accuracy of the resulting trained network and efficiency of the training process. We propose an evolutionary pre-training technique that initializes networks in a manner that optimizes toward orthogonality of feature activations in the convolutional filters of a CNN. Relative to randomly initialized parameters, this evolutionary pre-training improves the resulting accuracy of networks when training these convolutional filters on the image classification benchmark CIFAR-100.
Primary Faculty Mentor Name
Nicholas Cheney
Status
Graduate
Student College
College of Engineering and Mathematical Sciences
Program/Major
Computer Science
Primary Research Category
Engineering and Math Science
Unsupervised Pre-Training by Evolving for Diverse Features
Deep neural networks (DNNs) excel at extracting complex patterns from data to solve complex, non-linear problems across several domains. The various initialization strategies utilized can greatly affect the accuracy of the resulting trained network and efficiency of the training process. We propose an evolutionary pre-training technique that initializes networks in a manner that optimizes toward orthogonality of feature activations in the convolutional filters of a CNN. Relative to randomly initialized parameters, this evolutionary pre-training improves the resulting accuracy of networks when training these convolutional filters on the image classification benchmark CIFAR-100.