New Publication in IEEE International Conference on Electronics, Circuits and Systems

Hyper-parameters of a machine learning architecture define its design. Tuning of hyper-parameters is costly and for large data sets outright impractical, whether it is performed manually or algorithmically. In this study we propose a Neocognitron based method for reducing the training set to a fraction, while keeping the dynamics and complexity of the domain. Our approach does not require processing of the entire training set, making it feasible for larger data sets. In our
experiments we could successfully reduce the MNIST training data set to less than 2.5% (1,489 images) by processing less than 10% of the 60K images. We showed that the reduced data set can be used for tuning of number of hidden neurons in a multi-layer perceptron.

K.K.E. Akı, T. Erkoç, M.T. Eskil, “Subset Selection for Tuning of Hyper-parameters in Artificial Neural Networks,” 24th IEEE International Conference on Electronics, Circuits and Systems (ICECS 2017), accepted.