Pattern Recognition (Second Semester of MAIA at University of Cassino and Southern Lazio, June 2018)
s the learning rate is one of the most importanthyper-parameters to tune for training convolutional neural net-works. In this paper, a powerful technique to select a range oflearning rates for a neural network that named cyclical learningrate was implemented with two different skewness degrees. Itis an approach to adjust where the value is cycled between alower bound and upper bound. CLR policies are computationallysimpler and can avoid the computational expense of fine tuningwith fixed learning rate. It is clearly shown that changing thelearning rate during the training phase provides by far betterresults than fixed values with similar or even smaller number of epochs.