Question about MLP learning rate

In the end of today´s lecture, I got a question about how the learn rate should be adjusted to the training set size, and if a larger set would require a smaller learning rate.

My answer was vague so afterwards with the help of my friend good Google, we found the following paper containing this table confirming that the learning rate should normally be smaller for a larger data set. Further, it confirms that updating weights after each training vector (online) is more effective than batch training (opposite of what our text book claims).


Published Oct. 5, 2015 6:44 PM - Last modified Oct. 5, 2015 6:47 PM