What happens if you over-train a neural net?

mlcvGru

When you let a machine learn too much, it may happen that it will do worse. It is just like us – as human being – start forgetting things, or even go crazy, when we are forced to study excessively.

All jokes aside, I am having some chances to play with deep and (reasonably) big neural network, and I just found out what have been said above.

In the first experiment, I trained a feed forward neural network on MNIST. The net has 3 hidden layers with 1024 – 1024 – 2048 hidden units, fully connected, trained by stochastic gradient descent with momentum, L2 weight decay and decaying learning rate. The cost function is Cross Entropy. The net is similar to the one described here, but 2 layer deeper. The number of errors on training/validation/test sets are displayed in the figure below

It gone wild, eventually. After 700*2000 =…

View original post 414 more words

Advertisements

About Khuram Ali

Programming... Programming and Programming...!!!

Posted on May 6, 2013, in Artificial Intelligence. Bookmark the permalink. Leave a comment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: