Okay – I was watching
paint drymy neural network learning today and something surprising happened about an hour into the sequence – suddenly, instead of getting the tests 90%+ correct, it was getting them almost 100% incorrect!
It took me a few moments to realise what had happened – for some reasons, I had decided to try teach the network the difference between two objects for a lot of sessions before adding a third ingredient. I thought that would teach the network to very finely know the difference between them.
It turns out, though, that the network was smarter than me. It learned that the answer to the current test is… the exact opposite of whatever the previous answer was. The damn thing learned to predict my tests!
So – what I’ve done to correct this, is that ta the beginning of any test, I now erase the neural memory in the network. In effect, I’m whacking the network on the side of the head so it can’t learn to predict my sequences and must learn the actual photos instead.