11 – Solution Random Vs Preinitialized Thoughts

And the big surprise to us, what makes it better. We had significantly better results within it was pretrain on completely different objects than if he trained enough from scratch. Somehow, the features that develop inside his layers of neural network irrespective of what image that your are training on, have enough commonality that you get a better classifier with pretraining. That, to me, is interesting because I am a father and for the first few years, my son would like babble around and look around randomly and I could not quite understand why nature makes us do this. But, now, I know that very likely in these random acts of perception, structure evolves in the visual cortex that later becomes useful and my son eventually will become a medical.

%d 블로거가 이것을 좋아합니다: