Hi all
I was wondering if Dropout can cause big training cost error rather then a cost error close to zero, when the training size is small like 2 sample (used for learning curve)
[link][3 comments]
Hi all
I was wondering if Dropout can cause big training cost error rather then a cost error close to zero, when the training size is small like 2 sample (used for learning curve)