Ask questionsOptimizer clipvalue and clipnorm not working in Tensorflow 2.0
Describe the current behavior clipvalue and clipnorm in Optimizers does nothing!
Describe the expected behavior By setting clipvalue=0 or clipnorm=0 no training should occur (gradients should be 0!), but the network still trains, and if using a large learning rate, loss goes to nan.
Code to reproduce the issue Gradient is clearly not zero since the network is getting modified at each iteration.
Sanity check by setting lr=0 No training occurs when lr=0, as expected.
Answer questions tomerk
Hi Goldie, someone wants to know if a bug fix in 2.2 can be backported to 2.1.x. What's our policy on minor release versions?
On Sun, Mar 22, 2020 at 9:35 AM jlherren email@example.com wrote:
Oh man, I've just been debugging my NaN losses for over an hour now. How could this even go unnoticed...
Will this be fixed for 2.1.x as well or is 2.2 the only way to get this working again? I can't disable eager mode because then I lose the cuDNN implementation of LSTMs.
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/tensorflow/issues/33929#issuecomment-602236143, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFFEBMU4OQKDECN2UCLUQTRIY45FANCNFSM4JICXMIA .