Ask questionsOptimizer clipvalue and clipnorm not working in Tensorflow 2.0
Describe the current behavior clipvalue and clipnorm in Optimizers does nothing!
Describe the expected behavior By setting clipvalue=0 or clipnorm=0 no training should occur (gradients should be 0!), but the network still trains, and if using a large learning rate, loss goes to nan.
Code to reproduce the issue Gradient is clearly not zero since the network is getting modified at each iteration.
Sanity check by setting lr=0 No training occurs when lr=0, as expected.
Answer questions tomerk
Yes, norm clipping still worked correctly when eager execution was disabled in 2.0 and 2.1.