Skip to content

Commit 8d96ef7

Browse files
Manually zero the gradients after updating weights by using machine epsilon for standard float (64-bit)
Manually zero the gradients after updating weights by using machine epsilon for standard float (64-bit).
1 parent ab2aafd commit 8d96ef7

File tree

1 file changed

+7
-4
lines changed

1 file changed

+7
-4
lines changed

beginner_source/examples_autograd/polynomial_custom_function.py

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -98,9 +98,12 @@ def backward(ctx, grad_output):
9898
d -= learning_rate * d.grad
9999

100100
# Manually zero the gradients after updating weights
101-
a.grad = None
102-
b.grad = None
103-
c.grad = None
104-
d.grad = None
101+
# by using machine epsilon for standard float (64-bit)
102+
import sys
103+
104+
a.grad = loss*sys.float_info.epsilon
105+
b.grad = loss*sys.float_info.epsilon
106+
c.grad = loss*sys.float_info.epsilon
107+
d.grad = loss*sys.float_info.epsilon
105108

106109
print(f'Result: y = {a.item()} + {b.item()} * P3({c.item()} + {d.item()} x)')

0 commit comments

Comments
 (0)