Skip to content

Commit 64b6782

Browse files
authored
Deprecate RAdam optimizer properly. (#389)
1 parent 51e361c commit 64b6782

File tree

4 files changed

+17
-1
lines changed

4 files changed

+17
-1
lines changed

CHANGES.rst

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,10 @@
11
Changes
22
-------
33

4+
0.3.1 (YYYY-MM-DD)
5+
------------------
6+
* Deprecate RAdam optimizer.
7+
48
0.3.0 (2021-10-30)
59
------------------
610
* Revert for Drop RAdam.

README.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -775,6 +775,8 @@ RAdam
775775
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_RAdam.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_RAdam.png |
776776
+---------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------+
777777

778+
Deprecated, please use version provided by PyTorch_.
779+
778780
.. code:: python
779781
780782
import torch_optimizer as optim

torch_optimizer/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@
7979
# utils
8080
'get',
8181
)
82-
__version__ = '0.3.0'
82+
__version__ = '0.3.1a0'
8383

8484

8585
_package_opts = [

torch_optimizer/radam.py

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
import math
2+
import warnings
23

34
import torch
45
from torch.optim.optimizer import Optimizer
@@ -11,6 +12,9 @@
1112
class RAdam(Optimizer):
1213
r"""Implements RAdam optimization algorithm.
1314
15+
Note:
16+
Deprecated, please use version provided by PyTorch_.
17+
1418
It has been proposed in `On the Variance of the Adaptive Learning
1519
Rate and Beyond`__.
1620
@@ -45,6 +49,12 @@ def __init__(
4549
eps: float = 1e-8,
4650
weight_decay: float = 0,
4751
) -> None:
52+
warnings.warn(
53+
'RAdam optimizer is deprecated, since it is included '
54+
'in pytorch natively.',
55+
DeprecationWarning,
56+
stacklevel=2,
57+
)
4858
if lr <= 0.0:
4959
raise ValueError('Invalid learning rate: {}'.format(lr))
5060
if eps < 0.0:

0 commit comments

Comments
 (0)