Skip to content

Commit 9b4bef8

Browse files
Verbose fix (#2957)
* remove codecov * RankProcessFirst * annotations * from class to contextlib * from class to contextlib and test * del test file * uniq folder for test * refactor tests + new assert_test * add to __all__, remove idist import * Apply suggestions from code review * Apply suggestions from code review * Update tests/ignite/distributed/utils/test_native.py * Added local arg and renamed function * verbose del * remove cast --------- Co-authored-by: vfdev <[email protected]>
1 parent e9e5b45 commit 9b4bef8

File tree

2 files changed

+3
-7
lines changed

2 files changed

+3
-7
lines changed

ignite/handlers/param_scheduler.py

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -809,12 +809,8 @@ def optimizer(self) -> torch.optim.Optimizer:
809809
return self._lr_scheduler.optimizer
810810

811811
def get_lr(self, epoch: Optional[int] = None) -> List[float]:
812-
# TODO: Remove this workaround when pytorch has fixed wrong type hints:
813-
# https://github.com/pytorch/pytorch/pull/102067
814-
# Replace below T_mult -> self._lr_scheduler.T_mult
815-
# Replace below eta_min -> self._lr_scheduler.eta_min
816-
T_mult = cast(int, self._lr_scheduler.T_mult)
817-
eta_min = cast(float, self._lr_scheduler.eta_min)
812+
T_mult = self._lr_scheduler.T_mult
813+
eta_min = self._lr_scheduler.eta_min
818814

819815
if epoch is None and self.last_epoch < 0:
820816
epoch = 0

tests/ignite/handlers/test_param_scheduler.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1378,7 +1378,7 @@ def get_optim():
13781378
return torch.optim.SGD([t1], lr=lr)
13791379

13801380
def get_cos_shed():
1381-
return CosineAnnealingWarmRestarts(optimizer, T_0=T_0, T_mult=T_mult, verbose=False)
1381+
return CosineAnnealingWarmRestarts(optimizer, T_0=T_0, T_mult=T_mult)
13821382

13831383
optimizer = get_optim()
13841384
scheduler = get_cos_shed()

0 commit comments

Comments
 (0)