Release time: 2020-11-12 15:31:54
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
f722a1c8
] - scheduler: update step by gradient_accumulation_steps (wangfeng)
- [
ba0dfd71
] - changelog: update change log to v1.2.24 (wangfeng)
Release time: 2020-11-12 15:02:11
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
5440697a
] - dataio: enable shuffle in distributed data parall training (wangfeng)
- [
09b42ea1
] - logging: minor revision logging info (wangfeng) - [
d6544319
] - changelog: update change log to v1.2.23 (wangfeng)
Release time: 2020-11-12 12:48:46
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
d2b84c15
] - trainer: move data to cuda with no_blocking (wangfeng)
- [
fc09ad8f
] - api: fix lr_decay update error (wangfeng)
- [
7f971696
] - changelog: update change log to v1.2.22 (wangfeng)
Release time: 2020-11-12 11:59:50
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
68d71be8
] - api: output the updated lr_delay related params (wangfeng) - [
e732bbb0
] - dataio: enable drop_last in distributed training (wangfeng)
- [
2ec884a4
] - scheduler: update scheduler each step without considering graident accumulate (wangfeng)
- [
112ab798
] - unable publish to pypi (wangfeng) - [
1d478300
] - changelog: update change log to v1.2.21 (wangfeng)
Release time: 2020-11-05 18:51:08
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
ccc7c058
] - fix rumal.yaml (wangfeng) - [
cde13f13
] - changelog: update change log to v1.2.20 (wangfeng)
Release time: 2020-10-22 11:20:32
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
aad96f91
] - fix model and dataio import (wangfeng) - [
b7e869f0
] - changelog: update change log to v1.2.19 (wangfeng)
Release time: 2020-10-12 14:10:40
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
6d5ad1e4
] - fix setup for py3.5 (wangfeng) - [
963d067a
] - changelog: update change log to v1.2.18 (wangfeng)
Release time: 2020-09-30 19:46:28
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
bba1b44b
] - dataio: try to support streaming dataset with ease use of multi-worker (wangfeng)
Release time: 2020-09-30 11:33:27
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
8e5c3693
] - trainier: optimize training loop (wangfeng)
Release time: 2020-09-29 15:43:48
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
83943bf2
] - visualizer: check gradient flow in tensorboard (wangfeng)
- [
c7458628
] - changelog: update change log to v1.2.15 (wangfeng)
Release time: 2020-09-28 13:19:47
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
f66fa420
] - minor update (wangfeng) - [
ae66b9a0
] - changelog: update change log to v1.2.14 (wangfeng)
Release time: 2020-09-28 12:11:53
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
dee66e36
] - trainer: make gradient_accumulation work correctly (wangfeng) - [
fcb34067
] - changelog: update change log to v1.2.13 (wangfeng)
Release time: 2020-09-11 14:23:24
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
17f31c6c
] - optimizer: correct steps num per epoch (wangfeng)
- [
0fd328ac
] - changelog: update change log to v1.2.12 (wangfeng)
Release time: 2020-09-10 16:13:24
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
793933e4
] - optimizer: add LAMB optimizer (wangfeng)
- [
b95eb9e3
] - changelog: update change log to v1.2.11 (wangfeng)
Release time: 2020-09-07 17:22:19
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
04f3d7b7
] - trainer: revert commit (wangfeng)
- [
960c0757
] - changelog: update change log to v1.2.10 (wangfeng)
Release time: 2020-09-07 17:12:41
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
d0451480
] - add pacakge to manage utilization scripts (wangfeng) - [
05e78989
] - optimizer: add adadelta and adafactor optimizer (wangfeng)
- [
6ecec44c
] - trainer: minor improvement code (wangfeng) - [
7ab30c5f
] - scheduler: warmup lr from zero instead of min_lr (wangfeng)
Release time: 2020-08-28 20:03:04
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
70a7263d
] - scheduler: fix bugs to enable minimal_lr (wangfeng)
- [
1abe15af
] - changelog: update change log to v1.2.8 (wangfeng)
Release time: 2020-08-28 18:23:22
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
8624d471
] - scheduler: support min_lr param for only (wangfeng)
- [
52cad880
] - changelog: update change log to v1.2.7 (wangfeng)
Release time: 2020-08-28 13:13:18
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
8f126c1a
] - checkpoint: enable resume model params from checkpoint only (wangfeng)
- [
455a608e
] - evaluator: fix eval mode works (wangfeng)
- [
bf3128d6
] - changelog: update change log to v1.2.6 (wangfeng)
Release time: 2020-08-27 18:02:50
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
48855041
] - optimizer: add make_parameter_groups implement from allennlp (wangfeng)
- [
5d7ab4e7
] - readme: add more refers (wangfeng)
- [
be288ebe
] - optimizer: fix missing import (wangfeng) - [
9bce131e
] - trainer: fix prefetcher to make it work with DDP (wangfeng) - [
b275f08b
] - changelog: update change log to v1.2.5 (wangfeng)
Release time: 2020-08-25 15:45:22
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
255f4c54
] - prefetcher: minor fixing prefetcher (wangfeng) - [
bdec4125
] - readme: fix readme yaml code (wangfeng) - [
6e318647
] - prefetcher: use asyn dataloader in standalone and prefetcher in DDP (wangfeng) - [
85ae85f7
] - changelog: update change log to v1.2.4 (wangfeng)
Release time: 2020-08-25 14:17:13
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
efccbf8a
] - dataloader: use asyn dataloader as prefetcher (wangfeng)
- [
5137d156
] - changelog: update change log to v1.2.3 (wangfeng)
Release time: 2020-08-25 13:35:39
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
9d73fe07
] - enable find_unused_parameters in DDP (wangfeng) - [
e7018ee3
] - changelog: update change log to v1.2.2 (wangfeng)
Release time: 2020-08-25 12:42:01
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
64f0dfc9
] - trainer: enable gradient_clip (wangfeng)
- [
c109a734
] - changelog: update change log to v1.2.1 (wangfeng)
Release time: 2020-08-25 12:31:50
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
5ddca50b
] - rework (wangfeng)
Release time: 2020-08-25 12:28:36
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
9c78afc6
] - add sync_batch_norm (wangfeng) - [
ca6c92aa
] - releas v1.2.0 (wangfeng) - [
93115913
] - changelog: update change log to v1.1.17 (wangfeng)
Release time: 2020-08-25 12:28:17
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
9c78afc6
] - add sync_batch_norm (wangfeng) - [
ca6c92aa
] - releas v1.2.0 (wangfeng) - [
93115913
] - changelog: update change log to v1.1.17 (wangfeng)
Release time: 2020-08-25 11:39:06
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
959af865
] - enable use_prefetcher (wangfeng) - [
6f714656
] - changelog: update change log to v1.1.16 (wangfeng)
Release time: 2020-08-20 12:47:14
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
aa0fef0f
] - lr decay strategy in distributed mode (wangfeng)
- [
3c92b149
] - fix amp under distributed env (wangfeng) - [
cded3538
] - changelog: update change log to v1.1.15 (wangfeng)
Release time: 2020-08-06 11:12:24
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
1d6f1f79
] - load model's hparams from yaml (wangfeng)
- [
8e875692
] - trainer: scale the learning rate by the number of workers (half of workers) (wangfeng)
- [
0ff7f1ae
] - add isort.cfg (wangfeng) - [
b1816616
] - changelog: update change log to v1.1.14 (wangfeng)
Release time: 2020-07-16 14:49:06
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
b7b426cb
] - trainer: fix #issue 22049 Cannot update part of the parameters in DistributedDataParallel (wangfeng)
- [
a552c464
] - format codes to pass pre-commit (wangfeng) - [
146dc6f5
] - changelog: update change log to v1.1.13 (wangfeng)
Release time: 2020-07-16 11:39:10
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
8a259140
] - remove unused requirement (wangfeng) - [
2b3df569
] - changelog: update change log to v1.1.12 (wangfeng)
Release time: 2020-07-16 11:29:58
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
f19551d7
] - lr_decay: enable decay learning rate by epoch (wangfeng)
- [
294b02ca
] - trainer: remove mistake logging (wangfeng)
- [
88a75de5
] - format codes (wangfeng) - [
240a15cc
] - changelog: update change log to v1.1.11 (wangfeng)
Release time: 2020-07-14 18:23:52
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
f268af6a
] - docker: fix docker dev container (wangfeng) - [
cd4c8a1b
] - add missing release template (wangfeng)
- [
98794850
] - model_api: enable resume from pretrained model in func of create_model (wangfeng)
- [
e2ac2f85
] - changelog: update change log to v1.1.10 (wangfeng)
Release time: 2020-07-13 17:42:17
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
1f76af9c
] - evaluator: add evaluate command, support standalone model only by now (wangfeng)
- [
c9ca9514
] - avg_meter: overides keys() func (wangfeng) - [
ea7e1c37
] - trainer: revise logx printing (wangfeng)
- [
92704b7e
] - changelog: update change log to v1.1.9 (wangfeng)
Release time: 2020-07-12 21:23:35
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
0a63e115
] - core: fix reporting error (wangfeng)
- [
bd3fccef
] - changelog: update change log to v1.1.8 (wangfeng)
Release time: 2020-07-12 20:59:22
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
82407570
] - setup: add missing package pytest (wangfeng)
- [
a9c3bf22
] - report all valid metrics after each epoch (wangfeng)
- [
5a8cb688
] - changelog: update change log to v1.1.7 (wangfeng)
Release time: 2020-07-12 18:00:43
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
334999a3
] - replace forked runx package instead to fix log_dir issue (wangfeng) - [
d322052d
] - merge chore-bumping locally (temporary option) (wangfeng)
- [
af315d1d
] - ... (wangfeng) - [
4090317b
] - fix release.sh (wangfeng) - [
2d0ac832
] - changelog: update change log to v1.1.6 (wangfeng)
Release time: 2020-07-11 16:54:18
🙇 We'd like to thank all contributors for this new release! In particular, wangfeng, 🙇
- [
ace08ea1
] - enable release note (wangfeng)
- [
32b7113c
] - load_checkpoint returns state_dict and metas (wangfeng) - [
1ca73b8b
] - change AverageMeter's average func (wangfeng)
- [
82805fa6
] - update readme (wangfeng)