You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-11Lines changed: 7 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -23,6 +23,9 @@ I'm fortunate to be able to dedicate significant time and money of my own suppor
23
23
24
24
## What's New
25
25
26
+
### April 13, 2021
27
+
* Add Swin Transformer models and weights from https://github.com/microsoft/Swin-Transformer
28
+
26
29
### April 12, 2021
27
30
* Add ECA-NFNet-L1 (slimmed down F1 w/ SiLU, 41M params) trained with this code. 84% top-1 @ 320x320. Trained at 256x256.
28
31
* Add EfficientNet-V2S model (unverified model definition) weights. 83.3 top-1 @ 288x288. Only trained single res 224. Working on progressive training.
@@ -164,17 +167,6 @@ I'm fortunate to be able to dedicate significant time and money of my own suppor
164
167
* EdgeTPU-M (`efficientnet_em`) model trained in PyTorch, 79.3 top-1
165
168
* Pip release, doc updates pending a few more changes...
* Support for native Torch AMP and channels_last memory format added to train/validate scripts (`--channels-last`, `--native-amp` vs `--apex-amp`)
176
-
* Models tested with channels_last on latest NGC 20.08 container. AdaptiveAvgPool in attn layers changed to mean((2,3)) to work around bug with NHWC kernel.
177
-
178
170
179
171
## Introduction
180
172
@@ -189,6 +181,7 @@ All model architecture families include variants with pretrained weights. There
189
181
A full version of the list below with source links can be found in the [documentation](https://rwightman.github.io/pytorch-image-models/models/).
190
182
191
183
* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370
* Support for native Torch AMP and channels_last memory format added to train/validate scripts (`--channels-last`, `--native-amp` vs `--apex-amp`)
12
+
* Models tested with channels_last on latest NGC 20.08 container. AdaptiveAvgPool in attn layers changed to mean((2,3)) to work around bug with NHWC kernel.
13
+
3
14
### Aug 12, 2020
4
15
* New/updated weights from training experiments
5
16
* EfficientNet-B3 - 82.1 top-1 (vs 81.6 for official with AA and 81.9 for AdvProp)
Copy file name to clipboardExpand all lines: docs/changes.md
+28Lines changed: 28 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,33 @@
1
1
# Recent Changes
2
2
3
+
### April 13, 2021
4
+
* Add Swin Transformer models and weights from https://github.com/microsoft/Swin-Transformer
5
+
6
+
### April 12, 2021
7
+
* Add ECA-NFNet-L1 (slimmed down F1 w/ SiLU, 41M params) trained with this code. 84% top-1 @ 320x320. Trained at 256x256.
8
+
* Add EfficientNet-V2S model (unverified model definition) weights. 83.3 top-1 @ 288x288. Only trained single res 224. Working on progressive training.
9
+
* Add ByoaNet model definition (Bring-your-own-attention) w/ SelfAttention block and corresponding SA/SA-like modules and model defs
0 commit comments