Skip to content

Commit 81089b1

Browse files
committed
Remove unecessary LongTensor in EfficientFormer. Possibly maybe fix #1878
1 parent 4224529 commit 81089b1

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

timm/models/efficientformer.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ def __init__(
6767
rel_pos = (pos[..., :, None] - pos[..., None, :]).abs()
6868
rel_pos = (rel_pos[0] * resolution[1]) + rel_pos[1]
6969
self.attention_biases = torch.nn.Parameter(torch.zeros(num_heads, resolution[0] * resolution[1]))
70-
self.register_buffer('attention_bias_idxs', torch.LongTensor(rel_pos))
70+
self.register_buffer('attention_bias_idxs', rel_pos)
7171
self.attention_bias_cache = {} # per-device attention_biases cache (data-parallel compat)
7272

7373
@torch.no_grad()

0 commit comments

Comments
 (0)