You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "basicsr/test.py", line 44, in
test_pipeline(root_path)
File "basicsr/test.py", line 34, in test_pipeline
model = build_model(opt)
File "/root/dataDisk/DAT-main/basicsr/models/init.py", line 27, in build_model
model = MODEL_REGISTRY.get(opt['model_type'])(opt)
File "/root/dataDisk/DAT-main/basicsr/models/sr_model.py", line 30, in init
self.load_network(self.net_g, load_path, self.opt['path'].get('strict_load_g', True), param_key)
File "/root/dataDisk/DAT-main/basicsr/models/base_model.py", line 303, in load_network
net.load_state_dict(load_net, strict=strict)
File "/root/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for DAT:
size mismatch for conv_first.weight: copying a param with shape torch.Size([60, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([180, 3, 3, 3]).
size mismatch for conv_first.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for before_RG.1.weight: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for before_RG.1.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.norm1.weight: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.norm1.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.attn.qkv.weight: copying a param with shape torch.Size([180, 60]) from checkpoint, the shape in current model is torch.Size([540, 180]).
size mismatch for layers.0.blocks.0.attn.qkv.bias: copying a param with shape torch.Size([180]) from checkpoint, the shape in current model is torch.Size([540]).
size mismatch for layers.0.blocks.0.attn.proj.weight: copying a param with shape torch.Size([60, 60]) from checkpoint, the shape in current model is torch.Size([180, 180]).
The text was updated successfully, but these errors were encountered:
使用DAT_light_x2.pth 和 DAT_2_x2.pth模型
Traceback (most recent call last):
File "basicsr/test.py", line 44, in
test_pipeline(root_path)
File "basicsr/test.py", line 34, in test_pipeline
model = build_model(opt)
File "/root/dataDisk/DAT-main/basicsr/models/init.py", line 27, in build_model
model = MODEL_REGISTRY.get(opt['model_type'])(opt)
File "/root/dataDisk/DAT-main/basicsr/models/sr_model.py", line 30, in init
self.load_network(self.net_g, load_path, self.opt['path'].get('strict_load_g', True), param_key)
File "/root/dataDisk/DAT-main/basicsr/models/base_model.py", line 303, in load_network
net.load_state_dict(load_net, strict=strict)
File "/root/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for DAT:
The text was updated successfully, but these errors were encountered: