Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test single报错 #56

Open
thorory opened this issue Aug 9, 2024 · 1 comment
Open

test single报错 #56

thorory opened this issue Aug 9, 2024 · 1 comment

Comments

@thorory
Copy link

thorory commented Aug 9, 2024

使用DAT_light_x2.pth 和 DAT_2_x2.pth模型

Traceback (most recent call last):
File "basicsr/test.py", line 44, in
test_pipeline(root_path)
File "basicsr/test.py", line 34, in test_pipeline
model = build_model(opt)
File "/root/dataDisk/DAT-main/basicsr/models/init.py", line 27, in build_model
model = MODEL_REGISTRY.get(opt['model_type'])(opt)
File "/root/dataDisk/DAT-main/basicsr/models/sr_model.py", line 30, in init
self.load_network(self.net_g, load_path, self.opt['path'].get('strict_load_g', True), param_key)
File "/root/dataDisk/DAT-main/basicsr/models/base_model.py", line 303, in load_network
net.load_state_dict(load_net, strict=strict)
File "/root/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for DAT:

size mismatch for conv_first.weight: copying a param with shape torch.Size([60, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([180, 3, 3, 3]).
size mismatch for conv_first.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for before_RG.1.weight: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for before_RG.1.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.norm1.weight: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.norm1.bias: copying a param with shape torch.Size([60]) from checkpoint, the shape in current model is torch.Size([180]).
size mismatch for layers.0.blocks.0.attn.qkv.weight: copying a param with shape torch.Size([180, 60]) from checkpoint, the shape in current model is torch.Size([540, 180]).
size mismatch for layers.0.blocks.0.attn.qkv.bias: copying a param with shape torch.Size([180]) from checkpoint, the shape in current model is torch.Size([540]).
size mismatch for layers.0.blocks.0.attn.proj.weight: copying a param with shape torch.Size([60, 60]) from checkpoint, the shape in current model is torch.Size([180, 180]).
@HuaqingHe
Copy link

HuaqingHe commented Aug 16, 2024

I have the same issue while using DAT_S_x4.pth .
Change the # network structures network_g in test single.yml can solve it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants