Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weights for PVT and Swin #4

Closed
thangnn123456 opened this issue Oct 9, 2022 · 3 comments
Closed

Weights for PVT and Swin #4

thangnn123456 opened this issue Oct 9, 2022 · 3 comments

Comments

@thangnn123456
Copy link

Could you provide the weights or the correct link to download the weights for 2 Transformer models? e.g. PVT-v2 and Swin-T

Because I have tried all weights from the official repository of them, but the results are extremely low (AP is only 15). And there are some errors while loading the weights:

fvcore.common.checkpoint WARNING: Some model parameters or buffers are not found in the checkpoint: �[34mbackbone.block1.0.attn.kv.{bias, weight}�[0m �[34mbackbone.block1.0.attn.norm.{bias, weight}�[0m �[34mbackbone.block1.0.attn.proj.{bias, weight}�[0m �[34mbackbone.block1.0.attn.q.{bias, weight}�[0m �[34mbackbone.block1.0.attn.sr.{bias, weight}�[0m �[34mbackbone.block1.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block1.0.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block1.0.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block1.0.norm1.{bias, weight}�[0m �[34mbackbone.block1.0.norm2.{bias, weight}�[0m �[34mbackbone.block1.1.attn.kv.{bias, weight}�[0m �[34mbackbone.block1.1.attn.norm.{bias, weight}�[0m �[34mbackbone.block1.1.attn.proj.{bias, weight}�[0m �[34mbackbone.block1.1.attn.q.{bias, weight}�[0m �[34mbackbone.block1.1.attn.sr.{bias, weight}�[0m �[34mbackbone.block1.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block1.1.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block1.1.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block1.1.norm1.{bias, weight}�[0m �[34mbackbone.block1.1.norm2.{bias, weight}�[0m �[34mbackbone.block1.2.attn.kv.{bias, weight}�[0m �[34mbackbone.block1.2.attn.norm.{bias, weight}�[0m �[34mbackbone.block1.2.attn.proj.{bias, weight}�[0m �[34mbackbone.block1.2.attn.q.{bias, weight}�[0m �[34mbackbone.block1.2.attn.sr.{bias, weight}�[0m �[34mbackbone.block1.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block1.2.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block1.2.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block1.2.norm1.{bias, weight}�[0m �[34mbackbone.block1.2.norm2.{bias, weight}�[0m �[34mbackbone.block2.0.attn.kv.{bias, weight}�[0m �[34mbackbone.block2.0.attn.norm.{bias, weight}�[0m �[34mbackbone.block2.0.attn.proj.{bias, weight}�[0m �[34mbackbone.block2.0.attn.q.{bias, weight}�[0m �[34mbackbone.block2.0.attn.sr.{bias, weight}�[0m �[34mbackbone.block2.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block2.0.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block2.0.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block2.0.norm1.{bias, weight}�[0m �[34mbackbone.block2.0.norm2.{bias, weight}�[0m �[34mbackbone.block2.1.attn.kv.{bias, weight}�[0m �[34mbackbone.block2.1.attn.norm.{bias, weight}�[0m �[34mbackbone.block2.1.attn.proj.{bias, weight}�[0m �[34mbackbone.block2.1.attn.q.{bias, weight}�[0m �[34mbackbone.block2.1.attn.sr.{bias, weight}�[0m �[34mbackbone.block2.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block2.1.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block2.1.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block2.1.norm1.{bias, weight}�[0m �[34mbackbone.block2.1.norm2.{bias, weight}�[0m �[34mbackbone.block2.2.attn.kv.{bias, weight}�[0m �[34mbackbone.block2.2.attn.norm.{bias, weight}�[0m �[34mbackbone.block2.2.attn.proj.{bias, weight}�[0m �[34mbackbone.block2.2.attn.q.{bias, weight}�[0m �[34mbackbone.block2.2.attn.sr.{bias, weight}�[0m �[34mbackbone.block2.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block2.2.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block2.2.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block2.2.norm1.{bias, weight}�[0m �[34mbackbone.block2.2.norm2.{bias, weight}�[0m �[34mbackbone.block2.3.attn.kv.{bias, weight}�[0m �[34mbackbone.block2.3.attn.norm.{bias, weight}�[0m �[34mbackbone.block2.3.attn.proj.{bias, weight}�[0m �[34mbackbone.block2.3.attn.q.{bias, weight}�[0m �[34mbackbone.block2.3.attn.sr.{bias, weight}�[0m �[34mbackbone.block2.3.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block2.3.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block2.3.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block2.3.norm1.{bias, weight}�[0m �[34mbackbone.block2.3.norm2.{bias, weight}�[0m �[34mbackbone.block3.0.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.0.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.0.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.0.attn.q.{bias, weight}�[0m �[34mbackbone.block3.0.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.0.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.0.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.0.norm1.{bias, weight}�[0m �[34mbackbone.block3.0.norm2.{bias, weight}�[0m �[34mbackbone.block3.1.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.1.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.1.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.1.attn.q.{bias, weight}�[0m �[34mbackbone.block3.1.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.1.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.1.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.1.norm1.{bias, weight}�[0m �[34mbackbone.block3.1.norm2.{bias, weight}�[0m �[34mbackbone.block3.2.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.2.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.2.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.2.attn.q.{bias, weight}�[0m �[34mbackbone.block3.2.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.2.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.2.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.2.norm1.{bias, weight}�[0m �[34mbackbone.block3.2.norm2.{bias, weight}�[0m �[34mbackbone.block3.3.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.3.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.3.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.3.attn.q.{bias, weight}�[0m �[34mbackbone.block3.3.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.3.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.3.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.3.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.3.norm1.{bias, weight}�[0m �[34mbackbone.block3.3.norm2.{bias, weight}�[0m �[34mbackbone.block3.4.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.4.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.4.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.4.attn.q.{bias, weight}�[0m �[34mbackbone.block3.4.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.4.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.4.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.4.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.4.norm1.{bias, weight}�[0m �[34mbackbone.block3.4.norm2.{bias, weight}�[0m �[34mbackbone.block3.5.attn.kv.{bias, weight}�[0m �[34mbackbone.block3.5.attn.norm.{bias, weight}�[0m �[34mbackbone.block3.5.attn.proj.{bias, weight}�[0m �[34mbackbone.block3.5.attn.q.{bias, weight}�[0m �[34mbackbone.block3.5.attn.sr.{bias, weight}�[0m �[34mbackbone.block3.5.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block3.5.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block3.5.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block3.5.norm1.{bias, weight}�[0m �[34mbackbone.block3.5.norm2.{bias, weight}�[0m �[34mbackbone.block4.0.attn.kv.{bias, weight}�[0m �[34mbackbone.block4.0.attn.norm.{bias, weight}�[0m �[34mbackbone.block4.0.attn.proj.{bias, weight}�[0m �[34mbackbone.block4.0.attn.q.{bias, weight}�[0m �[34mbackbone.block4.0.attn.sr.{bias, weight}�[0m �[34mbackbone.block4.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block4.0.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block4.0.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block4.0.norm1.{bias, weight}�[0m �[34mbackbone.block4.0.norm2.{bias, weight}�[0m �[34mbackbone.block4.1.attn.kv.{bias, weight}�[0m �[34mbackbone.block4.1.attn.norm.{bias, weight}�[0m �[34mbackbone.block4.1.attn.proj.{bias, weight}�[0m �[34mbackbone.block4.1.attn.q.{bias, weight}�[0m �[34mbackbone.block4.1.attn.sr.{bias, weight}�[0m �[34mbackbone.block4.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block4.1.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block4.1.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block4.1.norm1.{bias, weight}�[0m �[34mbackbone.block4.1.norm2.{bias, weight}�[0m �[34mbackbone.block4.2.attn.kv.{bias, weight}�[0m �[34mbackbone.block4.2.attn.norm.{bias, weight}�[0m �[34mbackbone.block4.2.attn.proj.{bias, weight}�[0m �[34mbackbone.block4.2.attn.q.{bias, weight}�[0m �[34mbackbone.block4.2.attn.sr.{bias, weight}�[0m �[34mbackbone.block4.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[34mbackbone.block4.2.mlp.fc1.{bias, weight}�[0m �[34mbackbone.block4.2.mlp.fc2.{bias, weight}�[0m �[34mbackbone.block4.2.norm1.{bias, weight}�[0m �[34mbackbone.block4.2.norm2.{bias, weight}�[0m �[34mbackbone.norm1.{bias, weight}�[0m �[34mbackbone.norm2.{bias, weight}�[0m �[34mbackbone.norm3.{bias, weight}�[0m �[34mbackbone.norm4.{bias, weight}�[0m �[34mbackbone.patch_embed1.norm.{bias, weight}�[0m �[34mbackbone.patch_embed1.proj.{bias, weight}�[0m �[34mbackbone.patch_embed2.norm.{bias, weight}�[0m �[34mbackbone.patch_embed2.proj.{bias, weight}�[0m �[34mbackbone.patch_embed3.norm.{bias, weight}�[0m �[34mbackbone.patch_embed3.proj.{bias, weight}�[0m �[34mbackbone.patch_embed4.norm.{bias, weight}�[0m �[34mbackbone.patch_embed4.proj.{bias, weight}�[0m �[34mcate_head.cate_pred.{bias, weight}�[0m �[34mcate_head.kernel_pred.{bias, weight}�[0m �[34mcate_head.qs_pred.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.norm1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.0.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.norm1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.1.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.norm1.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_decoder.decoder.layers.2.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_decoder.level_embed�[0m �[34mcate_head.trans_decoder.reference_points.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.0.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.1.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.2.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.3.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.4.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.ffn.ffn.0.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.ffn.ffn.1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.ffn.ffn.3.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.norm1.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.self_attn.attention_weights.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.self_attn.output_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.self_attn.sampling_offsets.{bias, weight}�[0m �[34mcate_head.trans_encoder.encoder.layers.5.self_attn.value_proj.{bias, weight}�[0m �[34mcate_head.trans_encoder.level_embed�[0m �[34mcate_head.trans_encoder.reference_points.{bias, weight}�[0m �[34mdcin.affine_bias.{bias, weight}�[0m �[34mdcin.affine_scale.{bias, weight}�[0m �[34mmask_head.conv_pred.0.weight�[0m �[34mmask_head.conv_pred.1.{bias, weight}�[0m �[34mmask_head.convs_all_levels.0.0.weight�[0m �[34mmask_head.convs_all_levels.0.1.{bias, weight}�[0m �[34mmask_head.convs_all_levels.1.0.weight�[0m �[34mmask_head.convs_all_levels.1.1.{bias, weight}�[0m �[34mmask_head.convs_all_levels.2.0.weight�[0m �[34mmask_head.convs_all_levels.2.1.{bias, weight}�[0m �[34mmask_head.convs_all_sums.0.conv0.0.weight�[0m �[34mmask_head.convs_all_sums.0.conv0.1.{bias, weight}�[0m �[34mmask_head.convs_all_sums.1.conv1.0.weight�[0m �[34mmask_head.convs_all_sums.1.conv1.1.{bias, weight}�[0m �[34mmask_head.convs_all_sums.2.conv2.0.weight�[0m �[34mmask_head.convs_all_sums.2.conv2.1.{bias, weight}�[0m �[34mmask_head.convs_all_sums.3.conv3.0.weight�[0m �[34mmask_head.convs_all_sums.3.conv3.1.{bias, weight}�[0m �[34mmask_head.edge_all_levels.0.conv1.weight�[0m �[34mmask_head.edge_all_levels.0.edge_pred.weight�[0m �[34mmask_head.edge_all_levels.1.conv1.weight�[0m �[34mmask_head.edge_all_levels.1.edge_pred.weight�[0m �[34mmask_head.edge_all_levels.2.conv1.weight�[0m �[34mmask_head.edge_all_levels.2.edge_pred.weight�[0m �[34mres_modules.0.0.weight�[0m �[34mres_modules.0.1.{bias, weight}�[0m �[34mres_modules.1.0.weight�[0m �[34mres_modules.1.1.{bias, weight}�[0m �[34mres_modules.2.0.weight�[0m �[34mres_modules.2.1.{bias, weight}�[0m �[34mres_modules.3.0.weight�[0m �[34mres_modules.3.1.{bias, weight}�[0m fvcore.common.checkpoint WARNING: The checkpoint state_dict contains keys that are not used by the model: �[35mpatch_embed1.proj.{bias, weight}�[0m �[35mpatch_embed1.norm.{bias, weight}�[0m �[35mpatch_embed2.proj.{bias, weight}�[0m �[35mpatch_embed2.norm.{bias, weight}�[0m �[35mpatch_embed3.proj.{bias, weight}�[0m �[35mpatch_embed3.norm.{bias, weight}�[0m �[35mpatch_embed4.proj.{bias, weight}�[0m �[35mpatch_embed4.norm.{bias, weight}�[0m �[35mblock1.0.norm1.{bias, weight}�[0m �[35mblock1.0.attn.q.{bias, weight}�[0m �[35mblock1.0.attn.kv.{bias, weight}�[0m �[35mblock1.0.attn.proj.{bias, weight}�[0m �[35mblock1.0.attn.sr.{bias, weight}�[0m �[35mblock1.0.attn.norm.{bias, weight}�[0m �[35mblock1.0.norm2.{bias, weight}�[0m �[35mblock1.0.mlp.fc1.{bias, weight}�[0m �[35mblock1.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock1.0.mlp.fc2.{bias, weight}�[0m �[35mblock1.1.norm1.{bias, weight}�[0m �[35mblock1.1.attn.q.{bias, weight}�[0m �[35mblock1.1.attn.kv.{bias, weight}�[0m �[35mblock1.1.attn.proj.{bias, weight}�[0m �[35mblock1.1.attn.sr.{bias, weight}�[0m �[35mblock1.1.attn.norm.{bias, weight}�[0m �[35mblock1.1.norm2.{bias, weight}�[0m �[35mblock1.1.mlp.fc1.{bias, weight}�[0m �[35mblock1.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock1.1.mlp.fc2.{bias, weight}�[0m �[35mblock1.2.norm1.{bias, weight}�[0m �[35mblock1.2.attn.q.{bias, weight}�[0m �[35mblock1.2.attn.kv.{bias, weight}�[0m �[35mblock1.2.attn.proj.{bias, weight}�[0m �[35mblock1.2.attn.sr.{bias, weight}�[0m �[35mblock1.2.attn.norm.{bias, weight}�[0m �[35mblock1.2.norm2.{bias, weight}�[0m �[35mblock1.2.mlp.fc1.{bias, weight}�[0m �[35mblock1.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock1.2.mlp.fc2.{bias, weight}�[0m �[35mnorm1.{bias, weight}�[0m �[35mblock2.0.norm1.{bias, weight}�[0m �[35mblock2.0.attn.q.{bias, weight}�[0m �[35mblock2.0.attn.kv.{bias, weight}�[0m �[35mblock2.0.attn.proj.{bias, weight}�[0m �[35mblock2.0.attn.sr.{bias, weight}�[0m �[35mblock2.0.attn.norm.{bias, weight}�[0m �[35mblock2.0.norm2.{bias, weight}�[0m �[35mblock2.0.mlp.fc1.{bias, weight}�[0m �[35mblock2.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock2.0.mlp.fc2.{bias, weight}�[0m �[35mblock2.1.norm1.{bias, weight}�[0m �[35mblock2.1.attn.q.{bias, weight}�[0m �[35mblock2.1.attn.kv.{bias, weight}�[0m �[35mblock2.1.attn.proj.{bias, weight}�[0m �[35mblock2.1.attn.sr.{bias, weight}�[0m �[35mblock2.1.attn.norm.{bias, weight}�[0m �[35mblock2.1.norm2.{bias, weight}�[0m �[35mblock2.1.mlp.fc1.{bias, weight}�[0m �[35mblock2.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock2.1.mlp.fc2.{bias, weight}�[0m �[35mblock2.2.norm1.{bias, weight}�[0m �[35mblock2.2.attn.q.{bias, weight}�[0m �[35mblock2.2.attn.kv.{bias, weight}�[0m �[35mblock2.2.attn.proj.{bias, weight}�[0m �[35mblock2.2.attn.sr.{bias, weight}�[0m �[35mblock2.2.attn.norm.{bias, weight}�[0m �[35mblock2.2.norm2.{bias, weight}�[0m �[35mblock2.2.mlp.fc1.{bias, weight}�[0m �[35mblock2.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock2.2.mlp.fc2.{bias, weight}�[0m �[35mblock2.3.norm1.{bias, weight}�[0m �[35mblock2.3.attn.q.{bias, weight}�[0m �[35mblock2.3.attn.kv.{bias, weight}�[0m �[35mblock2.3.attn.proj.{bias, weight}�[0m �[35mblock2.3.attn.sr.{bias, weight}�[0m �[35mblock2.3.attn.norm.{bias, weight}�[0m �[35mblock2.3.norm2.{bias, weight}�[0m �[35mblock2.3.mlp.fc1.{bias, weight}�[0m �[35mblock2.3.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock2.3.mlp.fc2.{bias, weight}�[0m �[35mnorm2.{bias, weight}�[0m �[35mblock3.0.norm1.{bias, weight}�[0m �[35mblock3.0.attn.q.{bias, weight}�[0m �[35mblock3.0.attn.kv.{bias, weight}�[0m �[35mblock3.0.attn.proj.{bias, weight}�[0m �[35mblock3.0.attn.sr.{bias, weight}�[0m �[35mblock3.0.attn.norm.{bias, weight}�[0m �[35mblock3.0.norm2.{bias, weight}�[0m �[35mblock3.0.mlp.fc1.{bias, weight}�[0m �[35mblock3.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.0.mlp.fc2.{bias, weight}�[0m �[35mblock3.1.norm1.{bias, weight}�[0m �[35mblock3.1.attn.q.{bias, weight}�[0m �[35mblock3.1.attn.kv.{bias, weight}�[0m �[35mblock3.1.attn.proj.{bias, weight}�[0m �[35mblock3.1.attn.sr.{bias, weight}�[0m �[35mblock3.1.attn.norm.{bias, weight}�[0m �[35mblock3.1.norm2.{bias, weight}�[0m �[35mblock3.1.mlp.fc1.{bias, weight}�[0m �[35mblock3.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.1.mlp.fc2.{bias, weight}�[0m �[35mblock3.2.norm1.{bias, weight}�[0m �[35mblock3.2.attn.q.{bias, weight}�[0m �[35mblock3.2.attn.kv.{bias, weight}�[0m �[35mblock3.2.attn.proj.{bias, weight}�[0m �[35mblock3.2.attn.sr.{bias, weight}�[0m �[35mblock3.2.attn.norm.{bias, weight}�[0m �[35mblock3.2.norm2.{bias, weight}�[0m �[35mblock3.2.mlp.fc1.{bias, weight}�[0m �[35mblock3.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.2.mlp.fc2.{bias, weight}�[0m �[35mblock3.3.norm1.{bias, weight}�[0m �[35mblock3.3.attn.q.{bias, weight}�[0m �[35mblock3.3.attn.kv.{bias, weight}�[0m �[35mblock3.3.attn.proj.{bias, weight}�[0m �[35mblock3.3.attn.sr.{bias, weight}�[0m �[35mblock3.3.attn.norm.{bias, weight}�[0m �[35mblock3.3.norm2.{bias, weight}�[0m �[35mblock3.3.mlp.fc1.{bias, weight}�[0m �[35mblock3.3.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.3.mlp.fc2.{bias, weight}�[0m �[35mblock3.4.norm1.{bias, weight}�[0m �[35mblock3.4.attn.q.{bias, weight}�[0m �[35mblock3.4.attn.kv.{bias, weight}�[0m �[35mblock3.4.attn.proj.{bias, weight}�[0m �[35mblock3.4.attn.sr.{bias, weight}�[0m �[35mblock3.4.attn.norm.{bias, weight}�[0m �[35mblock3.4.norm2.{bias, weight}�[0m �[35mblock3.4.mlp.fc1.{bias, weight}�[0m �[35mblock3.4.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.4.mlp.fc2.{bias, weight}�[0m �[35mblock3.5.norm1.{bias, weight}�[0m �[35mblock3.5.attn.q.{bias, weight}�[0m �[35mblock3.5.attn.kv.{bias, weight}�[0m �[35mblock3.5.attn.proj.{bias, weight}�[0m �[35mblock3.5.attn.sr.{bias, weight}�[0m �[35mblock3.5.attn.norm.{bias, weight}�[0m �[35mblock3.5.norm2.{bias, weight}�[0m �[35mblock3.5.mlp.fc1.{bias, weight}�[0m �[35mblock3.5.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock3.5.mlp.fc2.{bias, weight}�[0m �[35mnorm3.{bias, weight}�[0m �[35mblock4.0.norm1.{bias, weight}�[0m �[35mblock4.0.attn.q.{bias, weight}�[0m �[35mblock4.0.attn.kv.{bias, weight}�[0m �[35mblock4.0.attn.proj.{bias, weight}�[0m �[35mblock4.0.attn.sr.{bias, weight}�[0m �[35mblock4.0.attn.norm.{bias, weight}�[0m �[35mblock4.0.norm2.{bias, weight}�[0m �[35mblock4.0.mlp.fc1.{bias, weight}�[0m �[35mblock4.0.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock4.0.mlp.fc2.{bias, weight}�[0m �[35mblock4.1.norm1.{bias, weight}�[0m �[35mblock4.1.attn.q.{bias, weight}�[0m �[35mblock4.1.attn.kv.{bias, weight}�[0m �[35mblock4.1.attn.proj.{bias, weight}�[0m �[35mblock4.1.attn.sr.{bias, weight}�[0m �[35mblock4.1.attn.norm.{bias, weight}�[0m �[35mblock4.1.norm2.{bias, weight}�[0m �[35mblock4.1.mlp.fc1.{bias, weight}�[0m �[35mblock4.1.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock4.1.mlp.fc2.{bias, weight}�[0m �[35mblock4.2.norm1.{bias, weight}�[0m �[35mblock4.2.attn.q.{bias, weight}�[0m �[35mblock4.2.attn.kv.{bias, weight}�[0m �[35mblock4.2.attn.proj.{bias, weight}�[0m �[35mblock4.2.attn.sr.{bias, weight}�[0m �[35mblock4.2.attn.norm.{bias, weight}�[0m �[35mblock4.2.norm2.{bias, weight}�[0m �[35mblock4.2.mlp.fc1.{bias, weight}�[0m �[35mblock4.2.mlp.dwconv.dwconv.{bias, weight}�[0m �[35mblock4.2.mlp.fc2.{bias, weight}�[0m �[35mnorm4.{bias, weight}�[0m �[35mhead.{bias, weight}�[0m

Thank you so much.

@Patrickctyyx
Copy link
Collaborator

Thanks for your attention, we have checked the weights, the results are exactly the same as reported and the warnings did not occur.
Please check the downloaded weights:

  1. Size
  • osformer-pvt.pth: 154,885,121  bytes
  • osformer-swin.pth: 180,823,607 bytes
  1. Keys
import torch
sd = torch.load('osformer-pvt.pth', map_location='cpu')
sd.keys()   # check weights keys
sd['backbone.block1.0.attn.kv.weight'] is None   # check if parameter exist in checkpoint

@Patrickctyyx
Copy link
Collaborator

Patrickctyyx commented Oct 11, 2022

Sorry for our negligence. We changed the pth key to fit detectron2, the code is as below:

import torch
from collections import OrderedDict

sd = torch.load('pvt_v2_b2_li.pth', map_location='cpu')
new_sd = OrderedDict()

for k, v in sd.items():
     new_sd['backbone.' + k] = v  # add 'backbone.' to original key

torch.save(new_sd, 'pvt_v2_b2_li.pth.backbone')

The pth for swin is https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth and pth for pvt is https://github.com/whai362/PVT/releases/download/v2/pvt_v2_b2_li.pth.

You can modify the code to revise the pth or we'll upload the modified pth later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants