[03/21 06:48:33] detectron2 INFO: Rank of current process: 0. World size: 1 [03/21 06:48:34] detectron2 INFO: Environment info: ---------------------- ---------------------------------------------------------------- sys.platform linux Python 3.9.13 (main, May 23 2022, 22:01:06) [GCC 9.4.0] numpy 1.22.4 detectron2 0.6 @/notebooks/detrex/detectron2/detectron2 Compiler GCC 9.4 CUDA compiler CUDA 11.2 detectron2 arch flags 8.6 DETECTRON2_ENV_MODULE PyTorch 1.12.0+cu116 @/usr/local/lib/python3.9/dist-packages/torch PyTorch debug build False GPU available Yes GPU 0 NVIDIA RTX A6000 (arch=8.6) Driver version 510.73.05 CUDA_HOME /usr/local/cuda Pillow 9.2.0 torchvision 0.13.0+cu116 @/usr/local/lib/python3.9/dist-packages/torchvision torchvision arch flags 3.5, 5.0, 6.0, 7.0, 7.5, 8.0, 8.6 fvcore 0.1.5.post20221221 iopath 0.1.9 cv2 4.6.0 ---------------------- ---------------------------------------------------------------- PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.6 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.6.1 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.6, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, [03/21 06:48:34] detectron2 INFO: Command line arguments: Namespace(config_file='projects/dino/configs/dino_r50_corpus.py', resume=False, eval_only=False, num_gpus=1, num_machines=1, machine_rank=0, dist_url='tcp://127.0.0.1:49152', opts=[]) [03/21 06:48:34] detectron2 INFO: Contents of args.config_file=projects/dino/configs/dino_r50_corpus.py: from detrex.config import get_config from .models.dino_r50 import model # get default config dataloader = get_config("common/data/custom.py").dataloader #from detectron2.data.datasets import register_coco_instances #register_coco_instances("corpus", {}, "json_annotation.json", "path/to/image/dir") optimizer = get_config("common/optim.py").AdamW lr_multiplier = get_config("common/coco_schedule.py").lr_multiplier_12ep train = get_config("common/train.py").train # modify training config train.init_checkpoint = "detectron2://ImageNetPretrained/torchvision/R-50.pkl" train.output_dir = "./output/dino_r50_4scale_12ep" # max training iterations train.max_iter = 90000 # run evaluation every 5000 iters train.eval_period = 2000 # log training infomation every 20 iters train.log_period = 50 # save checkpoint every 5000 iters train.checkpointer.period = 2000 # gradient clipping for training train.clip_grad.enabled = True train.clip_grad.params.max_norm = 0.1 train.clip_grad.params.norm_type = 2 # set training devices train.device = "cuda" model.device = train.device # please notice that this is total batch size. # surpose you're using 4 gpus for training and the batch size for # each gpu is 16/4 = 4 dataloader.train.total_batch_size = 14 # modify optimizer config optimizer.lr = 1e-4 * dataloader.train.total_batch_size / 16 optimizer.betas = (0.9, 0.999) optimizer.weight_decay = 1e-4 optimizer.params.lr_factor_func = lambda module_name: 0.1 if "backbone" in module_name else 1 # modify dataloader config dataloader.train.num_workers = 8 # dump the testing results into output_dir for visualization dataloader.evaluator.output_dir = train.output_dir [03/21 06:48:34] d2.config.lazy WARNING: The config contains objects that cannot serialize to a valid yaml. ./output/dino_r50_4scale_12ep/config.yaml is human-readable but cannot be loaded. [03/21 06:48:34] d2.config.lazy WARNING: Config is saved using cloudpickle at ./output/dino_r50_4scale_12ep/config.yaml.pkl. [03/21 06:48:34] detectron2 INFO: Full config saved to ./output/dino_r50_4scale_12ep/config.yaml [03/21 06:48:34] d2.utils.env INFO: Using a generated random seed 34523067 [03/21 06:50:15] detectron2 INFO: Rank of current process: 0. World size: 1 [03/21 06:50:15] detectron2 INFO: Environment info: ---------------------- ---------------------------------------------------------------- sys.platform linux Python 3.9.13 (main, May 23 2022, 22:01:06) [GCC 9.4.0] numpy 1.22.4 detectron2 0.6 @/notebooks/detrex/detectron2/detectron2 Compiler GCC 9.4 CUDA compiler CUDA 11.2 detectron2 arch flags 8.6 DETECTRON2_ENV_MODULE PyTorch 1.12.0+cu116 @/usr/local/lib/python3.9/dist-packages/torch PyTorch debug build False GPU available Yes GPU 0 NVIDIA RTX A6000 (arch=8.6) Driver version 510.73.05 CUDA_HOME /usr/local/cuda Pillow 9.2.0 torchvision 0.13.0+cu116 @/usr/local/lib/python3.9/dist-packages/torchvision torchvision arch flags 3.5, 5.0, 6.0, 7.0, 7.5, 8.0, 8.6 fvcore 0.1.5.post20221221 iopath 0.1.9 cv2 4.6.0 ---------------------- ---------------------------------------------------------------- PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.6 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.6.1 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.6, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, [03/21 06:50:15] detectron2 INFO: Command line arguments: Namespace(config_file='projects/dino/configs/dino_r50_corpus.py', resume=False, eval_only=False, num_gpus=1, num_machines=1, machine_rank=0, dist_url='tcp://127.0.0.1:49152', opts=[]) [03/21 06:50:15] detectron2 INFO: Contents of args.config_file=projects/dino/configs/dino_r50_corpus.py: from detrex.config import get_config from .models.dino_r50 import model # get default config dataloader = get_config("common/data/custom.py").dataloader #from detectron2.data.datasets import register_coco_instances #register_coco_instances("corpus", {}, "json_annotation.json", "path/to/image/dir") optimizer = get_config("common/optim.py").AdamW lr_multiplier = get_config("common/coco_schedule.py").lr_multiplier_12ep train = get_config("common/train.py").train # modify training config train.init_checkpoint = "detectron2://ImageNetPretrained/torchvision/R-50.pkl" train.output_dir = "./output/dino_r50_4scale_12ep" # max training iterations train.max_iter = 90000 # run evaluation every 5000 iters train.eval_period = 2000 # log training infomation every 20 iters train.log_period = 50 # save checkpoint every 5000 iters train.checkpointer.period = 2000 # gradient clipping for training train.clip_grad.enabled = True train.clip_grad.params.max_norm = 0.1 train.clip_grad.params.norm_type = 2 # set training devices train.device = "cuda" model.device = train.device # please notice that this is total batch size. # surpose you're using 4 gpus for training and the batch size for # each gpu is 16/4 = 4 dataloader.train.total_batch_size = 14 # modify optimizer config optimizer.lr = 1e-4 * dataloader.train.total_batch_size / 16 optimizer.betas = (0.9, 0.999) optimizer.weight_decay = 1e-4 optimizer.params.lr_factor_func = lambda module_name: 0.1 if "backbone" in module_name else 1 # modify dataloader config dataloader.train.num_workers = 8 # dump the testing results into output_dir for visualization dataloader.evaluator.output_dir = train.output_dir [03/21 06:50:15] d2.config.lazy WARNING: The config contains objects that cannot serialize to a valid yaml. ./output/dino_r50_4scale_12ep/config.yaml is human-readable but cannot be loaded. [03/21 06:50:16] d2.config.lazy WARNING: Config is saved using cloudpickle at ./output/dino_r50_4scale_12ep/config.yaml.pkl. [03/21 06:50:16] detectron2 INFO: Full config saved to ./output/dino_r50_4scale_12ep/config.yaml [03/21 06:50:16] d2.utils.env INFO: Using a generated random seed 16060255 [03/21 06:50:17] detectron2 INFO: Model: DINO( (backbone): ResNet( (stem): BasicStem( (conv1): Conv2d( 3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) ) (res2): Sequential( (0): BottleneckBlock( (shortcut): Conv2d( 64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv1): Conv2d( 64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv2): Conv2d( 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv3): Conv2d( 64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) ) (1): BottleneckBlock( (conv1): Conv2d( 256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv2): Conv2d( 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv3): Conv2d( 64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) ) (2): BottleneckBlock( (conv1): Conv2d( 256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv2): Conv2d( 64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05) ) (conv3): Conv2d( 64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) ) ) (res3): Sequential( (0): BottleneckBlock( (shortcut): Conv2d( 256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv1): Conv2d( 256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv2): Conv2d( 128, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv3): Conv2d( 128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) ) (1): BottleneckBlock( (conv1): Conv2d( 512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv2): Conv2d( 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv3): Conv2d( 128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) ) (2): BottleneckBlock( (conv1): Conv2d( 512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv2): Conv2d( 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv3): Conv2d( 128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) ) (3): BottleneckBlock( (conv1): Conv2d( 512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv2): Conv2d( 128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05) ) (conv3): Conv2d( 128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) ) ) (res4): Sequential( (0): BottleneckBlock( (shortcut): Conv2d( 512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) (conv1): Conv2d( 512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) (1): BottleneckBlock( (conv1): Conv2d( 1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) (2): BottleneckBlock( (conv1): Conv2d( 1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) (3): BottleneckBlock( (conv1): Conv2d( 1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) (4): BottleneckBlock( (conv1): Conv2d( 1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) (5): BottleneckBlock( (conv1): Conv2d( 1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv2): Conv2d( 256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05) ) (conv3): Conv2d( 256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05) ) ) ) (res5): Sequential( (0): BottleneckBlock( (shortcut): Conv2d( 1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05) ) (conv1): Conv2d( 1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv2): Conv2d( 512, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv3): Conv2d( 512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05) ) ) (1): BottleneckBlock( (conv1): Conv2d( 2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv2): Conv2d( 512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv3): Conv2d( 512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05) ) ) (2): BottleneckBlock( (conv1): Conv2d( 2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv2): Conv2d( 512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05) ) (conv3): Conv2d( 512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05) ) ) ) ) (position_embedding): PositionEmbeddingSine() (neck): ChannelMapper( (convs): ModuleList( (0): ConvNormAct( (conv): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1)) (norm): GroupNorm(32, 256, eps=1e-05, affine=True) ) (1): ConvNormAct( (conv): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1)) (norm): GroupNorm(32, 256, eps=1e-05, affine=True) ) (2): ConvNormAct( (conv): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1)) (norm): GroupNorm(32, 256, eps=1e-05, affine=True) ) ) (extra_convs): ModuleList( (0): ConvNormAct( (conv): Conv2d(2048, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1)) (norm): GroupNorm(32, 256, eps=1e-05, affine=True) ) ) ) (transformer): DINOTransformer( (encoder): DINOTransformerEncoder( (layers): ModuleList( (0): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (1): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (2): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (3): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (4): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (5): BaseTransformerLayer( (attentions): ModuleList( (0): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) ) ) (decoder): DINOTransformerDecoder( (layers): ModuleList( (0): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (1): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (2): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (3): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (4): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) (5): BaseTransformerLayer( (attentions): ModuleList( (0): MultiheadAttention( (attn): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=256, out_features=256, bias=True) ) (proj_drop): Dropout(p=0.0, inplace=False) ) (1): MultiScaleDeformableAttention( (dropout): Dropout(p=0.0, inplace=False) (sampling_offsets): Linear(in_features=256, out_features=256, bias=True) (attention_weights): Linear(in_features=256, out_features=128, bias=True) (value_proj): Linear(in_features=256, out_features=256, bias=True) (output_proj): Linear(in_features=256, out_features=256, bias=True) ) ) (ffns): ModuleList( (0): FFN( (activation): ReLU(inplace=True) (layers): Sequential( (0): Sequential( (0): Linear(in_features=256, out_features=2048, bias=True) (1): ReLU(inplace=True) (2): Dropout(p=0.0, inplace=False) ) (1): Linear(in_features=2048, out_features=256, bias=True) (2): Dropout(p=0.0, inplace=False) ) ) ) (norms): ModuleList( (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (2): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) ) ) (ref_point_head): MLP( (layers): ModuleList( (0): Linear(in_features=512, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) ) ) (norm): LayerNorm((256,), eps=1e-05, elementwise_affine=True) (class_embed): ModuleList( (0): Linear(in_features=256, out_features=1, bias=True) (1): Linear(in_features=256, out_features=1, bias=True) (2): Linear(in_features=256, out_features=1, bias=True) (3): Linear(in_features=256, out_features=1, bias=True) (4): Linear(in_features=256, out_features=1, bias=True) (5): Linear(in_features=256, out_features=1, bias=True) (6): Linear(in_features=256, out_features=1, bias=True) ) (bbox_embed): ModuleList( (0): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (1): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (2): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (3): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (4): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (5): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (6): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) ) ) (tgt_embed): Embedding(900, 256) (enc_output): Linear(in_features=256, out_features=256, bias=True) (enc_output_norm): LayerNorm((256,), eps=1e-05, elementwise_affine=True) ) (class_embed): ModuleList( (0): Linear(in_features=256, out_features=1, bias=True) (1): Linear(in_features=256, out_features=1, bias=True) (2): Linear(in_features=256, out_features=1, bias=True) (3): Linear(in_features=256, out_features=1, bias=True) (4): Linear(in_features=256, out_features=1, bias=True) (5): Linear(in_features=256, out_features=1, bias=True) (6): Linear(in_features=256, out_features=1, bias=True) ) (bbox_embed): ModuleList( (0): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (1): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (2): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (3): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (4): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (5): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) (6): MLP( (layers): ModuleList( (0): Linear(in_features=256, out_features=256, bias=True) (1): Linear(in_features=256, out_features=256, bias=True) (2): Linear(in_features=256, out_features=4, bias=True) ) ) ) (criterion): Criterion DINOCriterion matcher: Matcher HungarianMatcher cost_class: 2.0 cost_bbox: 5.0 cost_giou: 2.0 cost_class_type: focal_loss_cost focal cost alpha: 0.25 focal cost gamma: 2.0 losses: ['class', 'boxes'] loss_class_type: focal_loss weight_dict: {'loss_class': 1, 'loss_bbox': 5.0, 'loss_giou': 2.0, 'loss_class_dn': 1, 'loss_bbox_dn': 5.0, 'loss_giou_dn': 2.0, 'loss_class_enc': 1, 'loss_bbox_enc': 5.0, 'loss_giou_enc': 2.0, 'loss_class_dn_enc': 1, 'loss_bbox_dn_enc': 5.0, 'loss_giou_dn_enc': 2.0, 'loss_class_0': 1, 'loss_bbox_0': 5.0, 'loss_giou_0': 2.0, 'loss_class_dn_0': 1, 'loss_bbox_dn_0': 5.0, 'loss_giou_dn_0': 2.0, 'loss_class_1': 1, 'loss_bbox_1': 5.0, 'loss_giou_1': 2.0, 'loss_class_dn_1': 1, 'loss_bbox_dn_1': 5.0, 'loss_giou_dn_1': 2.0, 'loss_class_2': 1, 'loss_bbox_2': 5.0, 'loss_giou_2': 2.0, 'loss_class_dn_2': 1, 'loss_bbox_dn_2': 5.0, 'loss_giou_dn_2': 2.0, 'loss_class_3': 1, 'loss_bbox_3': 5.0, 'loss_giou_3': 2.0, 'loss_class_dn_3': 1, 'loss_bbox_dn_3': 5.0, 'loss_giou_dn_3': 2.0, 'loss_class_4': 1, 'loss_bbox_4': 5.0, 'loss_giou_4': 2.0, 'loss_class_dn_4': 1, 'loss_bbox_dn_4': 5.0, 'loss_giou_dn_4': 2.0} num_classes: 1 eos_coef: None focal loss alpha: 0.25 focal loss gamma: 2.0 (label_enc): Embedding(1, 256) ) [03/21 06:50:18] d2.data.datasets.coco WARNING: Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you. [03/21 06:50:18] d2.data.datasets.coco INFO: Loaded 739 images in COCO format from datasets/corpus/annotations/train.json [03/21 06:50:18] d2.data.build INFO: Removed 0 images with no usable annotations. 739 images left. [03/21 06:50:18] d2.data.build INFO: Distribution of instances among all 1 categories: | category | #instances | |:----------:|:-------------| | object | 32866 | | | | [03/21 06:50:18] d2.data.common INFO: Serializing 739 elements to byte tensors and concatenating them all ... [03/21 06:50:18] d2.data.common INFO: Serialized dataset takes 11.55 MiB [03/21 06:50:20] fvcore.common.checkpoint INFO: [Checkpointer] Loading from detectron2://ImageNetPretrained/torchvision/R-50.pkl ... [03/21 06:50:23] fvcore.common.checkpoint INFO: Reading a file from 'torchvision' [03/21 06:50:23] d2.checkpoint.c2_model_loading INFO: Following weights matched with submodule backbone: | Names in Model | Names in Checkpoint | Shapes | |:------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| | res2.0.conv1.* | res2.0.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,64,1,1) | | res2.0.conv2.* | res2.0.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,64,3,3) | | res2.0.conv3.* | res2.0.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,64,1,1) | | res2.0.shortcut.* | res2.0.shortcut.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,64,1,1) | | res2.1.conv1.* | res2.1.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,256,1,1) | | res2.1.conv2.* | res2.1.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,64,3,3) | | res2.1.conv3.* | res2.1.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,64,1,1) | | res2.2.conv1.* | res2.2.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,256,1,1) | | res2.2.conv2.* | res2.2.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,64,3,3) | | res2.2.conv3.* | res2.2.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,64,1,1) | | res3.0.conv1.* | res3.0.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,256,1,1) | | res3.0.conv2.* | res3.0.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,128,3,3) | | res3.0.conv3.* | res3.0.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,128,1,1) | | res3.0.shortcut.* | res3.0.shortcut.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,256,1,1) | | res3.1.conv1.* | res3.1.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,512,1,1) | | res3.1.conv2.* | res3.1.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,128,3,3) | | res3.1.conv3.* | res3.1.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,128,1,1) | | res3.2.conv1.* | res3.2.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,512,1,1) | | res3.2.conv2.* | res3.2.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,128,3,3) | | res3.2.conv3.* | res3.2.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,128,1,1) | | res3.3.conv1.* | res3.3.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,512,1,1) | | res3.3.conv2.* | res3.3.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (128,) (128,) (128,) (128,) (128,128,3,3) | | res3.3.conv3.* | res3.3.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,128,1,1) | | res4.0.conv1.* | res4.0.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,512,1,1) | | res4.0.conv2.* | res4.0.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.0.conv3.* | res4.0.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res4.0.shortcut.* | res4.0.shortcut.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,512,1,1) | | res4.1.conv1.* | res4.1.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,1024,1,1) | | res4.1.conv2.* | res4.1.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.1.conv3.* | res4.1.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res4.2.conv1.* | res4.2.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,1024,1,1) | | res4.2.conv2.* | res4.2.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.2.conv3.* | res4.2.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res4.3.conv1.* | res4.3.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,1024,1,1) | | res4.3.conv2.* | res4.3.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.3.conv3.* | res4.3.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res4.4.conv1.* | res4.4.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,1024,1,1) | | res4.4.conv2.* | res4.4.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.4.conv3.* | res4.4.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res4.5.conv1.* | res4.5.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,1024,1,1) | | res4.5.conv2.* | res4.5.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (256,) (256,) (256,) (256,) (256,256,3,3) | | res4.5.conv3.* | res4.5.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (1024,) (1024,) (1024,) (1024,) (1024,256,1,1) | | res5.0.conv1.* | res5.0.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,1024,1,1) | | res5.0.conv2.* | res5.0.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,512,3,3) | | res5.0.conv3.* | res5.0.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (2048,) (2048,) (2048,) (2048,) (2048,512,1,1) | | res5.0.shortcut.* | res5.0.shortcut.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (2048,) (2048,) (2048,) (2048,) (2048,1024,1,1) | | res5.1.conv1.* | res5.1.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,2048,1,1) | | res5.1.conv2.* | res5.1.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,512,3,3) | | res5.1.conv3.* | res5.1.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (2048,) (2048,) (2048,) (2048,) (2048,512,1,1) | | res5.2.conv1.* | res5.2.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,2048,1,1) | | res5.2.conv2.* | res5.2.conv2.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (512,) (512,) (512,) (512,) (512,512,3,3) | | res5.2.conv3.* | res5.2.conv3.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (2048,) (2048,) (2048,) (2048,) (2048,512,1,1) | | stem.conv1.* | stem.conv1.{norm.bias,norm.running_mean,norm.running_var,norm.weight,weight} | (64,) (64,) (64,) (64,) (64,3,7,7) | [03/21 06:50:24] fvcore.common.checkpoint WARNING: Some model parameters or buffers are not found in the checkpoint: bbox_embed.0.layers.0.{bias, weight} bbox_embed.0.layers.1.{bias, weight} bbox_embed.0.layers.2.{bias, weight} bbox_embed.1.layers.0.{bias, weight} bbox_embed.1.layers.1.{bias, weight} bbox_embed.1.layers.2.{bias, weight} bbox_embed.2.layers.0.{bias, weight} bbox_embed.2.layers.1.{bias, weight} bbox_embed.2.layers.2.{bias, weight} bbox_embed.3.layers.0.{bias, weight} bbox_embed.3.layers.1.{bias, weight} bbox_embed.3.layers.2.{bias, weight} bbox_embed.4.layers.0.{bias, weight} bbox_embed.4.layers.1.{bias, weight} bbox_embed.4.layers.2.{bias, weight} bbox_embed.5.layers.0.{bias, weight} bbox_embed.5.layers.1.{bias, weight} bbox_embed.5.layers.2.{bias, weight} bbox_embed.6.layers.0.{bias, weight} bbox_embed.6.layers.1.{bias, weight} bbox_embed.6.layers.2.{bias, weight} class_embed.0.{bias, weight} class_embed.1.{bias, weight} class_embed.2.{bias, weight} class_embed.3.{bias, weight} class_embed.4.{bias, weight} class_embed.5.{bias, weight} class_embed.6.{bias, weight} label_enc.weight neck.convs.0.conv.{bias, weight} neck.convs.0.norm.{bias, weight} neck.convs.1.conv.{bias, weight} neck.convs.1.norm.{bias, weight} neck.convs.2.conv.{bias, weight} neck.convs.2.norm.{bias, weight} neck.extra_convs.0.conv.{bias, weight} neck.extra_convs.0.norm.{bias, weight} transformer.decoder.bbox_embed.0.layers.0.{bias, weight} transformer.decoder.bbox_embed.0.layers.1.{bias, weight} transformer.decoder.bbox_embed.0.layers.2.{bias, weight} transformer.decoder.bbox_embed.1.layers.0.{bias, weight} transformer.decoder.bbox_embed.1.layers.1.{bias, weight} transformer.decoder.bbox_embed.1.layers.2.{bias, weight} transformer.decoder.bbox_embed.2.layers.0.{bias, weight} transformer.decoder.bbox_embed.2.layers.1.{bias, weight} transformer.decoder.bbox_embed.2.layers.2.{bias, weight} transformer.decoder.bbox_embed.3.layers.0.{bias, weight} transformer.decoder.bbox_embed.3.layers.1.{bias, weight} transformer.decoder.bbox_embed.3.layers.2.{bias, weight} transformer.decoder.bbox_embed.4.layers.0.{bias, weight} transformer.decoder.bbox_embed.4.layers.1.{bias, weight} transformer.decoder.bbox_embed.4.layers.2.{bias, weight} transformer.decoder.bbox_embed.5.layers.0.{bias, weight} transformer.decoder.bbox_embed.5.layers.1.{bias, weight} transformer.decoder.bbox_embed.5.layers.2.{bias, weight} transformer.decoder.bbox_embed.6.layers.0.{bias, weight} transformer.decoder.bbox_embed.6.layers.1.{bias, weight} transformer.decoder.bbox_embed.6.layers.2.{bias, weight} transformer.decoder.class_embed.0.{bias, weight} transformer.decoder.class_embed.1.{bias, weight} transformer.decoder.class_embed.2.{bias, weight} transformer.decoder.class_embed.3.{bias, weight} transformer.decoder.class_embed.4.{bias, weight} transformer.decoder.class_embed.5.{bias, weight} transformer.decoder.class_embed.6.{bias, weight} transformer.decoder.layers.0.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.0.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.0.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.0.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.0.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.0.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.0.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.0.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.0.norms.0.{bias, weight} transformer.decoder.layers.0.norms.1.{bias, weight} transformer.decoder.layers.0.norms.2.{bias, weight} transformer.decoder.layers.1.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.1.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.1.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.1.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.1.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.1.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.1.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.1.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.1.norms.0.{bias, weight} transformer.decoder.layers.1.norms.1.{bias, weight} transformer.decoder.layers.1.norms.2.{bias, weight} transformer.decoder.layers.2.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.2.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.2.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.2.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.2.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.2.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.2.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.2.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.2.norms.0.{bias, weight} transformer.decoder.layers.2.norms.1.{bias, weight} transformer.decoder.layers.2.norms.2.{bias, weight} transformer.decoder.layers.3.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.3.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.3.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.3.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.3.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.3.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.3.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.3.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.3.norms.0.{bias, weight} transformer.decoder.layers.3.norms.1.{bias, weight} transformer.decoder.layers.3.norms.2.{bias, weight} transformer.decoder.layers.4.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.4.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.4.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.4.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.4.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.4.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.4.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.4.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.4.norms.0.{bias, weight} transformer.decoder.layers.4.norms.1.{bias, weight} transformer.decoder.layers.4.norms.2.{bias, weight} transformer.decoder.layers.5.attentions.0.attn.out_proj.{bias, weight} transformer.decoder.layers.5.attentions.0.attn.{in_proj_bias, in_proj_weight} transformer.decoder.layers.5.attentions.1.attention_weights.{bias, weight} transformer.decoder.layers.5.attentions.1.output_proj.{bias, weight} transformer.decoder.layers.5.attentions.1.sampling_offsets.{bias, weight} transformer.decoder.layers.5.attentions.1.value_proj.{bias, weight} transformer.decoder.layers.5.ffns.0.layers.0.0.{bias, weight} transformer.decoder.layers.5.ffns.0.layers.1.{bias, weight} transformer.decoder.layers.5.norms.0.{bias, weight} transformer.decoder.layers.5.norms.1.{bias, weight} transformer.decoder.layers.5.norms.2.{bias, weight} transformer.decoder.norm.{bias, weight} transformer.decoder.ref_point_head.layers.0.{bias, weight} transformer.decoder.ref_point_head.layers.1.{bias, weight} transformer.enc_output.{bias, weight} transformer.enc_output_norm.{bias, weight} transformer.encoder.layers.0.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.0.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.0.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.0.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.0.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.0.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.0.norms.0.{bias, weight} transformer.encoder.layers.0.norms.1.{bias, weight} transformer.encoder.layers.1.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.1.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.1.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.1.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.1.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.1.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.1.norms.0.{bias, weight} transformer.encoder.layers.1.norms.1.{bias, weight} transformer.encoder.layers.2.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.2.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.2.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.2.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.2.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.2.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.2.norms.0.{bias, weight} transformer.encoder.layers.2.norms.1.{bias, weight} transformer.encoder.layers.3.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.3.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.3.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.3.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.3.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.3.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.3.norms.0.{bias, weight} transformer.encoder.layers.3.norms.1.{bias, weight} transformer.encoder.layers.4.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.4.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.4.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.4.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.4.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.4.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.4.norms.0.{bias, weight} transformer.encoder.layers.4.norms.1.{bias, weight} transformer.encoder.layers.5.attentions.0.attention_weights.{bias, weight} transformer.encoder.layers.5.attentions.0.output_proj.{bias, weight} transformer.encoder.layers.5.attentions.0.sampling_offsets.{bias, weight} transformer.encoder.layers.5.attentions.0.value_proj.{bias, weight} transformer.encoder.layers.5.ffns.0.layers.0.0.{bias, weight} transformer.encoder.layers.5.ffns.0.layers.1.{bias, weight} transformer.encoder.layers.5.norms.0.{bias, weight} transformer.encoder.layers.5.norms.1.{bias, weight} transformer.level_embeds transformer.tgt_embed.weight [03/21 06:50:24] fvcore.common.checkpoint WARNING: The checkpoint state_dict contains keys that are not used by the model: stem.fc.{bias, weight} [03/21 06:50:24] d2.engine.train_loop INFO: Starting training from iteration 0 [03/21 06:52:09] d2.utils.events INFO: eta: 2 days, 1:31:46 iter: 49 total_loss: 31.39 loss_class: 0.3901 loss_bbox: 0.7303 loss_giou: 1.51 loss_class_0: 0.341 loss_bbox_0: 0.7715 loss_giou_0: 1.571 loss_class_1: 0.3615 loss_bbox_1: 0.7546 loss_giou_1: 1.535 loss_class_2: 0.3759 loss_bbox_2: 0.7453 loss_giou_2: 1.521 loss_class_3: 0.3834 loss_bbox_3: 0.7373 loss_giou_3: 1.516 loss_class_4: 0.3906 loss_bbox_4: 0.7321 loss_giou_4: 1.512 loss_class_enc: 0.3899 loss_bbox_enc: 0.7735 loss_giou_enc: 1.582 loss_class_dn: 0.2283 loss_bbox_dn: 0.5829 loss_giou_dn: 1.389 loss_class_dn_0: 0.1446 loss_bbox_dn_0: 0.5395 loss_giou_dn_0: 1.367 loss_class_dn_1: 0.1446 loss_bbox_dn_1: 0.5443 loss_giou_dn_1: 1.368 loss_class_dn_2: 0.1487 loss_bbox_dn_2: 0.5482 loss_giou_dn_2: 1.373 loss_class_dn_3: 0.1529 loss_bbox_dn_3: 0.5562 loss_giou_dn_3: 1.38 loss_class_dn_4: 0.1795 loss_bbox_dn_4: 0.5691 loss_giou_dn_4: 1.387 time: 2.0147 data_time: 0.0903 lr: 8.75e-05 max_mem: 36046M [03/21 06:53:52] d2.utils.events INFO: eta: 2 days, 2:07:38 iter: 99 total_loss: 30.54 loss_class: 0.331 loss_bbox: 0.7423 loss_giou: 1.5 loss_class_0: 0.3013 loss_bbox_0: 0.7618 loss_giou_0: 1.53 loss_class_1: 0.3064 loss_bbox_1: 0.7627 loss_giou_1: 1.531 loss_class_2: 0.3067 loss_bbox_2: 0.7609 loss_giou_2: 1.528 loss_class_3: 0.3131 loss_bbox_3: 0.7545 loss_giou_3: 1.518 loss_class_4: 0.316 loss_bbox_4: 0.7509 loss_giou_4: 1.512 loss_class_enc: 0.3749 loss_bbox_enc: 0.716 loss_giou_enc: 1.478 loss_class_dn: 0.1461 loss_bbox_dn: 0.5777 loss_giou_dn: 1.381 loss_class_dn_0: 0.1433 loss_bbox_dn_0: 0.5753 loss_giou_dn_0: 1.378 loss_class_dn_1: 0.1435 loss_bbox_dn_1: 0.574 loss_giou_dn_1: 1.381 loss_class_dn_2: 0.1434 loss_bbox_dn_2: 0.5739 loss_giou_dn_2: 1.382 loss_class_dn_3: 0.1435 loss_bbox_dn_3: 0.5762 loss_giou_dn_3: 1.38 loss_class_dn_4: 0.1438 loss_bbox_dn_4: 0.5775 loss_giou_dn_4: 1.381 time: 2.0318 data_time: 0.1357 lr: 8.75e-05 max_mem: 36619M [03/21 06:55:34] d2.utils.events INFO: eta: 2 days, 2:25:28 iter: 149 total_loss: 24.47 loss_class: 0.3549 loss_bbox: 0.4026 loss_giou: 0.9389 loss_class_0: 0.3301 loss_bbox_0: 0.4307 loss_giou_0: 0.9966 loss_class_1: 0.3402 loss_bbox_1: 0.4235 loss_giou_1: 0.9753 loss_class_2: 0.3471 loss_bbox_2: 0.4163 loss_giou_2: 0.9661 loss_class_3: 0.3491 loss_bbox_3: 0.4112 loss_giou_3: 0.9601 loss_class_4: 0.3509 loss_bbox_4: 0.4087 loss_giou_4: 0.9502 loss_class_enc: 0.3683 loss_bbox_enc: 0.3938 loss_giou_enc: 0.9409 loss_class_dn: 0.1444 loss_bbox_dn: 0.5297 loss_giou_dn: 1.382 loss_class_dn_0: 0.1423 loss_bbox_dn_0: 0.5334 loss_giou_dn_0: 1.378 loss_class_dn_1: 0.1417 loss_bbox_dn_1: 0.5321 loss_giou_dn_1: 1.379 loss_class_dn_2: 0.1424 loss_bbox_dn_2: 0.5312 loss_giou_dn_2: 1.379 loss_class_dn_3: 0.1425 loss_bbox_dn_3: 0.5309 loss_giou_dn_3: 1.38 loss_class_dn_4: 0.143 loss_bbox_dn_4: 0.5303 loss_giou_dn_4: 1.38 time: 2.0356 data_time: 0.1326 lr: 8.75e-05 max_mem: 36619M [03/21 06:57:14] d2.utils.events INFO: eta: 2 days, 2:24:09 iter: 199 total_loss: 22.3 loss_class: 0.3353 loss_bbox: 0.3162 loss_giou: 0.761 loss_class_0: 0.3089 loss_bbox_0: 0.3335 loss_giou_0: 0.7983 loss_class_1: 0.3152 loss_bbox_1: 0.3301 loss_giou_1: 0.7905 loss_class_2: 0.3203 loss_bbox_2: 0.3241 loss_giou_2: 0.7786 loss_class_3: 0.3226 loss_bbox_3: 0.3228 loss_giou_3: 0.7727 loss_class_4: 0.3271 loss_bbox_4: 0.321 loss_giou_4: 0.7688 loss_class_enc: 0.3035 loss_bbox_enc: 0.325 loss_giou_enc: 0.7735 loss_class_dn: 0.1359 loss_bbox_dn: 0.546 loss_giou_dn: 1.36 loss_class_dn_0: 0.1351 loss_bbox_dn_0: 0.5514 loss_giou_dn_0: 1.37 loss_class_dn_1: 0.1345 loss_bbox_dn_1: 0.5488 loss_giou_dn_1: 1.368 loss_class_dn_2: 0.1348 loss_bbox_dn_2: 0.5475 loss_giou_dn_2: 1.367 loss_class_dn_3: 0.1344 loss_bbox_dn_3: 0.5469 loss_giou_dn_3: 1.364 loss_class_dn_4: 0.1345 loss_bbox_dn_4: 0.5467 loss_giou_dn_4: 1.361 time: 2.0300 data_time: 0.1378 lr: 8.75e-05 max_mem: 36619M [03/21 06:58:54] d2.utils.events INFO: eta: 2 days, 2:19:14 iter: 249 total_loss: 20.31 loss_class: 0.2914 loss_bbox: 0.2757 loss_giou: 0.7097 loss_class_0: 0.272 loss_bbox_0: 0.2963 loss_giou_0: 0.7372 loss_class_1: 0.2696 loss_bbox_1: 0.2909 loss_giou_1: 0.7312 loss_class_2: 0.2708 loss_bbox_2: 0.2912 loss_giou_2: 0.7382 loss_class_3: 0.2799 loss_bbox_3: 0.2854 loss_giou_3: 0.7274 loss_class_4: 0.2835 loss_bbox_4: 0.2792 loss_giou_4: 0.7185 loss_class_enc: 0.2559 loss_bbox_enc: 0.3015 loss_giou_enc: 0.7576 loss_class_dn: 0.1238 loss_bbox_dn: 0.4917 loss_giou_dn: 1.239 loss_class_dn_0: 0.1335 loss_bbox_dn_0: 0.5369 loss_giou_dn_0: 1.31 loss_class_dn_1: 0.1297 loss_bbox_dn_1: 0.5191 loss_giou_dn_1: 1.278 loss_class_dn_2: 0.1276 loss_bbox_dn_2: 0.5093 loss_giou_dn_2: 1.258 loss_class_dn_3: 0.1256 loss_bbox_dn_3: 0.5021 loss_giou_dn_3: 1.258 loss_class_dn_4: 0.1243 loss_bbox_dn_4: 0.4955 loss_giou_dn_4: 1.246 time: 2.0233 data_time: 0.1387 lr: 8.75e-05 max_mem: 36619M [03/21 07:00:37] d2.utils.events INFO: eta: 2 days, 2:15:53 iter: 299 total_loss: 19.29 loss_class: 0.2711 loss_bbox: 0.2571 loss_giou: 0.6883 loss_class_0: 0.255 loss_bbox_0: 0.2784 loss_giou_0: 0.7191 loss_class_1: 0.2484 loss_bbox_1: 0.2768 loss_giou_1: 0.7346 loss_class_2: 0.25 loss_bbox_2: 0.2685 loss_giou_2: 0.7175 loss_class_3: 0.2527 loss_bbox_3: 0.2642 loss_giou_3: 0.7215 loss_class_4: 0.2625 loss_bbox_4: 0.2593 loss_giou_4: 0.6972 loss_class_enc: 0.2447 loss_bbox_enc: 0.2967 loss_giou_enc: 0.7578 loss_class_dn: 0.1182 loss_bbox_dn: 0.4419 loss_giou_dn: 1.123 loss_class_dn_0: 0.1339 loss_bbox_dn_0: 0.4889 loss_giou_dn_0: 1.265 loss_class_dn_1: 0.1241 loss_bbox_dn_1: 0.4744 loss_giou_dn_1: 1.229 loss_class_dn_2: 0.1216 loss_bbox_dn_2: 0.4521 loss_giou_dn_2: 1.163 loss_class_dn_3: 0.1188 loss_bbox_dn_3: 0.4486 loss_giou_dn_3: 1.153 loss_class_dn_4: 0.1183 loss_bbox_dn_4: 0.4442 loss_giou_dn_4: 1.131 time: 2.0279 data_time: 0.1322 lr: 8.75e-05 max_mem: 36619M [03/21 07:02:17] d2.utils.events INFO: eta: 2 days, 2:13:37 iter: 349 total_loss: 18.18 loss_class: 0.2393 loss_bbox: 0.2547 loss_giou: 0.6594 loss_class_0: 0.2444 loss_bbox_0: 0.2738 loss_giou_0: 0.6896 loss_class_1: 0.2343 loss_bbox_1: 0.2712 loss_giou_1: 0.6913 loss_class_2: 0.2257 loss_bbox_2: 0.2641 loss_giou_2: 0.6819 loss_class_3: 0.2277 loss_bbox_3: 0.2618 loss_giou_3: 0.6761 loss_class_4: 0.2318 loss_bbox_4: 0.2588 loss_giou_4: 0.6695 loss_class_enc: 0.2299 loss_bbox_enc: 0.312 loss_giou_enc: 0.7611 loss_class_dn: 0.1091 loss_bbox_dn: 0.4225 loss_giou_dn: 0.9755 loss_class_dn_0: 0.1308 loss_bbox_dn_0: 0.5209 loss_giou_dn_0: 1.232 loss_class_dn_1: 0.1211 loss_bbox_dn_1: 0.4938 loss_giou_dn_1: 1.173 loss_class_dn_2: 0.1152 loss_bbox_dn_2: 0.4417 loss_giou_dn_2: 1.033 loss_class_dn_3: 0.1131 loss_bbox_dn_3: 0.4355 loss_giou_dn_3: 1.013 loss_class_dn_4: 0.1109 loss_bbox_dn_4: 0.425 loss_giou_dn_4: 0.9864 time: 2.0232 data_time: 0.1387 lr: 8.75e-05 max_mem: 36619M [03/21 07:03:57] d2.utils.events INFO: eta: 2 days, 2:11:56 iter: 399 total_loss: 17.14 loss_class: 0.2111 loss_bbox: 0.237 loss_giou: 0.6286 loss_class_0: 0.2274 loss_bbox_0: 0.2622 loss_giou_0: 0.6648 loss_class_1: 0.2101 loss_bbox_1: 0.2546 loss_giou_1: 0.6524 loss_class_2: 0.2008 loss_bbox_2: 0.2503 loss_giou_2: 0.6466 loss_class_3: 0.2039 loss_bbox_3: 0.2455 loss_giou_3: 0.6404 loss_class_4: 0.2049 loss_bbox_4: 0.2419 loss_giou_4: 0.6364 loss_class_enc: 0.2093 loss_bbox_enc: 0.2891 loss_giou_enc: 0.7289 loss_class_dn: 0.106 loss_bbox_dn: 0.3959 loss_giou_dn: 0.9171 loss_class_dn_0: 0.1286 loss_bbox_dn_0: 0.4928 loss_giou_dn_0: 1.201 loss_class_dn_1: 0.1187 loss_bbox_dn_1: 0.4481 loss_giou_dn_1: 1.099 loss_class_dn_2: 0.1113 loss_bbox_dn_2: 0.4096 loss_giou_dn_2: 0.9694 loss_class_dn_3: 0.1098 loss_bbox_dn_3: 0.4042 loss_giou_dn_3: 0.9508 loss_class_dn_4: 0.107 loss_bbox_dn_4: 0.3966 loss_giou_dn_4: 0.9269 time: 2.0222 data_time: 0.1333 lr: 8.75e-05 max_mem: 36619M [03/21 07:05:37] d2.utils.events INFO: eta: 2 days, 2:06:55 iter: 449 total_loss: 15.78 loss_class: 0.1969 loss_bbox: 0.1984 loss_giou: 0.5929 loss_class_0: 0.2093 loss_bbox_0: 0.2177 loss_giou_0: 0.6243 loss_class_1: 0.1941 loss_bbox_1: 0.2103 loss_giou_1: 0.6118 loss_class_2: 0.1906 loss_bbox_2: 0.2066 loss_giou_2: 0.6012 loss_class_3: 0.1926 loss_bbox_3: 0.2041 loss_giou_3: 0.6002 loss_class_4: 0.1924 loss_bbox_4: 0.2003 loss_giou_4: 0.5982 loss_class_enc: 0.2043 loss_bbox_enc: 0.2511 loss_giou_enc: 0.67 loss_class_dn: 0.1016 loss_bbox_dn: 0.3355 loss_giou_dn: 0.837 loss_class_dn_0: 0.125 loss_bbox_dn_0: 0.4579 loss_giou_dn_0: 1.152 loss_class_dn_1: 0.1166 loss_bbox_dn_1: 0.3991 loss_giou_dn_1: 1.009 loss_class_dn_2: 0.1078 loss_bbox_dn_2: 0.3527 loss_giou_dn_2: 0.8861 loss_class_dn_3: 0.1049 loss_bbox_dn_3: 0.3438 loss_giou_dn_3: 0.8651 loss_class_dn_4: 0.1026 loss_bbox_dn_4: 0.336 loss_giou_dn_4: 0.844 time: 2.0181 data_time: 0.1367 lr: 8.75e-05 max_mem: 36619M [03/21 07:07:15] d2.utils.events INFO: eta: 2 days, 1:47:47 iter: 499 total_loss: 15.48 loss_class: 0.182 loss_bbox: 0.1927 loss_giou: 0.5816 loss_class_0: 0.2002 loss_bbox_0: 0.209 loss_giou_0: 0.6115 loss_class_1: 0.1829 loss_bbox_1: 0.2031 loss_giou_1: 0.608 loss_class_2: 0.1773 loss_bbox_2: 0.1979 loss_giou_2: 0.6008 loss_class_3: 0.1796 loss_bbox_3: 0.1971 loss_giou_3: 0.5926 loss_class_4: 0.1795 loss_bbox_4: 0.194 loss_giou_4: 0.5853 loss_class_enc: 0.1918 loss_bbox_enc: 0.2393 loss_giou_enc: 0.6642 loss_class_dn: 0.1007 loss_bbox_dn: 0.3052 loss_giou_dn: 0.8087 loss_class_dn_0: 0.1218 loss_bbox_dn_0: 0.4415 loss_giou_dn_0: 1.129 loss_class_dn_1: 0.1137 loss_bbox_dn_1: 0.3693 loss_giou_dn_1: 0.9531 loss_class_dn_2: 0.104 loss_bbox_dn_2: 0.3235 loss_giou_dn_2: 0.8518 loss_class_dn_3: 0.1021 loss_bbox_dn_3: 0.314 loss_giou_dn_3: 0.8337 loss_class_dn_4: 0.1009 loss_bbox_dn_4: 0.3067 loss_giou_dn_4: 0.8152 time: 2.0126 data_time: 0.1270 lr: 8.75e-05 max_mem: 36619M [03/21 07:08:54] d2.utils.events INFO: eta: 2 days, 1:37:11 iter: 549 total_loss: 15.3 loss_class: 0.1856 loss_bbox: 0.2082 loss_giou: 0.5833 loss_class_0: 0.2041 loss_bbox_0: 0.2285 loss_giou_0: 0.6337 loss_class_1: 0.1887 loss_bbox_1: 0.219 loss_giou_1: 0.6124 loss_class_2: 0.1822 loss_bbox_2: 0.2169 loss_giou_2: 0.5997 loss_class_3: 0.1796 loss_bbox_3: 0.2142 loss_giou_3: 0.5946 loss_class_4: 0.1838 loss_bbox_4: 0.2118 loss_giou_4: 0.5876 loss_class_enc: 0.1929 loss_bbox_enc: 0.258 loss_giou_enc: 0.6919 loss_class_dn: 0.1 loss_bbox_dn: 0.3317 loss_giou_dn: 0.8251 loss_class_dn_0: 0.1218 loss_bbox_dn_0: 0.4352 loss_giou_dn_0: 1.092 loss_class_dn_1: 0.1108 loss_bbox_dn_1: 0.3741 loss_giou_dn_1: 0.949 loss_class_dn_2: 0.1028 loss_bbox_dn_2: 0.3456 loss_giou_dn_2: 0.8668 loss_class_dn_3: 0.1022 loss_bbox_dn_3: 0.3414 loss_giou_dn_3: 0.8433 loss_class_dn_4: 0.1002 loss_bbox_dn_4: 0.334 loss_giou_dn_4: 0.83 time: 2.0094 data_time: 0.1241 lr: 8.75e-05 max_mem: 36619M [03/21 07:10:33] d2.utils.events INFO: eta: 2 days, 1:32:29 iter: 599 total_loss: 14.6 loss_class: 0.1724 loss_bbox: 0.2056 loss_giou: 0.5658 loss_class_0: 0.1878 loss_bbox_0: 0.2142 loss_giou_0: 0.5867 loss_class_1: 0.1749 loss_bbox_1: 0.2163 loss_giou_1: 0.5782 loss_class_2: 0.1726 loss_bbox_2: 0.2125 loss_giou_2: 0.5744 loss_class_3: 0.1698 loss_bbox_3: 0.2108 loss_giou_3: 0.5704 loss_class_4: 0.1693 loss_bbox_4: 0.2073 loss_giou_4: 0.566 loss_class_enc: 0.1859 loss_bbox_enc: 0.2432 loss_giou_enc: 0.6461 loss_class_dn: 0.09496 loss_bbox_dn: 0.3211 loss_giou_dn: 0.7744 loss_class_dn_0: 0.1159 loss_bbox_dn_0: 0.4339 loss_giou_dn_0: 1.057 loss_class_dn_1: 0.1056 loss_bbox_dn_1: 0.3556 loss_giou_dn_1: 0.8849 loss_class_dn_2: 0.09853 loss_bbox_dn_2: 0.3293 loss_giou_dn_2: 0.8043 loss_class_dn_3: 0.0961 loss_bbox_dn_3: 0.3279 loss_giou_dn_3: 0.7874 loss_class_dn_4: 0.09421 loss_bbox_dn_4: 0.3216 loss_giou_dn_4: 0.777 time: 2.0064 data_time: 0.1364 lr: 8.75e-05 max_mem: 36619M [03/21 07:12:11] d2.utils.events INFO: eta: 2 days, 1:28:27 iter: 649 total_loss: 14.37 loss_class: 0.1636 loss_bbox: 0.2115 loss_giou: 0.5631 loss_class_0: 0.184 loss_bbox_0: 0.222 loss_giou_0: 0.5931 loss_class_1: 0.1643 loss_bbox_1: 0.2217 loss_giou_1: 0.5851 loss_class_2: 0.1646 loss_bbox_2: 0.2155 loss_giou_2: 0.5736 loss_class_3: 0.1598 loss_bbox_3: 0.2143 loss_giou_3: 0.5732 loss_class_4: 0.1577 loss_bbox_4: 0.2127 loss_giou_4: 0.5687 loss_class_enc: 0.1782 loss_bbox_enc: 0.2546 loss_giou_enc: 0.6472 loss_class_dn: 0.09388 loss_bbox_dn: 0.3066 loss_giou_dn: 0.7467 loss_class_dn_0: 0.116 loss_bbox_dn_0: 0.3885 loss_giou_dn_0: 1.01 loss_class_dn_1: 0.1024 loss_bbox_dn_1: 0.3323 loss_giou_dn_1: 0.8501 loss_class_dn_2: 0.09734 loss_bbox_dn_2: 0.3139 loss_giou_dn_2: 0.7769 loss_class_dn_3: 0.09568 loss_bbox_dn_3: 0.3108 loss_giou_dn_3: 0.7685 loss_class_dn_4: 0.09312 loss_bbox_dn_4: 0.3069 loss_giou_dn_4: 0.7505 time: 2.0031 data_time: 0.1351 lr: 8.75e-05 max_mem: 36619M [03/21 07:13:49] d2.utils.events INFO: eta: 2 days, 1:23:53 iter: 699 total_loss: 14.27 loss_class: 0.1606 loss_bbox: 0.1839 loss_giou: 0.5593 loss_class_0: 0.1823 loss_bbox_0: 0.2028 loss_giou_0: 0.5849 loss_class_1: 0.1689 loss_bbox_1: 0.1937 loss_giou_1: 0.578 loss_class_2: 0.1624 loss_bbox_2: 0.1893 loss_giou_2: 0.5634 loss_class_3: 0.1617 loss_bbox_3: 0.1881 loss_giou_3: 0.5622 loss_class_4: 0.1591 loss_bbox_4: 0.1851 loss_giou_4: 0.5575 loss_class_enc: 0.1796 loss_bbox_enc: 0.2238 loss_giou_enc: 0.6337 loss_class_dn: 0.09332 loss_bbox_dn: 0.2967 loss_giou_dn: 0.7659 loss_class_dn_0: 0.1123 loss_bbox_dn_0: 0.3806 loss_giou_dn_0: 0.9949 loss_class_dn_1: 0.1016 loss_bbox_dn_1: 0.3181 loss_giou_dn_1: 0.8532 loss_class_dn_2: 0.09715 loss_bbox_dn_2: 0.2987 loss_giou_dn_2: 0.7946 loss_class_dn_3: 0.09364 loss_bbox_dn_3: 0.2994 loss_giou_dn_3: 0.7864 loss_class_dn_4: 0.09295 loss_bbox_dn_4: 0.2956 loss_giou_dn_4: 0.7707 time: 2.0010 data_time: 0.1074 lr: 8.75e-05 max_mem: 36619M [03/21 07:15:30] d2.utils.events INFO: eta: 2 days, 1:19:24 iter: 749 total_loss: 13.83 loss_class: 0.1607 loss_bbox: 0.1844 loss_giou: 0.5554 loss_class_0: 0.1778 loss_bbox_0: 0.1965 loss_giou_0: 0.5858 loss_class_1: 0.1605 loss_bbox_1: 0.1933 loss_giou_1: 0.5773 loss_class_2: 0.1614 loss_bbox_2: 0.1908 loss_giou_2: 0.5598 loss_class_3: 0.1613 loss_bbox_3: 0.1885 loss_giou_3: 0.5583 loss_class_4: 0.1616 loss_bbox_4: 0.1862 loss_giou_4: 0.5558 loss_class_enc: 0.1764 loss_bbox_enc: 0.2168 loss_giou_enc: 0.6204 loss_class_dn: 0.09268 loss_bbox_dn: 0.2855 loss_giou_dn: 0.7466 loss_class_dn_0: 0.1148 loss_bbox_dn_0: 0.3648 loss_giou_dn_0: 0.9884 loss_class_dn_1: 0.1022 loss_bbox_dn_1: 0.3138 loss_giou_dn_1: 0.8346 loss_class_dn_2: 0.09479 loss_bbox_dn_2: 0.293 loss_giou_dn_2: 0.7726 loss_class_dn_3: 0.09375 loss_bbox_dn_3: 0.2869 loss_giou_dn_3: 0.7581 loss_class_dn_4: 0.09495 loss_bbox_dn_4: 0.2847 loss_giou_dn_4: 0.7495 time: 2.0010 data_time: 0.1348 lr: 8.75e-05 max_mem: 36619M [03/21 07:17:05] d2.utils.events INFO: eta: 2 days, 1:10:07 iter: 799 total_loss: 13.17 loss_class: 0.1553 loss_bbox: 0.1913 loss_giou: 0.5281 loss_class_0: 0.173 loss_bbox_0: 0.2124 loss_giou_0: 0.558 loss_class_1: 0.1574 loss_bbox_1: 0.2032 loss_giou_1: 0.5456 loss_class_2: 0.1551 loss_bbox_2: 0.1974 loss_giou_2: 0.5339 loss_class_3: 0.1528 loss_bbox_3: 0.1946 loss_giou_3: 0.5338 loss_class_4: 0.1548 loss_bbox_4: 0.1921 loss_giou_4: 0.5262 loss_class_enc: 0.176 loss_bbox_enc: 0.2435 loss_giou_enc: 0.6125 loss_class_dn: 0.08625 loss_bbox_dn: 0.2904 loss_giou_dn: 0.7087 loss_class_dn_0: 0.1083 loss_bbox_dn_0: 0.3818 loss_giou_dn_0: 0.9614 loss_class_dn_1: 0.09576 loss_bbox_dn_1: 0.3174 loss_giou_dn_1: 0.8043 loss_class_dn_2: 0.09236 loss_bbox_dn_2: 0.296 loss_giou_dn_2: 0.7373 loss_class_dn_3: 0.08803 loss_bbox_dn_3: 0.293 loss_giou_dn_3: 0.7231 loss_class_dn_4: 0.08636 loss_bbox_dn_4: 0.2905 loss_giou_dn_4: 0.7102 time: 1.9958 data_time: 0.1163 lr: 8.75e-05 max_mem: 36619M [03/21 07:18:45] d2.utils.events INFO: eta: 2 days, 1:02:30 iter: 849 total_loss: 13.52 loss_class: 0.1554 loss_bbox: 0.1913 loss_giou: 0.5249 loss_class_0: 0.1714 loss_bbox_0: 0.2059 loss_giou_0: 0.5621 loss_class_1: 0.1556 loss_bbox_1: 0.201 loss_giou_1: 0.548 loss_class_2: 0.1529 loss_bbox_2: 0.1971 loss_giou_2: 0.5362 loss_class_3: 0.1506 loss_bbox_3: 0.1929 loss_giou_3: 0.5395 loss_class_4: 0.1511 loss_bbox_4: 0.1921 loss_giou_4: 0.5268 loss_class_enc: 0.1707 loss_bbox_enc: 0.2371 loss_giou_enc: 0.6287 loss_class_dn: 0.0899 loss_bbox_dn: 0.2927 loss_giou_dn: 0.7078 loss_class_dn_0: 0.1081 loss_bbox_dn_0: 0.3796 loss_giou_dn_0: 0.944 loss_class_dn_1: 0.09809 loss_bbox_dn_1: 0.3192 loss_giou_dn_1: 0.7914 loss_class_dn_2: 0.0927 loss_bbox_dn_2: 0.298 loss_giou_dn_2: 0.7287 loss_class_dn_3: 0.08943 loss_bbox_dn_3: 0.2965 loss_giou_dn_3: 0.7181 loss_class_dn_4: 0.0897 loss_bbox_dn_4: 0.2923 loss_giou_dn_4: 0.708 time: 1.9951 data_time: 0.1432 lr: 8.75e-05 max_mem: 36619M [03/21 07:20:22] d2.utils.events INFO: eta: 2 days, 0:57:18 iter: 899 total_loss: 13.58 loss_class: 0.1572 loss_bbox: 0.1977 loss_giou: 0.5381 loss_class_0: 0.1701 loss_bbox_0: 0.2168 loss_giou_0: 0.5655 loss_class_1: 0.1528 loss_bbox_1: 0.2101 loss_giou_1: 0.5612 loss_class_2: 0.1516 loss_bbox_2: 0.2043 loss_giou_2: 0.5513 loss_class_3: 0.1535 loss_bbox_3: 0.2041 loss_giou_3: 0.5489 loss_class_4: 0.1551 loss_bbox_4: 0.199 loss_giou_4: 0.542 loss_class_enc: 0.1686 loss_bbox_enc: 0.2339 loss_giou_enc: 0.6494 loss_class_dn: 0.09017 loss_bbox_dn: 0.2944 loss_giou_dn: 0.7108 loss_class_dn_0: 0.1071 loss_bbox_dn_0: 0.3648 loss_giou_dn_0: 0.9343 loss_class_dn_1: 0.09719 loss_bbox_dn_1: 0.3109 loss_giou_dn_1: 0.7894 loss_class_dn_2: 0.09185 loss_bbox_dn_2: 0.3002 loss_giou_dn_2: 0.7372 loss_class_dn_3: 0.09009 loss_bbox_dn_3: 0.2968 loss_giou_dn_3: 0.7232 loss_class_dn_4: 0.08935 loss_bbox_dn_4: 0.294 loss_giou_dn_4: 0.7119 time: 1.9929 data_time: 0.1326 lr: 8.75e-05 max_mem: 36619M [03/21 07:22:01] d2.utils.events INFO: eta: 2 days, 0:54:16 iter: 949 total_loss: 12.87 loss_class: 0.1421 loss_bbox: 0.1679 loss_giou: 0.5261 loss_class_0: 0.1571 loss_bbox_0: 0.1852 loss_giou_0: 0.5519 loss_class_1: 0.1415 loss_bbox_1: 0.1766 loss_giou_1: 0.5423 loss_class_2: 0.141 loss_bbox_2: 0.1726 loss_giou_2: 0.5386 loss_class_3: 0.1398 loss_bbox_3: 0.1721 loss_giou_3: 0.5326 loss_class_4: 0.141 loss_bbox_4: 0.1711 loss_giou_4: 0.529 loss_class_enc: 0.1522 loss_bbox_enc: 0.2036 loss_giou_enc: 0.6027 loss_class_dn: 0.08535 loss_bbox_dn: 0.264 loss_giou_dn: 0.6966 loss_class_dn_0: 0.102 loss_bbox_dn_0: 0.3393 loss_giou_dn_0: 0.9249 loss_class_dn_1: 0.09096 loss_bbox_dn_1: 0.2807 loss_giou_dn_1: 0.7711 loss_class_dn_2: 0.08737 loss_bbox_dn_2: 0.2676 loss_giou_dn_2: 0.7158 loss_class_dn_3: 0.08515 loss_bbox_dn_3: 0.2665 loss_giou_dn_3: 0.7083 loss_class_dn_4: 0.08451 loss_bbox_dn_4: 0.2636 loss_giou_dn_4: 0.6996 time: 1.9920 data_time: 0.1278 lr: 8.75e-05 max_mem: 36619M [03/21 07:23:40] d2.utils.events INFO: eta: 2 days, 0:53:13 iter: 999 total_loss: 14.07 loss_class: 0.155 loss_bbox: 0.1956 loss_giou: 0.568 loss_class_0: 0.1637 loss_bbox_0: 0.2166 loss_giou_0: 0.5809 loss_class_1: 0.1544 loss_bbox_1: 0.2112 loss_giou_1: 0.5862 loss_class_2: 0.1502 loss_bbox_2: 0.2051 loss_giou_2: 0.5815 loss_class_3: 0.1532 loss_bbox_3: 0.1984 loss_giou_3: 0.5737 loss_class_4: 0.1526 loss_bbox_4: 0.1977 loss_giou_4: 0.5677 loss_class_enc: 0.1639 loss_bbox_enc: 0.2445 loss_giou_enc: 0.6595 loss_class_dn: 0.08664 loss_bbox_dn: 0.2966 loss_giou_dn: 0.7383 loss_class_dn_0: 0.1016 loss_bbox_dn_0: 0.3524 loss_giou_dn_0: 0.9314 loss_class_dn_1: 0.0925 loss_bbox_dn_1: 0.3063 loss_giou_dn_1: 0.8018 loss_class_dn_2: 0.08864 loss_bbox_dn_2: 0.2961 loss_giou_dn_2: 0.7604 loss_class_dn_3: 0.08548 loss_bbox_dn_3: 0.2954 loss_giou_dn_3: 0.7514 loss_class_dn_4: 0.08612 loss_bbox_dn_4: 0.2951 loss_giou_dn_4: 0.7393 time: 1.9907 data_time: 0.1188 lr: 8.75e-05 max_mem: 36619M [03/21 07:25:18] d2.utils.events INFO: eta: 2 days, 0:50:08 iter: 1049 total_loss: 12.61 loss_class: 0.1402 loss_bbox: 0.1697 loss_giou: 0.498 loss_class_0: 0.1522 loss_bbox_0: 0.1904 loss_giou_0: 0.5455 loss_class_1: 0.1418 loss_bbox_1: 0.1781 loss_giou_1: 0.5258 loss_class_2: 0.1423 loss_bbox_2: 0.1729 loss_giou_2: 0.513 loss_class_3: 0.1423 loss_bbox_3: 0.1723 loss_giou_3: 0.5079 loss_class_4: 0.1377 loss_bbox_4: 0.1708 loss_giou_4: 0.5015 loss_class_enc: 0.1548 loss_bbox_enc: 0.2239 loss_giou_enc: 0.6022 loss_class_dn: 0.08238 loss_bbox_dn: 0.2563 loss_giou_dn: 0.6754 loss_class_dn_0: 0.1021 loss_bbox_dn_0: 0.3431 loss_giou_dn_0: 0.8955 loss_class_dn_1: 0.09147 loss_bbox_dn_1: 0.2881 loss_giou_dn_1: 0.7608 loss_class_dn_2: 0.08719 loss_bbox_dn_2: 0.2648 loss_giou_dn_2: 0.7004 loss_class_dn_3: 0.08442 loss_bbox_dn_3: 0.2608 loss_giou_dn_3: 0.6924 loss_class_dn_4: 0.0834 loss_bbox_dn_4: 0.257 loss_giou_dn_4: 0.682 time: 1.9897 data_time: 0.1326 lr: 8.75e-05 max_mem: 36619M [03/21 07:26:56] d2.utils.events INFO: eta: 2 days, 0:43:27 iter: 1099 total_loss: 11.79 loss_class: 0.135 loss_bbox: 0.1644 loss_giou: 0.4716 loss_class_0: 0.1537 loss_bbox_0: 0.1842 loss_giou_0: 0.5177 loss_class_1: 0.1378 loss_bbox_1: 0.1762 loss_giou_1: 0.4958 loss_class_2: 0.1365 loss_bbox_2: 0.1669 loss_giou_2: 0.4882 loss_class_3: 0.1346 loss_bbox_3: 0.1653 loss_giou_3: 0.4801 loss_class_4: 0.1351 loss_bbox_4: 0.1642 loss_giou_4: 0.4739 loss_class_enc: 0.1533 loss_bbox_enc: 0.2098 loss_giou_enc: 0.5448 loss_class_dn: 0.08058 loss_bbox_dn: 0.2455 loss_giou_dn: 0.6456 loss_class_dn_0: 0.09777 loss_bbox_dn_0: 0.3198 loss_giou_dn_0: 0.8498 loss_class_dn_1: 0.08765 loss_bbox_dn_1: 0.2678 loss_giou_dn_1: 0.7217 loss_class_dn_2: 0.08346 loss_bbox_dn_2: 0.2492 loss_giou_dn_2: 0.6665 loss_class_dn_3: 0.08079 loss_bbox_dn_3: 0.2464 loss_giou_dn_3: 0.6543 loss_class_dn_4: 0.0802 loss_bbox_dn_4: 0.2447 loss_giou_dn_4: 0.6468 time: 1.9886 data_time: 0.1447 lr: 8.75e-05 max_mem: 36619M [03/21 07:28:35] d2.utils.events INFO: eta: 2 days, 0:38:01 iter: 1149 total_loss: 12.03 loss_class: 0.1403 loss_bbox: 0.1656 loss_giou: 0.4747 loss_class_0: 0.1541 loss_bbox_0: 0.1813 loss_giou_0: 0.5215 loss_class_1: 0.1423 loss_bbox_1: 0.1717 loss_giou_1: 0.4944 loss_class_2: 0.1437 loss_bbox_2: 0.1685 loss_giou_2: 0.4786 loss_class_3: 0.1425 loss_bbox_3: 0.1677 loss_giou_3: 0.4767 loss_class_4: 0.141 loss_bbox_4: 0.1651 loss_giou_4: 0.4738 loss_class_enc: 0.1516 loss_bbox_enc: 0.203 loss_giou_enc: 0.5705 loss_class_dn: 0.08421 loss_bbox_dn: 0.2664 loss_giou_dn: 0.6486 loss_class_dn_0: 0.1014 loss_bbox_dn_0: 0.3333 loss_giou_dn_0: 0.8585 loss_class_dn_1: 0.09038 loss_bbox_dn_1: 0.2863 loss_giou_dn_1: 0.7179 loss_class_dn_2: 0.08666 loss_bbox_dn_2: 0.2705 loss_giou_dn_2: 0.6687 loss_class_dn_3: 0.08492 loss_bbox_dn_3: 0.2698 loss_giou_dn_3: 0.658 loss_class_dn_4: 0.08338 loss_bbox_dn_4: 0.2668 loss_giou_dn_4: 0.6502 time: 1.9875 data_time: 0.1429 lr: 8.75e-05 max_mem: 36619M [03/21 07:30:16] d2.utils.events INFO: eta: 2 days, 0:35:06 iter: 1199 total_loss: 12.15 loss_class: 0.1373 loss_bbox: 0.1587 loss_giou: 0.4645 loss_class_0: 0.1468 loss_bbox_0: 0.1779 loss_giou_0: 0.5294 loss_class_1: 0.142 loss_bbox_1: 0.1671 loss_giou_1: 0.4984 loss_class_2: 0.1389 loss_bbox_2: 0.1601 loss_giou_2: 0.4745 loss_class_3: 0.1386 loss_bbox_3: 0.1592 loss_giou_3: 0.4709 loss_class_4: 0.1379 loss_bbox_4: 0.1585 loss_giou_4: 0.4643 loss_class_enc: 0.1526 loss_bbox_enc: 0.2125 loss_giou_enc: 0.5869 loss_class_dn: 0.08555 loss_bbox_dn: 0.2521 loss_giou_dn: 0.6596 loss_class_dn_0: 0.09937 loss_bbox_dn_0: 0.3153 loss_giou_dn_0: 0.8656 loss_class_dn_1: 0.09008 loss_bbox_dn_1: 0.2678 loss_giou_dn_1: 0.7307 loss_class_dn_2: 0.08608 loss_bbox_dn_2: 0.2549 loss_giou_dn_2: 0.6812 loss_class_dn_3: 0.08464 loss_bbox_dn_3: 0.2525 loss_giou_dn_3: 0.6721 loss_class_dn_4: 0.08471 loss_bbox_dn_4: 0.2517 loss_giou_dn_4: 0.6621 time: 1.9888 data_time: 0.1460 lr: 8.75e-05 max_mem: 36619M [03/21 07:31:54] d2.utils.events INFO: eta: 2 days, 0:29:30 iter: 1249 total_loss: 11.83 loss_class: 0.1343 loss_bbox: 0.1652 loss_giou: 0.4554 loss_class_0: 0.147 loss_bbox_0: 0.1893 loss_giou_0: 0.5044 loss_class_1: 0.1367 loss_bbox_1: 0.1756 loss_giou_1: 0.4846 loss_class_2: 0.1342 loss_bbox_2: 0.1705 loss_giou_2: 0.4673 loss_class_3: 0.1352 loss_bbox_3: 0.1683 loss_giou_3: 0.4593 loss_class_4: 0.135 loss_bbox_4: 0.1666 loss_giou_4: 0.4562 loss_class_enc: 0.1521 loss_bbox_enc: 0.2122 loss_giou_enc: 0.574 loss_class_dn: 0.08068 loss_bbox_dn: 0.2501 loss_giou_dn: 0.6389 loss_class_dn_0: 0.09763 loss_bbox_dn_0: 0.3196 loss_giou_dn_0: 0.8373 loss_class_dn_1: 0.08624 loss_bbox_dn_1: 0.2702 loss_giou_dn_1: 0.705 loss_class_dn_2: 0.08192 loss_bbox_dn_2: 0.2556 loss_giou_dn_2: 0.6594 loss_class_dn_3: 0.08103 loss_bbox_dn_3: 0.2524 loss_giou_dn_3: 0.6484 loss_class_dn_4: 0.0805 loss_bbox_dn_4: 0.2502 loss_giou_dn_4: 0.6409 time: 1.9879 data_time: 0.1380 lr: 8.75e-05 max_mem: 36619M [03/21 07:33:33] d2.utils.events INFO: eta: 2 days, 0:24:13 iter: 1299 total_loss: 11.65 loss_class: 0.1404 loss_bbox: 0.1492 loss_giou: 0.4474 loss_class_0: 0.1474 loss_bbox_0: 0.175 loss_giou_0: 0.5164 loss_class_1: 0.1394 loss_bbox_1: 0.1605 loss_giou_1: 0.4805 loss_class_2: 0.1373 loss_bbox_2: 0.1535 loss_giou_2: 0.4616 loss_class_3: 0.1381 loss_bbox_3: 0.1536 loss_giou_3: 0.4569 loss_class_4: 0.1377 loss_bbox_4: 0.1514 loss_giou_4: 0.4502 loss_class_enc: 0.1488 loss_bbox_enc: 0.1964 loss_giou_enc: 0.5679 loss_class_dn: 0.08258 loss_bbox_dn: 0.2438 loss_giou_dn: 0.6188 loss_class_dn_0: 0.09773 loss_bbox_dn_0: 0.3057 loss_giou_dn_0: 0.8349 loss_class_dn_1: 0.08758 loss_bbox_dn_1: 0.2627 loss_giou_dn_1: 0.6863 loss_class_dn_2: 0.08376 loss_bbox_dn_2: 0.2482 loss_giou_dn_2: 0.634 loss_class_dn_3: 0.08255 loss_bbox_dn_3: 0.247 loss_giou_dn_3: 0.6276 loss_class_dn_4: 0.08259 loss_bbox_dn_4: 0.2446 loss_giou_dn_4: 0.6202 time: 1.9874 data_time: 0.1266 lr: 8.75e-05 max_mem: 36619M [03/21 07:35:10] d2.utils.events INFO: eta: 2 days, 0:19:20 iter: 1349 total_loss: 11.45 loss_class: 0.1319 loss_bbox: 0.1521 loss_giou: 0.438 loss_class_0: 0.1408 loss_bbox_0: 0.1775 loss_giou_0: 0.5015 loss_class_1: 0.1342 loss_bbox_1: 0.1581 loss_giou_1: 0.4746 loss_class_2: 0.1313 loss_bbox_2: 0.155 loss_giou_2: 0.4498 loss_class_3: 0.13 loss_bbox_3: 0.1545 loss_giou_3: 0.4438 loss_class_4: 0.1308 loss_bbox_4: 0.1525 loss_giou_4: 0.4373 loss_class_enc: 0.1426 loss_bbox_enc: 0.2002 loss_giou_enc: 0.5545 loss_class_dn: 0.08152 loss_bbox_dn: 0.2427 loss_giou_dn: 0.6172 loss_class_dn_0: 0.09606 loss_bbox_dn_0: 0.3129 loss_giou_dn_0: 0.8498 loss_class_dn_1: 0.08729 loss_bbox_dn_1: 0.2626 loss_giou_dn_1: 0.6882 loss_class_dn_2: 0.08256 loss_bbox_dn_2: 0.2462 loss_giou_dn_2: 0.6339 loss_class_dn_3: 0.08124 loss_bbox_dn_3: 0.2444 loss_giou_dn_3: 0.6236 loss_class_dn_4: 0.0811 loss_bbox_dn_4: 0.2425 loss_giou_dn_4: 0.6189 time: 1.9860 data_time: 0.1324 lr: 8.75e-05 max_mem: 36619M [03/21 07:36:49] d2.utils.events INFO: eta: 2 days, 0:12:27 iter: 1399 total_loss: 11.53 loss_class: 0.1345 loss_bbox: 0.1554 loss_giou: 0.4294 loss_class_0: 0.1467 loss_bbox_0: 0.1767 loss_giou_0: 0.5096 loss_class_1: 0.1388 loss_bbox_1: 0.168 loss_giou_1: 0.4688 loss_class_2: 0.1342 loss_bbox_2: 0.1599 loss_giou_2: 0.4367 loss_class_3: 0.1321 loss_bbox_3: 0.1586 loss_giou_3: 0.4319 loss_class_4: 0.1312 loss_bbox_4: 0.1564 loss_giou_4: 0.4308 loss_class_enc: 0.1418 loss_bbox_enc: 0.2001 loss_giou_enc: 0.5582 loss_class_dn: 0.08176 loss_bbox_dn: 0.2541 loss_giou_dn: 0.6105 loss_class_dn_0: 0.09839 loss_bbox_dn_0: 0.3065 loss_giou_dn_0: 0.8378 loss_class_dn_1: 0.08842 loss_bbox_dn_1: 0.2646 loss_giou_dn_1: 0.6786 loss_class_dn_2: 0.08283 loss_bbox_dn_2: 0.2563 loss_giou_dn_2: 0.629 loss_class_dn_3: 0.08044 loss_bbox_dn_3: 0.2548 loss_giou_dn_3: 0.6191 loss_class_dn_4: 0.082 loss_bbox_dn_4: 0.2533 loss_giou_dn_4: 0.6131 time: 1.9860 data_time: 0.1453 lr: 8.75e-05 max_mem: 36619M [03/21 07:38:28] d2.utils.events INFO: eta: 2 days, 0:07:08 iter: 1449 total_loss: 11.04 loss_class: 0.1324 loss_bbox: 0.1416 loss_giou: 0.4046 loss_class_0: 0.1428 loss_bbox_0: 0.163 loss_giou_0: 0.4664 loss_class_1: 0.134 loss_bbox_1: 0.152 loss_giou_1: 0.4333 loss_class_2: 0.1322 loss_bbox_2: 0.1437 loss_giou_2: 0.415 loss_class_3: 0.1294 loss_bbox_3: 0.1427 loss_giou_3: 0.4123 loss_class_4: 0.131 loss_bbox_4: 0.1426 loss_giou_4: 0.4105 loss_class_enc: 0.1405 loss_bbox_enc: 0.1873 loss_giou_enc: 0.5324 loss_class_dn: 0.07947 loss_bbox_dn: 0.221 loss_giou_dn: 0.5871 loss_class_dn_0: 0.09145 loss_bbox_dn_0: 0.2864 loss_giou_dn_0: 0.8024 loss_class_dn_1: 0.08302 loss_bbox_dn_1: 0.2381 loss_giou_dn_1: 0.6559 loss_class_dn_2: 0.07934 loss_bbox_dn_2: 0.2256 loss_giou_dn_2: 0.6096 loss_class_dn_3: 0.07908 loss_bbox_dn_3: 0.2235 loss_giou_dn_3: 0.5976 loss_class_dn_4: 0.07858 loss_bbox_dn_4: 0.2216 loss_giou_dn_4: 0.5878 time: 1.9857 data_time: 0.1189 lr: 8.75e-05 max_mem: 36619M [03/21 07:40:08] d2.utils.events INFO: eta: 2 days, 0:11:48 iter: 1499 total_loss: 10.58 loss_class: 0.1224 loss_bbox: 0.1314 loss_giou: 0.3885 loss_class_0: 0.133 loss_bbox_0: 0.1569 loss_giou_0: 0.4719 loss_class_1: 0.1249 loss_bbox_1: 0.1424 loss_giou_1: 0.4179 loss_class_2: 0.1234 loss_bbox_2: 0.1352 loss_giou_2: 0.3975 loss_class_3: 0.1222 loss_bbox_3: 0.1349 loss_giou_3: 0.3915 loss_class_4: 0.1225 loss_bbox_4: 0.1332 loss_giou_4: 0.3905 loss_class_enc: 0.1371 loss_bbox_enc: 0.1918 loss_giou_enc: 0.5306 loss_class_dn: 0.07592 loss_bbox_dn: 0.214 loss_giou_dn: 0.5565 loss_class_dn_0: 0.08998 loss_bbox_dn_0: 0.2675 loss_giou_dn_0: 0.7697 loss_class_dn_1: 0.08105 loss_bbox_dn_1: 0.2286 loss_giou_dn_1: 0.61 loss_class_dn_2: 0.07715 loss_bbox_dn_2: 0.2184 loss_giou_dn_2: 0.5691 loss_class_dn_3: 0.07649 loss_bbox_dn_3: 0.2155 loss_giou_dn_3: 0.5624 loss_class_dn_4: 0.07561 loss_bbox_dn_4: 0.2143 loss_giou_dn_4: 0.5555 time: 1.9860 data_time: 0.1315 lr: 8.75e-05 max_mem: 36619M [03/21 07:41:48] d2.utils.events INFO: eta: 2 days, 0:09:32 iter: 1549 total_loss: 11.04 loss_class: 0.1368 loss_bbox: 0.1446 loss_giou: 0.4024 loss_class_0: 0.145 loss_bbox_0: 0.1694 loss_giou_0: 0.4818 loss_class_1: 0.1339 loss_bbox_1: 0.1552 loss_giou_1: 0.437 loss_class_2: 0.1338 loss_bbox_2: 0.1493 loss_giou_2: 0.412 loss_class_3: 0.1339 loss_bbox_3: 0.1467 loss_giou_3: 0.4078 loss_class_4: 0.1342 loss_bbox_4: 0.1448 loss_giou_4: 0.4037 loss_class_enc: 0.1407 loss_bbox_enc: 0.1938 loss_giou_enc: 0.5368 loss_class_dn: 0.07754 loss_bbox_dn: 0.2421 loss_giou_dn: 0.5956 loss_class_dn_0: 0.09216 loss_bbox_dn_0: 0.3006 loss_giou_dn_0: 0.7889 loss_class_dn_1: 0.08347 loss_bbox_dn_1: 0.2546 loss_giou_dn_1: 0.6594 loss_class_dn_2: 0.07949 loss_bbox_dn_2: 0.2448 loss_giou_dn_2: 0.6123 loss_class_dn_3: 0.07765 loss_bbox_dn_3: 0.2435 loss_giou_dn_3: 0.6029 loss_class_dn_4: 0.07681 loss_bbox_dn_4: 0.2421 loss_giou_dn_4: 0.5979 time: 1.9864 data_time: 0.1233 lr: 8.75e-05 max_mem: 36619M [03/21 07:43:28] d2.utils.events INFO: eta: 2 days, 0:11:48 iter: 1599 total_loss: 10.67 loss_class: 0.1298 loss_bbox: 0.1392 loss_giou: 0.4023 loss_class_0: 0.1345 loss_bbox_0: 0.1623 loss_giou_0: 0.4728 loss_class_1: 0.1264 loss_bbox_1: 0.1504 loss_giou_1: 0.4331 loss_class_2: 0.1255 loss_bbox_2: 0.1447 loss_giou_2: 0.4118 loss_class_3: 0.1261 loss_bbox_3: 0.1423 loss_giou_3: 0.4071 loss_class_4: 0.1266 loss_bbox_4: 0.1402 loss_giou_4: 0.4019 loss_class_enc: 0.1332 loss_bbox_enc: 0.1847 loss_giou_enc: 0.5298 loss_class_dn: 0.07832 loss_bbox_dn: 0.2224 loss_giou_dn: 0.5643 loss_class_dn_0: 0.09172 loss_bbox_dn_0: 0.2845 loss_giou_dn_0: 0.774 loss_class_dn_1: 0.08148 loss_bbox_dn_1: 0.2402 loss_giou_dn_1: 0.6255 loss_class_dn_2: 0.07802 loss_bbox_dn_2: 0.2278 loss_giou_dn_2: 0.581 loss_class_dn_3: 0.07772 loss_bbox_dn_3: 0.2249 loss_giou_dn_3: 0.5761 loss_class_dn_4: 0.07826 loss_bbox_dn_4: 0.2227 loss_giou_dn_4: 0.5661 time: 1.9869 data_time: 0.1347 lr: 8.75e-05 max_mem: 36619M [03/21 07:45:08] d2.utils.events INFO: eta: 2 days, 0:12:03 iter: 1649 total_loss: 10.55 loss_class: 0.1304 loss_bbox: 0.1376 loss_giou: 0.3944 loss_class_0: 0.1378 loss_bbox_0: 0.1608 loss_giou_0: 0.4568 loss_class_1: 0.1282 loss_bbox_1: 0.1482 loss_giou_1: 0.417 loss_class_2: 0.1274 loss_bbox_2: 0.1405 loss_giou_2: 0.4013 loss_class_3: 0.1294 loss_bbox_3: 0.1392 loss_giou_3: 0.3967 loss_class_4: 0.1297 loss_bbox_4: 0.1384 loss_giou_4: 0.394 loss_class_enc: 0.1331 loss_bbox_enc: 0.1879 loss_giou_enc: 0.5231 loss_class_dn: 0.07697 loss_bbox_dn: 0.2122 loss_giou_dn: 0.5675 loss_class_dn_0: 0.09165 loss_bbox_dn_0: 0.2946 loss_giou_dn_0: 0.7744 loss_class_dn_1: 0.08155 loss_bbox_dn_1: 0.2307 loss_giou_dn_1: 0.6295 loss_class_dn_2: 0.07786 loss_bbox_dn_2: 0.2177 loss_giou_dn_2: 0.5834 loss_class_dn_3: 0.07674 loss_bbox_dn_3: 0.2149 loss_giou_dn_3: 0.5757 loss_class_dn_4: 0.07724 loss_bbox_dn_4: 0.2125 loss_giou_dn_4: 0.5683 time: 1.9872 data_time: 0.1346 lr: 8.75e-05 max_mem: 36619M [03/21 07:46:48] d2.utils.events INFO: eta: 2 days, 0:08:36 iter: 1699 total_loss: 10.58 loss_class: 0.1299 loss_bbox: 0.1451 loss_giou: 0.4062 loss_class_0: 0.134 loss_bbox_0: 0.1678 loss_giou_0: 0.4664 loss_class_1: 0.1274 loss_bbox_1: 0.1523 loss_giou_1: 0.4288 loss_class_2: 0.1268 loss_bbox_2: 0.1468 loss_giou_2: 0.4141 loss_class_3: 0.1266 loss_bbox_3: 0.1443 loss_giou_3: 0.4111 loss_class_4: 0.1277 loss_bbox_4: 0.1432 loss_giou_4: 0.4083 loss_class_enc: 0.1339 loss_bbox_enc: 0.1841 loss_giou_enc: 0.5189 loss_class_dn: 0.07636 loss_bbox_dn: 0.228 loss_giou_dn: 0.5725 loss_class_dn_0: 0.0906 loss_bbox_dn_0: 0.2785 loss_giou_dn_0: 0.7693 loss_class_dn_1: 0.0813 loss_bbox_dn_1: 0.2397 loss_giou_dn_1: 0.6252 loss_class_dn_2: 0.07767 loss_bbox_dn_2: 0.2316 loss_giou_dn_2: 0.5884 loss_class_dn_3: 0.0762 loss_bbox_dn_3: 0.2293 loss_giou_dn_3: 0.5828 loss_class_dn_4: 0.07618 loss_bbox_dn_4: 0.2269 loss_giou_dn_4: 0.5755 time: 1.9873 data_time: 0.1392 lr: 8.75e-05 max_mem: 36619M [03/21 07:48:24] d2.utils.events INFO: eta: 2 days, 0:00:37 iter: 1749 total_loss: 10.61 loss_class: 0.1266 loss_bbox: 0.1348 loss_giou: 0.3781 loss_class_0: 0.1354 loss_bbox_0: 0.1575 loss_giou_0: 0.4494 loss_class_1: 0.1296 loss_bbox_1: 0.1432 loss_giou_1: 0.4069 loss_class_2: 0.1305 loss_bbox_2: 0.1384 loss_giou_2: 0.3865 loss_class_3: 0.1252 loss_bbox_3: 0.1366 loss_giou_3: 0.3798 loss_class_4: 0.1267 loss_bbox_4: 0.1352 loss_giou_4: 0.379 loss_class_enc: 0.133 loss_bbox_enc: 0.1959 loss_giou_enc: 0.5317 loss_class_dn: 0.07486 loss_bbox_dn: 0.2265 loss_giou_dn: 0.5583 loss_class_dn_0: 0.08962 loss_bbox_dn_0: 0.3042 loss_giou_dn_0: 0.7647 loss_class_dn_1: 0.07882 loss_bbox_dn_1: 0.2449 loss_giou_dn_1: 0.6119 loss_class_dn_2: 0.07545 loss_bbox_dn_2: 0.2295 loss_giou_dn_2: 0.5708 loss_class_dn_3: 0.07491 loss_bbox_dn_3: 0.2286 loss_giou_dn_3: 0.5636 loss_class_dn_4: 0.07452 loss_bbox_dn_4: 0.2264 loss_giou_dn_4: 0.5588 time: 1.9854 data_time: 0.0939 lr: 8.75e-05 max_mem: 36619M [03/21 07:50:01] d2.utils.events INFO: eta: 2 days, 0:02:38 iter: 1799 total_loss: 10.45 loss_class: 0.1204 loss_bbox: 0.1368 loss_giou: 0.3764 loss_class_0: 0.1284 loss_bbox_0: 0.1592 loss_giou_0: 0.4608 loss_class_1: 0.1238 loss_bbox_1: 0.1478 loss_giou_1: 0.4108 loss_class_2: 0.1228 loss_bbox_2: 0.1403 loss_giou_2: 0.3923 loss_class_3: 0.1199 loss_bbox_3: 0.138 loss_giou_3: 0.3853 loss_class_4: 0.1195 loss_bbox_4: 0.1376 loss_giou_4: 0.3803 loss_class_enc: 0.1324 loss_bbox_enc: 0.1825 loss_giou_enc: 0.5122 loss_class_dn: 0.07415 loss_bbox_dn: 0.222 loss_giou_dn: 0.5603 loss_class_dn_0: 0.09017 loss_bbox_dn_0: 0.2926 loss_giou_dn_0: 0.7645 loss_class_dn_1: 0.07932 loss_bbox_dn_1: 0.2438 loss_giou_dn_1: 0.6164 loss_class_dn_2: 0.07604 loss_bbox_dn_2: 0.229 loss_giou_dn_2: 0.5704 loss_class_dn_3: 0.07482 loss_bbox_dn_3: 0.2262 loss_giou_dn_3: 0.5648 loss_class_dn_4: 0.07327 loss_bbox_dn_4: 0.2225 loss_giou_dn_4: 0.5607 time: 1.9844 data_time: 0.1221 lr: 8.75e-05 max_mem: 36619M [03/21 07:51:39] d2.utils.events INFO: eta: 2 days, 0:01:47 iter: 1849 total_loss: 10.61 loss_class: 0.1215 loss_bbox: 0.1354 loss_giou: 0.4063 loss_class_0: 0.1277 loss_bbox_0: 0.1572 loss_giou_0: 0.464 loss_class_1: 0.1214 loss_bbox_1: 0.1457 loss_giou_1: 0.4404 loss_class_2: 0.1202 loss_bbox_2: 0.138 loss_giou_2: 0.4175 loss_class_3: 0.1183 loss_bbox_3: 0.1364 loss_giou_3: 0.414 loss_class_4: 0.1196 loss_bbox_4: 0.1363 loss_giou_4: 0.4097 loss_class_enc: 0.1266 loss_bbox_enc: 0.178 loss_giou_enc: 0.5309 loss_class_dn: 0.07699 loss_bbox_dn: 0.2203 loss_giou_dn: 0.5954 loss_class_dn_0: 0.09066 loss_bbox_dn_0: 0.2747 loss_giou_dn_0: 0.7782 loss_class_dn_1: 0.08084 loss_bbox_dn_1: 0.2324 loss_giou_dn_1: 0.646 loss_class_dn_2: 0.07711 loss_bbox_dn_2: 0.2226 loss_giou_dn_2: 0.6085 loss_class_dn_3: 0.0757 loss_bbox_dn_3: 0.2207 loss_giou_dn_3: 0.5994 loss_class_dn_4: 0.07658 loss_bbox_dn_4: 0.2201 loss_giou_dn_4: 0.5955 time: 1.9840 data_time: 0.1328 lr: 8.75e-05 max_mem: 36619M [03/21 07:53:18] d2.utils.events INFO: eta: 1 day, 23:59:51 iter: 1899 total_loss: 9.742 loss_class: 0.1162 loss_bbox: 0.1234 loss_giou: 0.355 loss_class_0: 0.127 loss_bbox_0: 0.1478 loss_giou_0: 0.4255 loss_class_1: 0.1207 loss_bbox_1: 0.129 loss_giou_1: 0.3754 loss_class_2: 0.1176 loss_bbox_2: 0.1249 loss_giou_2: 0.3624 loss_class_3: 0.1161 loss_bbox_3: 0.1243 loss_giou_3: 0.3592 loss_class_4: 0.1152 loss_bbox_4: 0.1241 loss_giou_4: 0.3575 loss_class_enc: 0.1263 loss_bbox_enc: 0.1766 loss_giou_enc: 0.5027 loss_class_dn: 0.07288 loss_bbox_dn: 0.2009 loss_giou_dn: 0.5252 loss_class_dn_0: 0.08713 loss_bbox_dn_0: 0.2629 loss_giou_dn_0: 0.7201 loss_class_dn_1: 0.07607 loss_bbox_dn_1: 0.2152 loss_giou_dn_1: 0.572 loss_class_dn_2: 0.07399 loss_bbox_dn_2: 0.2039 loss_giou_dn_2: 0.5367 loss_class_dn_3: 0.07253 loss_bbox_dn_3: 0.2021 loss_giou_dn_3: 0.532 loss_class_dn_4: 0.07237 loss_bbox_dn_4: 0.2009 loss_giou_dn_4: 0.5265 time: 1.9834 data_time: 0.1334 lr: 8.75e-05 max_mem: 36619M [03/21 07:54:56] d2.utils.events INFO: eta: 1 day, 23:58:31 iter: 1949 total_loss: 9.396 loss_class: 0.1109 loss_bbox: 0.1186 loss_giou: 0.3575 loss_class_0: 0.1243 loss_bbox_0: 0.1374 loss_giou_0: 0.4224 loss_class_1: 0.1161 loss_bbox_1: 0.1276 loss_giou_1: 0.3805 loss_class_2: 0.1147 loss_bbox_2: 0.1204 loss_giou_2: 0.3632 loss_class_3: 0.1115 loss_bbox_3: 0.1194 loss_giou_3: 0.3626 loss_class_4: 0.1103 loss_bbox_4: 0.119 loss_giou_4: 0.357 loss_class_enc: 0.1216 loss_bbox_enc: 0.1648 loss_giou_enc: 0.4832 loss_class_dn: 0.07272 loss_bbox_dn: 0.1959 loss_giou_dn: 0.5162 loss_class_dn_0: 0.0874 loss_bbox_dn_0: 0.2469 loss_giou_dn_0: 0.7063 loss_class_dn_1: 0.07787 loss_bbox_dn_1: 0.2056 loss_giou_dn_1: 0.574 loss_class_dn_2: 0.07427 loss_bbox_dn_2: 0.1987 loss_giou_dn_2: 0.5319 loss_class_dn_3: 0.07316 loss_bbox_dn_3: 0.1977 loss_giou_dn_3: 0.5221 loss_class_dn_4: 0.07328 loss_bbox_dn_4: 0.196 loss_giou_dn_4: 0.5152 time: 1.9832 data_time: 0.1295 lr: 8.75e-05 max_mem: 36619M [03/21 07:56:37] fvcore.common.checkpoint INFO: Saving checkpoint to ./output/dino_r50_4scale_12ep/model_0001999.pth [03/21 07:56:38] detectron2 INFO: Run evaluation without EMA. [03/21 07:56:38] d2.data.datasets.coco WARNING: Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you. [03/21 07:56:38] d2.data.datasets.coco INFO: Loaded 83 images in COCO format from datasets/corpus/annotations/test.json [03/21 07:56:38] d2.data.build INFO: Distribution of instances among all 1 categories: | category | #instances | |:----------:|:-------------| | object | 3516 | | | | [03/21 07:56:38] d2.data.common INFO: Serializing 83 elements to byte tensors and concatenating them all ... [03/21 07:56:38] d2.data.common INFO: Serialized dataset takes 1.17 MiB [03/21 07:56:38] d2.evaluation.evaluator INFO: Start inference on 83 batches [03/21 07:56:39] d2.evaluation.evaluator INFO: Inference done 11/83. Dataloading: 0.0564 s/iter. Inference: 0.0650 s/iter. Eval: 0.0004 s/iter. Total: 0.1218 s/iter. ETA=0:00:08 [03/21 07:56:45] d2.evaluation.evaluator INFO: Inference done 54/83. Dataloading: 0.0545 s/iter. Inference: 0.0632 s/iter. Eval: 0.0004 s/iter. Total: 0.1181 s/iter. ETA=0:00:03 [03/21 07:56:48] d2.evaluation.evaluator INFO: Total inference time: 0:00:08.986979 (0.115218 s / iter per device, on 1 devices) [03/21 07:56:48] d2.evaluation.evaluator INFO: Total inference pure compute time: 0:00:04 (0.061954 s / iter per device, on 1 devices) [03/21 07:56:48] d2.evaluation.coco_evaluation INFO: Preparing results for COCO format ... [03/21 07:56:48] d2.evaluation.coco_evaluation INFO: Saving results to ./output/dino_r50_4scale_12ep/coco_instances_results.json [03/21 07:56:48] d2.evaluation.coco_evaluation INFO: Evaluating predictions with unofficial COCO API... [03/21 07:56:48] d2.evaluation.fast_eval_api INFO: Evaluate annotation type *bbox* [03/21 07:56:48] d2.evaluation.fast_eval_api INFO: COCOeval_opt.evaluate() finished in 0.12 seconds. [03/21 07:56:48] d2.evaluation.fast_eval_api INFO: Accumulating evaluation results... [03/21 07:56:48] d2.evaluation.fast_eval_api INFO: COCOeval_opt.accumulate() finished in 0.01 seconds. [03/21 07:56:48] d2.evaluation.coco_evaluation INFO: Evaluation results for bbox: | AP | AP50 | AP75 | APs | APm | APl | |:------:|:------:|:------:|:------:|:------:|:------:| | 57.735 | 80.629 | 66.406 | 21.287 | 59.223 | 61.058 | [03/21 07:56:48] d2.evaluation.testing INFO: copypaste: Task: bbox [03/21 07:56:48] d2.evaluation.testing INFO: copypaste: AP,AP50,AP75,APs,APm,APl [03/21 07:56:48] d2.evaluation.testing INFO: copypaste: 57.7351,80.6290,66.4063,21.2871,59.2229,61.0579 [03/21 07:56:48] d2.utils.events INFO: eta: 1 day, 23:56:53 iter: 1999 total_loss: 10.22 loss_class: 0.1237 loss_bbox: 0.1309 loss_giou: 0.3808 loss_class_0: 0.1279 loss_bbox_0: 0.1603 loss_giou_0: 0.4532 loss_class_1: 0.1242 loss_bbox_1: 0.1443 loss_giou_1: 0.4049 loss_class_2: 0.1232 loss_bbox_2: 0.1359 loss_giou_2: 0.3903 loss_class_3: 0.1238 loss_bbox_3: 0.1335 loss_giou_3: 0.3859 loss_class_4: 0.1226 loss_bbox_4: 0.1317 loss_giou_4: 0.3814 loss_class_enc: 0.1288 loss_bbox_enc: 0.189 loss_giou_enc: 0.5189 loss_class_dn: 0.07363 loss_bbox_dn: 0.2227 loss_giou_dn: 0.5558 loss_class_dn_0: 0.08759 loss_bbox_dn_0: 0.289 loss_giou_dn_0: 0.7393 loss_class_dn_1: 0.07783 loss_bbox_dn_1: 0.238 loss_giou_dn_1: 0.6089 loss_class_dn_2: 0.0744 loss_bbox_dn_2: 0.2257 loss_giou_dn_2: 0.5694 loss_class_dn_3: 0.07371 loss_bbox_dn_3: 0.2249 loss_giou_dn_3: 0.5643 loss_class_dn_4: 0.07328 loss_bbox_dn_4: 0.2228 loss_giou_dn_4: 0.557 time: 1.9839 data_time: 0.1238 lr: 8.75e-05 max_mem: 36619M [03/21 07:58:27] d2.utils.events INFO: eta: 1 day, 23:56:56 iter: 2049 total_loss: 9.655 loss_class: 0.1128 loss_bbox: 0.1208 loss_giou: 0.3781 loss_class_0: 0.1221 loss_bbox_0: 0.1435 loss_giou_0: 0.4455 loss_class_1: 0.1174 loss_bbox_1: 0.1291 loss_giou_1: 0.4013 loss_class_2: 0.1149 loss_bbox_2: 0.1237 loss_giou_2: 0.3833 loss_class_3: 0.1144 loss_bbox_3: 0.1221 loss_giou_3: 0.3807 loss_class_4: 0.11 loss_bbox_4: 0.1207 loss_giou_4: 0.3763 loss_class_enc: 0.1187 loss_bbox_enc: 0.1617 loss_giou_enc: 0.4981 loss_class_dn: 0.07465 loss_bbox_dn: 0.2058 loss_giou_dn: 0.5566 loss_class_dn_0: 0.0886 loss_bbox_dn_0: 0.2742 loss_giou_dn_0: 0.7553 loss_class_dn_1: 0.07867 loss_bbox_dn_1: 0.2213 loss_giou_dn_1: 0.6019 loss_class_dn_2: 0.07558 loss_bbox_dn_2: 0.2089 loss_giou_dn_2: 0.5674 loss_class_dn_3: 0.07453 loss_bbox_dn_3: 0.2072 loss_giou_dn_3: 0.5609 loss_class_dn_4: 0.07445 loss_bbox_dn_4: 0.2052 loss_giou_dn_4: 0.5577 time: 1.9838 data_time: 0.1410 lr: 8.75e-05 max_mem: 36619M [03/21 08:00:06] d2.utils.events INFO: eta: 1 day, 23:53:37 iter: 2099 total_loss: 9.979 loss_class: 0.1137 loss_bbox: 0.124 loss_giou: 0.3714 loss_class_0: 0.118 loss_bbox_0: 0.1498 loss_giou_0: 0.4416 loss_class_1: 0.1138 loss_bbox_1: 0.1359 loss_giou_1: 0.3971 loss_class_2: 0.1129 loss_bbox_2: 0.1308 loss_giou_2: 0.3831 loss_class_3: 0.1129 loss_bbox_3: 0.1283 loss_giou_3: 0.377 loss_class_4: 0.1132 loss_bbox_4: 0.1256 loss_giou_4: 0.3734 loss_class_enc: 0.125 loss_bbox_enc: 0.171 loss_giou_enc: 0.4951 loss_class_dn: 0.07468 loss_bbox_dn: 0.2173 loss_giou_dn: 0.5429 loss_class_dn_0: 0.08767 loss_bbox_dn_0: 0.282 loss_giou_dn_0: 0.7498 loss_class_dn_1: 0.07845 loss_bbox_dn_1: 0.2352 loss_giou_dn_1: 0.5967 loss_class_dn_2: 0.07669 loss_bbox_dn_2: 0.222 loss_giou_dn_2: 0.5547 loss_class_dn_3: 0.07515 loss_bbox_dn_3: 0.2194 loss_giou_dn_3: 0.5502 loss_class_dn_4: 0.07499 loss_bbox_dn_4: 0.2175 loss_giou_dn_4: 0.5439 time: 1.9834 data_time: 0.1291 lr: 8.75e-05 max_mem: 36619M [03/21 08:01:46] d2.utils.events INFO: eta: 1 day, 23:53:47 iter: 2149 total_loss: 9.636 loss_class: 0.1122 loss_bbox: 0.1321 loss_giou: 0.3474 loss_class_0: 0.1206 loss_bbox_0: 0.1559 loss_giou_0: 0.4188 loss_class_1: 0.1139 loss_bbox_1: 0.1424 loss_giou_1: 0.3735 loss_class_2: 0.114 loss_bbox_2: 0.1369 loss_giou_2: 0.357 loss_class_3: 0.1099 loss_bbox_3: 0.1358 loss_giou_3: 0.3522 loss_class_4: 0.111 loss_bbox_4: 0.1332 loss_giou_4: 0.3514 loss_class_enc: 0.1276 loss_bbox_enc: 0.1796 loss_giou_enc: 0.472 loss_class_dn: 0.07266 loss_bbox_dn: 0.2103 loss_giou_dn: 0.5129 loss_class_dn_0: 0.08656 loss_bbox_dn_0: 0.2759 loss_giou_dn_0: 0.7014 loss_class_dn_1: 0.0777 loss_bbox_dn_1: 0.2306 loss_giou_dn_1: 0.5671 loss_class_dn_2: 0.07341 loss_bbox_dn_2: 0.215 loss_giou_dn_2: 0.5294 loss_class_dn_3: 0.07239 loss_bbox_dn_3: 0.213 loss_giou_dn_3: 0.5208 loss_class_dn_4: 0.0721 loss_bbox_dn_4: 0.2108 loss_giou_dn_4: 0.5146 time: 1.9841 data_time: 0.1411 lr: 8.75e-05 max_mem: 36619M [03/21 08:03:26] d2.utils.events INFO: eta: 1 day, 23:50:49 iter: 2199 total_loss: 9.771 loss_class: 0.1135 loss_bbox: 0.1198 loss_giou: 0.3533 loss_class_0: 0.1173 loss_bbox_0: 0.1446 loss_giou_0: 0.4304 loss_class_1: 0.1126 loss_bbox_1: 0.1301 loss_giou_1: 0.377 loss_class_2: 0.1124 loss_bbox_2: 0.1238 loss_giou_2: 0.3614 loss_class_3: 0.1135 loss_bbox_3: 0.1217 loss_giou_3: 0.3621 loss_class_4: 0.1133 loss_bbox_4: 0.1206 loss_giou_4: 0.3554 loss_class_enc: 0.1191 loss_bbox_enc: 0.1632 loss_giou_enc: 0.4787 loss_class_dn: 0.07146 loss_bbox_dn: 0.21 loss_giou_dn: 0.5254 loss_class_dn_0: 0.08593 loss_bbox_dn_0: 0.2705 loss_giou_dn_0: 0.7293 loss_class_dn_1: 0.07546 loss_bbox_dn_1: 0.2245 loss_giou_dn_1: 0.5817 loss_class_dn_2: 0.07224 loss_bbox_dn_2: 0.2148 loss_giou_dn_2: 0.5392 loss_class_dn_3: 0.07115 loss_bbox_dn_3: 0.2124 loss_giou_dn_3: 0.5364 loss_class_dn_4: 0.07121 loss_bbox_dn_4: 0.2104 loss_giou_dn_4: 0.5285 time: 1.9842 data_time: 0.1363 lr: 8.75e-05 max_mem: 36619M [03/21 08:05:04] d2.utils.events INFO: eta: 1 day, 23:49:11 iter: 2249 total_loss: 9.462 loss_class: 0.1045 loss_bbox: 0.1225 loss_giou: 0.3371 loss_class_0: 0.1136 loss_bbox_0: 0.1464 loss_giou_0: 0.423 loss_class_1: 0.1052 loss_bbox_1: 0.1315 loss_giou_1: 0.3678 loss_class_2: 0.1035 loss_bbox_2: 0.1238 loss_giou_2: 0.3458 loss_class_3: 0.1035 loss_bbox_3: 0.125 loss_giou_3: 0.3443 loss_class_4: 0.1046 loss_bbox_4: 0.1232 loss_giou_4: 0.3405 loss_class_enc: 0.116 loss_bbox_enc: 0.1744 loss_giou_enc: 0.4875 loss_class_dn: 0.06936 loss_bbox_dn: 0.2182 loss_giou_dn: 0.5188 loss_class_dn_0: 0.08334 loss_bbox_dn_0: 0.2832 loss_giou_dn_0: 0.7087 loss_class_dn_1: 0.07388 loss_bbox_dn_1: 0.2341 loss_giou_dn_1: 0.566 loss_class_dn_2: 0.0702 loss_bbox_dn_2: 0.221 loss_giou_dn_2: 0.5298 loss_class_dn_3: 0.07056 loss_bbox_dn_3: 0.2195 loss_giou_dn_3: 0.525 loss_class_dn_4: 0.07006 loss_bbox_dn_4: 0.2181 loss_giou_dn_4: 0.5204 time: 1.9840 data_time: 0.1337 lr: 8.75e-05 max_mem: 36619M [03/21 08:06:44] d2.utils.events INFO: eta: 1 day, 23:50:47 iter: 2299 total_loss: 9.343 loss_class: 0.1079 loss_bbox: 0.1156 loss_giou: 0.3401 loss_class_0: 0.1155 loss_bbox_0: 0.1344 loss_giou_0: 0.4126 loss_class_1: 0.11 loss_bbox_1: 0.1241 loss_giou_1: 0.3689 loss_class_2: 0.1082 loss_bbox_2: 0.1187 loss_giou_2: 0.3534 loss_class_3: 0.1073 loss_bbox_3: 0.1175 loss_giou_3: 0.3477 loss_class_4: 0.1069 loss_bbox_4: 0.1167 loss_giou_4: 0.3437 loss_class_enc: 0.1145 loss_bbox_enc: 0.1626 loss_giou_enc: 0.4853 loss_class_dn: 0.07099 loss_bbox_dn: 0.1933 loss_giou_dn: 0.5048 loss_class_dn_0: 0.08487 loss_bbox_dn_0: 0.2542 loss_giou_dn_0: 0.7118 loss_class_dn_1: 0.07428 loss_bbox_dn_1: 0.2097 loss_giou_dn_1: 0.5636 loss_class_dn_2: 0.07151 loss_bbox_dn_2: 0.1988 loss_giou_dn_2: 0.5197 loss_class_dn_3: 0.07071 loss_bbox_dn_3: 0.1966 loss_giou_dn_3: 0.5127 loss_class_dn_4: 0.07105 loss_bbox_dn_4: 0.1938 loss_giou_dn_4: 0.506 time: 1.9843 data_time: 0.1323 lr: 8.75e-05 max_mem: 36619M [03/21 08:08:23] d2.utils.events INFO: eta: 1 day, 23:50:28 iter: 2349 total_loss: 9.6 loss_class: 0.1079 loss_bbox: 0.1212 loss_giou: 0.3584 loss_class_0: 0.1165 loss_bbox_0: 0.1462 loss_giou_0: 0.4312 loss_class_1: 0.1098 loss_bbox_1: 0.1283 loss_giou_1: 0.3836 loss_class_2: 0.108 loss_bbox_2: 0.1233 loss_giou_2: 0.3674 loss_class_3: 0.1078 loss_bbox_3: 0.1212 loss_giou_3: 0.3657 loss_class_4: 0.1057 loss_bbox_4: 0.1208 loss_giou_4: 0.3611 loss_class_enc: 0.1169 loss_bbox_enc: 0.1668 loss_giou_enc: 0.4804 loss_class_dn: 0.06975 loss_bbox_dn: 0.2008 loss_giou_dn: 0.527 loss_class_dn_0: 0.08334 loss_bbox_dn_0: 0.257 loss_giou_dn_0: 0.7168 loss_class_dn_1: 0.07419 loss_bbox_dn_1: 0.2163 loss_giou_dn_1: 0.5746 loss_class_dn_2: 0.07126 loss_bbox_dn_2: 0.2046 loss_giou_dn_2: 0.5421 loss_class_dn_3: 0.06973 loss_bbox_dn_3: 0.2025 loss_giou_dn_3: 0.5372 loss_class_dn_4: 0.06948 loss_bbox_dn_4: 0.201 loss_giou_dn_4: 0.5297 time: 1.9839 data_time: 0.1221 lr: 8.75e-05 max_mem: 36619M [03/21 08:10:03] d2.utils.events INFO: eta: 1 day, 23:50:53 iter: 2399 total_loss: 9.675 loss_class: 0.1019 loss_bbox: 0.1228 loss_giou: 0.3618 loss_class_0: 0.1121 loss_bbox_0: 0.1476 loss_giou_0: 0.4169 loss_class_1: 0.1045 loss_bbox_1: 0.133 loss_giou_1: 0.3737 loss_class_2: 0.1011 loss_bbox_2: 0.1268 loss_giou_2: 0.3668 loss_class_3: 0.1018 loss_bbox_3: 0.1247 loss_giou_3: 0.366 loss_class_4: 0.09988 loss_bbox_4: 0.1236 loss_giou_4: 0.3622 loss_class_enc: 0.1147 loss_bbox_enc: 0.1615 loss_giou_enc: 0.4697 loss_class_dn: 0.07095 loss_bbox_dn: 0.2008 loss_giou_dn: 0.5209 loss_class_dn_0: 0.08427 loss_bbox_dn_0: 0.257 loss_giou_dn_0: 0.7117 loss_class_dn_1: 0.07359 loss_bbox_dn_1: 0.2138 loss_giou_dn_1: 0.5687 loss_class_dn_2: 0.07078 loss_bbox_dn_2: 0.2053 loss_giou_dn_2: 0.5344 loss_class_dn_3: 0.07046 loss_bbox_dn_3: 0.203 loss_giou_dn_3: 0.5294 loss_class_dn_4: 0.07034 loss_bbox_dn_4: 0.2011 loss_giou_dn_4: 0.5218 time: 1.9844 data_time: 0.1367 lr: 8.75e-05 max_mem: 36619M [03/21 08:11:41] d2.utils.events INFO: eta: 1 day, 23:48:10 iter: 2449 total_loss: 9.5 loss_class: 0.115 loss_bbox: 0.1263 loss_giou: 0.3466 loss_class_0: 0.1219 loss_bbox_0: 0.1467 loss_giou_0: 0.4031 loss_class_1: 0.1139 loss_bbox_1: 0.1336 loss_giou_1: 0.3645 loss_class_2: 0.1128 loss_bbox_2: 0.1273 loss_giou_2: 0.352 loss_class_3: 0.1144 loss_bbox_3: 0.1265 loss_giou_3: 0.3519 loss_class_4: 0.1161 loss_bbox_4: 0.1238 loss_giou_4: 0.3471 loss_class_enc: 0.1216 loss_bbox_enc: 0.1666 loss_giou_enc: 0.4653 loss_class_dn: 0.0702 loss_bbox_dn: 0.2127 loss_giou_dn: 0.5134 loss_class_dn_0: 0.08222 loss_bbox_dn_0: 0.2673 loss_giou_dn_0: 0.7084 loss_class_dn_1: 0.07233 loss_bbox_dn_1: 0.2261 loss_giou_dn_1: 0.5711 loss_class_dn_2: 0.07023 loss_bbox_dn_2: 0.2157 loss_giou_dn_2: 0.5281 loss_class_dn_3: 0.07038 loss_bbox_dn_3: 0.2152 loss_giou_dn_3: 0.5217 loss_class_dn_4: 0.07039 loss_bbox_dn_4: 0.2131 loss_giou_dn_4: 0.5145 time: 1.9838 data_time: 0.1479 lr: 8.75e-05 max_mem: 36619M [03/21 08:13:19] d2.utils.events INFO: eta: 1 day, 23:45:49 iter: 2499 total_loss: 9.119 loss_class: 0.1071 loss_bbox: 0.1164 loss_giou: 0.3255 loss_class_0: 0.1142 loss_bbox_0: 0.1299 loss_giou_0: 0.4009 loss_class_1: 0.1085 loss_bbox_1: 0.121 loss_giou_1: 0.3524 loss_class_2: 0.1079 loss_bbox_2: 0.1188 loss_giou_2: 0.3397 loss_class_3: 0.106 loss_bbox_3: 0.1172 loss_giou_3: 0.3327 loss_class_4: 0.1057 loss_bbox_4: 0.1164 loss_giou_4: 0.3279 loss_class_enc: 0.1178 loss_bbox_enc: 0.1536 loss_giou_enc: 0.4479 loss_class_dn: 0.06972 loss_bbox_dn: 0.1983 loss_giou_dn: 0.4984 loss_class_dn_0: 0.08224 loss_bbox_dn_0: 0.2528 loss_giou_dn_0: 0.6957 loss_class_dn_1: 0.0728 loss_bbox_dn_1: 0.2104 loss_giou_dn_1: 0.5463 loss_class_dn_2: 0.07006 loss_bbox_dn_2: 0.2023 loss_giou_dn_2: 0.511 loss_class_dn_3: 0.06856 loss_bbox_dn_3: 0.2003 loss_giou_dn_3: 0.5049 loss_class_dn_4: 0.06933 loss_bbox_dn_4: 0.1984 loss_giou_dn_4: 0.4991 time: 1.9835 data_time: 0.1247 lr: 8.75e-05 max_mem: 36619M [03/21 08:15:02] d2.utils.events INFO: eta: 1 day, 23:47:09 iter: 2549 total_loss: 9.062 loss_class: 0.1031 loss_bbox: 0.115 loss_giou: 0.3303 loss_class_0: 0.1136 loss_bbox_0: 0.1372 loss_giou_0: 0.3944 loss_class_1: 0.1073 loss_bbox_1: 0.1228 loss_giou_1: 0.352 loss_class_2: 0.1051 loss_bbox_2: 0.117 loss_giou_2: 0.3335 loss_class_3: 0.1048 loss_bbox_3: 0.1161 loss_giou_3: 0.3336 loss_class_4: 0.1035 loss_bbox_4: 0.1147 loss_giou_4: 0.3306 loss_class_enc: 0.1063 loss_bbox_enc: 0.1641 loss_giou_enc: 0.4569 loss_class_dn: 0.06894 loss_bbox_dn: 0.1893 loss_giou_dn: 0.4906 loss_class_dn_0: 0.08304 loss_bbox_dn_0: 0.2555 loss_giou_dn_0: 0.6709 loss_class_dn_1: 0.07371 loss_bbox_dn_1: 0.2048 loss_giou_dn_1: 0.5404 loss_class_dn_2: 0.0712 loss_bbox_dn_2: 0.195 loss_giou_dn_2: 0.5032 loss_class_dn_3: 0.06917 loss_bbox_dn_3: 0.1918 loss_giou_dn_3: 0.4974 loss_class_dn_4: 0.06867 loss_bbox_dn_4: 0.1894 loss_giou_dn_4: 0.4917 time: 1.9847 data_time: 0.1317 lr: 8.75e-05 max_mem: 36619M [03/21 08:16:42] d2.utils.events INFO: eta: 1 day, 23:41:38 iter: 2599 total_loss: 9.086 loss_class: 0.1068 loss_bbox: 0.1125 loss_giou: 0.3291 loss_class_0: 0.1082 loss_bbox_0: 0.136 loss_giou_0: 0.4069 loss_class_1: 0.1042 loss_bbox_1: 0.1224 loss_giou_1: 0.3536 loss_class_2: 0.1058 loss_bbox_2: 0.1167 loss_giou_2: 0.3387 loss_class_3: 0.1095 loss_bbox_3: 0.1146 loss_giou_3: 0.334 loss_class_4: 0.11 loss_bbox_4: 0.1136 loss_giou_4: 0.3317 loss_class_enc: 0.1122 loss_bbox_enc: 0.1598 loss_giou_enc: 0.4655 loss_class_dn: 0.07071 loss_bbox_dn: 0.1911 loss_giou_dn: 0.4864 loss_class_dn_0: 0.08157 loss_bbox_dn_0: 0.2523 loss_giou_dn_0: 0.6956 loss_class_dn_1: 0.07341 loss_bbox_dn_1: 0.2075 loss_giou_dn_1: 0.5385 loss_class_dn_2: 0.07217 loss_bbox_dn_2: 0.1946 loss_giou_dn_2: 0.5001 loss_class_dn_3: 0.07157 loss_bbox_dn_3: 0.193 loss_giou_dn_3: 0.4949 loss_class_dn_4: 0.07213 loss_bbox_dn_4: 0.1911 loss_giou_dn_4: 0.4881 time: 1.9850 data_time: 0.1330 lr: 8.75e-05 max_mem: 36619M [03/21 08:18:20] d2.utils.events INFO: eta: 1 day, 23:38:08 iter: 2649 total_loss: 9.279 loss_class: 0.111 loss_bbox: 0.1166 loss_giou: 0.3311 loss_class_0: 0.1154 loss_bbox_0: 0.1398 loss_giou_0: 0.4011 loss_class_1: 0.1059 loss_bbox_1: 0.1257 loss_giou_1: 0.3557 loss_class_2: 0.1039 loss_bbox_2: 0.1188 loss_giou_2: 0.3393 loss_class_3: 0.1047 loss_bbox_3: 0.1174 loss_giou_3: 0.3363 loss_class_4: 0.1051 loss_bbox_4: 0.1173 loss_giou_4: 0.3321 loss_class_enc: 0.117 loss_bbox_enc: 0.1625 loss_giou_enc: 0.4597 loss_class_dn: 0.06894 loss_bbox_dn: 0.2032 loss_giou_dn: 0.5015 loss_class_dn_0: 0.0819 loss_bbox_dn_0: 0.2642 loss_giou_dn_0: 0.676 loss_class_dn_1: 0.07177 loss_bbox_dn_1: 0.2173 loss_giou_dn_1: 0.5472 loss_class_dn_2: 0.06891 loss_bbox_dn_2: 0.2072 loss_giou_dn_2: 0.5143 loss_class_dn_3: 0.06844 loss_bbox_dn_3: 0.2054 loss_giou_dn_3: 0.5101 loss_class_dn_4: 0.0688 loss_bbox_dn_4: 0.2034 loss_giou_dn_4: 0.503 time: 1.9849 data_time: 0.1337 lr: 8.75e-05 max_mem: 36619M [03/21 08:20:00] d2.utils.events INFO: eta: 1 day, 23:38:52 iter: 2699 total_loss: 9.33 loss_class: 0.1079 loss_bbox: 0.1157 loss_giou: 0.3331 loss_class_0: 0.1111 loss_bbox_0: 0.1416 loss_giou_0: 0.4003 loss_class_1: 0.1068 loss_bbox_1: 0.1245 loss_giou_1: 0.3609 loss_class_2: 0.107 loss_bbox_2: 0.1187 loss_giou_2: 0.3429 loss_class_3: 0.1083 loss_bbox_3: 0.1172 loss_giou_3: 0.3367 loss_class_4: 0.1083 loss_bbox_4: 0.1165 loss_giou_4: 0.3333 loss_class_enc: 0.1139 loss_bbox_enc: 0.1567 loss_giou_enc: 0.4488 loss_class_dn: 0.06863 loss_bbox_dn: 0.2087 loss_giou_dn: 0.5096 loss_class_dn_0: 0.08106 loss_bbox_dn_0: 0.2632 loss_giou_dn_0: 0.686 loss_class_dn_1: 0.07206 loss_bbox_dn_1: 0.2178 loss_giou_dn_1: 0.5556 loss_class_dn_2: 0.06968 loss_bbox_dn_2: 0.211 loss_giou_dn_2: 0.5225 loss_class_dn_3: 0.06901 loss_bbox_dn_3: 0.2101 loss_giou_dn_3: 0.5182 loss_class_dn_4: 0.06921 loss_bbox_dn_4: 0.2087 loss_giou_dn_4: 0.511 time: 1.9850 data_time: 0.1316 lr: 8.75e-05 max_mem: 36619M [03/21 08:21:38] d2.utils.events INFO: eta: 1 day, 23:37:26 iter: 2749 total_loss: 9.269 loss_class: 0.106 loss_bbox: 0.1193 loss_giou: 0.3347 loss_class_0: 0.1104 loss_bbox_0: 0.1475 loss_giou_0: 0.3916 loss_class_1: 0.1055 loss_bbox_1: 0.1262 loss_giou_1: 0.3541 loss_class_2: 0.1024 loss_bbox_2: 0.1217 loss_giou_2: 0.3445 loss_class_3: 0.1029 loss_bbox_3: 0.119 loss_giou_3: 0.3411 loss_class_4: 0.1041 loss_bbox_4: 0.1208 loss_giou_4: 0.337 loss_class_enc: 0.1142 loss_bbox_enc: 0.1582 loss_giou_enc: 0.4319 loss_class_dn: 0.06915 loss_bbox_dn: 0.2012 loss_giou_dn: 0.4996 loss_class_dn_0: 0.08152 loss_bbox_dn_0: 0.2606 loss_giou_dn_0: 0.6687 loss_class_dn_1: 0.07195 loss_bbox_dn_1: 0.2148 loss_giou_dn_1: 0.5425 loss_class_dn_2: 0.06987 loss_bbox_dn_2: 0.2059 loss_giou_dn_2: 0.513 loss_class_dn_3: 0.06879 loss_bbox_dn_3: 0.2033 loss_giou_dn_3: 0.5058 loss_class_dn_4: 0.06872 loss_bbox_dn_4: 0.2014 loss_giou_dn_4: 0.5009 time: 1.9844 data_time: 0.1282 lr: 8.75e-05 max_mem: 36619M [03/21 08:23:17] d2.utils.events INFO: eta: 1 day, 23:36:18 iter: 2799 total_loss: 8.87 loss_class: 0.1083 loss_bbox: 0.1156 loss_giou: 0.3343 loss_class_0: 0.1169 loss_bbox_0: 0.1339 loss_giou_0: 0.404 loss_class_1: 0.1055 loss_bbox_1: 0.1217 loss_giou_1: 0.359 loss_class_2: 0.1038 loss_bbox_2: 0.1169 loss_giou_2: 0.3405 loss_class_3: 0.1055 loss_bbox_3: 0.1159 loss_giou_3: 0.3395 loss_class_4: 0.1072 loss_bbox_4: 0.116 loss_giou_4: 0.3369 loss_class_enc: 0.1077 loss_bbox_enc: 0.153 loss_giou_enc: 0.4511 loss_class_dn: 0.06718 loss_bbox_dn: 0.1906 loss_giou_dn: 0.5038 loss_class_dn_0: 0.07902 loss_bbox_dn_0: 0.2467 loss_giou_dn_0: 0.6799 loss_class_dn_1: 0.06916 loss_bbox_dn_1: 0.2045 loss_giou_dn_1: 0.5457 loss_class_dn_2: 0.0675 loss_bbox_dn_2: 0.1942 loss_giou_dn_2: 0.5156 loss_class_dn_3: 0.06737 loss_bbox_dn_3: 0.1929 loss_giou_dn_3: 0.506 loss_class_dn_4: 0.06698 loss_bbox_dn_4: 0.191 loss_giou_dn_4: 0.504 time: 1.9845 data_time: 0.1302 lr: 8.75e-05 max_mem: 36619M [03/21 08:24:55] d2.utils.events INFO: eta: 1 day, 23:35:00 iter: 2849 total_loss: 8.958 loss_class: 0.1055 loss_bbox: 0.1086 loss_giou: 0.3071 loss_class_0: 0.1106 loss_bbox_0: 0.1316 loss_giou_0: 0.382 loss_class_1: 0.1069 loss_bbox_1: 0.1176 loss_giou_1: 0.3383 loss_class_2: 0.1054 loss_bbox_2: 0.111 loss_giou_2: 0.3134 loss_class_3: 0.1058 loss_bbox_3: 0.1099 loss_giou_3: 0.31 loss_class_4: 0.1063 loss_bbox_4: 0.1087 loss_giou_4: 0.3073 loss_class_enc: 0.1125 loss_bbox_enc: 0.1488 loss_giou_enc: 0.432 loss_class_dn: 0.06787 loss_bbox_dn: 0.1913 loss_giou_dn: 0.4638 loss_class_dn_0: 0.08036 loss_bbox_dn_0: 0.2395 loss_giou_dn_0: 0.6517 loss_class_dn_1: 0.07055 loss_bbox_dn_1: 0.2029 loss_giou_dn_1: 0.5219 loss_class_dn_2: 0.0689 loss_bbox_dn_2: 0.1953 loss_giou_dn_2: 0.4824 loss_class_dn_3: 0.06798 loss_bbox_dn_3: 0.1932 loss_giou_dn_3: 0.4757 loss_class_dn_4: 0.06853 loss_bbox_dn_4: 0.1918 loss_giou_dn_4: 0.4678 time: 1.9841 data_time: 0.1259 lr: 8.75e-05 max_mem: 36619M [03/21 08:26:34] d2.utils.events INFO: eta: 1 day, 23:33:57 iter: 2899 total_loss: 9.763 loss_class: 0.09927 loss_bbox: 0.1266 loss_giou: 0.3501 loss_class_0: 0.1115 loss_bbox_0: 0.1486 loss_giou_0: 0.3971 loss_class_1: 0.1081 loss_bbox_1: 0.1343 loss_giou_1: 0.3633 loss_class_2: 0.1019 loss_bbox_2: 0.1299 loss_giou_2: 0.3572 loss_class_3: 0.09932 loss_bbox_3: 0.1288 loss_giou_3: 0.353 loss_class_4: 0.09862 loss_bbox_4: 0.1274 loss_giou_4: 0.35 loss_class_enc: 0.1082 loss_bbox_enc: 0.1626 loss_giou_enc: 0.4466 loss_class_dn: 0.06777 loss_bbox_dn: 0.2158 loss_giou_dn: 0.519 loss_class_dn_0: 0.08194 loss_bbox_dn_0: 0.2776 loss_giou_dn_0: 0.7079 loss_class_dn_1: 0.07263 loss_bbox_dn_1: 0.2306 loss_giou_dn_1: 0.5689 loss_class_dn_2: 0.06966 loss_bbox_dn_2: 0.2203 loss_giou_dn_2: 0.5333 loss_class_dn_3: 0.06845 loss_bbox_dn_3: 0.2183 loss_giou_dn_3: 0.5303 loss_class_dn_4: 0.06855 loss_bbox_dn_4: 0.2162 loss_giou_dn_4: 0.5208 time: 1.9839 data_time: 0.1318 lr: 8.75e-05 max_mem: 36619M [03/21 08:28:11] d2.utils.events INFO: eta: 1 day, 23:30:18 iter: 2949 total_loss: 9.062 loss_class: 0.1006 loss_bbox: 0.1144 loss_giou: 0.3212 loss_class_0: 0.1109 loss_bbox_0: 0.1321 loss_giou_0: 0.3771 loss_class_1: 0.1022 loss_bbox_1: 0.1215 loss_giou_1: 0.3432 loss_class_2: 0.1022 loss_bbox_2: 0.1179 loss_giou_2: 0.33 loss_class_3: 0.1023 loss_bbox_3: 0.1154 loss_giou_3: 0.3244 loss_class_4: 0.1014 loss_bbox_4: 0.1145 loss_giou_4: 0.3206 loss_class_enc: 0.1085 loss_bbox_enc: 0.149 loss_giou_enc: 0.4411 loss_class_dn: 0.06883 loss_bbox_dn: 0.1988 loss_giou_dn: 0.4951 loss_class_dn_0: 0.08214 loss_bbox_dn_0: 0.2552 loss_giou_dn_0: 0.678 loss_class_dn_1: 0.0734 loss_bbox_dn_1: 0.2123 loss_giou_dn_1: 0.5401 loss_class_dn_2: 0.06872 loss_bbox_dn_2: 0.2034 loss_giou_dn_2: 0.5077 loss_class_dn_3: 0.06854 loss_bbox_dn_3: 0.2015 loss_giou_dn_3: 0.5017 loss_class_dn_4: 0.06941 loss_bbox_dn_4: 0.1991 loss_giou_dn_4: 0.4964 time: 1.9833 data_time: 0.1095 lr: 8.75e-05 max_mem: 36619M [03/21 08:29:51] d2.utils.events INFO: eta: 1 day, 23:28:19 iter: 2999 total_loss: 9.092 loss_class: 0.1042 loss_bbox: 0.1131 loss_giou: 0.3373 loss_class_0: 0.1095 loss_bbox_0: 0.1371 loss_giou_0: 0.4035 loss_class_1: 0.1066 loss_bbox_1: 0.1221 loss_giou_1: 0.3569 loss_class_2: 0.1081 loss_bbox_2: 0.1168 loss_giou_2: 0.3444 loss_class_3: 0.1036 loss_bbox_3: 0.1151 loss_giou_3: 0.3382 loss_class_4: 0.1043 loss_bbox_4: 0.1136 loss_giou_4: 0.3394 loss_class_enc: 0.1152 loss_bbox_enc: 0.1521 loss_giou_enc: 0.4459 loss_class_dn: 0.06974 loss_bbox_dn: 0.206 loss_giou_dn: 0.5045 loss_class_dn_0: 0.08262 loss_bbox_dn_0: 0.258 loss_giou_dn_0: 0.6704 loss_class_dn_1: 0.07198 loss_bbox_dn_1: 0.2163 loss_giou_dn_1: 0.5411 loss_class_dn_2: 0.07021 loss_bbox_dn_2: 0.2082 loss_giou_dn_2: 0.5135 loss_class_dn_3: 0.06911 loss_bbox_dn_3: 0.2065 loss_giou_dn_3: 0.5094 loss_class_dn_4: 0.06965 loss_bbox_dn_4: 0.2056 loss_giou_dn_4: 0.5048 time: 1.9836 data_time: 0.1264 lr: 8.75e-05 max_mem: 36619M [03/21 08:31:31] d2.utils.events INFO: eta: 1 day, 23:26:00 iter: 3049 total_loss: 9.02 loss_class: 0.09686 loss_bbox: 0.1145 loss_giou: 0.317 loss_class_0: 0.1078 loss_bbox_0: 0.1309 loss_giou_0: 0.3791 loss_class_1: 0.1018 loss_bbox_1: 0.1214 loss_giou_1: 0.338 loss_class_2: 0.09738 loss_bbox_2: 0.1179 loss_giou_2: 0.3186 loss_class_3: 0.0968 loss_bbox_3: 0.1167 loss_giou_3: 0.3188 loss_class_4: 0.09685 loss_bbox_4: 0.1146 loss_giou_4: 0.3168 loss_class_enc: 0.1109 loss_bbox_enc: 0.1504 loss_giou_enc: 0.4268 loss_class_dn: 0.06684 loss_bbox_dn: 0.2095 loss_giou_dn: 0.4829 loss_class_dn_0: 0.07891 loss_bbox_dn_0: 0.2626 loss_giou_dn_0: 0.6643 loss_class_dn_1: 0.06929 loss_bbox_dn_1: 0.2213 loss_giou_dn_1: 0.529 loss_class_dn_2: 0.06794 loss_bbox_dn_2: 0.2125 loss_giou_dn_2: 0.4948 loss_class_dn_3: 0.06777 loss_bbox_dn_3: 0.2113 loss_giou_dn_3: 0.4907 loss_class_dn_4: 0.06668 loss_bbox_dn_4: 0.2098 loss_giou_dn_4: 0.4845 time: 1.9838 data_time: 0.1524 lr: 8.75e-05 max_mem: 36619M [03/21 08:33:15] d2.utils.events INFO: eta: 1 day, 23:29:02 iter: 3099 total_loss: 8.549 loss_class: 0.09448 loss_bbox: 0.1091 loss_giou: 0.313 loss_class_0: 0.105 loss_bbox_0: 0.1262 loss_giou_0: 0.3683 loss_class_1: 0.09813 loss_bbox_1: 0.1147 loss_giou_1: 0.3382 loss_class_2: 0.09574 loss_bbox_2: 0.1089 loss_giou_2: 0.3199 loss_class_3: 0.09412 loss_bbox_3: 0.1083 loss_giou_3: 0.3176 loss_class_4: 0.09377 loss_bbox_4: 0.1087 loss_giou_4: 0.3139 loss_class_enc: 0.1084 loss_bbox_enc: 0.1442 loss_giou_enc: 0.4142 loss_class_dn: 0.06678 loss_bbox_dn: 0.1811 loss_giou_dn: 0.4705 loss_class_dn_0: 0.07797 loss_bbox_dn_0: 0.2411 loss_giou_dn_0: 0.6385 loss_class_dn_1: 0.06942 loss_bbox_dn_1: 0.1978 loss_giou_dn_1: 0.5236 loss_class_dn_2: 0.06732 loss_bbox_dn_2: 0.1854 loss_giou_dn_2: 0.4867 loss_class_dn_3: 0.06551 loss_bbox_dn_3: 0.1838 loss_giou_dn_3: 0.4804 loss_class_dn_4: 0.06582 loss_bbox_dn_4: 0.1816 loss_giou_dn_4: 0.4721 time: 1.9853 data_time: 0.1435 lr: 8.75e-05 max_mem: 36619M [03/21 08:34:56] d2.utils.events INFO: eta: 1 day, 23:30:15 iter: 3149 total_loss: 8.592 loss_class: 0.09426 loss_bbox: 0.1089 loss_giou: 0.3234 loss_class_0: 0.1052 loss_bbox_0: 0.1263 loss_giou_0: 0.3881 loss_class_1: 0.09716 loss_bbox_1: 0.1154 loss_giou_1: 0.3402 loss_class_2: 0.09539 loss_bbox_2: 0.1111 loss_giou_2: 0.3256 loss_class_3: 0.09422 loss_bbox_3: 0.1102 loss_giou_3: 0.3233 loss_class_4: 0.09352 loss_bbox_4: 0.11 loss_giou_4: 0.3232 loss_class_enc: 0.1065 loss_bbox_enc: 0.1486 loss_giou_enc: 0.4309 loss_class_dn: 0.067 loss_bbox_dn: 0.1955 loss_giou_dn: 0.4866 loss_class_dn_0: 0.07996 loss_bbox_dn_0: 0.2505 loss_giou_dn_0: 0.6597 loss_class_dn_1: 0.06967 loss_bbox_dn_1: 0.2088 loss_giou_dn_1: 0.5251 loss_class_dn_2: 0.06725 loss_bbox_dn_2: 0.1983 loss_giou_dn_2: 0.4971 loss_class_dn_3: 0.0662 loss_bbox_dn_3: 0.1967 loss_giou_dn_3: 0.492 loss_class_dn_4: 0.06655 loss_bbox_dn_4: 0.1954 loss_giou_dn_4: 0.4871 time: 1.9859 data_time: 0.1303 lr: 8.75e-05 max_mem: 37506M [03/21 08:36:35] d2.utils.events INFO: eta: 1 day, 23:29:14 iter: 3199 total_loss: 8.858 loss_class: 0.1022 loss_bbox: 0.1126 loss_giou: 0.3189 loss_class_0: 0.1065 loss_bbox_0: 0.1353 loss_giou_0: 0.383 loss_class_1: 0.1009 loss_bbox_1: 0.1196 loss_giou_1: 0.3456 loss_class_2: 0.1011 loss_bbox_2: 0.1157 loss_giou_2: 0.3313 loss_class_3: 0.1029 loss_bbox_3: 0.1138 loss_giou_3: 0.3276 loss_class_4: 0.1015 loss_bbox_4: 0.1124 loss_giou_4: 0.3215 loss_class_enc: 0.1084 loss_bbox_enc: 0.1494 loss_giou_enc: 0.4199 loss_class_dn: 0.06669 loss_bbox_dn: 0.1946 loss_giou_dn: 0.4845 loss_class_dn_0: 0.07852 loss_bbox_dn_0: 0.2543 loss_giou_dn_0: 0.6613 loss_class_dn_1: 0.06867 loss_bbox_dn_1: 0.2086 loss_giou_dn_1: 0.532 loss_class_dn_2: 0.06713 loss_bbox_dn_2: 0.1979 loss_giou_dn_2: 0.4987 loss_class_dn_3: 0.06585 loss_bbox_dn_3: 0.1961 loss_giou_dn_3: 0.4924 loss_class_dn_4: 0.06624 loss_bbox_dn_4: 0.1947 loss_giou_dn_4: 0.4866 time: 1.9859 data_time: 0.1410 lr: 8.75e-05 max_mem: 37506M [03/21 08:38:16] d2.utils.events INFO: eta: 1 day, 23:31:11 iter: 3249 total_loss: 8.705 loss_class: 0.09816 loss_bbox: 0.1161 loss_giou: 0.3139 loss_class_0: 0.1013 loss_bbox_0: 0.132 loss_giou_0: 0.3821 loss_class_1: 0.0966 loss_bbox_1: 0.1225 loss_giou_1: 0.3311 loss_class_2: 0.09641 loss_bbox_2: 0.1182 loss_giou_2: 0.319 loss_class_3: 0.09608 loss_bbox_3: 0.1164 loss_giou_3: 0.3159 loss_class_4: 0.09601 loss_bbox_4: 0.1155 loss_giou_4: 0.3137 loss_class_enc: 0.1031 loss_bbox_enc: 0.1595 loss_giou_enc: 0.4217 loss_class_dn: 0.06646 loss_bbox_dn: 0.1987 loss_giou_dn: 0.4889 loss_class_dn_0: 0.07765 loss_bbox_dn_0: 0.2507 loss_giou_dn_0: 0.644 loss_class_dn_1: 0.06774 loss_bbox_dn_1: 0.2058 loss_giou_dn_1: 0.5266 loss_class_dn_2: 0.06653 loss_bbox_dn_2: 0.1983 loss_giou_dn_2: 0.5007 loss_class_dn_3: 0.06703 loss_bbox_dn_3: 0.1973 loss_giou_dn_3: 0.4959 loss_class_dn_4: 0.06683 loss_bbox_dn_4: 0.1982 loss_giou_dn_4: 0.4913 time: 1.9864 data_time: 0.1332 lr: 8.75e-05 max_mem: 37506M [03/21 08:39:53] d2.utils.events INFO: eta: 1 day, 23:20:54 iter: 3299 total_loss: 8.413 loss_class: 0.09398 loss_bbox: 0.1023 loss_giou: 0.3037 loss_class_0: 0.0992 loss_bbox_0: 0.1244 loss_giou_0: 0.3707 loss_class_1: 0.09582 loss_bbox_1: 0.1088 loss_giou_1: 0.3299 loss_class_2: 0.09323 loss_bbox_2: 0.1055 loss_giou_2: 0.3144 loss_class_3: 0.09314 loss_bbox_3: 0.1042 loss_giou_3: 0.3093 loss_class_4: 0.09342 loss_bbox_4: 0.1026 loss_giou_4: 0.304 loss_class_enc: 0.1029 loss_bbox_enc: 0.1499 loss_giou_enc: 0.4208 loss_class_dn: 0.06469 loss_bbox_dn: 0.1738 loss_giou_dn: 0.4608 loss_class_dn_0: 0.07709 loss_bbox_dn_0: 0.2409 loss_giou_dn_0: 0.6289 loss_class_dn_1: 0.06746 loss_bbox_dn_1: 0.189 loss_giou_dn_1: 0.5055 loss_class_dn_2: 0.066 loss_bbox_dn_2: 0.1789 loss_giou_dn_2: 0.4742 loss_class_dn_3: 0.06433 loss_bbox_dn_3: 0.1759 loss_giou_dn_3: 0.4667 loss_class_dn_4: 0.06422 loss_bbox_dn_4: 0.1739 loss_giou_dn_4: 0.4616 time: 1.9855 data_time: 0.1308 lr: 8.75e-05 max_mem: 37506M [03/21 08:41:35] d2.utils.events INFO: eta: 1 day, 23:24:59 iter: 3349 total_loss: 8.662 loss_class: 0.09052 loss_bbox: 0.1083 loss_giou: 0.3067 loss_class_0: 0.1078 loss_bbox_0: 0.1299 loss_giou_0: 0.3592 loss_class_1: 0.09697 loss_bbox_1: 0.1204 loss_giou_1: 0.3374 loss_class_2: 0.09192 loss_bbox_2: 0.1129 loss_giou_2: 0.3225 loss_class_3: 0.08818 loss_bbox_3: 0.1107 loss_giou_3: 0.3178 loss_class_4: 0.08875 loss_bbox_4: 0.1087 loss_giou_4: 0.3089 loss_class_enc: 0.1012 loss_bbox_enc: 0.1515 loss_giou_enc: 0.413 loss_class_dn: 0.06513 loss_bbox_dn: 0.1932 loss_giou_dn: 0.4681 loss_class_dn_0: 0.07791 loss_bbox_dn_0: 0.242 loss_giou_dn_0: 0.6475 loss_class_dn_1: 0.06791 loss_bbox_dn_1: 0.207 loss_giou_dn_1: 0.5205 loss_class_dn_2: 0.066 loss_bbox_dn_2: 0.1992 loss_giou_dn_2: 0.4877 loss_class_dn_3: 0.06602 loss_bbox_dn_3: 0.1966 loss_giou_dn_3: 0.4801 loss_class_dn_4: 0.06512 loss_bbox_dn_4: 0.1934 loss_giou_dn_4: 0.4697 time: 1.9863 data_time: 0.1409 lr: 8.75e-05 max_mem: 37506M [03/21 08:43:12] d2.utils.events INFO: eta: 1 day, 23:17:15 iter: 3399 total_loss: 8.869 loss_class: 0.09357 loss_bbox: 0.1094 loss_giou: 0.3235 loss_class_0: 0.09988 loss_bbox_0: 0.1347 loss_giou_0: 0.4016 loss_class_1: 0.09961 loss_bbox_1: 0.1194 loss_giou_1: 0.3557 loss_class_2: 0.09463 loss_bbox_2: 0.1145 loss_giou_2: 0.3365 loss_class_3: 0.09128 loss_bbox_3: 0.1131 loss_giou_3: 0.3308 loss_class_4: 0.09196 loss_bbox_4: 0.1112 loss_giou_4: 0.3258 loss_class_enc: 0.1006 loss_bbox_enc: 0.159 loss_giou_enc: 0.4559 loss_class_dn: 0.06518 loss_bbox_dn: 0.1951 loss_giou_dn: 0.489 loss_class_dn_0: 0.07918 loss_bbox_dn_0: 0.2603 loss_giou_dn_0: 0.6612 loss_class_dn_1: 0.06851 loss_bbox_dn_1: 0.2116 loss_giou_dn_1: 0.5324 loss_class_dn_2: 0.06583 loss_bbox_dn_2: 0.2002 loss_giou_dn_2: 0.5002 loss_class_dn_3: 0.06487 loss_bbox_dn_3: 0.1977 loss_giou_dn_3: 0.4945 loss_class_dn_4: 0.06453 loss_bbox_dn_4: 0.1955 loss_giou_dn_4: 0.4897 time: 1.9858 data_time: 0.1422 lr: 8.75e-05 max_mem: 37506M [03/21 08:44:52] d2.utils.events INFO: eta: 1 day, 23:16:36 iter: 3449 total_loss: 8.439 loss_class: 0.09033 loss_bbox: 0.1041 loss_giou: 0.3044 loss_class_0: 0.1022 loss_bbox_0: 0.1205 loss_giou_0: 0.3589 loss_class_1: 0.09145 loss_bbox_1: 0.1082 loss_giou_1: 0.3207 loss_class_2: 0.09171 loss_bbox_2: 0.1054 loss_giou_2: 0.31 loss_class_3: 0.0896 loss_bbox_3: 0.1047 loss_giou_3: 0.3069 loss_class_4: 0.08939 loss_bbox_4: 0.1049 loss_giou_4: 0.3036 loss_class_enc: 0.0982 loss_bbox_enc: 0.1355 loss_giou_enc: 0.3999 loss_class_dn: 0.06603 loss_bbox_dn: 0.1911 loss_giou_dn: 0.4642 loss_class_dn_0: 0.07828 loss_bbox_dn_0: 0.2421 loss_giou_dn_0: 0.6321 loss_class_dn_1: 0.0692 loss_bbox_dn_1: 0.2017 loss_giou_dn_1: 0.5079 loss_class_dn_2: 0.06675 loss_bbox_dn_2: 0.1943 loss_giou_dn_2: 0.4781 loss_class_dn_3: 0.06586 loss_bbox_dn_3: 0.1922 loss_giou_dn_3: 0.4712 loss_class_dn_4: 0.06588 loss_bbox_dn_4: 0.191 loss_giou_dn_4: 0.4657 time: 1.9860 data_time: 0.1167 lr: 8.75e-05 max_mem: 37506M [03/21 08:46:31] d2.utils.events INFO: eta: 1 day, 23:13:59 iter: 3499 total_loss: 8.199 loss_class: 0.09238 loss_bbox: 0.09849 loss_giou: 0.2994 loss_class_0: 0.09931 loss_bbox_0: 0.1213 loss_giou_0: 0.3707 loss_class_1: 0.09352 loss_bbox_1: 0.1067 loss_giou_1: 0.32 loss_class_2: 0.0922 loss_bbox_2: 0.102 loss_giou_2: 0.3057 loss_class_3: 0.09259 loss_bbox_3: 0.1007 loss_giou_3: 0.3049 loss_class_4: 0.09262 loss_bbox_4: 0.09942 loss_giou_4: 0.2988 loss_class_enc: 0.1028 loss_bbox_enc: 0.1434 loss_giou_enc: 0.4246 loss_class_dn: 0.06386 loss_bbox_dn: 0.1774 loss_giou_dn: 0.4459 loss_class_dn_0: 0.07713 loss_bbox_dn_0: 0.2352 loss_giou_dn_0: 0.6322 loss_class_dn_1: 0.06752 loss_bbox_dn_1: 0.1885 loss_giou_dn_1: 0.483 loss_class_dn_2: 0.06413 loss_bbox_dn_2: 0.18 loss_giou_dn_2: 0.4584 loss_class_dn_3: 0.06376 loss_bbox_dn_3: 0.1787 loss_giou_dn_3: 0.4527 loss_class_dn_4: 0.06412 loss_bbox_dn_4: 0.177 loss_giou_dn_4: 0.447 time: 1.9856 data_time: 0.1291 lr: 8.75e-05 max_mem: 37506M [03/21 08:48:11] d2.utils.events INFO: eta: 1 day, 23:09:25 iter: 3549 total_loss: 8.649 loss_class: 0.08886 loss_bbox: 0.1044 loss_giou: 0.3097 loss_class_0: 0.09839 loss_bbox_0: 0.1267 loss_giou_0: 0.3686 loss_class_1: 0.09194 loss_bbox_1: 0.1125 loss_giou_1: 0.3312 loss_class_2: 0.09004 loss_bbox_2: 0.107 loss_giou_2: 0.3174 loss_class_3: 0.08834 loss_bbox_3: 0.1051 loss_giou_3: 0.318 loss_class_4: 0.08824 loss_bbox_4: 0.104 loss_giou_4: 0.3124 loss_class_enc: 0.1016 loss_bbox_enc: 0.1369 loss_giou_enc: 0.4157 loss_class_dn: 0.06446 loss_bbox_dn: 0.1839 loss_giou_dn: 0.4771 loss_class_dn_0: 0.07573 loss_bbox_dn_0: 0.2289 loss_giou_dn_0: 0.6475 loss_class_dn_1: 0.06726 loss_bbox_dn_1: 0.1934 loss_giou_dn_1: 0.5272 loss_class_dn_2: 0.06469 loss_bbox_dn_2: 0.1874 loss_giou_dn_2: 0.491 loss_class_dn_3: 0.06453 loss_bbox_dn_3: 0.1854 loss_giou_dn_3: 0.4861 loss_class_dn_4: 0.06447 loss_bbox_dn_4: 0.1841 loss_giou_dn_4: 0.4791 time: 1.9858 data_time: 0.1292 lr: 8.75e-05 max_mem: 37797M [03/21 08:49:52] d2.utils.events INFO: eta: 1 day, 23:08:39 iter: 3599 total_loss: 8.322 loss_class: 0.09431 loss_bbox: 0.107 loss_giou: 0.2988 loss_class_0: 0.0999 loss_bbox_0: 0.1299 loss_giou_0: 0.3638 loss_class_1: 0.09512 loss_bbox_1: 0.1147 loss_giou_1: 0.322 loss_class_2: 0.0933 loss_bbox_2: 0.1102 loss_giou_2: 0.3068 loss_class_3: 0.09227 loss_bbox_3: 0.1088 loss_giou_3: 0.3049 loss_class_4: 0.09264 loss_bbox_4: 0.1075 loss_giou_4: 0.3023 loss_class_enc: 0.1096 loss_bbox_enc: 0.1423 loss_giou_enc: 0.4049 loss_class_dn: 0.0643 loss_bbox_dn: 0.1891 loss_giou_dn: 0.4505 loss_class_dn_0: 0.07747 loss_bbox_dn_0: 0.2437 loss_giou_dn_0: 0.6371 loss_class_dn_1: 0.06806 loss_bbox_dn_1: 0.2028 loss_giou_dn_1: 0.4883 loss_class_dn_2: 0.06646 loss_bbox_dn_2: 0.1932 loss_giou_dn_2: 0.4622 loss_class_dn_3: 0.06635 loss_bbox_dn_3: 0.1911 loss_giou_dn_3: 0.4561 loss_class_dn_4: 0.06468 loss_bbox_dn_4: 0.1891 loss_giou_dn_4: 0.451 time: 1.9863 data_time: 0.1219 lr: 8.75e-05 max_mem: 37797M [03/21 08:51:33] d2.utils.events INFO: eta: 1 day, 23:07:36 iter: 3649 total_loss: 8.563 loss_class: 0.09649 loss_bbox: 0.1077 loss_giou: 0.3102 loss_class_0: 0.1022 loss_bbox_0: 0.1295 loss_giou_0: 0.366 loss_class_1: 0.1044 loss_bbox_1: 0.1177 loss_giou_1: 0.3272 loss_class_2: 0.1018 loss_bbox_2: 0.1112 loss_giou_2: 0.3173 loss_class_3: 0.09575 loss_bbox_3: 0.1093 loss_giou_3: 0.3169 loss_class_4: 0.09629 loss_bbox_4: 0.1085 loss_giou_4: 0.3093 loss_class_enc: 0.1045 loss_bbox_enc: 0.1451 loss_giou_enc: 0.4291 loss_class_dn: 0.06445 loss_bbox_dn: 0.189 loss_giou_dn: 0.4738 loss_class_dn_0: 0.07485 loss_bbox_dn_0: 0.2445 loss_giou_dn_0: 0.6353 loss_class_dn_1: 0.06617 loss_bbox_dn_1: 0.2049 loss_giou_dn_1: 0.5133 loss_class_dn_2: 0.06389 loss_bbox_dn_2: 0.1956 loss_giou_dn_2: 0.484 loss_class_dn_3: 0.06344 loss_bbox_dn_3: 0.193 loss_giou_dn_3: 0.4797 loss_class_dn_4: 0.06346 loss_bbox_dn_4: 0.1894 loss_giou_dn_4: 0.4754 time: 1.9870 data_time: 0.1355 lr: 8.75e-05 max_mem: 37797M [03/21 08:53:15] d2.utils.events INFO: eta: 1 day, 23:07:53 iter: 3699 total_loss: 8.059 loss_class: 0.08114 loss_bbox: 0.09902 loss_giou: 0.2938 loss_class_0: 0.09276 loss_bbox_0: 0.1201 loss_giou_0: 0.3537 loss_class_1: 0.08496 loss_bbox_1: 0.1068 loss_giou_1: 0.3117 loss_class_2: 0.08374 loss_bbox_2: 0.101 loss_giou_2: 0.2981 loss_class_3: 0.08158 loss_bbox_3: 0.1009 loss_giou_3: 0.2979 loss_class_4: 0.08214 loss_bbox_4: 0.09946 loss_giou_4: 0.2937 loss_class_enc: 0.09716 loss_bbox_enc: 0.1342 loss_giou_enc: 0.4015 loss_class_dn: 0.06125 loss_bbox_dn: 0.1764 loss_giou_dn: 0.4552 loss_class_dn_0: 0.07375 loss_bbox_dn_0: 0.2441 loss_giou_dn_0: 0.6276 loss_class_dn_1: 0.06449 loss_bbox_dn_1: 0.1898 loss_giou_dn_1: 0.4963 loss_class_dn_2: 0.06151 loss_bbox_dn_2: 0.1792 loss_giou_dn_2: 0.467 loss_class_dn_3: 0.06059 loss_bbox_dn_3: 0.1784 loss_giou_dn_3: 0.4633 loss_class_dn_4: 0.06043 loss_bbox_dn_4: 0.1764 loss_giou_dn_4: 0.4566 time: 1.9877 data_time: 0.1323 lr: 8.75e-05 max_mem: 37797M [03/21 08:54:56] d2.utils.events INFO: eta: 1 day, 23:13:47 iter: 3749 total_loss: 8.919 loss_class: 0.09483 loss_bbox: 0.1119 loss_giou: 0.3092 loss_class_0: 0.1045 loss_bbox_0: 0.1346 loss_giou_0: 0.3718 loss_class_1: 0.0969 loss_bbox_1: 0.1207 loss_giou_1: 0.3392 loss_class_2: 0.09742 loss_bbox_2: 0.113 loss_giou_2: 0.3265 loss_class_3: 0.09399 loss_bbox_3: 0.1141 loss_giou_3: 0.3202 loss_class_4: 0.0941 loss_bbox_4: 0.1131 loss_giou_4: 0.3113 loss_class_enc: 0.1016 loss_bbox_enc: 0.15 loss_giou_enc: 0.4126 loss_class_dn: 0.06581 loss_bbox_dn: 0.1963 loss_giou_dn: 0.4717 loss_class_dn_0: 0.07812 loss_bbox_dn_0: 0.2525 loss_giou_dn_0: 0.6449 loss_class_dn_1: 0.06885 loss_bbox_dn_1: 0.2049 loss_giou_dn_1: 0.5229 loss_class_dn_2: 0.06617 loss_bbox_dn_2: 0.1976 loss_giou_dn_2: 0.4927 loss_class_dn_3: 0.06566 loss_bbox_dn_3: 0.1961 loss_giou_dn_3: 0.484 loss_class_dn_4: 0.06578 loss_bbox_dn_4: 0.1963 loss_giou_dn_4: 0.473 time: 1.9881 data_time: 0.1258 lr: 8.75e-05 max_mem: 37797M [03/21 08:56:38] d2.utils.events INFO: eta: 1 day, 23:14:51 iter: 3799 total_loss: 8.145 loss_class: 0.09252 loss_bbox: 0.09925 loss_giou: 0.2978 loss_class_0: 0.09935 loss_bbox_0: 0.1213 loss_giou_0: 0.3523 loss_class_1: 0.09263 loss_bbox_1: 0.1074 loss_giou_1: 0.3192 loss_class_2: 0.09165 loss_bbox_2: 0.1012 loss_giou_2: 0.304 loss_class_3: 0.09007 loss_bbox_3: 0.1003 loss_giou_3: 0.3015 loss_class_4: 0.0907 loss_bbox_4: 0.09879 loss_giou_4: 0.3013 loss_class_enc: 0.09885 loss_bbox_enc: 0.15 loss_giou_enc: 0.4243 loss_class_dn: 0.06435 loss_bbox_dn: 0.1884 loss_giou_dn: 0.4454 loss_class_dn_0: 0.07502 loss_bbox_dn_0: 0.2453 loss_giou_dn_0: 0.6196 loss_class_dn_1: 0.06633 loss_bbox_dn_1: 0.204 loss_giou_dn_1: 0.4924 loss_class_dn_2: 0.0638 loss_bbox_dn_2: 0.1932 loss_giou_dn_2: 0.4614 loss_class_dn_3: 0.06294 loss_bbox_dn_3: 0.191 loss_giou_dn_3: 0.4559 loss_class_dn_4: 0.06325 loss_bbox_dn_4: 0.1885 loss_giou_dn_4: 0.4477 time: 1.9886 data_time: 0.1546 lr: 8.75e-05 max_mem: 37797M [03/21 08:58:21] d2.utils.events INFO: eta: 1 day, 23:21:27 iter: 3849 total_loss: 8.287 loss_class: 0.0906 loss_bbox: 0.101 loss_giou: 0.2975 loss_class_0: 0.09751 loss_bbox_0: 0.1173 loss_giou_0: 0.354 loss_class_1: 0.09181 loss_bbox_1: 0.1096 loss_giou_1: 0.314 loss_class_2: 0.09022 loss_bbox_2: 0.1053 loss_giou_2: 0.3026 loss_class_3: 0.08837 loss_bbox_3: 0.1036 loss_giou_3: 0.2998 loss_class_4: 0.08848 loss_bbox_4: 0.1026 loss_giou_4: 0.2973 loss_class_enc: 0.09963 loss_bbox_enc: 0.1335 loss_giou_enc: 0.3909 loss_class_dn: 0.06192 loss_bbox_dn: 0.1823 loss_giou_dn: 0.4546 loss_class_dn_0: 0.0738 loss_bbox_dn_0: 0.2316 loss_giou_dn_0: 0.6134 loss_class_dn_1: 0.06502 loss_bbox_dn_1: 0.1932 loss_giou_dn_1: 0.4934 loss_class_dn_2: 0.06168 loss_bbox_dn_2: 0.1859 loss_giou_dn_2: 0.4636 loss_class_dn_3: 0.06206 loss_bbox_dn_3: 0.1833 loss_giou_dn_3: 0.4608 loss_class_dn_4: 0.06163 loss_bbox_dn_4: 0.1824 loss_giou_dn_4: 0.4573 time: 1.9897 data_time: 0.1470 lr: 8.75e-05 max_mem: 37797M [03/21 09:00:03] d2.utils.events INFO: eta: 1 day, 23:26:01 iter: 3899 total_loss: 7.369 loss_class: 0.07692 loss_bbox: 0.08628 loss_giou: 0.2775 loss_class_0: 0.08952 loss_bbox_0: 0.1021 loss_giou_0: 0.3317 loss_class_1: 0.07878 loss_bbox_1: 0.08956 loss_giou_1: 0.298 loss_class_2: 0.0784 loss_bbox_2: 0.08685 loss_giou_2: 0.2848 loss_class_3: 0.07794 loss_bbox_3: 0.087 loss_giou_3: 0.283 loss_class_4: 0.07685 loss_bbox_4: 0.08638 loss_giou_4: 0.2786 loss_class_enc: 0.08652 loss_bbox_enc: 0.1158 loss_giou_enc: 0.3687 loss_class_dn: 0.06052 loss_bbox_dn: 0.1465 loss_giou_dn: 0.4238 loss_class_dn_0: 0.07238 loss_bbox_dn_0: 0.2027 loss_giou_dn_0: 0.5825 loss_class_dn_1: 0.06311 loss_bbox_dn_1: 0.1591 loss_giou_dn_1: 0.4624 loss_class_dn_2: 0.06072 loss_bbox_dn_2: 0.1485 loss_giou_dn_2: 0.44 loss_class_dn_3: 0.06085 loss_bbox_dn_3: 0.1473 loss_giou_dn_3: 0.432 loss_class_dn_4: 0.06083 loss_bbox_dn_4: 0.1469 loss_giou_dn_4: 0.424 time: 1.9904 data_time: 0.1041 lr: 8.75e-05 max_mem: 37797M [03/21 09:01:46] d2.utils.events INFO: eta: 1 day, 23:36:22 iter: 3949 total_loss: 8.038 loss_class: 0.08514 loss_bbox: 0.1028 loss_giou: 0.2915 loss_class_0: 0.09387 loss_bbox_0: 0.1198 loss_giou_0: 0.3506 loss_class_1: 0.0862 loss_bbox_1: 0.1096 loss_giou_1: 0.3137 loss_class_2: 0.08567 loss_bbox_2: 0.1047 loss_giou_2: 0.3025 loss_class_3: 0.08482 loss_bbox_3: 0.1033 loss_giou_3: 0.3001 loss_class_4: 0.08376 loss_bbox_4: 0.1035 loss_giou_4: 0.2933 loss_class_enc: 0.09626 loss_bbox_enc: 0.129 loss_giou_enc: 0.3843 loss_class_dn: 0.06396 loss_bbox_dn: 0.1767 loss_giou_dn: 0.4554 loss_class_dn_0: 0.07379 loss_bbox_dn_0: 0.24 loss_giou_dn_0: 0.615 loss_class_dn_1: 0.06715 loss_bbox_dn_1: 0.1907 loss_giou_dn_1: 0.5018 loss_class_dn_2: 0.0643 loss_bbox_dn_2: 0.1806 loss_giou_dn_2: 0.4701 loss_class_dn_3: 0.06329 loss_bbox_dn_3: 0.1781 loss_giou_dn_3: 0.4639 loss_class_dn_4: 0.06357 loss_bbox_dn_4: 0.1765 loss_giou_dn_4: 0.4567 time: 1.9912 data_time: 0.1052 lr: 8.75e-05 max_mem: 37797M [03/21 09:03:28] fvcore.common.checkpoint INFO: Saving checkpoint to ./output/dino_r50_4scale_12ep/model_0003999.pth [03/21 09:03:29] detectron2 INFO: Run evaluation without EMA. [03/21 09:03:29] d2.data.datasets.coco WARNING: Category ids in annotations are not in [1, #categories]! We'll apply a mapping for you. [03/21 09:03:29] d2.data.datasets.coco INFO: Loaded 83 images in COCO format from datasets/corpus/annotations/test.json [03/21 09:03:29] d2.data.common INFO: Serializing 83 elements to byte tensors and concatenating them all ... [03/21 09:03:29] d2.data.common INFO: Serialized dataset takes 1.17 MiB [03/21 09:03:29] d2.evaluation.evaluator INFO: Start inference on 83 batches [03/21 09:03:30] d2.evaluation.evaluator INFO: Inference done 11/83. Dataloading: 0.0524 s/iter. Inference: 0.0609 s/iter. Eval: 0.0005 s/iter. Total: 0.1138 s/iter. ETA=0:00:08 [03/21 09:03:35] d2.evaluation.evaluator INFO: Inference done 55/83. Dataloading: 0.0525 s/iter. Inference: 0.0619 s/iter. Eval: 0.0004 s/iter. Total: 0.1149 s/iter. ETA=0:00:03 [03/21 09:03:38] d2.evaluation.evaluator INFO: Total inference time: 0:00:08.866202 (0.113669 s / iter per device, on 1 devices) [03/21 09:03:38] d2.evaluation.evaluator INFO: Total inference pure compute time: 0:00:04 (0.062106 s / iter per device, on 1 devices) [03/21 09:03:38] d2.evaluation.coco_evaluation INFO: Preparing results for COCO format ... [03/21 09:03:38] d2.evaluation.coco_evaluation INFO: Saving results to ./output/dino_r50_4scale_12ep/coco_instances_results.json [03/21 09:03:39] d2.evaluation.coco_evaluation INFO: Evaluating predictions with unofficial COCO API... [03/21 09:03:39] d2.evaluation.fast_eval_api INFO: Evaluate annotation type *bbox* [03/21 09:03:39] d2.evaluation.fast_eval_api INFO: COCOeval_opt.evaluate() finished in 0.32 seconds. [03/21 09:03:39] d2.evaluation.fast_eval_api INFO: Accumulating evaluation results... [03/21 09:03:39] d2.evaluation.fast_eval_api INFO: COCOeval_opt.accumulate() finished in 0.01 seconds. [03/21 09:03:39] d2.evaluation.coco_evaluation INFO: Evaluation results for bbox: | AP | AP50 | AP75 | APs | APm | APl | |:------:|:------:|:------:|:------:|:------:|:------:| | 65.875 | 87.872 | 74.959 | 26.922 | 66.299 | 72.306 | [03/21 09:03:39] d2.evaluation.testing INFO: copypaste: Task: bbox [03/21 09:03:39] d2.evaluation.testing INFO: copypaste: AP,AP50,AP75,APs,APm,APl [03/21 09:03:39] d2.evaluation.testing INFO: copypaste: 65.8750,87.8723,74.9586,26.9217,66.2989,72.3059 [03/21 09:03:39] d2.utils.events INFO: eta: 1 day, 23:35:24 iter: 3999 total_loss: 8.031 loss_class: 0.08332 loss_bbox: 0.1008 loss_giou: 0.2752 loss_class_0: 0.0937 loss_bbox_0: 0.116 loss_giou_0: 0.3248 loss_class_1: 0.0877 loss_bbox_1: 0.106 loss_giou_1: 0.2944 loss_class_2: 0.08725 loss_bbox_2: 0.1031 loss_giou_2: 0.2805 loss_class_3: 0.0858 loss_bbox_3: 0.1015 loss_giou_3: 0.2791 loss_class_4: 0.08411 loss_bbox_4: 0.1012 loss_giou_4: 0.2769 loss_class_enc: 0.1004 loss_bbox_enc: 0.128 loss_giou_enc: 0.3658 loss_class_dn: 0.05947 loss_bbox_dn: 0.1912 loss_giou_dn: 0.4312 loss_class_dn_0: 0.07317 loss_bbox_dn_0: 0.2521 loss_giou_dn_0: 0.6005 loss_class_dn_1: 0.06306 loss_bbox_dn_1: 0.2064 loss_giou_dn_1: 0.4702 loss_class_dn_2: 0.06036 loss_bbox_dn_2: 0.1959 loss_giou_dn_2: 0.4398 loss_class_dn_3: 0.05965 loss_bbox_dn_3: 0.1939 loss_giou_dn_3: 0.4372 loss_class_dn_4: 0.05916 loss_bbox_dn_4: 0.1916 loss_giou_dn_4: 0.4316 time: 1.9917 data_time: 0.1420 lr: 8.75e-05 max_mem: 37797M [03/21 09:05:22] d2.utils.events INFO: eta: 1 day, 23:41:11 iter: 4049 total_loss: 8.291 loss_class: 0.08868 loss_bbox: 0.1019 loss_giou: 0.3037 loss_class_0: 0.09511 loss_bbox_0: 0.1196 loss_giou_0: 0.3593 loss_class_1: 0.09071 loss_bbox_1: 0.1091 loss_giou_1: 0.325 loss_class_2: 0.08956 loss_bbox_2: 0.1024 loss_giou_2: 0.3085 loss_class_3: 0.08888 loss_bbox_3: 0.103 loss_giou_3: 0.3077 loss_class_4: 0.08889 loss_bbox_4: 0.1025 loss_giou_4: 0.3052 loss_class_enc: 0.1041 loss_bbox_enc: 0.1393 loss_giou_enc: 0.3978 loss_class_dn: 0.06432 loss_bbox_dn: 0.1836 loss_giou_dn: 0.4789 loss_class_dn_0: 0.07485 loss_bbox_dn_0: 0.2354 loss_giou_dn_0: 0.6209 loss_class_dn_1: 0.06706 loss_bbox_dn_1: 0.1979 loss_giou_dn_1: 0.5126 loss_class_dn_2: 0.06559 loss_bbox_dn_2: 0.1865 loss_giou_dn_2: 0.4859 loss_class_dn_3: 0.06488 loss_bbox_dn_3: 0.1847 loss_giou_dn_3: 0.4831 loss_class_dn_4: 0.06478 loss_bbox_dn_4: 0.1838 loss_giou_dn_4: 0.4783 time: 1.9925 data_time: 0.1423 lr: 8.75e-05 max_mem: 37797M [03/21 09:05:51] d2.engine.hooks INFO: Overall training speed: 4062 iterations in 2:14:54 (1.9928 s / it) [03/21 09:05:51] d2.engine.hooks INFO: Total training time: 2:15:18 (0:00:23 on hooks) [03/21 09:05:51] d2.utils.events INFO: eta: 1 day, 23:41:09 iter: 4064 total_loss: 7.713 loss_class: 0.08595 loss_bbox: 0.09989 loss_giou: 0.2754 loss_class_0: 0.0931 loss_bbox_0: 0.1141 loss_giou_0: 0.3275 loss_class_1: 0.08866 loss_bbox_1: 0.1042 loss_giou_1: 0.2949 loss_class_2: 0.08542 loss_bbox_2: 0.1002 loss_giou_2: 0.2838 loss_class_3: 0.08571 loss_bbox_3: 0.09974 loss_giou_3: 0.2811 loss_class_4: 0.08572 loss_bbox_4: 0.09909 loss_giou_4: 0.2773 loss_class_enc: 0.09943 loss_bbox_enc: 0.1333 loss_giou_enc: 0.3657 loss_class_dn: 0.06147 loss_bbox_dn: 0.182 loss_giou_dn: 0.4216 loss_class_dn_0: 0.07321 loss_bbox_dn_0: 0.2368 loss_giou_dn_0: 0.5977 loss_class_dn_1: 0.06375 loss_bbox_dn_1: 0.1943 loss_giou_dn_1: 0.4654 loss_class_dn_2: 0.06166 loss_bbox_dn_2: 0.1846 loss_giou_dn_2: 0.435 loss_class_dn_3: 0.06163 loss_bbox_dn_3: 0.1831 loss_giou_dn_3: 0.4267 loss_class_dn_4: 0.06141 loss_bbox_dn_4: 0.1821 loss_giou_dn_4: 0.4216 time: 1.9926 data_time: 0.1459 lr: 8.75e-05 max_mem: 37797M