2020·0318·22:30
1Layer (type) Output Shape Param #
================================================================ Conv2d-1 [-1, 64, 14, 14] 3,136 BatchNorm2d-2 [-1, 64, 14, 14] 128 ReLU-3 [-1, 64, 14, 14] 0 MaxPool2d-4 [-1, 64, 7, 7] 0 Conv2d-5 [-1, 64, 7, 7] 36,864 BatchNorm2d-6 [-1, 64, 7, 7] 128 ReLU-7 [-1, 64, 7, 7] 0 Conv2d-8 [-1, 64, 7, 7] 36,864 BatchNorm2d-9 [-1, 64, 7, 7] 128 ReLU-10 [-1, 64, 7, 7] 0 BasicBlock-11 [-1, 64, 7, 7] 0 Conv2d-12 [-1, 64, 7, 7] 36,864 BatchNorm2d-13 [-1, 64, 7, 7] 128 ReLU-14 [-1, 64, 7, 7] 0 Conv2d-15 [-1, 64, 7, 7] 36,864 BatchNorm2d-16 [-1, 64, 7, 7] 128 ReLU-17 [-1, 64, 7, 7] 0 BasicBlock-18 [-1, 64, 7, 7] 0 Conv2d-19 [-1, 128, 4, 4] 73,728 BatchNorm2d-20 [-1, 128, 4, 4] 256 ReLU-21 [-1, 128, 4, 4] 0 Conv2d-22 [-1, 128, 4, 4] 147,456 BatchNorm2d-23 [-1, 128, 4, 4] 256 Conv2d-24 [-1, 128, 4, 4] 8,192 BatchNorm2d-25 [-1, 128, 4, 4] 256 ReLU-26 [-1, 128, 4, 4] 0 BasicBlock-27 [-1, 128, 4, 4] 0 Conv2d-28 [-1, 128, 4, 4] 147,456 BatchNorm2d-29 [-1, 128, 4, 4] 256 ReLU-30 [-1, 128, 4, 4] 0 Conv2d-31 [-1, 128, 4, 4] 147,456 BatchNorm2d-32 [-1, 128, 4, 4] 256 ReLU-33 [-1, 128, 4, 4] 0 BasicBlock-34 [-1, 128, 4, 4] 0 Conv2d-35 [-1, 256, 2, 2] 294,912 BatchNorm2d-36 [-1, 256, 2, 2] 512 ReLU-37 [-1, 256, 2, 2] 0 Conv2d-38 [-1, 256, 2, 2] 589,824 BatchNorm2d-39 [-1, 256, 2, 2] 512 Conv2d-40 [-1, 256, 2, 2] 32,768 BatchNorm2d-41 [-1, 256, 2, 2] 512 ReLU-42 [-1, 256, 2, 2] 0 BasicBlock-43 [-1, 256, 2, 2] 0 Conv2d-44 [-1, 256, 2, 2] 589,824 BatchNorm2d-45 [-1, 256, 2, 2] 512 ReLU-46 [-1, 256, 2, 2] 0 Conv2d-47 [-1, 256, 2, 2] 589,824 BatchNorm2d-48 [-1, 256, 2, 2] 512 ReLU-49 [-1, 256, 2, 2] 0 BasicBlock-50 [-1, 256, 2, 2] 0 Conv2d-51 [-1, 512, 1, 1] 1,179,648 BatchNorm2d-52 [-1, 512, 1, 1] 1,024 ReLU-53 [-1, 512, 1, 1] 0 Conv2d-54 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-55 [-1, 512, 1, 1] 1,024 Conv2d-56 [-1, 512, 1, 1] 131,072 BatchNorm2d-57 [-1, 512, 1, 1] 1,024 ReLU-58 [-1, 512, 1, 1] 0 BasicBlock-59 [-1, 512, 1, 1] 0 Conv2d-60 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-61 [-1, 512, 1, 1] 1,024 ReLU-62 [-1, 512, 1, 1] 0 Conv2d-63 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-64 [-1, 512, 1, 1] 1,024 ReLU-65 [-1, 512, 1, 1] 0 BasicBlock-66 [-1, 512, 1, 1] 0
Linear-67 [-1, 10] 5,130
Total params: 11,175,370 Trainable params: 11,175,370
Non-trainable params: 0
Input size (MB): 0.00 Forward/backward pass size (MB): 1.08 Params size (MB): 42.63
Estimated Total Size (MB): 43.72
Traceback (most recent call last):
File "myresnet.py", line 170, in
Epoch: 2 | Batch index: 0 | Batch size: 128
xxxxxxxxxx
11Layer (type) Output Shape Param #
================================================================ Conv2d-1 [-1, 64, 14, 14] 3,136 BatchNorm2d-2 [-1, 64, 14, 14] 128 ReLU-3 [-1, 64, 14, 14] 0 MaxPool2d-4 [-1, 64, 7, 7] 0 Conv2d-5 [-1, 64, 7, 7] 36,864 BatchNorm2d-6 [-1, 64, 7, 7] 128 ReLU-7 [-1, 64, 7, 7] 0 Conv2d-8 [-1, 64, 7, 7] 36,864 BatchNorm2d-9 [-1, 64, 7, 7] 128 ReLU-10 [-1, 64, 7, 7] 0 BasicBlock-11 [-1, 64, 7, 7] 0 Conv2d-12 [-1, 64, 7, 7] 36,864 BatchNorm2d-13 [-1, 64, 7, 7] 128 ReLU-14 [-1, 64, 7, 7] 0 Conv2d-15 [-1, 64, 7, 7] 36,864 BatchNorm2d-16 [-1, 64, 7, 7] 128 ReLU-17 [-1, 64, 7, 7] 0 BasicBlock-18 [-1, 64, 7, 7] 0 Conv2d-19 [-1, 128, 4, 4] 73,728 BatchNorm2d-20 [-1, 128, 4, 4] 256 ReLU-21 [-1, 128, 4, 4] 0 Conv2d-22 [-1, 128, 4, 4] 147,456 BatchNorm2d-23 [-1, 128, 4, 4] 256 Conv2d-24 [-1, 128, 4, 4] 8,192 BatchNorm2d-25 [-1, 128, 4, 4] 256 ReLU-26 [-1, 128, 4, 4] 0 BasicBlock-27 [-1, 128, 4, 4] 0 Conv2d-28 [-1, 128, 4, 4] 147,456 BatchNorm2d-29 [-1, 128, 4, 4] 256 ReLU-30 [-1, 128, 4, 4] 0 Conv2d-31 [-1, 128, 4, 4] 147,456 BatchNorm2d-32 [-1, 128, 4, 4] 256 ReLU-33 [-1, 128, 4, 4] 0 BasicBlock-34 [-1, 128, 4, 4] 0 Conv2d-35 [-1, 256, 2, 2] 294,912 BatchNorm2d-36 [-1, 256, 2, 2] 512 ReLU-37 [-1, 256, 2, 2] 0 Conv2d-38 [-1, 256, 2, 2] 589,824 BatchNorm2d-39 [-1, 256, 2, 2] 512 Conv2d-40 [-1, 256, 2, 2] 32,768 BatchNorm2d-41 [-1, 256, 2, 2] 512 ReLU-42 [-1, 256, 2, 2] 0 BasicBlock-43 [-1, 256, 2, 2] 0 Conv2d-44 [-1, 256, 2, 2] 589,824 BatchNorm2d-45 [-1, 256, 2, 2] 512 ReLU-46 [-1, 256, 2, 2] 0 Conv2d-47 [-1, 256, 2, 2] 589,824 BatchNorm2d-48 [-1, 256, 2, 2] 512 ReLU-49 [-1, 256, 2, 2] 0 BasicBlock-50 [-1, 256, 2, 2] 0 Conv2d-51 [-1, 512, 1, 1] 1,179,648 BatchNorm2d-52 [-1, 512, 1, 1] 1,024 ReLU-53 [-1, 512, 1, 1] 0 Conv2d-54 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-55 [-1, 512, 1, 1] 1,024 Conv2d-56 [-1, 512, 1, 1] 131,072 BatchNorm2d-57 [-1, 512, 1, 1] 1,024 ReLU-58 [-1, 512, 1, 1] 0 BasicBlock-59 [-1, 512, 1, 1] 0 Conv2d-60 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-61 [-1, 512, 1, 1] 1,024 ReLU-62 [-1, 512, 1, 1] 0 Conv2d-63 [-1, 512, 1, 1] 2,359,296 BatchNorm2d-64 [-1, 512, 1, 1] 1,024 ReLU-65 [-1, 512, 1, 1] 0 BasicBlock-66 [-1, 512, 1, 1] 0
Linear-67 [-1, 10] 5,130
Total params: 11,175,370 Trainable params: 11,175,370
Non-trainable params: 0
Input size (MB): 0.00 Forward/backward pass size (MB): 1.08 Params size (MB): 42.63
Estimated Total Size (MB): 43.72
Epoch: 001/010 | Batch 0000/0469 | Cost: 2.4471 Epoch: 001/010 | Batch 0050/0469 | Cost: 0.0794 Epoch: 001/010 | Batch 0100/0469 | Cost: 0.1959 Epoch: 001/010 | Batch 0150/0469 | Cost: 0.2234 Epoch: 001/010 | Batch 0200/0469 | Cost: 0.1466 Epoch: 001/010 | Batch 0250/0469 | Cost: 0.0707 Epoch: 001/010 | Batch 0300/0469 | Cost: 0.2032 Epoch: 001/010 | Batch 0350/0469 | Cost: 0.0515 Epoch: 001/010 | Batch 0400/0469 | Cost: 0.0456 Epoch: 001/010 | Batch 0450/0469 | Cost: 0.1236 Epoch: 001/010 | Train: 98.042% Time elapsed: 0.72 min Epoch: 002/010 | Batch 0000/0469 | Cost: 0.0186 Epoch: 002/010 | Batch 0050/0469 | Cost: 0.0141 Epoch: 002/010 | Batch 0100/0469 | Cost: 0.0558 Epoch: 002/010 | Batch 0150/0469 | Cost: 0.0507 Epoch: 002/010 | Batch 0200/0469 | Cost: 0.0244 Epoch: 002/010 | Batch 0250/0469 | Cost: 0.0421 Epoch: 002/010 | Batch 0300/0469 | Cost: 0.0517 Epoch: 002/010 | Batch 0350/0469 | Cost: 0.0915 Epoch: 002/010 | Batch 0400/0469 | Cost: 0.0398 Epoch: 002/010 | Batch 0450/0469 | Cost: 0.0630 Epoch: 002/010 | Train: 99.123% Time elapsed: 1.43 min Epoch: 003/010 | Batch 0000/0469 | Cost: 0.0102 Epoch: 003/010 | Batch 0050/0469 | Cost: 0.0206 Epoch: 003/010 | Batch 0100/0469 | Cost: 0.0652 Epoch: 003/010 | Batch 0150/0469 | Cost: 0.0203 Epoch: 003/010 | Batch 0200/0469 | Cost: 0.0252 Epoch: 003/010 | Batch 0250/0469 | Cost: 0.1252 Epoch: 003/010 | Batch 0300/0469 | Cost: 0.0362 Epoch: 003/010 | Batch 0350/0469 | Cost: 0.0165 Epoch: 003/010 | Batch 0400/0469 | Cost: 0.0699 Epoch: 003/010 | Batch 0450/0469 | Cost: 0.2126 Epoch: 003/010 | Train: 98.872% Time elapsed: 2.15 min Epoch: 004/010 | Batch 0000/0469 | Cost: 0.0051 Epoch: 004/010 | Batch 0050/0469 | Cost: 0.0108 Epoch: 004/010 | Batch 0100/0469 | Cost: 0.0157 Epoch: 004/010 | Batch 0150/0469 | Cost: 0.0337 Epoch: 004/010 | Batch 0200/0469 | Cost: 0.0489 Epoch: 004/010 | Batch 0250/0469 | Cost: 0.0286 Epoch: 004/010 | Batch 0300/0469 | Cost: 0.0057 Epoch: 004/010 | Batch 0350/0469 | Cost: 0.0065 Epoch: 004/010 | Batch 0400/0469 | Cost: 0.0052 Epoch: 004/010 | Batch 0450/0469 | Cost: 0.0078 Epoch: 004/010 | Train: 99.170% Time elapsed: 2.87 min Epoch: 005/010 | Batch 0000/0469 | Cost: 0.0249 Epoch: 005/010 | Batch 0050/0469 | Cost: 0.0530 Epoch: 005/010 | Batch 0100/0469 | Cost: 0.0053 Epoch: 005/010 | Batch 0150/0469 | Cost: 0.0095 Epoch: 005/010 | Batch 0200/0469 | Cost: 0.0411 Epoch: 005/010 | Batch 0250/0469 | Cost: 0.0113 Epoch: 005/010 | Batch 0300/0469 | Cost: 0.0020 Epoch: 005/010 | Batch 0350/0469 | Cost: 0.0192 Epoch: 005/010 | Batch 0400/0469 | Cost: 0.0531 Epoch: 005/010 | Batch 0450/0469 | Cost: 0.0302 Epoch: 005/010 | Train: 99.140% Time elapsed: 3.60 min Epoch: 006/010 | Batch 0000/0469 | Cost: 0.0081 Epoch: 006/010 | Batch 0050/0469 | Cost: 0.0146 Epoch: 006/010 | Batch 0100/0469 | Cost: 0.0309 Epoch: 006/010 | Batch 0150/0469 | Cost: 0.0188 Epoch: 006/010 | Batch 0200/0469 | Cost: 0.0309 Epoch: 006/010 | Batch 0250/0469 | Cost: 0.0264 Epoch: 006/010 | Batch 0300/0469 | Cost: 0.0502 Epoch: 006/010 | Batch 0350/0469 | Cost: 0.0055 Epoch: 006/010 | Batch 0400/0469 | Cost: 0.0011 Epoch: 006/010 | Batch 0450/0469 | Cost: 0.0033 Epoch: 006/010 | Train: 99.452% Time elapsed: 4.31 min Epoch: 007/010 | Batch 0000/0469 | Cost: 0.0266 Epoch: 007/010 | Batch 0050/0469 | Cost: 0.0021 Epoch: 007/010 | Batch 0100/0469 | Cost: 0.0042 Epoch: 007/010 | Batch 0150/0469 | Cost: 0.0095 Epoch: 007/010 | Batch 0200/0469 | Cost: 0.0050 Epoch: 007/010 | Batch 0250/0469 | Cost: 0.0252 Epoch: 007/010 | Batch 0300/0469 | Cost: 0.0019 Epoch: 007/010 | Batch 0350/0469 | Cost: 0.0045 Epoch: 007/010 | Batch 0400/0469 | Cost: 0.0083 Epoch: 007/010 | Batch 0450/0469 | Cost: 0.0076 Epoch: 007/010 | Train: 99.603% Time elapsed: 5.02 min Epoch: 008/010 | Batch 0000/0469 | Cost: 0.0010 Epoch: 008/010 | Batch 0050/0469 | Cost: 0.0081 Epoch: 008/010 | Batch 0100/0469 | Cost: 0.0009 Epoch: 008/010 | Batch 0150/0469 | Cost: 0.0310 Epoch: 008/010 | Batch 0200/0469 | Cost: 0.0130 Epoch: 008/010 | Batch 0250/0469 | Cost: 0.0183 Epoch: 008/010 | Batch 0300/0469 | Cost: 0.0025 Epoch: 008/010 | Batch 0350/0469 | Cost: 0.0068 Epoch: 008/010 | Batch 0400/0469 | Cost: 0.0132 Epoch: 008/010 | Batch 0450/0469 | Cost: 0.0028 Epoch: 008/010 | Train: 99.585% Time elapsed: 5.73 min Epoch: 009/010 | Batch 0000/0469 | Cost: 0.0174 Epoch: 009/010 | Batch 0050/0469 | Cost: 0.0088 Epoch: 009/010 | Batch 0100/0469 | Cost: 0.0075 Epoch: 009/010 | Batch 0150/0469 | Cost: 0.0003 Epoch: 009/010 | Batch 0200/0469 | Cost: 0.0020 Epoch: 009/010 | Batch 0250/0469 | Cost: 0.0039 Epoch: 009/010 | Batch 0300/0469 | Cost: 0.0118 Epoch: 009/010 | Batch 0350/0469 | Cost: 0.0083 Epoch: 009/010 | Batch 0400/0469 | Cost: 0.0407 Epoch: 009/010 | Batch 0450/0469 | Cost: 0.0061 Epoch: 009/010 | Train: 99.630% Time elapsed: 6.45 min Epoch: 010/010 | Batch 0000/0469 | Cost: 0.0375 Epoch: 010/010 | Batch 0050/0469 | Cost: 0.0106 Epoch: 010/010 | Batch 0100/0469 | Cost: 0.0027 Epoch: 010/010 | Batch 0150/0469 | Cost: 0.0111 Epoch: 010/010 | Batch 0200/0469 | Cost: 0.0048 Epoch: 010/010 | Batch 0250/0469 | Cost: 0.0053 Epoch: 010/010 | Batch 0300/0469 | Cost: 0.0156 Epoch: 010/010 | Batch 0350/0469 | Cost: 0.0050 Epoch: 010/010 | Batch 0400/0469 | Cost: 0.0043 Epoch: 010/010 | Batch 0450/0469 | Cost: 0.0264 Epoch: 010/010 | Train: 99.665% Time elapsed: 7.14 min Total Training Time: 7.14 min Test accuracy: 99.09%