RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/u012193416/article/details/87876643

RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same

torchsummary.summary( )中出现了上述错误,torchsummary是应用在pytorch中的一种结构表达方式。

如下图fishnet110的结构图

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 32, 112, 112]             864
       BatchNorm2d-2         [-1, 32, 112, 112]              64
              ReLU-3         [-1, 32, 112, 112]               0
            Conv2d-4         [-1, 32, 112, 112]           9,216
       BatchNorm2d-5         [-1, 32, 112, 112]              64
              ReLU-6         [-1, 32, 112, 112]               0
            Conv2d-7         [-1, 64, 112, 112]          18,432
       BatchNorm2d-8         [-1, 64, 112, 112]             128
              ReLU-9         [-1, 64, 112, 112]               0
        MaxPool2d-10           [-1, 64, 56, 56]               0
      BatchNorm2d-11           [-1, 64, 56, 56]             128
             ReLU-12           [-1, 64, 56, 56]               0
             ReLU-13           [-1, 64, 56, 56]               0
           Conv2d-14           [-1, 32, 56, 56]           2,048
      BatchNorm2d-15           [-1, 32, 56, 56]              64
             ReLU-16           [-1, 32, 56, 56]               0
             ReLU-17           [-1, 32, 56, 56]               0
           Conv2d-18           [-1, 32, 56, 56]           9,216
      BatchNorm2d-19           [-1, 32, 56, 56]              64
             ReLU-20           [-1, 32, 56, 56]               0
             ReLU-21           [-1, 32, 56, 56]               0
           Conv2d-22          [-1, 128, 56, 56]           4,096
      BatchNorm2d-23           [-1, 64, 56, 56]             128
             ReLU-24           [-1, 64, 56, 56]               0
             ReLU-25           [-1, 64, 56, 56]               0
           Conv2d-26          [-1, 128, 56, 56]           8,192
       Bottleneck-27          [-1, 128, 56, 56]               0
      BatchNorm2d-28          [-1, 128, 56, 56]             256
             ReLU-29          [-1, 128, 56, 56]               0
           Conv2d-30           [-1, 32, 56, 56]           4,096
      BatchNorm2d-31           [-1, 32, 56, 56]              64
             ReLU-32           [-1, 32, 56, 56]               0
           Conv2d-33           [-1, 32, 56, 56]           9,216
      BatchNorm2d-34           [-1, 32, 56, 56]              64
             ReLU-35           [-1, 32, 56, 56]               0
           Conv2d-36          [-1, 128, 56, 56]           4,096
       Bottleneck-37          [-1, 128, 56, 56]               0
        MaxPool2d-38          [-1, 128, 28, 28]               0
        MaxPool2d-39          [-1, 128, 28, 28]               0
        MaxPool2d-40          [-1, 128, 28, 28]               0
        MaxPool2d-41          [-1, 128, 28, 28]               0
        MaxPool2d-42          [-1, 128, 28, 28]               0
        MaxPool2d-43          [-1, 128, 28, 28]               0
        MaxPool2d-44          [-1, 128, 28, 28]               0
      BatchNorm2d-45          [-1, 128, 28, 28]             256
             ReLU-46          [-1, 128, 28, 28]               0
             ReLU-47          [-1, 128, 28, 28]               0
           Conv2d-48           [-1, 64, 28, 28]           8,192
      BatchNorm2d-49           [-1, 64, 28, 28]             128
             ReLU-50           [-1, 64, 28, 28]               0
             ReLU-51           [-1, 64, 28, 28]               0
           Conv2d-52           [-1, 64, 28, 28]          36,864
      BatchNorm2d-53           [-1, 64, 28, 28]             128
             ReLU-54           [-1, 64, 28, 28]               0
             ReLU-55           [-1, 64, 28, 28]               0
           Conv2d-56          [-1, 256, 28, 28]          16,384
      BatchNorm2d-57          [-1, 128, 28, 28]             256
             ReLU-58          [-1, 128, 28, 28]               0
             ReLU-59          [-1, 128, 28, 28]               0
           Conv2d-60          [-1, 256, 28, 28]          32,768
       Bottleneck-61          [-1, 256, 28, 28]               0
      BatchNorm2d-62          [-1, 256, 28, 28]             512
             ReLU-63          [-1, 256, 28, 28]               0
           Conv2d-64           [-1, 64, 28, 28]          16,384
      BatchNorm2d-65           [-1, 64, 28, 28]             128
             ReLU-66           [-1, 64, 28, 28]               0
           Conv2d-67           [-1, 64, 28, 28]          36,864
      BatchNorm2d-68           [-1, 64, 28, 28]             128
             ReLU-69           [-1, 64, 28, 28]               0
           Conv2d-70          [-1, 256, 28, 28]          16,384
       Bottleneck-71          [-1, 256, 28, 28]               0
        MaxPool2d-72          [-1, 256, 14, 14]               0
        MaxPool2d-73          [-1, 256, 14, 14]               0
        MaxPool2d-74          [-1, 256, 14, 14]               0
        MaxPool2d-75          [-1, 256, 14, 14]               0
        MaxPool2d-76          [-1, 256, 14, 14]               0
        MaxPool2d-77          [-1, 256, 14, 14]               0
        MaxPool2d-78          [-1, 256, 14, 14]               0
      BatchNorm2d-79          [-1, 256, 14, 14]             512
             ReLU-80          [-1, 256, 14, 14]               0
             ReLU-81          [-1, 256, 14, 14]               0
           Conv2d-82          [-1, 128, 14, 14]          32,768
      BatchNorm2d-83          [-1, 128, 14, 14]             256
             ReLU-84          [-1, 128, 14, 14]               0
             ReLU-85          [-1, 128, 14, 14]               0
           Conv2d-86          [-1, 128, 14, 14]         147,456
      BatchNorm2d-87          [-1, 128, 14, 14]             256
             ReLU-88          [-1, 128, 14, 14]               0
             ReLU-89          [-1, 128, 14, 14]               0
           Conv2d-90          [-1, 512, 14, 14]          65,536
      BatchNorm2d-91          [-1, 256, 14, 14]             512
             ReLU-92          [-1, 256, 14, 14]               0
             ReLU-93          [-1, 256, 14, 14]               0
           Conv2d-94          [-1, 512, 14, 14]         131,072
       Bottleneck-95          [-1, 512, 14, 14]               0
      BatchNorm2d-96          [-1, 512, 14, 14]           1,024
             ReLU-97          [-1, 512, 14, 14]               0
           Conv2d-98          [-1, 128, 14, 14]          65,536
      BatchNorm2d-99          [-1, 128, 14, 14]             256
            ReLU-100          [-1, 128, 14, 14]               0
          Conv2d-101          [-1, 128, 14, 14]         147,456
     BatchNorm2d-102          [-1, 128, 14, 14]             256
            ReLU-103          [-1, 128, 14, 14]               0
          Conv2d-104          [-1, 512, 14, 14]          65,536
      Bottleneck-105          [-1, 512, 14, 14]               0
     BatchNorm2d-106          [-1, 512, 14, 14]           1,024
            ReLU-107          [-1, 512, 14, 14]               0
          Conv2d-108          [-1, 128, 14, 14]          65,536
     BatchNorm2d-109          [-1, 128, 14, 14]             256
            ReLU-110          [-1, 128, 14, 14]               0
          Conv2d-111          [-1, 128, 14, 14]         147,456
     BatchNorm2d-112          [-1, 128, 14, 14]             256
            ReLU-113          [-1, 128, 14, 14]               0
          Conv2d-114          [-1, 512, 14, 14]          65,536
      Bottleneck-115          [-1, 512, 14, 14]               0
     BatchNorm2d-116          [-1, 512, 14, 14]           1,024
            ReLU-117          [-1, 512, 14, 14]               0
          Conv2d-118          [-1, 128, 14, 14]          65,536
     BatchNorm2d-119          [-1, 128, 14, 14]             256
            ReLU-120          [-1, 128, 14, 14]               0
          Conv2d-121          [-1, 128, 14, 14]         147,456
     BatchNorm2d-122          [-1, 128, 14, 14]             256
            ReLU-123          [-1, 128, 14, 14]               0
          Conv2d-124          [-1, 512, 14, 14]          65,536
      Bottleneck-125          [-1, 512, 14, 14]               0
     BatchNorm2d-126          [-1, 512, 14, 14]           1,024
            ReLU-127          [-1, 512, 14, 14]               0
          Conv2d-128          [-1, 128, 14, 14]          65,536
     BatchNorm2d-129          [-1, 128, 14, 14]             256
            ReLU-130          [-1, 128, 14, 14]               0
          Conv2d-131          [-1, 128, 14, 14]         147,456
     BatchNorm2d-132          [-1, 128, 14, 14]             256
            ReLU-133          [-1, 128, 14, 14]               0
          Conv2d-134          [-1, 512, 14, 14]          65,536
      Bottleneck-135          [-1, 512, 14, 14]               0
     BatchNorm2d-136          [-1, 512, 14, 14]           1,024
            ReLU-137          [-1, 512, 14, 14]               0
          Conv2d-138          [-1, 128, 14, 14]          65,536
     BatchNorm2d-139          [-1, 128, 14, 14]             256
            ReLU-140          [-1, 128, 14, 14]               0
          Conv2d-141          [-1, 128, 14, 14]         147,456
     BatchNorm2d-142          [-1, 128, 14, 14]             256
            ReLU-143          [-1, 128, 14, 14]               0
          Conv2d-144          [-1, 512, 14, 14]          65,536
      Bottleneck-145          [-1, 512, 14, 14]               0
       MaxPool2d-146            [-1, 512, 7, 7]               0
       MaxPool2d-147            [-1, 512, 7, 7]               0
       MaxPool2d-148            [-1, 512, 7, 7]               0
       MaxPool2d-149            [-1, 512, 7, 7]               0
       MaxPool2d-150            [-1, 512, 7, 7]               0
       MaxPool2d-151            [-1, 512, 7, 7]               0
       MaxPool2d-152            [-1, 512, 7, 7]               0
     BatchNorm2d-153            [-1, 512, 7, 7]           1,024
            ReLU-154            [-1, 512, 7, 7]               0
          Conv2d-155            [-1, 256, 7, 7]         131,072
     BatchNorm2d-156            [-1, 256, 7, 7]             512
            ReLU-157            [-1, 256, 7, 7]               0
          Conv2d-158           [-1, 1024, 7, 7]         263,168
     BatchNorm2d-159           [-1, 1024, 7, 7]           2,048
            ReLU-160           [-1, 1024, 7, 7]               0
AdaptiveAvgPool2d-161           [-1, 1024, 1, 1]               0
          Conv2d-162             [-1, 32, 1, 1]          32,800
            ReLU-163             [-1, 32, 1, 1]               0
          Conv2d-164            [-1, 512, 1, 1]          16,896
         Sigmoid-165            [-1, 512, 1, 1]               0
     BatchNorm2d-166           [-1, 1024, 7, 7]           2,048
            ReLU-167           [-1, 1024, 7, 7]               0
            ReLU-168           [-1, 1024, 7, 7]               0
          Conv2d-169            [-1, 128, 7, 7]         131,072
     BatchNorm2d-170            [-1, 128, 7, 7]             256
            ReLU-171            [-1, 128, 7, 7]               0
            ReLU-172            [-1, 128, 7, 7]               0
          Conv2d-173            [-1, 128, 7, 7]         147,456
     BatchNorm2d-174            [-1, 128, 7, 7]             256
            ReLU-175            [-1, 128, 7, 7]               0
            ReLU-176            [-1, 128, 7, 7]               0
          Conv2d-177            [-1, 512, 7, 7]          65,536
     BatchNorm2d-178           [-1, 1024, 7, 7]           2,048
            ReLU-179           [-1, 1024, 7, 7]               0
            ReLU-180           [-1, 1024, 7, 7]               0
          Conv2d-181            [-1, 512, 7, 7]         524,288
      Bottleneck-182            [-1, 512, 7, 7]               0
     BatchNorm2d-183            [-1, 512, 7, 7]           1,024
            ReLU-184            [-1, 512, 7, 7]               0
          Conv2d-185            [-1, 128, 7, 7]          65,536
     BatchNorm2d-186            [-1, 128, 7, 7]             256
            ReLU-187            [-1, 128, 7, 7]               0
          Conv2d-188            [-1, 128, 7, 7]         147,456
     BatchNorm2d-189            [-1, 128, 7, 7]             256
            ReLU-190            [-1, 128, 7, 7]               0
          Conv2d-191            [-1, 512, 7, 7]          65,536
      Bottleneck-192            [-1, 512, 7, 7]               0
     BatchNorm2d-193            [-1, 512, 7, 7]           1,024
            ReLU-194            [-1, 512, 7, 7]               0
          Conv2d-195            [-1, 128, 7, 7]          65,536
     BatchNorm2d-196            [-1, 128, 7, 7]             256
            ReLU-197            [-1, 128, 7, 7]               0
          Conv2d-198            [-1, 128, 7, 7]         147,456
     BatchNorm2d-199            [-1, 128, 7, 7]             256
            ReLU-200            [-1, 128, 7, 7]               0
          Conv2d-201            [-1, 512, 7, 7]          65,536
      Bottleneck-202            [-1, 512, 7, 7]               0
        Upsample-203          [-1, 512, 14, 14]               0
        Upsample-204          [-1, 512, 14, 14]               0
        Upsample-205          [-1, 512, 14, 14]               0
        Upsample-206          [-1, 512, 14, 14]               0
     BatchNorm2d-207          [-1, 256, 14, 14]             512
            ReLU-208          [-1, 256, 14, 14]               0
          Conv2d-209           [-1, 64, 14, 14]          16,384
     BatchNorm2d-210           [-1, 64, 14, 14]             128
            ReLU-211           [-1, 64, 14, 14]               0
          Conv2d-212           [-1, 64, 14, 14]          36,864
     BatchNorm2d-213           [-1, 64, 14, 14]             128
            ReLU-214           [-1, 64, 14, 14]               0
          Conv2d-215          [-1, 256, 14, 14]          16,384
      Bottleneck-216          [-1, 256, 14, 14]               0
     BatchNorm2d-217          [-1, 768, 14, 14]           1,536
            ReLU-218          [-1, 768, 14, 14]               0
          Conv2d-219           [-1, 96, 14, 14]          73,728
     BatchNorm2d-220           [-1, 96, 14, 14]             192
            ReLU-221           [-1, 96, 14, 14]               0
          Conv2d-222           [-1, 96, 14, 14]          82,944
     BatchNorm2d-223           [-1, 96, 14, 14]             192
            ReLU-224           [-1, 96, 14, 14]               0
          Conv2d-225          [-1, 384, 14, 14]          36,864
      Bottleneck-226          [-1, 384, 14, 14]               0
        Upsample-227          [-1, 384, 28, 28]               0
        Upsample-228          [-1, 384, 28, 28]               0
        Upsample-229          [-1, 384, 28, 28]               0
        Upsample-230          [-1, 384, 28, 28]               0
     BatchNorm2d-231          [-1, 128, 28, 28]             256
            ReLU-232          [-1, 128, 28, 28]               0
          Conv2d-233           [-1, 32, 28, 28]           4,096
     BatchNorm2d-234           [-1, 32, 28, 28]              64
            ReLU-235           [-1, 32, 28, 28]               0
          Conv2d-236           [-1, 32, 28, 28]           9,216
     BatchNorm2d-237           [-1, 32, 28, 28]              64
            ReLU-238           [-1, 32, 28, 28]               0
          Conv2d-239          [-1, 128, 28, 28]           4,096
      Bottleneck-240          [-1, 128, 28, 28]               0
     BatchNorm2d-241          [-1, 512, 28, 28]           1,024
            ReLU-242          [-1, 512, 28, 28]               0
          Conv2d-243           [-1, 64, 28, 28]          32,768
     BatchNorm2d-244           [-1, 64, 28, 28]             128
            ReLU-245           [-1, 64, 28, 28]               0
          Conv2d-246           [-1, 64, 28, 28]          36,864
     BatchNorm2d-247           [-1, 64, 28, 28]             128
            ReLU-248           [-1, 64, 28, 28]               0
          Conv2d-249          [-1, 256, 28, 28]          16,384
      Bottleneck-250          [-1, 256, 28, 28]               0
        Upsample-251          [-1, 256, 56, 56]               0
        Upsample-252          [-1, 256, 56, 56]               0
        Upsample-253          [-1, 256, 56, 56]               0
        Upsample-254          [-1, 256, 56, 56]               0
     BatchNorm2d-255           [-1, 64, 56, 56]             128
            ReLU-256           [-1, 64, 56, 56]               0
          Conv2d-257           [-1, 16, 56, 56]           1,024
     BatchNorm2d-258           [-1, 16, 56, 56]              32
            ReLU-259           [-1, 16, 56, 56]               0
          Conv2d-260           [-1, 16, 56, 56]           2,304
     BatchNorm2d-261           [-1, 16, 56, 56]              32
            ReLU-262           [-1, 16, 56, 56]               0
          Conv2d-263           [-1, 64, 56, 56]           1,024
      Bottleneck-264           [-1, 64, 56, 56]               0
     BatchNorm2d-265          [-1, 320, 56, 56]             640
            ReLU-266          [-1, 320, 56, 56]               0
          Conv2d-267           [-1, 80, 56, 56]          25,600
     BatchNorm2d-268           [-1, 80, 56, 56]             160
            ReLU-269           [-1, 80, 56, 56]               0
          Conv2d-270           [-1, 80, 56, 56]          57,600
     BatchNorm2d-271           [-1, 80, 56, 56]             160
            ReLU-272           [-1, 80, 56, 56]               0
          Conv2d-273          [-1, 320, 56, 56]          25,600
      Bottleneck-274          [-1, 320, 56, 56]               0
       MaxPool2d-275          [-1, 320, 28, 28]               0
       MaxPool2d-276          [-1, 320, 28, 28]               0
       MaxPool2d-277          [-1, 320, 28, 28]               0
       MaxPool2d-278          [-1, 320, 28, 28]               0
       MaxPool2d-279          [-1, 320, 28, 28]               0
       MaxPool2d-280          [-1, 320, 28, 28]               0
       MaxPool2d-281          [-1, 320, 28, 28]               0
     BatchNorm2d-282          [-1, 512, 28, 28]           1,024
            ReLU-283          [-1, 512, 28, 28]               0
          Conv2d-284          [-1, 128, 28, 28]          65,536
     BatchNorm2d-285          [-1, 128, 28, 28]             256
            ReLU-286          [-1, 128, 28, 28]               0
          Conv2d-287          [-1, 128, 28, 28]         147,456
     BatchNorm2d-288          [-1, 128, 28, 28]             256
            ReLU-289          [-1, 128, 28, 28]               0
          Conv2d-290          [-1, 512, 28, 28]          65,536
      Bottleneck-291          [-1, 512, 28, 28]               0
     BatchNorm2d-292          [-1, 832, 28, 28]           1,664
            ReLU-293          [-1, 832, 28, 28]               0
          Conv2d-294          [-1, 208, 28, 28]         173,056
     BatchNorm2d-295          [-1, 208, 28, 28]             416
            ReLU-296          [-1, 208, 28, 28]               0
          Conv2d-297          [-1, 208, 28, 28]         389,376
     BatchNorm2d-298          [-1, 208, 28, 28]             416
            ReLU-299          [-1, 208, 28, 28]               0
          Conv2d-300          [-1, 832, 28, 28]         173,056
      Bottleneck-301          [-1, 832, 28, 28]               0
     BatchNorm2d-302          [-1, 832, 28, 28]           1,664
            ReLU-303          [-1, 832, 28, 28]               0
          Conv2d-304          [-1, 208, 28, 28]         173,056
     BatchNorm2d-305          [-1, 208, 28, 28]             416
            ReLU-306          [-1, 208, 28, 28]               0
          Conv2d-307          [-1, 208, 28, 28]         389,376
     BatchNorm2d-308          [-1, 208, 28, 28]             416
            ReLU-309          [-1, 208, 28, 28]               0
          Conv2d-310          [-1, 832, 28, 28]         173,056
      Bottleneck-311          [-1, 832, 28, 28]               0
       MaxPool2d-312          [-1, 832, 14, 14]               0
       MaxPool2d-313          [-1, 832, 14, 14]               0
       MaxPool2d-314          [-1, 832, 14, 14]               0
       MaxPool2d-315          [-1, 832, 14, 14]               0
       MaxPool2d-316          [-1, 832, 14, 14]               0
       MaxPool2d-317          [-1, 832, 14, 14]               0
       MaxPool2d-318          [-1, 832, 14, 14]               0
     BatchNorm2d-319          [-1, 768, 14, 14]           1,536
            ReLU-320          [-1, 768, 14, 14]               0
          Conv2d-321          [-1, 192, 14, 14]         147,456
     BatchNorm2d-322          [-1, 192, 14, 14]             384
            ReLU-323          [-1, 192, 14, 14]               0
          Conv2d-324          [-1, 192, 14, 14]         331,776
     BatchNorm2d-325          [-1, 192, 14, 14]             384
            ReLU-326          [-1, 192, 14, 14]               0
          Conv2d-327          [-1, 768, 14, 14]         147,456
      Bottleneck-328          [-1, 768, 14, 14]               0
     BatchNorm2d-329         [-1, 1600, 14, 14]           3,200
            ReLU-330         [-1, 1600, 14, 14]               0
          Conv2d-331          [-1, 400, 14, 14]         640,000
     BatchNorm2d-332          [-1, 400, 14, 14]             800
            ReLU-333          [-1, 400, 14, 14]               0
          Conv2d-334          [-1, 400, 14, 14]       1,440,000
     BatchNorm2d-335          [-1, 400, 14, 14]             800
            ReLU-336          [-1, 400, 14, 14]               0
          Conv2d-337         [-1, 1600, 14, 14]         640,000
      Bottleneck-338         [-1, 1600, 14, 14]               0
     BatchNorm2d-339         [-1, 1600, 14, 14]           3,200
            ReLU-340         [-1, 1600, 14, 14]               0
          Conv2d-341          [-1, 400, 14, 14]         640,000
     BatchNorm2d-342          [-1, 400, 14, 14]             800
            ReLU-343          [-1, 400, 14, 14]               0
          Conv2d-344          [-1, 400, 14, 14]       1,440,000
     BatchNorm2d-345          [-1, 400, 14, 14]             800
            ReLU-346          [-1, 400, 14, 14]               0
          Conv2d-347         [-1, 1600, 14, 14]         640,000
      Bottleneck-348         [-1, 1600, 14, 14]               0
       MaxPool2d-349           [-1, 1600, 7, 7]               0
       MaxPool2d-350           [-1, 1600, 7, 7]               0
       MaxPool2d-351           [-1, 1600, 7, 7]               0
       MaxPool2d-352           [-1, 1600, 7, 7]               0
       MaxPool2d-353           [-1, 1600, 7, 7]               0
       MaxPool2d-354           [-1, 1600, 7, 7]               0
       MaxPool2d-355           [-1, 1600, 7, 7]               0
     BatchNorm2d-356            [-1, 512, 7, 7]           1,024
            ReLU-357            [-1, 512, 7, 7]               0
          Conv2d-358            [-1, 128, 7, 7]          65,536
     BatchNorm2d-359            [-1, 128, 7, 7]             256
            ReLU-360            [-1, 128, 7, 7]               0
          Conv2d-361            [-1, 128, 7, 7]         147,456
     BatchNorm2d-362            [-1, 128, 7, 7]             256
            ReLU-363            [-1, 128, 7, 7]               0
          Conv2d-364            [-1, 512, 7, 7]          65,536
      Bottleneck-365            [-1, 512, 7, 7]               0
     BatchNorm2d-366            [-1, 512, 7, 7]           1,024
            ReLU-367            [-1, 512, 7, 7]               0
          Conv2d-368            [-1, 128, 7, 7]          65,536
     BatchNorm2d-369            [-1, 128, 7, 7]             256
            ReLU-370            [-1, 128, 7, 7]               0
          Conv2d-371            [-1, 128, 7, 7]         147,456
     BatchNorm2d-372            [-1, 128, 7, 7]             256
            ReLU-373            [-1, 128, 7, 7]               0
          Conv2d-374            [-1, 512, 7, 7]          65,536
      Bottleneck-375            [-1, 512, 7, 7]               0
     BatchNorm2d-376            [-1, 512, 7, 7]           1,024
            ReLU-377            [-1, 512, 7, 7]               0
          Conv2d-378            [-1, 128, 7, 7]          65,536
     BatchNorm2d-379            [-1, 128, 7, 7]             256
            ReLU-380            [-1, 128, 7, 7]               0
          Conv2d-381            [-1, 128, 7, 7]         147,456
     BatchNorm2d-382            [-1, 128, 7, 7]             256
            ReLU-383            [-1, 128, 7, 7]               0
          Conv2d-384            [-1, 512, 7, 7]          65,536
      Bottleneck-385            [-1, 512, 7, 7]               0
     BatchNorm2d-386            [-1, 512, 7, 7]           1,024
            ReLU-387            [-1, 512, 7, 7]               0
          Conv2d-388            [-1, 128, 7, 7]          65,536
     BatchNorm2d-389            [-1, 128, 7, 7]             256
            ReLU-390            [-1, 128, 7, 7]               0
          Conv2d-391            [-1, 128, 7, 7]         147,456
     BatchNorm2d-392            [-1, 128, 7, 7]             256
            ReLU-393            [-1, 128, 7, 7]               0
          Conv2d-394            [-1, 512, 7, 7]          65,536
      Bottleneck-395            [-1, 512, 7, 7]               0
     BatchNorm2d-396           [-1, 2112, 7, 7]           4,224
            ReLU-397           [-1, 2112, 7, 7]               0
          Conv2d-398           [-1, 1056, 7, 7]       2,230,272
     BatchNorm2d-399           [-1, 1056, 7, 7]           2,112
            ReLU-400           [-1, 1056, 7, 7]               0
AdaptiveAvgPool2d-401           [-1, 1056, 1, 1]               0
          Conv2d-402           [-1, 1000, 1, 1]       1,057,000
            Fish-403           [-1, 1000, 1, 1]               0
================================================================
Total params: 16,628,904
Trainable params: 16,628,904
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 390.95
Params size (MB): 63.43
Estimated Total Size (MB): 454.96
----------------------------------------------------------------

Process finished with exit code 0

有两种方式可以更改上述错误:

1.

if __name__ == '__main__':
    model = fishnet99()
    torchsummary.summary(model.cuda(), (3, 224, 224))

2.

if __name__ == '__main__':
    model = fishnet99()
    torchsummary.summary(model, (3, 224, 224),device='cpu')

这两种方式都可以避免上述错误

第一种就是 convert your network to cuda,第二种就是 call `torchsummary.summary` with `device='cpu'`

猜你喜欢

转载自blog.csdn.net/u012193416/article/details/87876643