DDP报错ValueError: batch_size should be a positive integer value, but got batch_size=<torch.utils.data

ValueError: batch_size should be a positive integer value, but got batch_size=<torch.utils.data.sampler.BatchSampler object at 0x7f4e023dca58>

Traceback (most recent call last):
  File "/opt/conda/envs/env_cp36_microDL/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/opt/conda/envs/env_cp36_microDL/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/yingmuzhi/microDL_3_0/train/train_ddp.py", line 168, in run_ddp
    collate_fn=train_dataset.collate_fn,
  File "/opt/conda/envs/env_cp36_microDL/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 268, in __init__
    batch_sampler = BatchSampler(sampler, batch_size, drop_last)
  File "/opt/conda/envs/env_cp36_microDL/lib/python3.6/site-packages/torch/utils/data/sampler.py", line 217, in __init__
    "but got batch_size={}".format(batch_size))
ValueError: batch_size should be a positive integer value, but got batch_size=<torch.utils.data.sampler.BatchSampler object at 0x7f4e023dca58>
    train_loader = torch.utils.data.DataLoader(
        train_dataset, 
        batch_size=train_batch_sampler, 
        shuffle=args.shuffle_images, 
        # num_workers=args.num_of_workers,
        num_workers=num_workers,
        pin_memory=args.pin_memory,
        collate_fn=train_dataset.collate_fn,
    )

变成

    train_loader = torch.utils.data.DataLoader(
        train_dataset, 
        batch_sampler=train_batch_sampler, 
        shuffle=args.shuffle_images, 
        # num_workers=args.num_of_workers,
        num_workers=num_workers,
        pin_memory=args.pin_memory,
        collate_fn=train_dataset.collate_fn,
    )

猜你喜欢

转载自blog.csdn.net/qq_43369406/article/details/130750492