Pytorch & front-end work combing

Pytorch & front-end work combing

Data Set & Classic Model

Image classification data (mnist, cifar10, stl10, svhn)-VGG16, ResNet, AlexNet, LeNet, GoogleNet, DenseNet, Inception

Image segmentation data (PortraitDataset)-Unet

Target detection data (PennFudanPed)-Faster RCNN

Image generation data (img_align_celeba_2k)-GCGAN

Name classification data (names)-RNN

The front end needs to be able to select a column of the data set

Network layer

Pool-MaxPool2d, AvgPool2d, MaxUnpool2d (see Pytorch_Part3_model module )

Convolution layer-Convxd, ConvTranspose2d

Activation function-Relu, Sigmoid, tanh, RRelu, Leaky Relu

softmax

Dropout (see Pytorch_Part6_ regularization )

Standardization-BatchNormxd, LayerNorm, InstanceNorm2d, GroupNorm

The front end adds the above network layer structure

Loss function & optimizer & learning rate drop

Loss function-CrossEntropyLoss, NLLLoss, BCELoss, BCEWithLogitsLoss (see Pytorch_Part4_loss function )

Optimizer-SGD, RMSprop, Adam (see PyTorch study notes (7): PyTorch's ten optimizers )

Decreased learning rate-StepLR, MultiStepLR, ExponentialLR, ReduceLRonPlateau (see Pytorch_Part5_iterative training )

Add a self-selected column to the front end

Image enhancement

Crop-CenterCrop, RandomCrop, RandomResizeCrop (see Pytorch_Part2_data module )

Flip and rotate-RandomHorizontalFlip, RandomVerticalFlip, RandomRotation

Image transformation-pad, ColorJitter, GrayScale, RandomGrayScale, RandomAffine, RandomErasing

Randomly select the above methods-RandomChoice, RandomApply, RandomOrder

Out of order, Batch-size-parameters in DataLoader

Add an image enhancement method column to the front end

Network layer encapsulation

Unified use of Sequential structure for packaging (see Pytorch_Part3_model module )

Wrap the same network layer-ModuleList

Network layer multiplexer-ModuleDict

Unified front and back interfaces, implementation of the encapsulation layer

other

Epoch

CPU / GPU-model.to ('cuda'), tensor.to ('cuda') (see Pytorch_Part7_model usage )

Model saving and loading (in the \ (\ beta \) stage)

Guess you like

Origin www.cnblogs.com/NAG2020/p/12721528.html