foreword
This article is a continuation of the classic backbone model cheat sheet (1) in the CV field . Also record some interesting backbone models.
An understanding of domain adaptation in transfer learning and an introduction to 3 techniques
Model
DynaMixer
ICML 2022
DynaMixer: A Vision MLP Architecture with Dynamic Mixing
Code: https://github.com/ziyuwwang/DynaMixer , 2022.8.5 Have not seen pre-training weights announced yet.
![](https://img-blog.csdnimg.cn/76bfe9c8f7624b91b44664c54b1efe38.png#pic_center)
Block pseudocode, source original paper.
###### initializaiton #######
proj_c = nn.Linear(D, D)
proj_o = nn.Linear(D, D)
###### code in forward ######
def dyna_mixer_block(self, X):
H, W, D = X.shape
# row mixing
for h = 1:H
Y_h[h,:,:] = DynaMixerOp_h(X[h,:,:])
# column mixing
for w = 1:W
Y_w[:,w,:] = DynaMixerOp_w(X[:,w,:])
# channel mixing
Y_c = proj_c(X)
Y_out = Y_h + Y_w + Y_c
return proj_o(Y_out)