都不知道是不是resnet原始结构,keras实现....

1.

各种各样的网络结构,看别人博客额
没时间看原文,搞的我头大

管他的,我只想拿一个resblock过来用用…

shortcut设置为true,就代表要改变尺寸大小

下面的代码,忘了从哪复制的…

Res_Block 原始结构 好像是 CONV-BN-RE -> CONV-BN-RE-> CONV-BN-RE

但是下面的程序可能是变种,Conv-relu-BN了…

from keras.models import Model
from keras.layers import Dense,add,Input,Flatten,Conv2D,MaxPooling2D,Dropout,BatchNormalization
from keras.layers.merge import concatenate

def Conv2d_BN(x, nb_filter, kernel_size, strides=(1, 1), padding='same', name=None):
    if name is not None:
        bn_name = name + '_bn'
        conv_name = name + '_conv'
    else:
        bn_name = None
        conv_name = None

    x = Conv2D(nb_filter, kernel_size, padding=padding, strides=strides, activation='relu', name=conv_name)(x)
    x = BatchNormalization(axis=3, name=bn_name)(x)
    return x

def identity_Block(inpt, nb_filter, kernel_size, strides=(1, 1), with_conv_shortcut=False):
    # 也就是说,这里的第一个Conv_BN,允许改变尺寸大小,如果这里改变了,那个shortcut部分也会相应改变,
    x = Conv2d_BN(inpt, nb_filter=nb_filter, kernel_size=kernel_size, strides=strides, padding='same')
    # 第二个,始终保持大小不变,padding='same',stride=1
    x = Conv2d_BN(x, nb_filter=nb_filter, kernel_size=kernel_size, padding='same')# 这里默认步长为1
    if with_conv_shortcut:  
        shortcut = Conv2d_BN(inpt, nb_filter=nb_filter, strides=strides, kernel_size=kernel_size)# padding在Conv2d_BN默认为same
        x = add([x, shortcut])
    else:
    	x = add([x, inpt])
    return x



# conv1
x = Conv2d_BN(x, nb_filter=64, kernel_size=(7, 7), strides=(2, 2), padding='same')
# 上面都stride=2了,不就相当于做了个降采样...这里要池化做啥子
# x = MaxPooling2D(pool_size=(3, 3), strides=(2, 2), padding='same')(x)
# conv2
x = identity_Block(x, nb_filter=64, kernel_size=(3, 3))
x = identity_Block(x, nb_filter=64, kernel_size=(3, 3))
# conv3
# 这里加了shortcu,说明尺寸大小发生了改变,因为stride=2了
# 下面的肯定是不变了,为1
x = identity_Block(x, nb_filter=128, kernel_size=(3, 3), strides=(2, 2), with_conv_shortcut=True)
x = identity_Block(x, nb_filter=128, kernel_size=(3, 3))
# x = identity_Block(x, nb_filter=128, kernel_size=(3, 3))

2.改一下顺序就好

在头文件的 Layer 处导入一个Activation就行

这个轴的话,额,其实写 axis = -1 也行吧

from keras.layers import Input, Dense, Activation
def Conv2d_BN(x, nb_filter, kernel_size, strides=(1, 1), padding='same', name=None):
    if name is not None:
        bn_name = name + '_bn'
        conv_name = name + '_conv'
    else:
        bn_name = None
        conv_name = None

    x = Conv2D(nb_filter, kernel_size, padding=padding, strides=strides, name=conv_name)(x)
    x = BatchNormalization(axis=3, name=bn_name)(x)
    x = Activation('relu')(x)
    return x

猜你喜欢

转载自blog.csdn.net/weixin_47289438/article/details/109765852