Keras custom layer and the problem that the number of parameters of the custom layer in Summary is 0

Today
, I am happy to implement a residual layer with noise reduction function: in the custom layer, other Keras layers are used.

__author__ = 'dk'
'''
    定义降噪和增强模块
'''
from keras import backend as K
from keras.engine.topology import Layer
from keras.layers import add,Flatten,Conv1D, AveragePooling1D,Dot
import keras

class Denoising_layer(Layer):
    def __init__(self, filter_type = 'non_local_mean',**kwargs):
        '''
            :param type :  降噪类型, non-local-mean 滤波, 或者是 mean 均值滤波
        '''

        assert filter_type in ['non_local_mean','mean']
        self.filter_type = filter_type
        super(Denoising_layer,self).__init__(**kwargs)

    def build(self, input_shape):

        self.average_pooling = AveragePooling1D(strides=1,padding='same')
        self.conv1d = Conv1D(kernel_size=1,padding='same',filters=input_shape[-1])
        super(Denoising_layer,self).build(input_shape)

    def call(self, inputs, **kwargs):

        if self.filter_type == 'mean' :

            delta_x = self.average_pooling(inputs)

        if self.filter_type == 'non_local_mean' :
            delta_x = Dot(axes=(2,1))([inputs , K.permute_dimensions(inputs , (0,2,1))])
            delta_x = Dot(axes=(1))([delta_x,inputs])
        delta_x = self.conv1d(delta_x)

        return add([inputs,delta_x])

Then I happily added it to the existing model, summarized the model, and found that the parameter of this layer is actually 0. Is this really good? ? ?

Insert picture description here
and! The most outrageous thing is that this error will not affect the training of the model.
But when the model is saved, and then loaded into the model for prediction, then the model is directly cool.
This error can be said to be quite hidden.

Finally, I found out that I had to write like this:

__author__ = 'dk'
'''
    定义降噪和增强模块
'''
from keras import backend as K
from keras.engine.topology import Layer
from keras.layers import add,Flatten,Conv1D, AveragePooling1D,Dot
import keras

class Denoising_layer(Layer):
    def __init__(self, filter_type = 'non_local_mean',**kwargs):
        '''
            :param type :  降噪类型, non-local-mean 滤波, 或者是 mean 均值滤波
        '''

        assert filter_type in ['non_local_mean','mean']
        self.filter_type = filter_type
        super(Denoising_layer,self).__init__(**kwargs)

    def build(self, input_shape):

        self.average_pooling = AveragePooling1D(strides=1,padding='same')
        self.conv1d = Conv1D(kernel_size=1,padding='same',filters=input_shape[-1])
        self.conv1d.build(input_shape) ##关键的两行!!!
        self._trainable_weights += self.conv1d._trainable_weights #关键的两行!!!
        super(Denoising_layer,self).build(input_shape)

    def call(self, inputs, **kwargs):

        if self.filter_type == 'mean' :

            delta_x = self.average_pooling(inputs)

        if self.filter_type == 'non_local_mean' :
            delta_x = Dot(axes=(2,1))([inputs , K.permute_dimensions(inputs , (0,2,1))])
            delta_x = Dot(axes=(1))([delta_x,inputs])
        delta_x = self.conv1d(delta_x)

        return add([inputs,delta_x])

You need to call the build function of the other keras layer in the middle after the build, and then add the parameters of the other layer. ! ! !
This is wonderful.

Look at the Summary:
Insert picture description here
This will return to normal! ! ! !

Summary: If you use some of Keras's own layers when customizing layers, you need to add the parameters of the used layer to the parameters of the custom layer in the build function of the custom layer.

Guess you like

Origin blog.csdn.net/jmh1996/article/details/109783600