tf.nn.relu_layer(x, weights, biases, name=None)
tf.nn.relu(x)函数对卷积后的结果进行激活,而tf.nn.relu_layer对输入变量x和weights做矩阵乘法并加上biases再做relu非线性变换得到activation
tf.nn.relu_layer()
猜你喜欢
转载自blog.csdn.net/weixin_43486780/article/details/105048605
今日推荐
周排行