Tensorflow:反卷积,解决AdamOptimizer报错ValueError

使用tensorflow把GAN搭建好了,调试过程出现ValueError: Shapes (100, 14, 14, 64) and (100, 12, 12, 64) are not compatible,这个小错误差点把人弄崩溃,经过一番折腾,终于解决,直接给出该错误的问题出处吧:

def deconv_layer(input,kernel_size_x,kernel_size_y,channel_in,channel_out,output_shape_n,isnorm=True,name="conv",active="relu"):
	with tf.name_scope(name):
		w = tf.Variable(tf.truncated_normal(shape=[kernel_size_x,kernel_size_y,channel_in,channel_out], stddev=0.01), name="W")
		b = tf.Variable(tf.zeros([channel_in])+0.1, name="B")
		conv = tf.nn.conv2d_transpose(input,w,output_shape=output_shape_n,strides=[1,2,2,1],padding="SAME")
		if isnorm:
			conv = tf.contrib.layers.batch_norm(inputs = conv, center=True, scale=True, is_training=True)
		if active == "relu":
			act = tf.nn.relu(conv + b)
		if active == "tanh":
			act = tf.nn.tanh(conv + b)	
		return act

在我的代码中生成器的反卷积过程为:1. b a t c h s i z e 100 batchsize*100 —>2. b a t c h s i z e 2 2 25 batchsize*2*2*25 —>3. b a t c h s i z e 3 3 256 batchsize*3*3*256 —>4. b a t c h s i z e 6 6 128 batchsize*6*6*128 —>5. b a t c h s i z e 12 12 64 batchsize*12*12*64 —>6. b a t c h s i z e 28 28 1 batchsize*28*28*1 。错误为在5. b a t c h s i z e 12 12 64 batchsize*12*12*64 –>6. b a t c h s i z e 28 28 1 batchsize*28*28*1 中反卷积操作tf.nn.conv2d_transpose里的填充模式为"SAME",使得Adam后向传播的时候无法计算梯度。

解决方案:在5. b a t c h s i z e 12 12 64 batchsize*12*12*64 –>6. b a t c h s i z e 28 28 1 batchsize*28*28*1 中的填充模式改为VALID即可。

猜你喜欢

转载自blog.csdn.net/LiGuang923/article/details/84246399
今日推荐