Tensorflow学习笔记——Optimizer

目前Tensorflow支持11种不同的经典优化器:

  • tf.train.Optimizer
  • tf.train.GradientDescentOptimizer
  • tf.train.AdadeltaOptimizer
  • tf.train.AdagtadOptimizer
  • tf.train.AdagradDAOptimizer
  • tf.train.MomentumOptimizer
  • tf.train.AdamOptimizer
  • tf.train.FtrlOptimizer
  • tf.train.ProximalGradientDescentOptimizer
  • tf.train.ProximalAdagradOptimizer
  • tf.train.RMSProOptimizer

例子:

with tf.name_scope("cross_ent"):
	# compute loss
	loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=score, labels=y))

with tf.name_scope("train"):
	# get gradients of all trainable variables
	gradients = tf.compute_gradients(loss, var_list)
	gradients = list(zip(gradients, var_list))

	# create optimizer and apply gradient descent to the trainable variables
	optimizer = tf.train.GradientDescentOptimizer(learning_rate)
	train_op = optimizer.apply_gradients(grads_and_vars=gradients)

Reference:
https://blog.csdn.net/wuguangbin1230/article/details/71160777
https://blog.csdn.net/xierhacker/article/details/53174558
https://blog.csdn.net/shenxiaoming77/article/details/77169756

猜你喜欢

转载自blog.csdn.net/weixin_42018112/article/details/88881604
今日推荐