tf.add_to_collection and tf.get_collection function

Reprinted: https://blog.csdn.net/nini_coded/article/details/80528466

tf.add_to_collection is put into a number of variables y own quotes named the collection, which is the number of variables in a unified list.

tf.get_collection contrast, all the elements are removed from the list, constituting a new list.

At first glance this seems to be no effect tf.get_collection, or both before and after the list. In fact, the reason for this is as follows:

Example: 
Before using the plurality of variables in tf.add_to_collection named 'regularizer' domain, made up a list. And 'regularizer' itself is not a list display operation can not be directly applied to the tf.get_collection Only 'regularizer' obtained after the list itself is the operable.

After learning depth, usually stored parameters of different weights and bias layer with several functions, i.e. to study all the parameters using tf.contrib.layers.l2_regularizer (regular_num) (w) obtained NORM, are put to the list of 'regular' as a regularization term, and their function and use tf.add_n original loss added to obtain a regular containing the loss.

w = tf.get_variable('weight', dtype=tf.float32,initializer=tf.contrib.layers.xavier_initializer())
tf.add_to_collection('regularizer', tf.contrib.layers.l2_regularizer(regular_num=0.001)(w))

shared = tf.nn.conv2d(input, w, [1, stride, stride, 1], padding=padding)

b = tf.get_variable('bias', [out_dim], 'float32', initializer=tf.constant_initializer(0.))
tf.add_to_collection('regularizer', tf.contrib.layers.l2_regularizer(regular_num=0.001)(b))

out = tf.nn.bias_add(shared, b)

 

The above parameters when defining conv2d w, b doing regular back into L2 'regularizer', the following is calling 'regularizer' defining loss:

regular = tf.add_n(tf.get_collection('regularizer'), 'loss') 
# tf.add_n(inputs,name)
with tf.variable_scope(name='loss') as scope:
    loss = -tf.reduce_sum(label*tf.log(y)) + regular # cross entroy + L2-norm as the loss

Guess you like

Origin blog.csdn.net/qq_38409301/article/details/94206801