Adding layers fully connected Tensorflow

Tensorflow provided tf.layers.dense()with tf.contrib.layers.fully_connecteda connection layer to add whole, both as functions, which encapsulates implemented on the basis of the former.

1. tf.layers.dense()

tf.layers.dense(
    inputs,
    units,
    activation=None,
    use_bias=True,
    kernel_initializer=None,
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    bias_regularizer=None,
    activity_regularizer=None,
    kernel_constraint=None,
    bias_constraint=None,
    trainable=True,
    name=None,
    reuse=None
)
  • Inputs : the input layer.
  • Units : output size (dimension), integer or long.
  • Activation : what activation function (nonlinear layer neural network), the default is None, activation function is not used.
  • use_bias : Use bias True (the default), False can be changed without bias.
  • kernel_initializer : weight matrix initialization function. If None (default value), then use tf.get_variable use the default initialization program initializes weights.
  • bias_initializer : initialization function of bias.
  • kernel_regularizer : weight matrix of regular function.
  • bias_regularizer : BIAS of regular functions.
  • activity_regularizer : the output of the regular function.
  • kernel_constraint : the optimized kernel update is applied to optional projection function (e.g., for realizing the weight layer weight norm value constraints or constraints). This function must be non-projected variables as input variables and must return projection (must have the same shape). Asynchronous distributed during training, the use of restraint is not secure.
  • bias_constraint : update the optimized bias applied to optional projection function.
  • trainable : Boolean, if added to the atlas is True, also variable - collectionGraphKeys.TRAINABLE_VARIABLES (see tf.Variable).
  • name : name
  • received Reuse Reuse : Boolean, whether to reuse the same name before a layer weight.

2. tf.contrib.layers.fully_connected

tf.contrib.layers.fully_connected(
    inputs,
    num_outputs,
    activation_fn=tf.nn.relu,
    normalizer_fn=None,
    normalizer_params=None,
    weights_initializer=initializers.xavier_initializer(),
    weights_regularizer=None,
    biases_initializer=tf.zeros_initializer(),
    biases_regularizer=None,
    reuse=None,
    variables_collections=None,
    outputs_collections=None,
    trainable=True,
    scope=None
)
  • Inputs : Level tensor at least 2 and a static value of the last dimension; i.e. [batch_size, depth], [None , None, None, channels].
  • num_outputs : or long integer, the number of layers in the output unit.
  • activation_fn : activate the function. The default value is ReLU function. It is explicitly set to "None" to skip it and maintain linear activation.
  • normalizer_fn : the use of standardized function instead of biases. If normalizer_fn provide biases_initializer, biases_regularizer biases is ignored and does not create nor added. No standardized function, the default setting to "None"
  • normalizer_params : normalized function parameters.
  • weights_initializer : weight initialization procedure.
  • weights_regularizer : Optional weights regularization device.
  • biases_initializer : initialization procedure offset. If not Skip bias.
  • biases_regularizer : offset optional normalizer.
  • received Reuse Reuse : whether to reuse the layer and its variables. Layer must be given the ability to reuse range.
  • variables_collections : a collection of dictionaries optional list of all the variables for each variable or contain different set list.
  • outputs_collections : adding a set of outputs.
  • trainable : True if the variable will be added to the chart collection GraphKeys.TRAINABLE_VARIABLES (See tf.Variable).
  • scope: variable_scope selectable range.
Released three original articles · won praise 2 · Views 161

Guess you like

Origin blog.csdn.net/apr15/article/details/104591771