TF2.0—tf.keras.losses.BinaryCrossentropy

Article Directory

BinaryCrossentropy

tf.keras.losses.BinaryCrossentropy(
    from_logits=False, label_smoothing=0, reduction=losses_utils.ReductionV2.AUTO,
    name='binary_crossentropy'
)

description

Calculate the cross entropy loss between the real label and the predicted label
When there are only two label classes (assuming 0 and 1), use this cross entropy loss. For each example, each prediction should have a floating point value

parameter

from_logits
whether y_pred interpreted as logit value tensor.
By default, we assume that y_pred contains the probability (that is, the value in [0, 1])

The label_smoothing
floating point number is in [0,1].
If it is 0, no smoothing will be performed.
When> 0, we will calculate the loss between the predicted label and the smoothed version of the true label, where smoothing will compress the label towards 0.5.
A larger value of label_smoothing corresponds to a heavier smoothness

reduction
(optional) Type of tf.keras.losses.reduction, applicable to loss. The default value is automatic.
AUTO indicates that the reduction option will be determined by the usage.

name
(optional) The name of the op. The default is'binary_crossenropy'

Parameters of the called instance object

y_true: true value

y_pred: predicted value

The
optional sample_weight of sample_weight is used as the coefficient of loss.
If a scalar is provided, the loss will simply be scaled by the given value.
If sample_weight is a tensor of size [batch_size], then the total loss of each sample of the batch will be rescaled by the corresponding element in the sample_weight vector.
If the shape of sample_weight is [batch_size, d0, ... dN-1] (or can be broadcast to this shape), then each loss element of y_pred will be scaled by the corresponding value of sample_weight.

Instance

Independent use

y_true = [[0., 1.], [0., 0.]]
y_pred = [[0.6, 0.4], [0.4, 0.6]]
# Using 'auto'/'sum_over_batch_size' reduction type.
bce = tf.keras.losses.BinaryCrossentropy()
bce(y_true, y_pred).numpy()
0.815

# Calling with 'sample_weight'.
bce(y_true, y_pred, sample_weight=[1, 0]).numpy()
0.458
 # Using 'sum' reduction type.
 bce = tf.keras.losses.BinaryCrossentropy(
     reduction=tf.keras.losses.Reduction.SUM)
 bce(y_true, y_pred).numpy()
 1.630
# Using 'none' reduction type.
bce = tf.keras.losses.BinaryCrossentropy(
    reduction=tf.keras.losses.Reduction.NONE)
bce(y_true, y_pred).numpy()

**Used in tf.keras API**

model.compile(optimizer='sgd', loss=tf.keras.losses.BinaryCrossentropy())

Guess you like

Origin blog.csdn.net/weixin_46649052/article/details/112707572