tf.keras.layers运算

tf.keras.layers.Permute

在这里插入图片描述

tf.keras.layers.Permute(
    dims, **kwargs
)

使用案例:

model = Sequential()
model.add(Permute((2, 1), input_shape=(10, 64)))
# now: model.output_shape == (None, 64, 10)
# note: `None` is the batch dimension

参数:

dims : Tuple of integers. Permutation pattern does not include the samples dimension. Indexing starts at 1. For instance, (2, 1) permutes the first and second dimensions of the input.

tf.keras.layers.Multiply

函数原型:

tf.keras.layers.Multiply(
    **kwargs
)

使用案例:

tf.keras.layers.Multiply()([np.arange(5).reshape(5, 1),
                            np.arange(5, 10).reshape(5, 1)])

输出:

tf.Tensor(
[[ 0]
 [ 6]
 [14]
 [24]
 [36]], shape=(5, 1), dtype=int32)

由此可见,tf.keras.layers.Multiply就是相同shape的tensor对应位置元素相乘。

tf.keras.layers.Reshape

函数原型

扫描二维码关注公众号,回复: 13412156 查看本文章
tf.keras.layers.Reshape(
    target_shape, **kwargs
)

使用案例:

n = np.arange(32).reshape(2, 16)
k = tf.convert_to_tensor(n)
xx = tf.keras.layers.Reshape((8, 2))(k)
xn = tf.keras.layers.Reshape((4, 4))(xx)

n1 = np.arange(32).reshape(16, 2)
k1 = tf.convert_to_tensor(n1)
xx1 = tf.keras.layers.Reshape((1, 2))(k1)

可以看出,输出为(batch_size,) + target_shape
第一维肯定是batch_size,第二维开始按行的顺序重新排列。重新排列时,按照维度从前到后的顺序重新排列。每次reshape都按照这个顺序,乱不了。

model = tf.keras.Sequential()
model.add(tf.keras.layers.Reshape((3, 4), input_shape=(12,)))
# model.output_shape == (None, 3, 4), `None` is the batch size.
model.output_shape

输出:(None, 3, 4)

tf.keras.layers.RepeatVector

函数原型:

tf.keras.layers.RepeatVector(
    n, **kwargs
)

使用案例:

model = Sequential()
model.add(Dense(32, input_dim=32))
# now: model.output_shape == (None, 32)
# note: `None` is the batch dimension

model.add(RepeatVector(3))
# now: model.output_shape == (None, 3, 32)

Input shape:
2D tensor of shape (num_samples, features).

Output shape:
3D tensor of shape (num_samples, n, features).

猜你喜欢

转载自blog.csdn.net/u011913417/article/details/110855175