Keras regulations on the input size of the first layer

1. In the case of Sequential,
if you want to specify the size of the batch, you need to
use batch_input_shape in the input shape of the first layer instead of input_shape, because input_shape cannot specify the size of the batch, and the batch can only be None

input_shape and batch_input_shape.
input_shape does not include the batch size,
batch_input_shape is the shape of the whole input, including the batch size.

2. In the case of functional

Input parameter
shape: a size tuple (integer), excluding batch size. A shape tuple (integer), not including the batch size. For example, shape=(32,) indicates that the expected input is a 32-dimensional vector in batches.
batch_shape: A size tuple (integer) containing the batch size. For example, batch_shape=(10, 32) indicates that the expected input is 10 32-dimensional vectors. batch_shape=(None, 32) indicates a 32-dimensional vector of any batch size.

The explanation for the official website is like this

# 作为 Sequential 模型的第一层
model = Sequential()
model.add(Dense(32, input_shape=(16,)))
# 现在模型就会以尺寸为 (*, 16) 的数组作为输入,
# 其输出数组的尺寸为 (*, 32)

# 在第一层之后,你就不再需要指定输入的尺寸了:
model.add(Dense(32))

Click here

Guess you like

Origin blog.csdn.net/ALZFterry/article/details/109764969