Original link: https://arxiv.org/abs/1901.10444
Published: ICLR 2019
--------------------------------------------------------------------------------------------------------------------------------------------------------------------
Introduced in the embedded sentence structure 3 (RANDOM SENTENCE ENCODERS),
- Bag of random embedding projections
- Random LSTMs
- Echo State Networks
Ideas: the use of a pre-trained word embedding as input, then no training sentence encoder (IE, pre-random initialization), then add a layer of a linear layer, using logistic regression classifier can.
BAG OF RANDOM EMBEDDING PROJECTIONS (BOREP)
Random initialization bag-of-embeddings right weight W,
Each element of random initialization,
We can get a sentence of
F the pool of pooling function, or may be a pooling max mean pooling. then followed by a non-linear function, such Relu (h) = max (0 , h).
RANDOM LSTMS
Same, LSTM weight matrix are initialized to random,
d is the hidden size LSTM. can be obtained in a sentence,
ECHO STATE NETWORKS
ESN can be expressed as the following form,
Here, too, the use of two-way ESN, last sentence can be expressed,
Look directly at the author's conclusions