转自:FesianXu
tf.tile()进行张量扩展
tf.tile()
应用于需要张量扩展的场景,具体说来就是:
如果现有一个形状如[width
, height
]的张量,需要得到一个基于原张量的,形状如[batch_size
,width
,height
]的张量,其中每一个batch的内容都和原张量一模一样。tf.tile
使用方法如:
tile(
input,
multiples,
name=None
)
- 1
- 2
- 3
- 4
- 5
其中输出将会重复input输入multiples次。例子如:
import tensorflow as tf
raw = tf.Variable(tf.random_normal(shape=(1, 3, 2)))
multi = tf.tile(raw, multiples=[2, 1, 1])
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(raw.eval())
print('-----------------------------')
print(sess.run(multi))
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
输出如:
[[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]]
\-----------------------------
[[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]
[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]]
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
可见,multi重复了raw的0 axes两次,1和2 axes不变。
还有几个例子:
import tensorflow as tf
temp = tf.tile([1,2,3],[2])
temp2 = tf.tile([[1,2],[3,4],[5,6]],[2,3])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
- 1
- 2
- 3
- 4
- 5
- 6
[1 2 3 1 2 3]
[[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]
[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]]
import tensorflow as tf
temp = tf.tile([[1,2,3],[1,2,3]],[1,1])
temp2 = tf.tile([[1,2,3],[1,2,3]],[2,1])
temp3 = tf.tile([[1,2,3],[1,2,3]],[2,2])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
print(sess.run(temp3))
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
[[1 2 3]
[1 2 3]]
[[1 2 3]
[1 2 3]
[1 2 3]
[1 2 3]]
[[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]]
tf.tile()进行张量扩展
tf.tile()
应用于需要张量扩展的场景,具体说来就是:
如果现有一个形状如[width
, height
]的张量,需要得到一个基于原张量的,形状如[batch_size
,width
,height
]的张量,其中每一个batch的内容都和原张量一模一样。tf.tile
使用方法如:
tile(
input,
multiples,
name=None
)
- 1
- 2
- 3
- 4
- 5
其中输出将会重复input输入multiples次。例子如:
import tensorflow as tf
raw = tf.Variable(tf.random_normal(shape=(1, 3, 2)))
multi = tf.tile(raw, multiples=[2, 1, 1])
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(raw.eval())
print('-----------------------------')
print(sess.run(multi))
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
输出如:
[[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]]
\-----------------------------
[[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]
[[-0.50027871 -0.48475555]
[-0.52617502 -0.2396145 ]
[ 1.74173343 -0.20627949]]]
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
可见,multi重复了raw的0 axes两次,1和2 axes不变。
还有几个例子:
import tensorflow as tf
temp = tf.tile([1,2,3],[2])
temp2 = tf.tile([[1,2],[3,4],[5,6]],[2,3])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
- 1
- 2
- 3
- 4
- 5
- 6
[1 2 3 1 2 3]
[[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]
[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]]
import tensorflow as tf
temp = tf.tile([[1,2,3],[1,2,3]],[1,1])
temp2 = tf.tile([[1,2,3],[1,2,3]],[2,1])
temp3 = tf.tile([[1,2,3],[1,2,3]],[2,2])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
print(sess.run(temp3))
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
[[1 2 3]
[1 2 3]]
[[1 2 3]
[1 2 3]
[1 2 3]
[1 2 3]]
[[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]]
import tensorflow as tf
temp = tf.tile([1,2,3],[2])
temp2 = tf.tile([[1,2],[3,4],[5,6]],[2,3])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
- 1
- 2
- 3
- 4
- 5
- 6
[1 2 3 1 2 3]
[[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]
[1 2 1 2 1 2]
[3 4 3 4 3 4]
[5 6 5 6 5 6]]
import tensorflow as tf
temp = tf.tile([[1,2,3],[1,2,3]],[1,1])
temp2 = tf.tile([[1,2,3],[1,2,3]],[2,1])
temp3 = tf.tile([[1,2,3],[1,2,3]],[2,2])
with tf.Session() as sess:
print(sess.run(temp))
print(sess.run(temp2))
print(sess.run(temp3))
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
[[1 2 3]
[1 2 3]]
[[1 2 3]
[1 2 3]
[1 2 3]
[1 2 3]]
[[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]
[1 2 3 1 2 3]]