random tensor- randomly generated seed operation (tf.set_random_seed (interger))

random seed is actually divided into two operations: graph-level (Fig level) and op-level (operational level), the random number generator seed is operating in the data flow diagram resources, let me introduce them specifically.

The first case: different sequences to be generated in the Session, please set neither provided nor op FIG level Level Seed:

a = tf.random_uniform([1])
b = tf.random_normal([1])
print( "Session 1")
with tf.Session() as sess1:
  print (sess1.run(a))  # generates 'A1'
  print (sess1.run(a))  # generates 'A2'
  print (sess1.run(b))  # generates 'B1'
  print (sess1.run(b))  # generates 'B2'

print( "Session 2")
with tf.Session() as sess2:
  print (sess2.run(a))  # generates 'A3'
  print (sess2.run(a))  # generates 'A4'
  print (sess2.run(b))  # generates 'B3'
  print (sess2.run(b))  # generates 'B4'

  The results:

 

As is apparent, whether in the same or a different Session Session, the generated sequences are different.

The second case: To generate the same sequence can be repeated across the Session, please set the seeds for the op:

import tensorflow as tf

a = tf.random_uniform([1], seed=1)     #op-level 随机生成种子
b = tf.random_normal([1])

print( "Session 1")
with tf.Session() as sess1:
  print (sess1.run(a))  # generates 'A1'
  print (sess1.run(a))  # generates 'A2'
  print (sess1.run(b))  # generates 'B1'
  print (sess1.run(b))  # generates 'B2'

print( "Session 2")
with tf.Session() as sess2:
  print (sess2.run(a))  # generates 'A3'
  print (sess2.run(a))  # generates 'A4'
  print (sess2.run(b))  # generates 'B3'
  print (sess2.run(b))  # generates 'B4'

 The results:

 

After the operation is apparent randomly generated seed stage op-level, generating a different sequence within the same Session, inter-Session generate identical sequences.

第三种情况:要使所有生成的随机序列在会话中可重复,就要设置图级别的种子:

import tensorflow as tf

tf.set_random_seed(1234)
a = tf.random_uniform([1])
b = tf.random_normal([1])

print( "Session 1")
with tf.Session() as sess1:
  print (sess1.run(a))  # generates 'A1'
  print (sess1.run(a))  # generates 'A2'
  print (sess1.run(b))  # generates 'B1'
  print (sess1.run(b))  # generates 'B2'

print( "Session 2")
with tf.Session() as sess2:
  print (sess2.run(a))  # generates 'A3'
  print (sess2.run(a))  # generates 'A4'
  print (sess2.run(b))  # generates 'B3'
  print (sess2.run(b))  # generates 'B4'

 

 

明显可以看出,跨Session生成的所有序列都是重复的,但是在档额Session里是不同的,这就是graph-level的随机生成种子。这tf.set_random_seed(interger)  中不同的interger没有什么不同,只是相同的interger每次生成的序列是固定的。

 

Guess you like

Origin www.cnblogs.com/happy-sir/p/11530528.html