作者:手机用户2602916141 | 来源:互联网 | 2023-09-09 18:19
InTensorFlowv2,belowcodecancauseGraphDefreconciliationerror.
In TensorFlow v2, below code can cause GraphDef reconciliation error.
1 2 3 4 5 6 7 8 9 10
| py
.function
def foo(x):
return x ** 2
with writer.as_default():
tf.summary.trace_on()
foo(1)
foo(2)
tf.summary.trace_export("foo") |
Depending on the argument,
(really, auto-graph) creates ops that are unique within GraphDef but is not globally unique. In the example above, two GraphDefs (on from
and another from
) will be written out and they can collide badly in names and content.
In such case, instead of showing wrong graph content, TensorBoard throws an error.
该提问来源于开源项目:tensorflow/tensorboard
I had the same issue. Tensorboard needs unique names to be given to the graph variables (I don't why and I hope this issue will be fixed). In your case this piece of code should fix it:
1 2 3 4 5 6 7 8 9 10 11 12
| import tensorflow as tf
.function
def foo(x):
return x ** 2
writer=tf.summary.create_file_writer('logs\\')
with writer.as_default():
tf.summary.trace_on()
foo(tf.Variable(1, name='foo1')) # define a unique name for the variable
foo(tf.Variable(2, name='foo2'))
tf.summary.trace_export("foo", step=0) |
This issue also exists when overriding tf.Module. Then, self.name_scope (or tf.name_scope) can be used when defining the module variables (wrapping the other operations or not). Here is an example of a custom Dense layer:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
| import tensorflow as tf
import numpy as np
class Dense(tf.Module):
# Fully-connected layer.
def __init__(self, out_fmaps, name=None):
super().__init__(name=name)
self.is_built = False
self.out_fmaps = out_fmaps
def __call__(self, x):
if not self.is_built:
with self.name_scope: # Creates the variable under name_scope
he_init = np.sqrt(2/x.shape[-1])
init_val = tf.random.normal([x.shape[-1], self.out_fmaps])*he_init
self.w = tf.Variable(init_val, name='dense')
self.is_built = True
return tf.matmul(x, self.w) |