hello,
I received the following error:
"TypeError: Exception encountered when calling CLIPEncoderLayer.call().
Could not automatically infer the output shape / dtype of 'clip_encoder_layer' (of type CLIPEncoderLayer). Either the CLIPEncoderLayer.call() method is incorrect, or you need to implement the CLIPEncoderLayer.compute_output_spec() / compute_output_shape() method. Error encountered:
Exception encountered when calling CLIPAttention.call().
pred must not be a Python bool
Arguments received by CLIPAttention.call():
• inputs=tf.Tensor(shape=(None, 77, 768), dtype=float32)
• attention_mask=None
Arguments received by CLIPEncoderLayer.call():
• args=('<KerasTensor shape=(None, 77, 768), dtype=float32, sparse=False, name=keras_tensor_60>',)
• kwargs=<class 'inspect._empty'>"
for the command
"unconditional_context = self.text_encoder.predict_on_batch( [unconditional_tokens, self._get_pos_ids()])"
in the function "_get_unconditional_context" in the file "stable_diffusion.py".
Please help me to run this code.
Thanks.
hello,
I received the following error:
"TypeError: Exception encountered when calling CLIPEncoderLayer.call().
Could not automatically infer the output shape / dtype of 'clip_encoder_layer' (of type CLIPEncoderLayer). Either the
CLIPEncoderLayer.call()method is incorrect, or you need to implement theCLIPEncoderLayer.compute_output_spec() / compute_output_shape()method. Error encountered:Exception encountered when calling CLIPAttention.call().
pred must not be a Python bool
Arguments received by CLIPAttention.call():
• inputs=tf.Tensor(shape=(None, 77, 768), dtype=float32)
• attention_mask=None
Arguments received by CLIPEncoderLayer.call():
• args=('<KerasTensor shape=(None, 77, 768), dtype=float32, sparse=False, name=keras_tensor_60>',)
• kwargs=<class 'inspect._empty'>"
for the command
"unconditional_context = self.text_encoder.predict_on_batch( [unconditional_tokens, self._get_pos_ids()])"
in the function "_get_unconditional_context" in the file "stable_diffusion.py".
Please help me to run this code.
Thanks.