Skip to content

Commit

Permalink
Fix to get fp8 working on T5 base.
Browse files Browse the repository at this point in the history
  • Loading branch information
comfyanonymous committed Jul 31, 2024
1 parent a5991a7 commit c24f897
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions comfy/text_encoders/t5.py
Original file line number Diff line number Diff line change
Expand Up @@ -236,4 +236,6 @@ def set_input_embeddings(self, embeddings):

def forward(self, input_ids, *args, **kwargs):
x = self.shared(input_ids, out_dtype=kwargs.get("dtype", torch.float32))
if self.dtype not in [torch.float32, torch.float16, torch.bfloat16]:
x = torch.nan_to_num(x) #Fix for fp8 T5 base
return self.encoder(x, *args, **kwargs)

0 comments on commit c24f897

Please sign in to comment.