FP8 range for E4M3 dtype

E4M3 data type is supposed to have a maximum value of 448. When I test that data type in 1.16.2, I see the maximum value is 240. Why is there such a discrepancy?

To reproduce the issue, you can use:

‘’’
import torch
import habana_frameworks.torch.core as htcore
import habana_frameworks.torch.hpex.experimental.transformer_engine as te

x = torch.arange(0, 448., device=‘hpu’).to(torch.float8_e4m3fn)
print(x)
‘’’

1 Like

Thanks for posting, we are looking into this

Hi, any update about this range?

This is being worked upon by internal teams. Once they provide guidelines on which fp8 version is supported exactly, will post here.

1 Like