E4M3 data type is supposed to have a maximum value of 448. When I test that data type in 1.16.2, I see the maximum value is 240. Why is there such a discrepancy?
To reproduce the issue, you can use:
‘’’
import torch
import habana_frameworks.torch.core as htcore
import habana_frameworks.torch.hpex.experimental.transformer_engine as te
x = torch.arange(0, 448., device=‘hpu’).to(torch.float8_e4m3fn)
print(x)
‘’’