Does HPU support complex datatype in torch

Does hpu support torch.stft()?

No complex datatype isnt supported.

workaround: If you have a model with stft, you can run the stft on cpu (by moving its input to cpu input = input.to('cpu'), assuming real number inputs, and maybe setting return_complex=False). Once stft is done on CPU, you can move back the results on HPU and continue the rest of the model

>>> import habana_frameworks.torch.core as htcore
>>> torch.stft(torch.tensor([2,3,2,3,4,2,3,2.0]).to('hpu'), 8, return_complex=False)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.10/dist-packages/torch/functional.py", line 660, in stft
    return _VF.stft(input, n_fft, hop_length, win_length, window,  # type: ignore[attr-defined]
RuntimeError: Complex datatype is not supported on HPU device.
>>> torch.stft(torch.tensor([2,3,2,3,4,2,3,2.0]).to('hpu'), 8, return_complex=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.10/dist-packages/torch/functional.py", line 660, in stft
    return _VF.stft(input, n_fft, hop_length, win_length, window,  # type: ignore[attr-defined]
RuntimeError: Complex datatype is not supported on HPU device.
>>>