AttributeError: module 'habana_frameworks.torch.hpu' has no attribute 'wrap_in_hpu_graph

The Sentence Transformers module tries to call habana_frameworks.torch.hpu.wrap_in_hpu_graph, which seemingly does not exist.

See the error messages here.

  • I use the docker image vault.habana.ai/gaudi-docker/1.18.0/ubuntu22.04/habanalabs/pytorch-installer-2.4.0:latest and then pip install torch_geometric
  • Specifically, torch-geometric==2.6.1

See the code here

Can you try replacing it with:
from habana_frameworks.torch.hpu import wrap_in_hpu_graph

As far as I can see, at least in the version of habana_frameworks I am currently using, the function wrap_in_hpu_graph is in habana_frameworks/torch/hpu/graphs.py but the functions in .graph is only imported in __init__.py when is_lazy() is True.

I manually modified habana_frameworks/torch/hpu/__init__.py by removing the condition if is_lazy() and it works now.

@Sayantan_S Thanks for the suggestion. We still have an ImportError. See below.

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
 in <module>:1                                                                                    
                                                                                                  
 1 from habana_frameworks.torch.hpu import wrap_in_hpu_graph                                    
   2                                                                                              
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ImportError: cannot import name 'wrap_in_hpu_graph' from 'habana_frameworks.torch.hpu' 
(/usr/local/lib/python3.10/dist-packages/habana_frameworks/torch/hpu/__init__.py)

@vezenbu do you use HPU graphs with eager + torch.compile?
In eager + torch.compile, HPU graphs are not supported.