optimum[habana] version mismatch

When I install optimum[habana] with

pip install optimum[habana]

I get a version mismatch for transformers – it is saying that I need a later version of transformers, which is not the one on the Habana SynapseAI GitHub (GitHub - HabanaAI/Model-References: TensorFlow and PyTorch Reference models for Gaudi(R)).

ERROR: optimum-habana 1.1.2 has requirement transformers>=4.20.0, but you'll have transformers 4.19.2 which is incompatible.

I also get another error just when I try to import optimum.habana:

from optimum.habana import GaudiTrainer, GaudiTrainingArguments
ImportError                               Traceback (most recent call last)
Input In [34], in <cell line: 10>()
      8 from transformers import DistilBertForSequenceClassification, Trainer, TrainingArguments
      9 # from transformers import DistilBertForSequenceClassification, GaudiTrainer, GaudiTrainingArguments
---> 10 from optimum.habana import GaudiTrainer, GaudiTrainingArguments

File /usr/local/lib/python3.8/dist-packages/optimum/habana/__init__.py:22, in <module>
     20 from .models.gpt2 import GaudiGPT2LMHeadModel
     21 from .models.t5 import GaudiT5ForConditionalGeneration
---> 22 from .trainer import GaudiTrainer
     23 from .trainer_seq2seq import GaudiSeq2SeqTrainer
     24 from .training_args import GaudiTrainingArguments

File /usr/local/lib/python3.8/dist-packages/optimum/habana/trainer.py:41, in <module>
     39 from transformers.modeling_utils import ModuleUtilsMixin, PreTrainedModel, unwrap_model
     40 from transformers.models.albert.modeling_albert import AlbertModel
---> 41 from transformers.pytorch_utils import ALL_LAYERNORM_LAYERS
     42 from transformers.tokenization_utils_base import PreTrainedTokenizerBase
     43 from transformers.trainer_callback import TrainerCallback, TrainerState

ImportError: cannot import name 'ALL_LAYERNORM_LAYERS' from 'transformers.pytorch_utils' (/usr/local/lib/python3.8/dist-packages/transformers/pytorch_utils.py)
1 Like

Can you please let me know the version you are using, is it 1.5 or 1.6?