`AcceleratorState`对象没有属性`distributed_type`

问题描述 投票:0回答:1

我正在尝试使用以下代码将加速器与训练器一起使用:

    tokenizer = AutoTokenizer.from_pretrained(model_args.model_name_or_path)

    config = AutoConfig.from_pretrained(model_args.model_name_or_path)
    model = AutoModelForSeq2SeqLM.from_pretrained(
        model_args.model_name_or_path, config=config)
    collator = DataCollatorForSeq2Seq(tokenizer, model=model)

    train_set = CorefDataset(tokenizer, data_args, training_args, 'train')
    tb_callback = TensorBoardCallback()

    accelerator = Accelerator()
    trainer = accelerator.prepare(CorefTrainer(
        tokenizer=tokenizer,
        model=model,
        args=training_args,
        train_dataset=train_set,
        #        eval_dataset=dev_set,
        data_collator=collator,
        callbacks=[tb_callback]
    ))
    trainer.train()

然后按照这篇post中的说明,我使用 Google Colab 中的命令运行了此代码:

!accelerate launch --config_file /root/.cache/huggingface/accelerate/default_config.yaml Seq2seqCoref/main.py

然后我收到以下错误:

Traceback (most recent call last):
  File "/content/Seq2seqCoref/main.py", line 41, in <module>
    trainer = accelerator.prepare(CorefTrainer(
  File "/usr/local/lib/python3.10/dist-packages/accelerate/accelerator.py", line 1248, in prepare
    if self.distributed_type == DistributedType.DEEPSPEED:
  File "/usr/local/lib/python3.10/dist-packages/accelerate/accelerator.py", line 529, in distributed_type
    return self.state.distributed_type
  File "/usr/local/lib/python3.10/dist-packages/accelerate/state.py", line 1076, in __getattr__
    raise AttributeError(
AttributeError: `AcceleratorState` object has no attribute `distributed_type`. This happens if `AcceleratorState._reset_state()` was called and an `Accelerator` or `PartialState` was not reinitialized.

变压器和加速库的版本分别是4.40.2和0.30.0。

之前,我直接在Google colab中尝试了代码,而不是使用main.py。但是,出现了同样的错误。

nlp huggingface-transformers accelerate huggingface-trainer
1个回答
0
投票

降级到这些对我有用:

!pip install accelerate==0.27.2
!pip install transformers==4.40.2
© www.soinside.com 2019 - 2024. All rights reserved.