使用transformers过程中出现的bug
- 手机
- 2025-08-16 20:48:01

1. The following model_kwargs are not used by the model: ['encoder_hidden_states', 'encoder_attention_mask'] (note: typos in the generate arguments will also show up in this list)
使用text_decoder就出现上述错误,这是由于transformers版本不兼容导致的
from transformers import AutoModel, AutoConfig, BertGenerationDecoder decoder_config = AutoConfig.from_pretrained(args['text_checkpoint']) text_decoder = BertGenerationDecoder(config=decoder_config) output = self.text_decoder.generate(input_ids=cls_input_ids, encoder_hidden_states=encoder_hidden_states, encoder_attention_mask=encoder_attention_mask, max_length=self.args['max_seq_length'], do_sample=True, num_beams=self.args['beam_size'], length_penalty=1.0, use_cache=True, )解决办法:将transformer的版本换到以下范围, 4.15.0<=transformers<4.22.0,transformers>=4.25.0
比如:pip install transformers==4.25.1 or pip install transformers==4.20.1
使用transformers过程中出现的bug由讯客互联手机栏目发布,感谢您对讯客互联的认可,以及对我们原创作品以及文章的青睐,非常欢迎各位朋友分享到个人网站或者朋友圈,但转载请说明文章出处“使用transformers过程中出现的bug”