Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 13951

TypeError: forward() got an unexpected keyword argument 'input_embeds' in BartForConditionalGeneration

$
0
0

I am trying to re-map the token ids with the ones from my custom tokenizer, but the captioned TypeError appears. Here is my custom model:

class MyModel(BartForConditionalGeneration):    def __init__(self, config):        super().__init__(config)        self.shared = nn.Embedding(config.vocab_size, config.d_model, config.pad_token_id)    def forward(self, input_ids, **kwargs):        input_ids = input_ids.long().cuda()        input_embeds = self.shared(input_ids)        return super().forward(input_ids, input_embeds = input_embeds, **kwargs) # TypeError appears here

The error appears when I use the model for training.

In the source code of transformers 4.28.1 (the version I am using), the input_embeds is there.

What did I miss? how do I solve this unexpected argument issue?


Viewing all articles
Browse latest Browse all 13951

Trending Articles