tjns
tjns

Reputation: 41

Can I train a seq2seq model in which the same input sequence might have many possible output?

I'm trying to figure out a way to augment data for Seq2Seq model. But i have a limit in number of training samples. I have 200 samples.

I have an idea that using chat GPT to generates similar output sequences to that one input. Is this approach confused the model?

Upvotes: 1

Views: 23

Answers (0)

Related Questions