Reputation: 1
I've just learn about how BERT produce embeddings. I might not understand it fully.
I was thinking of doing a project of leveraging those embeddings and feed it to an autoencoder to generate latent space for my text data.
I am a bit skeptical that Bert embeddings has produce like sort of relationship between words in a single vector, would it not like loss those relationship information if I feed it in an autoencoder?
It would be nice if someone could give an opinion regarding this.
I am trying to group like text with similar emotions clustered in the latent space in the same time applying contrastive learning concept in a supervised manner.
Upvotes: 0
Views: 44