Skip to content
This repository has been archived by the owner on Jul 6, 2021. It is now read-only.

Do you consider the introduction of the pre-trained model for embedding,such as bert? #46

Open
Alexkerl opened this issue Apr 7, 2021 · 0 comments

Comments

@Alexkerl
Copy link

Alexkerl commented Apr 7, 2021

the structure is bert embedding + Transformer decoder +PGN?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant