Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

where is the setting about position embedding #51

Open
liwenssss opened this issue Jan 5, 2022 · 1 comment
Open

where is the setting about position embedding #51

liwenssss opened this issue Jan 5, 2022 · 1 comment

Comments

@liwenssss
Copy link

hello, thx for your nice work. I want to know how do you set position embedding? Do you just follow the setting of BERT?

@kevinlin311tw
Copy link
Member

There are two related codes for the position encoding/embedding.

First of all, when preparing the input tokens, we concatenate a reference template mesh and the image features for position encoding. You can find relevant code here

Secondly, inside the transformer encoder module, we set the position embedding following the conventional BERT. Relevant code can be found here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants