Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Attention-RPN fine-tuning #141

Open
kdongyi opened this issue Dec 6, 2023 · 0 comments
Open

About Attention-RPN fine-tuning #141

kdongyi opened this issue Dec 6, 2023 · 0 comments

Comments

@kdongyi
Copy link

kdongyi commented Dec 6, 2023

The Attention-RPN paper specifically states that fine-tuning is not required, but why does the README suggest fine-tuning?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant