-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What category does the M2 model belong to #34
Comments
Great question! Every convolution is an SSM so that’s what we mean by SSM
model. The dimension mixer is orthogonal.
…On Wed, May 29, 2024 at 12:41 AM 41924076 ***@***.***> wrote:
Hello, thank you for your great work! M2bert paper mentioned that "Monarch
Mixer is part of a new class of architectures called state-space models
(SSMs), which include S4, Mamba, and BiGS".
Is Monarch Mixer and M2BERT a part of SSMs?
I consider M2BERT to be:
(1) replace attention with bidirectional gated convolutions with a
residual convolution, and set the Monarch matrices to DFT and inverse DFT
matrices to speed up DFT for conv;
(2)In the dimension mixer, replace the two dense matrices in MLPs with
learned block-diagonal matrices to speed up MLP computation.
I wonder which part of it is related to SSM? I would be very grateful if
you could help me with the answer : )
—
Reply to this email directly, view it on GitHub
<#34>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABDDIIURHEI5SUWOUK35W53ZEWBBNAVCNFSM6AAAAABIOLK2QGVHI2DSMVQWIX3LMV43ASLTON2WKOZSGMZDENRQGMZDKOA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thank you so much for your answer! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello, thank you for your great work! M2bert paper mentioned that "Monarch Mixer is part of a new class of architectures called state-space models (SSMs), which include S4, Mamba, and BiGS".
Is Monarch Mixer and M2BERT a part of SSMs?
I consider M2BERT to be:
(1) replace attention with bidirectional gated convolutions with a residual convolution, and set the Monarch matrices to DFT and inverse DFT matrices to speed up DFT for conv;
(2)In the dimension mixer, replace the two dense matrices in MLPs with learned block-diagonal matrices to speed up MLP computation.
I wonder which part of it is related to SSM? I would be very grateful if you could help me with the answer : )
The text was updated successfully, but these errors were encountered: