Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kansformers and transformer loss curve #2

Open
HaiFengZeng opened this issue May 9, 2024 · 1 comment
Open

kansformers and transformer loss curve #2

HaiFengZeng opened this issue May 9, 2024 · 1 comment

Comments

@HaiFengZeng
Copy link

what's the loss looks like for the same config setting

@WhatMelonGua
Copy link

Yes, I'm also interested about the comparison. Is Kan truly perform obviously better out of the Formula derivation than MLP ?
is it worth to use a Kan replacing the MLP?
And to a img or matrix-like, how can we understand the formula Kan Learned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants