Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: pretraining matches paper #302

Merged
merged 1 commit into from
May 27, 2021
Merged

feat: pretraining matches paper #302

merged 1 commit into from
May 27, 2021

Conversation

Optimox
Copy link
Collaborator

@Optimox Optimox commented May 25, 2021

What kind of change does this PR introduce?

This PR takes into account the discussion from #291 in order to get a closer implementation from the research paper.

The decoder now uses GLU blocks and a final mapping layer to reconstruct the output from the different steps.

Does this PR introduce a breaking change?

I added new parameters in order to be able to decide the size of the decoding blocks.

What needs to be documented once your changes are merged?

I changed the README but I'm not sure it's clear enough.

Closing issues
closes #291

@Optimox Optimox merged commit 5adb804 into develop May 27, 2021
@Optimox Optimox deleted the feat/update-pretraining branch May 27, 2021 09:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

What is the intention of this part in TabNetDecoder?
2 participants