Skip to content

Generator_loss curve and discriminator_loss curve looks wrong? #1

Answered by jhauret
jdwang125 asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,

Your curves are not incorrect, you just have pushed the training too far! 😄 If you stop the training at the 13th epoch($\approx 200k$ steps), you will obtain the same model as the one in the project.

The real-time and on-device constraints have forced us to have a much lower number of generator parameters in comparison to discriminators (1.9M vs 27.8M). Starting from that point, it was hard to reach a Nash equilibrium during the training. BUT, this does not prevent obtaining a performing generator.

If you want to go further than a simple result reproduction, we have tried two interesting techniques from the Encodec Paper that helped to stabilize training and improved results:

  • Normali…

Replies: 1 comment 15 replies

Comment options

You must be logged in to vote
15 replies
@jhauret
Comment options

@SayeedChowdhury
Comment options

@jiaweiru
Comment options

@jhauret
Comment options

@jiaweiru
Comment options

Answer selected by jhauret
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
5 participants