Show layer specified learning rates with combine_tessdata -l #3468
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
combine_tessdata -l ...
command shows the initial value of the learning rate, but not the layer (final)learning rates that are actually used in finetuning (lstmtraining with-continue_from
option).This PR will add layer learning rates to the output of
combine_tessdata -l
command.Example output:
In this output, we can see that the learning rate of jpn_vert has been decayed to 0.000125.
This information may be useful for debugging when finetuning does not work very well, or for determining the learning rate when doing a full training.