Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

[High-Level-API] Update MNIST to use optimizer_func #535

Merged

Conversation

jetfuel
Copy link
Contributor

@jetfuel jetfuel commented Jun 6, 2018

No description provided.

@jetfuel jetfuel self-assigned this Jun 6, 2018
@@ -146,7 +146,7 @@ Here are the quick overview on the major fluid API complements.
This is where you specify the network flow.
1. `train_program`: A function that specify how to get avg_cost from `inference_program` and labels.
This is where you specify the loss calculations.
1. `optimizer`: Configure how to minimize the loss. Paddle supports most major optimization methods.
1. `optimizer_func`: Configure how to minimize the loss. Paddle supports most major optimization methods.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we elaborate a bit? I like the description of the train_program. Can we write something like "A function that specifies the configuration of the the optimizer. The optimizer is responsible for minimizing the loss and driving the training. Paddle supports many different optimizers."

@@ -245,6 +245,15 @@ def train_program():
return [avg_cost, acc]
```

#### Optimizer Function Configuration

In the following `Adam` optimizer, `learning_rate` means the speed at which the network training converges.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we rephrase: learning_rate specifies the learning rate in the optimization procedure.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure, sounds good.

Copy link
Contributor

@sidgoyal78 sidgoyal78 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jetfuel jetfuel merged commit 34b152f into PaddlePaddle:high-level-api-branch Jun 6, 2018
@jetfuel jetfuel deleted the fixOptimzerFuncMnist branch June 6, 2018 20:55
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants