Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

increase protobuffer message size #170

Closed
wants to merge 1 commit into from
Closed

increase protobuffer message size #170

wants to merge 1 commit into from

Conversation

luotao1
Copy link
Contributor

@luotao1 luotao1 commented Oct 8, 2016

No description provided.

Copy link
Collaborator

@reyoung reyoung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. But need more reviews.

This change makes trainer_config able to return a large config. It makes PyDataProvider can serialize much more arguments.

But the disadvantages are:

  • It will use larger memory to save trainer config. Maybe we can remove PyDataProvider's arguments after parsing them immediately to solve this problem. But TrainerConfig in Paddle may have some Copy By Value. It is also need be cleaned.
  • The problem is not actually solved, because there still is a size limit here.

And related issue is #166

Copy link
Collaborator

@emailweixu emailweixu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we discussed, it's not recommended to transmit large python data structure the TrainerConfig. For dictionary, we can just transmit the file name, and let dataprovider to load it. Please change the demos to this way of handling dictionary instead of increasing protobuf size limit.

@reyoung reyoung changed the base branch from master to develop October 26, 2016 10:25
@reyoung reyoung closed this Oct 26, 2016
@luotao1 luotao1 deleted the protobuf branch October 28, 2016 05:18
This was referenced Dec 5, 2016
thisjiang pushed a commit to thisjiang/Paddle that referenced this pull request Oct 28, 2021
gglin001 pushed a commit to graphcore/Paddle-fork that referenced this pull request Dec 8, 2021
DesmonDay pushed a commit to DesmonDay/Paddle that referenced this pull request Nov 28, 2022
* Modify for experiment

* Modify for experiment

* Modify for experiment

Co-authored-by: root <[email protected]>
zmxdream pushed a commit to zmxdream/Paddle that referenced this pull request Dec 7, 2022
* Modify for experiment

* Modify for experiment

* Modify for experiment

Co-authored-by: root <[email protected]>
AnnaTrainingG pushed a commit to AnnaTrainingG/Paddle that referenced this pull request Dec 6, 2023
lizexu123 pushed a commit to lizexu123/Paddle that referenced this pull request Feb 23, 2024
* fix readme link

* fix readme link
WAYKEN-TSE pushed a commit to WAYKEN-TSE/Paddle that referenced this pull request Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants