Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean train #201

Closed
wants to merge 3 commits into from
Closed

Clean train #201

wants to merge 3 commits into from

Conversation

backyes
Copy link
Contributor

@backyes backyes commented Oct 13, 2016

I found 3 variables always are null, and verified it with source code and compiling.

Check it within internal opensource, external opensource, metric learning usage.. Passed compilation. So, Just remove them ?
Maybe I am wrong, but clean it necessary if I am right.

@emailweixu @hedaoyuan @reyoung

@backyes
Copy link
Contributor Author

backyes commented Oct 13, 2016

Track related LOG:

commit 623a419538564aa91fb02b71d57e2576979c127b
Author: xuwei06 xuwei06@1ad973e4-5ce8-4261-8a94-b56d1f490c56
Date: Wed Jun 3 00:15:21 2015 +0000

Change Trainer to support multiple trainer in one process.

And fixed a bug for Stat.cpp

git-svn-id: https://svn.baidu.com/idl/trunk/paddle@630 1ad973e4-5ce8-4261-8a94-b56d1f490c56

commit ae24f3d3b08517f6b6ee9cc07a89ebf22e059809

git show 623a419538564aa91fb02b71d57e2576979c127b

const std::shared_ptr<GradientMachine> &gradientMachine,
const std::shared_ptr<DataProvider> &dataProvider,
const std::shared_ptr<DataProvider> &testDataProvider) {
bool testing) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This interface is used by other projects. Also used by recent pull request: https://github.com/baidu/Paddle/pull/193/files#diff-71eb11a1ef9cbe77f5e207dc915cc8b6

@backyes backyes closed this Oct 14, 2016
zhhsplendid pushed a commit to zhhsplendid/Paddle that referenced this pull request Sep 25, 2019
Meiyim pushed a commit to Meiyim/Paddle that referenced this pull request May 21, 2021
thisjiang pushed a commit to thisjiang/Paddle that referenced this pull request Oct 28, 2021
gglin001 pushed a commit to graphcore/Paddle-fork that referenced this pull request Dec 8, 2021
* delete useless code

* delete useless code

* update jenkinsfile

Co-authored-by: haichengj <[email protected]>
wangxicoding pushed a commit to wangxicoding/Paddle that referenced this pull request Dec 9, 2021
zhoutianzi666 pushed a commit to zhoutianzi666/Paddle that referenced this pull request May 23, 2022
DesmonDay pushed a commit to DesmonDay/Paddle that referenced this pull request Jan 18, 2023
zmxdream pushed a commit to zmxdream/Paddle that referenced this pull request Jan 30, 2023
zmxdream added a commit that referenced this pull request Feb 6, 2023
* add dump_walk_path  (#193)

* add dump_walk_path; test=develop

* add dump_walk_path; test=develop

* add dump_walk_path; test=develop

* Add multiple CPU communication, parameter query and merging functions, support batch alignment between multiple cards (#194)

* compatible with edge_type of src2dst and src2etype2dst (#195)

* do not merge_feature_shard when using metapath_split_opt (#198)

* support only load reverse_edge (#199)

* refactor GraphTable (#201)

* fix

* fix

* fix code style

* fix code style

* fix test_dataset

* fix hogwild worker

* fix code style

* fix code style

* fix code style

* fix code style

* fix code style.

* fix code style.

---------

Co-authored-by: danleifeng <[email protected]>
Co-authored-by: qingshui <[email protected]>
Co-authored-by: Webbley <[email protected]>
Co-authored-by: huwei02 <[email protected]>
lizexu123 pushed a commit to lizexu123/Paddle that referenced this pull request Feb 23, 2024
WAYKEN-TSE pushed a commit to WAYKEN-TSE/Paddle that referenced this pull request Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants