Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get rid of pickle for Storing format of inference #7221

Closed
kexinzhao opened this issue Jan 4, 2018 · 2 comments · Fixed by #7712
Closed

Get rid of pickle for Storing format of inference #7221

kexinzhao opened this issue Jan 4, 2018 · 2 comments · Fixed by #7712
Assignees
Labels
预测 原名Inference,包含Capi预测问题等

Comments

@kexinzhao
Copy link
Contributor

kexinzhao commented Jan 4, 2018

Currently Pickle is used in fluid.io.save_inference_model to save inference protobuf string and feed and fetch var names.

However it is not well supported for the c++ side to parse the serializing format provided by pickle.

To make the inference API concise, we need to get rid of pickle.

@abhinavarora
Copy link
Contributor

One strategy that I am exploring right now is to create a new ProtoBuf called InferenceDesc that can incorporate the ProgramDesc and Model Params. This is a related issue for that #7328

@kavyasrinet
Copy link

I added a proposal in the same issue to avoid duplication, since I started working on the same issue (#7328) as well. @abhinavarora and I can work together on this I guess.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
None yet
4 participants