-
Notifications
You must be signed in to change notification settings - Fork 15.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python generated code import paths #881
Comments
I just run into the same problem! Let's make that a bit clearer: I have several file1_pb2.py
The directoy where these file will end up is not necessary in the pythonpath. Relative imports worked for python2 but apparently they don't for python3. For python3 we need |
Also, relative imports are supported already in Python2.5 with |
Hi, I had similar issues with the new python3 importing rules, but resolved them by changing the way Suppose you have the following project tree:
where # file1.proto
import 'file2.proto'; Indeed in this case I had python3 complaining about the absolute imports. But if you can replace the import directive by the following one: # file1.proto
import 'mymodule/file2.proto'; the
and python3 will be happy with that. Hope it will help |
tl;dr: can we talk about the mapping of proto packages to Python packages? I'm having trouble with including different proto packages together in one Python package because of how the imports get translated. I believe this problem still applies when using more than one subdirectory. Consider this structure:
In
Note that the import expects the package root level to be available on the Python path. This is a problem for me because the outer directory structure looks like this:
I want to run the server. In order to make the relative imports work (in accordance with PEP 328), I run it from outside the myproject folder as
and promptly get an Now, I've managed to solve this by doing a path-wrangle in the Update: this path wrangle doesn't work and causes some strange errors. Am working on a workaround to this workaround. It's possible/likely that my use-case falls into the category of "not supported", since I believe protobufs are currently designed to be used to make single modules rather than whole packages. However, I'd like to hear what others think about this use-case, and having more "package-style" proto projects. Am I the only one wanting to do this? 1 I wrote |
@awan1 , have you solve the problem? now , I encountered this problem too, I split the .proto files into different packages, this relative imports is a big problem |
@zhongql for now I've added the different generated folders into one larger folder, and I've put that folder directly on my Python path, using the |
@awan1 ,thank you , i also through script to handle the generated file *_pb2.py to solve the relative imports error,now the program can work! |
I think this is related... I don't know python too well, but I'm trying to set up python packages for other developers to use GRPC from python. I'm using GRPC and a few of google service protos.
This makes the generated code unusable because python is looking in the protobuf egg for google/api. The only way I have figured out how to fix this is post-process the generated code. add This only works if your package is one level deep... I'm sure it's going to get unmanageable as we add more services. Directory structure looks like:
and run with: Any thoughts on how to fix this? For now I've just copied the protobuf files from google/api I'm using and changed the packages. |
See #1491 for discussion of relative imports |
It is weird this problem was not fully solved yet !!
and for python 2 compatibility: Am I missing something ? |
This is not solved yet! If you generate the code for a package you want then to distribute (so you effectively cannot tell users they need to add extra directories to the path) you must put generated files under your package directory... but then you can no longer import them, because the proto files have no knowledge of that top-level directory. The "solution" which involves moving around source directories is NOT a solution, it's an attempt at temporary patch. What would constitute a solution needs to be something like
|
Any update on this? Seems like a [maybe simple] problem that should be fixed right? |
Actually, after researching this issue somewhat, I realized that all those "option" things are handled by plugins. Those plugins don't even need to be part of the main As an aside, and, sort of, self-promotion (not really). I started to work on an alternative Protobuf implementation: https://github.com/wvxvw/protopy but... I encountered some issues with it, and as of right now, I don't have time to work on the project, but, if anyone wanted to take over, I wouldn't mind, and would provide some support to clue into how the project works / supposed to work. |
I've run into a similar issue as well. I went with a custom |
Read more here: protocolbuffers/protobuf#881 Signed-off-by: Leonardo Di Donato <[email protected]>
Add grpc channel credentials Use makefile instead of bash script Fix time build: makefile to download and build the protos Signed-off-by: Leonardo Di Donato <[email protected]> chore: do not commit .proto files Signed-off-by: Leonardo Di Donato <[email protected]> docs: example plus build instructions into the README Signed-off-by: Leonardo Di Donato <[email protected]> chore: remove protos and generated python code from the root Signed-off-by: Leonardo Di Donato <[email protected]> new: workaround the issues python has with code generated from protobuf Read more here: protocolbuffers/protobuf#881 Signed-off-by: Leonardo Di Donato <[email protected]> new(falco): generated python API into falco/schema and falco/svc Signed-off-by: Leonardo Di Donato <[email protected]> update(falco/domain): update the import paths Signed-off-by: Leonardo Di Donato <[email protected]> update(falco): client import paths updated Signed-off-by: Leonardo Di Donato <[email protected]>
Read more here: protocolbuffers/protobuf#881 Signed-off-by: Leonardo Di Donato <[email protected]>
Read more here: protocolbuffers/protobuf#881 Signed-off-by: Leonardo Di Donato <[email protected]>
When running under Python 3, PEP 0328 formalized `import package` as an absolute import. This PEP introduced a new syntax, `import .package` as a relative import (of a "sub" package). Finally, the syntax `from . import bar` was made to import a "peer" package. Python gRPC code generated by the gRPC compiler uses the gRPC package name as the absolute import, which works great if you own that particular namespace, and can afford to allocate that namespace to that protobuf globally, or are running under the (now EOL) Python 2. There are some existing bugs (e.g. protocolbuffers/protobuf#881) on GitHub and elsewhere on the internet that outline this issue, and tend to be closed WONTFIX or people work around it by sed'ing the output protobuf files. This change allows the use of the prefixes (and a previously broken codepath) to work around this bug by doing something that semantically feels right-ish. The gRPC Python plugin accepts arguments to strip prefixes via the `--grpc_python_out` flag. For example: --grpc_python_out=grpc_2_0,mycorp.:. will cause the generated imports to become: `from myservice.v1 import baz` rather than `from mycorp.myservice.v1 import baz` Now, when when this is taken to the extreme, and the *entire* prefix except for the final dot is stripped -- for instance: --grpc_python_out=grpc_2_0,mycorp.myservice.v1:. THe existing gRPC plugin will generate the (buggy) output of: `from import baz` This patch changes the behavior of that (buggy) output to become: `from . import baz` Which will allow the import of a peer package under Python 3 without having to exist in a global namespace that matches the protobuf package tree.
When running under Python 3, PEP 0328 formalized `import package` as an absolute import. This PEP introduced a new syntax, `import .package` as a relative import (of a "sub" package). Finally, the syntax `from . import bar` was made to import a "peer" package. Python gRPC code generated by the gRPC compiler uses the gRPC package name as the absolute import, which works great if you own that particular namespace, and can afford to allocate that namespace to that protobuf globally, or are running under the (now EOL) Python 2. There are some existing reports of this problem on GitHub and elsewhere on the internet such as protocolbuffers/protobuf#881 https://www.bountysource.com/issues/27402421-python-generated-code-import-paths https://stackoverflow.com/questions/57213543/grpc-generated-python-script-fails-to-find-a-local-module-in-the-same-directory that outline this issue, and tend to be closed WONTFIX or people work around it by sed'ing the output protobuf files. This change allows the use of the prefixes (and a previously broken codepath) to work around this bug by doing something that semantically feels right-ish. The gRPC Python plugin accepts arguments to strip prefixes via the `--grpc_python_out` flag. For example: --grpc_python_out=grpc_2_0,mycorp.:. will cause the generated imports to become: `from myservice.v1 import baz` rather than `from mycorp.myservice.v1 import baz` Now, when when this is taken to the extreme, and the *entire* prefix except for the final dot is stripped -- for instance: --grpc_python_out=grpc_2_0,mycorp.myservice.v1:. THe existing gRPC plugin will generate the (buggy) output of: `from import baz` This patch changes the behavior of that (buggy) output to become: `from . import baz` Which will allow the import of a peer package under Python 3 without having to exist in a global namespace that matches the protobuf package tree. Beyond the fact this feels like the correct usage of the prefix stripping feature, this also can not result in broken output, since the output was previously broken, so no one is relying on this particular behavior.
When running under Python 3, PEP 0328 formalized `import package` as an absolute import. This PEP introduced a new syntax, `import .package` as a relative import (of a "sub" package). Finally, the syntax `from . import bar` was made to import a "peer" package. Python gRPC code generated by the gRPC compiler uses the gRPC package name as the absolute import, which works great if you own that particular namespace, and can afford to allocate that namespace to that protobuf globally, or are running under the (now EOL) Python 2. There are some existing reports of this problem on GitHub and elsewhere on the internet such as protocolbuffers/protobuf#881 https://www.bountysource.com/issues/27402421-python-generated-code-import-paths https://stackoverflow.com/questions/57213543/grpc-generated-python-script-fails-to-find-a-local-modul e-in-the-same-directory that outline this issue, and tend to be closed WONTFIX or people work around it by sed'ing the output protobuf files. This change allows the use of the prefixes (and a previously broken codepath) to work around this bug by doing something that semantically feels right-ish. The gRPC Python plugin accepts arguments to strip prefixes via the `--grpc_python_out` flag. For example: --grpc_python_out=grpc_2_0,mycorp.:. will cause the generated imports to become: `from myservice.v1 import baz` rather than `from mycorp.myservice.v1 import baz` Now, when when this is taken to the extreme, and the *entire* prefix except for the final dot is stripped -- for instance: --grpc_python_out=grpc_2_0,mycorp.myservice.v1:. THe existing gRPC plugin will generate the (buggy) output of: `from import baz` This patch changes the behavior of that (buggy) output to become:
When running under Python 3, PEP 0328 formalized `import package` as an absolute import. This PEP introduced a new syntax, `import .package` as a relative import (of a "sub" package). Finally, the syntax `from . import bar` was made to import a "peer" package. Python gRPC code generated by the gRPC compiler uses the gRPC package name as the absolute import, which works great if you own that particular namespace, and can afford to allocate that namespace to that protobuf globally, or are running under the (now EOL) Python 2. There are some existing reports of this problem on GitHub and elsewhere on the internet such as protocolbuffers/protobuf#881 https://www.bountysource.com/issues/27402421-python-generated-code-import-paths https://stackoverflow.com/questions/57213543/grpc-generated-python-script-fails-to-find-a-local-modul e-in-the-same-directory that outline this issue, and tend to be closed WONTFIX or people work around it by sed'ing the output protobuf files. This change allows the use of the prefixes (and a previously broken codepath) to work around this bug by doing something that semantically feels right-ish. The gRPC Python plugin accepts arguments to strip prefixes via the `--grpc_python_out` flag. For example: --grpc_python_out=grpc_2_0,mycorp.:. will cause the generated imports to become: `from myservice.v1 import baz` rather than `from mycorp.myservice.v1 import baz` Now, when when this is taken to the extreme, and the *entire* prefix except for the final dot is stripped -- for instance: --grpc_python_out=grpc_2_0,mycorp.myservice.v1:. The existing gRPC plugin will generate the (buggy) output of: `from import baz` This patch changes the behavior of that (buggy) output to become: `from . import baz`
Hi all, I follow the sugestion from @awan1
in
However when I run , I get this error in this is |
I am running in the same issue ? |
@aaly00 there's a workaround hack
to use the proto_out (py) files without import errors you should just add
|
By default the imports of other
.proto
-generated files are translated toimport <module> as <alias>
They probably should be
from . import <module> as <alias>
as now all the_pb2.py
files need to be in a top-level pythonpath.The text was updated successfully, but these errors were encountered: