-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom Op Linux ABI Incompatibility: Undefined Symbol #987
Comments
I am seeing the undefined symbol problem with a custom built TF 1.14 python whl. What bazel build option should I use beside --cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" ? How do I specify compatibility with manylinux2010 and GCC 7.3.1 |
TensorFlow Addons only supports TF2.x. We have an issue for build from source compatibility but it would still only support TF2+ #770 |
Addons are similar to CustomOps in 1.x , right ? So has the same issue. |
TFA must be build from source to support CustomOps. This is necessary since TensorFlow C++ API is not ABI compatible. See: - tensorflow/addons#987.
Note: TFA functionnality relying on Custom Op won't work. See: tensorflow/addons#987. BUG=145555176
Creating this issue to consolidate the information. As seen in #676 #574
Overview:
The TF C++ API for kernel implementations are not ABI stable. Because of this we compile our pip package to be manylinux2010 compatible with GCC 7.3.1. This means that TF-Addons custom ops will work on pip installed TF, but are not guaranteed to work with TF built from source or installed from conda
This should not affect our macOS and Windows versions.
Temporary "fixes"
We implemented a lazy loading of custom-ops so majority of TFA functionality is not affected by this issue:
#869
Long term solution
RFC 133 is in the works to make a stable C ABI.
The text was updated successfully, but these errors were encountered: