-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support keras style programming in eager mode and enable tf.function #196
Support keras style programming in eager mode and enable tf.function #196
Conversation
1112cb8
to
001d90f
Compare
cb56f4a
to
1de368e
Compare
tensorflow_recommenders_addons/dynamic_embedding/python/ops/layers.py
Outdated
Show resolved
Hide resolved
""" | ||
ids = tf.convert_to_tensor(ids) | ||
input_shape = tf.shape(ids) | ||
lookup_result = de.shadow_ops.embedding_lookup(self.shadow, ids) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
embedding_lookup_unique is a better choice for following Keras Embedding layer?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does! In dynamic_embedding_ops.py, the xxx_embedding_lookup
APIs are wrappers of embedding_lookup
. So I'm trying to reuse the codes in these wrappers, by applying code transform on de.embedding_lookup
. It will take some time, maybe just making it unique_embedding_lookup
on current keras layer is realistic.
tensorflow_recommenders_addons/dynamic_embedding/python/ops/shadow_embedding_ops.py
Outdated
Show resolved
Hide resolved
tensorflow_recommenders_addons/dynamic_embedding/python/ops/shadow_embedding_ops.py
Show resolved
Hide resolved
tensorflow_recommenders_addons/dynamic_embedding/python/ops/shadow_embedding_ops.py
Show resolved
Hide resolved
tensorflow_recommenders_addons/dynamic_embedding/python/ops/shadow_embedding_ops.py
Show resolved
Hide resolved
8b65cd2
to
cd67f46
Compare
The CI was failed caused by keras-team/keras#15586 |
a71ea0f
to
8b0a4bc
Compare
Accept |
3b4bdff
to
7cc07c3
Compare
tensorflow_recommenders_addons/dynamic_embedding/python/ops/dynamic_embedding_ops.py
Show resolved
Hide resolved
tensorflow_recommenders_addons/dynamic_embedding/python/ops/dynamic_embedding_ops.py
Outdated
Show resolved
Hide resolved
7cc07c3
to
7f9505b
Compare
7f9505b
to
bfb404a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Background
Some of the design on Variable in current TensorFlow built on prerequisite of that variables are dense. And there is a gap between sparse Variable and dense Variable, since they have different shape mechanism, initializer implication, updating logic, etc., with agreement of RFC. Sparse Variable can be assume to be a superset to dense one, with more functionality and complex working mechanism.
Currently, in graph mode, the
dynamic_embedding
works fine, but in eager mode, we lack of either performance and not eazy to use. This PR aims to help the situation with supportingtf.function
ondynamic_embedding
and providekeras
programming support.In past versions of
dynamic_embedding
, we create a local variable to store the embedding lookup result temporarily, by callingembedding_lookup
. If it was called inside scope oftf.function
, the temporary variables will be translated to EagerTensor in the outputs:Same issues are also existed when optimizers try to create slots.
Changes