X Tutup
The Wayback Machine - https://web.archive.org/web/20250503045651/https://github.com/tensorflow/tensorflow/pull/45342
Skip to content

[TFLite] Add TABLE operator for LUT-based operators lowering #45342

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Jul 29, 2021

Conversation

Tessil
Copy link
Contributor

@Tessil Tessil commented Dec 2, 2020

Hi,

This PR adds a TFLite TABLE operator which is similar to the TABLE operator of the TOSA specification. The operator takes int8 or int16 inputs and look them up into the table associated to the operator to produce the outputs.

This operator can be used to lower non-linear quantized functions like the exponential to a LUT in the range of the quantized range. The following proof-of-concept commit quantizes the EXP operator by generating a LUT in the input range of the operator and replace the it by a TABLE operator in the exported TFLite model.

This PR only add the operator and don't provide any transformation yet, these will be part of a different PR. The gen_lut function also has been extended to support int8->int8, int8->int16 and int16->int8 tables in addition of the previously supported int16->int16 table.

Thibaut

@google-ml-butler google-ml-butler bot added the size:L CL Change Size: Large label Dec 2, 2020
@google-ml-butler google-ml-butler bot requested a review from joker-eph December 2, 2020 17:10
@google-cla google-cla bot added the cla: yes label Dec 2, 2020
@gbaned gbaned self-assigned this Dec 3, 2020
@gbaned gbaned added the comp:lite TF Lite related issues label Dec 3, 2020
@abattery
Copy link
Contributor

abattery commented Dec 7, 2020

Thanks for your contribution.

The new table operator will not be used for a part of the TF to TFL op lowering so it is hard to be used for the actual use cases. Are there any future plans for that? If now, how about just having this table operator as a custom op?

@gbaned gbaned requested a review from abattery December 8, 2020 15:13
@gbaned
Copy link
Contributor

gbaned commented Dec 8, 2020

@Tessil Can you please check @abattery's comments and keep us posted ? Thanks!

@gbaned gbaned added awaiting review Pull request awaiting review stat:awaiting response Status - Awaiting response from author and removed awaiting review Pull request awaiting review labels Dec 8, 2020
@Tessil Tessil force-pushed the toupstream/table_operator branch from 184a9dc to a24d523 Compare December 11, 2020 16:06
@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label Dec 13, 2020
@gbaned gbaned added the awaiting review Pull request awaiting review label Dec 21, 2020
@abattery
Copy link
Contributor

abattery commented Dec 21, 2020

Since this operator is somewhat in an experimental stage, how about adding this operator as a custom op as a first step until this custom op will resolve the unsupported cases or unlock the new opportunities with concrete examples? Because the builtin op addition will demand most of the android developers for the extra binary size requirement

@Tessil
Copy link
Contributor Author

Tessil commented Dec 22, 2020

Sorry for my late answer. I think it would be easier for now to leave the PR on the side as there are still some discussions in progress regarding the advantages of a separate TABLE operator compared to generating the LUT inside the Prepare method of each operator. Once we have a clearer understanding on how to move on with all this, we could eventually add the operator as custom op as first step.

@tensorflowbutler tensorflowbutler removed the awaiting review Pull request awaiting review label Dec 24, 2020
@gbaned gbaned added the stat:awaiting response Status - Awaiting response from author label Jan 15, 2021
@tensorflowbutler tensorflowbutler added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Feb 1, 2021
@gbaned
Copy link
Contributor

gbaned commented Feb 5, 2021

@Tessil Any update on this PR? Please. Thanks!

@gbaned gbaned removed the stale This label marks the issue/pr stale - to be closed automatically if no activity label Feb 5, 2021
@Tessil
Copy link
Contributor Author

Tessil commented Feb 5, 2021

Hi @gbaned ,

The PR is put in suspend for now as some discussions are still required internally and with some of the TFLite team members to check if we move forward with a separate TABLE operator or not. Sorry for that.

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label Feb 7, 2021
@gbaned gbaned added the stat:awaiting response Status - Awaiting response from author label Mar 1, 2021
@tensorflowbutler tensorflowbutler added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Mar 17, 2021
@tensorflowbutler
Copy link
Member

It has been 14 days with no activity and the awaiting response label was assigned. Is this PR still valid? Assigning the stalled label. Please comment to reassure me that this is still being worked on.

@talumbau talumbau requested a review from jianlijianli May 5, 2021 17:11
@jianlijianli
Copy link
Contributor

jianlijianli commented Jul 14, 2021

Thanks Tessil for making the change.

the change I did in tensorflow/lite/micro/kernels/softmax.cc will ....

If I understand correctly, the change in softmax is only refactoring (so the common functions can support the new TABLE operator). Are there any tangible change in the implementation?

@Tessil
Copy link
Contributor Author

Tessil commented Jul 14, 2021

Yes, the change is only a refactoring to adapt to the new more flexible gen_lut function signature. There is no tangible change in the implementation itself, the results will be exactly the same for the given parameters.

Copy link
Contributor

@jianlijianli jianlijianli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Tessil.

@google-ml-butler google-ml-butler bot added kokoro:force-run Tests on submitted change ready to pull PR ready for merge process labels Jul 14, 2021
@kokoro-team kokoro-team removed the kokoro:force-run Tests on submitted change label Jul 14, 2021
@gbaned gbaned added ready to pull PR ready for merge process and removed ready to pull PR ready for merge process stat:awaiting tensorflower Status - Awaiting response from tensorflower labels Jul 15, 2021
@google-ml-butler google-ml-butler bot removed the ready to pull PR ready for merge process label Jul 15, 2021
@gbaned gbaned requested a review from jianlijianli July 15, 2021 14:04
@gbaned gbaned added the awaiting review Pull request awaiting review label Jul 16, 2021
@Tessil
Copy link
Contributor Author

Tessil commented Jul 19, 2021

@jianlijianli I fixed an implicit cast warning which caused an error in one of the CI. The PR will need re-approval, thanks!

Copy link
Contributor

@jianlijianli jianlijianli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Tessil. Gtihub still shows build failure on Intel oneDNN but it appear to be unrelated.

@google-ml-butler google-ml-butler bot added kokoro:force-run Tests on submitted change ready to pull PR ready for merge process labels Jul 26, 2021
@kokoro-team kokoro-team removed the kokoro:force-run Tests on submitted change label Jul 26, 2021
@gbaned gbaned removed the awaiting review Pull request awaiting review label Jul 26, 2021
@abattery
Copy link
Contributor

abattery commented Jul 27, 2021

Is it possible to remove the new custom op from the default op registration? Instead, we can keep it as a part of custom_ops target under the tensorflow/lite/BUILD file.

Most of common TFLite users do not requires a new table custom op for now and we would like to keep the TFLite library as compact as possible.

@@ -637,6 +637,7 @@ BUILTIN_KERNEL_SRCS = [
"strided_slice.cc",
"sub.cc",
"svdf.cc",
"table.cc",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to remove the new custom op from the default op registration? Instead, we can keep it as a part of custom_ops target under the tensorflow/lite/BUILD file.

Most of common TFLite users do not requires a new table custom op for now and we would like to keep the TFLite library as compact as possible.

@Tessil
Copy link
Contributor Author

Tessil commented Jul 27, 2021

I can remove it from the default registration and add it to the custom_ops library but out of curiosity are there specific builds generated with these custom ops? And how should we compile such builds (as the custom_ops library target doesn't seem to be referenced anywhere as dependency outside of the tests)?

@abattery
Copy link
Contributor

abattery commented Jul 27, 2021

In bazel, it is possible to create a custom TFLite build target, depending on the custom_ops build target. This is the way to use the user generated custom ops.

@copybara-service copybara-service bot merged commit f2907e7 into tensorflow:master Jul 29, 2021
@google-ml-butler google-ml-butler bot removed cla: yes comp:lite TF Lite related issues ready to pull PR ready for merge process size:L CL Change Size: Large labels Jul 29, 2021
copybara-service bot pushed a commit that referenced this pull request Jul 29, 2021
…rs lowering

PiperOrigin-RevId: 387605434
Change-Id: I40ca1c45b169f5f801cc5663e856d7c20ee6f022
copybara-service bot pushed a commit that referenced this pull request Aug 5, 2021
PiperOrigin-RevId: 388849112
Change-Id: Iaac0005b519ffe32f7f298ee80a14735bff9c9c2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants
X Tutup