-
Notifications
You must be signed in to change notification settings - Fork 8
Fixes for DORY #11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
lukamac
wants to merge
4
commits into
devel
Choose a base branch
from
pr/fix-dory
base: devel
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Fixes for DORY #11
Changes from all commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -19,5 +19,4 @@ | |
| # limitations under the License. | ||
| # | ||
|
|
||
| from . import grrules | ||
| from .cutie_export import convert_net, export_net | ||
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please motivate this more, are you sure you are not making a mistake at some other point?
This is critical code for many applications.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is hard for me to motivate this more with my limited knowledge of quantlib. Do you have an alternative way to handle this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This creates an extra addition layer in the produced onnx graph and DORY expects a precise pattern of layers to recognize requantization.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will definitely break compatibility with the RQS strategy in Deeploy where we do rounding by default. I suggest that you make arithmetic rounding in the RequantShift layer configurable and disable it in flows targetting DORY as the backend.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am unfortunately not very familiar with DORY but for Deeploy we (or at least I) export fused RQS nodes directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably talk with @da-gazzi as well - I know that Georg implemented rounding by adding "half the shift" to the bias; it seems to me like adding .5 here does pretty much the same. We should disentangle this a bit before merging, but if there are multiple places where rounding biases are added we should fold that into one spot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I concur with @Scheremo the only "good" solution is to fuse the rounding with the bias value and not expose this
+0.5here. I do not know in Deeploy how that is handled, but as this is anyways an addition of 0.5 happening after requantization, it can not really represent an integer op in a bit-true fashion.Fusing this inside QuantLib avoids any confusion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems also related to this issue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree - the idea of the "RequantShift" layer is that it represents the integer operations performed on the device 1:1. The activation rounding is handled by statically adding half an eps to the bias value; adding 0.5 here would achieve the same thing but it breaks the exported net if you don't use custom nodes. Is there something keeping us from just using the "integer rounding" approach in all cases? It is already configurable; i.e., you can turn it on/off as desired with the
roundingflag toPACTActivationclasses.