-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added training files with quantized weights #1
Conversation
Thank you. But please comment: |
In principle yes, but first is necessary to review the performance of the new files
I plan to add version in the DPF config to remove dependency of the name.
2017 is a miss-type, I'll correct
yes, I plan to change the DeepTauID producer to reduce amount of memory consumption |
OK, thanks for explanations. I am going to prepare a special branch for the files with quantized training and then accept the PR. |
Merged to the |
P.S. I checked and found that original DPF have "2017" in their names, so it makes sense to keep it also in names of quantized ones. What about "v0/v1" which is expected to be just before ".pb" to correctly recognize version of the discriminant? Is it possible to detect the DPF version based on a structure of the graph in *.pb file? |
@mbluj @ocolegrove After some tries, I don't find a way to create quantized files for DPF that would work for 94X, because 94X uses an old tensorflow version (1.3.0). In 10_4_X quantized files work fine, because the tensorflow version there is 1.6.0. Any ideas/suggestions how to proceed? |
@mbluj <https://github.com/mbluj> @ocolegrove
<https://github.com/ocolegrove> After some tries, I don't find a way to
create quantized files for DPF that would work for 94X, because 94X uses an
old tensorflow version (1.3.0). In 10_4_X quantized files work fine,
because the tensorflow version there is 1.6.0.
The quantized version of deepTau will work for both 94X and 10_4_X,
because it has a simpler network structure.
What about TF version associated to 102X?
Any ideas/suggestions how to proceed?
I propose to focus on 104X for now (and 102X if it contains TF version
which is new enough). If DeepTau/DPFTau are accepted for 104X we will think
about backports. Does it sound reasonable?
|
1.6.0, so it should be compatible.
Ok, I agree. However, to not create a mess, I propose that @MRD2F first commits the code in 94X branch (with quantized version disabled by default) and then forward port it to 10_4_X (same procedure as was done before). Are you agree? Just for the record: since the original version of DPF is compatible with 1.3.0, in theory it should be possible to produce quantized graph that is also compatible with this version by using a standalone tensorflow v1.3.0 installed from sources with transform_graph tools compiled. |
What about TF version associated to 102X?
1.6.0, so it should be compatible.
OK, good!
I propose to focus on 104X for now (and 102X if it contains TF version
which is new enough). If DeepTau/DPFTau are accepted for 104X we will think
about backports. Does it sound reasonable?
Ok, I agree. However, to not create a mess, I propose that @MRD2F
<https://github.com/MRD2F> first commits the code in 94X branch (with
quantized version disabled by default) and then forward port it to 10_4_X
(same procedure as was done before). Are you agree?
Yes, please proceed with it.
Just for the record: since the original version of DPF is compatible with
1.3.0, in theory it should be possible to produce quantized graph that is
also compatible with this version by using a standalone tensorflow v1.3.0
installed from sources with transform_graph tools compiled.
OK, it is good to know. If integration with CMSSW 104X is successful we
will think about this option for 94X. I hope that graph quantized with TF
1.3.0 will be also usable with 1.6.0 to not expand too much the
training-file repo. Or you think it makes sense to try the quantization
with TF 1.3.0 already now? Will it be more handy?
|
For sure it would be handier to have the final quantized files straight away compatible with 1.3.0 and above, but tensorflow 1.3.0 is old and it is not compatible with the recent versions of bazel (the build system to compile tensorflow from sources). So, at least if someone has it already installed, one should do all setup from scratch to make it work, which will require some time... |
OK, it is good to know. If integration with CMSSW 104X is successful we
will think about this option for 94X. I hope that graph quantized with TF
1.3.0 will be also usable with 1.6.0 to not expand too much the
training-file repo. Or you think it makes sense to try the quantization
with TF 1.3.0 already now? Will it be more handy?
For sure it would be handier to have the final quantized files straight
away compatible with 1.3.0 and above, but tensorflow 1.3.0 is old and it is
not compatible with the recent versions of bazel (the build system to
compile tensorflow from sources). So, at least if someone has it already
installed, one should do all setup from scratch to make it work, which will
require some time...
OK, so let's not explore this now. It could be simpler to update TF in a
(hypothetical) future 94X release than go back to TF 1.3.0...
Of course one can ask b-tagging guys, maybe someone has by chance old TF still installed somewhere.
|
Training files for DNN-based Tau-Ids
Added training files with quantized weights