You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.
i'm using onnx.js V0.1.8 from to run mobilenetv2 exported from pytorch. But i had to do some modifications because of unsupported operators created due to an adapative average pooling2d (see #265). But now it works and create the same output compared to pytorch when using the same image.
New issue: webgl seems not to work.
A tensorflow.js version of mobilenetv2 achieved approx. 10fps inference.
onnx.js - backend - cpu: Approx. 0.5fps
onnx.js - backend - wasm: Approx. 0.9fps
onnx.js - backend - webgl: Approx. 0.5fps .... so i thing webgl is not used there.
I used following code to activate webgl.
What is wrong?
Thx. Klaus
Using <script src="./onnx.min.js"></script> and download all the files from cdn and placed the files in the same directory like index.html
var mdl="./pytorch_mobilenetv2.onnx";
const myOnnxSession = new onnx.InferenceSession({ backendHint: 'webgl' });
myOnnxSession.loadModel(mdl).then(() => {});
The text was updated successfully, but these errors were encountered:
When using webgl i get following warnings, but as mentioned no speed improvement compared to cpu.
WebGL warning: drawArraysInstanced: Using format enabled by implicitly enabled extension: EXT_float_blend. For maximal portability enable it explicitly. 31 webgl-context.ts:191:12
WebGL: No further warnings will be reported for this WebGL context. (already reported 32 warnings)
import tensorflow as tf
import tensorflowjs as tfjs
import onnx
from onnx_tf.backend import prepare
from tensorflow.python.tools.import_pb_to_tensorboard import import_to_tensorboard
onnx_model = onnx.load("output/pytorch_mobilenetv2.onnx")
tf_rep = prepare(onnx_model) # returns: A TensorflowRep class object representing the ONNX model
tf_rep.export_graph("output/pytorch_mobilenetv2.pb")
#import_to_tensorboard("output/pytorch_mobilenetv2.pb", "tb_log","")
mdl2=tf.saved_model.load("output/pytorch_mobilenetv2.pb")
tfjs.converters.convert_tf_saved_model("output/pytorch_mobilenetv2.pb", "output/js")
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
i'm using onnx.js V0.1.8 from to run mobilenetv2 exported from pytorch. But i had to do some modifications because of unsupported operators created due to an adapative average pooling2d (see #265). But now it works and create the same output compared to pytorch when using the same image.
New issue: webgl seems not to work.
A tensorflow.js version of mobilenetv2 achieved approx. 10fps inference.
onnx.js - backend - cpu: Approx. 0.5fps
onnx.js - backend - wasm: Approx. 0.9fps
onnx.js - backend - webgl: Approx. 0.5fps .... so i thing webgl is not used there.
I used following code to activate webgl.
What is wrong?
Thx. Klaus
Using <script src="./onnx.min.js"></script> and download all the files from cdn and placed the files in the same directory like index.html
The text was updated successfully, but these errors were encountered: