Test Device CPU/GPU AI speed.

AISpeedTest.net



Log

Chart







Load custom model/images
Drag & Drop files here (Models)
Drag & Drop files here (Images)
Help

- How does one use?
  Open aispeedtest.net on device(s) web browser, view/compare results fed back to all their views.

- How does it work?
  An onnx model is loaded in browser and emoji images are passed to ai model, inferenceTime provides info on how quick model ran in browser.
  Model is based on real images, emoji images are used just for quick no upload test.

- Can I load custom image for detection?
  An image drop zone above can be used to run inference.

- Can I load custom model?
  An onnx model can be tried using file dropzones.

References

MS Demo:MS onnx runtime demo
Model:https://microsoft.github.io/onnxruntime-web-demo/mobilenetv2-7.onnx




Changelog
 
 Oct-2024.2 - added name tag
 Oct-2024.1 - initial graph
##  known issues

- no ws fallback e.g. apple watch devices.
- older ios canvas ut8 char draw scale / web sockets
- chartjs replacement for older devices (need different timeseries chart)
- add top/averages in view
- prediction issues Fire Tablet/Silk
Contact
grahams.attic-0z[at]icloud[dot]com