Project Icon

wonnx

Rust-based GPU-Accelerated ONNX Inference Tool for Web Applications

Product DescriptionA GPU-accelerated ONNX inference runtime entirely built in Rust, designed for web use. It supports Vulkan, Metal, and DX12, providing ease in model handling via CLI, Rust, Python, and WebGPU with WebAssembly. Available across platforms such as Windows, Linux, macOS, and Android. Includes comprehensive examples, CLI tools, and extensive documentation, catering to developers needing efficient, cross-platform inference solutions with Rust. Compatible with models like Squeezenet, MNIST, and BERT without overstated claims.
Project Details