Project Icon

OnnxStream

Enhancing Deep Learning with Efficient Memory Management

Product DescriptionOnnxStream is a memory-efficient inference library for running large models like Stable Diffusion on devices with limited memory. It separates the inference engine and model weights for versatile data handling, achieving significant memory reductions compared to traditional methods. Ideal for resource-constrained environments, recent updates include WebAssembly and Stable Diffusion XL support, enhancing machine learning potential on compact hardware.
Project Details