multi-model-server
Multi Model Server is a versatile platform for serving deep learning models across different ML/DL frameworks. It includes server CLI, Docker images, and HTTP endpoints for inference in CPU and GPU setups. Features include rapid model deployment, API-based management, and streamlined model archiving. Enhanced security and deployment options are provided through Docker applications. Comprehensive documentation and community support are available.