About AIKit ✨
AIKit is a versatile platform designed to simplify the process of hosting, deploying, building, and fine-tuning large language models (LLMs). Offering robust features for both inference and fine-tuning, AIKit ensures users can implement large-scale AI models with ease and efficiency.
Key Capabilities
Inference
AIKit's inference abilities are powered by LocalAI, which supports extensive inference capabilities and formats. This provides a REST API compatible with OpenAI's API, enabling seamless integration with any compatible client — be it Kubectl AI or Chatbot-UI, among others. This compatibility ensures that users can readily make requests to open LLMs using their existing tools.
Fine-Tuning
AIKit includes an expandable interface for fine-tuning models, particularly emphasizing speed, memory efficiency, and user-friendliness with Unsloth.
For further reading on AIKit's features, users are encouraged to visit the AIKit website.
Exciting Features
- No Special Hardware Required: Operate without a GPU or internet, needing only Docker.
- Security and Size: Enjoy a minimal image size with a reduced attack surface thanks to a distroless-based setup.
- Customizability: Easily fine-tune models and configure them declaratively for both inference and model building.
- OpenAI Compatibility: Seamlessly integrate with any client compatible with OpenAI's API.
- Versatile Model Support: Supports multi-modal models, image generation, and several model types like GGUF and GPTQ.
- Kubernetes and More: Ready for deployment in Kubernetes environments.
- Wide Platform Compatibility: Supports both AMD64 and ARM64 CPUs and allows for GPU-accelerated processes with NVIDIA GPUs.
- Secure and Isolated Operations: Protect data with fully isolated, air-gapped environments using self-hosted or remote registries for edge inference.
How to Get Started
AIKit allows for a straightforward setup on a local machine — even without a GPU.
docker run -d --rm -p 8080:8080 ghcr.io/sozercan/llama3.1:8b
Users can access the platform's WebUI simply by visiting http://localhost:8080/chat
in their browsers.
API Access
AIKit provides a REST endpoint compatible with the OpenAI API, allowing developers to send requests as they would with OpenAI's services.
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
"model": "llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "explain kubernetes in a sentence"}]
}'
Ready-to-Use Models
AIKit comes pre-loaded with a variety of models ready for immediate use, though users are free to create and host custom models if needed.
Explore Further
For those interested in more detailed guidance on fine-tuning models or creating custom images, the AIKit website offers comprehensive resources and support.