Project Icon

llama_ros

Integrate LLMs and VLMs into ROS 2 with llama_ros Packages

Product DescriptionDiscover how llama_ros enables the integration of llama.cpp's optimization features into ROS 2 projects. It supports GGUF-based LLMs, VLMs, real-time LoRA changes, and GBNF grammars, enhancing robotic applications. The repository includes detailed installation guides, Docker options, and usage examples. Enhance ROS 2 functionality with CUDA support and other tools offered by llama_ros, suitable for expanding project capabilities with LangChain and related demos.
Project Details