Code Llama for VSCode: An Introduction
Overview
Code Llama for VSCode provides an innovative solution for developers looking to integrate the capabilities of the Code Llama AI models into their Visual Studio Code environment. By leveraging an API that mimics the functionality of Llama.cpp, this project enables the seamless use of Code Llama with the Continue Visual Studio Code extension.
Why Use Code Llama for VSCode?
For developers who wish to utilize Code Llama locally on their machines, Code Llama for VSCode stands out as a unique solution. Unlike other services requiring sign-ups or API keys, this project allows developers to work without these constraints. It is noteworthy that while an alternative called Ollama exists, it does not support Windows or Linux platforms. In contrast, Code Llama for VSCode is entirely cross-platform, meaning it can be used on any platform where Meta's codellama code is operational.
Setup Instructions
Prerequisites
Before getting started, ensure that you have the necessary components set up:
- Download and run one of the Code Llama Instruct models from the Code Llama GitHub repository.
- Install the Continue VSCode extension via the Visual Studio Marketplace.
Once both components are functioning independently, you can integrate them using the Code Llama for VSCode.
Integration Steps
-
Preparation: Move the
llamacpp_mock_api.py
file into yourcodellama
directory. You'll need to install Flask in your environment using the commandpip install flask
to proceed. -
Running the API: Execute
llamacpp_mock_api.py
using your Code Llama Instruct torchrun command. Here's an example command to guide you:torchrun --nproc_per_node 1 llamacpp_mock_api.py \ --ckpt_dir CodeLlama-7b-Instruct/ \ --tokenizer_path CodeLlama-7b-Instruct/tokenizer.model \ --max_seq_len 512 --max_batch_size 4
-
Configuration: Go to Continue's UI in VSCode and click on the settings button located at the bottom right. Modify the
config.json
to align with the specified configuration here[archive]. Make sure to replaceMODEL_NAME
withcodellama-7b
.
After performing these steps, restart VSCode or reload the Continue extension. You should now be fully equipped to harness the power of Code Llama within VSCode, enhancing your development experience with AI-driven code assistance.