Project Overview: llm-movieagent
The llm-movieagent project is an innovative endeavor aimed at enhancing interactions between language models and graph databases, specifically Neo4j. This is achieved through a semantic layer and OpenAI function calling. The project leverages advanced tools and technologies to create an intelligent agent that understands and interacts with a graph database, providing a user-friendly interface for querying and retrieving data from it.
Project Goals and Workflow
The primary goal of the llm-movieagent project is to create an agent that can respond to user queries about movies or individuals within the context of a graph database. The agent comprehends user intent and utilizes a semantic layer to fetch or recommend relevant data from Neo4j. By visiting the project's associated blog post, users can gain deeper insights into the project’s workflow depicted in the provided diagram.
To launch the project, users execute a simple command: docker-compose up
, and then interact with the agent via a web interface, accessible at http://localhost:8501
.
Tools and Functionality
The llm-movieagent employs several key tools to interact effectively with Neo4j:
-
Information Tool: This tool allows the agent to access and retrieve current, precise data concerning movies or individuals, ensuring it handles requests proficiently.
-
Recommendation Tool: Based on user preferences and input, this tool suggests movies, enhancing user engagement by providing tailored recommendations.
-
Memory Tool: By storing user preferences within a knowledge graph, this tool personalizes the experience, making interactions more meaningful over time.
Setting Up the Environment
For proper functioning, several environment variables must be defined in a .env
file as follows:
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
NEO4J_URI=<YOUR_NEO4J_URI>
NEO4J_USERNAME=<YOUR_NEO4J_USERNAME>
NEO4J_PASSWORD=<YOUR_NEO4J_PASSWORD>
These variables ensure secure and seamless connections between the various components of the project.
Docker Container Services
The project is structured to include various services packaged as Docker containers:
-
Neo4j: This service utilizes Neo4j to store comprehensive data about actors, movies, and their ratings.
-
API: This service employs LangChain's
neo4j-semantic-layer
template to integrate OpenAI's language models and function calling abilities. -
UI: The user interface is a simplistic Streamlit chat application, easily reachable on
localhost:8501
for user interactions.
Database Population
To enrich the database with an exemplary movie dataset, users can employ the ingest.py
script. This process involves importing movie information and user ratings into the graph database. The script is designed to run from within the API Docker container:
# Enter the container shell
docker exec -it <container id for llm-movieagent-api> bash
# Execute the ingestion script
python ingest.py
This dataset, inspired by the MovieLens dataset, is an integral part of Neo4j's "Recommendation" project, further enhancing the project's capabilities.
Invitation for Contributions
The llm-movieagent project welcomes contributions from enthusiasts and developers eager to collaborate and innovate further.
In essence, llm-movieagent is a project that marries language models with graph databases, delivering advanced and personalized data retrieval and interaction experiences.