Autonomous HR Chatbot
The Autonomous HR Chatbot is an innovative enterprise application designed to answer human resources (HR) inquiries autonomously. This prototype leverages advanced technologies, including ChatGPT, LangChain, Pinecone, and Streamlit, to provide efficient and accurate responses to HR-related queries.
Overview
The primary function of the Autonomous HR Chatbot is to serve as an autonomous agent capable of answering HR-related questions using specific tools. It is built using LangChain's agents and tools modules and utilizes Pinecone as a vector database. The application is powered by ChatGPT or the gpt-3.5-turbo model, and Streamlit serves as the frontend interface using the streamlit_chat component.
Technologies and Tools
-
Timekeeping Policies: This tool includes a sample HR policy document generated by ChatGPT. The document's contents are embedded using OpenAI’s text-embedding-ada-002 model and stored in a Pinecone index for easy retrieval and reference.
-
Employee Data: Employee data is maintained in a CSV file that includes mock information such as employee names, supervisors, and leave details. This data is loaded into a pandas dataframe, enabling manipulation by the language model through LangChain's PythonAstREPLTool.
-
Calculator: The chatbot uses LangChain's calculator chain module, LLMMathChain, to perform arithmetic operations as needed during interactions.
Demonstrations
-
Sample Chat: The chatbot can carry out conversations and provide insightful responses to inquiries, as shown in a sample chat.
-
Sample Tool Use: The chatbot efficiently uses available tools to deliver accurate and contextually appropriate responses.
Getting Started
To utilize the Autonomous HR Chatbot, one must:
- Install Python 3.10 and its dependencies.
- Clone the project repository locally.
- Install the necessary modules using the command
pip install -r requirements.txt
. - Input personal API keys into the
hr_agent_backend_local.py
file. - Start the application with
streamlit run hr_agent_frontent.py
.
Embedding Storage
To store embeddings in Pinecone:
- Register for a Pinecone account to obtain the necessary API keys and environment details.
- Execute the
store_embeddings_in_pinecone.ipynb
notebook, inserting your Pinecone and OpenAI API keys for effective storage.
Technology Stack
- Azure OpenAI Service: Provides AI functionalities via Azure's platform.
- LangChain: A framework for developing applications around large language models (LLMs).
- Pinecone: A vector database used to store embeddings.
- Streamlit: A framework enabling the quick deployment of Python web applications.
- Azure Data Lake and Data Factory: Manage and create data pipelines for employee data.
- SAP HCM: Supplies the source employee data.
Additional Resources
A video demonstration is available on YouTube to illustrate how the chatbot functions.
About the Author
Stephen Bonifacio is the creator of this project. He can be reached for further engagement on LinkedIn and Twitter.