The GenAI Stack is a project designed to help users build their own GenAI application quickly. It includes demo applications that can serve as inspiration or a starting point. The project requires configuration through a `.env` file, which includes variables for the Ollama LLM API URL, Neo4j database URL and credentials, LLM model tag, embedding model, and various API keys for different services. Users can use any LLM available via Ollama on MacOS and Linux, or use GPT-3.5 and GPT-4 on all platforms with their own OpenAI API keys.
docker compose up Start all containers in the project. docker compose up --build Build and start all containers in the project. docker compose watch Start all containers in the project and automatically rebuild when files change.
Shoulder.dev transforms codebases into tailored learning experiences. Below are organized categories of the codebase to help you start with your initial focus.
Detailed explanation of how to configure the Large Language Model (LLM) using environment variables, including setting up Ollama, Neo4j, and other dependencies.
Overview of the five applications included in the GenAI Stack, including the Support Bot, Stack Overflow Loader, PDF Reader, Standalone Bot API, and Standalone Bot UI.
Detailed explanation of the Support Bot application, including its features, such as answering support questions, providing summarized answers, and generating high-quality support tickets.
Detailed explanation of the Loader application, including its features, such as importing Stack Overflow data, embedding questions and answers, and storing them in a vector index.
Detailed explanation of the PDF Reader application, including its features, such as loading local PDFs, embedding text chunks, and asking questions about the contents.
Detailed explanation of the Standalone Bot API, including its endpoints, such as non-streaming and SSE streaming, and how to use it to answer questions.
Detailed explanation of the Standalone Bot UI, including its features, such as using the Standalone Bot API to interact with the model, and its modern best practices using Vite, Svelte, and Tailwind.
Explanation of the Continuous Integration and Continuous Deployment (CI/CD) processes used in the GenAI Stack, including how to automate testing, building, and deployment.