The GenAI Stack is a project designed to help users build their own GenAI application quickly. It includes demo applications that can serve as inspiration or a starting point. The project requires configuration through a `.env` file, which includes variables for the Ollama LLM API URL, Neo4j database URL and credentials, LLM model tag, embedding model, and various API keys for different services. Users can use any LLM available via Ollama on MacOS and Linux, or use GPT-3.5 and GPT-4 on all platforms with their own OpenAI API keys.
docker compose up Start all containers in the project. docker compose up --build Build and start all containers in the project. docker compose watch Start all containers in the project and automatically rebuild when files change.
Codebase Insights for docker/genai-stack with Shoulder.dev
Shoulder.dev transforms codebases into tailored learning experiences. Below are organized categories of the codebase to help you start with your initial focus.