LLM Configuration - docker/genai-stack

The GenAI Stack is a new solution for streamlined AI/ML integration, made easy with pre-configured Large Language Models (LLMs), Ollama management, and comprehensive support. The stack includes the following key technologies and dependencies:

  • Docker
  • Compose
  • Python
  • Neo4j
  • OpenAI
  • Boto3
  • FastAPI
  • Torch
  • Sentence Transformers
  • Langchain

The GenAI Stack includes pre-configured LLMs such as Llama2, GPT-3.5, and GPT-4, and utilizes Neo4j as the default database for graph and native vector search capabilities. Neo4j knowledge graphs are used to ground LLMs for precise GenAI predictions and outcomes. LangChain facilitates communication between the LLM, your application, and the database, along with a robust vector index.

To install and configure the Grafana LLM app, which enables various LLM-related functionality across Grafana, follow the instructions in the official documentation. The plugin is not designed to be directly interacted with; instead, use the convenience functions in the @grafana/experimental package which will communicate with this plugin if installed.

For more information on how to contribute to LLVM, refer to the official documentation.

References: