Skip to content

Conversation

seyeong-han
Copy link
Contributor

@seyeong-han seyeong-han commented Oct 10, 2025

What does this PR do?

This pull request adds support for MongoDB Atlas as a remote vector database provider to Llama Stack, enabling cloud-native vector search functionality. The changes introduce configuration, documentation, and integration for MongoDB Atlas Vector Search across the codebase, including distribution templates, provider registry, and sample configurations.

MongoDB Atlas Vector Search Integration

  • Added remote::mongodb as a supported vector database provider in distribution build/run YAML files for starter, starter-gpu, and CI-tests, allowing MongoDB Atlas to be used for vector storage and search. [1] [2] [3] [4] [5] [6]
  • Integrated MongoDB Atlas configuration into the starter distribution Python template, including environment variables, provider registration, and sample config generation using MongoDBVectorIOConfig. [1] [2] [3] [4]

Provider Implementation

  • Added the MongoDB Atlas provider implementation in llama_stack/providers/remote/vector_io/mongodb, including adapter loading logic and a pydantic-based configuration class for connection and search options. [1] [2]

Documentation

  • Created comprehensive documentation for the MongoDB Atlas provider, detailing features, search modes (vector, keyword, hybrid), configuration, installation, and usage instructions.

Provider Registry

  • Registered the MongoDB Atlas provider in llama_stack/providers/registry/vector_io.py, including metadata, pip dependencies, and a detailed description for discoverability.

Miscellaneous

  • Added copyright header to the test module for providers.

Test Plan

Example RAG with response API

  1. Get MongoDB connection key and Fireworks API key

  2. Run llama-stack server with Fireworks API

MONGODB_CONNECTION_STRING="mongodb+srv://{GET_THIS_FROM_MONGODB}" \
FIREWORKS_API_KEY={YOUR_API_KEY} uv run --with llama-stack llama stack build --distro starter --image-type venv --run
  1. Run RAG
from io import BytesIO
from llama_stack_client import LlamaStackClient

# Initialize client (use port 8321 which is the default for starter distribution)
client = LlamaStackClient(base_url="http://localhost:8321")

model_id = "accounts/fireworks/models/llama4-maverick-instruct-basic"

knowledge_base = [
    (
        "Python is a high-level programming language known for its readable syntax and versatility.",
        {"category": "Programming"},
    ),
]
file_ids = []
for content, metadata in knowledge_base:
    with BytesIO(content.encode()) as file_buffer:
        file_buffer.name = f"{metadata['category'].lower()}_{len(file_ids)}.txt"
        response = client.files.create(file=file_buffer, purpose="assistants")
        file_ids.append(response.id)

print(f"✅ Uploaded {len(file_ids)} documents")

print("Creating vector store with sentence-transformers/all-MiniLM-L6-v2...")

vector_store = client.vector_stores.create(
    name="mongodb_tech_knowledge_base",
    file_ids=file_ids,
    embedding_model="sentence-transformers/all-MiniLM-L6-v2",
    embedding_dimension=384,
    provider_id="mongodb_atlas",  # MongoDB Atlas provider
)

print(f"✅ Created MongoDB Atlas vector store: {vector_store.name}")
print(f"Vector store id: {vector_store.id}")

query = "What is Python?"
try:
    response = client.responses.create(
        model=model_id,
        input=query,
        tools=[
            {
                "type": "file_search",
                "vector_store_ids": [vector_store.id],
            },
        ],
    )
    # print(f"Response: {response}\n")
    print(f"File search results: {response.output[-1].content[0].text}\n")
except Exception as e:
    print(f"❌ Query failed: {str(e)[:150]}...")

--> Result

source .venv/bin/activate
python examples/mongodb_response_api_simple.py
INFO:httpx:HTTP Request: POST http://localhost:8321/v1/openai/v1/files "HTTP/1.1 200 OK"
✅ Uploaded 1 documents
Creating vector store with sentence-transformers/all-MiniLM-L6-v2...
INFO:httpx:HTTP Request: POST http://localhost:8321/v1/openai/v1/vector_stores "HTTP/1.1 200 OK"
✅ Created MongoDB Atlas vector store: mongodb_tech_knowledge_base
Vector store id: vs_f09d3c4b-0163-43b2-8f49-d6701643cf7a
INFO:httpx:HTTP Request: POST http://localhost:8321/v1/openai/v1/responses "HTTP/1.1 200 OK"
File search results: Python is a high-level, interpreted programming language known for its simplicity and readability. It is widely used for various purposes such as web development, data analysis, artificial intelligence, and more. Python's syntax is designed to be easy to understand, making it a great language for beginners and experienced developers alike.

MongoDB Atlas llama_stack.vs_f09d3c4b_0163_43b2_8f49_d6701643cf7a collection
image

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 10, 2025
@seyeong-han seyeong-han changed the title Add: mongodb vector io feat: mongodb vector io Oct 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot. new-in-tree-provider

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants