QDrant Loader

PyPI - qdrant-loader PyPI - mcp-server Test Coverage License: GPL v3

📋 Release Notes v0.4.6 - Latest improvements and bug fixes (June 3, 2025)

A comprehensive toolkit for loading data into Qdrant vector database with advanced MCP server support for AI-powered development workflows.

🎯 What is QDrant Loader?

QDrant Loader is a powerful data ingestion and retrieval system that bridges the gap between your technical content and AI development tools. It collects, processes, and vectorizes content from multiple sources, then provides intelligent search capabilities through a Model Context Protocol (MCP) server.

Perfect for:

  • 🤖 AI-powered development with Cursor, Windsurf, and GitHub Copilot
  • 📚 Knowledge base creation from scattered documentation
  • 🔍 Intelligent code assistance with contextual documentation
  • 🏢 Enterprise content integration from Confluence, JIRA, and Git repositories

📦 Packages

This monorepo contains two complementary packages:

🔄 QDrant Loader

Data ingestion and processing engine

Collects and vectorizes content from multiple sources into QDrant vector database.

Key Features:

  • Multi-source connectors: Git, Confluence (Cloud & Data Center), JIRA (Cloud & Data Center), Public Docs, Local Files
  • Advanced file conversion: 20+ file types including PDF, Office docs, images with AI-powered processing
  • Intelligent chunking: Smart document processing with metadata extraction
  • Incremental updates: Change detection and efficient synchronization
  • Flexible embeddings: OpenAI, local models, and custom endpoints

🔌 QDrant Loader MCP Server

AI development integration layer

Model Context Protocol server providing RAG capabilities to AI development tools.

Key Features:

  • MCP protocol compliance: Full integration with Cursor, Windsurf, and Claude Desktop
  • Advanced search tools: Semantic, hierarchy-aware, and attachment-focused search
  • Confluence intelligence: Deep understanding of page hierarchies and relationships
  • File attachment support: Comprehensive attachment discovery with parent document context
  • Real-time processing: Streaming responses for large result sets

🚀 Quick Start

Installation

# Install both packages
pip install qdrant-loader qdrant-loader-mcp-server

# Or install individually
pip install qdrant-loader          # Data ingestion only
pip install qdrant-loader-mcp-server  # MCP server only

5-Minute Setup

  1. Create a workspace

bash mkdir my-qdrant-workspace && cd my-qdrant-workspace

  1. Download configuration templates

bash curl -o config.yaml https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/config.template.yaml curl -o .env https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/.env.template

  1. Configure your environment (edit .env)

bash QDRANT_URL=http://localhost:6333 QDRANT_COLLECTION_NAME=my_docs OPENAI_API_KEY=your_openai_key

  1. Configure data sources (edit config.yaml)

yaml sources: git: - url: "https://github.com/your-org/your-repo.git" branch: "main"

  1. Load your data

bash qdrant-loader --workspace . init qdrant-loader --workspace . ingest

  1. Start the MCP server

bash mcp-qdrant-loader

🎉 You're ready! Your content is now searchable through AI development tools.

🔧 Integration Examples

Cursor IDE Integration

Add to .cursor/mcp.json:

{
  "mcpServers": {
    "qdrant-loader": {
      "command": "/path/to/venv/bin/mcp-qdrant-loader",
      "env": {
        "QDRANT_URL": "http://localhost:6333",
        "QDRANT_COLLECTION_NAME": "my_docs",
        "OPENAI_API_KEY": "your_key",
        "MCP_DISABLE_CONSOLE_LOGGING": "true"
      }
    }
  }
}

Example Queries in Cursor

  • "Find documentation about authentication in our API"
  • "Show me examples of error handling patterns"
  • "What are the deployment requirements for this service?"
  • "Find all attachments related to database schema"

📁 Project Structure

qdrant-loader/
├── packages/
│   ├── qdrant-loader/           # Core data ingestion package
│   └── qdrant-loader-mcp-server/ # MCP server for AI integration
├── docs/                        # Comprehensive documentation
├── website/                     # Documentation website generator
└── README.md                   # This file

📚 Documentation

🚀 Getting Started

👥 For Users

🛠️ For Developers

📦 Package Documentation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details on:

  • Setting up the development environment
  • Code style and standards
  • Pull request process
  • Issue reporting guidelines

Quick Development Setup

# Clone the repository
git clone https://github.com/martin-papy/qdrant-loader.git
cd qdrant-loader

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install in development mode
pip install -e packages/qdrant-loader[dev]
pip install -e packages/qdrant-loader-mcp-server[dev]

# Run tests
pytest

🆘 Support

📄 License

This project is licensed under the GNU GPLv3 - see the LICENSE file for details.


Ready to supercharge your AI development workflow? Start with our Quick Start Guide or explore the complete documentation.

Back to Documentation
Generated from README.md