QDrant Loader Core

PyPI Python License: GPL v3

Shared core library for the QDrant Loader ecosystem. It provides a provider‑agnostic LLM layer (embeddings and chat), configuration mapping, safe logging, and normalized error handling used by the CLI and MCP Server packages.

For provider, configuration, and architecture details, use the documentation links below.

🎯 What It Provides

  • Provider-agnostic LLM facade for OpenAI, Azure OpenAI, OpenAI-compatible endpoints, and Ollama
  • Unified async APIs for embeddings and chat clients
  • Typed configuration mapping via LLMSettings.from_global_config(...)
  • Structured logging with secret redaction
  • Normalized provider exceptions for predictable handling across backends

πŸ“¦ Installation

pip install qdrant-loader-core

With extras:

pip install "qdrant-loader-core[openai]"
pip install "qdrant-loader-core[ollama]"

πŸ“„ Logging

Use built-in structured logging:

from qdrant_loader_core.logging import LoggingConfig

LoggingConfig.setup(level="INFO", format="console", file=None)
logger = LoggingConfig.get_logger(__name__)
logger.info("LLM ready")

πŸ“ Notes

  • Secrets (API keys/tokens) are redacted in logs
  • For MCP integrations, MCP_DISABLE_CONSOLE_LOGGING=true is recommended
  • Environment variable reference - Required and optional environment variables for setup, authentication, and runtime behavior.

❗ Error Handling

Catch provider-normalized exceptions from qdrant_loader_core.llm.errors:

  • TimeoutError
  • RateLimitedError
  • InvalidRequestError
  • AuthError
  • ServerError

πŸ“š Canonical Documentation

🀝 Contributing

See CONTRIBUTING - Contribution guidelines, development standards, and pull request process.