Skip to main content
Portkey’s Enterprise Components provide the core infrastructure needed for production deployments. Each component handles a specific function - analytics, logging, or caching - with multiple implementation options to match your requirements.

Analytics Store

Portkey leverages Clickhouse as the primary Analytics Store for the Control Panel, offering powerful capabilities for handling large-scale analytical workloads.
Portkey supports exporting your Clickhouse analytics data to OpenTelemetry (OTel) compatible collectors, allowing you to integrate Portkey’s analytics with your existing observability infrastructure.

Log Store

Portkey provides flexible options for storing and managing logs in your enterprise deployment. Choose from various storage solutions including MongoDB for document-based storage, AWS S3 for cloud-native object storage, or Wasabi for cost-effective cloud storage. Each option offers different benefits in terms of scalability, cost, and integration capabilities.

Log Object Path Format

Configure how log files are organized in your blob storage using the LOG_STORE_FILE_PATH_FORMAT environment variable.
ValueFormatExample Path
v1 (default)Flat structure30/<organisation-id>/<log-id>.json
v2Time-hierarchical30/<organisation-id>/<workspace-slug>/<year>/<month>/<day>/<hour>/<log-id>.json
The v2 format organizes logs by time hierarchy, making it easier to manage retention policies and query logs for specific time periods.
Changing LOG_STORE_FILE_PATH_FORMAT only affects newly written logs. Previously written logs retain their original path format and are not migrated.
The v2 format is not supported for air-gapped deployments where LOG_STORE is set to control_plane.

Cache Store

Portkey supports robust caching solutions to optimize performance and reduce latency in your enterprise deployment. Choose between Redis for in-memory caching or AWS ElastiCache for a fully managed caching service.

Semantic Cache Store

Portkey supports semantic caching capabilities that allow for intelligent caching based on the semantic meaning of requests rather than exact matches. For this functionality, Portkey requires a vector database to efficiently store and retrieve embeddings. Both Milvus and Pinecone are supported vector databases for Portkey’s semantic cache implementation.
  • You need a Milvus deployment with a collection configured with 1536 dimensions
  • Pinecone can be configured with a Pinecone index of 1536 dimensions to store the embeddings
This specific dimension size is required as Portkey uses OpenAI’s text-embedding-small model to create embeddings
Last modified on February 5, 2026