Skip to main content
Warning:
You can self-host CORE on your own infrastructure using Docker.
The following instructions will use Docker Compose to spin up a CORE instance.
Make sure to read the self-hosting overview first.
As self-hosted deployments tend to have unique requirements and configurations, we don’t provide specific advice for securing your deployment, scaling up, or improving reliability.
This guide alone is unlikely to result in a production-ready deployment. Security, scaling, and reliability concerns are not fully addressed here.
Should the burden ever get too much, we’d be happy to see you on CORE Cloud where we deal with these concerns for you.

Requirements

These are the minimum requirements for running the core.

Prerequisites

To run CORE, you will need:
  • Docker 20.10.0+
  • Docker Compose 2.20.0+

System Requirements

  • 4+ vCPU
  • 8+ GB RAM
  • 20+ GB Storage

Deployment Options

CORE offers multiple deployment approaches depending on your needs:

Quick Deploy with Railway

For a one-click deployment experience, use Railway: Deploy on Railway Railway will automatically set up all required services and handle the infrastructure for you.

Manual Docker Deployment

Prerequisites: Before starting any deployment, ensure you have your OPENAI_API_KEY ready. This is required for AI functionality in CORE. If you use an OpenAI-compatible proxy, OPENAI_API_KEY can typically be any non-empty value, and you can also set OPENAI_BASE_URL.

Combined Setup

For self deployment:
  1. Clone core repository
    # Clone the repository
    git clone https://github.com/RedPlanetHQ/core.git
    cd core/hosting/docker
    
  2. Start the services:
    Using an OpenAI-compatible proxy and/or Ollama? CORE supports OpenAI-compatible proxies via OPENAI_BASE_URL and optional Ollama (chat and/or embeddings) via OLLAMA_URL. Configure these in .env, then (optionally) uncomment the ollama service in docker-compose.yaml. After changing env vars, restart the services for changes to take effect.
    docker compose up -d
    

Configuring Your Host URL

After deployment, set your public host URL so that Gateway and channel integrations (Slack, WhatsApp, etc.) can reach your instance. In your .env file, update the following variables to match your actual host:
APP_ORIGIN=https://your-domain.com
LOGIN_ORIGIN=https://your-domain.com
  • APP_ORIGIN: The public URL of your CORE instance. Used by the Gateway and channel integrations to construct callback and webhook URLs.
  • LOGIN_ORIGIN: The URL used for authentication flows (magic links, OAuth redirects). Usually the same as APP_ORIGIN.
If these are not set correctly, Gateway connections and channel webhooks (e.g. Slack events, WhatsApp callbacks) will fail to reach your instance.

Next Steps

Once deployed, you can:
  • Configure your AI providers (OpenAI, Anthropic, etc.)
  • Set up integrations (Slack, GitHub, Gmail)
  • Start building your memory graph
  • Explore the CORE API and SDK