Self-Hosted Deployment: Run G8KEPR on Your Infrastructure
Self-hosting is coming to G8KEPR in Q1 2025. Deploy our API security gateway as a single Docker container on your own infrastructure while keeping access to our cloud dashboard for analytics and monitoring.
Why Self-Hosting?
Hosting G8KEPR on your own servers gives you control, reduces costs, and keeps your traffic private:
90% Bandwidth Cost Reduction
If you process 100M API requests per month with 5KB average response size, that's 500GB of egress traffic. Cloud providers charge ~$0.08/GB for data transfer:
- • Cloud-hosted: 500GB × $0.08 = $40/mo bandwidth costs
- • Self-hosted: Internal network transfer = $0/mo
- • Savings: $480/year per 100M requests
At 1 billion requests/month, you'd save $4,800/year in bandwidth costs alone.
The Architecture
G8KEPR self-hosted deployment consists of a single, lightweight Docker container that sits between your clients and backend services:
┌─────────┐ ┌───────────────────┐ ┌─────────────┐
│ Clients │ ───▶ │ G8KEPR Container │ ───▶ │ Your APIs │
│ (Web) │ │ (Docker on K8s) │ │ (Services) │
└─────────┘ └───────────────────┘ └─────────────┘
│
▼
┌──────────────────┐
│ Cloud Dashboard │
│ (Analytics only) │
└──────────────────┘What Stays On-Premise vs Cloud
| Component | Location | Why |
|---|---|---|
| Traffic Processing | Your Servers | Zero egress costs, lowest latency |
| Security Rules | Your Servers | No external calls for every request |
| Redis/PostgreSQL | Your Servers | Full data ownership and privacy |
| Analytics Dashboard | Cloud (Optional) | Beautiful UI, managed infrastructure |
| Metrics Sync | Cloud (Optional) | Batched every 60s, minimal bandwidth |
Deployment Options
Option 1: Docker Compose (Fastest)
# docker-compose.yml
version: '3.8'
services:
g8kepr:
image: g8kepr/gateway:latest
ports:
- "8000:8000"
environment:
- MODE=block # or "monitor" for shadow testing
- LICENSE_KEY=${LICENSE_KEY}
- UPSTREAM_URL=http://your-api:3000
volumes:
- ./config.yaml:/app/config.yaml
depends_on:
- redis
- postgres
redis:
image: redis:7-alpine
postgres:
image: postgres:15-alpine
environment:
- POSTGRES_PASSWORD=secure-password
# Start everything:
docker-compose up -dOption 2: Kubernetes with Helm (Production)
# Install G8KEPR on Kubernetes
helm repo add g8kepr https://charts.g8kepr.com
helm repo update
# Custom values.yaml
cat > values.yaml <<EOF
replicaCount: 5
autoscaling:
enabled: true
minReplicas: 3
maxReplicas: 10
ingress:
enabled: true
hosts:
- host: api.yourdomain.com
config:
mode: "block"
licenseKey: "your-license-key"
redis:
enabled: true
postgresql:
enabled: true
EOF
helm install g8kepr g8kepr/g8kepr -f values.yamlPricing for Self-Hosted
Community Edition
$0/mo
- ✓ Self-hosted Docker deployment
- ✓ Unlimited API requests
- ✓ All security features (30+ patterns)
- ✓ Basic rate limiting
- ✓ Community support (GitHub)
- ✗ No cloud dashboard
Starter (Self-Hosted)
$99/mo
- ✓ Everything in Community
- ✓ Cloud dashboard & analytics
- ✓ 10M requests/mo tracked
- ✓ Prometheus metrics
- ✓ Email support (48hr SLA)
- ✓ Helm chart for K8s
Configuration File
G8KEPR uses a single YAML config file for all settings. Changes take effect with zero-downtime reload (send SIGHUP to the container):
# config.yaml
mode: block # "monitor" for shadow testing
upstream:
url: http://backend-api:3000
timeout: 30s
security:
patterns:
- sql_injection
- xss
- path_traversal
- command_injection
rate_limit:
default: 1000/minute
endpoints:
/api/auth/login: 5/minute
/api/search: 100/minute
circuit_breaker:
failure_threshold: 50%
timeout: 30s
redis:
url: redis://redis:6379
postgres:
url: postgresql://user:pass@postgres:5432/g8kepr
# Optional: Send metrics to cloud dashboard
cloud:
enabled: true
api_key: ${CLOUD_API_KEY}
sync_interval: 60s # Batch metrics every 60sPerformance & Resource Usage
The G8KEPR container is lightweight and efficient:
| Metric | Value |
|---|---|
| Container Image Size | ~150MB (Alpine-based) |
| Memory (Idle) | 50-80MB |
| Memory (Under Load) | 200-500MB |
| CPU (10k req/s) | 1-2 cores |
| Latency Overhead | <5ms (p99) |
| Startup Time | <2 seconds |
When Self-Hosting Makes Sense
Choose self-hosting if you:
- ✓ Process >10M API requests/month (bandwidth savings add up)
- ✓ Need to keep all traffic within your VPC/network
- ✓ Have compliance requirements for data residency (GDPR, HIPAA)
- ✓ Already run Kubernetes and want to deploy alongside your apps
- ✓ Want full control over updates and configuration
- ✓ Prefer CapEx over OpEx (own the infrastructure)
Coming Q1 2025
Self-hosted deployment will be available in Q1 2025. Join the beta waitlist to get early access and help shape the final features.
Join Beta Waitlist →Ready to Secure Your APIs?
Deploy enterprise-grade API security in 5 minutes. No credit card required.
Start Free Trial