Documentation
Everything you need to set up and use Bananalytics Analytics
How are you running Bananalytics?
Quick Start
Get Bananalytics running in 5 minutes with Docker.
git clone https://github.com/bananalytics-analytics/bananalytics.git
cd bananalytics/server
docker-compose up -dThat's it for the backend. Postgres + the Go server start together, and database migrations are applied automatically on startup — no manual SQL needed. Verify with docker-compose logs bananalytics — you should see migrations: applied successfully and server starting port=8080.
2. Create your admin account. Open http://localhost:3000 in your browser. The first time you visit, you'll be redirected to /setup to register the first admin user (name, email, password). This page is one-time only — once an admin exists, it returns 410 Gone and everyone has to sign in via /login.
3. Create your first project.After signup you're dropped into the dashboard. Click "New Project"(or use the project switcher in the topbar), give it a name, and submit. You'll immediately see two keys:
rk_…— write key. Goes into your React Native app (the SDK'sapiKeyoption). Used for ingesting events. Safe to ship in the bundle.sk_…— secret key. Used for querying your data via the API (e.g./v1/query/events). The dashboard stores it server-side per session — never expose it in client code.
Copy both with the buttons in the modal. You can always re-view and rotate them later from Settings → API Keys on any project.
4. Drop the write key into your app. Jump to the React Native SDK section below — install the package, paste your rk_…key, and you're tracking events.
Server Setup
Before installing Bananalytics on a fresh VPS, harden Ubuntu so you're not running production on root with a wide-open firewall. If your server is already locked down (non-root sudo user, key-only SSH, UFW, fail2ban), skip ahead to Production Deploy.
1. First login + system update
ssh root@your-new-vps-ip
apt update && apt upgrade -y
apt autoremove -y2. Create a non-root sudo user
Don't use root for daily work. Replace max with whatever username you want.
adduser max # set a password when prompted (used for sudo)
usermod -aG sudo max3. Copy your SSH key to the new user
mkdir -p /home/max/.ssh
cp /root/.ssh/authorized_keys /home/max/.ssh/authorized_keys
chown -R max:max /home/max/.ssh
chmod 700 /home/max/.ssh
chmod 600 /home/max/.ssh/authorized_keysTest from your laptop in a new terminal (keep the root session open as a fallback): ssh max@your-vps-ip
4. Lock down SSH (disable root + password auth)
Add a drop-in config (cleanest — won't be overwritten by cloud-init updates):
sudo tee /etc/ssh/sshd_config.d/00-hardening.conf > /dev/null <<EOF
PermitRootLogin no
PasswordAuthentication no
PubkeyAuthentication yes
EOF
sudo sshd -t # must print nothing
sudo systemctl reload sshFrom a new terminal: ssh root@your-vps-ip must fail with “Permission denied”; ssh max@your-vps-ip must succeed. Only then close the existing root session.
5. Firewall (UFW)
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow 22/tcp # SSH
sudo ufw allow 80/tcp # HTTP (Caddy needs for Let's Encrypt)
sudo ufw allow 443/tcp # HTTPS
sudo ufw --force enable
sudo ufw status verboseDon't open 5432, 8080, or 3000 — Bananalytics binds those to the internal Docker network only. They should never be publicly reachable.
6. Fail2ban (brute-force protection)
sudo apt install -y fail2ban
sudo systemctl enable --now fail2ban
sudo fail2ban-client status sshd # see banned IPs anytime7. Automatic security updates
sudo apt install -y unattended-upgrades
sudo dpkg-reconfigure -plow unattended-upgrades
# Press ENTER when asked "Automatically install stable updates? Yes"8. Timezone, hostname, swap
# Set your timezone (affects log timestamps)
sudo timedatectl set-timezone Europe/Berlin
# Memorable hostname
sudo hostnamectl set-hostname bananalytics-prod
# 2 GB swap file — safety net for the 4 GB box
sudo fallocate -l 2G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab
echo 'vm.swappiness=10' | sudo tee -a /etc/sysctl.conf
sudo sysctl -p9. Useful tools + reboot
sudo apt install -y htop ncdu git curl wget tmux jq
sudo rebootWait ~30 seconds, SSH back in as your new user. The box is now ready for the Bananalytics install. Continue to Production Deploy.
Realistic time:~10 minutes if you copy-paste straight through. Once you've done it once, it's a 5-minute ritual for any new box.
Production Deploy
Deploy Bananalytics on any Ubuntu VPS with Docker. Recommended: Hetzner CX22 (2 vCPU, 4 GB, 40 GB) at €4.75/month.
This guide assumes a hardened VPS— non-root sudo user, key-only SSH, UFW with ports 22/80/443 open. If your server isn't set up yet, follow Server Setup first.
What you're deploying
Four containers, all on one VPS. Caddy is the only thing exposed to the internet — everything else lives on the internal Docker network.
| Service | Port | What it does |
|---|---|---|
| caddy | 80, 443 (public) | HTTPS reverse proxy. Routes /v1/*, /health to the backend; everything else to the dashboard. Auto-provisions Let's Encrypt certificates. |
| dashboard | 3000 (internal) | Next.js admin UI. Login, project management, charts, retention, geography, settings. |
| bananalytics | 8080 (internal) | Go API server. Event ingestion (/v1/ingest), queries (/v1/query/*), auth (/v1/auth/*), GeoIP enrichment, rate limiting. |
| postgres | 5432 (internal) | PostgreSQL 16. Stores users, projects, sessions, and the partitioned events table. Migrations apply automatically on backend startup. |
1. Add a DNS A record
In your DNS provider, point the subdomain you want (e.g. analytics.yourdomain.com) at your VPS IP. Verify after ~1 minute:
dig analytics.yourdomain.com +short
# should print your VPS IPCritical: Caddy can't fetch a Let's Encrypt cert until DNS resolves correctly.
2. Install Docker
curl -fsSL https://get.docker.com | sh
# Let your sudo user run docker without sudo
sudo usermod -aG docker $USER
exit # then SSH back in for group membership to take effect3. Clone the repo
sudo mkdir -p /opt/bananalytics
sudo chown $USER:$USER /opt/bananalytics
cd /opt/bananalytics
git clone https://github.com/TableTennisCoder/bananalytics.git .4. Configure env vars
Create /opt/bananalytics/server/.env:
# Caddy uses this for the SSL cert hostname
BANANA_DOMAIN=analytics.yourdomain.com
# Origins allowed to call the API (your dashboard + marketing site)
BANANA_CORS_ORIGINS=https://analytics.yourdomain.com,https://yourdomain.com
BANANA_LOG_LEVEL=info5. (Optional) Copy GeoIP database
Without this, country/city features show empty data. From your local machine:
scp ./server/geoip/GeoLite2-City.mmdb \
user@your-vps-ip:/opt/bananalytics/server/geoip/See GeoIP Setup for how to download the database.
6. Build and start
cd /opt/bananalytics/server
docker compose up -d --buildFirst build takes ~3-4 minutes. Then check:
docker compose ps # all 4 services should be "healthy"
docker compose logs --tail 50
# Look for:
# bananalytics → "migrations: applied successfully"
# bananalytics → "server starting port=8080"
# dashboard → "Ready in Xms"
# caddy → "certificate obtained successfully"7. Claim your instance — DO THIS IMMEDIATELY
The /setup endpoint is publicly reachable until the first admin is created. If anyone else hits it before you, they own the instance.
Open in your browser right now:
https://analytics.yourdomain.com/setupRegister your admin user. From then on, /setup returns 410 Gone forever.
8. Daily Postgres backup
Add to root's crontab (sudo crontab -e):
0 3 * * * docker exec server-postgres-1 pg_dump -U bananalytics bananalytics | gzip > /opt/backups/bananalytics-$(date +\%F).sql.gz && find /opt/backups -name "bananalytics-*.sql.gz" -mtime +14 -deleteDaily backup at 3 AM, keeps 14 days of history. Don't forget sudo mkdir -p /opt/backups first.
How to deploy updates
cd /opt/bananalytics
git pull
cd server
docker compose up -d --build~30 seconds for code-only changes (cached layers), ~3 minutes for dependency changes.
Realistic time for the full deploy:~20 minutes (mostly DNS propagation + first Docker build). The whole flow is one git pull + one docker command — no manual database creation, no migration scripts, no SSL certs to renew. Caddy + the Go backend's auto-migrations handle it all.
Capacity & Scaling
Bananalytics is designed to run lean. The Go backend is a single static binary, Postgres uses a partitioned events table, and the SDK batches events to keep network traffic minimal. The result: you can run the entire stack — including your Next.js dashboard — on a $5 server for a long time before you need to scale up.
What's actually using your RAM
On a typical Hetzner CX22 (2 vCPU, 4 GB RAM) running everything co-located, here's the memory budget at idle vs. modest load:
| Service | Idle | Under load |
|---|---|---|
| Ubuntu base + sshd | ~300 MB | ~300 MB |
| Docker daemon | ~100 MB | ~100 MB |
| PostgreSQL 16 | ~150 MB | ~400–600 MB |
| bananalytics (Go) | ~30 MB | ~80–150 MB |
| Next.js dashboard | ~250 MB | ~400–500 MB |
| Total | ~830 MB | ~1.3–1.7 GB |
You'll sit at ~25% RAM idle, ~40% under normal use. Plenty of headroom on a 4 GB box.
Event throughput
The Go server is never the bottleneck — Postgres is. With the default config on 2 vCPU + 4 GB:
- Sustained ingest: ~500–1,000 events/second
- Peak burst: ~2,000 events/second (batched)
- Per day sustained: ~40–80 million events
- Per month theoretical max: ~1–2 billion events
For comparison, Mixpanel charges ~$2,800/month for 1B events (at $0.28/1K).
Disk is the real limit
Each event row is ~300–700 bytes in Postgres (event name, properties JSON, IDs, timestamps, geo, indexes). Including index overhead and WAL:
| Events stored | Approx disk used |
|---|---|
| 1 million | ~1 GB |
| 10 million | ~8–12 GB |
| 30 million | ~25–30 GB |
| 50 million | ~40 GB — disk full |
Practical capacity of a CX22:~30–40 million events stored. If you average 50 events per active user per day:
- 1K MAU → ~50K events/day → years of headroom
- 10K MAU → ~500K events/day → ~2 months on disk
- 100K MAU → ~5M events/day → ~6 days on disk — needs upgrade
Recommended specs by app stage
| App stage | Hetzner box | Cost / mo | Notes |
|---|---|---|---|
| MVP — first 1K users | CX22 · 2/4/40 | €4.75 | 6+ months runway |
| 10K–50K MAU | CX32 · 4/8/80 | ~€7 | More disk runway, smoother queries |
| 50K–200K MAU | CX42 · 8/16/160 | ~€15 | Better Postgres caching, big-query headroom |
| 200K+ MAU | Dedicated DB box | ~€30+ | Split Postgres onto its own VM via private network |
When to upgrade
Watch for these signals:
- Disk > 70% full → attach a Hetzner Volume (€0.044/GB/mo, separate block storage) and mount it on
/var/lib/docker/volumes/server_pgdata. Cheaper than upgrading the whole VM. - Postgres queries > 2s on the dashboard → bump to CX32 (more shared_buffers cache hits).
- Ingest p99 latency > 100ms → add CPU; or move ratelimit/auth caches to Postgres-backed store and load-balance two backends behind your reverse proxy.
Monitoring commands
# Container resource usage (run on the server)
docker stats
# Disk usage
df -h
docker exec server-postgres-1 psql -U bananalytics -d bananalytics \
-c "SELECT pg_size_pretty(pg_database_size('bananalytics'));"
# Slow queries (last 24h)
docker-compose logs bananalytics --since 24h | grep -i "duration_ms.*[0-9]\{4,\}"Tip:Hetzner lets you resize the VM with the data volume intact — ~30 seconds of downtime. No reason to over-provision now. Start small, grow as needed.
GeoIP Setup
Bananalytics uses MaxMind's free GeoLite2 database to map user IP addresses to countries and cities. This powers the 3D globe, the geography dashboard, and the "Top Country" KPI on your overview.
Without this setup, all geo features will be empty. The server runs fine — it just won't enrich events with location data. Lookups happen locally on your server, so no IP data ever leaves your infrastructure.
1. Get a free MaxMind license key
Sign up at maxmind.com/en/geolite2/signup (free, takes 30 seconds). After confirming your email, go to "My License Key" → "Generate new license key".
2. Download the database
The repository ships with a download script. Run it in the server/ directory:
export MAXMIND_LICENSE_KEY=your_key_here
./scripts/download-geoip.sh$env:MAXMIND_LICENSE_KEY = "your_key_here"
.\scripts\download-geoip.ps1The script downloads ~70 MB to ./geoip/GeoLite2-City.mmdb. That directory is already mounted into the Docker container and gitignored.
3. Restart the server
docker-compose restart bananalyticsYou should see GeoIP database loaded in the logs (instead of GeoIP disabled). All new events will now be enriched with country, city, and coordinates.
4. Keep it fresh (optional)
MaxMind updates the database weekly. To stay current, re-run the script monthly — or add a cron job:
0 3 1-7 * 0 cd /path/to/bananalytics/server && \
MAXMIND_LICENSE_KEY=xxx ./scripts/download-geoip.sh && \
docker-compose restart bananalyticsPrivacy note: The lookup happens entirely on your server. The MaxMind database is local — no IP addresses are ever sent to MaxMind or any third party. Only the resolved country / city are stored alongside the event.
Configuration
| Variable | Default | Description |
|---|---|---|
| BANANA_PORT | 8080 | HTTP server port |
| BANANA_DB_DSN | required | PostgreSQL connection string |
| BANANA_LOG_LEVEL | info | debug, info, warn, error |
| BANANA_RATE_LIMIT_RPM | 1000 | Requests/min per API key |
| BANANA_IP_RATE_LIMIT_RPM | 300 | Requests/min per IP |
| BANANA_CORS_ORIGINS | * | Allowed origins |
| BANANA_DB_MAX_CONNS | 25 | Max DB connections |
| BANANA_GEOIP_DB | Path to GeoLite2-City.mmdb | |
| BANANA_DOMAIN | localhost | Domain for Caddy HTTPS |
AI Setup
Copy the prompt below and paste it into Claude Code, Cursor, Copilot, or any AI coding agent. It will integrate Bananalytics into your React Native app in one run.
Integrate Bananalytics analytics into this React Native app. Bananalytics is a self-hosted, privacy-first product analytics tool. Follow these steps exactly: ## 1. Install dependencies ...
How it works
- Copy the prompt above into Claude Code, Cursor, or any AI coding agent
- Replace YOUR_WRITE_KEY and YOUR_ENDPOINT with your actual values
- Answer the AI when it asks what events you want to track
- Done — the AI installs the SDK, adds tracking calls, and instruments your app
React Native SDK
Install the SDK in your React Native app.
npm install @bananalytics/react-native
npm install @react-native-async-storage/async-storageimport { Bananalytics } from '@bananalytics/react-native';
Bananalytics.init({
apiKey: 'rk_your_write_key',
endpoint: 'https://your-server.com',
debug: true,
});// Track custom events
Bananalytics.track('button_clicked', { button: 'signup' });
// Track screen views
Bananalytics.screen('HomeScreen');
// Identify users
Bananalytics.identify('user-123', { plan: 'pro' });
// Flush events immediately
await Bananalytics.flush();React Provider
import { BananalyticsProvider, useBananalytics, useTrackScreen } from '@bananalytics/react-native';
function App() {
return (
<BananalyticsProvider config={{ apiKey: 'rk_...', endpoint: '...' }}>
<HomeScreen />
</BananalyticsProvider>
);
}
function HomeScreen() {
useTrackScreen('HomeScreen');
const bananalytics = useBananalytics();
return (
<Button onPress={() => bananalytics.track('tapped')} title="Tap" />
);
}Configuration Options
| Option | Default | Description |
|---|---|---|
| apiKey | required | Write-only API key |
| endpoint | required | Backend URL |
| flushInterval | 30000 | Auto-flush interval (ms) |
| flushAt | 20 | Events before auto-flush |
| maxQueueSize | 1000 | Max events in memory |
| maxRetries | 3 | Retry attempts |
| debug | false | Console logging |
| trackAppLifecycle | true | Auto-track foreground/background |
| sessionTimeout | 1800000 | Session timeout (ms) |
Event Strategy
A good event strategy is the difference between a dashboard full of noise and one that drives decisions. Here is how to instrument your app to get the most out of Bananalytics.
Core Events You Should Track
These events power the dashboard features and give you a complete picture of user behavior.
Onboarding & Activation
Measure how users get from install to value. Build funnels to find where they drop off.
app_openedDistinguish first launch from returning users
first_open: boolean
signup_startedSee which auth methods convert best
method: 'email' | 'google' | 'apple'
signup_completedMeasure signup friction. Funnel: started → completed
method, time_to_complete_ms
onboarding_step_viewedFind which onboarding step loses users
step: number, step_name: string
onboarding_completedTrack activation rate
steps_completed: number
Core Product Usage
Track the actions that define your product's value. These power retention cohorts.
feature_usedSee which features drive retention
feature: string
content_viewedUnderstand what users engage with
content_id, content_type, source
search_performedDiscover unmet needs (zero-result searches)
query, results_count
item_createdMeasure creation activity as engagement signal
item_type, item_id
share_tappedTrack organic virality loops
content_type, share_method
Revenue & Conversion
Track the money path. Build funnels from browse to purchase to optimize conversion.
product_viewedTop of the purchase funnel
product_id, price, category
add_to_cartMid-funnel intent signal
product_id, quantity, price
checkout_startedHigh-intent moment — track abandonment
cart_value, item_count
purchase_completedRevenue tracking. Compare to checkout_started for drop-off
order_id, total, currency, items
subscription_startedSaaS conversion tracking
plan, price, trial: boolean
subscription_cancelledUnderstand churn reasons
plan, reason, days_active
Engagement & Retention Signals
These events feed your retention heatmap and help predict churn.
session_startedSession count per user = engagement health
(auto-tracked)
notification_receivedMeasure push notification effectiveness
type, campaign_id
notification_tappedTap rate = notification quality signal
type, campaign_id
rating_promptedOptimize when to ask for reviews
days_since_install
rating_submittedTrack app store rating health
stars, days_since_install
Errors & Friction
Track where users hit walls. These often reveal the biggest conversion opportunities.
error_occurredSurface bugs that affect real users
error_code, screen, message
payment_failedLost revenue you can recover
error_type, retry_count
form_abandonedFind the field that kills your form
form_name, last_field_filled
permission_deniedUsers refusing permissions = feature blockers
permission_type
Using Events with Dashboard Features
| Dashboard Feature | Events to Track | Insight You Get |
|---|---|---|
| Funnels | signup_started → signup_completed → first_purchase | Where users drop off in your conversion flow |
| Retention | Any recurring action (session_started, feature_used) | How many users come back on day 1, 7, 30 |
| Live View | All events in real-time | Verify tracking works, monitor launches & campaigns |
| Geography | All events (geo is extracted from IP) | Where your users are, localization priorities |
| Sessions | session_started + any user-identified events | Debug individual user journeys |
Best Practices
- Use past tense for event names —
purchase_completednotpurchase. It is clear that the action happened. - Use snake_case consistently — Bananalytics groups events by name. Mixed casing creates duplicates.
- Keep properties flat —
{ price: 49.99, currency: "USD" }not{ payment: { price: 49.99 } }. Easier to query. - Call
identify()early — As soon as the user logs in. This links anonymous events to a real user for session tracking. - Track screens with
screen()— It auto-createsscreen_viewevents, which powers the top events dashboard and retention. - Start with 10-15 events max — You can always add more. Too many events early on create noise and make dashboards hard to read.
// After user logs in
Bananalytics.identify('user-123', { plan: 'free' });
// Screen views (auto-tracked with useTrackScreen hook)
Bananalytics.screen('HomeScreen');
Bananalytics.screen('ProductScreen');
// Core conversion funnel
Bananalytics.track('product_viewed', {
product_id: 'prod_abc', price: 49.99, category: 'shoes'
});
Bananalytics.track('add_to_cart', {
product_id: 'prod_abc', quantity: 1, price: 49.99
});
Bananalytics.track('checkout_started', {
cart_value: 49.99, item_count: 1
});
Bananalytics.track('purchase_completed', {
order_id: 'ord_xyz', total: 49.99, currency: 'USD'
});
// Engagement signals
Bananalytics.track('search_performed', {
query: 'running shoes', results_count: 24
});
Bananalytics.track('share_tapped', {
content_type: 'product', share_method: 'instagram'
});
// Error tracking
Bananalytics.track('payment_failed', {
error_type: 'card_declined', retry_count: 0
});API Reference
Include your API key in every request:
curl -H "Authorization: Bearer sk_your_secret_key" http://localhost:8080/v1/query/eventsrk_* — Write Key (ingestion)sk_* — Secret Key (queries)Error Codes
| Status | Code | Description |
|---|---|---|
| 400 | BAD_REQUEST | Invalid request body or parameters |
| 400 | VALIDATION_FAILED | Event validation failed |
| 401 | UNAUTHORIZED | Missing or invalid API key |
| 413 | PAYLOAD_TOO_LARGE | Request body exceeds 5MB |
| 429 | RATE_LIMITED | Too many requests |
| 500 | INTERNAL_ERROR | Server error |
Privacy & Compliance
Bananalytics is designed with privacy in mind. All data stays on your infrastructure — no third-party services, no data sharing.
- Self-hosted: Data never leaves your server
- Opt-out support: Built-in consent management in the SDK
- PII sanitization: Auto-strips email, phone, SSN from auto-captured events
- No cookies: Uses device storage, not browser cookies
- GDPR-friendly: You control the data, you handle deletion requests