ββββββ βββββββ βββ βββ ββββββββββββββ βββββββββββ
βββββββββββββββββββ βββ ββββββββββββββββ ββββββββββββ
βββββββββββββββββββ βββββββ ββββββ βββββββ ββββββββ
βββββββββββββββ βββ βββββββ ββββββ βββββ ββββββββ
βββ ββββββ βββ βββ βββββββββββ βββ ββββββββ
βββ ββββββ βββ βββ βββββββββββ βββ ββββββββ
A comprehensive guide to securely managing API keys for all major AI, development, and creative platforms integrated with NerdCabalMCP.
NerdCabalMCP integrates with 20+ external services, each requiring API authentication. This guide shows you how to:
Before you start, ensure you havenβt accidentally committed keys:
# Check git history for potential secrets
git log -p | grep -i "api.*key\|secret\|token" | head -20
# Use gitleaks to scan for secrets
docker run -v $(pwd):/repo zricethezav/gitleaks:latest detect --source /repo
# Use TruffleHog
trufflehog git file://. --only-verified
If you find leaked keys:
git filter-branch or BFG Repo-Cleaner to remove from history| Level | Use Case | Example |
|---|---|---|
| Read-Only | Fetching data, monitoring | GitHub repo read, HuggingFace model download |
| Read-Write | Creating/updating resources | Creating GitHub issues, uploading datasets |
| Admin | Full control, billing | Organization settings, key rotation |
Principle: Always use the minimum required permission level.
Create a .env file in the project root:
# /home/user/NerdCabalMCP/.env
# ==============================================================================
# LLM PROVIDERS
# ==============================================================================
# Anthropic (Claude)
ANTHROPIC_API_KEY=sk-ant-api03-xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# OpenAI (GPT-4, DALL-E)
OPENAI_API_KEY=sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Google (Gemini)
GOOGLE_API_KEY=AIzaXxXxXxXxXxXxXxXxXxXxXxXxXxXxXxXx
GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
# Cohere
COHERE_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Mistral AI
MISTRAL_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Together AI
TOGETHER_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Replicate
REPLICATE_API_TOKEN=r8_xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# ==============================================================================
# DEVELOPMENT PLATFORMS
# ==============================================================================
# GitHub
GITHUB_TOKEN=ghp_xxxxxxxxxxxxxxxxxxxxxxxxxxxx
GITHUB_PERSONAL_ACCESS_TOKEN=ghp_xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Replit
REPLIT_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Vercel
VERCEL_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Railway
RAILWAY_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# ==============================================================================
# CREATIVE TOOLS
# ==============================================================================
# Figma
FIGMA_ACCESS_TOKEN=figd_xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Canva
CANVA_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# ==============================================================================
# ML & DATA PLATFORMS
# ==============================================================================
# HuggingFace
HUGGINGFACE_TOKEN=hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Kaggle
KAGGLE_USERNAME=your_username
KAGGLE_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Weights & Biases
WANDB_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# MLflow
MLFLOW_TRACKING_URI=https://your-mlflow-server.com
MLFLOW_TRACKING_USERNAME=admin
MLFLOW_TRACKING_PASSWORD=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# FiftyOne
FIFTYONE_DATABASE_URI=mongodb://localhost:27017
# ==============================================================================
# INFRASTRUCTURE
# ==============================================================================
# Cloudflare
CLOUDFLARE_API_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# AWS
AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
AWS_DEFAULT_REGION=us-east-1
# Google Cloud
GCP_PROJECT_ID=your-project-id
# ==============================================================================
# MISCELLANEOUS
# ==============================================================================
# Slack
SLACK_BOT_TOKEN=xoxb-xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Discord
DISCORD_BOT_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Twilio
TWILIO_ACCOUNT_SID=ACxxxxxxxxxxxxxxxxxxxxxxxxxxxx
TWILIO_AUTH_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Brave Search
BRAVE_SEARCH_API_KEY=BSAxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Secure the .env file:
# Make sure .env is in .gitignore
echo ".env" >> .gitignore
# Set restrictive permissions (Unix/Linux/macOS)
chmod 600 .env
# Verify it's not tracked
git status | grep .env # Should not appear
Set environment variables at the system level:
# Add to ~/.bashrc, ~/.zshrc, or ~/.profile
export ANTHROPIC_API_KEY="sk-ant-api03-xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_API_KEY="sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export HUGGINGFACE_TOKEN="hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
# Reload shell
source ~/.bashrc # or ~/.zshrc
# User-level environment variable
[System.Environment]::SetEnvironmentVariable('ANTHROPIC_API_KEY', 'sk-ant-api03-xxx', 'User')
# Or edit via GUI: System Properties > Environment Variables
# docker-compose.yml
version: '3.8'
services:
nerdcabal:
image: nerdcabal-mcp:latest
env_file:
- .env
# Or individual variables
environment:
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
See Secret Management Systems section below.
Get API Key:
sk-ant-api03-)Environment Variables:
export ANTHROPIC_API_KEY="sk-ant-api03-xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Test with curl
curl https://api.anthropic.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
}'
Usage Monitoring:
Get API Key:
sk-proj- or sk-)Environment Variables:
export OPENAI_API_KEY="sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
from openai import OpenAI
client = OpenAI(api_key=os.environ['OPENAI_API_KEY'])
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
Usage Monitoring:
Get API Key:
AIza)Environment Variables:
export GOOGLE_API_KEY="AIzaXxXxXxXxXxXxXxXxXxXxXxXxXxXxXxXx"
Testing:
import google.generativeai as genai
genai.configure(api_key=os.environ['GOOGLE_API_KEY'])
model = genai.GenerativeModel('gemini-1.5-pro')
response = model.generate_content("Hello!")
print(response.text)
Similar process for each:
Quick Reference:
# Cohere
export COHERE_API_KEY="..."
# https://dashboard.cohere.com/api-keys
# Mistral
export MISTRAL_API_KEY="..."
# https://console.mistral.ai
# Together AI
export TOGETHER_API_KEY="..."
# https://api.together.xyz/settings/api-keys
# Replicate
export REPLICATE_API_TOKEN="r8_..."
# https://replicate.com/account/api-tokens
Get Personal Access Token (PAT):
repo (for private repos)read:org (for organization access)workflow (for GitHub Actions)ghp_ or github_pat_)Environment Variables:
export GITHUB_TOKEN="ghp_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Test with GitHub CLI
gh auth status
# Or with curl
curl -H "Authorization: token $GITHUB_TOKEN" https://api.github.com/user
MCP Server Configuration:
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_TOKEN": "ghp_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
}
}
}
Get API Token:
Environment Variables:
export REPLIT_TOKEN="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Get Token:
Environment Variables:
export VERCEL_TOKEN="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Install Vercel CLI
npm i -g vercel
# Login with token
vercel login --token $VERCEL_TOKEN
# Test
vercel whoami
Get Access Token:
figd_)Environment Variables:
export FIGMA_ACCESS_TOKEN="figd_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Get user info
curl -H "X-Figma-Token: $FIGMA_ACCESS_TOKEN" https://api.figma.com/v1/me
Usage with NerdCabal Creative Director:
{
"tool": "creative-director",
"style": "cyberpunk-brutalist-bauhaus",
"export_to_figma": true,
"figma_token": "${FIGMA_ACCESS_TOKEN}"
}
Get API Key:
Environment Variables:
export CANVA_API_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export CANVA_CLIENT_ID="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export CANVA_CLIENT_SECRET="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Get Token:
hf_)Environment Variables:
export HUGGINGFACE_TOKEN="hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Login via CLI
huggingface-cli login
# Or programmatically
python -c "from huggingface_hub import HfApi; api = HfApi(token='$HUGGINGFACE_TOKEN'); print(api.whoami())"
Usage with NerdCabal Dataset Builder:
{
"tool": "dataset-builder",
"dataset_type": "SFT",
"upload_to_huggingface": true,
"hf_token": "${HUGGINGFACE_TOKEN}",
"hf_repo": "username/dataset-name"
}
Get API Credentials:
kaggle.jsonSetup:
# Place kaggle.json in the right location
mkdir -p ~/.kaggle
mv ~/Downloads/kaggle.json ~/.kaggle/
chmod 600 ~/.kaggle/kaggle.json
# Or set environment variables
export KAGGLE_USERNAME="your_username"
export KAGGLE_KEY="your_api_key"
Testing:
# Install Kaggle CLI
pip install kaggle
# Test
kaggle datasets list
Get API Key:
Environment Variables:
export WANDB_API_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Testing:
# Login
wandb login
# Or programmatically
python -c "import wandb; wandb.login(key='$WANDB_API_KEY')"
Self-Hosted Setup:
# Start MLflow server
mlflow server \
--backend-store-uri sqlite:///mlflow.db \
--default-artifact-root ./mlruns \
--host 0.0.0.0 \
--port 5000
# Set tracking URI
export MLFLOW_TRACKING_URI="http://localhost:5000"
With Authentication:
export MLFLOW_TRACKING_USERNAME="admin"
export MLFLOW_TRACKING_PASSWORD="your_password"
Testing:
import mlflow
mlflow.set_tracking_uri(os.environ['MLFLOW_TRACKING_URI'])
with mlflow.start_run():
mlflow.log_param("test", "value")
Get Access Keys:
Environment Variables:
export AWS_ACCESS_KEY_ID="AKIA..."
export AWS_SECRET_ACCESS_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export AWS_DEFAULT_REGION="us-east-1"
Testing:
# Install AWS CLI
pip install awscli
# Test
aws s3 ls
Get Service Account Key:
Environment Variables:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
export GCP_PROJECT_ID="your-project-id"
Testing:
# Install gcloud CLI
gcloud auth activate-service-account --key-file=$GOOGLE_APPLICATION_CREDENTIALS
# Test
gcloud projects list
Get API Token:
Environment Variables:
export CLOUDFLARE_API_TOKEN="xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
For production deployments and team collaboration, use a proper secret management system.
Setup:
# Start Vault server (dev mode)
vault server -dev
# Set Vault address
export VAULT_ADDR='http://127.0.0.1:8200'
# Store secrets
vault kv put secret/nerdcabal \
anthropic_api_key="sk-ant-..." \
openai_api_key="sk-proj-..." \
github_token="ghp_..."
# Retrieve secret
vault kv get secret/nerdcabal
Integration with NerdCabal:
# Create startup script
cat > load-secrets.sh << 'EOF'
#!/bin/bash
export ANTHROPIC_API_KEY=$(vault kv get -field=anthropic_api_key secret/nerdcabal)
export OPENAI_API_KEY=$(vault kv get -field=openai_api_key secret/nerdcabal)
export GITHUB_TOKEN=$(vault kv get -field=github_token secret/nerdcabal)
EOF
# Source before running MCP server
source load-secrets.sh
node mcp-server/dist/index.js
Store Secrets:
# Create secret
aws secretsmanager create-secret \
--name nerdcabal/api-keys \
--secret-string '{
"anthropic_api_key": "sk-ant-...",
"openai_api_key": "sk-proj-...",
"github_token": "ghp_..."
}'
# Retrieve secret
aws secretsmanager get-secret-value \
--secret-id nerdcabal/api-keys \
--query SecretString \
--output text | jq -r '.anthropic_api_key'
Auto-load Script:
#!/bin/bash
# load-aws-secrets.sh
SECRET_JSON=$(aws secretsmanager get-secret-value \
--secret-id nerdcabal/api-keys \
--query SecretString \
--output text)
export ANTHROPIC_API_KEY=$(echo $SECRET_JSON | jq -r '.anthropic_api_key')
export OPENAI_API_KEY=$(echo $SECRET_JSON | jq -r '.openai_api_key')
export GITHUB_TOKEN=$(echo $SECRET_JSON | jq -r '.github_token')
Store Secrets:
# Create secret
echo -n "sk-ant-..." | gcloud secrets create anthropic-api-key --data-file=-
# Access secret
gcloud secrets versions access latest --secret="anthropic-api-key"
Auto-load Script:
#!/bin/bash
export ANTHROPIC_API_KEY=$(gcloud secrets versions access latest --secret="anthropic-api-key")
export OPENAI_API_KEY=$(gcloud secrets versions access latest --secret="openai-api-key")
Setup:
# Install Doppler CLI
brew install dopplerhq/cli/doppler # macOS
# or curl -Ls https://cli.doppler.com/install.sh | sh
# Login
doppler login
# Setup project
doppler setup
# Set secrets
doppler secrets set ANTHROPIC_API_KEY="sk-ant-..."
# Run with secrets injected
doppler run -- node mcp-server/dist/index.js
Benefits:
Setup:
# Install 1Password CLI
brew install 1password-cli
# Sign in
op signin
# Create item
op item create \
--category=Login \
--title="NerdCabal API Keys" \
--vault="Development" \
anthropic_api_key="sk-ant-..." \
openai_api_key="sk-proj-..."
# Retrieve and export
export ANTHROPIC_API_KEY=$(op read "op://Development/NerdCabal API Keys/anthropic_api_key")
| Solution | Hosting | Cost | Best For | Complexity |
|---|---|---|---|---|
| Vault | Self-hosted or Cloud | Free (OSS) / Paid (Enterprise) | Enterprise | High |
| AWS Secrets Manager | AWS Cloud | $0.40/secret/month + API calls | AWS users | Medium |
| GCP Secret Manager | GCP Cloud | $0.06/secret/month + API calls | GCP users | Medium |
| Doppler | Cloud (SaaS) | Free tier / $7/user/month | Startups, teams | Low |
| 1Password | Cloud (SaaS) | $7.99/user/month | Individuals, small teams | Low |
Regular key rotation is essential for security.
Recommended Schedule:
#!/bin/bash
# rotate-keys.sh
# 1. Generate new keys (provider-specific)
# Example for GitHub:
NEW_GITHUB_TOKEN=$(gh api /user/tokens --method POST \
--field scopes[]="repo" \
--field note="NerdCabal-$(date +%Y%m%d)" \
--jq '.token')
# 2. Update in secret manager
vault kv put secret/nerdcabal github_token="$NEW_GITHUB_TOKEN"
# 3. Test new key
export GITHUB_TOKEN="$NEW_GITHUB_TOKEN"
if gh auth status; then
echo "β
New key works"
else
echo "β New key failed, rolling back"
exit 1
fi
# 4. Revoke old key
# (provider-specific, e.g., via API)
# 5. Update documentation
echo "$(date): Rotated GitHub token" >> rotation-log.txt
Use dual-key strategy:
Implementation:
# Support multiple keys temporarily
export PRIMARY_API_KEY="new-key"
export FALLBACK_API_KEY="old-key"
# In your code, try primary first, fallback if fails
Define roles for your team:
| Role | Access Level | Permissions |
|---|---|---|
| Admin | Full | Create/revoke keys, manage billing, all agents |
| Developer | Read-Write | Use all agents, create resources, no billing |
| Analyst | Read-Only | Query data, run analyses, no writes |
| Auditor | Logs-Only | View usage logs, no resource access |
Instead of sharing one key, create individual keys:
GitHub Example:
# Alice's key (repo read/write)
alice_token="ghp_alice..."
# Bob's key (repo read only)
bob_token="ghp_bob..."
# Charlie's key (workflow management)
charlie_token="ghp_charlie..."
Benefits:
Option 1: Shared 1Password Vault
Option 2: Doppler Teams
doppler run locallyChecklist:
- [ ] Create accounts on required platforms
- [ ] Generate scoped API keys
- [ ] Add to secret management system
- [ ] Grant appropriate RBAC role
- [ ] Share documentation (this guide!)
- [ ] Test access with sample task
- [ ] Add to rotation schedule
Offboarding:
- [ ] Revoke all API keys
- [ ] Remove from secret management
- [ ] Audit recent activity
- [ ] Rotate any shared keys they had access to
- [ ] Update team documentation
Create mcp-server/config/secrets.json (excluded from git):
{
"providers": {
"anthropic": {
"api_key": "${ANTHROPIC_API_KEY}",
"max_tokens": 4096,
"model": "claude-sonnet-4-5-20250929"
},
"openai": {
"api_key": "${OPENAI_API_KEY}",
"organization": "${OPENAI_ORG_ID}",
"model": "gpt-4o"
},
"google": {
"api_key": "${GOOGLE_API_KEY}",
"model": "gemini-1.5-pro"
}
},
"integrations": {
"github": {
"token": "${GITHUB_TOKEN}"
},
"huggingface": {
"token": "${HUGGINGFACE_TOKEN}"
},
"mlflow": {
"tracking_uri": "${MLFLOW_TRACKING_URI}",
"username": "${MLFLOW_TRACKING_USERNAME}",
"password": "${MLFLOW_TRACKING_PASSWORD}"
},
"figma": {
"access_token": "${FIGMA_ACCESS_TOKEN}"
},
"wandb": {
"api_key": "${WANDB_API_KEY}",
"project": "nerdcabal-experiments"
}
}
}
Load in TypeScript:
// mcp-server/src/config.ts
import * as fs from 'fs';
import * as path from 'path';
interface Config {
providers: {
[key: string]: {
api_key: string;
[key: string]: any;
};
};
integrations: {
[key: string]: {
[key: string]: string;
};
};
}
export function loadConfig(): Config {
const configPath = path.join(__dirname, '../config/secrets.json');
const configTemplate = fs.readFileSync(configPath, 'utf-8');
// Replace ${VAR} with process.env.VAR
const configResolved = configTemplate.replace(
/\$\{(\w+)\}/g,
(_, varName) => process.env[varName] || ''
);
return JSON.parse(configResolved);
}
// Usage
const config = loadConfig();
const anthropicKey = config.providers.anthropic.api_key;
Allow agents to choose provider based on task:
// mcp-server/src/providers.ts
import Anthropic from '@anthropic-ai/sdk';
import OpenAI from 'openai';
import { loadConfig } from './config';
const config = loadConfig();
export class LLMProvider {
private anthropic: Anthropic;
private openai: OpenAI;
constructor() {
this.anthropic = new Anthropic({
apiKey: config.providers.anthropic.api_key,
});
this.openai = new OpenAI({
apiKey: config.providers.openai.api_key,
});
}
async chat(provider: 'anthropic' | 'openai', prompt: string) {
if (provider === 'anthropic') {
const response = await this.anthropic.messages.create({
model: config.providers.anthropic.model,
max_tokens: config.providers.anthropic.max_tokens,
messages: [{ role: 'user', content: prompt }],
});
return response.content[0].text;
} else {
const response = await this.openai.chat.completions.create({
model: config.providers.openai.model,
messages: [{ role: 'user', content: prompt }],
});
return response.choices[0].message.content;
}
}
}
Solutions:
echo $ANTHROPIC_API_KEY # Should print key
Solutions:
async function retryWithBackoff(fn: Function, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
try {
return await fn();
} catch (error) {
if (error.status === 429 && i < maxRetries - 1) {
await sleep(2 ** i * 1000); // 1s, 2s, 4s
} else {
throw error;
}
}
}
}
Solutions:
.env file exists and is in correct location.env is loaded:
# Add to start script
export $(cat .env | xargs)
dotenv:
import dotenv from 'dotenv';
dotenv.config();
Solutions:
# AWS
aws secretsmanager list-secrets
# GCP
gcloud secrets list
# Vault
vault kv list secret/
- [ ] API key is valid and not expired
- [ ] Environment variable is set correctly
- [ ] Key has required permissions
- [ ] Not hitting rate limits
- [ ] Network connectivity is working
- [ ] Provider service is not down (check status page)
- [ ] Correct API endpoint URL
- [ ] SSL/TLS certificate is valid
Log API calls:
// mcp-server/src/middleware/logger.ts
import * as fs from 'fs';
export function logAPICall(
provider: string,
endpoint: string,
status: number,
timestamp: Date
) {
const logEntry = {
provider,
endpoint,
status,
timestamp: timestamp.toISOString(),
};
fs.appendFileSync(
'api-usage.log',
JSON.stringify(logEntry) + '\n'
);
}
Analyze logs:
# Count calls per provider
cat api-usage.log | jq -r '.provider' | sort | uniq -c
# Find failed requests
cat api-usage.log | jq 'select(.status >= 400)'
# Daily usage summary
cat api-usage.log | jq -r '.timestamp[:10]' | sort | uniq -c
Estimate costs:
// mcp-server/src/utils/cost-estimator.ts
const PRICING = {
'claude-3-5-sonnet': {
input: 0.003 / 1000, // $3 per million input tokens
output: 0.015 / 1000, // $15 per million output tokens
},
'gpt-4o': {
input: 0.005 / 1000,
output: 0.015 / 1000,
},
};
export function estimateCost(
model: string,
inputTokens: number,
outputTokens: number
): number {
const pricing = PRICING[model];
return (
inputTokens * pricing.input +
outputTokens * pricing.output
);
}
Monthly cost report:
# Generate cost report
cat api-usage.log | jq -r '.cost' | awk '{sum+=$1} END {print "Total: $"sum}'
Last Updated: January 2026 Version: 1.0.0
Built with π by the NerdCabal community