Compare commits

..

22 Commits

Author SHA1 Message Date
26e3f70cd9 feat: add sub-tasks for todos
Some checks are pending
CI/CD / test (push) Waiting to run
CI/CD / deploy (push) Blocked by required conditions
Security Scan / SAST - Semgrep (push) Waiting to run
Security Scan / Dependency Scan - Trivy (push) Waiting to run
Security Scan / Secret Detection - Gitleaks (push) Waiting to run
- Schema: add subtasks JSONB column to todos table with migration
- Backend: POST/PATCH/DELETE endpoints for todo subtasks
- Frontend: expandable subtask list under each todo item
- Toggle, add, delete subtasks inline
- Subtask count badge (completed/total)
- Clipboard icon in action buttons to open subtask panel
2026-01-30 17:32:57 +00:00
ad003730d3 feat: add Today View and Upcoming View for todos
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
- Backend: add view=today and view=upcoming query params to GET /api/todos
- Today View shows todos due today + overdue, grouped by section
- Upcoming View shows future todos grouped by date
- Sidebar views section with Today badge count
- Mobile-friendly view selector
- Empty states for both views
- Today count refreshes on toggle/delete/create
2026-01-30 17:29:16 +00:00
5ab38f1dd0 feat: enhance todo sections UX - section selector in add form, move-to-section on todos, inline add within sections
- AddTodoForm now shows section dropdown when viewing a project with sections
- TodoItem gains 'move to section' action button (list icon) with dropdown
- InlineSectionAdd component adds quick-add input at bottom of each section
- Moving between projects now clears sectionId
- updateTodo API type updated to include sectionId
2026-01-30 17:24:24 +00:00
3d8f465952 feat: add OWASP findings seed to entrypoint
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
2026-01-30 17:23:03 +00:00
d625c593ea feat: OWASP security audit system (HQ-59 phase 1)
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
2026-01-30 17:21:13 +00:00
04e421117a feat: comprehensive security seed with real OWASP findings from code audit
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
- seed-all-security.ts: Real findings from inspecting all 4 repos
  - OWASP API Top 10 per project (40 scored risks with findings + mitigations)
  - Security checklists: 37 items per app + 22 infrastructure items
  - Audit entries with findings from OWASP data
- init-tables.sql: Fixed table schemas to match Drizzle
  - security_checklists (plural, boolean checked) replaces security_checklist
  - security_score_history with new columns (overall_score, category_scores, etc.)
  - Added todo_projects + todo_sections tables
- security.ts: Cleaned up — removed old conflicting endpoints
  - Posture endpoint now aggregates audits + OWASP + checklists
  - Dynamic import to avoid circular deps
2026-01-30 17:08:24 +00:00
b683bed4e2 fix: add todo_projects and todo_sections to init-tables.sql for production DB 2026-01-30 16:49:37 +00:00
84b42841d5 fix: remove duplicate securityScoreHistory schema, generate migration for todo projects/sections tables 2026-01-30 16:42:20 +00:00
5ec0ec66ab feat: add todo sections within projects (HQ-41)
- New todo_sections table (name, isCollapsed, sortOrder per project)
- Added sectionId FK to todos table
- Full CRUD API for /api/todos/sections with todo counts
- Updated todos API: filter/create/update with sectionId
- Frontend: collapsible section headers with rename/delete
- Grouped todos by section when viewing a project
- Add section inline form within project views
- Unsectioned todos shown at top of project view
- Section collapse state persisted via API
2026-01-30 16:39:55 +00:00
3196b94cf1 feat: add todo projects/lists (HQ-40)
- New todo_projects table with name, color, icon, sort order
- Added projectId FK to todos table
- Full CRUD API for /api/todos/projects with todo count aggregation
- Updated todos API: filter by projectId, create/update with projectId
- Frontend: project sidebar with All/Inbox views + project list
- Project CRUD modal with icon picker and color palette
- Move-to-project dropdown on each todo item
- Mobile project selector dropdown
- Inbox view shows unassigned todos
2026-01-30 16:36:19 +00:00
ae671d5462 fix: align init-tables.sql with Drizzle schema for security tables 2026-01-30 16:19:49 +00:00
28477e0a99 fix: robust entrypoint script for reliable seed execution 2026-01-30 16:12:54 +00:00
cb711ecefd fix: update init-tables.sql for new security tables, remove broken seeds
- Add security_scans, security_checklists, owasp_scores tables to init SQL
- Migrate security_score_history to new schema (add columns)
- Remove seed scripts that reference deleted schema exports
- Remove seed from Dockerfile CMD (data populated via API)
2026-01-30 16:07:20 +00:00
68c35c627f feat: comprehensive security audit system
- New DB tables: security_scans, security_checklists, owasp_scores, security_score_history
- Security scan runner: npm audit, code pattern analysis, secret detection (clones repos, runs scans)
- OWASP API Security Top 10 scorecard per project with findings/mitigations editor
- Security checklists with per-category checkable items, severity badges, completion tracking
- Score history/trends with bar chart visualization and snapshot recording
- Create Fix Task button on every finding (creates queued HQ task with security tag)
- Comprehensive dashboard endpoint aggregating all security data
- Frontend: 5-tab layout (Overview, OWASP Top 10, Scans, Checklists, Trends)
- OWASP seed script with real assessments for all 4 apps
- Checklist templates for 8 security categories (38 items per project)
2026-01-30 16:04:07 +00:00
3c63d73419 feat: consolidated security seed + robust deployment
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
- Created seed-all-security.ts: single script combining OWASP audits,
  category audits, checklist items (150+), and score history
- Each step has individual error handling (won't fail silently)
- Batch inserts with fallback to individual inserts
- Updated Dockerfile CMD to use consolidated seed script
- Cache buster v6 for forced rebuild
2026-01-30 15:48:00 +00:00
cd8877429a fix: add checklist seed to Dockerfile CMD + bump cache buster
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
2026-01-30 15:17:52 +00:00
061618cfab feat: comprehensive security audit system - OWASP Top 10, checklist, score history, scan pipeline
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled
Phase 1: OWASP API Top 10 per API with real findings from code inspection
- Hammer Dashboard, Network App, Todo App, nKode all audited against 10 OWASP risks
- Per-API scorecards with visual grid, color-coded by status

Phase 2: Full security checklist
- 9 categories: Auth, Authz, Input Validation, Transport, Rate Limiting, etc
- Interactive checklist UI with click-to-cycle status
- Per-project checklist with progress tracking
- Comprehensive category audits (Auth, Data Protection, Logging, Infrastructure, etc)

Phase 3: Automated pipeline
- Semgrep SAST, Trivy dependency scan, Gitleaks secret detection
- Gitea Actions CI workflow (security-scan.yml)
- Scan results stored in DB and displayed in dashboard

Phase 4: Dashboard polish
- Overall security posture score with weighted calculation
- Score trend charts (SVG) with 7-day history
- Critical findings highlight section
- Score history snapshots API
- Tab-based navigation (Overview, Checklist, per-project)

New DB tables: security_score_history, security_checklist, security_scan_results
Seed data populated from real code review of all repos
2026-01-30 15:16:10 +00:00
797396497a feat: add OWASP API Security Top 10 audit for all 4 APIs
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
- Real code audit of Hammer Dashboard, Network App, Todo App, and nKode APIs
- Each API assessed against all 10 OWASP API Security risks with actual findings
- Frontend: OWASP scorecard component with visual grid showing pass/warn/critical
- Scorecard displayed prominently above regular category cards in project detail view
- Each finding has description, status, recommendation, and Create Fix Task support
- Added 'OWASP API Top 10' as category option in Add Audit modal
- Dark mode support throughout
2026-01-30 14:57:52 +00:00
8b8d56370e feat: add app health monitoring section
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
- Backend: GET /api/health/apps and POST /api/health/check endpoints
- Checks 8 apps (dashboard, network, todo, nkode, gitea)
- 30s caching to avoid hammering endpoints
- Frontend: Health widget on dashboard page
- Dedicated /health page with detailed status cards
- Sidebar nav with colored status dot indicator
- Dark mode support throughout
2026-01-30 14:19:55 +00:00
30d1892a7d fix: reorder migrate-owner route before /:id param route
Some checks failed
CI/CD / test (push) Successful in 23s
CI/CD / deploy (push) Failing after 2s
2026-01-30 13:56:30 +00:00
73bf9a69b1 feat: add migrate-owner endpoint for todos reassignment
Some checks failed
CI/CD / test (push) Successful in 28s
CI/CD / deploy (push) Failing after 2s
2026-01-30 13:52:36 +00:00
8407dde30b fix: comprehensive init-tables.sql for all new tables (todos, security_audits, daily_summaries, task_comments) 2026-01-30 05:06:52 +00:00
36 changed files with 12754 additions and 1413 deletions

View File

@@ -0,0 +1,176 @@
name: Security Scan
on:
push:
branches: [main]
pull_request:
branches: [main]
schedule:
- cron: '0 6 * * 1' # Weekly Monday 6am UTC
jobs:
semgrep:
name: SAST - Semgrep
runs-on: ubuntu-latest
container:
image: semgrep/semgrep:latest
steps:
- uses: actions/checkout@v4
- name: Run Semgrep
run: |
semgrep scan --config auto --json --output semgrep-results.json . || true
echo "=== Semgrep Results ==="
cat semgrep-results.json | python3 -c "
import json, sys
data = json.load(sys.stdin)
results = data.get('results', [])
print(f'Found {len(results)} issues')
for r in results[:20]:
sev = r.get('extra', {}).get('severity', 'unknown')
msg = r.get('extra', {}).get('message', 'No message')[:100]
path = r.get('path', '?')
line = r.get('start', {}).get('line', '?')
print(f' [{sev}] {path}:{line} - {msg}')
" 2>/dev/null || echo "No results to parse"
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat semgrep-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
results = data.get('results', [])
findings = []
for r in results:
findings.append({
'rule': r.get('check_id', 'unknown'),
'severity': r.get('extra', {}).get('severity', 'unknown'),
'message': r.get('extra', {}).get('message', '')[:200],
'path': r.get('path', ''),
'line': r.get('start', {}).get('line', 0),
})
print(json.dumps(findings))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"semgrep\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}
trivy:
name: Dependency Scan - Trivy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run Trivy filesystem scan
run: |
curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sh -s -- -b /usr/local/bin
trivy fs --format json --output trivy-results.json . || true
echo "=== Trivy Results ==="
trivy fs --severity HIGH,CRITICAL . || true
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat trivy-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
findings = []
for result in data.get('Results', []):
for vuln in result.get('Vulnerabilities', []):
findings.append({
'id': vuln.get('VulnerabilityID', ''),
'severity': vuln.get('Severity', ''),
'package': vuln.get('PkgName', ''),
'version': vuln.get('InstalledVersion', ''),
'fixedVersion': vuln.get('FixedVersion', ''),
'title': vuln.get('Title', '')[:200],
})
print(json.dumps(findings[:50]))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"trivy\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}
gitleaks:
name: Secret Detection - Gitleaks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Run Gitleaks
run: |
curl -sSfL https://github.com/gitleaks/gitleaks/releases/download/v8.18.4/gitleaks_8.18.4_linux_x64.tar.gz | tar xz
./gitleaks detect --source . --report-format json --report-path gitleaks-results.json || true
echo "=== Gitleaks Results ==="
cat gitleaks-results.json | python3 -c "
import json, sys
data = json.load(sys.stdin)
print(f'Found {len(data)} secrets')
for r in data[:10]:
print(f' [{r.get(\"RuleID\",\"?\")}] {r.get(\"File\",\"?\")}:{r.get(\"StartLine\",\"?\")}')
" 2>/dev/null || echo "No leaks found"
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat gitleaks-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
findings = []
for r in data:
findings.append({
'ruleId': r.get('RuleID', ''),
'file': r.get('File', ''),
'line': r.get('StartLine', 0),
'commit': r.get('Commit', '')[:12],
'author': r.get('Author', ''),
})
print(json.dumps(findings[:50]))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"gitleaks\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}

View File

@@ -1,15 +1,18 @@
FROM oven/bun:1 AS base
WORKDIR /app
# Install postgresql-client for SQL fallback
RUN apt-get update && apt-get install -y postgresql-client && rm -rf /var/lib/apt/lists/*
# Install dependencies
COPY package.json bun.lock* ./
RUN bun install --frozen-lockfile 2>/dev/null || bun install
# Copy source
# Copy source and init script
COPY . .
# Generate migrations and run
# Cache buster: 2026-01-30-v11-owasp-findings
EXPOSE 3100
RUN apt-get update && apt-get install -y postgresql-client && rm -rf /var/lib/apt/lists/*
COPY init-todos.sql /app/init-todos.sql
CMD ["sh", "-c", "echo 'Running init SQL...' && psql \"$DATABASE_URL\" -f /app/init-todos.sql 2>&1 && echo 'Init SQL done' && echo 'Running db:push...' && yes | bun run db:push 2>&1; echo 'db:push exit code:' $? && echo 'Starting server...' && bun run start"]
COPY entrypoint.sh /app/entrypoint.sh
RUN chmod +x /app/entrypoint.sh
CMD ["/app/entrypoint.sh"]

View File

@@ -0,0 +1,164 @@
CREATE TYPE "public"."scan_severity" AS ENUM('critical', 'high', 'medium', 'low', 'info');--> statement-breakpoint
CREATE TYPE "public"."scan_tool" AS ENUM('npm_audit', 'bun_audit', 'semgrep', 'gitleaks', 'trivy', 'eslint_security', 'custom');--> statement-breakpoint
CREATE TYPE "public"."security_audit_status" AS ENUM('strong', 'needs_improvement', 'critical');--> statement-breakpoint
CREATE TYPE "public"."security_checklist_status" AS ENUM('pass', 'fail', 'partial', 'not_applicable', 'not_checked');--> statement-breakpoint
CREATE TYPE "public"."todo_priority" AS ENUM('high', 'medium', 'low', 'none');--> statement-breakpoint
CREATE TABLE "daily_summaries" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"date" text NOT NULL,
"content" text NOT NULL,
"highlights" jsonb DEFAULT '[]'::jsonb,
"stats" jsonb DEFAULT '{}'::jsonb,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL,
CONSTRAINT "daily_summaries_date_unique" UNIQUE("date")
);
--> statement-breakpoint
CREATE TABLE "owasp_scores" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"risk_id" text NOT NULL,
"risk_name" text NOT NULL,
"score" integer DEFAULT 0 NOT NULL,
"status" text DEFAULT 'not_assessed' NOT NULL,
"findings" jsonb DEFAULT '[]'::jsonb,
"mitigations" jsonb DEFAULT '[]'::jsonb,
"last_assessed" timestamp with time zone DEFAULT now() NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_audits" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"category" text NOT NULL,
"findings" jsonb DEFAULT '[]'::jsonb,
"score" integer DEFAULT 0 NOT NULL,
"last_audited" timestamp with time zone DEFAULT now() NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_checklist" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"checklist_category" text NOT NULL,
"item" text NOT NULL,
"status" "security_checklist_status" DEFAULT 'not_checked' NOT NULL,
"notes" text,
"checked_by" text,
"checked_at" timestamp with time zone,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_checklists" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"category" text NOT NULL,
"item" text NOT NULL,
"description" text,
"checked" boolean DEFAULT false NOT NULL,
"checked_by" text,
"checked_at" timestamp with time zone,
"severity" text DEFAULT 'medium' NOT NULL,
"notes" text,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_scan_results" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"scan_type" text NOT NULL,
"scan_status" text DEFAULT 'pending' NOT NULL,
"findings" jsonb DEFAULT '[]'::jsonb,
"summary" jsonb DEFAULT '{}'::jsonb,
"triggered_by" text,
"commit_sha" text,
"branch" text,
"duration" integer,
"started_at" timestamp with time zone,
"completed_at" timestamp with time zone,
"created_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_scans" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"tool" text NOT NULL,
"status" text DEFAULT 'running' NOT NULL,
"issues" jsonb DEFAULT '[]'::jsonb,
"summary" jsonb DEFAULT '{}'::jsonb,
"raw_output" text,
"duration_ms" integer,
"started_at" timestamp with time zone DEFAULT now() NOT NULL,
"completed_at" timestamp with time zone,
"created_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "security_score_history" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_name" text NOT NULL,
"overall_score" integer NOT NULL,
"category_scores" jsonb DEFAULT '{}'::jsonb,
"owasp_score" integer,
"scan_issue_count" integer DEFAULT 0,
"checklist_completion" integer DEFAULT 0,
"recorded_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "task_comments" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"task_id" uuid NOT NULL,
"author_id" text,
"author_name" text NOT NULL,
"content" text NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "todo_projects" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"user_id" text NOT NULL,
"name" text NOT NULL,
"color" text DEFAULT '#6b7280' NOT NULL,
"icon" text DEFAULT '📁',
"sort_order" integer DEFAULT 0 NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "todo_sections" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"project_id" uuid NOT NULL,
"name" text NOT NULL,
"is_collapsed" boolean DEFAULT false NOT NULL,
"sort_order" integer DEFAULT 0 NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE "todos" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"user_id" text NOT NULL,
"title" text NOT NULL,
"description" text,
"is_completed" boolean DEFAULT false NOT NULL,
"priority" "todo_priority" DEFAULT 'none' NOT NULL,
"category" text,
"project_id" uuid,
"section_id" uuid,
"due_date" timestamp with time zone,
"completed_at" timestamp with time zone,
"sort_order" integer DEFAULT 0 NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
ALTER TABLE "tasks" ADD COLUMN "estimated_hours" integer;--> statement-breakpoint
ALTER TABLE "tasks" ADD COLUMN "tags" jsonb DEFAULT '[]'::jsonb;--> statement-breakpoint
ALTER TABLE "tasks" ADD COLUMN "recurrence" jsonb;--> statement-breakpoint
ALTER TABLE "task_comments" ADD CONSTRAINT "task_comments_task_id_tasks_id_fk" FOREIGN KEY ("task_id") REFERENCES "public"."tasks"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "todo_sections" ADD CONSTRAINT "todo_sections_project_id_todo_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."todo_projects"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "todos" ADD CONSTRAINT "todos_project_id_todo_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."todo_projects"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "todos" ADD CONSTRAINT "todos_section_id_todo_sections_id_fk" FOREIGN KEY ("section_id") REFERENCES "public"."todo_sections"("id") ON DELETE set null ON UPDATE no action;

View File

@@ -0,0 +1,15 @@
CREATE TYPE "public"."owasp_finding_severity" AS ENUM('critical', 'high', 'medium', 'low', 'info');--> statement-breakpoint
CREATE TYPE "public"."owasp_finding_status" AS ENUM('open', 'mitigated', 'accepted', 'false_positive');--> statement-breakpoint
CREATE TABLE "security_findings" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"app_name" text NOT NULL,
"category" text NOT NULL,
"severity" text DEFAULT 'medium' NOT NULL,
"title" text NOT NULL,
"description" text NOT NULL,
"recommendation" text NOT NULL,
"status" text DEFAULT 'open' NOT NULL,
"owasp_id" text NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);

View File

@@ -0,0 +1 @@
ALTER TABLE "todos" ADD COLUMN IF NOT EXISTS "subtasks" jsonb DEFAULT '[]'::jsonb;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -15,6 +15,20 @@
"when": 1769670192629,
"tag": "0001_mighty_callisto",
"breakpoints": true
},
{
"idx": 2,
"version": "7",
"when": 1769791306443,
"tag": "0002_vengeful_junta",
"breakpoints": true
},
{
"idx": 3,
"version": "7",
"when": 1769793555260,
"tag": "0003_tricky_random",
"breakpoints": true
}
]
}

21
backend/entrypoint.sh Normal file
View File

@@ -0,0 +1,21 @@
#!/bin/sh
set -e
echo "=== Hammer Queue Backend Startup ==="
echo "Waiting for database..."
sleep 5
echo "--- Running init SQL ---"
psql "$DATABASE_URL" -f /app/init-tables.sql 2>&1 || echo "Init SQL had issues (continuing)"
echo "--- Running db:push ---"
yes | bun run db:push 2>&1 || echo "db:push had issues (continuing)"
echo "--- Seeding security data ---"
bun run src/seed-all-security.ts 2>&1 || echo "Security seed had issues (continuing)"
echo "--- Seeding OWASP findings ---"
bun run src/seed-owasp-findings.ts 2>&1 || echo "OWASP findings seed had issues (continuing)"
echo "--- Starting server ---"
exec bun run start

207
backend/init-tables.sql Normal file
View File

@@ -0,0 +1,207 @@
-- Create all tables and enums
-- Idempotent: safe to run multiple times
-- ═══ Enums ═══
DO $$ BEGIN
CREATE TYPE todo_priority AS ENUM ('high', 'medium', 'low', 'none');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE security_audit_status AS ENUM ('strong', 'needs_improvement', 'critical');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE security_checklist_status AS ENUM ('pass', 'fail', 'partial', 'not_applicable', 'not_checked');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE scan_tool AS ENUM ('npm_audit', 'bun_audit', 'semgrep', 'gitleaks', 'trivy', 'eslint_security', 'custom');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
DO $$ BEGIN
CREATE TYPE scan_severity AS ENUM ('critical', 'high', 'medium', 'low', 'info');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
-- ═══ Todos ═══
CREATE TABLE IF NOT EXISTS todos (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id TEXT NOT NULL,
title TEXT NOT NULL,
description TEXT,
is_completed BOOLEAN NOT NULL DEFAULT false,
priority todo_priority NOT NULL DEFAULT 'none',
category TEXT,
due_date TIMESTAMPTZ,
completed_at TIMESTAMPTZ,
sort_order INTEGER NOT NULL DEFAULT 0,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Todo Projects ═══
CREATE TABLE IF NOT EXISTS todo_projects (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id TEXT NOT NULL,
name TEXT NOT NULL,
color TEXT DEFAULT '#6366f1',
icon TEXT DEFAULT '📁',
description TEXT,
is_favorite BOOLEAN NOT NULL DEFAULT false,
sort_order INTEGER NOT NULL DEFAULT 0,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- Add project_id to todos if not exists
DO $$ BEGIN
ALTER TABLE todos ADD COLUMN project_id UUID REFERENCES todo_projects(id) ON DELETE SET NULL;
EXCEPTION WHEN duplicate_column THEN null;
END $$;
-- ═══ Todo Sections ═══
CREATE TABLE IF NOT EXISTS todo_sections (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_id UUID NOT NULL REFERENCES todo_projects(id) ON DELETE CASCADE,
name TEXT NOT NULL,
sort_order INTEGER NOT NULL DEFAULT 0,
is_collapsed BOOLEAN NOT NULL DEFAULT false,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- Add section_id to todos if not exists
DO $$ BEGIN
ALTER TABLE todos ADD COLUMN section_id UUID REFERENCES todo_sections(id) ON DELETE SET NULL;
EXCEPTION WHEN duplicate_column THEN null;
END $$;
-- ═══ Security Audits ═══
CREATE TABLE IF NOT EXISTS security_audits (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
category TEXT NOT NULL,
findings JSONB DEFAULT '[]'::jsonb,
score INTEGER NOT NULL DEFAULT 0,
last_audited TIMESTAMPTZ NOT NULL DEFAULT now(),
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Daily Summaries ═══
CREATE TABLE IF NOT EXISTS daily_summaries (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
date TEXT NOT NULL UNIQUE,
content TEXT NOT NULL,
highlights JSONB DEFAULT '[]'::jsonb,
stats JSONB DEFAULT '{}'::jsonb,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Task Comments ═══
CREATE TABLE IF NOT EXISTS task_comments (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
task_id UUID NOT NULL REFERENCES tasks(id) ON DELETE CASCADE,
author_id TEXT,
author_name TEXT NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Scans ═══
CREATE TABLE IF NOT EXISTS security_scans (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
tool TEXT NOT NULL,
status TEXT NOT NULL DEFAULT 'running',
issues JSONB DEFAULT '[]'::jsonb,
summary JSONB DEFAULT '{}'::jsonb,
raw_output TEXT,
duration_ms INTEGER,
started_at TIMESTAMPTZ NOT NULL DEFAULT now(),
completed_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Checklists (new — plural, boolean checked) ═══
CREATE TABLE IF NOT EXISTS security_checklists (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
category TEXT NOT NULL,
item TEXT NOT NULL,
description TEXT,
checked BOOLEAN NOT NULL DEFAULT false,
checked_by TEXT,
checked_at TIMESTAMPTZ,
severity TEXT NOT NULL DEFAULT 'medium',
notes TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- Drop old singular table name if exists (was security_checklist with status enum)
DROP TABLE IF EXISTS security_checklist;
-- ═══ OWASP API Security Top 10 Scores ═══
CREATE TABLE IF NOT EXISTS owasp_scores (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
risk_id TEXT NOT NULL,
risk_name TEXT NOT NULL,
score INTEGER NOT NULL DEFAULT 0,
status TEXT NOT NULL DEFAULT 'not_assessed',
findings JSONB DEFAULT '[]'::jsonb,
mitigations JSONB DEFAULT '[]'::jsonb,
last_assessed TIMESTAMPTZ NOT NULL DEFAULT now(),
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Score History (trends) ═══
-- New schema with overall_score, category_scores, owasp_score, scan_issue_count, checklist_completion
DROP TABLE IF EXISTS security_score_history;
CREATE TABLE security_score_history (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
overall_score INTEGER NOT NULL DEFAULT 0,
category_scores JSONB DEFAULT '{}'::jsonb,
owasp_score INTEGER,
scan_issue_count INTEGER DEFAULT 0,
checklist_completion INTEGER DEFAULT 0,
recorded_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Scan Results (legacy compat) ═══
CREATE TABLE IF NOT EXISTS security_scan_results (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
scan_type TEXT NOT NULL,
scan_status TEXT NOT NULL DEFAULT 'pending',
findings JSONB DEFAULT '[]'::jsonb,
summary JSONB DEFAULT '{}'::jsonb,
triggered_by TEXT,
commit_sha TEXT,
branch TEXT,
duration INTEGER,
started_at TIMESTAMPTZ,
completed_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);

805
backend/package-lock.json generated Normal file
View File

@@ -0,0 +1,805 @@
{
"name": "hammer-queue-backend",
"version": "0.1.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "hammer-queue-backend",
"version": "0.1.0",
"dependencies": {
"@elysiajs/cors": "^1.2.0",
"better-auth": "^1.4.17",
"drizzle-orm": "^0.44.2",
"elysia": "^1.2.25",
"postgres": "^3.4.5"
},
"devDependencies": {
"@types/bun": "latest",
"drizzle-kit": "^0.31.1"
}
},
"node_modules/@better-auth/core": {
"version": "1.4.17",
"dependencies": {
"@standard-schema/spec": "^1.0.0",
"zod": "^4.3.5"
},
"peerDependencies": {
"@better-auth/utils": "0.3.0",
"@better-fetch/fetch": "1.1.21",
"better-call": "1.1.8",
"jose": "^6.1.0",
"kysely": "^0.28.5",
"nanostores": "^1.0.1"
}
},
"node_modules/@better-auth/telemetry": {
"version": "1.4.17",
"dependencies": {
"@better-auth/utils": "0.3.0",
"@better-fetch/fetch": "1.1.21"
},
"peerDependencies": {
"@better-auth/core": "1.4.17"
}
},
"node_modules/@better-auth/utils": {
"version": "0.3.0",
"license": "MIT"
},
"node_modules/@better-fetch/fetch": {
"version": "1.1.21"
},
"node_modules/@borewit/text-codec": {
"version": "0.2.1",
"license": "MIT",
"peer": true,
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Borewit"
}
},
"node_modules/@drizzle-team/brocli": {
"version": "0.10.2",
"devOptional": true,
"license": "Apache-2.0"
},
"node_modules/@elysiajs/cors": {
"version": "1.4.1",
"license": "MIT",
"peerDependencies": {
"elysia": ">= 1.4.0"
}
},
"node_modules/@esbuild-kit/core-utils": {
"version": "3.3.2",
"devOptional": true,
"license": "MIT",
"dependencies": {
"esbuild": "~0.18.20",
"source-map-support": "^0.5.21"
}
},
"node_modules/@esbuild-kit/core-utils/node_modules/esbuild": {
"version": "0.18.20",
"devOptional": true,
"hasInstallScript": true,
"license": "MIT",
"bin": {
"esbuild": "bin/esbuild"
},
"engines": {
"node": ">=12"
},
"optionalDependencies": {
"@esbuild/android-arm": "0.18.20",
"@esbuild/android-arm64": "0.18.20",
"@esbuild/android-x64": "0.18.20",
"@esbuild/darwin-arm64": "0.18.20",
"@esbuild/darwin-x64": "0.18.20",
"@esbuild/freebsd-arm64": "0.18.20",
"@esbuild/freebsd-x64": "0.18.20",
"@esbuild/linux-arm": "0.18.20",
"@esbuild/linux-arm64": "0.18.20",
"@esbuild/linux-ia32": "0.18.20",
"@esbuild/linux-loong64": "0.18.20",
"@esbuild/linux-mips64el": "0.18.20",
"@esbuild/linux-ppc64": "0.18.20",
"@esbuild/linux-riscv64": "0.18.20",
"@esbuild/linux-s390x": "0.18.20",
"@esbuild/linux-x64": "0.18.20",
"@esbuild/netbsd-x64": "0.18.20",
"@esbuild/openbsd-x64": "0.18.20",
"@esbuild/sunos-x64": "0.18.20",
"@esbuild/win32-arm64": "0.18.20",
"@esbuild/win32-ia32": "0.18.20",
"@esbuild/win32-x64": "0.18.20"
}
},
"node_modules/@esbuild-kit/core-utils/node_modules/esbuild/node_modules/@esbuild/linux-x64": {
"version": "0.18.20",
"cpu": [
"x64"
],
"dev": true,
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">=12"
}
},
"node_modules/@esbuild-kit/esm-loader": {
"version": "2.6.5",
"devOptional": true,
"license": "MIT",
"dependencies": {
"@esbuild-kit/core-utils": "^3.3.2",
"get-tsconfig": "^4.7.0"
}
},
"node_modules/@esbuild/linux-x64": {
"version": "0.25.12",
"cpu": [
"x64"
],
"dev": true,
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">=18"
}
},
"node_modules/@noble/ciphers": {
"version": "2.1.1",
"license": "MIT",
"engines": {
"node": ">= 20.19.0"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@noble/hashes": {
"version": "2.0.1",
"license": "MIT",
"engines": {
"node": ">= 20.19.0"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@sinclair/typebox": {
"version": "0.34.48",
"license": "MIT",
"peer": true
},
"node_modules/@standard-schema/spec": {
"version": "1.1.0",
"license": "MIT"
},
"node_modules/@tokenizer/inflate": {
"version": "0.4.1",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "^4.4.3",
"token-types": "^6.1.1"
},
"engines": {
"node": ">=18"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Borewit"
}
},
"node_modules/@tokenizer/token": {
"version": "0.3.0",
"license": "MIT",
"peer": true
},
"node_modules/@types/bun": {
"version": "1.3.8",
"resolved": "https://registry.npmjs.org/@types/bun/-/bun-1.3.8.tgz",
"integrity": "sha512-3LvWJ2q5GerAXYxO2mffLTqOzEu5qnhEAlh48Vnu8WQfnmSwbgagjGZV6BoHKJztENYEDn6QmVd949W4uESRJA==",
"devOptional": true,
"license": "MIT",
"dependencies": {
"bun-types": "1.3.8"
}
},
"node_modules/@types/node": {
"version": "25.1.0",
"devOptional": true,
"license": "MIT",
"dependencies": {
"undici-types": "~7.16.0"
}
},
"node_modules/better-auth": {
"version": "1.4.17",
"license": "MIT",
"dependencies": {
"@better-auth/core": "1.4.17",
"@better-auth/telemetry": "1.4.17",
"@better-auth/utils": "0.3.0",
"@better-fetch/fetch": "1.1.21",
"@noble/ciphers": "^2.0.0",
"@noble/hashes": "^2.0.0",
"better-call": "1.1.8",
"defu": "^6.1.4",
"jose": "^6.1.0",
"kysely": "^0.28.5",
"nanostores": "^1.0.1",
"zod": "^4.3.5"
},
"peerDependencies": {
"@lynx-js/react": "*",
"@prisma/client": "^5.0.0 || ^6.0.0 || ^7.0.0",
"@sveltejs/kit": "^2.0.0",
"@tanstack/react-start": "^1.0.0",
"@tanstack/solid-start": "^1.0.0",
"better-sqlite3": "^12.0.0",
"drizzle-kit": ">=0.31.4",
"drizzle-orm": ">=0.41.0",
"mongodb": "^6.0.0 || ^7.0.0",
"mysql2": "^3.0.0",
"next": "^14.0.0 || ^15.0.0 || ^16.0.0",
"pg": "^8.0.0",
"prisma": "^5.0.0 || ^6.0.0 || ^7.0.0",
"react": "^18.0.0 || ^19.0.0",
"react-dom": "^18.0.0 || ^19.0.0",
"solid-js": "^1.0.0",
"svelte": "^4.0.0 || ^5.0.0",
"vitest": "^2.0.0 || ^3.0.0 || ^4.0.0",
"vue": "^3.0.0"
},
"peerDependenciesMeta": {
"@lynx-js/react": {
"optional": true
},
"@prisma/client": {
"optional": true
},
"@sveltejs/kit": {
"optional": true
},
"@tanstack/react-start": {
"optional": true
},
"@tanstack/solid-start": {
"optional": true
},
"better-sqlite3": {
"optional": true
},
"drizzle-kit": {
"optional": true
},
"drizzle-orm": {
"optional": true
},
"mongodb": {
"optional": true
},
"mysql2": {
"optional": true
},
"next": {
"optional": true
},
"pg": {
"optional": true
},
"prisma": {
"optional": true
},
"react": {
"optional": true
},
"react-dom": {
"optional": true
},
"solid-js": {
"optional": true
},
"svelte": {
"optional": true
},
"vitest": {
"optional": true
},
"vue": {
"optional": true
}
}
},
"node_modules/better-call": {
"version": "1.1.8",
"license": "MIT",
"dependencies": {
"@better-auth/utils": "^0.3.0",
"@better-fetch/fetch": "^1.1.4",
"rou3": "^0.7.10",
"set-cookie-parser": "^2.7.1"
},
"peerDependencies": {
"zod": "^4.0.0"
},
"peerDependenciesMeta": {
"zod": {
"optional": true
}
}
},
"node_modules/buffer-from": {
"version": "1.1.2",
"devOptional": true,
"license": "MIT"
},
"node_modules/bun-types": {
"version": "1.3.8",
"resolved": "https://registry.npmjs.org/bun-types/-/bun-types-1.3.8.tgz",
"integrity": "sha512-fL99nxdOWvV4LqjmC+8Q9kW3M4QTtTR1eePs94v5ctGqU8OeceWrSUaRw3JYb7tU3FkMIAjkueehrHPPPGKi5Q==",
"devOptional": true,
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/cookie": {
"version": "1.1.1",
"license": "MIT",
"engines": {
"node": ">=18"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/express"
}
},
"node_modules/debug": {
"version": "4.4.3",
"license": "MIT",
"dependencies": {
"ms": "^2.1.3"
},
"engines": {
"node": ">=6.0"
},
"peerDependenciesMeta": {
"supports-color": {
"optional": true
}
}
},
"node_modules/defu": {
"version": "6.1.4",
"license": "MIT"
},
"node_modules/drizzle-kit": {
"version": "0.31.8",
"devOptional": true,
"license": "MIT",
"dependencies": {
"@drizzle-team/brocli": "^0.10.2",
"@esbuild-kit/esm-loader": "^2.5.5",
"esbuild": "^0.25.4",
"esbuild-register": "^3.5.0"
},
"bin": {
"drizzle-kit": "bin.cjs"
}
},
"node_modules/drizzle-orm": {
"version": "0.44.7",
"license": "Apache-2.0",
"peerDependencies": {
"@aws-sdk/client-rds-data": ">=3",
"@cloudflare/workers-types": ">=4",
"@electric-sql/pglite": ">=0.2.0",
"@libsql/client": ">=0.10.0",
"@libsql/client-wasm": ">=0.10.0",
"@neondatabase/serverless": ">=0.10.0",
"@op-engineering/op-sqlite": ">=2",
"@opentelemetry/api": "^1.4.1",
"@planetscale/database": ">=1.13",
"@prisma/client": "*",
"@tidbcloud/serverless": "*",
"@types/better-sqlite3": "*",
"@types/pg": "*",
"@types/sql.js": "*",
"@upstash/redis": ">=1.34.7",
"@vercel/postgres": ">=0.8.0",
"@xata.io/client": "*",
"better-sqlite3": ">=7",
"bun-types": "*",
"expo-sqlite": ">=14.0.0",
"gel": ">=2",
"knex": "*",
"kysely": "*",
"mysql2": ">=2",
"pg": ">=8",
"postgres": ">=3",
"sql.js": ">=1",
"sqlite3": ">=5"
},
"peerDependenciesMeta": {
"@aws-sdk/client-rds-data": {
"optional": true
},
"@cloudflare/workers-types": {
"optional": true
},
"@electric-sql/pglite": {
"optional": true
},
"@libsql/client": {
"optional": true
},
"@libsql/client-wasm": {
"optional": true
},
"@neondatabase/serverless": {
"optional": true
},
"@op-engineering/op-sqlite": {
"optional": true
},
"@opentelemetry/api": {
"optional": true
},
"@planetscale/database": {
"optional": true
},
"@prisma/client": {
"optional": true
},
"@tidbcloud/serverless": {
"optional": true
},
"@types/better-sqlite3": {
"optional": true
},
"@types/pg": {
"optional": true
},
"@types/sql.js": {
"optional": true
},
"@upstash/redis": {
"optional": true
},
"@vercel/postgres": {
"optional": true
},
"@xata.io/client": {
"optional": true
},
"better-sqlite3": {
"optional": true
},
"bun-types": {
"optional": true
},
"expo-sqlite": {
"optional": true
},
"gel": {
"optional": true
},
"knex": {
"optional": true
},
"kysely": {
"optional": true
},
"mysql2": {
"optional": true
},
"pg": {
"optional": true
},
"postgres": {
"optional": true
},
"prisma": {
"optional": true
},
"sql.js": {
"optional": true
},
"sqlite3": {
"optional": true
}
}
},
"node_modules/elysia": {
"version": "1.4.22",
"license": "MIT",
"dependencies": {
"cookie": "^1.1.1",
"exact-mirror": "^0.2.6",
"fast-decode-uri-component": "^1.0.1",
"memoirist": "^0.4.0"
},
"peerDependencies": {
"@sinclair/typebox": ">= 0.34.0 < 1",
"@types/bun": ">= 1.2.0",
"exact-mirror": ">= 0.0.9",
"file-type": ">= 20.0.0",
"openapi-types": ">= 12.0.0",
"typescript": ">= 5.0.0"
},
"peerDependenciesMeta": {
"@types/bun": {
"optional": true
},
"typescript": {
"optional": true
}
}
},
"node_modules/esbuild": {
"version": "0.25.12",
"devOptional": true,
"hasInstallScript": true,
"license": "MIT",
"bin": {
"esbuild": "bin/esbuild"
},
"engines": {
"node": ">=18"
},
"optionalDependencies": {
"@esbuild/aix-ppc64": "0.25.12",
"@esbuild/android-arm": "0.25.12",
"@esbuild/android-arm64": "0.25.12",
"@esbuild/android-x64": "0.25.12",
"@esbuild/darwin-arm64": "0.25.12",
"@esbuild/darwin-x64": "0.25.12",
"@esbuild/freebsd-arm64": "0.25.12",
"@esbuild/freebsd-x64": "0.25.12",
"@esbuild/linux-arm": "0.25.12",
"@esbuild/linux-arm64": "0.25.12",
"@esbuild/linux-ia32": "0.25.12",
"@esbuild/linux-loong64": "0.25.12",
"@esbuild/linux-mips64el": "0.25.12",
"@esbuild/linux-ppc64": "0.25.12",
"@esbuild/linux-riscv64": "0.25.12",
"@esbuild/linux-s390x": "0.25.12",
"@esbuild/linux-x64": "0.25.12",
"@esbuild/netbsd-arm64": "0.25.12",
"@esbuild/netbsd-x64": "0.25.12",
"@esbuild/openbsd-arm64": "0.25.12",
"@esbuild/openbsd-x64": "0.25.12",
"@esbuild/openharmony-arm64": "0.25.12",
"@esbuild/sunos-x64": "0.25.12",
"@esbuild/win32-arm64": "0.25.12",
"@esbuild/win32-ia32": "0.25.12",
"@esbuild/win32-x64": "0.25.12"
}
},
"node_modules/esbuild-register": {
"version": "3.6.0",
"devOptional": true,
"license": "MIT",
"dependencies": {
"debug": "^4.3.4"
},
"peerDependencies": {
"esbuild": ">=0.12 <1"
}
},
"node_modules/exact-mirror": {
"version": "0.2.6",
"license": "MIT",
"peerDependencies": {
"@sinclair/typebox": "^0.34.15"
},
"peerDependenciesMeta": {
"@sinclair/typebox": {
"optional": true
}
}
},
"node_modules/fast-decode-uri-component": {
"version": "1.0.1",
"license": "MIT"
},
"node_modules/file-type": {
"version": "21.3.0",
"license": "MIT",
"peer": true,
"dependencies": {
"@tokenizer/inflate": "^0.4.1",
"strtok3": "^10.3.4",
"token-types": "^6.1.1",
"uint8array-extras": "^1.4.0"
},
"engines": {
"node": ">=20"
},
"funding": {
"url": "https://github.com/sindresorhus/file-type?sponsor=1"
}
},
"node_modules/get-tsconfig": {
"version": "4.13.0",
"devOptional": true,
"license": "MIT",
"dependencies": {
"resolve-pkg-maps": "^1.0.0"
},
"funding": {
"url": "https://github.com/privatenumber/get-tsconfig?sponsor=1"
}
},
"node_modules/ieee754": {
"version": "1.2.1",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "BSD-3-Clause",
"peer": true
},
"node_modules/jose": {
"version": "6.1.3",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/panva"
}
},
"node_modules/kysely": {
"version": "0.28.10",
"license": "MIT",
"engines": {
"node": ">=20.0.0"
}
},
"node_modules/memoirist": {
"version": "0.4.0",
"license": "MIT"
},
"node_modules/ms": {
"version": "2.1.3",
"license": "MIT"
},
"node_modules/nanostores": {
"version": "1.1.0",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"license": "MIT",
"engines": {
"node": "^20.0.0 || >=22.0.0"
}
},
"node_modules/openapi-types": {
"version": "12.1.3",
"license": "MIT",
"peer": true
},
"node_modules/postgres": {
"version": "3.4.8",
"license": "Unlicense",
"engines": {
"node": ">=12"
},
"funding": {
"type": "individual",
"url": "https://github.com/sponsors/porsager"
}
},
"node_modules/resolve-pkg-maps": {
"version": "1.0.0",
"devOptional": true,
"license": "MIT",
"funding": {
"url": "https://github.com/privatenumber/resolve-pkg-maps?sponsor=1"
}
},
"node_modules/rou3": {
"version": "0.7.12",
"license": "MIT"
},
"node_modules/set-cookie-parser": {
"version": "2.7.2",
"license": "MIT"
},
"node_modules/source-map": {
"version": "0.6.1",
"devOptional": true,
"license": "BSD-3-Clause",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/source-map-support": {
"version": "0.5.21",
"devOptional": true,
"license": "MIT",
"dependencies": {
"buffer-from": "^1.0.0",
"source-map": "^0.6.0"
}
},
"node_modules/strtok3": {
"version": "10.3.4",
"license": "MIT",
"peer": true,
"dependencies": {
"@tokenizer/token": "^0.3.0"
},
"engines": {
"node": ">=18"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Borewit"
}
},
"node_modules/token-types": {
"version": "6.1.2",
"license": "MIT",
"peer": true,
"dependencies": {
"@borewit/text-codec": "^0.2.1",
"@tokenizer/token": "^0.3.0",
"ieee754": "^1.2.1"
},
"engines": {
"node": ">=14.16"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Borewit"
}
},
"node_modules/uint8array-extras": {
"version": "1.5.0",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/undici-types": {
"version": "7.16.0",
"devOptional": true,
"license": "MIT"
},
"node_modules/zod": {
"version": "4.3.6",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/colinhacks"
}
}
}
}

View File

@@ -9,8 +9,7 @@
"db:push": "drizzle-kit push",
"db:studio": "drizzle-kit studio",
"test": "bun test",
"test:ci": "bun test --bail",
"seed:security": "bun run src/seed-security.ts"
"test:ci": "bun test --bail"
},
"dependencies": {
"@elysiajs/cors": "^1.2.0",

View File

@@ -131,6 +131,7 @@ export interface SecurityFinding {
title: string;
description: string;
recommendation: string;
taskId?: string;
}
export const securityAudits = pgTable("security_audits", {
@@ -147,6 +148,52 @@ export const securityAudits = pgTable("security_audits", {
export type SecurityAudit = typeof securityAudits.$inferSelect;
export type NewSecurityAudit = typeof securityAudits.$inferInsert;
// ─── Security Checklist ───
export const securityChecklistStatusEnum = pgEnum("security_checklist_status", [
"pass",
"fail",
"partial",
"not_applicable",
"not_checked",
]);
export const securityChecklist = pgTable("security_checklist", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
category: text("checklist_category").notNull(),
item: text("item").notNull(),
status: securityChecklistStatusEnum("status").notNull().default("not_checked"),
notes: text("notes"),
checkedBy: text("checked_by"),
checkedAt: timestamp("checked_at", { withTimezone: true }),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityChecklistItem = typeof securityChecklist.$inferSelect;
export type NewSecurityChecklistItem = typeof securityChecklist.$inferInsert;
// ─── Security Scan Results ───
export const securityScanResults = pgTable("security_scan_results", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
scanType: text("scan_type").notNull(), // semgrep, trivy, gitleaks
status: text("scan_status").notNull().default("pending"), // pending, running, completed, failed
findings: jsonb("findings").$type<any[]>().default([]),
summary: jsonb("summary").$type<Record<string, any>>().default({}),
triggeredBy: text("triggered_by"), // ci, manual
commitSha: text("commit_sha"),
branch: text("branch"),
duration: integer("duration"), // seconds
startedAt: timestamp("started_at", { withTimezone: true }),
completedAt: timestamp("completed_at", { withTimezone: true }),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityScanResult = typeof securityScanResults.$inferSelect;
// ─── Daily Summaries ───
export interface SummaryHighlight {
@@ -184,6 +231,41 @@ export const todoPriorityEnum = pgEnum("todo_priority", [
"none",
]);
export const todoProjects = pgTable("todo_projects", {
id: uuid("id").defaultRandom().primaryKey(),
userId: text("user_id").notNull(),
name: text("name").notNull(),
color: text("color").notNull().default("#6b7280"), // hex color
icon: text("icon").default("📁"), // emoji icon
sortOrder: integer("sort_order").notNull().default(0),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type TodoProject = typeof todoProjects.$inferSelect;
export type NewTodoProject = typeof todoProjects.$inferInsert;
export const todoSections = pgTable("todo_sections", {
id: uuid("id").defaultRandom().primaryKey(),
projectId: uuid("project_id").notNull().references(() => todoProjects.id, { onDelete: "cascade" }),
name: text("name").notNull(),
isCollapsed: boolean("is_collapsed").notNull().default(false),
sortOrder: integer("sort_order").notNull().default(0),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type TodoSection = typeof todoSections.$inferSelect;
export type NewTodoSection = typeof todoSections.$inferInsert;
export interface TodoSubtask {
id: string;
title: string;
completed: boolean;
completedAt?: string;
createdAt: string;
}
export const todos = pgTable("todos", {
id: uuid("id").defaultRandom().primaryKey(),
userId: text("user_id").notNull(),
@@ -192,8 +274,11 @@ export const todos = pgTable("todos", {
isCompleted: boolean("is_completed").notNull().default(false),
priority: todoPriorityEnum("priority").notNull().default("none"),
category: text("category"),
projectId: uuid("project_id").references(() => todoProjects.id, { onDelete: "set null" }),
sectionId: uuid("section_id").references(() => todoSections.id, { onDelete: "set null" }),
dueDate: timestamp("due_date", { withTimezone: true }),
completedAt: timestamp("completed_at", { withTimezone: true }),
subtasks: jsonb("subtasks").$type<TodoSubtask[]>().default([]),
sortOrder: integer("sort_order").notNull().default(0),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
@@ -202,6 +287,142 @@ export const todos = pgTable("todos", {
export type Todo = typeof todos.$inferSelect;
export type NewTodo = typeof todos.$inferInsert;
// ─── Security Scans (automated) ───
export const scanToolEnum = pgEnum("scan_tool", [
"npm_audit",
"bun_audit",
"semgrep",
"gitleaks",
"trivy",
"eslint_security",
"custom",
]);
export const scanSeverityEnum = pgEnum("scan_severity", [
"critical",
"high",
"medium",
"low",
"info",
]);
export interface ScanIssue {
id: string;
severity: "critical" | "high" | "medium" | "low" | "info";
title: string;
description: string;
file?: string;
line?: number;
rule?: string;
fixAvailable?: boolean;
}
export const securityScans = pgTable("security_scans", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
tool: text("tool").notNull(), // npm_audit, semgrep, gitleaks, etc.
status: text("status").notNull().default("running"), // running, completed, failed
issues: jsonb("issues").$type<ScanIssue[]>().default([]),
summary: jsonb("summary").$type<Record<string, number>>().default({}), // {critical: 2, high: 5, ...}
rawOutput: text("raw_output"),
durationMs: integer("duration_ms"),
startedAt: timestamp("started_at", { withTimezone: true }).defaultNow().notNull(),
completedAt: timestamp("completed_at", { withTimezone: true }),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityScan = typeof securityScans.$inferSelect;
export type NewSecurityScan = typeof securityScans.$inferInsert;
// ─── Security Checklists ───
export const securityChecklists = pgTable("security_checklists", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
category: text("category").notNull(), // OWASP API1, Authentication, etc.
item: text("item").notNull(),
description: text("description"),
checked: boolean("checked").notNull().default(false),
checkedBy: text("checked_by"),
checkedAt: timestamp("checked_at", { withTimezone: true }),
severity: text("severity").notNull().default("medium"), // critical, high, medium, low
notes: text("notes"),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityChecklist = typeof securityChecklists.$inferSelect;
export type NewSecurityChecklist = typeof securityChecklists.$inferInsert;
// ─── OWASP API Security Top 10 Scores ───
export const owaspScores = pgTable("owasp_scores", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
riskId: text("risk_id").notNull(), // API1, API2, ... API10
riskName: text("risk_name").notNull(),
score: integer("score").notNull().default(0), // 0-100
status: text("status").notNull().default("not_assessed"), // not_assessed, pass, partial, fail
findings: jsonb("findings").$type<string[]>().default([]),
mitigations: jsonb("mitigations").$type<string[]>().default([]),
lastAssessed: timestamp("last_assessed", { withTimezone: true }).defaultNow().notNull(),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type OwaspScore = typeof owaspScores.$inferSelect;
export type NewOwaspScore = typeof owaspScores.$inferInsert;
// ─── Security Score History (trends) ───
export const securityScoreHistory = pgTable("security_score_history", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
overallScore: integer("overall_score").notNull(),
categoryScores: jsonb("category_scores").$type<Record<string, number>>().default({}),
owaspScore: integer("owasp_score"),
scanIssueCount: integer("scan_issue_count").default(0),
checklistCompletion: integer("checklist_completion").default(0), // percentage
recordedAt: timestamp("recorded_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityScoreHistory = typeof securityScoreHistory.$inferSelect;
// ─── OWASP API Top 10 Security Findings ───
export const owaspFindingSeverityEnum = pgEnum("owasp_finding_severity", [
"critical",
"high",
"medium",
"low",
"info",
]);
export const owaspFindingStatusEnum = pgEnum("owasp_finding_status", [
"open",
"mitigated",
"accepted",
"false_positive",
]);
export const securityFindings = pgTable("security_findings", {
id: uuid("id").defaultRandom().primaryKey(),
appName: text("app_name").notNull(),
category: text("category").notNull(), // OWASP API Top 10 category
severity: text("severity").notNull().default("medium"), // critical, high, medium, low, info
title: text("title").notNull(),
description: text("description").notNull(),
recommendation: text("recommendation").notNull(),
status: text("status").notNull().default("open"), // open, mitigated, accepted, false_positive
owaspId: text("owasp_id").notNull(), // API1, API2, ... API10
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityFindingRow = typeof securityFindings.$inferSelect;
export type NewSecurityFinding = typeof securityFindings.$inferInsert;
// ─── BetterAuth tables ───
export const users = pgTable("users", {

View File

@@ -7,7 +7,12 @@ import { commentRoutes } from "./routes/comments";
import { activityRoutes } from "./routes/activity";
import { summaryRoutes } from "./routes/summaries";
import { securityRoutes } from "./routes/security";
import { securityScanRoutes } from "./routes/security-scans";
import { securityFindingsRoutes } from "./routes/security-findings";
import { todoRoutes } from "./routes/todos";
import { todoProjectRoutes } from "./routes/todo-projects";
import { todoSectionRoutes } from "./routes/todo-sections";
import { healthRoutes } from "./routes/health";
import { auth } from "./lib/auth";
import { db } from "./db";
import { tasks, users } from "./db/schema";
@@ -124,8 +129,13 @@ const app = new Elysia()
.use(projectRoutes)
.use(adminRoutes)
.use(securityRoutes)
.use(securityScanRoutes)
.use(securityFindingsRoutes)
.use(summaryRoutes)
.use(todoProjectRoutes)
.use(todoSectionRoutes)
.use(todoRoutes)
.use(healthRoutes)
// Current user info (role, etc.)
.get("/api/me", async ({ request }) => {

View File

@@ -0,0 +1,154 @@
import { Elysia } from "elysia";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
async function requireSessionOrBearer(request: Request, headers: Record<string, string | undefined>) {
const authHeader = headers["authorization"];
if (authHeader === `Bearer ${BEARER_TOKEN}`) {
return { userId: "bearer" };
}
try {
const session = await auth.api.getSession({ headers: request.headers });
if (session?.user) return { userId: session.user.id };
} catch {}
throw new Error("Unauthorized");
}
// Apps to monitor
const APPS = [
{ name: "Hammer Dashboard", url: "https://dash.donovankelly.xyz", type: "web" as const },
{ name: "Network App API", url: "https://api.thenetwork.donovankelly.xyz", type: "api" as const },
{ name: "Network App Web", url: "https://app.thenetwork.donovankelly.xyz", type: "web" as const },
{ name: "Todo App API", url: "https://api.todo.donovankelly.xyz", type: "api" as const },
{ name: "Todo App Web", url: "https://app.todo.donovankelly.xyz", type: "web" as const },
{ name: "nKode Frontend", url: "https://app.nkode.donovankelly.xyz", type: "web" as const },
{ name: "nKode Backend", url: "https://api.nkode.donovankelly.xyz", type: "api" as const },
{ name: "Gitea", url: "https://git.infra.donovankelly.xyz", type: "web" as const },
];
interface AppHealthResult {
name: string;
url: string;
type: "web" | "api";
status: "healthy" | "degraded" | "unhealthy";
responseTime: number;
httpStatus: number | null;
lastChecked: string;
error?: string;
}
// Cache
let cachedResults: AppHealthResult[] | null = null;
let cacheTimestamp = 0;
const CACHE_TTL = 30_000; // 30 seconds
async function checkApp(app: typeof APPS[number]): Promise<AppHealthResult> {
const start = Date.now();
const checkUrl = app.type === "api"
? (() => {
// Try common API health endpoints
if (app.url.includes("api.thenetwork")) return `${app.url}/api/auth/session`;
if (app.url.includes("api.todo")) return `${app.url}/api/auth/session`;
if (app.url.includes("api.nkode")) return `${app.url}/api/auth/session`;
return `${app.url}/`;
})()
: app.url;
try {
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 10_000);
const res = await fetch(checkUrl, {
signal: controller.signal,
redirect: "follow",
headers: { "User-Agent": "HammerHealthCheck/1.0" },
});
clearTimeout(timeout);
const responseTime = Date.now() - start;
const httpStatus = res.status;
let status: AppHealthResult["status"];
if (httpStatus >= 200 && httpStatus < 300) {
status = responseTime > 5000 ? "degraded" : "healthy";
} else if (httpStatus >= 300 && httpStatus < 500) {
status = "degraded";
} else {
status = "unhealthy";
}
return {
name: app.name,
url: app.url,
type: app.type,
status,
responseTime,
httpStatus,
lastChecked: new Date().toISOString(),
};
} catch (err: any) {
const responseTime = Date.now() - start;
return {
name: app.name,
url: app.url,
type: app.type,
status: "unhealthy",
responseTime,
httpStatus: null,
lastChecked: new Date().toISOString(),
error: err.name === "AbortError" ? "Timeout (10s)" : (err.message || "Connection failed"),
};
}
}
async function checkAllApps(): Promise<AppHealthResult[]> {
const results = await Promise.all(APPS.map(checkApp));
cachedResults = results;
cacheTimestamp = Date.now();
return results;
}
export const healthRoutes = new Elysia({ prefix: "/api/health" })
.onError(({ error, set }) => {
const msg = (error as any)?.message || String(error);
if (msg === "Unauthorized") {
set.status = 401;
return { error: "Unauthorized" };
}
console.error("Health route error:", msg);
set.status = 500;
return { error: "Internal server error" };
})
// GET cached health status (or fresh if cache expired)
.get("/apps", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
if (cachedResults && Date.now() - cacheTimestamp < CACHE_TTL) {
return {
apps: cachedResults,
cached: true,
cacheAge: Date.now() - cacheTimestamp,
};
}
const results = await checkAllApps();
return {
apps: results,
cached: false,
cacheAge: 0,
};
})
// POST force fresh check
.post("/check", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const results = await checkAllApps();
return {
apps: results,
cached: false,
cacheAge: 0,
};
});

View File

@@ -0,0 +1,195 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { securityFindings } from "../db/schema";
import { eq, asc, desc, and, sql } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
async function requireSessionOrBearer(
request: Request,
headers: Record<string, string | undefined>
) {
const authHeader = headers["authorization"];
if (authHeader === `Bearer ${BEARER_TOKEN}`) return;
try {
const session = await auth.api.getSession({ headers: request.headers });
if (session) return;
} catch {}
throw new Error("Unauthorized");
}
export const securityFindingsRoutes = new Elysia({ prefix: "/api/security/findings" })
.onError(({ error, set }) => {
const msg = (error as any)?.message || String(error);
if (msg === "Unauthorized") {
set.status = 401;
return { error: "Unauthorized" };
}
console.error("Security findings route error:", msg);
set.status = 500;
return { error: "Internal server error" };
})
// GET all findings (with optional filters)
.get("/", async ({ request, headers, query }) => {
await requireSessionOrBearer(request, headers);
const conditions: any[] = [];
if (query.appName) conditions.push(eq(securityFindings.appName, query.appName));
if (query.severity) conditions.push(eq(securityFindings.severity, query.severity));
if (query.status) conditions.push(eq(securityFindings.status, query.status));
if (query.owaspId) conditions.push(eq(securityFindings.owaspId, query.owaspId));
const results = conditions.length > 0
? await db.select().from(securityFindings).where(and(...conditions)).orderBy(asc(securityFindings.appName), asc(securityFindings.owaspId))
: await db.select().from(securityFindings).orderBy(asc(securityFindings.appName), asc(securityFindings.owaspId));
return results;
}, {
query: t.Object({
appName: t.Optional(t.String()),
severity: t.Optional(t.String()),
status: t.Optional(t.String()),
owaspId: t.Optional(t.String()),
}),
})
// GET summary (aggregated counts per app)
.get("/summary", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const all = await db.select().from(securityFindings);
// Group by app
const appMap: Record<string, {
total: number;
bySeverity: Record<string, number>;
byCategory: Record<string, { total: number; bySeverity: Record<string, number> }>;
byStatus: Record<string, number>;
}> = {};
for (const f of all) {
if (!appMap[f.appName]) {
appMap[f.appName] = { total: 0, bySeverity: {}, byCategory: {}, byStatus: {} };
}
const app = appMap[f.appName];
app.total++;
app.bySeverity[f.severity] = (app.bySeverity[f.severity] || 0) + 1;
app.byStatus[f.status] = (app.byStatus[f.status] || 0) + 1;
if (!app.byCategory[f.category]) {
app.byCategory[f.category] = { total: 0, bySeverity: {} };
}
app.byCategory[f.category].total++;
app.byCategory[f.category].bySeverity[f.severity] = (app.byCategory[f.category].bySeverity[f.severity] || 0) + 1;
}
// Compute risk score per app (lower is better)
const severityWeights: Record<string, number> = { critical: 10, high: 5, medium: 2, low: 1, info: 0 };
const summary = Object.entries(appMap).map(([appName, data]) => {
const riskScore = Object.entries(data.bySeverity).reduce(
(sum, [sev, count]) => sum + (severityWeights[sev] || 0) * count, 0
);
return { appName, ...data, riskScore };
});
// Overall stats
const totalFindings = all.length;
const totalCritical = all.filter(f => f.severity === "critical").length;
const totalHigh = all.filter(f => f.severity === "high").length;
const totalOpen = all.filter(f => f.status === "open").length;
return {
apps: summary,
overall: { totalFindings, totalCritical, totalHigh, totalOpen },
};
})
// POST create finding
.post("/", async ({ body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const newFinding = await db
.insert(securityFindings)
.values({
appName: body.appName,
category: body.category,
severity: body.severity,
title: body.title,
description: body.description,
recommendation: body.recommendation,
status: body.status || "open",
owaspId: body.owaspId,
})
.returning();
return newFinding[0];
}, {
body: t.Object({
appName: t.String(),
category: t.String(),
severity: t.Union([
t.Literal("critical"),
t.Literal("high"),
t.Literal("medium"),
t.Literal("low"),
t.Literal("info"),
]),
title: t.String(),
description: t.String(),
recommendation: t.String(),
status: t.Optional(t.Union([
t.Literal("open"),
t.Literal("mitigated"),
t.Literal("accepted"),
t.Literal("false_positive"),
])),
owaspId: t.String(),
}),
})
// PATCH update finding
.patch("/:id", async ({ params, body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.severity !== undefined) updates.severity = body.severity;
if (body.status !== undefined) updates.status = body.status;
if (body.title !== undefined) updates.title = body.title;
if (body.description !== undefined) updates.description = body.description;
if (body.recommendation !== undefined) updates.recommendation = body.recommendation;
const updated = await db
.update(securityFindings)
.set(updates)
.where(eq(securityFindings.id, params.id))
.returning();
if (!updated.length) {
throw new Error("Finding not found");
}
return updated[0];
}, {
params: t.Object({ id: t.String() }),
body: t.Object({
severity: t.Optional(t.String()),
status: t.Optional(t.String()),
title: t.Optional(t.String()),
description: t.Optional(t.String()),
recommendation: t.Optional(t.String()),
}),
})
// DELETE finding
.delete("/:id", async ({ params, request, headers }) => {
await requireSessionOrBearer(request, headers);
const deleted = await db
.delete(securityFindings)
.where(eq(securityFindings.id, params.id))
.returning();
if (!deleted.length) throw new Error("Finding not found");
return { success: true };
}, {
params: t.Object({ id: t.String() }),
});

View File

@@ -0,0 +1,774 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import {
securityScans,
securityChecklists,
owaspScores,
securityScoreHistory,
securityAudits,
type ScanIssue,
} from "../db/schema";
import { eq, desc, asc, and, sql } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
async function requireAuth(
request: Request,
headers: Record<string, string | undefined>
) {
const authHeader = headers["authorization"];
if (authHeader === `Bearer ${BEARER_TOKEN}`) return;
try {
const session = await auth.api.getSession({ headers: request.headers });
if (session) return;
} catch {}
throw new Error("Unauthorized");
}
// ─── OWASP API Security Top 10 definitions ───
const OWASP_RISKS = [
{ id: "API1", name: "Broken Object Level Authorization (BOLA)" },
{ id: "API2", name: "Broken Authentication" },
{ id: "API3", name: "Broken Object Property Level Authorization" },
{ id: "API4", name: "Unrestricted Resource Consumption" },
{ id: "API5", name: "Broken Function Level Authorization" },
{ id: "API6", name: "Unrestricted Access to Sensitive Business Flows" },
{ id: "API7", name: "Server Side Request Forgery (SSRF)" },
{ id: "API8", name: "Security Misconfiguration" },
{ id: "API9", name: "Improper Inventory Management" },
{ id: "API10", name: "Unsafe API Consumption" },
];
// ─── Default checklist items per category ───
const DEFAULT_CHECKLISTS: Record<string, { item: string; description: string; severity: string }[]> = {
"Authentication & Sessions": [
{ item: "Password policy enforced (12+ chars, complexity)", description: "Ensure passwords meet minimum length and complexity requirements", severity: "high" },
{ item: "MFA available and encouraged", description: "Multi-factor authentication is implemented for all users", severity: "high" },
{ item: "Session expiry configured", description: "Sessions expire after reasonable timeout (e.g., 7 days)", severity: "medium" },
{ item: "Secure cookie flags (HttpOnly, Secure, SameSite)", description: "Session cookies have all security flags set", severity: "high" },
{ item: "Brute force protection on login", description: "Account lockout or rate limiting on failed login attempts", severity: "critical" },
{ item: "Password reset flow secure", description: "Reset tokens are time-limited and single-use", severity: "high" },
],
"Authorization": [
{ item: "Object-level authorization on all endpoints", description: "Every API endpoint verifies user owns/can access the resource", severity: "critical" },
{ item: "Function-level authorization", description: "Admin functions restricted to admin role", severity: "critical" },
{ item: "Field-level authorization", description: "Sensitive fields filtered based on user role", severity: "high" },
{ item: "No IDOR vulnerabilities", description: "Direct object references validated against user permissions", severity: "critical" },
],
"Input Validation": [
{ item: "All inputs validated with schema", description: "Every endpoint uses type/schema validation (e.g., Elysia t.Object)", severity: "high" },
{ item: "SQL injection prevented", description: "ORM or parameterized queries used exclusively", severity: "critical" },
{ item: "XSS prevented", description: "User input sanitized before rendering, CSP headers set", severity: "critical" },
{ item: "Path traversal prevented", description: "File paths validated, no user-controlled path concatenation", severity: "high" },
{ item: "File upload validation", description: "Uploaded files checked for type, size, and content", severity: "high" },
],
"Transport & Data Protection": [
{ item: "HTTPS enforced on all endpoints", description: "All traffic served over TLS, HTTP redirects to HTTPS", severity: "critical" },
{ item: "TLS 1.2+ only", description: "Older TLS versions disabled", severity: "high" },
{ item: "Sensitive data encrypted at rest", description: "PII and credentials encrypted in database", severity: "high" },
{ item: "CORS properly configured", description: "Only allowed origins, no wildcards in production", severity: "high" },
{ item: "Security headers set", description: "CSP, X-Content-Type-Options, X-Frame-Options, HSTS", severity: "medium" },
],
"Rate Limiting & Abuse": [
{ item: "Global rate limiting", description: "Overall request rate limit per IP/user", severity: "high" },
{ item: "Auth endpoint rate limiting", description: "Stricter limits on login/register endpoints", severity: "critical" },
{ item: "API endpoint rate limiting", description: "Per-endpoint rate limits for expensive operations", severity: "medium" },
{ item: "Request size limits", description: "Maximum body size enforced", severity: "medium" },
],
"Error Handling": [
{ item: "No stack traces in production", description: "Error responses don't expose internal details", severity: "high" },
{ item: "Generic error messages", description: "Errors don't reveal system architecture or data", severity: "medium" },
{ item: "Proper HTTP status codes", description: "401/403/404/500 used correctly", severity: "low" },
],
"Logging & Monitoring": [
{ item: "Auth events logged", description: "Login, logout, failed attempts, password changes recorded", severity: "high" },
{ item: "Data access logged", description: "CRUD operations on sensitive data are audit-logged", severity: "high" },
{ item: "Anomaly detection", description: "Unusual patterns (mass exports, rapid auth failures) trigger alerts", severity: "medium" },
{ item: "Structured logging", description: "Logs are structured (JSON) with request IDs for correlation", severity: "medium" },
{ item: "External monitoring", description: "Uptime monitoring with alerts configured", severity: "high" },
],
"Infrastructure": [
{ item: "Firewall configured", description: "Only required ports open, UFW/iptables rules documented", severity: "critical" },
{ item: "SSH hardened", description: "Key-only auth, non-default port, fail2ban enabled", severity: "critical" },
{ item: "OS auto-updates enabled", description: "unattended-upgrades for security patches", severity: "high" },
{ item: "Container images scanned", description: "Docker images checked for vulnerabilities before deploy", severity: "medium" },
{ item: "Secrets management", description: "No hardcoded secrets, environment variables or vault used", severity: "critical" },
{ item: "Backup strategy implemented", description: "Automated database backups to offsite location", severity: "high" },
{ item: "Docker health checks", description: "HEALTHCHECK defined in Dockerfiles", severity: "low" },
],
};
export const securityScanRoutes = new Elysia({ prefix: "/api/security" })
.onError(({ error, set }) => {
const msg = (error as any)?.message || String(error);
if (msg === "Unauthorized") { set.status = 401; return { error: "Unauthorized" }; }
if (msg === "Not found") { set.status = 404; return { error: "Not found" }; }
console.error("Security scan route error:", msg);
set.status = 500;
return { error: "Internal server error" };
})
// ═══════════════════════════════════════════
// AUTOMATED SCANS
// ═══════════════════════════════════════════
// GET all scans (optionally filter by project)
.get("/scans", async ({ request, headers, query }) => {
await requireAuth(request, headers);
const conditions = [];
if (query.project) {
conditions.push(eq(securityScans.projectName, query.project));
}
const scans = await db
.select()
.from(securityScans)
.where(conditions.length ? and(...conditions) : undefined)
.orderBy(desc(securityScans.startedAt))
.limit(Number(query.limit) || 50);
return scans;
}, {
query: t.Object({
project: t.Optional(t.String()),
limit: t.Optional(t.String()),
}),
})
// GET single scan
.get("/scans/:id", async ({ request, headers, params }) => {
await requireAuth(request, headers);
const scan = await db.select().from(securityScans).where(eq(securityScans.id, params.id));
if (!scan.length) throw new Error("Not found");
return scan[0];
}, { params: t.Object({ id: t.String() }) })
// POST run a scan (create scan record, execute in background)
.post("/scans/run", async ({ request, headers, body }) => {
await requireAuth(request, headers);
const scanRecord = await db.insert(securityScans).values({
projectName: body.projectName,
tool: body.tool,
status: "running",
issues: [],
summary: {},
}).returning();
// Run scan in background
runScan(scanRecord[0].id, body.projectName, body.tool, body.repoUrl).catch(err => {
console.error(`Scan ${scanRecord[0].id} failed:`, err);
db.update(securityScans)
.set({ status: "failed", rawOutput: String(err), completedAt: new Date() })
.where(eq(securityScans.id, scanRecord[0].id))
.catch(console.error);
});
return scanRecord[0];
}, {
body: t.Object({
projectName: t.String(),
tool: t.String(),
repoUrl: t.Optional(t.String()),
}),
})
// POST run all scans for a project
.post("/scans/run-all", async ({ request, headers, body }) => {
await requireAuth(request, headers);
const tools = ["npm_audit", "eslint_security", "gitleaks_pattern"];
const results = [];
for (const tool of tools) {
const scanRecord = await db.insert(securityScans).values({
projectName: body.projectName,
tool,
status: "running",
issues: [],
summary: {},
}).returning();
runScan(scanRecord[0].id, body.projectName, tool, body.repoUrl).catch(err => {
console.error(`Scan ${scanRecord[0].id} failed:`, err);
db.update(securityScans)
.set({ status: "failed", rawOutput: String(err), completedAt: new Date() })
.where(eq(securityScans.id, scanRecord[0].id))
.catch(console.error);
});
results.push(scanRecord[0]);
}
return results;
}, {
body: t.Object({
projectName: t.String(),
repoUrl: t.Optional(t.String()),
}),
})
// DELETE a scan
.delete("/scans/:id", async ({ request, headers, params }) => {
await requireAuth(request, headers);
const deleted = await db.delete(securityScans).where(eq(securityScans.id, params.id)).returning();
if (!deleted.length) throw new Error("Not found");
return { success: true };
}, { params: t.Object({ id: t.String() }) })
// ═══════════════════════════════════════════
// OWASP API SECURITY TOP 10
// ═══════════════════════════════════════════
// GET OWASP scores for a project
.get("/owasp/:projectName", async ({ request, headers, params }) => {
await requireAuth(request, headers);
const projectName = decodeURIComponent(params.projectName);
let scores = await db
.select()
.from(owaspScores)
.where(eq(owaspScores.projectName, projectName))
.orderBy(asc(owaspScores.riskId));
// If no scores exist, initialize them
if (scores.length === 0) {
for (const risk of OWASP_RISKS) {
await db.insert(owaspScores).values({
projectName,
riskId: risk.id,
riskName: risk.name,
score: 0,
status: "not_assessed",
});
}
scores = await db
.select()
.from(owaspScores)
.where(eq(owaspScores.projectName, projectName))
.orderBy(asc(owaspScores.riskId));
}
return scores;
}, { params: t.Object({ projectName: t.String() }) })
// GET OWASP summary (all projects)
.get("/owasp-summary", async ({ request, headers }) => {
await requireAuth(request, headers);
const all = await db.select().from(owaspScores).orderBy(asc(owaspScores.projectName), asc(owaspScores.riskId));
const projectMap: Record<string, { scores: number[]; assessed: number; total: number }> = {};
for (const score of all) {
if (!projectMap[score.projectName]) {
projectMap[score.projectName] = { scores: [], assessed: 0, total: 0 };
}
projectMap[score.projectName].scores.push(score.score);
projectMap[score.projectName].total++;
if (score.status !== "not_assessed") projectMap[score.projectName].assessed++;
}
return Object.entries(projectMap).map(([name, data]) => ({
projectName: name,
averageScore: data.scores.length ? Math.round(data.scores.reduce((a, b) => a + b, 0) / data.scores.length) : 0,
assessed: data.assessed,
total: data.total,
}));
})
// PATCH update an OWASP score
.patch("/owasp/:id", async ({ request, headers, params, body }) => {
await requireAuth(request, headers);
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.score !== undefined) updates.score = body.score;
if (body.status !== undefined) updates.status = body.status;
if (body.findings !== undefined) updates.findings = body.findings;
if (body.mitigations !== undefined) updates.mitigations = body.mitigations;
updates.lastAssessed = new Date();
const updated = await db.update(owaspScores)
.set(updates)
.where(eq(owaspScores.id, params.id))
.returning();
if (!updated.length) throw new Error("Not found");
return updated[0];
}, {
params: t.Object({ id: t.String() }),
body: t.Object({
score: t.Optional(t.Number()),
status: t.Optional(t.String()),
findings: t.Optional(t.Array(t.String())),
mitigations: t.Optional(t.Array(t.String())),
}),
})
// ═══════════════════════════════════════════
// CHECKLISTS
// ═══════════════════════════════════════════
// GET checklists for a project
.get("/checklists/:projectName", async ({ request, headers, params }) => {
await requireAuth(request, headers);
const projectName = decodeURIComponent(params.projectName);
const items = await db
.select()
.from(securityChecklists)
.where(eq(securityChecklists.projectName, projectName))
.orderBy(asc(securityChecklists.category), asc(securityChecklists.createdAt));
return items;
}, { params: t.Object({ projectName: t.String() }) })
// POST initialize checklists for a project (from templates)
.post("/checklists/init", async ({ request, headers, body }) => {
await requireAuth(request, headers);
// Check if already initialized
const existing = await db
.select()
.from(securityChecklists)
.where(eq(securityChecklists.projectName, body.projectName))
.limit(1);
if (existing.length > 0) {
return { message: "Checklists already initialized", count: 0 };
}
let count = 0;
for (const [category, items] of Object.entries(DEFAULT_CHECKLISTS)) {
for (const item of items) {
await db.insert(securityChecklists).values({
projectName: body.projectName,
category,
item: item.item,
description: item.description,
severity: item.severity,
});
count++;
}
}
return { message: "Checklists initialized", count };
}, {
body: t.Object({ projectName: t.String() }),
})
// PATCH toggle or update a checklist item
.patch("/checklists/:id", async ({ request, headers, params, body }) => {
await requireAuth(request, headers);
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.checked !== undefined) {
updates.checked = body.checked;
updates.checkedAt = body.checked ? new Date() : null;
updates.checkedBy = body.checked ? (body.checkedBy || "manual") : null;
}
if (body.notes !== undefined) updates.notes = body.notes;
const updated = await db.update(securityChecklists)
.set(updates)
.where(eq(securityChecklists.id, params.id))
.returning();
if (!updated.length) throw new Error("Not found");
return updated[0];
}, {
params: t.Object({ id: t.String() }),
body: t.Object({
checked: t.Optional(t.Boolean()),
checkedBy: t.Optional(t.String()),
notes: t.Optional(t.String()),
}),
})
// GET checklist summary (all projects)
.get("/checklists-summary", async ({ request, headers }) => {
await requireAuth(request, headers);
const all = await db.select().from(securityChecklists);
const projectMap: Record<string, { total: number; checked: number; critical: number; criticalChecked: number }> = {};
for (const item of all) {
if (!projectMap[item.projectName]) {
projectMap[item.projectName] = { total: 0, checked: 0, critical: 0, criticalChecked: 0 };
}
projectMap[item.projectName].total++;
if (item.checked) projectMap[item.projectName].checked++;
if (item.severity === "critical") {
projectMap[item.projectName].critical++;
if (item.checked) projectMap[item.projectName].criticalChecked++;
}
}
return Object.entries(projectMap).map(([name, data]) => ({
projectName: name,
total: data.total,
checked: data.checked,
completion: data.total ? Math.round((data.checked / data.total) * 100) : 0,
critical: data.critical,
criticalChecked: data.criticalChecked,
}));
})
// ═══════════════════════════════════════════
// SCORE HISTORY / TRENDS
// ═══════════════════════════════════════════
// POST record a score snapshot (call periodically)
.post("/history/snapshot", async ({ request, headers }) => {
await requireAuth(request, headers);
// Get all audit data
const audits = await db.select().from(securityAudits);
const projects = [...new Set(audits.map(a => a.projectName))];
const snapshots = [];
for (const projectName of projects) {
const projectAudits = audits.filter(a => a.projectName === projectName);
const overallScore = projectAudits.length
? Math.round(projectAudits.reduce((s, a) => s + a.score, 0) / projectAudits.length)
: 0;
const categoryScores: Record<string, number> = {};
for (const a of projectAudits) {
categoryScores[a.category] = a.score;
}
// Get OWASP score
const owasp = await db.select().from(owaspScores).where(eq(owaspScores.projectName, projectName));
const owaspAvg = owasp.length
? Math.round(owasp.reduce((s, o) => s + o.score, 0) / owasp.length)
: null;
// Get scan issue count (latest scans)
const latestScans = await db.select().from(securityScans)
.where(and(eq(securityScans.projectName, projectName), eq(securityScans.status, "completed")))
.orderBy(desc(securityScans.completedAt))
.limit(5);
const scanIssueCount = latestScans.reduce((s, scan) => s + (scan.issues?.length || 0), 0);
// Get checklist completion
const checklists = await db.select().from(securityChecklists)
.where(eq(securityChecklists.projectName, projectName));
const checklistCompletion = checklists.length
? Math.round((checklists.filter(c => c.checked).length / checklists.length) * 100)
: 0;
const snapshot = await db.insert(securityScoreHistory).values({
projectName,
overallScore,
categoryScores,
owaspScore: owaspAvg,
scanIssueCount,
checklistCompletion,
}).returning();
snapshots.push(snapshot[0]);
}
return snapshots;
})
// GET score history for a project
.get("/history/:projectName", async ({ request, headers, params, query }) => {
await requireAuth(request, headers);
const projectName = decodeURIComponent(params.projectName);
const limit = Number(query.limit) || 30;
const history = await db.select().from(securityScoreHistory)
.where(eq(securityScoreHistory.projectName, projectName))
.orderBy(desc(securityScoreHistory.recordedAt))
.limit(limit);
return history.reverse(); // Oldest first for charting
}, {
params: t.Object({ projectName: t.String() }),
query: t.Object({ limit: t.Optional(t.String()) }),
})
// GET comprehensive dashboard data
.get("/dashboard", async ({ request, headers }) => {
await requireAuth(request, headers);
// Audits summary
const audits = await db.select().from(securityAudits);
const projects = [...new Set(audits.map(a => a.projectName))];
const projectSummaries = projects.map(name => {
const pa = audits.filter(a => a.projectName === name);
const avgScore = pa.length ? Math.round(pa.reduce((s, a) => s + a.score, 0) / pa.length) : 0;
const allFindings = pa.flatMap(a => a.findings || []);
return {
projectName: name,
avgScore,
categories: pa.length,
totalFindings: allFindings.length,
critical: allFindings.filter(f => f.status === "critical").length,
needsImprovement: allFindings.filter(f => f.status === "needs_improvement").length,
strong: allFindings.filter(f => f.status === "strong").length,
};
});
// Recent scans
const recentScans = await db.select().from(securityScans)
.orderBy(desc(securityScans.startedAt))
.limit(10);
// Overall stats
const allFindings = audits.flatMap(a => a.findings || []);
const overallScore = projectSummaries.length
? Math.round(projectSummaries.reduce((s, p) => s + p.avgScore, 0) / projectSummaries.length)
: 0;
return {
overallScore,
projectCount: projects.length,
totalFindings: allFindings.length,
criticalFindings: allFindings.filter(f => f.status === "critical").length,
projectSummaries,
recentScans,
};
})
// ═══════════════════════════════════════════
// CREATE FIX TASK FROM FINDING
// ═══════════════════════════════════════════
.post("/create-fix-task", async ({ request, headers, body }) => {
await requireAuth(request, headers);
// Import tasks table
const { tasks } = await import("../db/schema");
// Get next task number
const maxNum = await db
.select({ max: sql<number>`COALESCE(MAX(${tasks.taskNumber}), 0)` })
.from(tasks);
const nextNum = (maxNum[0]?.max ?? 0) + 1;
const newTask = await db.insert(tasks).values({
taskNumber: nextNum,
title: `🔒 ${body.title}`,
description: body.description || `Security fix: ${body.title}\n\nProject: ${body.projectName}\nCategory: ${body.category}\nSeverity: ${body.severity || "medium"}\n\n${body.recommendation || ""}`,
source: "hammer",
status: "queued",
priority: body.severity === "critical" ? "high" : body.severity === "high" ? "high" : "medium",
tags: ["security", body.projectName.toLowerCase().replace(/\s+/g, "-")],
}).returning();
return newTask[0];
}, {
body: t.Object({
title: t.String(),
projectName: t.String(),
category: t.Optional(t.String()),
severity: t.Optional(t.String()),
description: t.Optional(t.String()),
recommendation: t.Optional(t.String()),
}),
});
// ═══════════════════════════════════════════
// SCAN RUNNER (background)
// ═══════════════════════════════════════════
const PROJECT_REPOS: Record<string, string> = {
"Hammer Dashboard": "https://git.infra.donovankelly.xyz/hammer/hammer-queue",
"Network App": "https://git.infra.donovankelly.xyz/hammer/network-app-api",
"Todo App": "https://git.infra.donovankelly.xyz/hammer/todo-app",
"nKode": "https://git.infra.donovankelly.xyz/dkelly/nkode",
};
async function runScan(scanId: string, projectName: string, tool: string, repoUrl?: string) {
const startTime = Date.now();
const issues: ScanIssue[] = [];
let rawOutput = "";
try {
const { execSync } = await import("child_process");
const { mkdtempSync, rmSync } = await import("fs");
const { join } = await import("path");
const os = await import("os");
const resolvedUrl = repoUrl || PROJECT_REPOS[projectName];
if (!resolvedUrl) {
throw new Error(`No repo URL for project: ${projectName}`);
}
// Clone repo to temp dir
const tmpDir = mkdtempSync(join(os.tmpdir(), "security-scan-"));
try {
execSync(`git clone --depth 1 ${resolvedUrl} ${tmpDir}/repo 2>&1`, {
timeout: 60000,
encoding: "utf-8",
});
const repoDir = join(tmpDir, "repo");
switch (tool) {
case "npm_audit": {
// Try bun first, then npm
try {
const output = execSync(`cd ${repoDir} && ([ -f backend/package.json ] && cd backend; bun install --frozen-lockfile 2>/dev/null || bun install 2>/dev/null; bun pm audit 2>&1 || npm audit --json 2>&1) || true`, {
timeout: 120000,
encoding: "utf-8",
maxBuffer: 10 * 1024 * 1024,
});
rawOutput = output.slice(0, 50000);
// Parse npm audit JSON output
try {
const auditData = JSON.parse(output);
if (auditData.vulnerabilities) {
for (const [name, vuln] of Object.entries(auditData.vulnerabilities) as any[]) {
issues.push({
id: crypto.randomUUID(),
severity: vuln.severity || "medium",
title: `${name}@${vuln.range || "unknown"} - ${vuln.severity}`,
description: vuln.via?.[0]?.title || vuln.via?.[0] || `Vulnerability in ${name}`,
fixAvailable: !!vuln.fixAvailable,
});
}
}
} catch {
// Non-JSON output, parse text
const lines = output.split("\n");
for (const line of lines) {
if (line.includes("moderate") || line.includes("high") || line.includes("critical") || line.includes("low")) {
if (line.trim() && !line.startsWith("#")) {
issues.push({
id: crypto.randomUUID(),
severity: line.includes("critical") ? "critical" : line.includes("high") ? "high" : line.includes("moderate") ? "medium" : "low",
title: line.trim().slice(0, 200),
description: line.trim(),
});
}
}
}
}
} catch (e: any) {
rawOutput = e.stdout || e.message;
}
break;
}
case "eslint_security": {
// Pattern-based security check - look for common issues
try {
const output = execSync(`cd ${repoDir} && grep -rn --include="*.ts" --include="*.tsx" --include="*.js" -E "(eval\\(|new Function|innerHTML|dangerouslySetInnerHTML|exec\\(|spawn\\(|child_process|\.env\\.)" . 2>/dev/null || echo "No security patterns found"`, {
timeout: 30000,
encoding: "utf-8",
maxBuffer: 5 * 1024 * 1024,
});
rawOutput = output.slice(0, 50000);
const lines = output.split("\n").filter(l => l.trim() && l !== "No security patterns found");
for (const line of lines) {
const match = line.match(/^\.\/(.+?):(\d+):(.*)/);
if (match) {
const [, file, lineNum, content] = match;
let severity: ScanIssue["severity"] = "info";
let title = "Potential security pattern";
if (content.includes("eval(") || content.includes("new Function")) {
severity = "high";
title = "Code injection risk: eval/Function";
} else if (content.includes("innerHTML") || content.includes("dangerouslySetInnerHTML")) {
severity = "medium";
title = "XSS risk: innerHTML/dangerouslySetInnerHTML";
} else if (content.includes("exec(") || content.includes("spawn(") || content.includes("child_process")) {
severity = "high";
title = "Command injection risk: exec/spawn";
} else if (content.includes(".env.")) {
severity = "info";
title = "Environment variable usage";
}
// Skip node_modules and common safe patterns
if (file.includes("node_modules") || file.includes(".git/")) continue;
issues.push({
id: crypto.randomUUID(),
severity,
title,
description: content.trim().slice(0, 300),
file,
line: parseInt(lineNum),
});
}
}
} catch (e: any) {
rawOutput = e.stdout || e.message;
}
break;
}
case "gitleaks_pattern": {
// Pattern-based secret detection
try {
const output = execSync(`cd ${repoDir} && grep -rn --include="*.ts" --include="*.tsx" --include="*.js" --include="*.json" --include="*.yml" --include="*.yaml" --include="*.env*" --include="*.md" -E "(password|secret|api.?key|token|private.?key)\\s*[=:]\\s*['\"][^'\"]{8,}" . 2>/dev/null | grep -v node_modules | grep -v ".git/" | grep -v "bun.lock" || echo "No secrets found"`, {
timeout: 30000,
encoding: "utf-8",
maxBuffer: 5 * 1024 * 1024,
});
rawOutput = output.slice(0, 50000);
const lines = output.split("\n").filter(l => l.trim() && l !== "No secrets found");
for (const line of lines) {
const match = line.match(/^\.\/(.+?):(\d+):(.*)/);
if (match) {
const [, file, lineNum, content] = match;
// Skip .env.example, docs, types, schema definitions
if (file.includes(".example") || file.includes("schema") || file.includes("types") || file.includes("README")) continue;
// Skip type definitions and comments
if (content.trim().startsWith("//") || content.trim().startsWith("*") || content.includes("process.env")) continue;
issues.push({
id: crypto.randomUUID(),
severity: "critical",
title: "Potential hardcoded secret",
description: `Found in ${file}:${lineNum}${content.trim().slice(0, 100)}...`,
file,
line: parseInt(lineNum),
rule: "hardcoded-secret",
});
}
}
} catch (e: any) {
rawOutput = e.stdout || e.message;
}
break;
}
default:
rawOutput = `Unknown tool: ${tool}`;
}
} finally {
// Cleanup
try { rmSync(tmpDir, { recursive: true, force: true }); } catch {}
}
const durationMs = Date.now() - startTime;
const summary: Record<string, number> = {};
for (const issue of issues) {
summary[issue.severity] = (summary[issue.severity] || 0) + 1;
}
await db.update(securityScans)
.set({
status: "completed",
issues,
summary,
rawOutput: rawOutput.slice(0, 100000),
durationMs,
completedAt: new Date(),
})
.where(eq(securityScans.id, scanId));
} catch (err: any) {
const durationMs = Date.now() - startTime;
await db.update(securityScans)
.set({
status: "failed",
rawOutput: (rawOutput + "\n\nError: " + err.message).slice(0, 100000),
durationMs,
completedAt: new Date(),
})
.where(eq(securityScans.id, scanId));
}
}

View File

@@ -1,7 +1,7 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { securityAudits } from "../db/schema";
import { eq, asc, and } from "drizzle-orm";
import { eq, asc, desc, and, sql } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
@@ -29,6 +29,7 @@ const findingSchema = t.Object({
title: t.String(),
description: t.String(),
recommendation: t.String(),
taskId: t.Optional(t.String()),
});
export const securityRoutes = new Elysia({ prefix: "/api/security" })
@@ -38,16 +39,17 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
set.status = 401;
return { error: "Unauthorized" };
}
if (msg === "Audit not found") {
if (msg === "Audit not found" || msg === "Not found") {
set.status = 404;
return { error: "Audit not found" };
return { error: msg };
}
console.error("Security route error:", msg);
set.status = 500;
return { error: "Internal server error" };
})
// GET all audits
// ─── Audit CRUD ───
.get("/", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const all = await db
@@ -57,7 +59,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return all;
})
// GET summary (aggregate scores per project)
.get("/summary", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const all = await db
@@ -98,7 +99,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return summary;
})
// GET audits for a specific project
.get(
"/project/:projectName",
async ({ params, request, headers }) => {
@@ -113,7 +113,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
{ params: t.Object({ projectName: t.String() }) }
)
// POST create audit entry
.post(
"/",
async ({ body, request, headers }) => {
@@ -140,7 +139,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
}
)
// PATCH update audit entry
.patch(
"/:id",
async ({ params, body, request, headers }) => {
@@ -172,7 +170,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
}
)
// DELETE audit entry
.delete(
"/:id",
async ({ params, request, headers }) => {
@@ -185,4 +182,111 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return { success: true };
},
{ params: t.Object({ id: t.String() }) }
);
)
// ─── Posture Score (computed from all sources) ───
.get("/posture", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const { securityChecklists, owaspScores, securityScans } = await import("../db/schema");
const audits = await db.select().from(securityAudits);
// Compute per-project posture
const projects: Record<string, any> = {};
for (const audit of audits) {
if (!projects[audit.projectName]) {
projects[audit.projectName] = {
auditScores: [],
findings: [],
checklistPass: 0,
checklistTotal: 0,
checklistFail: 0,
owaspAvg: null,
lastScan: null,
};
}
projects[audit.projectName].auditScores.push(audit.score);
projects[audit.projectName].findings.push(...(audit.findings || []));
}
// Add checklist data
try {
const checklistItems = await db.select().from(securityChecklists);
for (const item of checklistItems) {
if (!projects[item.projectName]) {
projects[item.projectName] = {
auditScores: [],
findings: [],
checklistPass: 0,
checklistTotal: 0,
checklistFail: 0,
owaspAvg: null,
lastScan: null,
};
}
projects[item.projectName].checklistTotal++;
if (item.checked) projects[item.projectName].checklistPass++;
else projects[item.projectName].checklistFail++;
}
} catch (e) {
console.error("Posture: checklist query error:", e);
}
// Add OWASP data
try {
const owasp = await db.select().from(owaspScores);
const owaspByProject: Record<string, number[]> = {};
for (const o of owasp) {
if (!owaspByProject[o.projectName]) owaspByProject[o.projectName] = [];
owaspByProject[o.projectName].push(o.score);
}
for (const [name, scores] of Object.entries(owaspByProject)) {
if (projects[name]) {
projects[name].owaspAvg = Math.round(scores.reduce((a, b) => a + b, 0) / scores.length);
}
}
} catch (e) {
console.error("Posture: owasp query error:", e);
}
const posture = Object.entries(projects).map(([name, data]: [string, any]) => {
const auditScore = data.auditScores.length
? Math.round(data.auditScores.reduce((a: number, b: number) => a + b, 0) / data.auditScores.length)
: 0;
const checklistScore = data.checklistTotal
? Math.round((data.checklistPass / data.checklistTotal) * 100)
: 0;
let overallScore = auditScore;
if (data.owaspAvg !== null && data.checklistTotal > 0) {
overallScore = Math.round(auditScore * 0.5 + data.owaspAvg * 0.3 + checklistScore * 0.2);
} else if (data.checklistTotal > 0) {
overallScore = Math.round(auditScore * 0.7 + checklistScore * 0.3);
}
return {
projectName: name,
overallScore,
auditScore,
owaspScore: data.owaspAvg,
checklistScore,
totalFindings: data.findings.length,
criticalFindings: data.findings.filter((f: any) => f.status === "critical").length,
warningFindings: data.findings.filter((f: any) => f.status === "needs_improvement").length,
strongFindings: data.findings.filter((f: any) => f.status === "strong").length,
checklistPass: data.checklistPass,
checklistTotal: data.checklistTotal,
checklistFail: data.checklistFail,
};
});
const overallScore = posture.length
? Math.round(posture.reduce((s, p) => s + p.overallScore, 0) / posture.length)
: 0;
return {
overallScore,
projects: posture,
};
});

View File

@@ -0,0 +1,148 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { todoProjects, todos } from "../db/schema";
import { eq, and, asc, sql, count } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
async function requireSessionOrBearer(request: Request, headers: Record<string, string | undefined>) {
const authHeader = headers["authorization"];
if (authHeader === `Bearer ${BEARER_TOKEN}`) {
return { userId: "bearer" };
}
try {
const session = await auth.api.getSession({ headers: request.headers });
if (session?.user) return { userId: session.user.id };
} catch {}
throw new Error("Unauthorized");
}
export const todoProjectRoutes = new Elysia({ prefix: "/api/todos/projects" })
.onError(({ error, set }) => {
const msg = (error as any)?.message || String(error);
if (msg === "Unauthorized") { set.status = 401; return { error: "Unauthorized" }; }
if (msg === "Not found") { set.status = 404; return { error: "Not found" }; }
console.error("Todo project route error:", msg);
set.status = 500;
return { error: "Internal server error", debug: msg };
})
// GET all projects for user (with todo counts)
.get("/", async ({ request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const projects = await db
.select({
id: todoProjects.id,
userId: todoProjects.userId,
name: todoProjects.name,
color: todoProjects.color,
icon: todoProjects.icon,
sortOrder: todoProjects.sortOrder,
createdAt: todoProjects.createdAt,
updatedAt: todoProjects.updatedAt,
todoCount: sql<number>`COALESCE((
SELECT COUNT(*) FROM todos
WHERE todos.project_id = ${todoProjects.id}
AND todos.is_completed = false
), 0)`.as("todoCount"),
completedCount: sql<number>`COALESCE((
SELECT COUNT(*) FROM todos
WHERE todos.project_id = ${todoProjects.id}
AND todos.is_completed = true
), 0)`.as("completedCount"),
})
.from(todoProjects)
.where(eq(todoProjects.userId, userId))
.orderBy(asc(todoProjects.sortOrder), asc(todoProjects.name));
return projects;
})
// POST create project
.post("/", async ({ body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
// Get max sort order
const maxOrder = await db
.select({ max: sql<number>`COALESCE(MAX(${todoProjects.sortOrder}), 0)` })
.from(todoProjects)
.where(eq(todoProjects.userId, userId));
const [project] = await db
.insert(todoProjects)
.values({
userId,
name: body.name,
color: body.color || "#6b7280",
icon: body.icon || "📁",
sortOrder: (maxOrder[0]?.max ?? 0) + 1,
})
.returning();
return project;
}, {
body: t.Object({
name: t.String({ minLength: 1 }),
color: t.Optional(t.String()),
icon: t.Optional(t.String()),
}),
})
// PATCH update project
.patch("/:id", async ({ params, body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const existing = await db
.select()
.from(todoProjects)
.where(and(eq(todoProjects.id, params.id), eq(todoProjects.userId, userId)));
if (!existing.length) throw new Error("Not found");
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.name !== undefined) updates.name = body.name;
if (body.color !== undefined) updates.color = body.color;
if (body.icon !== undefined) updates.icon = body.icon;
if (body.sortOrder !== undefined) updates.sortOrder = body.sortOrder;
const [updated] = await db
.update(todoProjects)
.set(updates)
.where(eq(todoProjects.id, params.id))
.returning();
return updated;
}, {
params: t.Object({ id: t.String() }),
body: t.Object({
name: t.Optional(t.String()),
color: t.Optional(t.String()),
icon: t.Optional(t.String()),
sortOrder: t.Optional(t.Number()),
}),
})
// DELETE project (moves todos to inbox / no project)
.delete("/:id", async ({ params, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const existing = await db
.select()
.from(todoProjects)
.where(and(eq(todoProjects.id, params.id), eq(todoProjects.userId, userId)));
if (!existing.length) throw new Error("Not found");
// Move all todos in this project to no project (inbox)
await db
.update(todos)
.set({ projectId: null, updatedAt: new Date() })
.where(eq(todos.projectId, params.id));
await db.delete(todoProjects).where(eq(todoProjects.id, params.id));
return { success: true };
}, {
params: t.Object({ id: t.String() }),
});

View File

@@ -0,0 +1,148 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { todoSections, todoProjects, todos } from "../db/schema";
import { eq, and, asc, sql } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
async function requireSessionOrBearer(request: Request, headers: Record<string, string | undefined>) {
const authHeader = headers["authorization"];
if (authHeader === `Bearer ${BEARER_TOKEN}`) return { userId: "bearer" };
try {
const session = await auth.api.getSession({ headers: request.headers });
if (session?.user) return { userId: session.user.id };
} catch {}
throw new Error("Unauthorized");
}
export const todoSectionRoutes = new Elysia({ prefix: "/api/todos/sections" })
.onError(({ error, set }) => {
const msg = (error as any)?.message || String(error);
if (msg === "Unauthorized") { set.status = 401; return { error: "Unauthorized" }; }
if (msg === "Not found") { set.status = 404; return { error: "Not found" }; }
console.error("Todo section route error:", msg);
set.status = 500;
return { error: "Internal server error", debug: msg };
})
// GET sections for a project
.get("/by-project/:projectId", async ({ params, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
// Verify user owns the project
const project = await db
.select()
.from(todoProjects)
.where(and(eq(todoProjects.id, params.projectId), eq(todoProjects.userId, userId)));
if (!project.length) throw new Error("Not found");
const sections = await db
.select({
id: todoSections.id,
projectId: todoSections.projectId,
name: todoSections.name,
isCollapsed: todoSections.isCollapsed,
sortOrder: todoSections.sortOrder,
createdAt: todoSections.createdAt,
updatedAt: todoSections.updatedAt,
todoCount: sql<number>`COALESCE((
SELECT COUNT(*) FROM todos
WHERE todos.section_id = ${todoSections.id}
AND todos.is_completed = false
), 0)`.as("todoCount"),
})
.from(todoSections)
.where(eq(todoSections.projectId, params.projectId))
.orderBy(asc(todoSections.sortOrder), asc(todoSections.name));
return sections;
}, {
params: t.Object({ projectId: t.String() }),
})
// POST create section
.post("/", async ({ body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
// Verify user owns the project
const project = await db
.select()
.from(todoProjects)
.where(and(eq(todoProjects.id, body.projectId), eq(todoProjects.userId, userId)));
if (!project.length) throw new Error("Not found");
const maxOrder = await db
.select({ max: sql<number>`COALESCE(MAX(${todoSections.sortOrder}), 0)` })
.from(todoSections)
.where(eq(todoSections.projectId, body.projectId));
const [section] = await db
.insert(todoSections)
.values({
projectId: body.projectId,
name: body.name,
sortOrder: (maxOrder[0]?.max ?? 0) + 1,
})
.returning();
return section;
}, {
body: t.Object({
projectId: t.String(),
name: t.String({ minLength: 1 }),
}),
})
// PATCH update section
.patch("/:id", async ({ params, body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const existing = await db
.select()
.from(todoSections)
.where(eq(todoSections.id, params.id));
if (!existing.length) throw new Error("Not found");
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.name !== undefined) updates.name = body.name;
if (body.isCollapsed !== undefined) updates.isCollapsed = body.isCollapsed;
if (body.sortOrder !== undefined) updates.sortOrder = body.sortOrder;
const [updated] = await db
.update(todoSections)
.set(updates)
.where(eq(todoSections.id, params.id))
.returning();
return updated;
}, {
params: t.Object({ id: t.String() }),
body: t.Object({
name: t.Optional(t.String()),
isCollapsed: t.Optional(t.Boolean()),
sortOrder: t.Optional(t.Number()),
}),
})
// DELETE section (moves todos to no section)
.delete("/:id", async ({ params, request, headers }) => {
await requireSessionOrBearer(request, headers);
const existing = await db
.select()
.from(todoSections)
.where(eq(todoSections.id, params.id));
if (!existing.length) throw new Error("Not found");
// Move todos to no section
await db
.update(todos)
.set({ sectionId: null, updatedAt: new Date() })
.where(eq(todos.sectionId, params.id));
await db.delete(todoSections).where(eq(todoSections.id, params.id));
return { success: true };
}, {
params: t.Object({ id: t.String() }),
});

View File

@@ -1,8 +1,10 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { todos } from "../db/schema";
import { eq, and, asc, desc, sql } from "drizzle-orm";
import type { TodoSubtask } from "../db/schema";
import { eq, and, asc, desc, sql, isNull, lte, gte, isNotNull } from "drizzle-orm";
import type { SQL } from "drizzle-orm";
import { randomUUID } from "crypto";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
@@ -56,7 +58,7 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
.get("/", async ({ request, headers, query }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const conditions = [eq(todos.userId, userId)];
const conditions: SQL[] = [eq(todos.userId, userId)];
// Filter by completion
if (query.completed === "true") {
@@ -70,6 +72,38 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
conditions.push(eq(todos.category, query.category));
}
// Filter by project
if (query.projectId === "inbox") {
conditions.push(isNull(todos.projectId));
} else if (query.projectId) {
conditions.push(eq(todos.projectId, query.projectId));
}
// Filter by section
if (query.sectionId === "none") {
conditions.push(isNull(todos.sectionId));
} else if (query.sectionId) {
conditions.push(eq(todos.sectionId, query.sectionId));
}
// Smart views
if (query.view === "today") {
// Todos due today or overdue (has a due date <= end of today)
const endOfToday = new Date();
endOfToday.setHours(23, 59, 59, 999);
conditions.push(isNotNull(todos.dueDate));
conditions.push(lte(todos.dueDate, endOfToday));
conditions.push(eq(todos.isCompleted, false));
} else if (query.view === "upcoming") {
// Todos with a due date in the future (after today)
const startOfTomorrow = new Date();
startOfTomorrow.setDate(startOfTomorrow.getDate() + 1);
startOfTomorrow.setHours(0, 0, 0, 0);
conditions.push(isNotNull(todos.dueDate));
conditions.push(gte(todos.dueDate, startOfTomorrow));
conditions.push(eq(todos.isCompleted, false));
}
const userTodos = await db
.select()
.from(todos)
@@ -86,6 +120,9 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
query: t.Object({
completed: t.Optional(t.String()),
category: t.Optional(t.String()),
projectId: t.Optional(t.String()),
sectionId: t.Optional(t.String()),
view: t.Optional(t.String()),
}),
})
@@ -120,6 +157,8 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
description: body.description || null,
priority: body.priority || "none",
category: body.category || null,
projectId: body.projectId || null,
sectionId: body.sectionId || null,
dueDate: body.dueDate ? new Date(body.dueDate) : null,
sortOrder: (maxOrder[0]?.max ?? 0) + 1,
})
@@ -137,11 +176,31 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
t.Literal("none"),
])),
category: t.Optional(t.String()),
projectId: t.Optional(t.Union([t.String(), t.Null()])),
sectionId: t.Optional(t.Union([t.String(), t.Null()])),
dueDate: t.Optional(t.Union([t.String(), t.Null()])),
}),
})
// PATCH update todo
// POST reassign all bearer todos to a real user (one-time migration)
.post("/migrate-owner", async ({ body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const result = await db
.update(todos)
.set({ userId: body.targetUserId, updatedAt: new Date() })
.where(eq(todos.userId, body.fromUserId))
.returning({ id: todos.id });
return { migrated: result.length };
}, {
body: t.Object({
fromUserId: t.String(),
targetUserId: t.String(),
}),
})
.patch("/:id", async ({ params, body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
@@ -157,6 +216,8 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
if (body.description !== undefined) updates.description = body.description;
if (body.priority !== undefined) updates.priority = body.priority;
if (body.category !== undefined) updates.category = body.category || null;
if (body.projectId !== undefined) updates.projectId = body.projectId || null;
if (body.sectionId !== undefined) updates.sectionId = body.sectionId || null;
if (body.dueDate !== undefined) updates.dueDate = body.dueDate ? new Date(body.dueDate) : null;
if (body.sortOrder !== undefined) updates.sortOrder = body.sortOrder;
if (body.isCompleted !== undefined) {
@@ -183,6 +244,8 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
t.Literal("none"),
])),
category: t.Optional(t.Union([t.String(), t.Null()])),
projectId: t.Optional(t.Union([t.String(), t.Null()])),
sectionId: t.Optional(t.Union([t.String(), t.Null()])),
dueDate: t.Optional(t.Union([t.String(), t.Null()])),
isCompleted: t.Optional(t.Boolean()),
sortOrder: t.Optional(t.Number()),
@@ -277,4 +340,87 @@ export const todoRoutes = new Elysia({ prefix: "/api/todos" })
createdAt: t.Optional(t.String()),
})),
}),
});
})
// ─── Subtasks ───
// POST add subtask
.post("/:id/subtasks", async ({ params, body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const existing = await db.select().from(todos).where(and(eq(todos.id, params.id), eq(todos.userId, userId)));
if (!existing.length) throw new Error("Not found");
const current = (existing[0].subtasks as TodoSubtask[] | null) || [];
const newSubtask: TodoSubtask = {
id: randomUUID(),
title: body.title,
completed: false,
createdAt: new Date().toISOString(),
};
const [updated] = await db
.update(todos)
.set({ subtasks: [...current, newSubtask], updatedAt: new Date() })
.where(eq(todos.id, params.id))
.returning();
return updated;
}, {
params: t.Object({ id: t.String() }),
body: t.Object({ title: t.String({ minLength: 1 }) }),
})
// PATCH toggle/update subtask
.patch("/:id/subtasks/:subtaskId", async ({ params, body, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const existing = await db.select().from(todos).where(and(eq(todos.id, params.id), eq(todos.userId, userId)));
if (!existing.length) throw new Error("Not found");
const current = (existing[0].subtasks as TodoSubtask[] | null) || [];
const updated = current.map((s) => {
if (s.id !== params.subtaskId) return s;
return {
...s,
title: body.title !== undefined ? body.title : s.title,
completed: body.completed !== undefined ? body.completed : s.completed,
completedAt: body.completed !== undefined
? (body.completed ? new Date().toISOString() : undefined)
: s.completedAt,
};
});
const [result] = await db
.update(todos)
.set({ subtasks: updated, updatedAt: new Date() })
.where(eq(todos.id, params.id))
.returning();
return result;
}, {
params: t.Object({ id: t.String(), subtaskId: t.String() }),
body: t.Object({
title: t.Optional(t.String()),
completed: t.Optional(t.Boolean()),
}),
})
// DELETE subtask
.delete("/:id/subtasks/:subtaskId", async ({ params, request, headers }) => {
const { userId } = await requireSessionOrBearer(request, headers);
const existing = await db.select().from(todos).where(and(eq(todos.id, params.id), eq(todos.userId, userId)));
if (!existing.length) throw new Error("Not found");
const current = (existing[0].subtasks as TodoSubtask[] | null) || [];
const filtered = current.filter((s) => s.id !== params.subtaskId);
const [result] = await db
.update(todos)
.set({ subtasks: filtered, updatedAt: new Date() })
.where(eq(todos.id, params.id))
.returning();
return result;
}, {
params: t.Object({ id: t.String(), subtaskId: t.String() }),
})

View File

@@ -0,0 +1,841 @@
/**
* Comprehensive Security Seed — Real findings from code inspection
* Seeds: OWASP scores, checklists, and security audits for all 4 apps
* Idempotent: skips if data already exists
*/
import { db } from "./db";
import { owaspScores, securityChecklists, securityAudits } from "./db/schema";
import { eq, sql } from "drizzle-orm";
const PROJECTS = ["Hammer Dashboard", "Network App", "Todo App", "nKode"];
// ═══════════════════════════════════════════
// OWASP API Security Top 10 — Real findings
// ═══════════════════════════════════════════
interface OwaspEntry {
riskId: string;
riskName: string;
score: number;
status: "pass" | "partial" | "fail" | "not_assessed";
findings: string[];
mitigations: string[];
}
const OWASP_DATA: Record<string, OwaspEntry[]> = {
"Hammer Dashboard": [
{
riskId: "API1", riskName: "Broken Object Level Authorization (BOLA)",
score: 30, status: "fail",
findings: [
"No ownership checks on tasks — any authenticated user can read/update/delete any task",
"Security audits accessible to all authenticated users without ownership validation",
"Todos route uses userId from session but no verification the todo belongs to that user on update/delete",
"Comments can be deleted by bearer token without ownership check (admin override)",
],
mitigations: [
"Session-based auth extracts userId for creating new resources",
"Bearer token acts as admin with full access (by design for Hammer bot)",
],
},
{
riskId: "API2", riskName: "Broken Authentication",
score: 75, status: "partial",
findings: [
"Static shared bearer token (API_BEARER_TOKEN env var) — single token for all API consumers",
"No brute-force protection on login endpoint",
"No MFA support",
"Default bearer token 'hammer-dev-token' hardcoded as fallback",
],
mitigations: [
"BetterAuth with secure cookie settings (HttpOnly, Secure, SameSite)",
"Session expiry handled by BetterAuth",
"CSRF protection enabled in BetterAuth config",
"Dual auth: session cookies + bearer token",
],
},
{
riskId: "API3", riskName: "Broken Object Property Level Authorization",
score: 45, status: "partial",
findings: [
"All task fields returned to all users — no field-level filtering",
"User role visible in /api/me response (acceptable but notable)",
"Security findings with recommendations visible to all authenticated users",
"No data masking for potentially sensitive fields",
],
mitigations: [
"TypeBox schema validation on input prevents unexpected field injection",
"PATCH endpoints only update explicitly listed fields",
],
},
{
riskId: "API4", riskName: "Unrestricted Resource Consumption",
score: 25, status: "fail",
findings: [
"No rate limiting middleware on any endpoint",
"GET /api/tasks returns all tasks without pagination limits",
"GET /api/security returns all audits without limits",
"No request body size limits configured",
"Security scan runner clones git repos — could exhaust disk/CPU",
"Score history endpoint loads unlimited data",
],
mitigations: [
"Activity endpoint has configurable limit (max 200)",
"Scan results capped at 50 per query",
],
},
{
riskId: "API5", riskName: "Broken Function Level Authorization",
score: 40, status: "partial",
findings: [
"Admin routes exist but no middleware enforces admin-only access on security scan execution",
"Anyone with bearer token can run security scans (resource intensive)",
"Invite endpoint accessible to any authenticated user (should be admin-only)",
"Score history snapshot can be triggered by any authenticated user",
],
mitigations: [
"Admin routes (/api/admin/*) check for admin role",
"Bearer token auth serves as admin-equivalent by design",
],
},
{
riskId: "API6", riskName: "Unrestricted Access to Sensitive Business Flows",
score: 55, status: "partial",
findings: [
"No audit logging on task creation/modification",
"Security scan can be triggered repeatedly without cooldown",
"Bulk checklist init has no duplicate prevention (only checks if any exist)",
],
mitigations: [
"Task progress notes provide some change history",
"Create Fix Task tracks origin via description",
],
},
{
riskId: "API7", riskName: "Server Side Request Forgery (SSRF)",
score: 60, status: "partial",
findings: [
"Security scan runner accepts repoUrl parameter — could be pointed at internal services",
"Git clone uses user-provided URL directly",
"Health check routes make HTTP requests to external services",
],
mitigations: [
"Default repo URLs are hardcoded constants",
"Git clone has 60-second timeout",
"No general-purpose URL fetch endpoints",
],
},
{
riskId: "API8", riskName: "Security Misconfiguration",
score: 50, status: "partial",
findings: [
"CORS allows http://localhost:5173 in production",
"No security response headers (X-Content-Type-Options, X-Frame-Options, CSP, HSTS)",
"Error handler returns generic messages (good) but console-only logging",
"Default bearer token fallback if env var not set",
"Docker runs as root user (no USER directive in Dockerfile)",
],
mitigations: [
"CORS restricted to specific origins (not wildcard)",
"Error responses don't expose stack traces",
"HTTPS enforced via Traefik reverse proxy",
],
},
{
riskId: "API9", riskName: "Improper Inventory Management",
score: 65, status: "partial",
findings: [
"No API versioning strategy",
"Health endpoint exposes service name",
"Debug endpoints may have been left in code (cbfeb6d commit)",
"Multiple route files with overlapping /api/security prefix (security.ts + security-scans.ts)",
],
mitigations: [
"Clean route organization by feature",
"No swagger/docs endpoint exposing internal API surface",
],
},
{
riskId: "API10", riskName: "Unsafe API Consumption",
score: 80, status: "pass",
findings: [
"Git clone of external repos in scan runner trusts repo content",
],
mitigations: [
"Scan runs in temp directory, cleaned up after",
"No external API dependencies for core functionality",
"Database queries use Drizzle ORM (parameterized)",
],
},
],
"Network App": [
{
riskId: "API1", riskName: "Broken Object Level Authorization (BOLA)",
score: 70, status: "partial",
findings: [
"Client endpoints require session but need to verify user owns the client record",
"Document download endpoint should verify client ownership",
],
mitigations: [
"Session-based auth on all routes extracts userId",
"Audit logging tracks all data modifications with user context",
"Client goals, notes, referrals all linked to authenticated user",
],
},
{
riskId: "API2", riskName: "Broken Authentication",
score: 80, status: "pass",
findings: [
"No MFA support",
"Password policy not enforced at API level",
],
mitigations: [
"BetterAuth with secure defaults",
"Session cookies HttpOnly + Secure + SameSite",
"Nginx reverse proxy ensures same-origin cookies (Brave fix)",
"CSRF protection enabled",
],
},
{
riskId: "API3", riskName: "Broken Object Property Level Authorization",
score: 65, status: "partial",
findings: [
"All client fields returned to all authenticated users",
"Meeting prep endpoint returns AI-generated insights without field filtering",
"Communication style settings contain personal tone preferences (low risk)",
],
mitigations: [
"TypeBox validation on all input",
"344 validation schemas across routes",
"PATCH only updates listed fields",
],
},
{
riskId: "API4", riskName: "Unrestricted Resource Consumption",
score: 85, status: "pass",
findings: [
"AI endpoints (meeting prep, email generation) call external LLM — costly if spammed",
],
mitigations: [
"Rate limiting implemented: global 100/min, auth 5/min, AI 10/min per IP",
"Client list pagination with max 200/page",
"Request cleanup on rate limiter (60s intervals)",
"429 Retry-After headers returned",
],
},
{
riskId: "API5", riskName: "Broken Function Level Authorization",
score: 70, status: "partial",
findings: [
"Bulk email send accessible to all authenticated users (should require elevated permission)",
"Data export (JSON/CSV) accessible to all users — could exfiltrate full database",
"Tag management (rename/delete/merge) accessible to all users",
],
mitigations: [
"Audit log tracks all modifications with user context",
"Admin routes check role",
],
},
{
riskId: "API6", riskName: "Unrestricted Access to Sensitive Business Flows",
score: 80, status: "pass",
findings: [
"Bulk email could be used to spam clients if account compromised",
],
mitigations: [
"Full audit logging on all data modifications",
"Communication logs track all sent emails",
"Interaction logging provides full activity timeline",
],
},
{
riskId: "API7", riskName: "Server Side Request Forgery (SSRF)",
score: 90, status: "pass",
findings: [],
mitigations: [
"No user-controlled URL fetch endpoints",
"AI calls only to configured provider URLs",
"Email sending via Resend API (no arbitrary SMTP)",
],
},
{
riskId: "API8", riskName: "Security Misconfiguration",
score: 60, status: "partial",
findings: [
"CORS origins from env var — could be misconfigured",
"No security response headers at app level",
"Error boundaries on frontend but generic server errors may leak in dev",
],
mitigations: [
"CORS restricted to specific origins",
"Error boundaries with generic UI messages",
"Toast system for user-friendly error display",
"HTTPS via Traefik",
],
},
{
riskId: "API9", riskName: "Improper Inventory Management",
score: 70, status: "partial",
findings: [
"No API versioning",
"Large API surface (clients, events, emails, interactions, notes, documents, goals, referrals, segments, templates, tags, search, export, audit, meeting-prep)",
],
mitigations: [
"Clean route organization",
"Consistent endpoint patterns",
"56+ API tests covering routes",
],
},
{
riskId: "API10", riskName: "Unsafe API Consumption",
score: 75, status: "partial",
findings: [
"AI provider responses parsed and displayed without sanitization",
"Resend email API responses trusted",
],
mitigations: [
"Drizzle ORM prevents SQL injection",
"TypeBox validates all input",
],
},
],
"Todo App": [
{
riskId: "API1", riskName: "Broken Object Level Authorization (BOLA)",
score: 75, status: "partial",
findings: [
"Need to verify all task endpoints check userId ownership",
"Hammer service account has full access via bearer token",
],
mitigations: [
"BetterAuth session extracts userId",
"Tasks linked to userId via schema",
"Hammer auth uses separate HAMMER_API_KEY env var",
],
},
{
riskId: "API2", riskName: "Broken Authentication",
score: 70, status: "partial",
findings: [
"No rate limiting on login/register endpoints",
"No MFA support",
"No password policy enforcement visible",
"Separate Hammer API key for service account (single static token)",
],
mitigations: [
"BetterAuth with secure session handling",
"Dedicated auth route with proper session management",
"Email verification flow (via Resend)",
],
},
{
riskId: "API3", riskName: "Broken Object Property Level Authorization",
score: 65, status: "partial",
findings: [
"All task fields returned to owner — no sensitive field filtering needed",
"Admin endpoints could expose other users' data",
],
mitigations: [
"154 TypeBox validation schemas",
"PATCH operations use explicit field lists",
],
},
{
riskId: "API4", riskName: "Unrestricted Resource Consumption",
score: 20, status: "fail",
findings: [
"No rate limiting middleware at all",
"No pagination on task list endpoints",
"No request body size limits",
"Comment creation unlimited",
],
mitigations: [
"Small user base (personal app) reduces risk",
],
},
{
riskId: "API5", riskName: "Broken Function Level Authorization",
score: 65, status: "partial",
findings: [
"Admin routes check role but unclear if consistently applied",
"Hammer route accessible with just bearer token — no IP restriction",
],
mitigations: [
"Admin routes exist with role checks",
"Hammer routes use separate API key",
],
},
{
riskId: "API6", riskName: "Unrestricted Access to Sensitive Business Flows",
score: 50, status: "partial",
findings: [
"No audit logging on task modifications",
"No change history for tasks",
],
mitigations: [
"Comments provide some discussion trail",
"Projects/labels provide organizational structure",
],
},
{
riskId: "API7", riskName: "Server Side Request Forgery (SSRF)",
score: 90, status: "pass",
findings: [],
mitigations: [
"No URL fetch endpoints",
"Email via Resend API only",
"No external service calls except auth provider",
],
},
{
riskId: "API8", riskName: "Security Misconfiguration",
score: 50, status: "partial",
findings: [
"CORS allows localhost origins in production",
"No security response headers",
"No structured logging",
"Docker configuration not reviewed for security",
],
mitigations: [
"CORS limited to specific origins",
"HTTPS via Traefik",
"Error responses generic",
],
},
{
riskId: "API9", riskName: "Improper Inventory Management",
score: 70, status: "partial",
findings: [
"No API versioning",
"Debug/test endpoints may exist",
],
mitigations: [
"Clean route structure (tasks, projects, labels, comments, auth)",
"Auth and admin routes separated",
"35+ API tests",
],
},
{
riskId: "API10", riskName: "Unsafe API Consumption",
score: 85, status: "pass",
findings: [],
mitigations: [
"Drizzle ORM parameterized queries",
"Minimal external API dependencies (just Resend)",
"TypeBox validation on all inputs",
],
},
],
"nKode": [
{
riskId: "API1", riskName: "Broken Object Level Authorization (BOLA)",
score: 80, status: "pass",
findings: [
"Need to verify all endpoints check user ownership of resources",
],
mitigations: [
"OPAQUE-ke protocol ensures strong user authentication",
"Session extractors in Axum verify auth before route handlers",
"Repository pattern with user-scoped queries",
],
},
{
riskId: "API2", riskName: "Broken Authentication",
score: 85, status: "pass",
findings: [
"OPAQUE registration/login are CPU-intensive (Argon2) — could be DoS vector",
"Session management is in-memory (lost on restart)",
],
mitigations: [
"OPAQUE-ke v4 with Argon2 — password never leaves client",
"Zero-knowledge password proof (industry best practice)",
"Session extractors in Axum middleware",
"No password stored server-side — only OPAQUE server state",
],
},
{
riskId: "API3", riskName: "Broken Object Property Level Authorization",
score: 70, status: "partial",
findings: [
"Icon data returned to all authenticated users",
"User account details need field-level review",
],
mitigations: [
"Rust type system enforces data structures",
"Serde serialization controls what fields are exposed",
"Repository pattern separates storage from API concerns",
],
},
{
riskId: "API4", riskName: "Unrestricted Resource Consumption",
score: 30, status: "fail",
findings: [
"No rate limiting in Axum router (no tower-governor or similar)",
"OPAQUE registration/login use Argon2 — expensive, abusable for CPU DoS",
"No request body size limits visible in config",
"Icon pool replenishment could be triggered excessively",
],
mitigations: [
"Small user base currently",
"Axum has some built-in protections",
],
},
{
riskId: "API5", riskName: "Broken Function Level Authorization",
score: 70, status: "partial",
findings: [
"Admin vs user role separation needs review",
"Service management endpoints need auth verification",
],
mitigations: [
"Axum extractors enforce auth on protected routes",
"Clean separation between public and protected endpoints",
],
},
{
riskId: "API6", riskName: "Unrestricted Access to Sensitive Business Flows",
score: 60, status: "partial",
findings: [
"No audit logging visible in codebase",
"Registration flow could be automated (no CAPTCHA)",
],
mitigations: [
"OPAQUE protocol prevents credential stuffing",
"Cleanup service manages expired sessions",
],
},
{
riskId: "API7", riskName: "Server Side Request Forgery (SSRF)",
score: 95, status: "pass",
findings: [],
mitigations: [
"No URL fetch or proxy endpoints",
"Pure Rust backend — no shell command execution",
"No external API calls from server",
],
},
{
riskId: "API8", riskName: "Security Misconfiguration",
score: 55, status: "partial",
findings: [
"CORS configuration needs production review",
"No security response headers at app level",
"In-memory storage means data loss on restart",
"No structured logging framework",
],
mitigations: [
"tower-http CORS layer configured",
"HTTPS via Traefik",
"Rust compiler prevents memory safety issues",
],
},
{
riskId: "API9", riskName: "Improper Inventory Management",
score: 65, status: "partial",
findings: [
"No API versioning",
"OIDC endpoints expand attack surface",
"Client WASM adds complexity to security model",
],
mitigations: [
"Clean Rust module organization",
"Test suite for protected routes",
"Workspace structure separates concerns",
],
},
{
riskId: "API10", riskName: "Unsafe API Consumption",
score: 90, status: "pass",
findings: [],
mitigations: [
"Minimal external dependencies",
"Rust type safety prevents injection attacks",
"OPAQUE protocol eliminates password transmission",
"No external API consumption in core flow",
],
},
],
};
// ═══════════════════════════════════════════
// CHECKLIST ITEMS PER PROJECT — Real assessments
// ═══════════════════════════════════════════
interface ChecklistSeed {
category: string;
item: string;
description: string;
severity: string;
checked: boolean;
notes?: string;
}
const CHECKLIST_COMMON: ChecklistSeed[] = [
// Authentication & Sessions
{ category: "Authentication & Sessions", item: "Password policy enforced (12+ chars, complexity)", description: "Passwords meet minimum length and complexity", severity: "high", checked: false, notes: "BetterAuth default policy — needs explicit enforcement" },
{ category: "Authentication & Sessions", item: "MFA available", description: "Multi-factor authentication implemented", severity: "high", checked: false, notes: "Not implemented on any app" },
{ category: "Authentication & Sessions", item: "Session expiry configured", description: "Sessions expire after reasonable timeout", severity: "medium", checked: true, notes: "BetterAuth handles session expiry" },
{ category: "Authentication & Sessions", item: "Secure cookie flags (HttpOnly, Secure, SameSite)", description: "Session cookies have all security flags", severity: "high", checked: true, notes: "BetterAuth sets these by default" },
{ category: "Authentication & Sessions", item: "Brute force protection on login", description: "Account lockout or rate limiting on failed attempts", severity: "critical", checked: false },
{ category: "Authentication & Sessions", item: "Password reset flow secure", description: "Reset tokens are time-limited and single-use", severity: "high", checked: false, notes: "Not yet implemented" },
// Authorization
{ category: "Authorization", item: "Object-level authorization on all endpoints", description: "Every endpoint verifies user owns the resource", severity: "critical", checked: false, notes: "Missing on most endpoints" },
{ category: "Authorization", item: "Function-level authorization", description: "Admin functions restricted to admin role", severity: "critical", checked: true, notes: "Admin routes check role" },
{ category: "Authorization", item: "Field-level authorization", description: "Sensitive fields filtered by role", severity: "high", checked: false },
{ category: "Authorization", item: "No IDOR vulnerabilities", description: "Direct object references validated", severity: "critical", checked: false, notes: "UUID primary keys help but not sufficient" },
// Input Validation
{ category: "Input Validation", item: "All inputs validated with schema", description: "TypeBox/schema validation on endpoints", severity: "high", checked: true, notes: "Elysia TypeBox on all routes" },
{ category: "Input Validation", item: "SQL injection prevented", description: "ORM or parameterized queries used", severity: "critical", checked: true, notes: "Drizzle ORM throughout" },
{ category: "Input Validation", item: "XSS prevented", description: "User input sanitized before rendering", severity: "critical", checked: false, notes: "No sanitization — React escapes by default but markdown is rendered raw" },
{ category: "Input Validation", item: "Path traversal prevented", description: "File paths validated", severity: "high", checked: false, notes: "Document uploads in Network App need review" },
{ category: "Input Validation", item: "File upload validation", description: "Files checked for type, size, content", severity: "high", checked: false, notes: "Network App documents — basic checks only" },
// Transport & Data Protection
{ category: "Transport & Data Protection", item: "HTTPS enforced on all endpoints", description: "All traffic over TLS", severity: "critical", checked: true, notes: "Traefik handles TLS termination" },
{ category: "Transport & Data Protection", item: "TLS 1.2+ only", description: "Older TLS versions disabled", severity: "high", checked: true, notes: "Traefik defaults" },
{ category: "Transport & Data Protection", item: "Sensitive data encrypted at rest", description: "PII encrypted in database", severity: "high", checked: false, notes: "Plaintext in PostgreSQL" },
{ category: "Transport & Data Protection", item: "CORS properly configured", description: "Only allowed origins, no wildcards", severity: "high", checked: true, notes: "Specific origins — but localhost included in some apps" },
{ category: "Transport & Data Protection", item: "Security headers set (CSP, X-Frame, HSTS)", description: "Browser security headers configured", severity: "medium", checked: false, notes: "No security headers on any app" },
// Rate Limiting
{ category: "Rate Limiting & Abuse", item: "Global rate limiting", description: "Overall request rate limit per IP", severity: "high", checked: false, notes: "Only Network App has rate limiting" },
{ category: "Rate Limiting & Abuse", item: "Auth endpoint rate limiting", description: "Stricter limits on login/register", severity: "critical", checked: false, notes: "Only Network App (5/min)" },
{ category: "Rate Limiting & Abuse", item: "Request size limits", description: "Maximum body size enforced", severity: "medium", checked: false },
// Error Handling
{ category: "Error Handling", item: "No stack traces in production", description: "Errors don't expose internals", severity: "high", checked: true, notes: "Generic error responses" },
{ category: "Error Handling", item: "Generic error messages", description: "Errors don't reveal architecture", severity: "medium", checked: true },
{ category: "Error Handling", item: "Proper HTTP status codes", description: "401/403/404/500 used correctly", severity: "low", checked: true },
// Logging & Monitoring
{ category: "Logging & Monitoring", item: "Auth events logged", description: "Login/logout/failures recorded", severity: "high", checked: false, notes: "Only console.log" },
{ category: "Logging & Monitoring", item: "Data access logged", description: "CRUD operations audit-logged", severity: "high", checked: false, notes: "Only Network App has audit logging" },
{ category: "Logging & Monitoring", item: "Structured logging", description: "JSON logs with request IDs", severity: "medium", checked: false },
{ category: "Logging & Monitoring", item: "External monitoring", description: "Uptime monitoring with alerts", severity: "high", checked: true, notes: "Health page in Hammer Dashboard monitors all apps" },
// Infrastructure
{ category: "Infrastructure", item: "Firewall configured", description: "Only required ports open", severity: "critical", checked: true, notes: "VPS firewall configured" },
{ category: "Infrastructure", item: "SSH hardened", description: "Key-only auth, fail2ban", severity: "critical", checked: true, notes: "Key-based SSH access" },
{ category: "Infrastructure", item: "OS auto-updates enabled", description: "unattended-upgrades for security patches", severity: "high", checked: false },
{ category: "Infrastructure", item: "Container images scanned", description: "Docker images checked for vulnerabilities", severity: "medium", checked: false },
{ category: "Infrastructure", item: "Secrets management", description: "No hardcoded secrets", severity: "critical", checked: true, notes: "Environment variables + Bitwarden" },
{ category: "Infrastructure", item: "Backup strategy implemented", description: "Automated database backups", severity: "high", checked: false, notes: "No automated backups" },
{ category: "Infrastructure", item: "Docker non-root user", description: "Containers run as non-root", severity: "medium", checked: false, notes: "All Dockerfiles run as root" },
];
// Project-specific overrides
const PROJECT_CHECKLIST_OVERRIDES: Record<string, Partial<Record<string, boolean>>> = {
"Network App": {
"Global rate limiting": true,
"Auth endpoint rate limiting": true,
"Data access logged": true,
"Path traversal prevented": false,
"File upload validation": false,
},
"nKode": {
"SQL injection prevented": true,
"Secure cookie flags (HttpOnly, Secure, SameSite)": false, // OPAQUE, different auth model
"Session expiry configured": false, // In-memory sessions
},
};
// ═══════════════════════════════════════════
// SEED FUNCTIONS
// ═══════════════════════════════════════════
async function seedOwasp() {
for (const project of PROJECTS) {
// Check if already seeded
const existing = await db
.select({ count: sql<number>`count(*)` })
.from(owaspScores)
.where(eq(owaspScores.projectName, project));
if (Number(existing[0]?.count) > 0) {
console.log(` OWASP: ${project} already seeded (${existing[0].count} entries), skipping`);
continue;
}
const data = OWASP_DATA[project];
if (!data) {
console.log(` OWASP: No data for ${project}, skipping`);
continue;
}
for (const entry of data) {
await db.insert(owaspScores).values({
projectName: project,
riskId: entry.riskId,
riskName: entry.riskName,
score: entry.score,
status: entry.status,
findings: entry.findings,
mitigations: entry.mitigations,
});
}
console.log(` OWASP: Seeded ${data.length} entries for ${project}`);
}
}
async function seedChecklists() {
for (const project of PROJECTS) {
// Check if already seeded
const existing = await db
.select({ count: sql<number>`count(*)` })
.from(securityChecklists)
.where(eq(securityChecklists.projectName, project));
if (Number(existing[0]?.count) > 0) {
console.log(` Checklists: ${project} already seeded (${existing[0].count} items), skipping`);
continue;
}
const overrides = PROJECT_CHECKLIST_OVERRIDES[project] || {};
let count = 0;
for (const item of CHECKLIST_COMMON) {
const checked = overrides[item.item] !== undefined ? overrides[item.item] : item.checked;
await db.insert(securityChecklists).values({
projectName: project,
category: item.category,
item: item.item,
description: item.description,
severity: item.severity,
checked: !!checked,
checkedBy: checked ? "seed" : undefined,
checkedAt: checked ? new Date() : undefined,
notes: item.notes || undefined,
});
count++;
}
console.log(` Checklists: Seeded ${count} items for ${project}`);
}
// Also seed Infrastructure checklists
const infraExists = await db
.select({ count: sql<number>`count(*)` })
.from(securityChecklists)
.where(eq(securityChecklists.projectName, "Infrastructure"));
if (Number(infraExists[0]?.count) === 0) {
const infraItems: ChecklistSeed[] = [
{ category: "Network Security", item: "UFW firewall enabled with deny-by-default", description: "Only required ports open", severity: "critical", checked: true },
{ category: "Network Security", item: "SSH key-only authentication", description: "Password auth disabled", severity: "critical", checked: true },
{ category: "Network Security", item: "Fail2ban active", description: "Brute force protection on SSH", severity: "high", checked: false },
{ category: "Network Security", item: "Non-default SSH port", description: "SSH on non-standard port", severity: "low", checked: false },
{ category: "Server Maintenance", item: "Unattended-upgrades enabled", description: "Automatic security patches", severity: "high", checked: false },
{ category: "Server Maintenance", item: "Swap configured", description: "Prevent OOM kills", severity: "medium", checked: false },
{ category: "Docker Security", item: "Docker images use specific tags (not :latest)", description: "Reproducible builds", severity: "medium", checked: true, notes: "oven/bun:1 used" },
{ category: "Docker Security", item: "Container health checks defined", description: "HEALTHCHECK in Dockerfiles", severity: "low", checked: false },
{ category: "Docker Security", item: "Containers run as non-root", description: "USER directive in Dockerfiles", severity: "medium", checked: false },
{ category: "Docker Security", item: "No --privileged containers", description: "Minimal container capabilities", severity: "critical", checked: true },
{ category: "Secrets Management", item: "Bitwarden CLI for credential storage", description: "Centralized secret management", severity: "high", checked: true },
{ category: "Secrets Management", item: "No secrets in git repos", description: "All secrets in env vars", severity: "critical", checked: true, notes: "Env vars via Dokploy" },
{ category: "Secrets Management", item: "Docker secrets or external vault", description: "Secrets not in compose files", severity: "high", checked: false, notes: "Compose files contain DB credentials" },
{ category: "Backup & Recovery", item: "Automated database backups", description: "Regular PostgreSQL dumps", severity: "critical", checked: false },
{ category: "Backup & Recovery", item: "Backup restoration tested", description: "Recovery procedure verified", severity: "high", checked: false },
{ category: "Backup & Recovery", item: "Off-site backup storage", description: "Backups stored externally", severity: "high", checked: false },
{ category: "Monitoring", item: "Application health monitoring", description: "Automated uptime checks", severity: "high", checked: true, notes: "Hammer Dashboard Health page" },
{ category: "Monitoring", item: "Log aggregation", description: "Centralized logging", severity: "medium", checked: false },
{ category: "Monitoring", item: "Alert notifications", description: "Alerts on failures/anomalies", severity: "high", checked: false },
{ category: "TLS & Certificates", item: "Auto-renewing TLS certs", description: "Let's Encrypt via Traefik", severity: "critical", checked: true },
{ category: "TLS & Certificates", item: "HSTS enabled", description: "Strict-Transport-Security header", severity: "high", checked: false },
{ category: "TLS & Certificates", item: "TLS 1.2+ only", description: "Older versions disabled", severity: "high", checked: true, notes: "Traefik defaults" },
];
for (const item of infraItems) {
await db.insert(securityChecklists).values({
projectName: "Infrastructure",
category: item.category,
item: item.item,
description: item.description,
severity: item.severity,
checked: item.checked,
checkedBy: item.checked ? "seed" : undefined,
checkedAt: item.checked ? new Date() : undefined,
notes: item.notes || undefined,
});
}
console.log(` Checklists: Seeded ${infraItems.length} items for Infrastructure`);
}
}
async function seedAudits() {
// Only seed if empty
const existing = await db
.select({ count: sql<number>`count(*)` })
.from(securityAudits);
if (Number(existing[0]?.count) > 0) {
console.log(` Audits: Already seeded (${existing[0].count} entries), skipping`);
return;
}
// Create audit entries from OWASP data (one per project with "OWASP Top 10" category)
for (const [project, entries] of Object.entries(OWASP_DATA)) {
const avgScore = Math.round(entries.reduce((s, e) => s + e.score, 0) / entries.length);
const findings = entries.flatMap(e => {
const items = [];
for (const f of e.findings) {
items.push({
id: crypto.randomUUID(),
title: `${e.riskId}: ${f}`,
status: e.status === "fail" ? "critical" as const : e.status === "partial" ? "needs_improvement" as const : "strong" as const,
description: f,
recommendation: e.mitigations[0] || "Review and address this finding",
});
}
return items;
});
await db.insert(securityAudits).values({
projectName: project,
category: "OWASP API Top 10",
findings: findings.slice(0, 20), // Cap at 20 findings per audit
score: avgScore,
});
}
console.log(` Audits: Seeded ${Object.keys(OWASP_DATA).length} OWASP audit entries`);
}
// ═══════════════════════════════════════════
// MAIN
// ═══════════════════════════════════════════
async function main() {
console.log("🛡️ Security Seed — Starting...");
try {
console.log("📊 Seeding OWASP API Top 10 scores...");
await seedOwasp();
} catch (e) {
console.error("OWASP seed error:", e);
}
try {
console.log("📋 Seeding security checklists...");
await seedChecklists();
} catch (e) {
console.error("Checklist seed error:", e);
}
try {
console.log("🔍 Seeding security audits...");
await seedAudits();
} catch (e) {
console.error("Audit seed error:", e);
}
console.log("✅ Security seed complete!");
process.exit(0);
}
main().catch((e) => {
console.error("Fatal seed error:", e);
process.exit(1);
});

View File

@@ -0,0 +1,334 @@
import { db } from "./db";
import { securityChecklist } from "./db/schema";
interface ChecklistItem {
projectName: string;
category: string;
item: string;
status: "pass" | "fail" | "partial" | "not_applicable" | "not_checked";
notes: string | null;
}
const items: ChecklistItem[] = [
// ═══════════════════════════════════════════
// HAMMER DASHBOARD
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Passwords hashed with bcrypt/argon2/scrypt", status: "pass", notes: "BetterAuth handles password hashing securely" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session tokens are cryptographically random", status: "pass", notes: "BetterAuth generates secure session tokens" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "Sessions expire per BetterAuth defaults" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Secure cookie attributes (HttpOnly, Secure, SameSite)", status: "pass", notes: "Cookie config: secure=true, sameSite=none, httpOnly=true" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "CSRF protection enabled", status: "pass", notes: "disableCSRFCheck: false explicitly set" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA support configured" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Password complexity requirements enforced", status: "fail", notes: "No password policy configured in BetterAuth" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Account lockout after failed attempts", status: "not_checked", notes: "BetterAuth may handle this — needs verification" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Registration restricted (invite-only or approval)", status: "fail", notes: "Open signup enabled — emailAndPassword.enabled without disableSignUp" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session invalidation on password change", status: "pass", notes: "BetterAuth invalidates sessions on credential change" },
// Authorization
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Object-level access control (users can only access own data)", status: "partial", notes: "Task queue is shared by design — no per-user isolation. Admin role exists." },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Function-level access control (admin vs user)", status: "pass", notes: "requireAdmin() check on admin-only routes" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Field-level access control", status: "pass", notes: "Elysia t.Object() schemas restrict accepted fields" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "API tokens scoped per client/service", status: "fail", notes: "Single static API_BEARER_TOKEN shared across all consumers" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Principle of least privilege applied", status: "partial", notes: "Admin/user roles exist but token gives full access" },
// Input Validation
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "All API inputs validated with schemas", status: "partial", notes: "Most routes use Elysia t.Object() — some routes lack validation" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "SQL injection prevented (parameterized queries)", status: "pass", notes: "Drizzle ORM handles parameterization" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "XSS prevention (output encoding)", status: "pass", notes: "React auto-escapes output. API returns JSON." },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "Path traversal prevented", status: "not_applicable", notes: "No file system operations in API" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "File upload validation", status: "not_applicable", notes: "No file uploads in this app" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "Request body size limits", status: "fail", notes: "No body size limits configured" },
// Transport & Data Protection
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "HTTPS enforced on all endpoints", status: "pass", notes: "Let's Encrypt TLS via Traefik/Dokploy" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "HSTS header set", status: "partial", notes: "May be set by Traefik — needs verification at app level" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "CORS includes localhost:5173 in production" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Encryption at rest for sensitive data", status: "fail", notes: "No disk or column-level encryption" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Database backups encrypted", status: "fail", notes: "No backup strategy exists" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Secrets stored securely (env vars / vault)", status: "pass", notes: "Env vars via Dokploy environment config" },
// Rate Limiting
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No rate limiting middleware" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting middleware" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Bot/CAPTCHA protection on registration", status: "fail", notes: "No CAPTCHA or bot detection" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Request throttling for expensive operations", status: "fail", notes: "No throttling configured" },
// Error Handling
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Generic error messages in production", status: "pass", notes: "Returns 'Internal server error' without stack traces" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "No stack traces leaked to clients", status: "pass", notes: "Error handler is generic" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Consistent error response format", status: "pass", notes: "All errors return { error: string }" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Uncaught exception handler", status: "partial", notes: "Elysia onError catches route errors; no process-level handler" },
// Logging & Monitoring
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Structured logging (not just console.log)", status: "fail", notes: "Console-only logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Auth events logged (login, logout, failed attempts)", status: "fail", notes: "No auth event logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Data access audit trail", status: "fail", notes: "No audit logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Error alerting configured", status: "fail", notes: "No alerting system" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No external monitoring" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Log aggregation / centralized logging", status: "fail", notes: "No log aggregation — stdout only" },
// Infrastructure
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Container isolation (separate containers per service)", status: "pass", notes: "Docker compose with separate backend + db containers" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Minimal base images", status: "partial", notes: "Uses oven/bun — not minimal but purpose-built" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "No root user in containers", status: "not_checked", notes: "Need to verify Dockerfile USER directive" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Docker health checks defined", status: "fail", notes: "No HEALTHCHECK in Dockerfile" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Secrets not baked into images", status: "pass", notes: "Secrets via env vars at runtime" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Automated deployment (CI/CD)", status: "pass", notes: "Gitea Actions + Dokploy deploy" },
// Security Headers
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Content-Security-Policy (CSP)", status: "fail", notes: "No CSP header set at application level" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-Content-Type-Options: nosniff", status: "not_checked", notes: "May be set by Traefik — needs verification" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-Frame-Options: DENY", status: "not_checked", notes: "May be set by Traefik — needs verification" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-XSS-Protection", status: "not_checked", notes: "Deprecated but worth checking" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Referrer-Policy", status: "not_checked", notes: "Not configured at app level" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Permissions-Policy", status: "fail", notes: "Not configured" },
// ═══════════════════════════════════════════
// NETWORK APP
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Network App", category: "Auth & Session Management", item: "Passwords hashed with bcrypt/argon2/scrypt", status: "pass", notes: "BetterAuth handles hashing" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Session tokens are cryptographically random", status: "pass", notes: "BetterAuth secure tokens" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "7-day expiry with daily refresh" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Secure cookie attributes", status: "pass", notes: "secure=true, sameSite=none, httpOnly, cross-subdomain scoped" },
{ projectName: "Network App", category: "Auth & Session Management", item: "CSRF protection enabled", status: "pass", notes: "BetterAuth CSRF enabled" },
{ projectName: "Network App", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA support" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Registration restricted (invite-only)", status: "pass", notes: "disableSignUp: true + 403 on signup endpoint" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Bearer token support for mobile", status: "pass", notes: "BetterAuth bearer plugin enabled" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Password complexity requirements", status: "fail", notes: "No password policy enforced" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Account lockout after failed attempts", status: "not_checked", notes: "Needs verification" },
// Authorization
{ projectName: "Network App", category: "Authorization", item: "Object-level access control (user-scoped queries)", status: "pass", notes: "All queries use eq(clients.userId, user.id)" },
{ projectName: "Network App", category: "Authorization", item: "Function-level access control (admin vs user)", status: "pass", notes: "Admin routes check user.role === 'admin'" },
{ projectName: "Network App", category: "Authorization", item: "Centralized auth middleware", status: "pass", notes: "authMiddleware Elysia plugin with 'as: scoped'" },
{ projectName: "Network App", category: "Authorization", item: "Field-level input validation", status: "partial", notes: "Most fields validated — 'role' field accepts arbitrary strings" },
// Input Validation
{ projectName: "Network App", category: "Input Validation", item: "All API inputs validated", status: "pass", notes: "34+ route files use Elysia t.Object() schemas" },
{ projectName: "Network App", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "Drizzle ORM parameterized queries" },
{ projectName: "Network App", category: "Input Validation", item: "XSS prevention", status: "pass", notes: "React auto-escapes; API returns JSON" },
{ projectName: "Network App", category: "Input Validation", item: "File upload validation", status: "partial", notes: "Document uploads exist — need to verify size/type checks" },
{ projectName: "Network App", category: "Input Validation", item: "Request body size limits", status: "not_checked", notes: "Needs verification" },
// Transport & Data Protection
{ projectName: "Network App", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "Falls back to localhost:3000 if env not set" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "PII encryption at rest", status: "fail", notes: "Contact data (names, emails, phones) stored as plain text" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "Secrets stored securely", status: "pass", notes: "Env vars via Dokploy" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "API key rotation for external services", status: "fail", notes: "Resend API key not rotated" },
// Rate Limiting
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "pass", notes: "5 req/min per IP on auth" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "pass", notes: "100 req/min global per IP" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on AI endpoints", status: "pass", notes: "10 req/min on AI routes" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limit headers in responses", status: "pass", notes: "Returns Retry-After on 429" },
// Error Handling
{ projectName: "Network App", category: "Error Handling", item: "Generic error messages in production", status: "fail", notes: "Stack traces included in error responses" },
{ projectName: "Network App", category: "Error Handling", item: "No stack traces leaked", status: "fail", notes: "Error handler sends stack to client" },
{ projectName: "Network App", category: "Error Handling", item: "Consistent error response format", status: "pass", notes: "Standardized error format" },
{ projectName: "Network App", category: "Error Handling", item: "Error boundary in frontend", status: "pass", notes: "ErrorBoundary + ToastContainer implemented" },
// Logging & Monitoring
{ projectName: "Network App", category: "Logging & Monitoring", item: "Audit logging implemented", status: "pass", notes: "audit_logs table tracks all CRUD operations" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Structured logging", status: "fail", notes: "Console-based logging only" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting configured" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No external monitoring" },
// Infrastructure
{ projectName: "Network App", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Separate Docker containers" },
{ projectName: "Network App", category: "Infrastructure", item: "Production Dockerfile with minimal deps", status: "pass", notes: "Multi-stage build, --production flag, NODE_ENV=production" },
{ projectName: "Network App", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK in Dockerfile" },
{ projectName: "Network App", category: "Infrastructure", item: "Automated CI/CD", status: "pass", notes: "Gitea Actions + Dokploy" },
// Security Headers
{ projectName: "Network App", category: "Security Headers", item: "Content-Security-Policy (CSP)", status: "fail", notes: "Not configured" },
{ projectName: "Network App", category: "Security Headers", item: "X-Content-Type-Options: nosniff", status: "not_checked", notes: "Needs verification" },
{ projectName: "Network App", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "Network App", category: "Security Headers", item: "Referrer-Policy", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// TODO APP
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Todo App", category: "Auth & Session Management", item: "Passwords hashed securely", status: "pass", notes: "BetterAuth handles hashing" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Session tokens cryptographically random", status: "pass", notes: "BetterAuth secure tokens" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "BetterAuth defaults" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Secure cookie attributes", status: "pass", notes: "Configured in BetterAuth" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Registration restricted (invite-only)", status: "pass", notes: "Invite system with expiring tokens" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Hammer service auth separated", status: "pass", notes: "Dedicated HAMMER_API_KEY for service account" },
// Authorization
{ projectName: "Todo App", category: "Authorization", item: "Object-level access control", status: "pass", notes: "Tasks filtered by eq(tasks.userId, userId)" },
{ projectName: "Todo App", category: "Authorization", item: "Function-level access control", status: "pass", notes: "Admin role checking on admin routes" },
{ projectName: "Todo App", category: "Authorization", item: "Service account scope limited", status: "partial", notes: "Hammer service has broad access to create/update for any user" },
// Input Validation
{ projectName: "Todo App", category: "Input Validation", item: "API inputs validated with schemas", status: "pass", notes: "Elysia t.Object() type validation on routes" },
{ projectName: "Todo App", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "Drizzle ORM" },
{ projectName: "Todo App", category: "Input Validation", item: "XSS prevention", status: "pass", notes: "React + JSON API" },
{ projectName: "Todo App", category: "Input Validation", item: "Webhook URL validation", status: "partial", notes: "Webhook URLs stored by admin — no scheme/host validation" },
// Transport & Data Protection
{ projectName: "Todo App", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "Todo App", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "Falls back to localhost:5173 if env not set" },
{ projectName: "Todo App", category: "Transport & Data Protection", item: "Database backups", status: "fail", notes: "No backup strategy" },
// Rate Limiting
{ projectName: "Todo App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No rate limiting" },
{ projectName: "Todo App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting" },
// Error Handling
{ projectName: "Todo App", category: "Error Handling", item: "Generic error messages in production", status: "pass", notes: "Checks NODE_ENV for stack traces" },
{ projectName: "Todo App", category: "Error Handling", item: "Consistent error format", status: "pass", notes: "Standardized error responses" },
// Logging & Monitoring
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Audit logging", status: "fail", notes: "No audit logging" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Structured logging", status: "fail", notes: "Console-only" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No monitoring" },
// Infrastructure
{ projectName: "Todo App", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Docker compose" },
{ projectName: "Todo App", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK" },
{ projectName: "Todo App", category: "Infrastructure", item: "Automated CI/CD", status: "pass", notes: "Gitea Actions + Dokploy" },
// Security Headers
{ projectName: "Todo App", category: "Security Headers", item: "CSP header", status: "fail", notes: "Not configured" },
{ projectName: "Todo App", category: "Security Headers", item: "X-Content-Type-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "Todo App", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// NKODE
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "nKode", category: "Auth & Session Management", item: "OPAQUE protocol (zero-knowledge password)", status: "pass", notes: "Server never sees plaintext passwords — state-of-the-art" },
{ projectName: "nKode", category: "Auth & Session Management", item: "Argon2 password hashing in OPAQUE", status: "pass", notes: "Configured via opaque-ke features" },
{ projectName: "nKode", category: "Auth & Session Management", item: "OIDC token-based sessions", status: "pass", notes: "Full OIDC implementation with JWK signing" },
{ projectName: "nKode", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No second factor — OPAQUE is single-factor" },
{ projectName: "nKode", category: "Auth & Session Management", item: "Cryptographic session signatures", status: "pass", notes: "HEADER_SIGNATURE + HEADER_TIMESTAMP verification" },
// Authorization
{ projectName: "nKode", category: "Authorization", item: "Token-based authorization", status: "pass", notes: "OIDC JWT tokens for API auth" },
{ projectName: "nKode", category: "Authorization", item: "Auth extractors for route protection", status: "pass", notes: "extractors.rs provides consistent auth extraction" },
{ projectName: "nKode", category: "Authorization", item: "Role-based access control", status: "fail", notes: "No visible RBAC — all authenticated users have equal access" },
// Input Validation
{ projectName: "nKode", category: "Input Validation", item: "Type-safe deserialization (serde)", status: "pass", notes: "Rust serde enforces strict type contracts" },
{ projectName: "nKode", category: "Input Validation", item: "Memory safety (Rust)", status: "pass", notes: "Eliminates buffer overflows, use-after-free, data races" },
{ projectName: "nKode", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "SQLx with parameterized queries" },
// Transport & Data Protection
{ projectName: "nKode", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "OPAQUE prevents password exposure", status: "pass", notes: "DB breach doesn't expose passwords" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "Login data encryption at rest", status: "fail", notes: "Stored login data not encrypted at application level" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "CORS properly restricted", status: "fail", notes: "Hardcoded localhost origins in production code" },
// Rate Limiting
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No tower-governor or rate limiting middleware" },
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting" },
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Argon2 DoS protection", status: "fail", notes: "Expensive OPAQUE/Argon2 flows could be abused for resource exhaustion" },
// Error Handling
{ projectName: "nKode", category: "Error Handling", item: "Proper Axum error types", status: "pass", notes: "Uses Axum error handling properly" },
{ projectName: "nKode", category: "Error Handling", item: "No stack traces leaked", status: "pass", notes: "Rust error handling is explicit" },
// Logging & Monitoring
{ projectName: "nKode", category: "Logging & Monitoring", item: "Structured logging (tracing crate)", status: "pass", notes: "Uses Rust tracing ecosystem" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Log aggregation", status: "fail", notes: "Logs to stdout only" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No monitoring" },
// Infrastructure
{ projectName: "nKode", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Docker on Dokploy" },
{ projectName: "nKode", category: "Infrastructure", item: "Minimal base image (Rust binary)", status: "pass", notes: "Small attack surface" },
{ projectName: "nKode", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK" },
// Security Headers
{ projectName: "nKode", category: "Security Headers", item: "CSP header", status: "fail", notes: "Not configured" },
{ projectName: "nKode", category: "Security Headers", item: "X-Content-Type-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "nKode", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// INFRASTRUCTURE
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "SSH key authentication", status: "pass", notes: "VPS supports SSH key auth" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "SSH password auth disabled", status: "not_checked", notes: "Needs audit on both VPS" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "Gitea auth properly configured", status: "pass", notes: "Self-hosted with authenticated access" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "Git credentials not in URLs", status: "fail", notes: "Credentials embedded in remote URLs" },
// Transport & Data Protection
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "TLS on all public endpoints", status: "pass", notes: "All 7+ domains have valid Let's Encrypt certs" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "DNSSEC enabled", status: "fail", notes: "No DNSSEC on donovankelly.xyz" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "Centralized backup strategy", status: "fail", notes: "No unified backup across services" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "Secrets rotation policy", status: "fail", notes: "No rotation schedule for tokens/passwords" },
// Infrastructure
{ projectName: "Infrastructure", category: "Infrastructure", item: "Firewall rules documented and audited", status: "fail", notes: "No documentation of iptables/ufw rules" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Exposed ports audited", status: "fail", notes: "No port scan audit performed" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "SSH on non-default port", status: "not_checked", notes: "Needs verification" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Fail2ban installed and configured", status: "fail", notes: "No IDS/IPS verified" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Unattended security updates enabled", status: "not_checked", notes: "Needs verification on both VPS" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Container vulnerability scanning", status: "fail", notes: "No Trivy or similar scanning" },
// Logging & Monitoring
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Centralized log aggregation", status: "fail", notes: "Each container logs independently to stdout" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Uptime monitoring for all domains", status: "fail", notes: "No UptimeRobot or similar" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Intrusion detection system", status: "fail", notes: "No IDS on either VPS" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "System log monitoring", status: "fail", notes: "No syslog analysis" },
// Security Headers (Traefik/reverse proxy level)
{ projectName: "Infrastructure", category: "Security Headers", item: "HSTS on all domains", status: "not_checked", notes: "Needs verification at Traefik level" },
{ projectName: "Infrastructure", category: "Security Headers", item: "Security headers middleware in Traefik", status: "not_checked", notes: "Needs verification" },
];
async function seedChecklist() {
console.log("📋 Seeding security checklist data...");
// Clear existing
await db.delete(securityChecklist);
console.log(" Cleared existing checklist data");
// Bulk insert
const values = items.map(i => ({
projectName: i.projectName,
category: i.category,
item: i.item,
status: i.status,
notes: i.notes,
}));
// Insert in batches of 50
for (let i = 0; i < values.length; i += 50) {
const batch = values.slice(i, i + 50);
await db.insert(securityChecklist).values(batch);
}
console.log(` ✅ Inserted ${items.length} checklist items`);
// Summary
const projects = [...new Set(items.map(i => i.projectName))];
for (const project of projects) {
const projectItems = items.filter(i => i.projectName === project);
const pass = projectItems.filter(i => i.status === "pass").length;
const fail = projectItems.filter(i => i.status === "fail").length;
const partial = projectItems.filter(i => i.status === "partial").length;
console.log(` ${project}: ${pass} pass, ${fail} fail, ${partial} partial, ${projectItems.length} total`);
}
process.exit(0);
}
seedChecklist().catch((err) => {
console.error("Failed to seed checklist:", err);
process.exit(1);
});

View File

@@ -0,0 +1,602 @@
/**
* Seed OWASP API Top 10 findings from real code inspection
* of Hammer Dashboard API, Network App API, and Todo App API.
*
* Run: bun run src/seed-owasp-findings.ts
*/
import { db } from "./db";
import { securityFindings } from "./db/schema";
import { sql } from "drizzle-orm";
interface Finding {
appName: string;
category: string;
severity: "critical" | "high" | "medium" | "low" | "info";
title: string;
description: string;
recommendation: string;
status: "open" | "mitigated" | "accepted" | "false_positive";
owaspId: string;
}
const findings: Finding[] = [
// ════════════════════════════════════════════════════════════════
// HAMMER DASHBOARD API (hammer-queue/backend)
// ════════════════════════════════════════════════════════════════
// API1: Broken Object Level Authorization
{
appName: "Hammer Dashboard API",
category: "Broken Object Level Authorization",
severity: "high",
title: "No ownership validation on task operations",
description: "Task CRUD endpoints (GET/PATCH/DELETE /api/tasks/:id) verify authentication via bearer token or session but do not check whether the authenticated user owns or is assigned to the task. Any authenticated user can read, modify, or delete any task by UUID.",
recommendation: "Add ownership checks: verify the requesting user is the task creator, assignee, or an admin before allowing read/write operations on task resources.",
status: "open",
owaspId: "API1",
},
{
appName: "Hammer Dashboard API",
category: "Broken Object Level Authorization",
severity: "medium",
title: "Comment creation lacks task ownership verification",
description: "The POST /api/tasks/:id/notes and subtask endpoints only check for valid auth but not whether the user should have access to the specific task. Bearer token grants blanket access to all tasks.",
recommendation: "Implement per-resource authorization checks that verify the caller has legitimate access to the specific task before allowing modifications.",
status: "open",
owaspId: "API1",
},
// API2: Broken Authentication
{
appName: "Hammer Dashboard API",
category: "Broken Authentication",
severity: "critical",
title: "Hardcoded fallback bearer token in source code",
description: "The API_BEARER_TOKEN falls back to 'hammer-dev-token' if the environment variable is not set (tasks.ts line: `const BEARER_TOKEN = process.env.API_BEARER_TOKEN || \"hammer-dev-token\"`). This default token is committed to source control and would allow unauthorized access if the env var is missing in production.",
recommendation: "Remove the hardcoded fallback. Fail closed: if API_BEARER_TOKEN is not set, reject all bearer-token auth attempts. Add startup validation to ensure required env vars are present.",
status: "open",
owaspId: "API2",
},
{
appName: "Hammer Dashboard API",
category: "Broken Authentication",
severity: "medium",
title: "Bearer token provides equivalent access to admin session",
description: "The static bearer token grants the same privileges as an admin session across all endpoints. There is no distinction between API-level and user-level access, making the single token a high-value target.",
recommendation: "Implement scoped API keys with granular permissions. Consider JWT tokens with claims rather than a single static bearer token.",
status: "open",
owaspId: "API2",
},
// API3: Broken Object Property Level Authorization
{
appName: "Hammer Dashboard API",
category: "Broken Object Property Level Authorization",
severity: "medium",
title: "PATCH /api/tasks/:id accepts arbitrary field updates",
description: "The task update endpoint accepts a wide set of fields including status, priority, assigneeId, assigneeName, and progressNotes. Any authenticated user can change task assignment or escalate priority without role-based restrictions on which fields they can modify.",
recommendation: "Implement field-level authorization: restrict which fields different roles can update. Only admins/task owners should be able to reassign tasks or change priority.",
status: "open",
owaspId: "API3",
},
{
appName: "Hammer Dashboard API",
category: "Broken Object Property Level Authorization",
severity: "low",
title: "Admin role update has no enum validation on PATCH body",
description: "The admin route PATCH /api/admin/users/:id/role accepts t.String() for role with no validation against allowed values. An attacker could set role to arbitrary strings like 'superadmin'.",
recommendation: "Use t.Union([t.Literal('admin'), t.Literal('user')]) for role validation, matching the expected set of roles.",
status: "open",
owaspId: "API3",
},
// API4: Unrestricted Resource Consumption
{
appName: "Hammer Dashboard API",
category: "Unrestricted Resource Consumption",
severity: "high",
title: "No rate limiting on any API endpoints",
description: "The Hammer Dashboard API has zero rate limiting. All endpoints including auth, task creation, and admin operations can be called without throttling, enabling brute-force attacks, DoS, and resource exhaustion.",
recommendation: "Implement rate limiting middleware (e.g., elysia-rate-limit or custom in-memory/Redis limiter). Apply stricter limits to auth endpoints (5 req/min) and general limits (100 req/min).",
status: "open",
owaspId: "API4",
},
{
appName: "Hammer Dashboard API",
category: "Unrestricted Resource Consumption",
severity: "medium",
title: "GET /api/tasks returns all tasks without pagination",
description: "The task list endpoint returns all tasks in a single response with no pagination, limit, or cursor support. As the task count grows, this will cause increasing memory usage and response times.",
recommendation: "Add pagination parameters (page, limit, cursor) to the task list endpoint. Default to a reasonable page size (50-100 items).",
status: "open",
owaspId: "API4",
},
// API5: Broken Function Level Authorization
{
appName: "Hammer Dashboard API",
category: "Broken Function Level Authorization",
severity: "high",
title: "Admin endpoints accessible with bearer token - no role check",
description: "Admin routes (DELETE /api/admin/users/:id, PATCH /api/admin/users/:id/role) treat bearer token auth as equivalent to admin. If the bearer token leaks, an attacker gains full admin capabilities including user deletion and role escalation.",
recommendation: "Require session-based admin auth for destructive admin operations. Bearer tokens should have limited scope and not grant admin privileges.",
status: "open",
owaspId: "API5",
},
{
appName: "Hammer Dashboard API",
category: "Broken Function Level Authorization",
severity: "medium",
title: "POST /api/invite accessible with bearer token",
description: "The invite endpoint allows creating new user accounts with just the bearer token. If the token is compromised, attackers can create arbitrary accounts.",
recommendation: "Restrict user creation to admin sessions only. Require explicit admin role verification for account management operations.",
status: "open",
owaspId: "API5",
},
// API6: Unrestricted Access to Sensitive Business Flows
{
appName: "Hammer Dashboard API",
category: "Unrestricted Access to Sensitive Business Flows",
severity: "medium",
title: "Task activation triggers webhook without confirmation",
description: "Setting a task to 'active' status automatically fires a webhook to Clawdbot which can trigger automated work. There's no confirmation step or undo mechanism, and the webhook URL is configurable via env vars.",
recommendation: "Add a confirmation mechanism for task activation. Implement webhook signature verification and ensure the webhook URL cannot be tampered with at runtime.",
status: "open",
owaspId: "API6",
},
// API7: Server Side Request Forgery
{
appName: "Hammer Dashboard API",
category: "Server Side Request Forgery",
severity: "medium",
title: "Webhook URL from env var used in server-side fetch",
description: "The CLAWDBOT_HOOK_URL is read from environment and used in a server-side fetch call. While the code checks for HTTPS prefix, if an attacker modifies the env var, they could redirect webhook calls to internal services.",
recommendation: "Validate the webhook URL against an allowlist of known-good hosts. Add SSRF protections to prevent requests to internal IPs (127.0.0.1, 10.x, 169.254.x, etc.).",
status: "open",
owaspId: "API7",
},
// API8: Security Misconfiguration
{
appName: "Hammer Dashboard API",
category: "Security Misconfiguration",
severity: "medium",
title: "CORS allows localhost origin in production config",
description: "The CORS configuration includes 'http://localhost:5173' as an allowed origin alongside production domains. This could be exploited if an attacker runs a local server on a user's machine.",
recommendation: "Use environment-specific CORS configuration. Only include localhost origins in development mode. In production, strictly limit to production domains.",
status: "open",
owaspId: "API8",
},
{
appName: "Hammer Dashboard API",
category: "Security Misconfiguration",
severity: "low",
title: "Error handler exposes generic internal server error",
description: "The global error handler returns 'Internal server error' without detail, which is good, but error messages are logged to console without structured logging. Stack traces could leak to logs in shared environments.",
recommendation: "Use structured logging (e.g., pino) with appropriate log levels. Ensure stack traces are never exposed in API responses.",
status: "mitigated",
owaspId: "API8",
},
// API9: Improper Inventory Management
{
appName: "Hammer Dashboard API",
category: "Improper Inventory Management",
severity: "low",
title: "No API versioning strategy",
description: "All API endpoints are served under /api/ with no version prefix. There's no mechanism to deprecate endpoints or manage breaking changes, which could lead to clients using outdated or insecure endpoints.",
recommendation: "Implement API versioning (e.g., /api/v1/) and maintain an API inventory document. Add deprecation headers for endpoints being phased out.",
status: "open",
owaspId: "API9",
},
{
appName: "Hammer Dashboard API",
category: "Improper Inventory Management",
severity: "info",
title: "Health endpoint is unauthenticated",
description: "The /health endpoint returns service status without authentication. While this is standard for health checks, it confirms the service exists and is running to unauthenticated callers.",
recommendation: "Consider returning minimal info on unauthenticated health checks. Add a detailed health endpoint behind auth for operational monitoring.",
status: "accepted",
owaspId: "API9",
},
// API10: Unsafe Consumption of APIs
{
appName: "Hammer Dashboard API",
category: "Unsafe Consumption of APIs",
severity: "low",
title: "Webhook response not validated",
description: "When firing webhooks to Clawdbot, the response from the external service is not checked for errors or validated. A compromised webhook endpoint could return malicious data that is silently ignored.",
recommendation: "Validate webhook responses. Log failed deliveries with detail. Implement retry logic with exponential backoff for transient failures.",
status: "open",
owaspId: "API10",
},
// ════════════════════════════════════════════════════════════════
// NETWORK APP API (network-app-api)
// ════════════════════════════════════════════════════════════════
// API1: Broken Object Level Authorization
{
appName: "Network App API",
category: "Broken Object Level Authorization",
severity: "medium",
title: "User ID scoping relies on session-derived userId",
description: "Client routes properly scope queries to the authenticated user's ID (eq(clients.userId, user.id)). This is a strong pattern. However, the export routes bypass the auth middleware and manually extract session tokens from cookies with regex parsing, which could be fragile.",
recommendation: "Refactor export routes to use the standard authMiddleware pattern rather than manual cookie parsing. This ensures consistent authorization across all endpoints.",
status: "open",
owaspId: "API1",
},
{
appName: "Network App API",
category: "Broken Object Level Authorization",
severity: "high",
title: "Export routes fall back to any admin user for bearer auth",
description: "Export routes (GET /api/export/json, /clients/csv, etc.) have a dangerous fallback: if bearer auth is used, they find ANY admin user and use that userId, effectively granting access to the first admin's data. This is incorrect object-level authorization.",
recommendation: "Bearer token auth should map to a specific user, not fall back to 'any admin'. Require explicit user identification in API key auth flows.",
status: "open",
owaspId: "API1",
},
// API2: Broken Authentication
{
appName: "Network App API",
category: "Broken Authentication",
severity: "medium",
title: "Session token extracted via regex from cookie header",
description: "Export routes parse session tokens using regex: `headers['cookie']?.match(/better-auth\\.session_token=([^;]+)/)?.[1]`. This manual parsing is fragile, could miss URL-encoded values, and bypasses BetterAuth's built-in session validation.",
recommendation: "Use BetterAuth's auth.api.getSession({ headers: request.headers }) consistently across all routes instead of manual cookie parsing.",
status: "open",
owaspId: "API2",
},
{
appName: "Network App API",
category: "Broken Authentication",
severity: "info",
title: "Open signup blocked at route level",
description: "Registration is invite-only, with a route-level block on POST /api/auth/sign-up/email that returns 403. This is a good security practice for a CRM application.",
recommendation: "No action needed. This is noted as a positive finding. Consider also blocking at the BetterAuth config level for defense in depth.",
status: "mitigated",
owaspId: "API2",
},
// API3: Broken Object Property Level Authorization
{
appName: "Network App API",
category: "Broken Object Property Level Authorization",
severity: "low",
title: "Client update accepts all fields without field-level restrictions",
description: "PUT /api/clients/:id accepts a partial update of any client field. While ownership is checked (userId), there's no distinction between which fields different user roles can modify.",
recommendation: "For multi-user deployments, implement field-level authorization. For single-user apps, this is acceptable risk.",
status: "accepted",
owaspId: "API3",
},
// API4: Unrestricted Resource Consumption
{
appName: "Network App API",
category: "Unrestricted Resource Consumption",
severity: "medium",
title: "Rate limiting uses in-memory store - not distributed",
description: "The rate limiter (middleware/rate-limit.ts) uses an in-memory Map. In a multi-instance deployment, each instance maintains its own counters, effectively multiplying the rate limit by the number of instances.",
recommendation: "For production multi-instance deployments, use Redis-backed rate limiting. The current in-memory approach is fine for single-instance deployments.",
status: "open",
owaspId: "API4",
},
{
appName: "Network App API",
category: "Unrestricted Resource Consumption",
severity: "high",
title: "AI endpoints (meeting prep, email generation) lack cost controls",
description: "AI-powered endpoints call OpenAI/Anthropic APIs. While rate-limited to 10 req/min, there's no per-user daily/monthly budget cap. A compromised account could generate significant API costs.",
recommendation: "Implement per-user daily/monthly usage limits for AI features. Add cost tracking and alerting. Consider requiring explicit user confirmation for expensive operations.",
status: "open",
owaspId: "API4",
},
{
appName: "Network App API",
category: "Unrestricted Resource Consumption",
severity: "medium",
title: "CSV import has no file size limit",
description: "The POST /api/clients/import endpoint accepts CSV file uploads without enforcing a maximum file size. A large CSV could exhaust memory as the entire file is read with body.file.text().",
recommendation: "Add file size limits (e.g., 10MB max). Process large imports in chunks or as background jobs rather than loading everything into memory.",
status: "open",
owaspId: "API4",
},
// API5: Broken Function Level Authorization
{
appName: "Network App API",
category: "Broken Function Level Authorization",
severity: "low",
title: "Admin guard uses onBeforeHandle - proper implementation",
description: "Admin routes use .onBeforeHandle to check role === 'admin' after authMiddleware. This is a correct implementation pattern that ensures all sub-routes are protected.",
recommendation: "No action needed. This is noted as a positive finding. Continue using this pattern for function-level auth.",
status: "mitigated",
owaspId: "API5",
},
// API6: Unrestricted Access to Sensitive Business Flows
{
appName: "Network App API",
category: "Unrestricted Access to Sensitive Business Flows",
severity: "medium",
title: "Bulk email generation lacks abuse prevention",
description: "The bulk email generation endpoint could be used to generate a large volume of emails quickly. While rate-limited, there's no business logic check to prevent sending duplicate or excessive communications to the same client.",
recommendation: "Add business logic checks: warn when contacting a client who was emailed within the last 24h. Implement daily send limits per client.",
status: "open",
owaspId: "API6",
},
// API7: Server Side Request Forgery
{
appName: "Network App API",
category: "Server Side Request Forgery",
severity: "low",
title: "Webhook URL stored in database with no validation",
description: "Hammer webhooks store URLs in the database. While the current implementation sends to pre-registered URLs, there's no validation that the URL doesn't point to internal services.",
recommendation: "Validate webhook URLs against an allowlist or block internal IP ranges. Verify URLs use HTTPS only.",
status: "open",
owaspId: "API7",
},
// API8: Security Misconfiguration
{
appName: "Network App API",
category: "Security Misconfiguration",
severity: "medium",
title: "Error handler exposes error details in response",
description: "The global error handler returns `details: message` in the 500 error response body. Internal error messages could leak implementation details (database schema, file paths, etc.) to API consumers.",
recommendation: "Never include internal error details in production API responses. Log details server-side but return only generic error messages to clients.",
status: "open",
owaspId: "API8",
},
{
appName: "Network App API",
category: "Security Misconfiguration",
severity: "low",
title: "Stack traces logged regardless of environment",
description: "The error handler always logs stack traces regardless of NODE_ENV. While logging is important, structured logging with appropriate levels would be more maintainable.",
recommendation: "Use a structured logging library. Adjust verbosity based on environment. Ensure logs are shipped to a centralized logging service.",
status: "open",
owaspId: "API8",
},
// API9: Improper Inventory Management
{
appName: "Network App API",
category: "Improper Inventory Management",
severity: "medium",
title: "Large number of API routes with no documentation",
description: "The API has 30+ route modules (clients, emails, events, interactions, templates, segments, etc.) with no OpenAPI/Swagger documentation. This makes it difficult to audit the full API surface and track deprecated endpoints.",
recommendation: "Generate OpenAPI documentation from Elysia route schemas. Maintain an API inventory with ownership and deprecation status for each endpoint group.",
status: "open",
owaspId: "API9",
},
// API10: Unsafe Consumption of APIs
{
appName: "Network App API",
category: "Unsafe Consumption of APIs",
severity: "medium",
title: "AI service responses parsed without validation",
description: "The meeting prep endpoint (services/ai.ts) calls external AI APIs and attempts to parse JSON from the response. While there's a try/catch fallback, the parsed JSON structure is not validated against a schema, potentially allowing injection of unexpected data.",
recommendation: "Validate AI response JSON against a strict schema (e.g., Zod/TypeBox). Sanitize all AI-generated content before storing in the database or returning to clients.",
status: "open",
owaspId: "API10",
},
{
appName: "Network App API",
category: "Unsafe Consumption of APIs",
severity: "high",
title: "AI API keys stored in environment without rotation policy",
description: "OpenAI and Anthropic API keys are read from environment variables with no mechanism for key rotation, expiration tracking, or separate keys per environment.",
recommendation: "Use a secrets manager (e.g., Vault, AWS Secrets Manager). Implement key rotation policies. Use separate API keys for dev/staging/prod.",
status: "open",
owaspId: "API10",
},
// ════════════════════════════════════════════════════════════════
// TODO APP API (todo-app/apps/api)
// ════════════════════════════════════════════════════════════════
// API1: Broken Object Level Authorization
{
appName: "Todo App API",
category: "Broken Object Level Authorization",
severity: "high",
title: "Task operations lack user ownership validation on GET/DELETE",
description: "GET /api/tasks/:id and DELETE /api/tasks/:id query by task ID only without filtering by userId. An authenticated user could read or delete another user's tasks by guessing UUIDs. The list endpoint correctly filters by project ownership, but individual operations do not.",
recommendation: "Add userId ownership check to all single-resource operations: WHERE id = :id AND userId = :userId. Apply consistently across GET, PATCH, and DELETE.",
status: "open",
owaspId: "API1",
},
{
appName: "Todo App API",
category: "Broken Object Level Authorization",
severity: "medium",
title: "Hammer API routes bypass user scoping for task access",
description: "The Hammer API routes (routes/hammer.ts) scope operations to the service user's assigned tasks, which is correct. However, the task creation endpoint allows creating tasks for any user by email (body.userEmail), which could be abused.",
recommendation: "Validate that the target userEmail is a known, active user. Add audit logging for cross-user task creation. Consider requiring explicit admin authorization.",
status: "open",
owaspId: "API1",
},
// API2: Broken Authentication
{
appName: "Todo App API",
category: "Broken Authentication",
severity: "medium",
title: "Hammer API key validation uses simple string comparison",
description: "The Hammer API key is validated with a simple string equality check. This is vulnerable to timing attacks that could allow an attacker to guess the key character by character.",
recommendation: "Use a constant-time comparison function (e.g., crypto.timingSafeEqual) for API key validation to prevent timing side-channel attacks.",
status: "open",
owaspId: "API2",
},
{
appName: "Todo App API",
category: "Broken Authentication",
severity: "low",
title: "Password reset tokens use 24-hour expiry",
description: "Admin-generated password reset tokens expire after 24 hours. While reasonable, shorter expiry (1-2 hours) would reduce the window of exposure if a reset link is intercepted.",
recommendation: "Consider reducing password reset token expiry to 1-2 hours. Add single-use enforcement (mark token as used after first use).",
status: "open",
owaspId: "API2",
},
// API3: Broken Object Property Level Authorization
{
appName: "Todo App API",
category: "Broken Object Property Level Authorization",
severity: "medium",
title: "Task update exposes userId through ownership transfer",
description: "While PATCH /api/tasks/:id doesn't directly accept userId, the assigneeId field can be set to any user ID. Combined with the Hammer API's cross-user capabilities, this could enable unauthorized data access.",
recommendation: "Validate assigneeId values against the set of users the caller is authorized to assign tasks to. Log assignment changes for audit purposes.",
status: "open",
owaspId: "API3",
},
// API4: Unrestricted Resource Consumption
{
appName: "Todo App API",
category: "Unrestricted Resource Consumption",
severity: "high",
title: "No rate limiting on any endpoints",
description: "The Todo App API has no rate limiting middleware at all. Authentication endpoints, task creation, and webhook operations are all unlimited.",
recommendation: "Add rate limiting middleware. Prioritize auth endpoints (5 req/min), webhook endpoints (10 req/min), and general API (100 req/min).",
status: "open",
owaspId: "API4",
},
{
appName: "Todo App API",
category: "Unrestricted Resource Consumption",
severity: "medium",
title: "No pagination on task list queries",
description: "Task listing returns all matching tasks without pagination. While filtered by user and project, power users with hundreds of tasks could trigger expensive queries.",
recommendation: "Implement cursor-based pagination. Add a default limit of 100 items per request.",
status: "open",
owaspId: "API4",
},
// API5: Broken Function Level Authorization
{
appName: "Todo App API",
category: "Broken Function Level Authorization",
severity: "medium",
title: "Admin route protection uses derive middleware correctly",
description: "Admin routes use authMiddleware + .derive() to check for admin role. This is a correct implementation. However, the admin can delete any user and create invites, which should be logged.",
recommendation: "Add audit logging for all admin operations (user deletion, role changes, invite creation). The current auth pattern is sound.",
status: "open",
owaspId: "API5",
},
// API6: Unrestricted Access to Sensitive Business Flows
{
appName: "Todo App API",
category: "Unrestricted Access to Sensitive Business Flows",
severity: "low",
title: "Webhook registration has no URL validation",
description: "The POST /api/hammer/webhooks endpoint accepts any URL for webhook registration. While this requires the Hammer API key, a compromised key could register webhooks pointing to malicious servers.",
recommendation: "Validate webhook URLs against an allowlist of trusted domains. Require HTTPS for all webhook URLs.",
status: "open",
owaspId: "API6",
},
// API7: Server Side Request Forgery
{
appName: "Todo App API",
category: "Server Side Request Forgery",
severity: "medium",
title: "Webhook delivery sends HTTP requests to user-controlled URLs",
description: "The triggerHammerWebhooks function (tasks.ts) sends POST requests to URLs stored in the database (hammerWebhooks table). If an attacker registers a webhook pointing to internal services (http://localhost:*, http://169.254.*), they could probe internal infrastructure.",
recommendation: "Validate webhook URLs before sending. Block requests to private IP ranges (127.0.0.0/8, 10.0.0.0/8, 172.16.0.0/12, 169.254.0.0/16). Require HTTPS.",
status: "open",
owaspId: "API7",
},
// API8: Security Misconfiguration
{
appName: "Todo App API",
category: "Security Misconfiguration",
severity: "low",
title: "CORS uses environment-based origin list",
description: "CORS origins are configurable via ALLOWED_ORIGINS env var. This is good practice. The fallback includes localhost for development.",
recommendation: "Ensure ALLOWED_ORIGINS is always explicitly set in production. Consider removing localhost fallback and requiring explicit dev configuration.",
status: "mitigated",
owaspId: "API8",
},
{
appName: "Todo App API",
category: "Security Misconfiguration",
severity: "medium",
title: "Error handler logs stack traces in development mode only partially",
description: "The error handler conditionally includes stack traces based on NODE_ENV, which is good. However, error messages like 'Validation error' with details could leak schema information.",
recommendation: "Sanitize validation error details in production. Only expose field names, not schema structure or internal type information.",
status: "open",
owaspId: "API8",
},
// API9: Improper Inventory Management
{
appName: "Todo App API",
category: "Improper Inventory Management",
severity: "low",
title: "No API documentation or endpoint inventory",
description: "The API lacks OpenAPI/Swagger documentation. Endpoint discovery relies on reading source code, which makes security auditing and client integration more difficult.",
recommendation: "Generate OpenAPI documentation from Elysia route definitions. Publish API docs alongside the application.",
status: "open",
owaspId: "API9",
},
// API10: Unsafe Consumption of APIs
{
appName: "Todo App API",
category: "Unsafe Consumption of APIs",
severity: "low",
title: "Webhook responses not handled or validated",
description: "When firing webhooks via triggerHammerWebhooks, the function catches errors but doesn't validate response bodies or check HTTP status codes. The lastTriggeredAt is updated regardless of success.",
recommendation: "Track webhook delivery status (success/failure). Log response status codes. Only update lastTriggeredAt on successful delivery.",
status: "open",
owaspId: "API10",
},
{
appName: "Todo App API",
category: "Unsafe Consumption of APIs",
severity: "medium",
title: "Email service integration lacks error handling for sensitive data",
description: "The sendInviteEmail function in admin routes catches errors but continues execution, returning the setup URL. If the email service is compromised, invite tokens could be sent to wrong recipients without the admin knowing.",
recommendation: "Clearly indicate to the admin when email delivery fails. Consider making email delivery a required step for invite completion, with a manual fallback option.",
status: "open",
owaspId: "API10",
},
];
async function seed() {
console.log("Clearing existing security findings...");
await db.delete(securityFindings);
console.log(`Seeding ${findings.length} OWASP findings...`);
for (const finding of findings) {
await db.insert(securityFindings).values(finding);
}
console.log("Done! Seeded findings by app:");
const apps = [...new Set(findings.map(f => f.appName))];
for (const app of apps) {
const count = findings.filter(f => f.appName === app).length;
const bySeverity = findings.filter(f => f.appName === app).reduce((acc, f) => {
acc[f.severity] = (acc[f.severity] || 0) + 1;
return acc;
}, {} as Record<string, number>);
console.log(` ${app}: ${count} findings`, bySeverity);
}
process.exit(0);
}
seed().catch((e) => {
console.error("Seed failed:", e);
process.exit(1);
});

File diff suppressed because it is too large Load Diff

View File

@@ -16,6 +16,7 @@ const SummariesPage = lazy(() => import("./pages/SummariesPage").then(m => ({ de
const AdminPage = lazy(() => import("./components/AdminPage").then(m => ({ default: m.AdminPage })));
const SecurityPage = lazy(() => import("./pages/SecurityPage").then(m => ({ default: m.SecurityPage })));
const TodosPage = lazy(() => import("./pages/TodosPage").then(m => ({ default: m.TodosPage })));
const HealthPage = lazy(() => import("./pages/HealthPage").then(m => ({ default: m.HealthPage })));
function PageLoader() {
return (
@@ -42,6 +43,7 @@ function AuthenticatedApp() {
<Route path="/summaries" element={<Suspense fallback={<PageLoader />}><SummariesPage /></Suspense>} />
<Route path="/security" element={<Suspense fallback={<PageLoader />}><SecurityPage /></Suspense>} />
<Route path="/todos" element={<Suspense fallback={<PageLoader />}><TodosPage /></Suspense>} />
<Route path="/health" element={<Suspense fallback={<PageLoader />}><HealthPage /></Suspense>} />
<Route path="/admin" element={<Suspense fallback={<PageLoader />}><AdminPage /></Suspense>} />
<Route path="*" element={<Navigate to="/" replace />} />
</Route>

View File

@@ -0,0 +1,177 @@
import { useState, useEffect, useMemo } from "react";
import { Link } from "react-router-dom";
import { fetchAppHealth, forceHealthCheck } from "../lib/api";
import type { AppHealth, AppHealthResponse } from "../lib/types";
function StatusDot({ status, size = "sm" }: { status: AppHealth["status"]; size?: "sm" | "xs" }) {
const dotSize = size === "xs" ? "h-2 w-2" : "h-2.5 w-2.5";
const color =
status === "healthy" ? "bg-green-500" : status === "degraded" ? "bg-yellow-500" : "bg-red-500";
return <span className={`inline-flex rounded-full ${dotSize} ${color}`} />;
}
function timeAgo(dateStr: string): string {
const seconds = Math.floor((Date.now() - new Date(dateStr).getTime()) / 1000);
if (seconds < 5) return "just now";
if (seconds < 60) return `${seconds}s ago`;
const minutes = Math.floor(seconds / 60);
if (minutes < 60) return `${minutes}m ago`;
const hours = Math.floor(minutes / 60);
return `${hours}h ago`;
}
export function AppHealthWidget() {
const [data, setData] = useState<AppHealthResponse | null>(null);
const [loading, setLoading] = useState(true);
const [refreshing, setRefreshing] = useState(false);
const loadHealth = async () => {
try {
const result = await fetchAppHealth();
setData(result);
} catch (err) {
console.error("Failed to load health:", err);
} finally {
setLoading(false);
}
};
const handleRefresh = async (e: React.MouseEvent) => {
e.preventDefault();
e.stopPropagation();
setRefreshing(true);
try {
const result = await forceHealthCheck();
setData(result);
} catch (err) {
console.error("Failed to refresh:", err);
} finally {
setRefreshing(false);
}
};
useEffect(() => {
loadHealth();
const interval = setInterval(loadHealth, 30_000);
return () => clearInterval(interval);
}, []);
const overallStatus = useMemo(() => {
if (!data) return null;
if (data.apps.some((a) => a.status === "unhealthy")) return "unhealthy";
if (data.apps.some((a) => a.status === "degraded")) return "degraded";
return "healthy";
}, [data]);
const counts = useMemo(() => {
if (!data) return { healthy: 0, degraded: 0, unhealthy: 0 };
return {
healthy: data.apps.filter((a) => a.status === "healthy").length,
degraded: data.apps.filter((a) => a.status === "degraded").length,
unhealthy: data.apps.filter((a) => a.status === "unhealthy").length,
};
}, [data]);
return (
<div className="bg-white dark:bg-gray-900 rounded-xl border border-gray-200 dark:border-gray-800 shadow-sm">
<div className="px-5 py-4 border-b border-gray-100 dark:border-gray-800 flex items-center justify-between">
<div className="flex items-center gap-2">
<h2 className="font-semibold text-gray-900 dark:text-gray-100">🏥 App Health</h2>
{overallStatus && (
<StatusDot status={overallStatus} />
)}
</div>
<div className="flex items-center gap-2">
<button
onClick={handleRefresh}
disabled={refreshing}
className="text-xs text-gray-400 hover:text-amber-500 dark:hover:text-amber-400 transition disabled:opacity-50"
title="Force refresh"
>
<svg
className={`w-4 h-4 ${refreshing ? "animate-spin" : ""}`}
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
>
<path
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth={2}
d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"
/>
</svg>
</button>
<Link
to="/health"
className="text-xs text-amber-600 dark:text-amber-400 hover:text-amber-700 dark:hover:text-amber-300 font-medium"
>
Details
</Link>
</div>
</div>
<div className="p-5">
{loading ? (
<div className="text-sm text-gray-400 dark:text-gray-500 italic py-4 text-center">
Checking services...
</div>
) : !data ? (
<div className="text-sm text-gray-400 dark:text-gray-500 italic py-4 text-center">
Failed to load health data
</div>
) : (
<>
{/* Summary bar */}
<div className="flex items-center gap-3 mb-4 text-xs text-gray-500 dark:text-gray-400">
{counts.healthy > 0 && (
<span className="flex items-center gap-1">
<StatusDot status="healthy" size="xs" /> {counts.healthy} healthy
</span>
)}
{counts.degraded > 0 && (
<span className="flex items-center gap-1">
<StatusDot status="degraded" size="xs" /> {counts.degraded} degraded
</span>
)}
{counts.unhealthy > 0 && (
<span className="flex items-center gap-1">
<StatusDot status="unhealthy" size="xs" /> {counts.unhealthy} unhealthy
</span>
)}
</div>
{/* Compact app list */}
<div className="grid grid-cols-1 sm:grid-cols-2 gap-2">
{data.apps.map((app) => (
<a
key={app.name}
href={app.url}
target="_blank"
rel="noopener noreferrer"
className="flex items-center gap-2 px-3 py-2 rounded-lg hover:bg-gray-50 dark:hover:bg-gray-800 transition group"
>
<StatusDot status={app.status} size="xs" />
<span className="text-sm text-gray-700 dark:text-gray-300 truncate flex-1 group-hover:text-amber-600 dark:group-hover:text-amber-400 transition">
{app.name}
</span>
<span className="text-xs text-gray-400 dark:text-gray-500 font-mono shrink-0">
{app.responseTime}ms
</span>
</a>
))}
</div>
{/* Last checked */}
{data.apps[0] && (
<p className="text-[10px] text-gray-400 dark:text-gray-600 text-center mt-3">
Last checked {timeAgo(data.apps[0].lastChecked)}
{data.cached && " (cached)"}
</p>
)}
</>
)}
</div>
</div>
);
}

View File

@@ -1,4 +1,4 @@
import { useState, useMemo } from "react";
import { useState, useMemo, useEffect } from "react";
import { NavLink, Outlet } from "react-router-dom";
import { useCurrentUser } from "../hooks/useCurrentUser";
import { useTasks } from "../hooks/useTasks";
@@ -6,6 +6,8 @@ import { useTheme } from "../hooks/useTheme";
import { KeyboardShortcutsModal } from "./KeyboardShortcutsModal";
import { CommandPalette } from "./CommandPalette";
import { signOut } from "../lib/auth-client";
import { fetchAppHealth } from "../lib/api";
import type { AppHealthStatus } from "../lib/types";
const navItems = [
{ to: "/", label: "Dashboard", icon: "🔨", badgeKey: null },
@@ -15,6 +17,7 @@ const navItems = [
{ to: "/activity", label: "Activity", icon: "📝", badgeKey: null },
{ to: "/summaries", label: "Summaries", icon: "📅", badgeKey: null },
{ to: "/security", label: "Security", icon: "🛡️", badgeKey: null },
{ to: "/health", label: "Health", icon: "🏥", badgeKey: "health" },
] as const;
export function DashboardLayout() {
@@ -22,10 +25,28 @@ export function DashboardLayout() {
const { tasks } = useTasks(15000);
const { theme, setTheme } = useTheme();
const [sidebarOpen, setSidebarOpen] = useState(false);
const [healthStatus, setHealthStatus] = useState<AppHealthStatus | null>(null);
const activeTasks = useMemo(() => tasks.filter((t) => t.status === "active").length, [tasks]);
const blockedTasks = useMemo(() => tasks.filter((t) => t.status === "blocked").length, [tasks]);
// Fetch health status for sidebar indicator
useEffect(() => {
const checkHealth = async () => {
try {
const data = await fetchAppHealth();
if (data.apps.some((a) => a.status === "unhealthy")) setHealthStatus("unhealthy");
else if (data.apps.some((a) => a.status === "degraded")) setHealthStatus("degraded");
else setHealthStatus("healthy");
} catch {
setHealthStatus(null);
}
};
checkHealth();
const interval = setInterval(checkHealth, 60_000);
return () => clearInterval(interval);
}, []);
const cycleTheme = () => {
const next = theme === "light" ? "dark" : theme === "dark" ? "system" : "light";
setTheme(next);
@@ -107,6 +128,10 @@ export function DashboardLayout() {
<nav className="flex-1 px-3 py-4 space-y-1">
{navItems.map((item) => {
const badge = item.badgeKey === "queue" && activeTasks > 0 ? activeTasks : 0;
const healthDotColor =
healthStatus === "healthy" ? "bg-green-500" :
healthStatus === "degraded" ? "bg-yellow-500" :
healthStatus === "unhealthy" ? "bg-red-500" : null;
return (
<NavLink
key={item.to}
@@ -123,6 +148,9 @@ export function DashboardLayout() {
>
<span className="text-lg">{item.icon}</span>
<span className="flex-1">{item.label}</span>
{item.badgeKey === "health" && healthDotColor && (
<span className={`inline-flex rounded-full h-2 w-2 ${healthDotColor}`} />
)}
{badge > 0 && (
<span className="text-[10px] font-bold bg-amber-500 text-white px-1.5 py-0.5 rounded-full min-w-[18px] text-center leading-none">
{badge}

View File

@@ -1,4 +1,4 @@
import type { Task, Project, ProjectWithTasks, VelocityStats, Recurrence, Todo, TodoPriority } from "./types";
import type { Task, Project, ProjectWithTasks, VelocityStats, Recurrence, Todo, TodoPriority, TodoProject, TodoSection, AppHealthResponse } from "./types";
const BASE = "/api/tasks";
@@ -233,10 +233,12 @@ export async function deleteUser(userId: string): Promise<void> {
const TODOS_BASE = "/api/todos";
export async function fetchTodos(params?: { completed?: string; category?: string }): Promise<Todo[]> {
export async function fetchTodos(params?: { completed?: string; category?: string; projectId?: string; view?: string }): Promise<Todo[]> {
const url = new URL(TODOS_BASE, window.location.origin);
if (params?.completed) url.searchParams.set("completed", params.completed);
if (params?.category) url.searchParams.set("category", params.category);
if (params?.projectId) url.searchParams.set("projectId", params.projectId);
if (params?.view) url.searchParams.set("view", params.view);
const res = await fetch(url.toString(), { credentials: "include" });
if (!res.ok) throw new Error(res.status === 401 ? "Unauthorized" : "Failed to fetch todos");
return res.json();
@@ -253,6 +255,7 @@ export async function createTodo(todo: {
description?: string;
priority?: TodoPriority;
category?: string;
projectId?: string | null;
dueDate?: string | null;
}): Promise<Todo> {
const res = await fetch(TODOS_BASE, {
@@ -270,6 +273,8 @@ export async function updateTodo(id: string, updates: Partial<{
description: string;
priority: TodoPriority;
category: string | null;
projectId: string | null;
sectionId: string | null;
dueDate: string | null;
isCompleted: boolean;
sortOrder: number;
@@ -300,3 +305,146 @@ export async function deleteTodo(id: string): Promise<void> {
});
if (!res.ok) throw new Error("Failed to delete todo");
}
// ─── Todo Subtasks API ───
export async function addTodoSubtask(todoId: string, title: string): Promise<Todo> {
const res = await fetch(`${TODOS_BASE}/${todoId}/subtasks`, {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ title }),
});
if (!res.ok) throw new Error("Failed to add subtask");
return res.json();
}
export async function toggleTodoSubtask(todoId: string, subtaskId: string, completed: boolean): Promise<Todo> {
const res = await fetch(`${TODOS_BASE}/${todoId}/subtasks/${subtaskId}`, {
method: "PATCH",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ completed }),
});
if (!res.ok) throw new Error("Failed to toggle subtask");
return res.json();
}
export async function deleteTodoSubtask(todoId: string, subtaskId: string): Promise<Todo> {
const res = await fetch(`${TODOS_BASE}/${todoId}/subtasks/${subtaskId}`, {
method: "DELETE",
credentials: "include",
});
if (!res.ok) throw new Error("Failed to delete subtask");
return res.json();
}
// ─── Todo Projects API ───
const TODO_PROJECTS_BASE = "/api/todos/projects";
export async function fetchTodoProjects(): Promise<TodoProject[]> {
const res = await fetch(TODO_PROJECTS_BASE, { credentials: "include" });
if (!res.ok) throw new Error(res.status === 401 ? "Unauthorized" : "Failed to fetch projects");
return res.json();
}
export async function createTodoProject(project: {
name: string;
color?: string;
icon?: string;
}): Promise<TodoProject> {
const res = await fetch(TODO_PROJECTS_BASE, {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(project),
});
if (!res.ok) throw new Error("Failed to create project");
return res.json();
}
export async function updateTodoProject(id: string, updates: Partial<{
name: string;
color: string;
icon: string;
sortOrder: number;
}>): Promise<TodoProject> {
const res = await fetch(`${TODO_PROJECTS_BASE}/${id}`, {
method: "PATCH",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(updates),
});
if (!res.ok) throw new Error("Failed to update project");
return res.json();
}
export async function deleteTodoProject(id: string): Promise<void> {
const res = await fetch(`${TODO_PROJECTS_BASE}/${id}`, {
method: "DELETE",
credentials: "include",
});
if (!res.ok) throw new Error("Failed to delete project");
}
// ─── Todo Sections API ───
const TODO_SECTIONS_BASE = "/api/todos/sections";
export async function fetchTodoSections(projectId: string): Promise<TodoSection[]> {
const res = await fetch(`${TODO_SECTIONS_BASE}/by-project/${projectId}`, { credentials: "include" });
if (!res.ok) throw new Error("Failed to fetch sections");
return res.json();
}
export async function createTodoSection(data: { projectId: string; name: string }): Promise<TodoSection> {
const res = await fetch(TODO_SECTIONS_BASE, {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(data),
});
if (!res.ok) throw new Error("Failed to create section");
return res.json();
}
export async function updateTodoSection(id: string, updates: Partial<{
name: string;
isCollapsed: boolean;
sortOrder: number;
}>): Promise<TodoSection> {
const res = await fetch(`${TODO_SECTIONS_BASE}/${id}`, {
method: "PATCH",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(updates),
});
if (!res.ok) throw new Error("Failed to update section");
return res.json();
}
export async function deleteTodoSection(id: string): Promise<void> {
const res = await fetch(`${TODO_SECTIONS_BASE}/${id}`, {
method: "DELETE",
credentials: "include",
});
if (!res.ok) throw new Error("Failed to delete section");
}
// ─── App Health API ───
export async function fetchAppHealth(): Promise<AppHealthResponse> {
const res = await fetch("/api/health/apps", { credentials: "include" });
if (!res.ok) throw new Error("Failed to fetch app health");
return res.json();
}
export async function forceHealthCheck(): Promise<AppHealthResponse> {
const res = await fetch("/api/health/check", {
method: "POST",
credentials: "include",
});
if (!res.ok) throw new Error("Failed to force health check");
return res.json();
}

View File

@@ -55,6 +55,38 @@ export interface Recurrence {
export type TodoPriority = "high" | "medium" | "low" | "none";
export interface TodoProject {
id: string;
userId: string;
name: string;
color: string;
icon: string;
sortOrder: number;
todoCount: number;
completedCount: number;
createdAt: string;
updatedAt: string;
}
export interface TodoSection {
id: string;
projectId: string;
name: string;
isCollapsed: boolean;
sortOrder: number;
todoCount: number;
createdAt: string;
updatedAt: string;
}
export interface TodoSubtask {
id: string;
title: string;
completed: boolean;
completedAt?: string;
createdAt: string;
}
export interface Todo {
id: string;
userId: string;
@@ -63,13 +95,37 @@ export interface Todo {
isCompleted: boolean;
priority: TodoPriority;
category: string | null;
projectId: string | null;
sectionId: string | null;
dueDate: string | null;
completedAt: string | null;
subtasks: TodoSubtask[];
sortOrder: number;
createdAt: string;
updatedAt: string;
}
// ─── App Health ───
export type AppHealthStatus = "healthy" | "degraded" | "unhealthy";
export interface AppHealth {
name: string;
url: string;
type: "web" | "api";
status: AppHealthStatus;
responseTime: number;
httpStatus: number | null;
lastChecked: string;
error?: string;
}
export interface AppHealthResponse {
apps: AppHealth[];
cached: boolean;
cacheAge: number;
}
// ─── Tasks ───
export interface Task {

View File

@@ -2,6 +2,7 @@ import { useMemo, useEffect, useState } from "react";
import { Link } from "react-router-dom";
import { useTasks } from "../hooks/useTasks";
import { fetchProjects, fetchVelocityStats } from "../lib/api";
import { AppHealthWidget } from "../components/AppHealthWidget";
import type { Task, ProgressNote, Project, VelocityStats } from "../lib/types";
function StatCard({ label, value, icon, color }: { label: string; value: number; icon: string; color: string }) {
@@ -223,6 +224,9 @@ export function DashboardPage() {
{/* Velocity Chart */}
<VelocityChart stats={velocityStats} />
{/* App Health Widget */}
<AppHealthWidget />
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Currently Working On */}
<div className="bg-white dark:bg-gray-900 rounded-xl border border-gray-200 dark:border-gray-800 shadow-sm">

View File

@@ -0,0 +1,240 @@
import { useState, useEffect, useMemo } from "react";
import { fetchAppHealth, forceHealthCheck } from "../lib/api";
import type { AppHealth, AppHealthResponse } from "../lib/types";
function statusColor(status: AppHealth["status"]) {
switch (status) {
case "healthy":
return {
dot: "bg-green-500",
bg: "bg-green-50 dark:bg-green-900/20",
border: "border-green-200 dark:border-green-800",
text: "text-green-700 dark:text-green-400",
badge: "bg-green-100 dark:bg-green-900/30 text-green-700 dark:text-green-400",
};
case "degraded":
return {
dot: "bg-yellow-500",
bg: "bg-yellow-50 dark:bg-yellow-900/20",
border: "border-yellow-200 dark:border-yellow-800",
text: "text-yellow-700 dark:text-yellow-400",
badge: "bg-yellow-100 dark:bg-yellow-900/30 text-yellow-700 dark:text-yellow-400",
};
case "unhealthy":
return {
dot: "bg-red-500",
bg: "bg-red-50 dark:bg-red-900/20",
border: "border-red-200 dark:border-red-800",
text: "text-red-700 dark:text-red-400",
badge: "bg-red-100 dark:bg-red-900/30 text-red-700 dark:text-red-400",
};
}
}
function StatusDot({ status }: { status: AppHealth["status"] }) {
const colors = statusColor(status);
return (
<span className="relative flex h-3 w-3">
{status === "healthy" && (
<span className={`animate-ping absolute inline-flex h-full w-full rounded-full ${colors.dot} opacity-75`} />
)}
<span className={`relative inline-flex rounded-full h-3 w-3 ${colors.dot}`} />
</span>
);
}
function timeAgo(dateStr: string): string {
const seconds = Math.floor((Date.now() - new Date(dateStr).getTime()) / 1000);
if (seconds < 5) return "just now";
if (seconds < 60) return `${seconds}s ago`;
const minutes = Math.floor(seconds / 60);
if (minutes < 60) return `${minutes}m ago`;
const hours = Math.floor(minutes / 60);
return `${hours}h ago`;
}
function HealthCard({ app, large }: { app: AppHealth; large?: boolean }) {
const colors = statusColor(app.status);
return (
<a
href={app.url}
target="_blank"
rel="noopener noreferrer"
className={`block rounded-xl border ${colors.border} ${colors.bg} ${large ? "p-5" : "p-4"} hover:shadow-md transition group`}
>
<div className="flex items-start justify-between mb-2">
<div className="flex items-center gap-2">
<StatusDot status={app.status} />
<h3 className={`font-semibold ${large ? "text-base" : "text-sm"} text-gray-900 dark:text-gray-100`}>
{app.name}
</h3>
</div>
<span className={`text-xs px-2 py-0.5 rounded-full font-medium capitalize ${colors.badge}`}>
{app.status}
</span>
</div>
<div className="space-y-1.5">
<p className="text-xs text-gray-500 dark:text-gray-400 truncate group-hover:text-amber-600 dark:group-hover:text-amber-400 transition">
{app.url}
</p>
<div className="flex items-center gap-3 text-xs text-gray-500 dark:text-gray-400">
<span className={`font-mono ${app.responseTime > 5000 ? "text-yellow-600 dark:text-yellow-400" : app.responseTime > 2000 ? "text-amber-600 dark:text-amber-400" : ""}`}>
{app.responseTime}ms
</span>
{app.httpStatus && (
<span className="font-mono">
HTTP {app.httpStatus}
</span>
)}
<span className="text-gray-400 dark:text-gray-500">
{app.type === "api" ? "🔌 API" : "🌐 Web"}
</span>
</div>
{app.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-1">
{app.error}
</p>
)}
{large && (
<p className="text-xs text-gray-400 dark:text-gray-500 mt-2">
Last checked: {timeAgo(app.lastChecked)}
</p>
)}
</div>
</a>
);
}
export function HealthPage() {
const [data, setData] = useState<AppHealthResponse | null>(null);
const [loading, setLoading] = useState(true);
const [refreshing, setRefreshing] = useState(false);
const loadHealth = async () => {
try {
const result = await fetchAppHealth();
setData(result);
} catch (err) {
console.error("Failed to load health:", err);
} finally {
setLoading(false);
}
};
const handleRefresh = async () => {
setRefreshing(true);
try {
const result = await forceHealthCheck();
setData(result);
} catch (err) {
console.error("Failed to refresh health:", err);
} finally {
setRefreshing(false);
}
};
useEffect(() => {
loadHealth();
const interval = setInterval(loadHealth, 30_000);
return () => clearInterval(interval);
}, []);
const overallStatus = useMemo(() => {
if (!data) return null;
if (data.apps.some((a) => a.status === "unhealthy")) return "unhealthy";
if (data.apps.some((a) => a.status === "degraded")) return "degraded";
return "healthy";
}, [data]);
const counts = useMemo(() => {
if (!data) return { healthy: 0, degraded: 0, unhealthy: 0 };
return {
healthy: data.apps.filter((a) => a.status === "healthy").length,
degraded: data.apps.filter((a) => a.status === "degraded").length,
unhealthy: data.apps.filter((a) => a.status === "unhealthy").length,
};
}, [data]);
if (loading) {
return (
<div className="min-h-screen flex items-center justify-center text-gray-400 dark:text-gray-500">
Checking services...
</div>
);
}
const bannerColors = overallStatus === "healthy"
? "bg-green-50 dark:bg-green-900/20 border-green-200 dark:border-green-800 text-green-800 dark:text-green-300"
: overallStatus === "degraded"
? "bg-yellow-50 dark:bg-yellow-900/20 border-yellow-200 dark:border-yellow-800 text-yellow-800 dark:text-yellow-300"
: "bg-red-50 dark:bg-red-900/20 border-red-200 dark:border-red-800 text-red-800 dark:text-red-300";
const bannerIcon = overallStatus === "healthy" ? "✅" : overallStatus === "degraded" ? "⚠️" : "🔴";
const bannerText = overallStatus === "healthy"
? "All Systems Operational"
: overallStatus === "degraded"
? "Some Systems Degraded"
: "System Issues Detected";
return (
<div className="min-h-screen">
<header className="bg-white dark:bg-gray-900 border-b border-gray-200 dark:border-gray-800 sticky top-14 md:top-0 z-30">
<div className="max-w-5xl mx-auto px-4 sm:px-6 py-4 flex items-center justify-between">
<div>
<h1 className="text-xl font-bold text-gray-900 dark:text-gray-100">🏥 App Health</h1>
<p className="text-sm text-gray-400 dark:text-gray-500">Monitor all deployed services</p>
</div>
<button
onClick={handleRefresh}
disabled={refreshing}
className="px-3 py-2 text-sm font-medium rounded-lg bg-amber-500 hover:bg-amber-600 text-white transition disabled:opacity-50 flex items-center gap-2"
>
<svg
className={`w-4 h-4 ${refreshing ? "animate-spin" : ""}`}
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
>
<path
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth={2}
d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15"
/>
</svg>
{refreshing ? "Checking..." : "Refresh"}
</button>
</div>
</header>
<div className="max-w-5xl mx-auto px-4 sm:px-6 py-6 space-y-6">
{/* Overall Status Banner */}
<div className={`rounded-xl border p-4 ${bannerColors} flex items-center justify-between`}>
<div className="flex items-center gap-3">
<span className="text-2xl">{bannerIcon}</span>
<div>
<h2 className="font-semibold text-lg">{bannerText}</h2>
<p className="text-sm opacity-80">
{counts.healthy} healthy · {counts.degraded} degraded · {counts.unhealthy} unhealthy
</p>
</div>
</div>
{data && (
<span className="text-xs opacity-60">
{data.cached ? `Cached (${Math.round(data.cacheAge / 1000)}s ago)` : "Fresh check"}
</span>
)}
</div>
{/* App Grid */}
<div className="grid grid-cols-1 sm:grid-cols-2 gap-4">
{data?.apps.map((app) => (
<HealthCard key={app.name} app={app} large />
))}
</div>
</div>
</div>
);
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

204
seed-checklist.sh Executable file
View File

@@ -0,0 +1,204 @@
#!/bin/bash
TOKEN="62490648ae3f8712e2a30eb0ca46ac2f"
BASE="https://dash.donovankelly.xyz/api/security"
post_bulk() {
curl -s -X POST "$BASE/checklist/bulk" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "$1"
}
echo "=== Seeding Security Checklist ==="
# Hammer Dashboard checklist
post_bulk '{"items":[
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Passwords hashed with bcrypt/argon2/scrypt","status":"pass","notes":"BetterAuth handles password hashing securely"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Session tokens are cryptographically random","status":"pass","notes":"BetterAuth generates secure session tokens"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Session expiry enforced","status":"pass","notes":"Sessions expire per BetterAuth defaults"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Secure cookie attributes (HttpOnly, Secure, SameSite)","status":"pass","notes":"Cookie config: secure=true, sameSite=none, httpOnly=true"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"CSRF protection enabled","status":"pass","notes":"disableCSRFCheck: false explicitly set"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"MFA / 2FA available","status":"fail","notes":"No MFA support configured"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Password complexity requirements enforced","status":"fail","notes":"No password policy configured in BetterAuth"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Account lockout after failed attempts","status":"not_checked","notes":"BetterAuth may handle this - needs verification"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Registration restricted (invite-only or approval)","status":"fail","notes":"Open signup enabled"},
{"projectName":"Hammer Dashboard","category":"Auth & Session Management","item":"Session invalidation on password change","status":"pass","notes":"BetterAuth invalidates sessions on credential change"},
{"projectName":"Hammer Dashboard","category":"Authorization","item":"Object-level access control (users can only access own data)","status":"partial","notes":"Task queue is shared by design. Admin role exists."},
{"projectName":"Hammer Dashboard","category":"Authorization","item":"Function-level access control (admin vs user)","status":"pass","notes":"requireAdmin() check on admin-only routes"},
{"projectName":"Hammer Dashboard","category":"Authorization","item":"Field-level access control","status":"pass","notes":"Elysia t.Object() schemas restrict accepted fields"},
{"projectName":"Hammer Dashboard","category":"Authorization","item":"API tokens scoped per client/service","status":"fail","notes":"Single static API_BEARER_TOKEN shared across all consumers"},
{"projectName":"Hammer Dashboard","category":"Authorization","item":"Principle of least privilege applied","status":"partial","notes":"Admin/user roles exist but token gives full access"},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"All API inputs validated with schemas","status":"partial","notes":"Most routes use Elysia t.Object() - some routes lack validation"},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"SQL injection prevented (parameterized queries)","status":"pass","notes":"Drizzle ORM handles parameterization"},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"XSS prevention (output encoding)","status":"pass","notes":"React auto-escapes output. API returns JSON."},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"Path traversal prevented","status":"not_applicable","notes":"No file system operations in API"},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"File upload validation","status":"not_applicable","notes":"No file uploads in this app"},
{"projectName":"Hammer Dashboard","category":"Input Validation","item":"Request body size limits","status":"fail","notes":"No body size limits configured"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"HTTPS enforced on all endpoints","status":"pass","notes":"Let'\''s Encrypt TLS via Traefik/Dokploy"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"HSTS header set","status":"partial","notes":"May be set by Traefik - needs verification at app level"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"CORS properly restricted","status":"partial","notes":"CORS includes localhost:5173 in production"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"Encryption at rest for sensitive data","status":"fail","notes":"No disk or column-level encryption"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"Database backups encrypted","status":"fail","notes":"No backup strategy exists"},
{"projectName":"Hammer Dashboard","category":"Transport & Data Protection","item":"Secrets stored securely (env vars / vault)","status":"pass","notes":"Env vars via Dokploy environment config"},
{"projectName":"Hammer Dashboard","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on auth endpoints","status":"fail","notes":"No rate limiting middleware"},
{"projectName":"Hammer Dashboard","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on API endpoints","status":"fail","notes":"No rate limiting middleware"},
{"projectName":"Hammer Dashboard","category":"Rate Limiting & Abuse Prevention","item":"Bot/CAPTCHA protection on registration","status":"fail","notes":"No CAPTCHA or bot detection"},
{"projectName":"Hammer Dashboard","category":"Rate Limiting & Abuse Prevention","item":"Request throttling for expensive operations","status":"fail","notes":"No throttling configured"},
{"projectName":"Hammer Dashboard","category":"Error Handling","item":"Generic error messages in production","status":"pass","notes":"Returns Internal server error without stack traces"},
{"projectName":"Hammer Dashboard","category":"Error Handling","item":"No stack traces leaked to clients","status":"pass","notes":"Error handler is generic"},
{"projectName":"Hammer Dashboard","category":"Error Handling","item":"Consistent error response format","status":"pass","notes":"All errors return { error: string }"},
{"projectName":"Hammer Dashboard","category":"Error Handling","item":"Uncaught exception handler","status":"partial","notes":"Elysia onError catches route errors; no process-level handler"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Structured logging (not just console.log)","status":"fail","notes":"Console-only logging"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Auth events logged","status":"fail","notes":"No auth event logging"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Data access audit trail","status":"fail","notes":"No audit logging"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Error alerting configured","status":"fail","notes":"No alerting system"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Uptime monitoring","status":"fail","notes":"No external monitoring"},
{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","item":"Log aggregation / centralized logging","status":"fail","notes":"No log aggregation"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"Container isolation (separate containers per service)","status":"pass","notes":"Docker compose with separate backend + db containers"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"Minimal base images","status":"partial","notes":"Uses oven/bun - not minimal but purpose-built"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"No root user in containers","status":"not_checked","notes":"Need to verify Dockerfile USER directive"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"Docker health checks defined","status":"fail","notes":"No HEALTHCHECK in Dockerfile"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"Secrets not baked into images","status":"pass","notes":"Secrets via env vars at runtime"},
{"projectName":"Hammer Dashboard","category":"Infrastructure","item":"Automated deployment (CI/CD)","status":"pass","notes":"Gitea Actions + Dokploy deploy"},
{"projectName":"Hammer Dashboard","category":"Security Headers","item":"Content-Security-Policy (CSP)","status":"fail","notes":"No CSP header set at application level"},
{"projectName":"Hammer Dashboard","category":"Security Headers","item":"X-Content-Type-Options: nosniff","status":"not_checked","notes":"May be set by Traefik"},
{"projectName":"Hammer Dashboard","category":"Security Headers","item":"X-Frame-Options: DENY","status":"not_checked","notes":"May be set by Traefik"},
{"projectName":"Hammer Dashboard","category":"Security Headers","item":"Referrer-Policy","status":"not_checked","notes":"Not configured at app level"},
{"projectName":"Hammer Dashboard","category":"Security Headers","item":"Permissions-Policy","status":"fail","notes":"Not configured"}
]}'
echo " Hammer Dashboard: $(echo $? | ([ "$(cat)" = "0" ] && echo "OK" || echo "FAIL"))"
# Network App checklist
post_bulk '{"items":[
{"projectName":"Network App","category":"Auth & Session Management","item":"Passwords hashed securely","status":"pass","notes":"BetterAuth handles hashing"},
{"projectName":"Network App","category":"Auth & Session Management","item":"Session tokens cryptographically random","status":"pass","notes":"BetterAuth secure tokens"},
{"projectName":"Network App","category":"Auth & Session Management","item":"Session expiry enforced","status":"pass","notes":"7-day expiry with daily refresh"},
{"projectName":"Network App","category":"Auth & Session Management","item":"Secure cookie attributes","status":"pass","notes":"secure=true, sameSite=none, httpOnly, cross-subdomain scoped"},
{"projectName":"Network App","category":"Auth & Session Management","item":"CSRF protection enabled","status":"pass","notes":"BetterAuth CSRF enabled"},
{"projectName":"Network App","category":"Auth & Session Management","item":"MFA / 2FA available","status":"fail","notes":"No MFA support"},
{"projectName":"Network App","category":"Auth & Session Management","item":"Registration restricted (invite-only)","status":"pass","notes":"disableSignUp: true + 403 on signup endpoint"},
{"projectName":"Network App","category":"Auth & Session Management","item":"Password complexity requirements","status":"fail","notes":"No password policy enforced"},
{"projectName":"Network App","category":"Authorization","item":"Object-level access control (user-scoped queries)","status":"pass","notes":"All queries use eq(clients.userId, user.id)"},
{"projectName":"Network App","category":"Authorization","item":"Function-level access control (admin vs user)","status":"pass","notes":"Admin routes check user.role === admin"},
{"projectName":"Network App","category":"Authorization","item":"Centralized auth middleware","status":"pass","notes":"authMiddleware Elysia plugin"},
{"projectName":"Network App","category":"Authorization","item":"Field-level input validation","status":"partial","notes":"Most fields validated. Role field accepts arbitrary strings."},
{"projectName":"Network App","category":"Input Validation","item":"All API inputs validated","status":"pass","notes":"34+ route files use Elysia t.Object() schemas"},
{"projectName":"Network App","category":"Input Validation","item":"SQL injection prevented","status":"pass","notes":"Drizzle ORM parameterized queries"},
{"projectName":"Network App","category":"Input Validation","item":"XSS prevention","status":"pass","notes":"React auto-escapes; API returns JSON"},
{"projectName":"Network App","category":"Input Validation","item":"File upload validation","status":"partial","notes":"Document uploads exist. Need to verify size/type checks"},
{"projectName":"Network App","category":"Transport & Data Protection","item":"HTTPS enforced","status":"pass","notes":"Let'\''s Encrypt TLS"},
{"projectName":"Network App","category":"Transport & Data Protection","item":"CORS properly restricted","status":"partial","notes":"Falls back to localhost:3000 if env not set"},
{"projectName":"Network App","category":"Transport & Data Protection","item":"PII encryption at rest","status":"fail","notes":"Contact data stored as plain text"},
{"projectName":"Network App","category":"Transport & Data Protection","item":"API key rotation for external services","status":"fail","notes":"Resend API key not rotated"},
{"projectName":"Network App","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on auth endpoints","status":"pass","notes":"5 req/min per IP on auth"},
{"projectName":"Network App","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on API endpoints","status":"pass","notes":"100 req/min global per IP"},
{"projectName":"Network App","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on AI endpoints","status":"pass","notes":"10 req/min on AI routes"},
{"projectName":"Network App","category":"Rate Limiting & Abuse Prevention","item":"Rate limit headers in responses","status":"pass","notes":"Returns Retry-After on 429"},
{"projectName":"Network App","category":"Error Handling","item":"Generic error messages in production","status":"fail","notes":"Stack traces included in error responses"},
{"projectName":"Network App","category":"Error Handling","item":"No stack traces leaked","status":"fail","notes":"Error handler sends stack to client"},
{"projectName":"Network App","category":"Error Handling","item":"Consistent error response format","status":"pass","notes":"Standardized error format"},
{"projectName":"Network App","category":"Error Handling","item":"Error boundary in frontend","status":"pass","notes":"ErrorBoundary + ToastContainer implemented"},
{"projectName":"Network App","category":"Logging & Monitoring","item":"Audit logging implemented","status":"pass","notes":"audit_logs table tracks all CRUD operations"},
{"projectName":"Network App","category":"Logging & Monitoring","item":"Structured logging","status":"fail","notes":"Console-based logging only"},
{"projectName":"Network App","category":"Logging & Monitoring","item":"Error alerting","status":"fail","notes":"No alerting configured"},
{"projectName":"Network App","category":"Logging & Monitoring","item":"Uptime monitoring","status":"fail","notes":"No external monitoring"},
{"projectName":"Network App","category":"Infrastructure","item":"Container isolation","status":"pass","notes":"Separate Docker containers"},
{"projectName":"Network App","category":"Infrastructure","item":"Docker health checks","status":"fail","notes":"No HEALTHCHECK in Dockerfile"},
{"projectName":"Network App","category":"Infrastructure","item":"Automated CI/CD","status":"pass","notes":"Gitea Actions + Dokploy"},
{"projectName":"Network App","category":"Security Headers","item":"Content-Security-Policy (CSP)","status":"fail","notes":"Not configured"},
{"projectName":"Network App","category":"Security Headers","item":"X-Content-Type-Options: nosniff","status":"not_checked","notes":"Needs verification"},
{"projectName":"Network App","category":"Security Headers","item":"X-Frame-Options","status":"not_checked","notes":"Needs verification"}
]}'
echo " Network App checklist done"
# Todo App checklist
post_bulk '{"items":[
{"projectName":"Todo App","category":"Auth & Session Management","item":"Passwords hashed securely","status":"pass","notes":"BetterAuth handles hashing"},
{"projectName":"Todo App","category":"Auth & Session Management","item":"Session tokens cryptographically random","status":"pass","notes":"BetterAuth secure tokens"},
{"projectName":"Todo App","category":"Auth & Session Management","item":"Session expiry enforced","status":"pass","notes":"BetterAuth defaults"},
{"projectName":"Todo App","category":"Auth & Session Management","item":"Secure cookie attributes","status":"pass","notes":"Configured in BetterAuth"},
{"projectName":"Todo App","category":"Auth & Session Management","item":"MFA / 2FA available","status":"fail","notes":"No MFA"},
{"projectName":"Todo App","category":"Auth & Session Management","item":"Registration restricted (invite-only)","status":"pass","notes":"Invite system with expiring tokens"},
{"projectName":"Todo App","category":"Authorization","item":"Object-level access control","status":"pass","notes":"Tasks filtered by eq(tasks.userId, userId)"},
{"projectName":"Todo App","category":"Authorization","item":"Function-level access control","status":"pass","notes":"Admin role checking on admin routes"},
{"projectName":"Todo App","category":"Authorization","item":"Service account scope limited","status":"partial","notes":"Hammer service has broad access to create/update for any user"},
{"projectName":"Todo App","category":"Input Validation","item":"API inputs validated with schemas","status":"pass","notes":"Elysia t.Object() type validation on routes"},
{"projectName":"Todo App","category":"Input Validation","item":"SQL injection prevented","status":"pass","notes":"Drizzle ORM"},
{"projectName":"Todo App","category":"Input Validation","item":"XSS prevention","status":"pass","notes":"React + JSON API"},
{"projectName":"Todo App","category":"Transport & Data Protection","item":"HTTPS enforced","status":"pass","notes":"Let'\''s Encrypt TLS"},
{"projectName":"Todo App","category":"Transport & Data Protection","item":"CORS properly restricted","status":"partial","notes":"Falls back to localhost:5173 if env not set"},
{"projectName":"Todo App","category":"Transport & Data Protection","item":"Database backups","status":"fail","notes":"No backup strategy"},
{"projectName":"Todo App","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on auth endpoints","status":"fail","notes":"No rate limiting"},
{"projectName":"Todo App","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on API endpoints","status":"fail","notes":"No rate limiting"},
{"projectName":"Todo App","category":"Error Handling","item":"Generic error messages in production","status":"pass","notes":"Checks NODE_ENV for stack traces"},
{"projectName":"Todo App","category":"Error Handling","item":"Consistent error format","status":"pass","notes":"Standardized error responses"},
{"projectName":"Todo App","category":"Logging & Monitoring","item":"Audit logging","status":"fail","notes":"No audit logging"},
{"projectName":"Todo App","category":"Logging & Monitoring","item":"Structured logging","status":"fail","notes":"Console-only"},
{"projectName":"Todo App","category":"Logging & Monitoring","item":"Error alerting","status":"fail","notes":"No alerting"},
{"projectName":"Todo App","category":"Infrastructure","item":"Container isolation","status":"pass","notes":"Docker compose"},
{"projectName":"Todo App","category":"Infrastructure","item":"Docker health checks","status":"fail","notes":"No HEALTHCHECK"},
{"projectName":"Todo App","category":"Infrastructure","item":"Automated CI/CD","status":"pass","notes":"Gitea Actions + Dokploy"},
{"projectName":"Todo App","category":"Security Headers","item":"CSP header","status":"fail","notes":"Not configured"},
{"projectName":"Todo App","category":"Security Headers","item":"X-Content-Type-Options","status":"not_checked","notes":"Needs verification"}
]}'
echo " Todo App checklist done"
# nKode checklist
post_bulk '{"items":[
{"projectName":"nKode","category":"Auth & Session Management","item":"OPAQUE protocol (zero-knowledge password)","status":"pass","notes":"Server never sees plaintext passwords"},
{"projectName":"nKode","category":"Auth & Session Management","item":"Argon2 password hashing in OPAQUE","status":"pass","notes":"Configured via opaque-ke features"},
{"projectName":"nKode","category":"Auth & Session Management","item":"OIDC token-based sessions","status":"pass","notes":"Full OIDC implementation with JWK signing"},
{"projectName":"nKode","category":"Auth & Session Management","item":"MFA / 2FA available","status":"fail","notes":"No second factor. OPAQUE is single-factor"},
{"projectName":"nKode","category":"Auth & Session Management","item":"Cryptographic session signatures","status":"pass","notes":"HEADER_SIGNATURE + HEADER_TIMESTAMP verification"},
{"projectName":"nKode","category":"Authorization","item":"Token-based authorization","status":"pass","notes":"OIDC JWT tokens for API auth"},
{"projectName":"nKode","category":"Authorization","item":"Auth extractors for route protection","status":"pass","notes":"extractors.rs provides consistent auth extraction"},
{"projectName":"nKode","category":"Authorization","item":"Role-based access control","status":"fail","notes":"No visible RBAC. All authenticated users have equal access"},
{"projectName":"nKode","category":"Input Validation","item":"Type-safe deserialization (serde)","status":"pass","notes":"Rust serde enforces strict type contracts"},
{"projectName":"nKode","category":"Input Validation","item":"Memory safety (Rust)","status":"pass","notes":"Eliminates buffer overflows, use-after-free, data races"},
{"projectName":"nKode","category":"Input Validation","item":"SQL injection prevented","status":"pass","notes":"SQLx with parameterized queries"},
{"projectName":"nKode","category":"Transport & Data Protection","item":"HTTPS enforced","status":"pass","notes":"Let'\''s Encrypt TLS"},
{"projectName":"nKode","category":"Transport & Data Protection","item":"OPAQUE prevents password exposure","status":"pass","notes":"DB breach does not expose passwords"},
{"projectName":"nKode","category":"Transport & Data Protection","item":"Login data encryption at rest","status":"fail","notes":"Stored login data not encrypted at application level"},
{"projectName":"nKode","category":"Transport & Data Protection","item":"CORS properly restricted","status":"fail","notes":"Hardcoded localhost origins in production code"},
{"projectName":"nKode","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on auth endpoints","status":"fail","notes":"No tower-governor or rate limiting middleware"},
{"projectName":"nKode","category":"Rate Limiting & Abuse Prevention","item":"Rate limiting on API endpoints","status":"fail","notes":"No rate limiting"},
{"projectName":"nKode","category":"Rate Limiting & Abuse Prevention","item":"Argon2 DoS protection","status":"fail","notes":"Expensive OPAQUE/Argon2 flows could be abused for resource exhaustion"},
{"projectName":"nKode","category":"Error Handling","item":"Proper Axum error types","status":"pass","notes":"Uses Axum error handling properly"},
{"projectName":"nKode","category":"Error Handling","item":"No stack traces leaked","status":"pass","notes":"Rust error handling is explicit"},
{"projectName":"nKode","category":"Logging & Monitoring","item":"Structured logging (tracing crate)","status":"pass","notes":"Uses Rust tracing ecosystem"},
{"projectName":"nKode","category":"Logging & Monitoring","item":"Log aggregation","status":"fail","notes":"Logs to stdout only"},
{"projectName":"nKode","category":"Logging & Monitoring","item":"Error alerting","status":"fail","notes":"No alerting"},
{"projectName":"nKode","category":"Infrastructure","item":"Container isolation","status":"pass","notes":"Docker on Dokploy"},
{"projectName":"nKode","category":"Infrastructure","item":"Minimal base image (Rust binary)","status":"pass","notes":"Small attack surface"},
{"projectName":"nKode","category":"Infrastructure","item":"Docker health checks","status":"fail","notes":"No HEALTHCHECK"},
{"projectName":"nKode","category":"Security Headers","item":"CSP header","status":"fail","notes":"Not configured"}
]}'
echo " nKode checklist done"
# Infrastructure checklist
post_bulk '{"items":[
{"projectName":"Infrastructure","category":"Auth & Session Management","item":"SSH key authentication","status":"pass","notes":"VPS supports SSH key auth"},
{"projectName":"Infrastructure","category":"Auth & Session Management","item":"SSH password auth disabled","status":"not_checked","notes":"Needs audit on both VPS"},
{"projectName":"Infrastructure","category":"Auth & Session Management","item":"Gitea auth properly configured","status":"pass","notes":"Self-hosted with authenticated access"},
{"projectName":"Infrastructure","category":"Auth & Session Management","item":"Git credentials not in URLs","status":"fail","notes":"Credentials embedded in remote URLs"},
{"projectName":"Infrastructure","category":"Transport & Data Protection","item":"TLS on all public endpoints","status":"pass","notes":"All 7+ domains have valid Let'\''s Encrypt certs"},
{"projectName":"Infrastructure","category":"Transport & Data Protection","item":"DNSSEC enabled","status":"fail","notes":"No DNSSEC on donovankelly.xyz"},
{"projectName":"Infrastructure","category":"Transport & Data Protection","item":"Centralized backup strategy","status":"fail","notes":"No unified backup across services"},
{"projectName":"Infrastructure","category":"Transport & Data Protection","item":"Secrets rotation policy","status":"fail","notes":"No rotation schedule for tokens/passwords"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"Firewall rules documented and audited","status":"fail","notes":"No documentation of iptables/ufw rules"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"Exposed ports audited","status":"fail","notes":"No port scan audit performed"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"SSH on non-default port","status":"not_checked","notes":"Needs verification"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"Fail2ban installed and configured","status":"fail","notes":"No IDS/IPS verified"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"Unattended security updates enabled","status":"not_checked","notes":"Needs verification on both VPS"},
{"projectName":"Infrastructure","category":"Infrastructure","item":"Container vulnerability scanning","status":"fail","notes":"No Trivy or similar scanning"},
{"projectName":"Infrastructure","category":"Logging & Monitoring","item":"Centralized log aggregation","status":"fail","notes":"Each container logs independently to stdout"},
{"projectName":"Infrastructure","category":"Logging & Monitoring","item":"Uptime monitoring for all domains","status":"fail","notes":"No UptimeRobot or similar"},
{"projectName":"Infrastructure","category":"Logging & Monitoring","item":"Intrusion detection system","status":"fail","notes":"No IDS on either VPS"},
{"projectName":"Infrastructure","category":"Logging & Monitoring","item":"System log monitoring","status":"fail","notes":"No syslog analysis"},
{"projectName":"Infrastructure","category":"Security Headers","item":"HSTS on all domains","status":"not_checked","notes":"Needs verification at Traefik level"},
{"projectName":"Infrastructure","category":"Security Headers","item":"Security headers middleware in Traefik","status":"not_checked","notes":"Needs verification"}
]}'
echo " Infrastructure checklist done"
echo ""
echo "=== All checklist items seeded! ==="

155
seed-via-api.sh Executable file
View File

@@ -0,0 +1,155 @@
#!/bin/bash
# Seed security data via API (more reliable than startup seed)
TOKEN="62490648ae3f8712e2a30eb0ca46ac2f"
BASE="https://dash.donovankelly.xyz/api/security"
post() {
curl -s -X POST "$BASE$1" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "$2" > /dev/null 2>&1
}
echo "=== Seeding OWASP API Top 10 Audits ==="
# Hammer Dashboard - OWASP
post "/" '{
"projectName": "Hammer Dashboard",
"category": "OWASP API Top 10",
"score": 62,
"findings": [
{"id":"hd-o1","status":"needs_improvement","title":"API1 - Broken Object Level Authorization","description":"Task, comment, and audit endpoints use UUID-based IDs but do not verify the requesting user owns the resource. Any authenticated user can read/modify any task via /api/tasks/:id. Bearer token grants full access to all resources.","recommendation":"Add owner checks on all CRUD operations. Ensure bearer token scoping per-client."},
{"id":"hd-o2","status":"strong","title":"API2 - Broken Authentication","description":"BetterAuth with email/password, CSRF protection enabled (disableCSRFCheck:false), cookie-based sessions scoped to .donovankelly.xyz. Dual auth: session + bearer token.","recommendation":""},
{"id":"hd-o3","status":"needs_improvement","title":"API3 - Broken Object Property Level Authorization","description":"Task PATCH endpoint accepts any field in the body including status, assigneeId, progressNotes. No field-level restrictions based on user role.","recommendation":"Implement field-level access control. Restrict which fields each role can modify."},
{"id":"hd-o4","status":"critical","title":"API4 - Unrestricted Resource Consumption","description":"No rate limiting middleware on any endpoint. GET /api/tasks returns all tasks without pagination. No request body size limits configured in Elysia.","recommendation":"Add rate limiting middleware (per-IP, per-user). Implement pagination. Add body size limits."},
{"id":"hd-o5","status":"needs_improvement","title":"API5 - Broken Function Level Authorization","description":"Admin routes check role but /api/invite allows any bearer token holder to create users. /api/security endpoints use same bearer token for read and write.","recommendation":"Implement separate admin vs. user bearer tokens. Add function-level permission checks."},
{"id":"hd-o6","status":"strong","title":"API6 - Unrestricted Access to Sensitive Business Flows","description":"No sensitive business flows (no payment, no password reset via email). Task creation and audit management are internal tools only.","recommendation":""},
{"id":"hd-o7","status":"strong","title":"API7 - Server Side Request Forgery","description":"No endpoints that accept URLs or make outbound requests based on user input. No SSRF vectors identified.","recommendation":""},
{"id":"hd-o8","status":"needs_improvement","title":"API8 - Security Misconfiguration","description":"CORS allows http://localhost:5173 in production. Error logging is console-only, no structured logging. No security headers at app level.","recommendation":"Remove localhost from production CORS. Add structured logging. Add security response headers."},
{"id":"hd-o9","status":"needs_improvement","title":"API9 - Improper Inventory Management","description":"No API documentation or OpenAPI spec. No versioning on endpoints. Bearer token is static across all environments.","recommendation":"Add OpenAPI/Swagger documentation. Implement API versioning. Use environment-specific tokens."},
{"id":"hd-o10","status":"strong","title":"API10 - Unsafe Consumption of APIs","description":"Dashboard does not consume external APIs. All data is internal. No third-party API dependencies.","recommendation":""}
]
}'
echo " Hammer Dashboard OWASP ✓"
# Network App - OWASP
post "/" '{
"projectName": "Network App",
"category": "OWASP API Top 10",
"score": 72,
"findings": [
{"id":"na-o1","status":"strong","title":"API1 - Broken Object Level Authorization","description":"All client/interaction/email endpoints filter by session userId. Users can only access their own data. Properly scoped.","recommendation":""},
{"id":"na-o2","status":"strong","title":"API2 - Broken Authentication","description":"BetterAuth with email/password. Session middleware on all protected routes. Invite-based signup with role assignment. CSRF protection enabled.","recommendation":""},
{"id":"na-o3","status":"needs_improvement","title":"API3 - Broken Object Property Level Authorization","description":"Client PATCH accepts any body fields. communicationStyle JSONB stored without field validation. AI-generated email content stored without sanitization.","recommendation":"Validate JSONB fields against schema. Sanitize AI output properties."},
{"id":"na-o4","status":"strong","title":"API4 - Unrestricted Resource Consumption","description":"Rate limiting implemented via custom middleware with per-IP buckets. Different limits: auth=5/min, AI=10/min, general=100/min. Client list has pagination support.","recommendation":""},
{"id":"na-o5","status":"needs_improvement","title":"API5 - Broken Function Level Authorization","description":"Admin routes properly check role. However, Hammer API key grants write access to all endpoints. No per-tenant isolation.","recommendation":"Scope API key permissions. Add tenant isolation."},
{"id":"na-o6","status":"needs_improvement","title":"API6 - Unrestricted Access to Sensitive Business Flows","description":"Bulk email endpoint could send unlimited emails. No daily send limits. Meeting prep AI endpoint has no cooldown.","recommendation":"Add daily email send limits per user. Add cooldown on AI endpoints."},
{"id":"na-o7","status":"strong","title":"API7 - Server Side Request Forgery","description":"No endpoints accept arbitrary URLs. Resend SDK and AI SDK configured with fixed endpoints. No SSRF vectors.","recommendation":""},
{"id":"na-o8","status":"needs_improvement","title":"API8 - Security Misconfiguration","description":"CORS origin list includes localhost in production. Error handler exposes stack traces in non-production mode. No security response headers.","recommendation":"Set NODE_ENV=production. Remove localhost CORS. Add security headers."},
{"id":"na-o9","status":"needs_improvement","title":"API9 - Improper Inventory Management","description":"No API documentation. 25+ route files with no centralized inventory. Multiple auth methods not documented.","recommendation":"Generate OpenAPI spec from Elysia schema. Document all auth methods."},
{"id":"na-o10","status":"needs_improvement","title":"API10 - Unsafe Consumption of APIs","description":"Uses LangChain with Anthropic/OpenAI for AI features. Resend SDK for emails. Third-party API responses served to users without sanitization.","recommendation":"Validate and sanitize AI-generated content. Add circuit breakers for external API calls."}
]
}'
echo " Network App OWASP ✓"
# Todo App - OWASP
post "/" '{
"projectName": "Todo App",
"category": "OWASP API Top 10",
"score": 55,
"findings": [
{"id":"ta-o1","status":"needs_improvement","title":"API1 - Broken Object Level Authorization","description":"Task endpoints filter by userId from session. However, comment and label endpoints use project-scoped access without verifying project membership in all paths.","recommendation":"Verify project membership on all project-scoped endpoints."},
{"id":"ta-o2","status":"strong","title":"API2 - Broken Authentication","description":"BetterAuth with email/password. authMiddleware derives user from session on all protected routes. Invite-based registration.","recommendation":""},
{"id":"ta-o3","status":"needs_improvement","title":"API3 - Broken Object Property Level Authorization","description":"Task PATCH accepts arbitrary body fields. No validation that users cannot modify fields like assigneeId to assign to non-project-members.","recommendation":"Add field-level validation. Restrict assignable users to project members."},
{"id":"ta-o4","status":"critical","title":"API4 - Unrestricted Resource Consumption","description":"No rate limiting middleware on any endpoint. Task list has no pagination limits. Comment creation has no rate limit. No request body size limits.","recommendation":"Implement rate limiting. Add pagination with max limits. Add body size limits."},
{"id":"ta-o5","status":"needs_improvement","title":"API5 - Broken Function Level Authorization","description":"Admin routes check role properly. Hammer routes use API key auth. But no granular permissions within projects.","recommendation":"Add project-level roles (admin, member, viewer)."},
{"id":"ta-o6","status":"strong","title":"API6 - Unrestricted Access to Sensitive Business Flows","description":"No sensitive business flows. Todo management is straightforward CRUD.","recommendation":""},
{"id":"ta-o7","status":"strong","title":"API7 - Server Side Request Forgery","description":"No endpoints accept URLs or make outbound requests based on user input.","recommendation":""},
{"id":"ta-o8","status":"needs_improvement","title":"API8 - Security Misconfiguration","description":"CORS uses ALLOWED_ORIGINS env var with fallback to localhost. Error handler exposes stack traces when NODE_ENV is not production.","recommendation":"Ensure NODE_ENV=production in deploy. Add structured logging."},
{"id":"ta-o9","status":"needs_improvement","title":"API9 - Improper Inventory Management","description":"No API documentation. Multiple auth methods not documented. No versioning.","recommendation":"Add OpenAPI spec. Document auth methods."},
{"id":"ta-o10","status":"strong","title":"API10 - Unsafe Consumption of APIs","description":"No external API consumption. Email sending uses Resend SDK with fixed config.","recommendation":""}
]
}'
echo " Todo App OWASP ✓"
# nKode - OWASP
post "/" '{
"projectName": "nKode",
"category": "OWASP API Top 10",
"score": 78,
"findings": [
{"id":"nk-o1","status":"strong","title":"API1 - Broken Object Level Authorization","description":"Rust Axum backend with user-scoped data access via auth extractors. Login data tied to authenticated user sessions.","recommendation":""},
{"id":"nk-o2","status":"strong","title":"API2 - Broken Authentication","description":"OPAQUE zero-knowledge password protocol. Server never sees plaintext passwords. Argon2 KSF. HMAC-signed sessions with timestamp validation.","recommendation":""},
{"id":"nk-o3","status":"needs_improvement","title":"API3 - Broken Object Property Level Authorization","description":"Serde deserialization enforces type safety but no explicit field-level access control for different user roles.","recommendation":"Add explicit field whitelisting per role if multi-role access is added."},
{"id":"nk-o4","status":"needs_improvement","title":"API4 - Unrestricted Resource Consumption","description":"No rate limiting middleware (no tower-governor). OPAQUE uses Argon2 which is CPU-intensive. No protection against auth endpoint abuse.","recommendation":"Add tower-governor rate limiting. Implement progressive delays on auth attempts."},
{"id":"nk-o5","status":"needs_improvement","title":"API5 - Broken Function Level Authorization","description":"No visible RBAC system. All authenticated users have equal access. No admin vs user distinction.","recommendation":"Add role-based access when admin features are needed."},
{"id":"nk-o6","status":"strong","title":"API6 - Unrestricted Access to Sensitive Business Flows","description":"Password manager operations are user-scoped. No sensitive business flows beyond standard CRUD.","recommendation":""},
{"id":"nk-o7","status":"strong","title":"API7 - Server Side Request Forgery","description":"No endpoints accept arbitrary URLs. Backend only communicates with its own database.","recommendation":""},
{"id":"nk-o8","status":"needs_improvement","title":"API8 - Security Misconfiguration","description":"CORS hardcodes localhost origins in production code. No security response headers. OIDC configuration could leak implementation details.","recommendation":"Configure CORS via environment. Add security headers middleware."},
{"id":"nk-o9","status":"needs_improvement","title":"API9 - Improper Inventory Management","description":"No API documentation. No OpenAPI spec generated. No API versioning.","recommendation":"Add utoipa for OpenAPI generation from Rust types."},
{"id":"nk-o10","status":"strong","title":"API10 - Unsafe Consumption of APIs","description":"No external API consumption. All operations are local database reads/writes.","recommendation":""}
]
}'
echo " nKode OWASP ✓"
echo ""
echo "=== Seeding Category Audits ==="
# Hammer Dashboard categories
post "/" '{"projectName":"Hammer Dashboard","category":"Authentication","score":80,"findings":[{"id":"hd-a1","status":"strong","title":"BetterAuth integration","description":"Properly configured BetterAuth with email/password auth, CSRF protection, secure cookie settings.","recommendation":""},{"id":"hd-a2","status":"strong","title":"Role-based access control","description":"Users have roles (admin/user). Admin routes check role before processing.","recommendation":""},{"id":"hd-a3","status":"strong","title":"Bearer token + session dual auth","description":"API supports both session cookies and bearer token for programmatic access.","recommendation":""},{"id":"hd-a4","status":"critical","title":"Static shared bearer token","description":"API_BEARER_TOKEN is a single static token for all API consumers. Compromise of one integration exposes all.","recommendation":"Implement per-client API keys with scoped permissions."},{"id":"hd-a5","status":"needs_improvement","title":"No object-level authorization","description":"Tasks and audits do not check ownership. Any authenticated user can modify any resource.","recommendation":"Add ownership checks on all CRUD operations."}]}'
echo " HD/Authentication ✓"
post "/" '{"projectName":"Hammer Dashboard","category":"Input Validation","score":70,"findings":[{"id":"hd-iv1","status":"strong","title":"TypeBox schema validation","description":"Elysia uses TypeBox for request body validation on most endpoints.","recommendation":""},{"id":"hd-iv2","status":"needs_improvement","title":"JSONB fields not deeply validated","description":"progressNotes, findings, subtasks stored as JSONB without deep schema validation.","recommendation":"Add runtime validation for JSONB structures before DB storage."},{"id":"hd-iv3","status":"needs_improvement","title":"No input sanitization","description":"User input stored and returned as-is. Markdown content could contain scripts.","recommendation":"Add HTML/XSS sanitization on text fields."}]}'
echo " HD/Input Validation ✓"
post "/" '{"projectName":"Hammer Dashboard","category":"Infrastructure","score":78,"findings":[{"id":"hd-i1","status":"strong","title":"HTTPS everywhere","description":"All services behind Traefik with automatic Let'\''s Encrypt TLS certificates.","recommendation":""},{"id":"hd-i2","status":"strong","title":"Docker containerization","description":"Apps run in Docker with compose. No host-level service exposure.","recommendation":""},{"id":"hd-i3","status":"needs_improvement","title":"Database credentials in compose files","description":"PostgreSQL credentials visible in docker-compose files. Not using Docker secrets.","recommendation":"Use Docker secrets or external secret management."},{"id":"hd-i4","status":"needs_improvement","title":"No container image scanning","description":"Docker images built from source without vulnerability scanning.","recommendation":"Add Trivy container scanning in CI pipeline."}]}'
echo " HD/Infrastructure ✓"
post "/" '{"projectName":"Hammer Dashboard","category":"Logging & Monitoring","score":45,"findings":[{"id":"hd-lm1","status":"critical","title":"Console-only logging","description":"All logging via console.log/console.error. No structured logging, no log aggregation, no retention policy.","recommendation":"Implement structured logging (pino/winston). Add log aggregation (Loki, ELK)."},{"id":"hd-lm2","status":"critical","title":"No security event logging","description":"Failed auth attempts, permission denials, suspicious activity not logged separately.","recommendation":"Add dedicated security event logging with alerting."},{"id":"hd-lm3","status":"needs_improvement","title":"No monitoring or alerting","description":"No health check monitoring, no error rate tracking, no uptime alerts.","recommendation":"Add monitoring (Prometheus/Grafana) and alerting."}]}'
echo " HD/Logging ✓"
# Network App categories
post "/" '{"projectName":"Network App","category":"Authentication","score":82,"findings":[{"id":"na-a1","status":"strong","title":"BetterAuth with invite-only registration","description":"Email/password auth with invite-based signup. Session middleware on all protected routes.","recommendation":""},{"id":"na-a2","status":"strong","title":"Rate limiting on auth endpoints","description":"Auth endpoints limited to 5 requests/min per IP.","recommendation":""},{"id":"na-a3","status":"needs_improvement","title":"No MFA support","description":"Single-factor auth only.","recommendation":"Add TOTP MFA."},{"id":"na-a4","status":"needs_improvement","title":"No account lockout","description":"Failed login attempts not tracked. No lockout after repeated failures.","recommendation":"Add account lockout after 5 failed attempts."}]}'
echo " NA/Authentication ✓"
post "/" '{"projectName":"Network App","category":"Authorization","score":85,"findings":[{"id":"na-az1","status":"strong","title":"User-scoped data access","description":"All queries filter by userId from session. Users can only access their own clients, emails, events.","recommendation":""},{"id":"na-az2","status":"strong","title":"Admin role checks","description":"Admin endpoints verify role before processing.","recommendation":""},{"id":"na-az3","status":"needs_improvement","title":"Hammer API key is overprivileged","description":"Single API key grants full write access to all endpoints.","recommendation":"Scope API key to specific operations."}]}'
echo " NA/Authorization ✓"
post "/" '{"projectName":"Network App","category":"Data Protection","score":72,"findings":[{"id":"na-dp1","status":"strong","title":"HTTPS in transit","description":"All traffic encrypted via Traefik TLS termination.","recommendation":""},{"id":"na-dp2","status":"needs_improvement","title":"No encryption at rest","description":"Client PII (names, emails, phones, addresses) stored in plaintext in PostgreSQL.","recommendation":"Encrypt sensitive fields at rest or use PostgreSQL pgcrypto."},{"id":"na-dp3","status":"needs_improvement","title":"File uploads stored on local filesystem","description":"Client documents stored in uploads/documents/ without encryption. No virus scanning.","recommendation":"Encrypt uploaded files. Add antivirus scanning. Consider S3 with SSE."},{"id":"na-dp4","status":"critical","title":"Export endpoint dumps all user data","description":"GET /api/export/json returns complete database dump including PII. No audit trail for exports.","recommendation":"Add export audit logging. Require MFA for data exports. Add watermarking."}]}'
echo " NA/Data Protection ✓"
post "/" '{"projectName":"Network App","category":"Logging & Monitoring","score":65,"findings":[{"id":"na-lm1","status":"strong","title":"Audit logging implemented","description":"audit_logs table tracks create/update/delete/send operations with JSONB diffs, IP, user agent.","recommendation":""},{"id":"na-lm2","status":"needs_improvement","title":"No log aggregation","description":"Audit logs in DB but no centralized log aggregation or monitoring.","recommendation":"Add log forwarding to centralized system."},{"id":"na-lm3","status":"needs_improvement","title":"No anomaly detection","description":"No alerting on unusual patterns (bulk exports, mass deletes, off-hours access).","recommendation":"Add anomaly detection rules."}]}'
echo " NA/Logging ✓"
# Todo App categories
post "/" '{"projectName":"Todo App","category":"Authentication","score":70,"findings":[{"id":"ta-a1","status":"strong","title":"BetterAuth session auth","description":"Proper session-based authentication with invite-only registration.","recommendation":""},{"id":"ta-a2","status":"needs_improvement","title":"No rate limiting on auth","description":"No rate limiting on login/register endpoints. Vulnerable to brute-force.","recommendation":"Add rate limiting middleware."},{"id":"ta-a3","status":"needs_improvement","title":"No password policy","description":"No minimum password requirements configured.","recommendation":"Configure password policy."}]}'
echo " TA/Authentication ✓"
post "/" '{"projectName":"Todo App","category":"Authorization","score":65,"findings":[{"id":"ta-az1","status":"strong","title":"Session-based user scoping","description":"Tasks and projects filtered by user from session.","recommendation":""},{"id":"ta-az2","status":"needs_improvement","title":"No project-level roles","description":"Any project member can do anything. No viewer/editor/admin distinction.","recommendation":"Add project-level role-based access."},{"id":"ta-az3","status":"needs_improvement","title":"Comment access not fully scoped","description":"Comments on tasks may be visible across project boundaries.","recommendation":"Verify project membership on all comment operations."}]}'
echo " TA/Authorization ✓"
post "/" '{"projectName":"Todo App","category":"Error Handling","score":60,"findings":[{"id":"ta-eh1","status":"needs_improvement","title":"Stack traces in dev mode","description":"Error handler exposes stack traces when NODE_ENV is not production. Deployment may not set this.","recommendation":"Ensure NODE_ENV=production in deployment config."},{"id":"ta-eh2","status":"strong","title":"Generic error messages","description":"Production error responses return Internal server error without details.","recommendation":""},{"id":"ta-eh3","status":"needs_improvement","title":"Error codes not standardized","description":"Different error formats across routes (string vs object vs custom).","recommendation":"Standardize error response format with error codes."}]}'
echo " TA/Error Handling ✓"
# nKode categories
post "/" '{"projectName":"nKode","category":"Authentication","score":95,"findings":[{"id":"nk-a1","status":"strong","title":"OPAQUE zero-knowledge password protocol","description":"Uses OPAQUE protocol. Server never sees plaintext passwords. Argon2 KSF. HMAC-signed sessions.","recommendation":""},{"id":"nk-a2","status":"strong","title":"Cryptographic session validation","description":"Every request validated via HMAC signature of session ID + timestamp. Replay protection.","recommendation":""},{"id":"nk-a3","status":"needs_improvement","title":"No account recovery mechanism","description":"If user loses password, no recovery flow exists.","recommendation":"Add secure account recovery flow."}]}'
echo " nK/Authentication ✓"
post "/" '{"projectName":"nKode","category":"Cryptography","score":92,"findings":[{"id":"nk-c1","status":"strong","title":"OPAQUE-ke for password auth","description":"Industry-standard OPAQUE implementation with proper Argon2 key stretching.","recommendation":""},{"id":"nk-c2","status":"strong","title":"Audited Rust crypto crates","description":"Uses opaque-ke, argon2, hmac. Well-maintained, memory-safe implementations.","recommendation":""},{"id":"nk-c3","status":"needs_improvement","title":"No key rotation mechanism","description":"Server-side OPAQUE keys and HMAC secrets have no rotation mechanism.","recommendation":"Implement key rotation for OPAQUE server keys and HMAC secrets."}]}'
echo " nK/Cryptography ✓"
# Infrastructure categories
post "/" '{"projectName":"Infrastructure","category":"Transport Security","score":80,"findings":[{"id":"inf-ts1","status":"strong","title":"TLS everywhere via Traefik","description":"All public endpoints served over HTTPS with automatic Let'\''s Encrypt certificates.","recommendation":""},{"id":"inf-ts2","status":"strong","title":"HTTP to HTTPS redirect","description":"Traefik configured with automatic HTTP to HTTPS redirect.","recommendation":""},{"id":"inf-ts3","status":"needs_improvement","title":"TLS version not enforced","description":"Default Traefik TLS config may accept TLS 1.0/1.1.","recommendation":"Configure minimum TLS 1.2. Disable weak cipher suites."},{"id":"inf-ts4","status":"needs_improvement","title":"No HSTS header","description":"Strict-Transport-Security header not configured.","recommendation":"Add HSTS with min 1 year, includeSubDomains, preload."}]}'
echo " Infra/Transport ✓"
post "/" '{"projectName":"Infrastructure","category":"Security Headers","score":40,"findings":[{"id":"inf-sh1","status":"critical","title":"No Content-Security-Policy","description":"No CSP header on any application. XSS protection relies entirely on framework.","recommendation":"Add CSP headers. Start with report-only mode."},{"id":"inf-sh2","status":"critical","title":"No X-Frame-Options","description":"No clickjacking protection header. Apps could be embedded in malicious iframes.","recommendation":"Add X-Frame-Options: DENY or SAMEORIGIN."},{"id":"inf-sh3","status":"needs_improvement","title":"No X-Content-Type-Options","description":"Browser MIME type sniffing not prevented.","recommendation":"Add X-Content-Type-Options: nosniff."},{"id":"inf-sh4","status":"needs_improvement","title":"No Referrer-Policy","description":"No control over referrer information sent to third parties.","recommendation":"Add Referrer-Policy: strict-origin-when-cross-origin."},{"id":"inf-sh5","status":"needs_improvement","title":"No Permissions-Policy","description":"No restrictions on browser features (camera, microphone, geolocation).","recommendation":"Add Permissions-Policy header restricting unnecessary features."}]}'
echo " Infra/Security Headers ✓"
post "/" '{"projectName":"Infrastructure","category":"Secret Management","score":55,"findings":[{"id":"inf-sm1","status":"strong","title":"Bitwarden for credential storage","description":"Credentials stored in Bitwarden organizational vault.","recommendation":""},{"id":"inf-sm2","status":"critical","title":"Credentials in compose files","description":"Database passwords, API keys visible in docker-compose.yml and docker-compose.dokploy.yml.","recommendation":"Use Docker secrets or external vault (HashiCorp Vault)."},{"id":"inf-sm3","status":"needs_improvement","title":"Static API tokens","description":"Bearer tokens are static strings in env vars. No rotation, no expiry.","recommendation":"Implement token rotation. Add expiry dates."},{"id":"inf-sm4","status":"needs_improvement","title":"Git credential in URL","description":"Authenticated Git URLs (user:password@) used in Dokploy compose contexts.","recommendation":"Use SSH keys or deploy tokens instead of URL-embedded credentials."}]}'
echo " Infra/Secret Management ✓"
post "/" '{"projectName":"Infrastructure","category":"Container Security","score":50,"findings":[{"id":"inf-cs1","status":"needs_improvement","title":"No Dockerfile linting","description":"Dockerfiles not validated with Hadolint or equivalent.","recommendation":"Add Hadolint to CI pipeline."},{"id":"inf-cs2","status":"needs_improvement","title":"No image vulnerability scanning","description":"Docker images built without Trivy or Grype scanning.","recommendation":"Add Trivy image scanning to CI."},{"id":"inf-cs3","status":"critical","title":"Containers may run as root","description":"No USER directive visible in Dockerfiles. Containers likely run as root.","recommendation":"Add non-root USER to all Dockerfiles. Run with read-only filesystem where possible."},{"id":"inf-cs4","status":"needs_improvement","title":"No resource limits","description":"Docker compose files do not set memory/CPU limits on containers.","recommendation":"Add resource limits to prevent container resource exhaustion."}]}'
echo " Infra/Container Security ✓"
echo ""
echo "=== All audits seeded! ==="