feat: comprehensive security audit system - OWASP Top 10, checklist, score history, scan pipeline
Some checks failed
CI/CD / test (push) Has been cancelled
CI/CD / deploy (push) Has been cancelled
Security Scan / SAST - Semgrep (push) Has been cancelled
Security Scan / Dependency Scan - Trivy (push) Has been cancelled
Security Scan / Secret Detection - Gitleaks (push) Has been cancelled

Phase 1: OWASP API Top 10 per API with real findings from code inspection
- Hammer Dashboard, Network App, Todo App, nKode all audited against 10 OWASP risks
- Per-API scorecards with visual grid, color-coded by status

Phase 2: Full security checklist
- 9 categories: Auth, Authz, Input Validation, Transport, Rate Limiting, etc
- Interactive checklist UI with click-to-cycle status
- Per-project checklist with progress tracking
- Comprehensive category audits (Auth, Data Protection, Logging, Infrastructure, etc)

Phase 3: Automated pipeline
- Semgrep SAST, Trivy dependency scan, Gitleaks secret detection
- Gitea Actions CI workflow (security-scan.yml)
- Scan results stored in DB and displayed in dashboard

Phase 4: Dashboard polish
- Overall security posture score with weighted calculation
- Score trend charts (SVG) with 7-day history
- Critical findings highlight section
- Score history snapshots API
- Tab-based navigation (Overview, Checklist, per-project)

New DB tables: security_score_history, security_checklist, security_scan_results
Seed data populated from real code review of all repos
This commit is contained in:
2026-01-30 15:16:10 +00:00
parent 797396497a
commit 061618cfab
8 changed files with 2207 additions and 1768 deletions

View File

@@ -0,0 +1,176 @@
name: Security Scan
on:
push:
branches: [main]
pull_request:
branches: [main]
schedule:
- cron: '0 6 * * 1' # Weekly Monday 6am UTC
jobs:
semgrep:
name: SAST - Semgrep
runs-on: ubuntu-latest
container:
image: semgrep/semgrep:latest
steps:
- uses: actions/checkout@v4
- name: Run Semgrep
run: |
semgrep scan --config auto --json --output semgrep-results.json . || true
echo "=== Semgrep Results ==="
cat semgrep-results.json | python3 -c "
import json, sys
data = json.load(sys.stdin)
results = data.get('results', [])
print(f'Found {len(results)} issues')
for r in results[:20]:
sev = r.get('extra', {}).get('severity', 'unknown')
msg = r.get('extra', {}).get('message', 'No message')[:100]
path = r.get('path', '?')
line = r.get('start', {}).get('line', '?')
print(f' [{sev}] {path}:{line} - {msg}')
" 2>/dev/null || echo "No results to parse"
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat semgrep-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
results = data.get('results', [])
findings = []
for r in results:
findings.append({
'rule': r.get('check_id', 'unknown'),
'severity': r.get('extra', {}).get('severity', 'unknown'),
'message': r.get('extra', {}).get('message', '')[:200],
'path': r.get('path', ''),
'line': r.get('start', {}).get('line', 0),
})
print(json.dumps(findings))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"semgrep\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}
trivy:
name: Dependency Scan - Trivy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run Trivy filesystem scan
run: |
curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sh -s -- -b /usr/local/bin
trivy fs --format json --output trivy-results.json . || true
echo "=== Trivy Results ==="
trivy fs --severity HIGH,CRITICAL . || true
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat trivy-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
findings = []
for result in data.get('Results', []):
for vuln in result.get('Vulnerabilities', []):
findings.append({
'id': vuln.get('VulnerabilityID', ''),
'severity': vuln.get('Severity', ''),
'package': vuln.get('PkgName', ''),
'version': vuln.get('InstalledVersion', ''),
'fixedVersion': vuln.get('FixedVersion', ''),
'title': vuln.get('Title', '')[:200],
})
print(json.dumps(findings[:50]))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"trivy\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}
gitleaks:
name: Secret Detection - Gitleaks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Run Gitleaks
run: |
curl -sSfL https://github.com/gitleaks/gitleaks/releases/download/v8.18.4/gitleaks_8.18.4_linux_x64.tar.gz | tar xz
./gitleaks detect --source . --report-format json --report-path gitleaks-results.json || true
echo "=== Gitleaks Results ==="
cat gitleaks-results.json | python3 -c "
import json, sys
data = json.load(sys.stdin)
print(f'Found {len(data)} secrets')
for r in data[:10]:
print(f' [{r.get(\"RuleID\",\"?\")}] {r.get(\"File\",\"?\")}:{r.get(\"StartLine\",\"?\")}')
" 2>/dev/null || echo "No leaks found"
- name: Report to Dashboard
if: always()
run: |
FINDINGS=$(cat gitleaks-results.json 2>/dev/null | python3 -c "
import json, sys
data = json.load(sys.stdin)
findings = []
for r in data:
findings.append({
'ruleId': r.get('RuleID', ''),
'file': r.get('File', ''),
'line': r.get('StartLine', 0),
'commit': r.get('Commit', '')[:12],
'author': r.get('Author', ''),
})
print(json.dumps(findings[:50]))
" 2>/dev/null || echo '[]')
COUNT=$(echo "$FINDINGS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo 0)
curl -s -X POST "$DASHBOARD_URL/api/security/scans" \
-H "Authorization: Bearer $DASHBOARD_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"projectName\": \"Hammer Dashboard\",
\"scanType\": \"gitleaks\",
\"status\": \"completed\",
\"findings\": $FINDINGS,
\"summary\": {\"totalFindings\": $COUNT},
\"triggeredBy\": \"ci\",
\"commitSha\": \"$GITHUB_SHA\",
\"branch\": \"$GITHUB_REF_NAME\"
}" || true
env:
DASHBOARD_URL: https://dash.donovankelly.xyz
DASHBOARD_TOKEN: ${{ secrets.DASHBOARD_TOKEN }}

View File

@@ -11,6 +11,6 @@ RUN bun install --frozen-lockfile 2>/dev/null || bun install
# Copy source and init script
COPY . .
# Cache buster: 2026-01-30-v3
# Cache buster: 2026-01-30-v4-security
EXPOSE 3100
CMD ["sh", "-c", "echo 'Waiting for DB...' && sleep 5 && echo 'Running init SQL...' && psql \"$DATABASE_URL\" -f /app/init-tables.sql 2>&1 && echo 'Init SQL done' && echo 'Running db:push...' && yes | bun run db:push 2>&1; echo 'db:push exit code:' $? && echo 'Starting server...' && bun run start"]
CMD ["sh", "-c", "echo 'Waiting for DB...' && sleep 5 && echo 'Running init SQL...' && psql \"$DATABASE_URL\" -f /app/init-tables.sql 2>&1 && echo 'Init SQL done' && echo 'Running db:push...' && yes | bun run db:push 2>&1; echo 'db:push exit code:' $? && echo 'Seeding security data...' && bun run src/seed-security.ts 2>&1; echo 'Seed exit:' $? && echo 'Starting server...' && bun run start"]

View File

@@ -65,3 +65,54 @@ CREATE TABLE IF NOT EXISTS task_comments (
content TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Score History ═══
CREATE TABLE IF NOT EXISTS security_score_history (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
score INTEGER NOT NULL,
total_findings INTEGER NOT NULL DEFAULT 0,
critical_count INTEGER NOT NULL DEFAULT 0,
warning_count INTEGER NOT NULL DEFAULT 0,
strong_count INTEGER NOT NULL DEFAULT 0,
recorded_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Checklist ═══
DO $$ BEGIN
CREATE TYPE security_checklist_status AS ENUM ('pass', 'fail', 'partial', 'not_applicable', 'not_checked');
EXCEPTION WHEN duplicate_object THEN null;
END $$;
CREATE TABLE IF NOT EXISTS security_checklist (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
checklist_category TEXT NOT NULL,
item TEXT NOT NULL,
status security_checklist_status NOT NULL DEFAULT 'not_checked',
notes TEXT,
checked_by TEXT,
checked_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
-- ═══ Security Scan Results ═══
CREATE TABLE IF NOT EXISTS security_scan_results (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
project_name TEXT NOT NULL,
scan_type TEXT NOT NULL,
scan_status TEXT NOT NULL DEFAULT 'pending',
findings JSONB DEFAULT '[]'::jsonb,
summary JSONB DEFAULT '{}'::jsonb,
triggered_by TEXT,
commit_sha TEXT,
branch TEXT,
duration INTEGER,
started_at TIMESTAMPTZ,
completed_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT now()
);

View File

@@ -148,6 +148,67 @@ export const securityAudits = pgTable("security_audits", {
export type SecurityAudit = typeof securityAudits.$inferSelect;
export type NewSecurityAudit = typeof securityAudits.$inferInsert;
// ─── Security Score History ───
export const securityScoreHistory = pgTable("security_score_history", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
score: integer("score").notNull(),
totalFindings: integer("total_findings").notNull().default(0),
criticalCount: integer("critical_count").notNull().default(0),
warningCount: integer("warning_count").notNull().default(0),
strongCount: integer("strong_count").notNull().default(0),
recordedAt: timestamp("recorded_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityScoreHistory = typeof securityScoreHistory.$inferSelect;
// ─── Security Checklist ───
export const securityChecklistStatusEnum = pgEnum("security_checklist_status", [
"pass",
"fail",
"partial",
"not_applicable",
"not_checked",
]);
export const securityChecklist = pgTable("security_checklist", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
category: text("checklist_category").notNull(),
item: text("item").notNull(),
status: securityChecklistStatusEnum("status").notNull().default("not_checked"),
notes: text("notes"),
checkedBy: text("checked_by"),
checkedAt: timestamp("checked_at", { withTimezone: true }),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
updatedAt: timestamp("updated_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityChecklistItem = typeof securityChecklist.$inferSelect;
export type NewSecurityChecklistItem = typeof securityChecklist.$inferInsert;
// ─── Security Scan Results ───
export const securityScanResults = pgTable("security_scan_results", {
id: uuid("id").defaultRandom().primaryKey(),
projectName: text("project_name").notNull(),
scanType: text("scan_type").notNull(), // semgrep, trivy, gitleaks
status: text("scan_status").notNull().default("pending"), // pending, running, completed, failed
findings: jsonb("findings").$type<any[]>().default([]),
summary: jsonb("summary").$type<Record<string, any>>().default({}),
triggeredBy: text("triggered_by"), // ci, manual
commitSha: text("commit_sha"),
branch: text("branch"),
duration: integer("duration"), // seconds
startedAt: timestamp("started_at", { withTimezone: true }),
completedAt: timestamp("completed_at", { withTimezone: true }),
createdAt: timestamp("created_at", { withTimezone: true }).defaultNow().notNull(),
});
export type SecurityScanResult = typeof securityScanResults.$inferSelect;
// ─── Daily Summaries ───
export interface SummaryHighlight {

View File

@@ -1,7 +1,7 @@
import { Elysia, t } from "elysia";
import { db } from "../db";
import { securityAudits } from "../db/schema";
import { eq, asc, and } from "drizzle-orm";
import { securityAudits, securityChecklist, securityScoreHistory, securityScanResults } from "../db/schema";
import { eq, asc, desc, and, sql } from "drizzle-orm";
import { auth } from "../lib/auth";
const BEARER_TOKEN = process.env.API_BEARER_TOKEN || "hammer-dev-token";
@@ -39,16 +39,17 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
set.status = 401;
return { error: "Unauthorized" };
}
if (msg === "Audit not found") {
if (msg === "Audit not found" || msg === "Not found") {
set.status = 404;
return { error: "Audit not found" };
return { error: msg };
}
console.error("Security route error:", msg);
set.status = 500;
return { error: "Internal server error" };
})
// GET all audits
// ─── Audit CRUD ───
.get("/", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const all = await db
@@ -58,7 +59,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return all;
})
// GET summary (aggregate scores per project)
.get("/summary", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const all = await db
@@ -99,7 +99,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return summary;
})
// GET audits for a specific project
.get(
"/project/:projectName",
async ({ params, request, headers }) => {
@@ -114,7 +113,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
{ params: t.Object({ projectName: t.String() }) }
)
// POST create audit entry
.post(
"/",
async ({ body, request, headers }) => {
@@ -141,7 +139,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
}
)
// PATCH update audit entry
.patch(
"/:id",
async ({ params, body, request, headers }) => {
@@ -173,7 +170,6 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
}
)
// DELETE audit entry
.delete(
"/:id",
async ({ params, request, headers }) => {
@@ -186,4 +182,324 @@ export const securityRoutes = new Elysia({ prefix: "/api/security" })
return { success: true };
},
{ params: t.Object({ id: t.String() }) }
);
)
// ─── Security Checklist ───
.get("/checklist", async ({ request, headers, query }) => {
await requireSessionOrBearer(request, headers);
const conditions = [];
if (query.projectName) conditions.push(eq(securityChecklist.projectName, query.projectName));
if (query.category) conditions.push(eq(securityChecklist.category, query.category));
const items = await db
.select()
.from(securityChecklist)
.where(conditions.length ? and(...conditions) : undefined)
.orderBy(asc(securityChecklist.projectName), asc(securityChecklist.category), asc(securityChecklist.item));
return items;
})
.post(
"/checklist",
async ({ body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const result = await db
.insert(securityChecklist)
.values({
projectName: body.projectName,
category: body.category,
item: body.item,
status: body.status || "not_checked",
notes: body.notes,
})
.returning();
return result[0];
},
{
body: t.Object({
projectName: t.String(),
category: t.String(),
item: t.String(),
status: t.Optional(t.String()),
notes: t.Optional(t.String()),
}),
}
)
.post(
"/checklist/bulk",
async ({ body, request, headers }) => {
await requireSessionOrBearer(request, headers);
if (!body.items || body.items.length === 0) return { inserted: 0 };
const values = body.items.map((item: any) => ({
projectName: item.projectName,
category: item.category,
item: item.item,
status: item.status || "not_checked",
notes: item.notes || null,
}));
const result = await db.insert(securityChecklist).values(values).returning();
return { inserted: result.length };
},
{
body: t.Object({
items: t.Array(
t.Object({
projectName: t.String(),
category: t.String(),
item: t.String(),
status: t.Optional(t.String()),
notes: t.Optional(t.String()),
})
),
}),
}
)
.patch(
"/checklist/:id",
async ({ params, body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const updates: Record<string, any> = { updatedAt: new Date() };
if (body.status !== undefined) {
updates.status = body.status;
updates.checkedAt = new Date();
updates.checkedBy = body.checkedBy || "manual";
}
if (body.notes !== undefined) updates.notes = body.notes;
const updated = await db
.update(securityChecklist)
.set(updates)
.where(eq(securityChecklist.id, params.id))
.returning();
if (!updated.length) throw new Error("Not found");
return updated[0];
},
{
params: t.Object({ id: t.String() }),
body: t.Object({
status: t.Optional(t.String()),
notes: t.Optional(t.String()),
checkedBy: t.Optional(t.String()),
}),
}
)
// ─── Score History ───
.get("/score-history", async ({ request, headers, query }) => {
await requireSessionOrBearer(request, headers);
const conditions = [];
if (query.projectName) conditions.push(eq(securityScoreHistory.projectName, query.projectName));
const history = await db
.select()
.from(securityScoreHistory)
.where(conditions.length ? and(...conditions) : undefined)
.orderBy(asc(securityScoreHistory.recordedAt))
.limit(100);
return history;
})
.post(
"/score-history/snapshot",
async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
// Take a snapshot of current scores for all projects
const all = await db.select().from(securityAudits);
const projectMap: Record<string, { scores: number[]; findings: any[] }> = {};
for (const audit of all) {
if (!projectMap[audit.projectName]) {
projectMap[audit.projectName] = { scores: [], findings: [] };
}
projectMap[audit.projectName].scores.push(audit.score);
projectMap[audit.projectName].findings.push(...(audit.findings || []));
}
const snapshots = [];
for (const [name, data] of Object.entries(projectMap)) {
const avgScore = Math.round(data.scores.reduce((a, b) => a + b, 0) / data.scores.length);
snapshots.push({
projectName: name,
score: avgScore,
totalFindings: data.findings.length,
criticalCount: data.findings.filter((f: any) => f.status === "critical").length,
warningCount: data.findings.filter((f: any) => f.status === "needs_improvement").length,
strongCount: data.findings.filter((f: any) => f.status === "strong").length,
});
}
// Also snapshot "Overall"
const allFindings = all.flatMap(a => a.findings || []);
const allScores = all.map(a => a.score);
if (allScores.length > 0) {
snapshots.push({
projectName: "Overall",
score: Math.round(allScores.reduce((a, b) => a + b, 0) / allScores.length),
totalFindings: allFindings.length,
criticalCount: allFindings.filter((f: any) => f.status === "critical").length,
warningCount: allFindings.filter((f: any) => f.status === "needs_improvement").length,
strongCount: allFindings.filter((f: any) => f.status === "strong").length,
});
}
if (snapshots.length > 0) {
await db.insert(securityScoreHistory).values(snapshots);
}
return { snapshots: snapshots.length };
}
)
// ─── Scan Results ───
.get("/scans", async ({ request, headers, query }) => {
await requireSessionOrBearer(request, headers);
const conditions = [];
if (query.projectName) conditions.push(eq(securityScanResults.projectName, query.projectName));
if (query.scanType) conditions.push(eq(securityScanResults.scanType, query.scanType));
const scans = await db
.select()
.from(securityScanResults)
.where(conditions.length ? and(...conditions) : undefined)
.orderBy(desc(securityScanResults.createdAt))
.limit(50);
return scans;
})
.post(
"/scans",
async ({ body, request, headers }) => {
await requireSessionOrBearer(request, headers);
const result = await db
.insert(securityScanResults)
.values({
projectName: body.projectName,
scanType: body.scanType,
status: body.status || "completed",
findings: body.findings || [],
summary: body.summary || {},
triggeredBy: body.triggeredBy || "manual",
commitSha: body.commitSha,
branch: body.branch,
duration: body.duration,
startedAt: body.startedAt ? new Date(body.startedAt) : null,
completedAt: body.completedAt ? new Date(body.completedAt) : new Date(),
})
.returning();
return result[0];
},
{
body: t.Object({
projectName: t.String(),
scanType: t.String(),
status: t.Optional(t.String()),
findings: t.Optional(t.Array(t.Any())),
summary: t.Optional(t.Record(t.String(), t.Any())),
triggeredBy: t.Optional(t.String()),
commitSha: t.Optional(t.String()),
branch: t.Optional(t.String()),
duration: t.Optional(t.Number()),
startedAt: t.Optional(t.String()),
completedAt: t.Optional(t.String()),
}),
}
)
// ─── Posture Score (computed) ───
.get("/posture", async ({ request, headers }) => {
await requireSessionOrBearer(request, headers);
const audits = await db.select().from(securityAudits);
const checklistItems = await db.select().from(securityChecklist);
const recentScans = await db
.select()
.from(securityScanResults)
.orderBy(desc(securityScanResults.createdAt))
.limit(20);
// Compute per-project posture
const projects: Record<string, any> = {};
for (const audit of audits) {
if (!projects[audit.projectName]) {
projects[audit.projectName] = {
auditScores: [],
findings: [],
checklistPass: 0,
checklistTotal: 0,
checklistFail: 0,
lastScan: null,
};
}
projects[audit.projectName].auditScores.push(audit.score);
projects[audit.projectName].findings.push(...(audit.findings || []));
}
for (const item of checklistItems) {
if (!projects[item.projectName]) {
projects[item.projectName] = {
auditScores: [],
findings: [],
checklistPass: 0,
checklistTotal: 0,
checklistFail: 0,
lastScan: null,
};
}
projects[item.projectName].checklistTotal++;
if (item.status === "pass") projects[item.projectName].checklistPass++;
if (item.status === "fail") projects[item.projectName].checklistFail++;
}
const posture = Object.entries(projects).map(([name, data]: [string, any]) => {
const auditScore = data.auditScores.length
? Math.round(data.auditScores.reduce((a: number, b: number) => a + b, 0) / data.auditScores.length)
: 0;
const checklistScore = data.checklistTotal
? Math.round((data.checklistPass / data.checklistTotal) * 100)
: 0;
// Weighted: 70% audit, 30% checklist
const overallScore = data.checklistTotal
? Math.round(auditScore * 0.7 + checklistScore * 0.3)
: auditScore;
return {
projectName: name,
overallScore,
auditScore,
checklistScore,
totalFindings: data.findings.length,
criticalFindings: data.findings.filter((f: any) => f.status === "critical").length,
warningFindings: data.findings.filter((f: any) => f.status === "needs_improvement").length,
strongFindings: data.findings.filter((f: any) => f.status === "strong").length,
checklistPass: data.checklistPass,
checklistTotal: data.checklistTotal,
checklistFail: data.checklistFail,
};
});
const overallScore = posture.length
? Math.round(posture.reduce((s, p) => s + p.overallScore, 0) / posture.length)
: 0;
return {
overallScore,
projects: posture,
recentScans: recentScans.map(s => ({
id: s.id,
projectName: s.projectName,
scanType: s.scanType,
status: s.status,
findingsCount: (s.findings as any[])?.length || 0,
completedAt: s.completedAt?.toISOString(),
})),
};
});

View File

@@ -0,0 +1,334 @@
import { db } from "./db";
import { securityChecklist } from "./db/schema";
interface ChecklistItem {
projectName: string;
category: string;
item: string;
status: "pass" | "fail" | "partial" | "not_applicable" | "not_checked";
notes: string | null;
}
const items: ChecklistItem[] = [
// ═══════════════════════════════════════════
// HAMMER DASHBOARD
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Passwords hashed with bcrypt/argon2/scrypt", status: "pass", notes: "BetterAuth handles password hashing securely" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session tokens are cryptographically random", status: "pass", notes: "BetterAuth generates secure session tokens" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "Sessions expire per BetterAuth defaults" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Secure cookie attributes (HttpOnly, Secure, SameSite)", status: "pass", notes: "Cookie config: secure=true, sameSite=none, httpOnly=true" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "CSRF protection enabled", status: "pass", notes: "disableCSRFCheck: false explicitly set" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA support configured" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Password complexity requirements enforced", status: "fail", notes: "No password policy configured in BetterAuth" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Account lockout after failed attempts", status: "not_checked", notes: "BetterAuth may handle this — needs verification" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Registration restricted (invite-only or approval)", status: "fail", notes: "Open signup enabled — emailAndPassword.enabled without disableSignUp" },
{ projectName: "Hammer Dashboard", category: "Auth & Session Management", item: "Session invalidation on password change", status: "pass", notes: "BetterAuth invalidates sessions on credential change" },
// Authorization
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Object-level access control (users can only access own data)", status: "partial", notes: "Task queue is shared by design — no per-user isolation. Admin role exists." },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Function-level access control (admin vs user)", status: "pass", notes: "requireAdmin() check on admin-only routes" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Field-level access control", status: "pass", notes: "Elysia t.Object() schemas restrict accepted fields" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "API tokens scoped per client/service", status: "fail", notes: "Single static API_BEARER_TOKEN shared across all consumers" },
{ projectName: "Hammer Dashboard", category: "Authorization", item: "Principle of least privilege applied", status: "partial", notes: "Admin/user roles exist but token gives full access" },
// Input Validation
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "All API inputs validated with schemas", status: "partial", notes: "Most routes use Elysia t.Object() — some routes lack validation" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "SQL injection prevented (parameterized queries)", status: "pass", notes: "Drizzle ORM handles parameterization" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "XSS prevention (output encoding)", status: "pass", notes: "React auto-escapes output. API returns JSON." },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "Path traversal prevented", status: "not_applicable", notes: "No file system operations in API" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "File upload validation", status: "not_applicable", notes: "No file uploads in this app" },
{ projectName: "Hammer Dashboard", category: "Input Validation", item: "Request body size limits", status: "fail", notes: "No body size limits configured" },
// Transport & Data Protection
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "HTTPS enforced on all endpoints", status: "pass", notes: "Let's Encrypt TLS via Traefik/Dokploy" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "HSTS header set", status: "partial", notes: "May be set by Traefik — needs verification at app level" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "CORS includes localhost:5173 in production" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Encryption at rest for sensitive data", status: "fail", notes: "No disk or column-level encryption" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Database backups encrypted", status: "fail", notes: "No backup strategy exists" },
{ projectName: "Hammer Dashboard", category: "Transport & Data Protection", item: "Secrets stored securely (env vars / vault)", status: "pass", notes: "Env vars via Dokploy environment config" },
// Rate Limiting
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No rate limiting middleware" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting middleware" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Bot/CAPTCHA protection on registration", status: "fail", notes: "No CAPTCHA or bot detection" },
{ projectName: "Hammer Dashboard", category: "Rate Limiting & Abuse Prevention", item: "Request throttling for expensive operations", status: "fail", notes: "No throttling configured" },
// Error Handling
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Generic error messages in production", status: "pass", notes: "Returns 'Internal server error' without stack traces" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "No stack traces leaked to clients", status: "pass", notes: "Error handler is generic" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Consistent error response format", status: "pass", notes: "All errors return { error: string }" },
{ projectName: "Hammer Dashboard", category: "Error Handling", item: "Uncaught exception handler", status: "partial", notes: "Elysia onError catches route errors; no process-level handler" },
// Logging & Monitoring
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Structured logging (not just console.log)", status: "fail", notes: "Console-only logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Auth events logged (login, logout, failed attempts)", status: "fail", notes: "No auth event logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Data access audit trail", status: "fail", notes: "No audit logging" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Error alerting configured", status: "fail", notes: "No alerting system" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No external monitoring" },
{ projectName: "Hammer Dashboard", category: "Logging & Monitoring", item: "Log aggregation / centralized logging", status: "fail", notes: "No log aggregation — stdout only" },
// Infrastructure
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Container isolation (separate containers per service)", status: "pass", notes: "Docker compose with separate backend + db containers" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Minimal base images", status: "partial", notes: "Uses oven/bun — not minimal but purpose-built" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "No root user in containers", status: "not_checked", notes: "Need to verify Dockerfile USER directive" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Docker health checks defined", status: "fail", notes: "No HEALTHCHECK in Dockerfile" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Secrets not baked into images", status: "pass", notes: "Secrets via env vars at runtime" },
{ projectName: "Hammer Dashboard", category: "Infrastructure", item: "Automated deployment (CI/CD)", status: "pass", notes: "Gitea Actions + Dokploy deploy" },
// Security Headers
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Content-Security-Policy (CSP)", status: "fail", notes: "No CSP header set at application level" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-Content-Type-Options: nosniff", status: "not_checked", notes: "May be set by Traefik — needs verification" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-Frame-Options: DENY", status: "not_checked", notes: "May be set by Traefik — needs verification" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "X-XSS-Protection", status: "not_checked", notes: "Deprecated but worth checking" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Referrer-Policy", status: "not_checked", notes: "Not configured at app level" },
{ projectName: "Hammer Dashboard", category: "Security Headers", item: "Permissions-Policy", status: "fail", notes: "Not configured" },
// ═══════════════════════════════════════════
// NETWORK APP
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Network App", category: "Auth & Session Management", item: "Passwords hashed with bcrypt/argon2/scrypt", status: "pass", notes: "BetterAuth handles hashing" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Session tokens are cryptographically random", status: "pass", notes: "BetterAuth secure tokens" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "7-day expiry with daily refresh" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Secure cookie attributes", status: "pass", notes: "secure=true, sameSite=none, httpOnly, cross-subdomain scoped" },
{ projectName: "Network App", category: "Auth & Session Management", item: "CSRF protection enabled", status: "pass", notes: "BetterAuth CSRF enabled" },
{ projectName: "Network App", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA support" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Registration restricted (invite-only)", status: "pass", notes: "disableSignUp: true + 403 on signup endpoint" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Bearer token support for mobile", status: "pass", notes: "BetterAuth bearer plugin enabled" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Password complexity requirements", status: "fail", notes: "No password policy enforced" },
{ projectName: "Network App", category: "Auth & Session Management", item: "Account lockout after failed attempts", status: "not_checked", notes: "Needs verification" },
// Authorization
{ projectName: "Network App", category: "Authorization", item: "Object-level access control (user-scoped queries)", status: "pass", notes: "All queries use eq(clients.userId, user.id)" },
{ projectName: "Network App", category: "Authorization", item: "Function-level access control (admin vs user)", status: "pass", notes: "Admin routes check user.role === 'admin'" },
{ projectName: "Network App", category: "Authorization", item: "Centralized auth middleware", status: "pass", notes: "authMiddleware Elysia plugin with 'as: scoped'" },
{ projectName: "Network App", category: "Authorization", item: "Field-level input validation", status: "partial", notes: "Most fields validated — 'role' field accepts arbitrary strings" },
// Input Validation
{ projectName: "Network App", category: "Input Validation", item: "All API inputs validated", status: "pass", notes: "34+ route files use Elysia t.Object() schemas" },
{ projectName: "Network App", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "Drizzle ORM parameterized queries" },
{ projectName: "Network App", category: "Input Validation", item: "XSS prevention", status: "pass", notes: "React auto-escapes; API returns JSON" },
{ projectName: "Network App", category: "Input Validation", item: "File upload validation", status: "partial", notes: "Document uploads exist — need to verify size/type checks" },
{ projectName: "Network App", category: "Input Validation", item: "Request body size limits", status: "not_checked", notes: "Needs verification" },
// Transport & Data Protection
{ projectName: "Network App", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "Falls back to localhost:3000 if env not set" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "PII encryption at rest", status: "fail", notes: "Contact data (names, emails, phones) stored as plain text" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "Secrets stored securely", status: "pass", notes: "Env vars via Dokploy" },
{ projectName: "Network App", category: "Transport & Data Protection", item: "API key rotation for external services", status: "fail", notes: "Resend API key not rotated" },
// Rate Limiting
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "pass", notes: "5 req/min per IP on auth" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "pass", notes: "100 req/min global per IP" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on AI endpoints", status: "pass", notes: "10 req/min on AI routes" },
{ projectName: "Network App", category: "Rate Limiting & Abuse Prevention", item: "Rate limit headers in responses", status: "pass", notes: "Returns Retry-After on 429" },
// Error Handling
{ projectName: "Network App", category: "Error Handling", item: "Generic error messages in production", status: "fail", notes: "Stack traces included in error responses" },
{ projectName: "Network App", category: "Error Handling", item: "No stack traces leaked", status: "fail", notes: "Error handler sends stack to client" },
{ projectName: "Network App", category: "Error Handling", item: "Consistent error response format", status: "pass", notes: "Standardized error format" },
{ projectName: "Network App", category: "Error Handling", item: "Error boundary in frontend", status: "pass", notes: "ErrorBoundary + ToastContainer implemented" },
// Logging & Monitoring
{ projectName: "Network App", category: "Logging & Monitoring", item: "Audit logging implemented", status: "pass", notes: "audit_logs table tracks all CRUD operations" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Structured logging", status: "fail", notes: "Console-based logging only" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting configured" },
{ projectName: "Network App", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No external monitoring" },
// Infrastructure
{ projectName: "Network App", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Separate Docker containers" },
{ projectName: "Network App", category: "Infrastructure", item: "Production Dockerfile with minimal deps", status: "pass", notes: "Multi-stage build, --production flag, NODE_ENV=production" },
{ projectName: "Network App", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK in Dockerfile" },
{ projectName: "Network App", category: "Infrastructure", item: "Automated CI/CD", status: "pass", notes: "Gitea Actions + Dokploy" },
// Security Headers
{ projectName: "Network App", category: "Security Headers", item: "Content-Security-Policy (CSP)", status: "fail", notes: "Not configured" },
{ projectName: "Network App", category: "Security Headers", item: "X-Content-Type-Options: nosniff", status: "not_checked", notes: "Needs verification" },
{ projectName: "Network App", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "Network App", category: "Security Headers", item: "Referrer-Policy", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// TODO APP
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Todo App", category: "Auth & Session Management", item: "Passwords hashed securely", status: "pass", notes: "BetterAuth handles hashing" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Session tokens cryptographically random", status: "pass", notes: "BetterAuth secure tokens" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Session expiry enforced", status: "pass", notes: "BetterAuth defaults" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Secure cookie attributes", status: "pass", notes: "Configured in BetterAuth" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No MFA" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Registration restricted (invite-only)", status: "pass", notes: "Invite system with expiring tokens" },
{ projectName: "Todo App", category: "Auth & Session Management", item: "Hammer service auth separated", status: "pass", notes: "Dedicated HAMMER_API_KEY for service account" },
// Authorization
{ projectName: "Todo App", category: "Authorization", item: "Object-level access control", status: "pass", notes: "Tasks filtered by eq(tasks.userId, userId)" },
{ projectName: "Todo App", category: "Authorization", item: "Function-level access control", status: "pass", notes: "Admin role checking on admin routes" },
{ projectName: "Todo App", category: "Authorization", item: "Service account scope limited", status: "partial", notes: "Hammer service has broad access to create/update for any user" },
// Input Validation
{ projectName: "Todo App", category: "Input Validation", item: "API inputs validated with schemas", status: "pass", notes: "Elysia t.Object() type validation on routes" },
{ projectName: "Todo App", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "Drizzle ORM" },
{ projectName: "Todo App", category: "Input Validation", item: "XSS prevention", status: "pass", notes: "React + JSON API" },
{ projectName: "Todo App", category: "Input Validation", item: "Webhook URL validation", status: "partial", notes: "Webhook URLs stored by admin — no scheme/host validation" },
// Transport & Data Protection
{ projectName: "Todo App", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "Todo App", category: "Transport & Data Protection", item: "CORS properly restricted", status: "partial", notes: "Falls back to localhost:5173 if env not set" },
{ projectName: "Todo App", category: "Transport & Data Protection", item: "Database backups", status: "fail", notes: "No backup strategy" },
// Rate Limiting
{ projectName: "Todo App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No rate limiting" },
{ projectName: "Todo App", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting" },
// Error Handling
{ projectName: "Todo App", category: "Error Handling", item: "Generic error messages in production", status: "pass", notes: "Checks NODE_ENV for stack traces" },
{ projectName: "Todo App", category: "Error Handling", item: "Consistent error format", status: "pass", notes: "Standardized error responses" },
// Logging & Monitoring
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Audit logging", status: "fail", notes: "No audit logging" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Structured logging", status: "fail", notes: "Console-only" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting" },
{ projectName: "Todo App", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No monitoring" },
// Infrastructure
{ projectName: "Todo App", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Docker compose" },
{ projectName: "Todo App", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK" },
{ projectName: "Todo App", category: "Infrastructure", item: "Automated CI/CD", status: "pass", notes: "Gitea Actions + Dokploy" },
// Security Headers
{ projectName: "Todo App", category: "Security Headers", item: "CSP header", status: "fail", notes: "Not configured" },
{ projectName: "Todo App", category: "Security Headers", item: "X-Content-Type-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "Todo App", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// NKODE
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "nKode", category: "Auth & Session Management", item: "OPAQUE protocol (zero-knowledge password)", status: "pass", notes: "Server never sees plaintext passwords — state-of-the-art" },
{ projectName: "nKode", category: "Auth & Session Management", item: "Argon2 password hashing in OPAQUE", status: "pass", notes: "Configured via opaque-ke features" },
{ projectName: "nKode", category: "Auth & Session Management", item: "OIDC token-based sessions", status: "pass", notes: "Full OIDC implementation with JWK signing" },
{ projectName: "nKode", category: "Auth & Session Management", item: "MFA / 2FA available", status: "fail", notes: "No second factor — OPAQUE is single-factor" },
{ projectName: "nKode", category: "Auth & Session Management", item: "Cryptographic session signatures", status: "pass", notes: "HEADER_SIGNATURE + HEADER_TIMESTAMP verification" },
// Authorization
{ projectName: "nKode", category: "Authorization", item: "Token-based authorization", status: "pass", notes: "OIDC JWT tokens for API auth" },
{ projectName: "nKode", category: "Authorization", item: "Auth extractors for route protection", status: "pass", notes: "extractors.rs provides consistent auth extraction" },
{ projectName: "nKode", category: "Authorization", item: "Role-based access control", status: "fail", notes: "No visible RBAC — all authenticated users have equal access" },
// Input Validation
{ projectName: "nKode", category: "Input Validation", item: "Type-safe deserialization (serde)", status: "pass", notes: "Rust serde enforces strict type contracts" },
{ projectName: "nKode", category: "Input Validation", item: "Memory safety (Rust)", status: "pass", notes: "Eliminates buffer overflows, use-after-free, data races" },
{ projectName: "nKode", category: "Input Validation", item: "SQL injection prevented", status: "pass", notes: "SQLx with parameterized queries" },
// Transport & Data Protection
{ projectName: "nKode", category: "Transport & Data Protection", item: "HTTPS enforced", status: "pass", notes: "Let's Encrypt TLS" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "OPAQUE prevents password exposure", status: "pass", notes: "DB breach doesn't expose passwords" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "Login data encryption at rest", status: "fail", notes: "Stored login data not encrypted at application level" },
{ projectName: "nKode", category: "Transport & Data Protection", item: "CORS properly restricted", status: "fail", notes: "Hardcoded localhost origins in production code" },
// Rate Limiting
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on auth endpoints", status: "fail", notes: "No tower-governor or rate limiting middleware" },
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Rate limiting on API endpoints", status: "fail", notes: "No rate limiting" },
{ projectName: "nKode", category: "Rate Limiting & Abuse Prevention", item: "Argon2 DoS protection", status: "fail", notes: "Expensive OPAQUE/Argon2 flows could be abused for resource exhaustion" },
// Error Handling
{ projectName: "nKode", category: "Error Handling", item: "Proper Axum error types", status: "pass", notes: "Uses Axum error handling properly" },
{ projectName: "nKode", category: "Error Handling", item: "No stack traces leaked", status: "pass", notes: "Rust error handling is explicit" },
// Logging & Monitoring
{ projectName: "nKode", category: "Logging & Monitoring", item: "Structured logging (tracing crate)", status: "pass", notes: "Uses Rust tracing ecosystem" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Log aggregation", status: "fail", notes: "Logs to stdout only" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Error alerting", status: "fail", notes: "No alerting" },
{ projectName: "nKode", category: "Logging & Monitoring", item: "Uptime monitoring", status: "fail", notes: "No monitoring" },
// Infrastructure
{ projectName: "nKode", category: "Infrastructure", item: "Container isolation", status: "pass", notes: "Docker on Dokploy" },
{ projectName: "nKode", category: "Infrastructure", item: "Minimal base image (Rust binary)", status: "pass", notes: "Small attack surface" },
{ projectName: "nKode", category: "Infrastructure", item: "Docker health checks", status: "fail", notes: "No HEALTHCHECK" },
// Security Headers
{ projectName: "nKode", category: "Security Headers", item: "CSP header", status: "fail", notes: "Not configured" },
{ projectName: "nKode", category: "Security Headers", item: "X-Content-Type-Options", status: "not_checked", notes: "Needs verification" },
{ projectName: "nKode", category: "Security Headers", item: "X-Frame-Options", status: "not_checked", notes: "Needs verification" },
// ═══════════════════════════════════════════
// INFRASTRUCTURE
// ═══════════════════════════════════════════
// Auth & Session Management
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "SSH key authentication", status: "pass", notes: "VPS supports SSH key auth" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "SSH password auth disabled", status: "not_checked", notes: "Needs audit on both VPS" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "Gitea auth properly configured", status: "pass", notes: "Self-hosted with authenticated access" },
{ projectName: "Infrastructure", category: "Auth & Session Management", item: "Git credentials not in URLs", status: "fail", notes: "Credentials embedded in remote URLs" },
// Transport & Data Protection
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "TLS on all public endpoints", status: "pass", notes: "All 7+ domains have valid Let's Encrypt certs" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "DNSSEC enabled", status: "fail", notes: "No DNSSEC on donovankelly.xyz" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "Centralized backup strategy", status: "fail", notes: "No unified backup across services" },
{ projectName: "Infrastructure", category: "Transport & Data Protection", item: "Secrets rotation policy", status: "fail", notes: "No rotation schedule for tokens/passwords" },
// Infrastructure
{ projectName: "Infrastructure", category: "Infrastructure", item: "Firewall rules documented and audited", status: "fail", notes: "No documentation of iptables/ufw rules" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Exposed ports audited", status: "fail", notes: "No port scan audit performed" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "SSH on non-default port", status: "not_checked", notes: "Needs verification" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Fail2ban installed and configured", status: "fail", notes: "No IDS/IPS verified" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Unattended security updates enabled", status: "not_checked", notes: "Needs verification on both VPS" },
{ projectName: "Infrastructure", category: "Infrastructure", item: "Container vulnerability scanning", status: "fail", notes: "No Trivy or similar scanning" },
// Logging & Monitoring
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Centralized log aggregation", status: "fail", notes: "Each container logs independently to stdout" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Uptime monitoring for all domains", status: "fail", notes: "No UptimeRobot or similar" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "Intrusion detection system", status: "fail", notes: "No IDS on either VPS" },
{ projectName: "Infrastructure", category: "Logging & Monitoring", item: "System log monitoring", status: "fail", notes: "No syslog analysis" },
// Security Headers (Traefik/reverse proxy level)
{ projectName: "Infrastructure", category: "Security Headers", item: "HSTS on all domains", status: "not_checked", notes: "Needs verification at Traefik level" },
{ projectName: "Infrastructure", category: "Security Headers", item: "Security headers middleware in Traefik", status: "not_checked", notes: "Needs verification" },
];
async function seedChecklist() {
console.log("📋 Seeding security checklist data...");
// Clear existing
await db.delete(securityChecklist);
console.log(" Cleared existing checklist data");
// Bulk insert
const values = items.map(i => ({
projectName: i.projectName,
category: i.category,
item: i.item,
status: i.status,
notes: i.notes,
}));
// Insert in batches of 50
for (let i = 0; i < values.length; i += 50) {
const batch = values.slice(i, i + 50);
await db.insert(securityChecklist).values(batch);
}
console.log(` ✅ Inserted ${items.length} checklist items`);
// Summary
const projects = [...new Set(items.map(i => i.projectName))];
for (const project of projects) {
const projectItems = items.filter(i => i.projectName === project);
const pass = projectItems.filter(i => i.status === "pass").length;
const fail = projectItems.filter(i => i.status === "fail").length;
const partial = projectItems.filter(i => i.status === "partial").length;
console.log(` ${project}: ${pass} pass, ${fail} fail, ${partial} partial, ${projectItems.length} total`);
}
process.exit(0);
}
seedChecklist().catch((err) => {
console.error("Failed to seed checklist:", err);
process.exit(1);
});

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff