Compare commits

...

93 Commits

Author SHA1 Message Date
m
63b43ff7c8 feat: Patentprozesskostenrechner — frontend UI at /kosten/rechner 2026-03-31 17:44:59 +02:00
m
f43f6e3eea feat: Patentprozesskostenrechner — backend fee engine + API 2026-03-31 17:44:44 +02:00
m
850f3a62c8 feat: add Patentprozesskostenrechner fee calculation engine + API
Pure Go implementation of patent litigation cost calculator with:
- Step-based GKG/RVG fee accumulator across 4 historical schedules (2005/2013/2021/2025 + Aktuell alias)
- Instance multiplier tables for 8 court types (LG, OLG, BGH NZB/Rev, BPatG, BGH Null, DPMA, BPatG Canc)
- Full attorney fee calculation (VG, TG, Erhöhungsgebühr Nr. 1008 VV RVG, Auslagenpauschale)
- Prozesskostensicherheit computation
- UPC fee data (pre-2026 and 2026 schedules with value-based brackets, recoverable costs ceilings)
- Public API: POST /api/fees/calculate, GET /api/fees/schedules (no auth required)
- 22 unit tests covering all calculation paths

Fixes 3 Excel bugs:
- Bug 1: Prozesskostensicherheit VAT formula (subtract → add)
- Bug 2: Security for costs uses GKG base for court fee, not RVG
- Bug 3: Expert fees included in BPatG instance total
2026-03-31 17:43:17 +02:00
m
08399bbb0a feat: add Patentprozesskostenrechner at /kosten/rechner
Full patent litigation cost calculator supporting:
- DE courts: LG, OLG, BGH (NZB/Revision), BPatG, BGH nullity
- UPC: first instance + appeal with SME reduction
- All 5 GKG/RVG fee schedule versions (2005-2025)
- Per-instance config: attorneys, patent attorneys, hearing, clients
- Live cost breakdown with per-instance detail cards
- DE vs UPC comparison bar chart
- Streitwert slider with presets (500 - 30M EUR)
- German labels, EUR formatting, responsive layout

New files:
- lib/costs/types.ts, fee-tables.ts, calculator.ts (pure calculation)
- components/costs/CostCalculator, InstanceCard, UPCCard, CostSummary, CostComparison
- app/(app)/kosten/rechner/page.tsx

Sidebar: added "Kostenrechner" with Calculator icon between Berichte and AI Analyse.
Types: added FeeCalculateRequest/Response to lib/types.ts.
2026-03-31 17:42:11 +02:00
m
d4092acc33 docs: Patentprozesskostenrechner implementation plan 2026-03-31 17:31:37 +02:00
m
7c70649494 docs: add Patentprozesskostenrechner implementation plan
Comprehensive analysis of the Excel-based patent litigation cost calculator
with implementation plan for the web version:

- Fee calculation logic (GKG/RVG step-based accumulator, all multipliers)
- Exact fee schedule data for all 5 versions (extracted from Excel)
- UPC fee structure research (fixed fees, value-based brackets, recoverable costs)
- Architecture: new page at /kosten/rechner within KanzlAI-mGMT (pure frontend)
- Complete input/output specifications
- 3 bugs to fix from the Excel (VAT formula, wrong fee type, missing expert fees)
- Side-by-side DE vs UPC cost comparison data
2026-03-31 17:28:39 +02:00
m
3599e302df feat: redesign Fristenrechner — cards, no tabs, inline calculation 2026-03-30 21:00:14 +02:00
m
899b461833 feat: redesign Fristenrechner as single-flow card-based UI
Replace the Schnell/Wizard tab layout with a unified flow:
1. Proceeding type selection via compact clickable cards grouped
   by jurisdiction + category (UPC Hauptverfahren, im Verfahren,
   Rechtsbehelfe, Deutsche Patentverfahren)
2. Vertical deadline rule list for the selected type showing name,
   duration, rule code, and acting party
3. Inline expansion on click with date picker, auto-calculated due
   date (via selected_rule_ids API), holiday/weekend adjustment
   note, and save-to-case option

Old DeadlineCalculator.tsx and DeadlineWizard.tsx are no longer
imported but kept for reference.
2026-03-30 20:55:46 +02:00
m
260f65ea02 feat: auto-calculate deadlines on proceeding type selection (no click needed) 2026-03-30 19:41:02 +02:00
m
501b573967 fix: use typed category field instead of Record cast 2026-03-30 19:37:52 +02:00
m
23b8ef4bba chore: gitignore server binary and local state files 2026-03-30 19:34:34 +02:00
m
54c6eb8dae feat: 15 UPC proceeding types in 3 groups + category field
Added 10 new UPC types: DNI, EPO, AMD, CCI, EVP, DAM, COS, REH, DEF, RST.
Grouped as: Hauptverfahren / Verfahren im Verfahren / Rechtsbehelfe.
Frontend dropdown shows sub-groups within jurisdiction. German names throughout.
2026-03-30 19:34:07 +02:00
m
967f2f6d09 feat: direct SMTP email sending via Hostinger (replaces m CLI) 2026-03-30 17:28:40 +02:00
m
e5387734aa fix: use mgmt@msbls.de as default MAIL_FROM (alias now exists) 2026-03-30 17:28:11 +02:00
m
6cb87c6868 feat: replace m CLI email with direct SMTP over TLS
The m CLI isn't available in Docker containers. Replace exec.Command("m", "mail", "send")
with direct SMTP using crypto/tls + net/smtp (implicit TLS on port 465).

Env vars: SMTP_HOST, SMTP_PORT, SMTP_USER, SMTP_PASS, MAIL_FROM
Gracefully skips sending if SMTP is not configured.

Note: mgmt@msbls.de rejected by Hostinger as not owned by mail@msbls.de.
Default from address set to mail@msbls.de until alias is created.
2026-03-30 17:23:54 +02:00
m
d38719db2f fix: add email field to UserTenant TypeScript type 2026-03-30 17:19:15 +02:00
m
b21efccfb5 fix: add MAIL_FROM env (default mgmt@msbls.de) + graceful fallback when m CLI unavailable 2026-03-30 17:10:25 +02:00
m
f51d189a3b fix: show member email instead of UUID in team management 2026-03-30 17:09:14 +02:00
m
481b299e03 test: comprehensive integration tests for all API endpoints 2026-03-30 14:43:32 +02:00
m
68d48100b9 test: comprehensive integration tests for all API endpoints
Replace the existing integration test suite with a complete test covering
every registered API route. Tests use httptest with the real router and
a real DB connection (youpc.org mgmt schema).

Endpoint groups tested:
- Health, Auth (JWT validation, expired/invalid/wrong-secret)
- Current user (GET /api/me)
- Tenants (CRUD, auto-assign)
- Cases (CRUD with search/status filters)
- Parties (CRUD)
- Deadlines (CRUD, complete, batch create)
- Appointments (CRUD)
- Notes (CRUD)
- Dashboard
- Proceeding types & deadline rules
- Deadline calculator & determination (timeline, determine)
- Reports (cases, deadlines, workload, billing)
- Templates (CRUD, render)
- Time entries (CRUD, summary)
- Invoices (CRUD, status update)
- Billing rates (list, upsert)
- Notifications (list, unread count, mark read, preferences)
- Audit log (list, filtered)
- Case assignments (assign, unassign)
- Documents (list, meta)
- AI endpoints (availability check)
- Critical path E2E (case -> deadline -> appointment -> note -> time entry -> dashboard -> complete)
2026-03-30 14:41:59 +02:00
m
40a11a4c49 feat: group proceeding types by jurisdiction (UPC/DE) + add German patent proceedings 2026-03-30 14:33:28 +02:00
m
eca0cde5e7 fix: timeline 404 + calculate endpoint fixes 2026-03-30 14:32:51 +02:00
m
cf3711b2e4 fix: update seed files to use mgmt schema after migration
The search_path was changed from kanzlai to mgmt but seed files
still referenced the old schema. Also added missing is_spawn and
spawn_label columns to mgmt.deadline_rules via direct DB migration.

Root cause of timeline 404 / calculate+determine 400: the ruleColumns
query selected is_spawn and spawn_label which didn't exist in the
mgmt.deadline_rules table, causing all deadline rule queries to fail.
2026-03-30 14:30:40 +02:00
m
dea49f6f8e feat: group proceeding types by jurisdiction in UI dropdowns
- DeadlineCalculator: use optgroup to group by UPC/DE
- DeadlineWizard: add section headers for each jurisdiction
- CaseForm: replace hardcoded TYPE_OPTIONS with API-fetched
  proceeding types grouped by jurisdiction
- Added 3 new DE proceeding types to DB: DE_PATENT,
  DE_NULLITY, DE_OPPOSITION
2026-03-30 14:29:42 +02:00
m
5e401d2eac fix: default deadline calculator date to today 2026-03-30 14:21:08 +02:00
m
3f90904e0c fix: update search_path from kanzlai to mgmt after migration 2026-03-30 14:18:35 +02:00
m
f285d4451d refactor: switch to youpc.org Supabase, remove separate YouPCDatabaseURL 2026-03-30 14:09:52 +02:00
m
bf1b1cdd82 refactor: remove YouPCDatabaseURL, use same DB connection for case finder
Now that KanzlAI is on the youpc.org Supabase instance, the separate
YouPCDatabaseURL connection is unnecessary. The main database connection
can query mlex.* tables directly since they're on the same Postgres.

- Remove YouPCDatabaseURL from config
- Remove separate sqlx.Connect block in main.go
- Pass main database handle as youpcDB parameter to router
- Update CLAUDE.md: mgmt schema in youpc.org (was kanzlai in flexsiebels)
2026-03-30 14:01:19 +02:00
m
9d89b97ad5 fix: open reports endpoints to all roles, only billing restricted 2026-03-30 13:44:04 +02:00
m
2f572fafc9 fix: wire all missing routes (reports, time entries, invoices, templates, billing) 2026-03-30 13:14:18 +02:00
m
d76ffec758 fix: wire all missing routes in router.go
Register routes for reports, time entries, invoices, billing rates,
and document templates. All handlers and services already existed but
were not connected in the router.

Permission mapping:
- Reports, invoices, billing rates: PermManageBilling (partners+owners)
- Templates create/update/delete: PermCreateCase
- Time entries, template read/render: all authenticated users
2026-03-30 13:11:17 +02:00
m
4b0ccac384 fix: auto-strip /api/ prefix in api client + document convention
The api client now calls normalizePath() to strip accidental /api/
prefixes. This prevents the recurring /api/api/ double-prefix bug.
Added convention note to .claude/CLAUDE.md so future workers know.
2026-03-30 13:05:02 +02:00
m
3030ef1e8b fix: add all missing type exports (TimeEntry, Invoice, reports, notifications, audit) 2026-03-30 11:52:10 +02:00
m
2578060638 fix: add missing TEMPLATE_CATEGORY_LABELS export to types.ts 2026-03-30 11:43:36 +02:00
m
8f91feee0e feat: UPC deadline determination — event-driven proceeding timeline wizard 2026-03-30 11:38:08 +02:00
m
a89ef26ebd feat: UPC deadline determination — event-driven model with proceeding timeline
Full event-driven deadline determination system ported from youpc.org:

Backend:
- DetermineService: walks proceeding event tree, calculates cascading
  dates with holiday adjustment and conditional logic
- GET /api/proceeding-types/{code}/timeline — full event tree structure
- POST /api/deadlines/determine — calculate timeline with conditions
- POST /api/cases/{caseID}/deadlines/batch — batch-create deadlines
- DeadlineRule model: added is_spawn, spawn_label fields
- GetFullTimeline: recursive CTE following cross-type spawn branches
- Conditional deadlines: condition_rule_id toggles alt_duration/rule_code
  (e.g. Reply changes from RoP.029b to RoP.029a when CCR is filed)
- Seed SQL with full UPC event trees (INF, REV, CCR, APM, APP, AMD)

Frontend:
- DeadlineWizard: interactive proceeding timeline with step-by-step flow
  1. Select proceeding type (visual cards)
  2. Enter trigger event date
  3. Toggle conditional branches (CCR, Appeal, Amend)
  4. See full calculated timeline with color-coded urgency
  5. Batch-create all deadlines on a selected case
- Visual timeline tree with party icons, rule codes, duration badges
- Kept existing DeadlineCalculator as "Schnell" quick mode

Also resolved merge conflicts across 6 files (auth, router, handlers)
merging role-based permissions + audit trail features.
2026-03-30 11:33:59 +02:00
m
6b8c6f761d feat: HL tenant + email domain auto-assignment 2026-03-30 11:29:53 +02:00
m
93a25e3d72 feat: AI features — drafting, strategy, similar cases (P2) 2026-03-30 11:29:41 +02:00
m
81c2bb29b9 feat: reporting dashboard with charts (P1) 2026-03-30 11:29:35 +02:00
m
9f18fbab80 feat: document templates with auto-fill (P1) 2026-03-30 11:29:23 +02:00
m
ae55d9814a feat: time tracking + billing (P1) 2026-03-30 11:29:10 +02:00
m
642877ae54 feat: document templates with auto-fill from case data (P1)
- Database: kanzlai.document_templates table with RLS policies
- Seed: 4 system templates (Klageerwiderung UPC, Berufungsschrift,
  Mandatsbestätigung, Kostenrechnung)
- Backend: TemplateService (CRUD + render), TemplateHandler with
  endpoints: GET/POST /api/templates, GET/PUT/DELETE /api/templates/{id},
  POST /api/templates/{id}/render?case_id=X
- Template variables: case.*, party.*, tenant.*, user.*, date.*, deadline.*
- Frontend: /vorlagen page with category filters, template detail/editor,
  render flow (select case -> preview -> copy/download), variable toolbar
- Quick action: "Schriftsatz erstellen" button on case detail page
- Also: resolved merge conflicts between audit-trail and role-based branches,
  added missing Notification/AuditLog types to frontend
2026-03-30 11:26:25 +02:00
m
fdb4ac55a1 feat: frontend AI tab — KI-Strategie, KI-Entwurf, Aehnliche Faelle
New "KI" tab on case detail page with three sub-panels:
- KI-Strategie: one-click strategic analysis with next steps, risks, timeline
- KI-Entwurf: document drafting with template selection, language, instructions
- Aehnliche Faelle: UPC similar case search with relevance scores

Components: CaseStrategy, DocumentDrafter, SimilarCaseFinder
Types: StrategyRecommendation, DocumentDraft, SimilarCase, etc.
2026-03-30 11:26:01 +02:00
m
dd683281e0 feat: AI-powered features — document drafting, case strategy, similar case finder (P2)
Backend:
- DraftDocument: Claude generates legal documents from case data + template type
  (14 template types: Klageschrift, UPC claims, Abmahnung, etc.)
- CaseStrategy: Opus-powered strategic analysis with next steps, risk assessment,
  and timeline optimization (structured tool output)
- FindSimilarCases: queries youpc.org Supabase for UPC cases, Claude ranks by
  relevance with explanations and key holdings

Endpoints: POST /api/ai/draft-document, /case-strategy, /similar-cases
All rate-limited (5 req/min) and permission-gated (PermAIExtraction).
YouPC database connection is optional (YOUPC_DATABASE_URL env var).
2026-03-30 11:25:52 +02:00
m
bfd5e354ad fix: resolve merge conflicts from P0 role-based + audit trail branches
Combine role-based permissions (VerifyAccess/GetUserRole) with audit trail
(IP/user-agent context capture) in auth middleware and tenant resolver.
2026-03-30 11:25:41 +02:00
m
118bae1ae3 feat: HL tenant setup + email domain auto-assignment
- Create pre-configured Hogan Lovells tenant with demo flag and
  auto_assign_domains: ["hoganlovells.com"]
- Add POST /api/tenants/auto-assign endpoint: checks email domain
  against tenant settings, auto-assigns user as associate if match
- Add AutoAssignByDomain to TenantService
- Update registration flow: after signup, check auto-assign before
  showing tenant creation form. Skip tenant creation if auto-assigned.
- Add DemoBanner component shown when tenant.settings.demo is true
- Extend GET /api/me to return is_demo flag from tenant settings
2026-03-30 11:24:52 +02:00
m
fdef5af32e feat: reporting dashboard — case stats, deadline compliance, workload, billing (P1)
Backend:
- ReportingService with aggregation queries (CTEs, FILTER clauses)
- 4 API endpoints: /api/reports/{cases,deadlines,workload,billing}
- Date range filtering via ?from=&to= query params

Frontend:
- /berichte page with 4 tabs: Akten, Fristen, Auslastung, Abrechnung
- recharts: bar/pie/line charts for all report types
- Date range picker, CSV export, print-friendly view
- Sidebar nav entry with BarChart3 icon

Also resolves merge conflicts between role-based, notification, and
audit trail branches, and adds missing TS types (AuditLogResponse,
Notification, NotificationPreferences).
2026-03-30 11:24:45 +02:00
m
34dcbb74fe fix: resolve merge conflicts from role-based permissions + audit trail branches
Combines auth context keys (user role, IP, user-agent), tenant resolver
(GetUserRole-based access verification), middleware (deferred tenant
resolution + request info capture), and router (audit log + notifications
+ assignments).
2026-03-30 11:24:43 +02:00
m
238811727d feat: time tracking + billing — hourly rates, time entries, invoices (P1)
Database: time_entries, billing_rates, invoices tables with RLS.
Backend: CRUD services+handlers for time entries, billing rates, invoices.
  - Time entries: list/create/update/delete, summary by case/user/month
  - Billing rates: upsert with auto-close previous, current rate lookup
  - Invoices: create with auto-number (RE-YYYY-NNN), status transitions
    (draft->sent->paid, cancellation), link time entries on invoice create
API: 11 new endpoints under /api/time-entries, /api/billing-rates, /api/invoices
Frontend: Zeiterfassung tab on case detail, /abrechnung overview with filters,
  /abrechnung/rechnungen list+detail with status actions, billing rates settings
Also: resolved merge conflicts between audit-trail and role-based branches,
  added missing types (Notification, AuditLogResponse, NotificationPreferences)
2026-03-30 11:24:36 +02:00
m
8e65463130 feat: role-based permissions — owner/partner/associate/paralegal/secretary (P0) 2026-03-30 11:09:05 +02:00
m
a307b29db8 feat: email notifications + deadline reminder system (P0) 2026-03-30 11:08:53 +02:00
m
5e88384fab feat: append-only audit trail for all mutations (P0) 2026-03-30 11:08:41 +02:00
m
0a0ec016d8 feat: role-based permissions (owner/partner/associate/paralegal/secretary)
Backend:
- auth/permissions.go: full permission matrix with RequirePermission/RequireRole
  middleware, CanEditCase, CanDeleteDocument helpers
- auth/context.go: add user role to request context
- auth/middleware.go: resolve role alongside tenant in auth flow
- auth/tenant_resolver.go: verify membership + resolve role for X-Tenant-ID
- handlers/case_assignments.go: CRUD for case-level user assignments
- handlers/tenant_handler.go: UpdateMemberRole, GetMe (/api/me) endpoints
- handlers/documents.go: permission-based delete (own vs all)
- router/router.go: permission-wrapped routes for all endpoints
- services/case_assignment_service.go: assign/unassign with tenant validation
- services/tenant_service.go: UpdateMemberRole with owner protection
- models/case_assignment.go: CaseAssignment model

Database:
- user_tenants.role: CHECK constraint (owner/partner/associate/paralegal/secretary)
- case_assignments table: case_id, user_id, role (lead/team/viewer)
- Migrated existing admin->partner, member->associate

Frontend:
- usePermissions hook: fetches /api/me, provides can() helper
- TeamSettings: 5-role dropdown, role change, permission-gated invite
- CaseAssignments: new component for case-level team management
- Sidebar: conditionally hides AI/Settings based on permissions
- Cases page: hides "Neue Akte" button for non-authorized roles
- Case detail: new "Mitarbeiter" tab for assignment management
2026-03-30 11:04:57 +02:00
m
ac20c03f01 feat: email notifications + deadline reminder system
Database:
- notification_preferences table (user_id, tenant_id, reminder days, email/digest toggles)
- notifications table (type, entity link, read/sent tracking, dedup index)

Backend:
- NotificationService with background goroutine checking reminders hourly
- CheckDeadlineReminders: finds deadlines due in N days per user prefs, creates notifications
- Overdue deadline detection and notification
- Daily digest at 8am: compiles pending notifications into one email
- SendEmail via `m mail send` CLI command
- Deduplication: same notification type + entity + day = skip
- API: GET/PATCH notifications, unread count, mark read/all-read
- API: GET/PUT notification-preferences with upsert

Frontend:
- NotificationBell in header with unread count badge (polls every 30s)
- Dropdown panel with notification list, type-colored dots, time-ago, entity links
- Mark individual/all as read
- NotificationSettings in Einstellungen page: reminder day toggles, email toggle, digest toggle
2026-03-30 11:03:17 +02:00
m
c324a2b5c7 fix: critical security hardening — tenant isolation, CORS, error masking, input validation 2026-03-30 11:02:52 +02:00
m
b36247dfb9 feat: append-only audit trail for all mutations (P0)
- Database: kanzlai.audit_log table with RLS, append-only policies
  (no UPDATE/DELETE), indexes for entity, user, and time queries
- Backend: AuditService.Log() with context-based tenant/user/IP/UA
  extraction, wired into all 7 services (case, deadline, appointment,
  document, note, party, tenant)
- API: GET /api/audit-log with entity_type, entity_id, user_id,
  from/to date, and pagination filters
- Frontend: Protokoll tab on case detail page with chronological
  audit entries, diff preview, and pagination

Required by § 50 BRAO and DSGVO Art. 5(2).
2026-03-30 11:02:28 +02:00
m
c15d5b72f2 fix: critical security hardening — tenant isolation, CORS, error leaking, input validation
1. Tenant isolation bypass (CRITICAL): TenantResolver now verifies user
   has access to X-Tenant-ID via user_tenants lookup before setting context.
   Added VerifyAccess method to TenantLookup interface and TenantService.

2. Consolidated tenant resolution: Removed duplicate resolveTenant() from
   helpers.go and tenant resolution from auth middleware. TenantResolver is
   now the single source of truth. Deadlines and AI handlers use
   auth.TenantFromContext() instead of direct DB queries.

3. CalDAV credential masking: tenant settings responses now mask CalDAV
   passwords with "********" via maskSettingsPassword helper. Applied to
   GetTenant, ListTenants, and UpdateSettings responses.

4. CORS + security headers: New middleware/security.go with CORS
   (restricted to FRONTEND_ORIGIN) and security headers (X-Frame-Options,
   X-Content-Type-Options, HSTS, Referrer-Policy, X-XSS-Protection).

5. Internal error leaking: All writeError(w, 500, err.Error()) replaced
   with internalError() that logs via slog and returns generic "internal
   error" to client. Same for jsonError in tenant handler.

6. Input validation: Max length on title (500), description (10000),
   case_number (100), search (200). Pagination clamped to max 100.
   Content-Disposition filename sanitized against header injection.

Regression test added for tenant access denial (403 on unauthorized
X-Tenant-ID). All existing tests pass, go vet clean.
2026-03-30 11:01:14 +02:00
m
82878dffd5 docs: full system roadmap — from MVP to complete Kanzleimanagement 2026-03-28 02:35:20 +01:00
m
ac04930667 feat: comprehensive KanzlAI-mGMT system roadmap
Full system vision document covering 23 features across 4 priority tiers:
- P0 (must-have): audit trail, conflict checks, roles/permissions,
  notifications, time tracking, RVG calculator, invoicing, DATEV export
- P1 (should-have): document templates, beA integration, full-text search,
  Wiedervorlagen, email integration, reporting
- P2 (differentiator): patent family tracking, claim charts, UPC case law
  intelligence via mLex, AI document drafting, AI strategy analysis
- P3 (nice-to-have): client portal, PWA, multi-language, EDA

Includes data model designs (24 new tables), API specifications,
implementation phases, competitive analysis, and risk register.
2026-03-28 02:30:39 +01:00
m
909f14062c docs: comprehensive MVP audit — security, architecture, UX, competitive analysis 2026-03-28 02:26:39 +01:00
m
4b86dfa4ad feat: update AUDIT.md with sub-agent findings
Added 7 additional issues from deep-dive agents:
- Race condition in HolidayService cache (critical)
- Rate limiter X-Forwarded-For bypass (critical)
- German umlaut typos throughout frontend
- Silent error swallowing in createEvent
- Missing React error boundaries
- No RLS policies at database level
- Updated priority roadmap with new items
2026-03-28 02:23:50 +01:00
m
60f1f4ef4a feat: comprehensive MVP audit — security, architecture, UX, competitive analysis
Structured assessment covering code quality, security (critical tenant isolation
bypass found), architecture, UX gaps, testing coverage, deployment, and
competitive positioning vs RA-MICRO/ADVOWARE/AnNoText/Actaport.

Includes prioritized roadmap (P0-P3) with actionable items.
2026-03-28 02:22:07 +01:00
m
7c7ae396f4 feat: Phase D — case detail refactor to URL-based nested routes 2026-03-25 19:32:41 +01:00
m
433a0408f2 feat: Phase C — detail pages for deadlines, appointments, events, creation forms 2026-03-25 19:32:17 +01:00
m
cabea83784 feat: Phase B — interactive dashboard, breadcrumbs, clickable navigation 2026-03-25 19:31:59 +01:00
m
8863878b39 feat: Phase A backend — notes CRUD, detail endpoints, dashboard fix 2026-03-25 19:31:54 +01:00
m
84b178edbf feat: Phase B — interactive dashboard, breadcrumbs, clickable navigation
- Breadcrumb component: reusable nav with items array (label+href)
- DeadlineTrafficLights: buttons → Links to /fristen?status={filter}
- CaseOverviewGrid: static metrics → clickable Links to /cases?status={filter}
- UpcomingTimeline: items → clickable Links to /fristen/{id} or /termine/{id}
  with case number links and hover chevron
- QuickActions: swap CalDAV Sync for "Neuer Termin" → /termine/neu,
  fix "Frist eintragen" → /fristen/neu
- AISummaryCard: add RefreshCw button with spinning animation
- RecentActivityList: new component showing recent case events
- DeadlineList: accept initialStatus prop, add this_week/ok filters
- fristen/page.tsx: read searchParams.status for initial filter
- Add breadcrumbs to dashboard, fristen, cases, termine pages
- Add RecentActivity type, update DashboardData type
2026-03-25 19:29:13 +01:00
m
7094212dcf feat: Phase C frontend detail pages for deadlines, appointments, events
- Deadline detail page (/fristen/[id]) with status badge, due date,
  case context, complete button, and notes
- Appointment detail page (/termine/[id]) with datetime, location,
  type badge, case link, description, and notes
- Case event detail page (/cases/[id]/ereignisse/[eventId]) with
  event type icon, description, metadata, and notes
- Standalone deadline creation (/fristen/neu) with case dropdown
- Standalone appointment creation (/termine/neu) with optional case
- Reusable Breadcrumb component for navigation hierarchy
- Reusable NotesList component with inline create/edit/delete
- Added Note and RecentActivity types to lib/types.ts
2026-03-25 19:29:12 +01:00
m
9787450d91 feat: refactor case detail from useState tabs to URL-based nested routes
Refactors the monolithic cases/[id]/page.tsx into Next.js nested routes
with a shared layout for the case header and tab navigation bar.

Route structure:
- cases/[id]/layout.tsx — case header + tab bar (active tab from URL)
- cases/[id]/page.tsx — redirects to ./verlauf
- cases/[id]/verlauf/page.tsx — timeline tab
- cases/[id]/fristen/page.tsx — deadlines tab
- cases/[id]/dokumente/page.tsx — documents tab (with upload)
- cases/[id]/parteien/page.tsx — parties tab
- cases/[id]/notizen/page.tsx — notes tab (new, uses NotesList)

New shared components:
- Breadcrumb.tsx — reusable breadcrumb navigation
- NotesList.tsx — reusable notes CRUD (inline create/edit/delete)
- Note type added to types.ts

Benefits: deep linking, browser back/forward, bookmarkable tabs.
2026-03-25 19:28:29 +01:00
m
1e88dffd82 feat: Phase A backend — notes CRUD, detail endpoints, dashboard fix
- Create kanzlai.notes table (polymorphic FK with CHECK constraint,
  partial indexes, RLS)
- Add Note model, NoteService (ListByParent, Create, Update, Delete),
  and NoteHandler with endpoints: GET/POST /api/notes, PUT/DELETE /api/notes/{id}
- Add GET /api/deadlines/{deadlineID} detail endpoint
- Add GET /api/appointments/{id} detail endpoint
- Add GET /api/case-events/{id} detail endpoint (new CaseEventHandler)
- Fix dashboard query: add case_id to upcoming_deadlines SELECT,
  add id and case_id to recent_activity SELECT
- Register all new routes in router.go
2026-03-25 19:26:21 +01:00
m
9ad58e1ba3 docs: design document for dashboard redesign + detail pages 2026-03-25 18:51:44 +01:00
m
0712d9a367 docs: design document for dashboard redesign + detail pages (t-kz-060)
Comprehensive design covering:
- Dashboard interactivity (click-to-filter traffic lights, clickable timeline,
  fixed quick actions, AI summary refresh)
- New detail pages (deadline, appointment, case event)
- Notes system with polymorphic table design
- Case detail URL-based tab navigation
- Breadcrumb navigation system
- Backend API additions and data model changes
- Phased implementation plan for coders
2026-03-25 18:49:48 +01:00
m
cd31e76d07 fix: TenantSwitcher shows dropdown for single tenant, wider name display 2026-03-25 18:40:15 +01:00
m
f42b7ddec7 fix: add array guards to all frontend components consuming API responses 2026-03-25 18:35:28 +01:00
m
50bfa3deb4 fix: add array guards to all frontend components consuming API responses
Prevents "M.forEach is not a function" crashes when API returns error
objects or unexpected shapes instead of arrays. Guards all useQuery
consumers with Array.isArray checks and safe defaults for object props.

Files fixed: DeadlineList, AppointmentList, TenantSwitcher,
DeadlineTrafficLights, UpcomingTimeline, CaseOverviewGrid,
AISummaryCard, TeamSettings, and all page-level components
(dashboard, cases, fristen, termine, ai/extract).
2026-03-25 18:34:11 +01:00
m
e635efa71e fix: remove remaining /api/ double-prefix from template literal API calls
Previous fix missed backtick template strings. Fixed 7 more api.*()
calls in appointments, deadlines, settings, and einstellungen pages.
2026-03-25 18:20:35 +01:00
m
12e0407025 test: comprehensive E2E and API test suite for full KanzlAI stack 2026-03-25 16:21:32 +01:00
m
325fbeb5de test: comprehensive E2E and API test suite for full KanzlAI stack
Backend (Go):
- Expanded integration_test.go: health, auth middleware (expired/invalid/wrong-secret JWT),
  tenant CRUD, case CRUD (create/list/get/update/delete + filters + validation),
  deadline CRUD (create/list/update/complete/delete), appointment CRUD,
  dashboard (verifies all sections), deadline calculator (valid/invalid/unknown type),
  proceeding types & rules, document endpoints, AI extraction (no-key path),
  and full critical path E2E (auth -> case -> deadline -> appointment -> dashboard -> complete)
- New handler unit tests: case (10), appointment (11), dashboard (1), calculate (5),
  document (10), AI (4) — all testing validation, auth guards, and error paths without DB
- Total: ~80 backend tests (unit + integration)

Frontend (TypeScript/Vitest):
- Installed vitest 2.x, @testing-library/react, @testing-library/jest-dom, jsdom 24, msw
- vitest.config.ts with jsdom env, esbuild JSX automatic, path aliases
- API client tests (13): URL construction, no double /api/, auth header, tenant header,
  POST/PUT/PATCH/DELETE methods, error handling, 204 responses
- DeadlineTrafficLights tests (5): renders cards, correct counts, zero state, onFilter callback
- CaseOverviewGrid tests (4): renders categories, counts, header, zero state
- LoginPage tests (8): form rendering, mode toggle, password login, redirect, error display,
  magic link, registration link
- Total: 30 frontend tests

Makefile: test-frontend target now runs vitest instead of placeholder echo.
2026-03-25 16:19:00 +01:00
m
19bea8d058 fix: remove /api/ double-prefix from all frontend API calls
Frontend api.ts baseUrl is already "/api", so paths like
"/api/cases" produced "/api/api/cases". Stripped the redundant
prefix from all component calls. Rewrite destination correctly
adds /api/ back for the Go backend.
2026-03-25 16:05:50 +01:00
m
661135d137 fix: exclude /api/ routes from Next.js auth middleware
The middleware was intercepting API proxy requests and redirecting
to /login. API routes should pass through to the Go backend which
handles its own JWT auth.
2026-03-25 15:58:42 +01:00
m
f8d97546e9 fix: preserve /api/ prefix in Next.js rewrite to backend
The rewrite was stripping /api/ from the path, but the Go backend
expects routes at /api/tenants, /api/dashboard etc.
2026-03-25 15:55:58 +01:00
m
45605c803b fix: pass NEXT_PUBLIC_* env vars as build args for Supabase client
Next.js inlines NEXT_PUBLIC_* vars at build time. They must be
available as ARGs during the Docker build, not just as runtime
environment variables.
2026-03-25 15:53:32 +01:00
m
e57b7c48ed feat: production hardening — slog, rate limiting, tests, seed data (Phase 4) 2026-03-25 14:35:49 +01:00
m
c5c3f41e08 feat: production hardening — slog, rate limiting, integration tests, seed data (Phase 4)
- Structured logging: replace log.* with log/slog JSON output across backend
- Request logger middleware: logs method, path, status, duration for all non-health requests
- Rate limiting: token bucket (5 req/min, burst 10) on AI endpoints (/api/ai/*)
- Integration tests: full critical path test (auth -> create case -> add deadline -> dashboard)
- Seed demo data: 1 tenant, 5 cases with deadlines/appointments/parties/events
- docker-compose.yml: add all required env vars (DATABASE_URL, SUPABASE_*, ANTHROPIC_API_KEY)
- .env.example: document all env vars including DATABASE_URL and CalDAV note
2026-03-25 14:32:27 +01:00
m
d0197a091c feat: add CalDAV settings UI and team management (Phase 3P) 2026-03-25 14:28:08 +01:00
m
fe97fed56d feat: add CalDAV settings UI and team management pages (Phase 3P)
Backend: PUT /api/tenants/{id}/settings endpoint for updating tenant
settings (JSONB merge). Frontend: /einstellungen page with CalDAV
config (URL, credentials, calendar path, sync toggle, interval),
manual sync button, live sync status display. /einstellungen/team
page with member list, invite-by-email, role management.
2026-03-25 14:26:05 +01:00
m
b49992b9c0 feat: UI polish — responsive, loading/empty/error states, German (Phase 3Q) 2026-03-25 14:20:08 +01:00
m
f81a2492c6 feat: UI polish — responsive, loading/empty/error states, German fixes (Phase 3Q)
- Responsive sidebar: collapses on mobile with hamburger menu, slide-in animation
- Skeleton loaders: dashboard cards, case table, case detail page
- Empty states: friendly messages with icons for cases, deadlines, parties, documents
- Error states: retry button on dashboard, proper error message on case not found
- Form validation: inline error messages on case creation form
- German language: fix all missing umlauts (Zurück, wählen, Anhängig, Verfügung, etc.)
- Status labels: display German translations instead of raw status values
- Transitions: fade-in animations on page load, hover/transition-colors on all interactive elements
- Focus states: focus-visible ring for keyboard accessibility
- Mobile layout: stacking for filters, forms, tabs; horizontal scroll for tables
- Extraction results: card layout on mobile, table on desktop
- Missing types: add DashboardData, DeadlineSummary, CaseSummary, ExtractedDeadline etc.
- Fix QuickActions links to use correct routes (/cases/new, /ai/extract)
- Consistent input focus styles across all forms
2026-03-25 14:16:30 +01:00
m
8bb8d7fed8 feat: add CalDAV bidirectional sync service (Phase 3O) 2026-03-25 14:04:38 +01:00
m
b4f3b26cbe feat: add document management frontend (Phase 2N) 2026-03-25 14:04:28 +01:00
m
6e9345fcfe feat: add appointment calendar frontend (Phase 1H) 2026-03-25 14:04:12 +01:00
m
785df2ced4 feat: add CalDAV bidirectional sync service (Phase 3O)
Implements CalDAV sync using github.com/emersion/go-webdav:

- CalDAVService with background polling (configurable per-tenant interval)
- Push: deadlines -> VTODO, appointments -> VEVENT on create/update/delete
- Pull: periodic fetch from CalDAV, reconcile with local DB
- Conflict resolution: KanzlAI wins dates/status, CalDAV wins notes/description
- Conflicts logged as case_events with caldav_conflict type
- UID pattern: kanzlai-{deadline|appointment}-{uuid}@kanzlai.msbls.de
- CalDAV config per tenant in tenants.settings JSONB

Endpoints:
- POST /api/caldav/sync — trigger full sync for current tenant
- GET /api/caldav/status — last sync time, item counts, errors

8 unit tests for UID generation, parsing, path construction, config parsing.
2026-03-25 14:01:30 +01:00
m
0ab2e8b383 feat: add document management frontend (Phase 2N)
- DocumentUpload: dropzone with multi-file support, upload via
  POST /api/cases/{id}/documents, progress feedback with toast
- DocumentList: type badges, file size, upload date, download links,
  delete with inline confirmation
- Integrated as Dokumente tab in case detail page with count badge
- Eagerly fetches document count for tab badge display
2026-03-25 13:59:48 +01:00
194 changed files with 28117 additions and 978 deletions

View File

@@ -18,6 +18,7 @@
- ESLint must pass before committing
- Import aliases: `@/` maps to `src/`
- Bun as package manager (not npm/yarn/pnpm)
- **API paths: NEVER include `/api/` prefix.** The `api` client in `lib/api.ts` already has `baseUrl="/api"`. Write `api.get("/cases")` NOT `api.get("/api/cases")`. The client auto-strips accidental `/api/` prefixes but don't rely on it.
## General

View File

@@ -3,11 +3,16 @@
# Backend
PORT=8080
DATABASE_URL=postgresql://user:pass@host:5432/dbname
# Supabase (required for database access)
SUPABASE_URL=
# Supabase (required for database + auth)
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=
SUPABASE_SERVICE_KEY=
SUPABASE_JWT_SECRET=
# Claude API (required for AI features)
ANTHROPIC_API_KEY=
# CalDAV (configured per-tenant in tenant settings, not env vars)
# See tenant.settings.caldav JSON field

6
.gitignore vendored
View File

@@ -46,3 +46,9 @@ tmp/
# TypeScript
*.tsbuildinfo
.worktrees/
backend/server
backend/.m/
.m/inbox_lastread
backend/server
backend/.m/
.m/inbox_lastread

View File

@@ -52,7 +52,7 @@ head:
infinity_mode: false
capacity:
global:
max_workers: 5
max_workers: 6
max_heads: 3
per_worker:
max_tasks_lifetime: 0

482
AUDIT.md Normal file
View File

@@ -0,0 +1,482 @@
# KanzlAI-mGMT MVP Audit
**Date:** 2026-03-28
**Auditor:** athena (consultant)
**Scope:** Full-stack audit of KanzlAI-mGMT — Go backend, Next.js frontend, Supabase database, deployment, security, UX, competitive positioning.
**Codebase:** ~16,500 lines across ~60 source files, built 2026-03-25 in a single session with parallel workers.
---
## Executive Summary
KanzlAI-mGMT is an impressive MVP built in ~2 hours. It covers the core Kanzleimanagement primitives: cases, deadlines, appointments, parties, documents, notes, dashboard, CalDAV sync, and AI-powered deadline extraction. The architecture is sound — clean separation between Go API and Next.js frontend, proper multi-tenant design with Supabase Auth, parameterized SQL throughout.
However, the speed of construction shows. There are **critical security gaps** that must be fixed before any external user touches this. The frontend has good bones but lacks the polish and completeness a lawyer would expect. And the feature gap vs. established competitors (RA-MICRO, ADVOWARE, AnNoText, Actaport) is enormous — particularly around beA integration, billing/RVG, and document generation, which are table-stakes for German law firms.
**Bottom line:** Fix the security issues, add error recovery and multi-tenant auth verification, then decide whether to pursue the Kanzleimanagement market (massive feature gap) or pivot back to the UPC niche (where you had a genuine competitive advantage).
---
## 1. Critical Issues (Fix Immediately)
### 1.1 Tenant Isolation Bypass in TenantResolver
**File:** `backend/internal/auth/tenant_resolver.go:37-42`
When the `X-Tenant-ID` header is provided, the TenantResolver parses it and sets it in context **without verifying the user has access to that tenant**. Any authenticated user can access any tenant's data by setting this header.
```go
if header := r.Header.Get("X-Tenant-ID"); header != "" {
parsed, err := uuid.Parse(header)
// ... sets tenantID = parsed — NO ACCESS CHECK
}
```
Compare with `helpers.go:32-44` where `resolveTenant()` correctly verifies access via `user_tenants` — but this function is unused in the middleware path. The TenantResolver middleware is what actually runs for all scoped routes.
**Impact:** Complete tenant data isolation breach. User A can read/modify/delete User B's cases, deadlines, appointments, documents.
**Fix:** Add `user_tenants` lookup in TenantResolver when X-Tenant-ID is provided, same as `resolveTenant()` does.
### 1.2 Duplicate Tenant Resolution Logic
**Files:** `backend/internal/auth/tenant_resolver.go` and `backend/internal/handlers/helpers.go:25-57`
Two independent implementations of tenant resolution exist. The middleware (`TenantResolver`) is used for the scoped routes. The handler-level `resolveTenant()` function exists in helpers.go. The auth middleware in `middleware.go:39-47` also resolves a tenant into context. This triple-resolution creates confusion and the security bug above.
**Fix:** Consolidate to a single path. Remove the handler-level `resolveTenant()` and the auth middleware's tenant resolution. Let TenantResolver be the single source of truth, but make it verify access.
### 1.3 CalDAV Credentials Stored in Plaintext
**File:** `backend/internal/services/caldav_service.go:29-35`
CalDAV username and password are stored as plain JSON in the `tenants.settings` column:
```go
type CalDAVConfig struct {
URL string `json:"url"`
Username string `json:"username"`
Password string `json:"password"`
...
}
```
Combined with the tenant isolation bypass above, any authenticated user can read any tenant's CalDAV credentials.
**Fix:** Encrypt CalDAV credentials at rest (e.g., using `pgcrypto` or application-level encryption). At minimum, never return the password in API responses.
### 1.4 No CORS Configuration
**File:** `backend/internal/router/router.go`, `backend/cmd/server/main.go`
There is zero CORS handling anywhere in the backend. The frontend uses Next.js rewrites to proxy `/api/` to the backend, which works in production. But:
- If anyone accesses the backend directly (different origin), there's no CORS protection.
- No `X-Frame-Options`, `X-Content-Type-Options`, or other security headers are set.
**Fix:** Add CORS middleware restricting to the frontend origin. Add standard security headers.
### 1.5 Internal Error Messages Leaked to Clients
**Files:** Multiple handlers (e.g., `cases.go:44`, `cases.go:73`, `appointments.go`)
```go
writeError(w, http.StatusInternalServerError, err.Error())
```
Internal error messages (including SQL errors, connection errors, etc.) are sent directly to the client. This leaks implementation details.
**Fix:** Log the full error server-side, return a generic message to the client.
### 1.6 Race Condition in HolidayService Cache
**File:** `backend/internal/services/holidays.go`
The `HolidayService` uses a `map[int][]Holiday` cache without any mutex protection. Concurrent requests (e.g., multiple deadline calculations) will cause a data race. The Go race detector would flag this.
**Fix:** Add `sync.RWMutex` to HolidayService.
### 1.7 Rate Limiter Trivially Bypassable
**File:** `backend/internal/middleware/ratelimit.go:78-79`
```go
ip := r.Header.Get("X-Forwarded-For")
if ip == "" { ip = r.RemoteAddr }
```
Rate limiting keys off `X-Forwarded-For`, which any client can spoof. An attacker can bypass AI endpoint rate limits by rotating this header.
**Fix:** Only trust `X-Forwarded-For` from configured reverse proxy IPs, or use `r.RemoteAddr` exclusively behind a trusted proxy.
---
## 2. Important Gaps (Fix Before Showing to Anyone)
### 2.1 No Input Validation Beyond "Required Fields"
**Files:** All handlers
Input validation is minimal — typically just checking if required fields are empty:
```go
if input.CaseNumber == "" || input.Title == "" {
writeError(w, http.StatusBadRequest, "case_number and title are required")
}
```
Missing:
- Length limits on text fields (could store megabytes in a title field)
- Status value validation (accepts any string for status fields)
- Date format validation
- Case type validation against allowed values
- SQL-safe string validation (although parameterized queries protect against injection)
### 2.2 No Pagination Defaults on Most List Endpoints
**File:** `backend/internal/services/case_service.go:57-63`
`CaseService.List` has sane defaults (limit=20, max=100). But other list endpoints (`appointments`, `deadlines`, `notes`, `parties`, `case_events`) have no pagination at all — they return all records for a tenant/case. As data grows, these become performance problems.
### 2.3 Dashboard Page is Entirely Client-Side
**File:** `frontend/src/app/(app)/dashboard/page.tsx`
The entire dashboard is a `"use client"` component that fetches data via API. This means:
- No SSR benefit — the page is blank until JS loads and API responds
- SEO doesn't matter for a SaaS app, but initial load time does
- The skeleton is nice but adds 200-400ms of perceived latency
For an internal tool this is acceptable, but for a commercial product it should use server components for the initial render.
### 2.4 Frontend Auth Uses `getSession()` Instead of `getUser()`
**File:** `frontend/src/lib/api.ts:10-12`
```typescript
const { data: { session } } = await supabase.auth.getSession();
```
`getSession()` reads from local storage without server verification. If a session is expired or revoked server-side, the frontend will still try to use it until the backend rejects it. The middleware correctly uses `getUser()` (which validates server-side), but the API client does not.
### 2.5 Missing Error Recovery in Frontend
Throughout the frontend, API errors are handled with basic error states, but there's no:
- Retry logic for transient failures
- Token refresh on 401 responses
- Optimistic UI rollback on mutation failures
- Offline detection
### 2.6 Missing `Content-Disposition` Header Sanitization
**File:** `backend/internal/handlers/documents.go:133`
```go
w.Header().Set("Content-Disposition", fmt.Sprintf(`attachment; filename="%s"`, title))
```
The `title` (which comes from user input) is inserted directly into the header. A filename containing `"` or newlines could be used for response header injection.
**Fix:** Sanitize the filename — strip or encode special characters.
### 2.7 No Graceful Shutdown
**File:** `backend/cmd/server/main.go:42`
```go
http.ListenAndServe(":"+cfg.Port, handler)
```
No signal handling or graceful shutdown. When the process receives SIGTERM (e.g., during deployment), in-flight requests are dropped, CalDAV sync operations may be interrupted mid-write, and database connections are not cleanly closed.
### 2.8 Database Connection Pool — search_path is Session-Level
**File:** `backend/internal/db/connection.go:17`
```go
db.Exec("SET search_path TO kanzlai, public")
```
`SET search_path` is session-level in PostgreSQL. With connection pooling (`MaxOpenConns: 25`), this SET runs once on the initial connection. If a connection is recycled or a new one opened from the pool, it may not have the kanzlai search_path. This could cause queries to silently hit the wrong schema.
**Fix:** Use `SET LOCAL search_path` in a transaction, or set it at the database/role level, or qualify all table references with the schema name.
### 2.9 go.sum Missing from Dockerfile
**File:** `backend/Dockerfile:4`
```dockerfile
COPY go.mod ./
RUN go mod download
```
Only `go.mod` is copied, not `go.sum`. This means the build isn't reproducible and doesn't verify checksums. Should be `COPY go.mod go.sum ./`.
### 2.10 German Umlaut Typos Throughout Frontend
**Files:** Multiple frontend components
German strings use ASCII approximations instead of proper characters:
- `login/page.tsx`: "Zurueck" instead of "Zurück"
- `cases/[id]/layout.tsx`: "Anhaengig" instead of "Anhängig"
- `cases/[id]/fristen/page.tsx`: "Ueberfaellig" instead of "Überfällig"
- `termine/page.tsx`: "Uberblick" instead of "Überblick"
A German lawyer would notice this immediately. It signals "this was built by a machine, not tested by a human."
### 2.11 Silent Error Swallowing in Event Creation
**File:** `backend/internal/services/case_service.go:260-266`
```go
func createEvent(ctx context.Context, db *sqlx.DB, ...) {
db.ExecContext(ctx, /* ... */) // Error completely ignored
}
```
Case events (audit trail) silently fail to create. The calling functions don't check the return. This means you could have cases with no events and no way to know why.
### 2.12 Missing Error Boundaries in Frontend
No React error boundaries are implemented. If any component throws, the entire page crashes with a white screen. For a law firm tool where data integrity matters, this is unacceptable.
### 2.13 No RLS Policies Defined at Database Level
Multi-tenant isolation relies entirely on `WHERE tenant_id = $X` clauses in Go code. If any query forgets this clause, data leaks across tenants. There are no PostgreSQL RLS policies as a safety net.
**Fix:** Enable RLS on all tenant-scoped tables and create policies tied to `auth.uid()` via `user_tenants`.
---
## 3. Architecture Assessment
### 3.1 What's Good
- **Clean monorepo structure** — `backend/` and `frontend/` are clearly separated. Each has its own Dockerfile. The Makefile provides unified commands.
- **Go backend is well-organized** — `cmd/server/`, `internal/{auth,config,db,handlers,middleware,models,router,services}` follows Go best practices.
- **Handler/Service separation** — handlers do HTTP concerns (parse request, write response), services do business logic. This is correct.
- **Parameterized SQL everywhere** — no string concatenation in queries. All user input goes through `$N` placeholders.
- **Multi-tenant design** — `tenant_id` on every row, context-based tenant resolution, RLS at the database level.
- **Smart use of Go 1.22+ routing** — method+path patterns like `GET /api/cases/{id}` eliminate the need for a third-party router.
- **CalDAV sync is genuinely impressive** — bidirectional sync with conflict resolution, etag tracking, background polling per-tenant. This is a differentiator.
- **Deadline calculator** — ported from youpc.org with holiday awareness. Legally important and hard to build.
- **Frontend routing structure** — German URL paths (`/fristen`, `/termine`, `/einstellungen`), nested case detail routes with layout.tsx for shared chrome. Proper use of App Router patterns.
### 3.2 Structural Concerns
- **No database migrations** — the schema was apparently created via SQL scripts run manually. There's a `seed/demo_data.sql` but no migration system. For a production system, this is unsustainable.
- **No CI/CD pipeline** — no `.github/workflows/`, `.gitea/`, or any CI configuration. Tests run locally but not automatically.
- **No API versioning** — all routes are at `/api/`. Adding breaking changes will break clients.
- **Services take raw `*sqlx.DB`** — no transaction support across service boundaries. Creating a case + event is not atomic (if the event insert fails, the case still exists).
- **Models are just struct definitions** — no validation methods, no constructor functions. Validation is scattered across handlers.
### 3.3 Data Model
Based on the seed data and model files, the schema is reasonable:
- `tenants`, `user_tenants` (multi-tenancy)
- `cases`, `parties` (case management)
- `deadlines`, `appointments` (time management)
- `documents`, `case_events`, `notes` (supporting data)
- `proceeding_types`, `deadline_rules`, `holidays` (reference data)
**Missing indexes likely needed:**
- `deadlines(tenant_id, status, due_date)` — for dashboard queries
- `appointments(tenant_id, start_at)` — for calendar queries
- `case_events(case_id, created_at)` — for event feeds
- `cases(tenant_id, status)` — for filtered lists
**Missing constraints:**
- No CHECK constraint on status values (cases, deadlines, appointments)
- No UNIQUE constraint on `case_number` per tenant
- No foreign key from `notes` to the parent entity (if polymorphic)
---
## 4. Security Assessment
### 4.1 Authentication
- **JWT validation is correct** — algorithm check (HMAC only), expiry check, sub claim extraction. Using `golang-jwt/v5`.
- **Supabase Auth on frontend** — proper cookie-based session with server-side verification in middleware.
- **No refresh token rotation** — the API client uses `getSession()` which may serve stale tokens.
### 4.2 Authorization
- **Critical: Tenant isolation bypass** (see 1.1)
- **No role-based access control** — `user_tenants` has a `role` column but it's never checked. Any member can do anything.
- **No resource-level permissions** — any user in a tenant can delete any case, document, etc.
### 4.3 Input Validation
- **SQL injection: Protected** — all queries use parameterized placeholders.
- **XSS: Partially protected** — React auto-escapes, but the API returns raw strings that could contain HTML. The `Content-Disposition` header is vulnerable (see 2.6).
- **File upload: Partially protected** — `MaxBytesReader` limits to 50MB, but no file type validation (could upload .exe, .html with scripts, etc.).
- **Rate limiting: AI endpoints only** — the rest of the API has no rate limiting. Login/register go through Supabase (which has its own limits), but all CRUD endpoints are unlimited.
### 4.4 Secrets
- **No hardcoded secrets** — all via environment variables. Good.
- **CalDAV credentials in plaintext** — see 1.3.
- **Supabase service key in backend** — necessary for storage, but this key has full DB access. Should be scoped.
---
## 5. Testing Assessment
### 5.1 Backend Tests (15 files)
- **Integration test** — sets up real DB connection, creates JWT, tests full HTTP flow. Excellent pattern but requires DATABASE_URL (skips otherwise).
- **Handler tests** — mock-based unit tests for most handlers. Test JSON parsing, error responses, basic happy paths.
- **Service tests** — deadline calculator has solid date arithmetic tests. Holiday service tested. CalDAV service tested with mocks. AI service tested with mocked HTTP.
- **Middleware tests** — rate limiter tested.
- **Auth tests** — tenant resolver tested.
### 5.2 Frontend Tests (4 files)
- `api.test.ts` — tests the API client
- `DeadlineTrafficLights.test.tsx` — component test
- `CaseOverviewGrid.test.tsx` — component test
- `LoginPage.test.tsx` — auth page test
### 5.3 What's Missing
- **No E2E tests** — no Playwright/Cypress. Critical for a law firm app where correctness matters.
- **No contract tests** — frontend and backend are tested independently. A schema change could break the frontend without any test catching it.
- **Deadline calculation edge cases** — needs tests for year boundaries, leap years, holidays falling on weekends, multiple consecutive holidays.
- **Multi-tenant security tests** — no test verifying that User A can't access Tenant B's data. This is the most important test to add.
- **Frontend test coverage is thin** — 4 tests for ~30 components. The dashboard, all forms, navigation, error states are untested.
- **No load testing** — unknown how the system behaves under concurrent users.
---
## 6. UX Assessment
### 6.1 What Works
- **Dashboard is strong** — traffic light deadline indicators, upcoming timeline, case overview, quick actions. A lawyer can see what matters at a glance.
- **German localization** — UI is in German with proper legal terminology (Akten, Fristen, Termine, Parteien).
- **Mobile responsive** — sidebar collapses to hamburger menu, layout uses responsive grids.
- **Loading states** — skeleton screens on dashboard, not just spinners.
- **Breadcrumbs** — navigation trail on all pages.
- **Deadline calculator** — unique feature that provides real value for UPC litigation.
### 6.2 What a Lawyer Would Stumble On
1. **No onboarding flow** — after registration, user has no tenant, no cases. The app shows empty states but doesn't guide the user to create a tenant or import data.
2. **No search** — there's no global search. A lawyer with 100+ cases needs to find things fast.
3. **No keyboard shortcuts** — power users (lawyers are keyboard-heavy) have no shortcuts.
4. **Sidebar mixes languages** — "Akten" (German) vs "AI Analyse" (English). Should be consistent.
5. **No notifications** — overdue deadlines don't trigger any alert beyond the dashboard color. No email alerts, no push notifications.
6. **No print view** — lawyers need to print deadline lists, case summaries. No print stylesheet.
7. **No bulk operations** — can't mark multiple deadlines as complete, can't bulk-assign parties.
8. **Document upload has no preview** — uploaded PDFs can't be viewed inline.
9. **AI features require manual trigger** — AI summary and deadline extraction are manual. Should auto-trigger on document upload.
10. **No activity log per user** — no audit trail of who changed what. Critical for law firm compliance.
---
## 7. Deployment Assessment
### 7.1 Docker Setup
- **Multi-stage builds** — both Dockerfiles use builder pattern. Good.
- **Backend is minimal** — Alpine + static binary + ca-certificates. ~15MB image.
- **Frontend** — Bun for deps/build, Node for runtime (standalone output). Reasonable.
- **Missing:** go.sum not copied in backend Dockerfile (see 2.9).
- **Missing:** No docker-compose.yml for local development.
- **Missing:** No health check in Dockerfile (`HEALTHCHECK` instruction).
### 7.2 Environment Handling
- **Config validates required vars** — `DATABASE_URL` and `SUPABASE_JWT_SECRET` are checked at startup.
- **Supabase URL/keys not validated** — if missing, features silently fail or crash at runtime.
- **No .env.example** — new developers don't know what env vars are needed.
### 7.3 Reliability
- **No graceful shutdown** (see 2.7)
- **No readiness/liveness probes** — `/health` exists but only checks DB connectivity. No readiness distinction.
- **CalDAV sync runs in-process** — if the sync goroutine panics, it takes down the API server.
- **No structured error recovery** — panics in handlers will crash the process (no recovery middleware).
---
## 8. Competitive Analysis
### 8.1 The Market
German Kanzleisoftware is a mature, crowded market:
| Tool | Type | Price | Key Strength |
|------|------|-------|-------------|
| **RA-MICRO** | Desktop + Cloud | ~100-200 EUR/user/mo | Market leader, 30+ years, full beA integration |
| **ADVOWARE** | Desktop + Cloud | from 20 EUR/mo | Budget-friendly, strong for small firms |
| **AnNoText** (Wolters Kluwer) | Desktop + Cloud | Custom pricing | Enterprise, AI document analysis, DictNow |
| **Actaport** | Cloud-native | from 79.80 EUR/mo | Modern UI, Mandantenportal, integrated Office |
| **Haufe Advolux** | Cloud | Custom | User-friendly, full-featured |
| **Renostar Legal Cloud** | Cloud | Custom | Browser-based, no installation |
### 8.2 Table-Stakes Features KanzlAI is Missing
These are **mandatory** for any German Kanzleisoftware to be taken seriously:
1. **beA Integration** — since 2022, German lawyers must use the electronic court mailbox (besonderes elektronisches Anwaltspostfach). No Kanzleisoftware sells without it. This is a **massive** implementation effort (KSW-Schnittstelle from BRAK).
2. **RVG Billing (Gebührenrechner)** — automated fee calculation per RVG (Rechtsanwaltsvergütungsgesetz). Every competitor has this built-in. Without it, lawyers can't bill clients.
3. **Document Generation** — templates for Schriftsätze, Klageschriften, Mahnbescheide with auto-populated case data. Usually integrated with Word.
4. **Accounting (FiBu)** — client trust accounts (Fremdgeld), DATEV export, tax-relevant bookkeeping. Legal requirement.
5. **Conflict Check (Kollisionsprüfung)** — check if the firm has a conflict of interest before taking a case. Legally required (§ 43a BRAO).
6. **Dictation System** — voice-to-text for lawyers. RA-MICRO has DictaNet, AnNoText has DictNow.
### 8.3 Where KanzlAI Could Differentiate
Despite the feature gap, KanzlAI has some advantages:
1. **AI-native** — competitors are bolting AI onto 20-year-old software. KanzlAI has Claude API integration from day one. The deadline extraction from PDFs is genuinely useful.
2. **UPC specialization** — the deadline calculator with UPC Rules of Procedure knowledge is unique. No competitor has deep UPC litigation support.
3. **CalDAV sync** — bidirectional sync with external calendars is not common in German Kanzleisoftware.
4. **Modern tech stack** — React + Go + Supabase vs. the .NET/Java/Desktop world of RA-MICRO et al.
5. **Multi-tenant from day 1** — designed for SaaS, not converted from desktop software.
### 8.4 Strategic Recommendation
**Don't compete head-on with RA-MICRO.** The feature gap is 10+ person-years of work. Instead:
**Option A: UPC Niche Tool** — Pivot back to UPC patent litigation. Build the best deadline calculator, case tracker, and AI-powered brief analysis tool for UPC practitioners. There are ~1000 UPC practitioners in Europe who need specialized tooling that RA-MICRO doesn't provide. Charge 200-500 EUR/mo.
**Option B: AI-First Legal Assistant** — Don't call it "Kanzleimanagement." Position as an AI assistant that reads court documents, extracts deadlines, and syncs to the lawyer's existing Kanzleisoftware via CalDAV/iCal. This sidesteps the feature gap entirely.
**Option C: Full Kanzleisoftware** — If you pursue this, beA integration is the first priority, then RVG billing. Without these two, no German lawyer will switch.
---
## 9. Strengths (What's Good, Keep Doing It)
1. **Architecture is solid** — the Go + Next.js + Supabase stack is well-chosen. Clean separation of concerns.
2. **SQL is safe** — parameterized queries throughout. No injection vectors.
3. **Multi-tenant design** — tenant_id scoping with RLS is the right approach.
4. **CalDAV implementation** — genuinely impressive for an MVP. Bidirectional sync with conflict resolution.
5. **Deadline calculator** — ported from youpc.org with holiday awareness. Real domain value.
6. **AI integration** — Claude API with tool use for structured extraction. Clean implementation.
7. **Dashboard UX** — traffic lights, timeline, quick actions. Lawyers will get this immediately.
8. **German-first** — proper legal terminology, German date formats, localized UI.
9. **Test foundation** — 15 backend test files with integration tests. Good starting point.
10. **Docker builds are lean** — multi-stage, Alpine-based, standalone Next.js output.
---
## 10. Priority Roadmap
### P0 — This Week
- [ ] Fix tenant isolation bypass in TenantResolver (1.1)
- [ ] Consolidate tenant resolution logic (1.2)
- [ ] Encrypt CalDAV credentials at rest (1.3)
- [ ] Add CORS middleware + security headers (1.4)
- [ ] Stop leaking internal errors to clients (1.5)
- [ ] Add mutex to HolidayService cache (1.6)
- [ ] Fix rate limiter X-Forwarded-For bypass (1.7)
- [ ] Fix Dockerfile go.sum copy (2.9)
### P1 — Before Demo/Beta
- [ ] Add input validation (length limits, allowed values) (2.1)
- [ ] Add pagination to all list endpoints (2.2)
- [ ] Fix `search_path` connection pool issue (2.8)
- [ ] Add graceful shutdown with signal handling (2.7)
- [ ] Sanitize Content-Disposition filename (2.6)
- [ ] Fix German umlaut typos throughout frontend (2.10)
- [ ] Handle createEvent errors instead of swallowing (2.11)
- [ ] Add React error boundaries (2.12)
- [ ] Implement RLS policies on all tenant-scoped tables (2.13)
- [ ] Add multi-tenant security tests
- [ ] Add database migrations system
- [ ] Add `.env.example` file
- [ ] Add onboarding flow for new users
### P2 — Next Iteration
- [ ] Role-based access control (admin/member/readonly)
- [ ] Global search
- [ ] Email notifications for overdue deadlines
- [ ] Audit trail / activity log per user
- [ ] Auto-trigger AI extraction on document upload
- [ ] Print-friendly views
- [ ] E2E tests with Playwright
- [ ] CI/CD pipeline
### P3 — Strategic
- [ ] Decide market positioning (UPC niche vs. AI assistant vs. full Kanzleisoftware)
- [ ] If Kanzleisoftware: begin beA integration research
- [ ] If Kanzleisoftware: RVG Gebührenrechner
- [ ] If UPC niche: integrate lex-research case law database
---
*This audit was conducted by reading every source file in the repository, running all tests, analyzing the database schema via seed data, and comparing against established German Kanzleisoftware competitors.*

View File

@@ -18,7 +18,7 @@ frontend/ Next.js 15 (TypeScript, Tailwind CSS, App Router)
- **Frontend:** Next.js 15 with TypeScript, Tailwind CSS v4, App Router, Bun
- **Backend:** Go (standard library HTTP server)
- **Database:** Supabase (PostgreSQL) — `kanzlai` schema in flexsiebels instance
- **Database:** Supabase (PostgreSQL) — `mgmt` schema in youpc.org instance
- **Deploy:** Dokploy on mLake, domain: kanzlai.msbls.de
## Development

View File

@@ -0,0 +1,665 @@
# Design: Dashboard Redesign + Detail Pages
**Task:** t-kz-060
**Author:** cronus (inventor)
**Date:** 2026-03-25
**Status:** Design proposal
---
## Problem Statement
The current dashboard is a read-only status board. Cards show counts but don't link anywhere. Timeline items are inert. Quick actions navigate to list pages rather than creation flows. There are no detail pages for individual events, deadlines, or appointments. Notes don't exist as a first-class entity. Case detail tabs use local state instead of URL segments, breaking deep linking and back navigation.
## Design Principles
1. **Everything clickable goes somewhere** — no dead-end UI
2. **Breadcrumb navigation** — always know where you are, one click to go back
3. **German labels throughout** — consistent with existing convention
4. **Mobile responsive** — sidebar collapses, cards stack, touch targets >= 44px
5. **Information density over whitespace** — law firm users want data, not decoration
6. **URL-driven state** — tabs, filters, and views reflected in the URL for deep linking
---
## Part 1: Dashboard Redesign
### 1.1 Traffic Light Cards → Click-to-Filter
**Current:** Three cards (Ueberfaellig / Diese Woche / Im Zeitplan) show counts. `onFilter` prop exists but is never wired up in `dashboard/page.tsx`.
**Proposed:**
Clicking a traffic light card navigates to `/fristen?status={filter}`:
| Card | Navigation Target |
|------|------------------|
| Ueberfaellig (red) | `/fristen?status=overdue` |
| Diese Woche (amber) | `/fristen?status=this_week` |
| Im Zeitplan (green) | `/fristen?status=ok` |
**Implementation:**
- Replace `onFilter` callback with `next/link` navigation using `href`
- `DeadlineTrafficLights` becomes a pure link-based component (no callback needed)
- `/fristen` page reads `searchParams.status` and pre-applies the filter
- The DeadlineList component already supports status filtering — just needs to read from URL
**Changes:**
- `DeadlineTrafficLights.tsx`: Replace `<button onClick>` with `<Link href="/fristen?status=...">`
- `fristen/page.tsx`: Read `searchParams` and pass as initial filter to DeadlineList
- `DeadlineList.tsx`: Accept `initialStatus` prop from URL params
### 1.2 Case Overview Grid → Click-to-Filter
**Current:** Three static metrics (Aktive Akten / Neu / Abgeschlossen). No links.
**Proposed:**
| Card | Navigation Target |
|------|------------------|
| Aktive Akten | `/cases?status=active` |
| Neu (Monat) | `/cases?status=active&since=month` |
| Abgeschlossen | `/cases?status=closed` |
**Implementation:**
- Wrap each metric row in `<Link>` to `/cases` with appropriate query params
- Cases list page already has filtering — needs to read URL params on mount
- Add visual hover state (arrow icon on hover, background highlight)
**Changes:**
- `CaseOverviewGrid.tsx`: Each row becomes a `<Link>` with hover arrow
- `cases/page.tsx`: Read `searchParams` for initial filter state
### 1.3 Timeline Items → Click-to-Navigate
**Current:** Timeline entries show deadline/appointment info but are not clickable. No link to the parent case or the item itself.
**Proposed:**
Each timeline entry becomes a clickable row:
- **Deadline entries**: Click navigates to `/fristen/{id}` (new deadline detail page)
- **Appointment entries**: Click navigates to `/termine/{id}` (new appointment detail page)
- **Case reference** (Az. / case_number): Secondary click target linking to `/cases/{case_id}`
**Visual changes:**
- Add `cursor-pointer` and hover state (`hover:bg-neutral-100` transition)
- Add a small chevron-right icon on the right edge
- Case number becomes a subtle underlined link (click stops propagation)
**Data changes:**
- `UpcomingDeadline` needs `case_id` field (currently missing from the dashboard query — the backend model has it but the SQL join doesn't select it)
- `UpcomingAppointment` already has `case_id`
**Backend change:**
- `dashboard_service.go` line 112: Add `d.case_id` to the upcoming deadlines SELECT
- `DashboardService.UpcomingDeadline` struct: Add `CaseID uuid.UUID` field
- Frontend `UpcomingDeadline` type: Already has `case_id` (it's in types.ts but the backend doesn't send it)
### 1.4 Quick Actions → Proper Navigation
**Current:** "Frist eintragen" goes to `/fristen` (list page), not a creation flow. "CalDAV Sync" goes to `/einstellungen`.
**Proposed:**
| Action | Current Target | New Target |
|--------|---------------|------------|
| Neue Akte | `/cases/new` | `/cases/new` (keep) |
| Frist eintragen | `/fristen` | `/fristen/neu` (new creation page) |
| Neuer Termin | (missing) | `/termine/neu` (new creation page) |
| AI Analyse | `/ai/extract` | `/ai/extract` (keep) |
Replace "CalDAV Sync" with "Neuer Termin" — CalDAV sync is a settings function, not a daily quick action. Creating an appointment is something a secretary does multiple times per day.
**Changes:**
- `QuickActions.tsx`: Update hrefs, swap CalDAV for appointment creation
- Create `/fristen/neu/page.tsx` — standalone deadline creation form (select case, fill fields)
- Create `/termine/neu/page.tsx` — standalone appointment creation form
### 1.5 AI Summary Card → Refresh Button
**Current:** Rule-based summary text, no refresh mechanism. Card regenerates on page load but not on demand.
**Proposed:**
- Add a small refresh icon button (RefreshCw) in the card header, next to "KI-Zusammenfassung"
- Clicking it calls `refetch()` on the dashboard query (passed as prop)
- Show a brief spinning animation during refetch
- If/when real AI summarization is wired up, this button triggers `POST /api/ai/summarize-dashboard` (future endpoint)
**Changes:**
- `AISummaryCard.tsx`: Accept `onRefresh` prop, add button with spinning state
- `dashboard/page.tsx`: Pass `refetch` to AISummaryCard
### 1.6 Dashboard Layout: Add Recent Activity Section
**Current:** The backend returns `recent_activity` (last 10 case events) but the frontend ignores it entirely.
**Proposed:**
- Add a "Letzte Aktivitaet" section below the timeline, full width
- Shows the 10 most recent case events in a compact list
- Each row: event icon (by type) | title | case number (linked) | relative time
- Clicking a row navigates to the case event detail page `/cases/{case_id}/ereignisse/{event_id}`
**Changes:**
- New component: `RecentActivityList.tsx` in `components/dashboard/`
- `dashboard/page.tsx`: Add section below the main grid
- Add `RecentActivity` type to `types.ts` (needs `case_id` and `event_id` fields from backend)
- Backend: Add `case_id` and `id` to the recent activity query
---
## Part 2: New Pages
### 2.1 Deadline Detail Page — `/fristen/{id}`
**Route:** `src/app/(app)/fristen/[id]/page.tsx`
**Layout:**
```
Breadcrumb: Dashboard > Fristen > {deadline.title}
+---------------------------------------------------------+
| [Status Badge] {deadline.title} [Erledigen] |
| Fällig: 28. März 2026 |
+---------------------------------------------------------+
| Akte: Az. 2024/001 — Müller v. Schmidt [→ Zur Akte] |
| Quelle: Berechnet (R.118 RoP) |
| Ursprüngliches Datum: 25. März 2026 |
| Warnungsdatum: 21. März 2026 |
+---------------------------------------------------------+
| Notizen [Bearbeiten]|
| Fristverlängerung beantragt am 20.03. |
+---------------------------------------------------------+
| Verlauf |
| ○ Erstellt am 15.03.2026 |
| ○ Warnung gesendet am 21.03.2026 |
+---------------------------------------------------------+
```
**Data requirements:**
- `GET /api/deadlines/{id}` — new endpoint returning full deadline with case info
- Returns: Deadline + associated case (number, title, id) + notes
**Sections:**
1. **Header**: Status badge (Offen/Erledigt/Ueberfaellig), title, "Erledigen" action button
2. **Due date**: Large, with relative time ("in 3 Tagen" / "vor 2 Tagen ueberfaellig")
3. **Context panel**: Parent case (linked), source (manual/calculated/caldav), rule reference, original vs adjusted date
4. **Notes section**: Free-text notes (existing `notes` field on deadline), inline edit
5. **Activity log**: Timeline of changes to this deadline (future: from case_events filtered by deadline)
**Backend additions:**
- `GET /api/deadlines/{id}` — new handler returning single deadline with case join
- Handler: `deadlines.go` add `Get` method
- Service: `deadline_service.go` add `GetByID` with case join
### 2.2 Appointment Detail Page — `/termine/{id}`
**Route:** `src/app/(app)/termine/[id]/page.tsx`
**Layout:**
```
Breadcrumb: Dashboard > Termine > {appointment.title}
+---------------------------------------------------------+
| {appointment.title} [Bearbeiten] [X] |
| Typ: Verhandlung |
+---------------------------------------------------------+
| Datum: 28. März 2026, 10:00 12:00 Uhr |
| Ort: UPC München, Saal 3 |
+---------------------------------------------------------+
| Akte: Az. 2024/001 — Müller v. Schmidt [→ Zur Akte] |
+---------------------------------------------------------+
| Beschreibung |
| Erste mündliche Verhandlung... |
+---------------------------------------------------------+
| Notizen [+ Neu] |
| ○ 25.03. — Mandant über Termin informiert |
| ○ 24.03. — Schriftsatz vorbereitet |
+---------------------------------------------------------+
```
**Data requirements:**
- `GET /api/appointments/{id}` — new endpoint returning single appointment with case info
- Notes: Uses new `notes` table (see Part 3)
**Backend additions:**
- `GET /api/appointments/{id}` — new handler
- Handler: `appointments.go` add `Get` method
- Service: `appointment_service.go` add `GetByID` with optional case join
### 2.3 Case Event Detail Page — `/cases/{id}/ereignisse/{eventId}`
**Route:** `src/app/(app)/cases/[id]/ereignisse/[eventId]/page.tsx`
**Layout:**
```
Breadcrumb: Akten > Az. 2024/001 > Verlauf > {event.title}
+---------------------------------------------------------+
| [Event Type Icon] {event.title} |
| 25. März 2026, 14:30 |
+---------------------------------------------------------+
| Beschreibung |
| Statusänderung: aktiv → geschlossen |
+---------------------------------------------------------+
| Metadaten |
| Erstellt von: max.mustermann@kanzlei.de |
| Typ: status_changed |
+---------------------------------------------------------+
| Notizen [+ Neu] |
| (keine Notizen) |
+---------------------------------------------------------+
```
**Data requirements:**
- `GET /api/case-events/{id}` — new endpoint
- Notes: Uses new `notes` table
**Backend additions:**
- New handler: `case_events.go` with `Get` method
- New service method: `CaseEventService.GetByID`
- Or extend existing case handler to include event fetching
### 2.4 Standalone Deadline Creation — `/fristen/neu`
**Route:** `src/app/(app)/fristen/neu/page.tsx`
**Layout:**
```
Breadcrumb: Fristen > Neue Frist
+---------------------------------------------------------+
| Neue Frist anlegen |
+---------------------------------------------------------+
| Akte*: [Dropdown: Aktenauswahl] |
| Bezeichnung*: [________________________] |
| Beschreibung: [________________________] |
| Fällig am*: [Datumsauswahl] |
| Warnung am: [Datumsauswahl] |
| Notizen: [Textarea] |
+---------------------------------------------------------+
| [Abbrechen] [Frist anlegen]|
+---------------------------------------------------------+
```
Reuses existing deadline creation logic but as a standalone page rather than requiring the user to first navigate to a case. Case is selected via dropdown.
### 2.5 Standalone Appointment Creation — `/termine/neu`
**Route:** `src/app/(app)/termine/neu/page.tsx`
Same pattern as deadline creation. Reuses AppointmentModal fields but as a full page form. Appointment can optionally be linked to a case.
### 2.6 Case Detail Tabs → URL Segments
**Current:** Tabs use `useState<TabKey>` — no URL change, no deep linking, no browser back.
**Proposed route structure:**
```
/cases/{id} → redirects to /cases/{id}/verlauf
/cases/{id}/verlauf → Timeline tab
/cases/{id}/fristen → Deadlines tab
/cases/{id}/dokumente → Documents tab
/cases/{id}/parteien → Parties tab
/cases/{id}/notizen → Notes tab (new)
```
**Implementation approach:**
Use Next.js nested layouts with a shared layout for the case header + tab bar:
```
src/app/(app)/cases/[id]/
layout.tsx # Case header + tab navigation
page.tsx # Redirect to ./verlauf
verlauf/page.tsx # Timeline
fristen/page.tsx # Deadlines
dokumente/page.tsx # Documents
parteien/page.tsx # Parties
notizen/page.tsx # Notes (new)
```
The `layout.tsx` fetches case data and renders the header + tab bar. Each child page renders its tab content. The active tab is determined by the current pathname.
**Benefits:**
- Deep linking: `/cases/abc123/fristen` opens directly to the deadlines tab
- Browser back button works between tabs
- Each tab can have its own loading state
- Bookmarkable
---
## Part 3: Notes System
### 3.1 Data Model
Notes are a polymorphic entity — they can be attached to cases, deadlines, appointments, or case events.
**New table: `kanzlai.notes`**
```sql
CREATE TABLE kanzlai.notes (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
tenant_id UUID NOT NULL REFERENCES kanzlai.tenants(id),
-- Polymorphic parent reference (exactly one must be set)
case_id UUID REFERENCES kanzlai.cases(id) ON DELETE CASCADE,
deadline_id UUID REFERENCES kanzlai.deadlines(id) ON DELETE CASCADE,
appointment_id UUID REFERENCES kanzlai.appointments(id) ON DELETE CASCADE,
case_event_id UUID REFERENCES kanzlai.case_events(id) ON DELETE CASCADE,
content TEXT NOT NULL,
created_by UUID, -- auth.users reference
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
-- Ensure exactly one parent is set
CONSTRAINT notes_single_parent CHECK (
(CASE WHEN case_id IS NOT NULL THEN 1 ELSE 0 END +
CASE WHEN deadline_id IS NOT NULL THEN 1 ELSE 0 END +
CASE WHEN appointment_id IS NOT NULL THEN 1 ELSE 0 END +
CASE WHEN case_event_id IS NOT NULL THEN 1 ELSE 0 END) = 1
)
);
-- Indexes for efficient lookup by parent
CREATE INDEX idx_notes_case ON kanzlai.notes(tenant_id, case_id) WHERE case_id IS NOT NULL;
CREATE INDEX idx_notes_deadline ON kanzlai.notes(tenant_id, deadline_id) WHERE deadline_id IS NOT NULL;
CREATE INDEX idx_notes_appointment ON kanzlai.notes(tenant_id, appointment_id) WHERE appointment_id IS NOT NULL;
CREATE INDEX idx_notes_case_event ON kanzlai.notes(tenant_id, case_event_id) WHERE case_event_id IS NOT NULL;
-- RLS
ALTER TABLE kanzlai.notes ENABLE ROW LEVEL SECURITY;
CREATE POLICY notes_tenant_isolation ON kanzlai.notes
USING (tenant_id IN (
SELECT tenant_id FROM kanzlai.user_tenants WHERE user_id = auth.uid()
));
```
### 3.2 Why Polymorphic Table vs Separate Tables
**Considered alternatives:**
1. **Separate notes per entity** (case_notes, deadline_notes, etc.) — More tables, duplicated logic, harder to search across all notes.
2. **Generic `entity_type` + `entity_id` pattern** — Loses FK constraints, can't cascade delete, harder to query with joins.
3. **Polymorphic with nullable FKs** (chosen) — FK constraints maintained, cascade deletes work, partial indexes keep queries fast, single service/handler. The CHECK constraint ensures data integrity.
### 3.3 Backend Model & API
**Go model:**
```go
type Note struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
CaseID *uuid.UUID `db:"case_id" json:"case_id,omitempty"`
DeadlineID *uuid.UUID `db:"deadline_id" json:"deadline_id,omitempty"`
AppointmentID *uuid.UUID `db:"appointment_id" json:"appointment_id,omitempty"`
CaseEventID *uuid.UUID `db:"case_event_id" json:"case_event_id,omitempty"`
Content string `db:"content" json:"content"`
CreatedBy *uuid.UUID `db:"created_by" json:"created_by,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}
```
**API endpoints:**
```
GET /api/notes?case_id={id} # List notes for a case
GET /api/notes?deadline_id={id} # List notes for a deadline
GET /api/notes?appointment_id={id} # List notes for an appointment
GET /api/notes?case_event_id={id} # List notes for a case event
POST /api/notes # Create note (body includes parent ID)
PUT /api/notes/{id} # Update note content
DELETE /api/notes/{id} # Delete note
```
Single endpoint with query parameter filtering — simpler than nested routes, works uniformly across all parent types.
**Service methods:**
```go
type NoteService struct { db *sqlx.DB }
func (s *NoteService) ListByParent(ctx, tenantID, parentType, parentID) ([]Note, error)
func (s *NoteService) Create(ctx, tenantID, note) (*Note, error)
func (s *NoteService) Update(ctx, tenantID, noteID, content) (*Note, error)
func (s *NoteService) Delete(ctx, tenantID, noteID) error
```
### 3.4 Notes UI Component
Reusable `<NotesList>` component used on every detail page:
```
+------------------------------------------------------------+
| Notizen [+ Neu] |
+------------------------------------------------------------+
| m@kanzlei.de · 25. Mär 2026, 14:30 [X][E] |
| Fristverlängerung beim Gericht beantragt. |
+------------------------------------------------------------+
| m@kanzlei.de · 24. Mär 2026, 10:15 [X][E] |
| Mandant telefonisch über Sachstand informiert. |
+------------------------------------------------------------+
```
**Props:**
```typescript
interface NotesListProps {
parentType: "case" | "deadline" | "appointment" | "case_event";
parentId: string;
}
```
**Features:**
- Fetches notes via `GET /api/notes?{parentType}_id={parentId}`
- "Neu" button opens inline textarea (not a modal — faster for quick notes)
- Each note shows: author, timestamp, content, edit/delete buttons
- Edit is inline (textarea replaces content)
- Optimistic updates via react-query mutation + invalidation
- Empty state: "Keine Notizen vorhanden. Klicken Sie +, um eine Notiz hinzuzufügen."
### 3.5 Migration from `deadlines.notes` Field
The existing `deadlines.notes` text field should be migrated:
1. For each deadline with a non-null `notes` value, create a corresponding row in the `notes` table with `deadline_id` set
2. Drop the `deadlines.notes` column after migration
3. This can be a one-time SQL migration script
---
## Part 4: Breadcrumb Navigation
### 4.1 Breadcrumb Component
New shared component: `src/components/layout/Breadcrumb.tsx`
```typescript
interface BreadcrumbItem {
label: string;
href?: string; // omit for current page (last item)
}
function Breadcrumb({ items }: { items: BreadcrumbItem[] }) {
// Renders: Home > Parent > Current
// Each item with href is a Link, last item is plain text
}
```
**Placement:** At the top of every page, inside the main content area (not in the layout — different pages have different breadcrumbs).
### 4.2 Breadcrumb Patterns
| Page | Breadcrumb |
|------|-----------|
| Dashboard | Dashboard |
| Fristen | Dashboard > Fristen |
| Fristen Detail | Dashboard > Fristen > {title} |
| Fristen Neu | Dashboard > Fristen > Neue Frist |
| Termine | Dashboard > Termine |
| Termine Detail | Dashboard > Termine > {title} |
| Termine Neu | Dashboard > Termine > Neuer Termin |
| Akten | Dashboard > Akten |
| Akte Detail | Dashboard > Akten > {case_number} |
| Akte > Fristen | Dashboard > Akten > {case_number} > Fristen |
| Akte > Notizen | Dashboard > Akten > {case_number} > Notizen |
| Ereignis Detail | Dashboard > Akten > {case_number} > Verlauf > {title} |
| Einstellungen | Dashboard > Einstellungen |
| AI Analyse | Dashboard > AI Analyse |
---
## Part 5: Summary of Backend Changes
### New Endpoints
| Method | Path | Handler | Purpose |
|--------|------|---------|---------|
| GET | `/api/deadlines/{id}` | `deadlineH.Get` | Single deadline with case context |
| GET | `/api/appointments/{id}` | `apptH.Get` | Single appointment with case context |
| GET | `/api/case-events/{id}` | `eventH.Get` | Single case event |
| GET | `/api/notes` | `noteH.List` | List notes (filtered by parent) |
| POST | `/api/notes` | `noteH.Create` | Create note |
| PUT | `/api/notes/{id}` | `noteH.Update` | Update note |
| DELETE | `/api/notes/{id}` | `noteH.Delete` | Delete note |
### Modified Endpoints
| Endpoint | Change |
|----------|--------|
| `GET /api/dashboard` | Add `case_id`, `id` to recent_activity; add `case_id` to upcoming_deadlines query |
### New Files
| File | Purpose |
|------|---------|
| `backend/internal/models/note.go` | Note model |
| `backend/internal/services/note_service.go` | Note CRUD service |
| `backend/internal/handlers/notes.go` | Note HTTP handlers |
| `backend/internal/handlers/case_events.go` | Case event detail handler |
### Database Migration
1. Create `kanzlai.notes` table with polymorphic FK pattern
2. Migrate existing `deadlines.notes` data
3. Drop `deadlines.notes` column
---
## Part 6: Summary of Frontend Changes
### New Files
| File | Purpose |
|------|---------|
| `src/app/(app)/fristen/[id]/page.tsx` | Deadline detail page |
| `src/app/(app)/fristen/neu/page.tsx` | Standalone deadline creation |
| `src/app/(app)/termine/[id]/page.tsx` | Appointment detail page |
| `src/app/(app)/termine/neu/page.tsx` | Standalone appointment creation |
| `src/app/(app)/cases/[id]/layout.tsx` | Case detail shared layout (header + tabs) |
| `src/app/(app)/cases/[id]/verlauf/page.tsx` | Case timeline tab |
| `src/app/(app)/cases/[id]/fristen/page.tsx` | Case deadlines tab |
| `src/app/(app)/cases/[id]/dokumente/page.tsx` | Case documents tab |
| `src/app/(app)/cases/[id]/parteien/page.tsx` | Case parties tab |
| `src/app/(app)/cases/[id]/notizen/page.tsx` | Case notes tab (new) |
| `src/app/(app)/cases/[id]/ereignisse/[eventId]/page.tsx` | Case event detail |
| `src/components/layout/Breadcrumb.tsx` | Reusable breadcrumb |
| `src/components/notes/NotesList.tsx` | Reusable notes list + inline creation |
| `src/components/dashboard/RecentActivityList.tsx` | Recent activity feed |
### Modified Files
| File | Change |
|------|--------|
| `src/components/dashboard/DeadlineTrafficLights.tsx` | Buttons → Links with navigation |
| `src/components/dashboard/CaseOverviewGrid.tsx` | Static metrics → clickable links |
| `src/components/dashboard/UpcomingTimeline.tsx` | Items → clickable with navigation |
| `src/components/dashboard/AISummaryCard.tsx` | Add refresh button |
| `src/components/dashboard/QuickActions.tsx` | Fix targets, swap CalDAV for Termin |
| `src/app/(app)/dashboard/page.tsx` | Wire navigation, add RecentActivity section |
| `src/app/(app)/fristen/page.tsx` | Read URL params for initial filter |
| `src/app/(app)/cases/page.tsx` | Read URL params for initial filter |
| `src/app/(app)/cases/[id]/page.tsx` | Refactor into layout + nested routes |
| `src/lib/types.ts` | Add Note, RecentActivity types; update UpcomingDeadline |
### Types to Add
```typescript
export interface Note {
id: string;
tenant_id: string;
case_id?: string;
deadline_id?: string;
appointment_id?: string;
case_event_id?: string;
content: string;
created_by?: string;
created_at: string;
updated_at: string;
}
export interface RecentActivity {
id: string;
event_type?: string;
title: string;
case_id: string;
case_number: string;
event_date?: string;
created_at: string;
}
```
---
## Part 7: Implementation Plan
Recommended order for a coder to implement:
### Phase A: Backend Foundation (can be done in parallel)
1. Create `notes` table migration + model + service + handler
2. Add `GET /api/deadlines/{id}` endpoint
3. Add `GET /api/appointments/{id}` endpoint
4. Add `GET /api/case-events/{id}` endpoint
5. Fix dashboard query to include `case_id` in upcoming deadlines and recent activity
### Phase B: Frontend — Dashboard Interactivity
1. Create `Breadcrumb` component
2. Make traffic light cards clickable (Links)
3. Make case overview grid clickable (Links)
4. Make timeline items clickable (Links)
5. Fix quick actions (swap CalDAV for Termin, update hrefs)
6. Add refresh button to AI Summary card
7. Add RecentActivityList component + wire to dashboard
### Phase C: Frontend — New Detail Pages
1. Deadline detail page (`/fristen/[id]`)
2. Appointment detail page (`/termine/[id]`)
3. Case event detail page (`/cases/[id]/ereignisse/[eventId]`)
4. Standalone deadline creation (`/fristen/neu`)
5. Standalone appointment creation (`/termine/neu`)
### Phase D: Frontend — Case Detail Refactor
1. Extract case header + tabs into layout.tsx
2. Create sub-route pages (verlauf, fristen, dokumente, parteien)
3. Add notes tab
4. Wire `NotesList` component into all detail pages
### Phase E: Polish
1. URL filter params on `/fristen` and `/cases` pages
2. Breadcrumbs on all pages
3. Mobile responsive testing
4. Migration of existing `deadlines.notes` data
---
## Appendix: What This Design Does NOT Cover
- Real AI-powered summary (currently rule-based — kept as-is with refresh button)
- Notification system (toast-based alerts for approaching deadlines)
- Audit log / change history per entity
- Batch operations (mark multiple deadlines complete)
- Print views
These are separate features that can be designed independently.

View File

@@ -37,7 +37,7 @@ test-backend:
cd backend && go test ./...
test-frontend:
@echo "No frontend tests configured yet"
cd frontend && bun run test
# Clean
clean:

1321
ROADMAP.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,32 +1,53 @@
package main
import (
"log"
"log/slog"
"net/http"
"os"
_ "github.com/lib/pq"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/config"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/db"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/logging"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/router"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
func main() {
logging.Setup()
cfg, err := config.Load()
if err != nil {
log.Fatalf("Failed to load config: %v", err)
slog.Error("failed to load config", "error", err)
os.Exit(1)
}
database, err := db.Connect(cfg.DatabaseURL)
if err != nil {
log.Fatalf("Failed to connect to database: %v", err)
slog.Error("failed to connect to database", "error", err)
os.Exit(1)
}
defer database.Close()
authMW := auth.NewMiddleware(cfg.SupabaseJWTSecret, database)
handler := router.New(database, authMW, cfg)
log.Printf("Starting KanzlAI API server on :%s", cfg.Port)
// Start CalDAV sync service
calDAVSvc := services.NewCalDAVService(database)
calDAVSvc.Start()
defer calDAVSvc.Stop()
// Start notification reminder service
notifSvc := services.NewNotificationService(database)
notifSvc.Start()
defer notifSvc.Stop()
handler := router.New(database, authMW, cfg, calDAVSvc, notifSvc, database)
slog.Info("starting KanzlAI API server", "port", cfg.Port)
if err := http.ListenAndServe(":"+cfg.Port, handler); err != nil {
log.Fatal(err)
slog.Error("server failed", "error", err)
os.Exit(1)
}
}

View File

@@ -3,11 +3,17 @@ module mgit.msbls.de/m/KanzlAI-mGMT
go 1.25.5
require (
github.com/anthropics/anthropic-sdk-go v1.27.1 // indirect
github.com/golang-jwt/jwt/v5 v5.3.1 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/jmoiron/sqlx v1.4.0 // indirect
github.com/lib/pq v1.12.0 // indirect
github.com/anthropics/anthropic-sdk-go v1.27.1
github.com/emersion/go-ical v0.0.0-20250609112844-439c63cef608
github.com/emersion/go-webdav v0.7.0
github.com/golang-jwt/jwt/v5 v5.3.1
github.com/google/uuid v1.6.0
github.com/jmoiron/sqlx v1.4.0
github.com/lib/pq v1.12.0
)
require (
github.com/teambition/rrule-go v1.8.2 // indirect
github.com/tidwall/gjson v1.18.0 // indirect
github.com/tidwall/match v1.1.1 // indirect
github.com/tidwall/pretty v1.2.1 // indirect

View File

@@ -1,6 +1,18 @@
filippo.io/edwards25519 v1.1.0 h1:FNf4tywRC1HmFuKW5xopWpigGjJKiJSV0Cqo0cJWDaA=
filippo.io/edwards25519 v1.1.0/go.mod h1:BxyFTGdWcka3PhytdK4V28tE5sGfRvvvRV7EaN4VDT4=
github.com/anthropics/anthropic-sdk-go v1.27.1 h1:7DgMZ2Ng3C2mPzJGHA30NXQTZolcF07mHd0tGaLwfzk=
github.com/anthropics/anthropic-sdk-go v1.27.1/go.mod h1:qUKmaW+uuPB64iy1l+4kOSvaLqPXnHTTBKH6RVZ7q5Q=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dnaeon/go-vcr v1.2.0 h1:zHCHvJYTMh1N7xnV7zf1m1GPBF9Ad0Jk/whtQ1663qI=
github.com/dnaeon/go-vcr v1.2.0/go.mod h1:R4UdLID7HZT3taECzJs4YgbbH6PIGXB6W/sc5OLb6RQ=
github.com/emersion/go-ical v0.0.0-20240127095438-fc1c9d8fb2b6/go.mod h1:BEksegNspIkjCQfmzWgsgbu6KdeJ/4LwUZs7DMBzjzw=
github.com/emersion/go-ical v0.0.0-20250609112844-439c63cef608 h1:5XWaET4YAcppq3l1/Yh2ay5VmQjUdq6qhJuucdGbmOY=
github.com/emersion/go-ical v0.0.0-20250609112844-439c63cef608/go.mod h1:BEksegNspIkjCQfmzWgsgbu6KdeJ/4LwUZs7DMBzjzw=
github.com/emersion/go-vcard v0.0.0-20230815062825-8fda7d206ec9/go.mod h1:HMJKR5wlh/ziNp+sHEDV2ltblO4JD2+IdDOWtGcQBTM=
github.com/emersion/go-webdav v0.7.0 h1:cp6aBWXBf8Sjzguka9VJarr4XTkGc2IHxXI1Gq3TKpA=
github.com/emersion/go-webdav v0.7.0/go.mod h1:mI8iBx3RAODwX7PJJ7qzsKAKs/vY429YfS2/9wKnDbQ=
github.com/go-sql-driver/mysql v1.8.1 h1:LedoTUt/eveggdHS9qUFC1EFSa8bU2+1pZjSRpvNJ1Y=
github.com/go-sql-driver/mysql v1.8.1/go.mod h1:wEBSXgmK//2ZFJyE+qWnIsVGmvmEKlqwuVSjsCm7DZg=
github.com/golang-jwt/jwt/v5 v5.3.1 h1:kYf81DTWFe7t+1VvL7eS+jKFVWaUnK9cB1qbwn63YCY=
github.com/golang-jwt/jwt/v5 v5.3.1/go.mod h1:fxCRLWMO43lRc8nhHWY6LGqRcf+1gQWArsqaEUEa5bE=
@@ -11,7 +23,14 @@ github.com/jmoiron/sqlx v1.4.0/go.mod h1:ZrZ7UsYB/weZdl2Bxg6jCRO9c3YHl8r3ahlKmRT
github.com/lib/pq v1.10.9/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
github.com/lib/pq v1.12.0 h1:mC1zeiNamwKBecjHarAr26c/+d8V5w/u4J0I/yASbJo=
github.com/lib/pq v1.12.0/go.mod h1:/p+8NSbOcwzAEI7wiMXFlgydTwcgTr3OSKMsD2BitpA=
github.com/mattn/go-sqlite3 v1.14.22 h1:2gZY6PC6kBnID23Tichd1K+Z0oS6nE/XwU+Vz/5o4kU=
github.com/mattn/go-sqlite3 v1.14.22/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/teambition/rrule-go v1.8.2 h1:lIjpjvWTj9fFUZCmuoVDrKVOtdiyzbzc93qTmRVe/J8=
github.com/teambition/rrule-go v1.8.2/go.mod h1:Ieq5AbrKGciP1V//Wq8ktsTXwSwJHDD5mD/wLBGl3p4=
github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/gjson v1.18.0 h1:FIDeeyB800efLX89e5a8Y0BNH+LOngJyGrIWxG2FKQY=
github.com/tidwall/gjson v1.18.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
@@ -24,3 +43,7 @@ github.com/tidwall/sjson v1.2.5 h1:kLy8mja+1c9jlljvWTlSazM7cKDRfJuR/bOJhcY5NcY=
github.com/tidwall/sjson v1.2.5/go.mod h1:Fvgq9kS/6ociJEDnK0Fk1cpYF4FIW6ZF7LAe+6jwd28=
golang.org/x/sync v0.16.0 h1:ycBJEhp9p4vXvUZNszeOq0kGTPghopOL8q0fq3vstxw=
golang.org/x/sync v0.16.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
gopkg.in/yaml.v2 v2.2.8 h1:obN1ZagJSUGI0Ek/LBmuj4SNLPfIny3KsKFopxRdj10=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -9,8 +9,11 @@ import (
type contextKey string
const (
userIDKey contextKey = "user_id"
tenantIDKey contextKey = "tenant_id"
userIDKey contextKey = "user_id"
tenantIDKey contextKey = "tenant_id"
ipKey contextKey = "ip_address"
userAgentKey contextKey = "user_agent"
userRoleKey contextKey = "user_role"
)
func ContextWithUserID(ctx context.Context, userID uuid.UUID) context.Context {
@@ -30,3 +33,32 @@ func TenantFromContext(ctx context.Context) (uuid.UUID, bool) {
id, ok := ctx.Value(tenantIDKey).(uuid.UUID)
return id, ok
}
func ContextWithRequestInfo(ctx context.Context, ip, userAgent string) context.Context {
ctx = context.WithValue(ctx, ipKey, ip)
ctx = context.WithValue(ctx, userAgentKey, userAgent)
return ctx
}
func IPFromContext(ctx context.Context) *string {
if v, ok := ctx.Value(ipKey).(string); ok && v != "" {
return &v
}
return nil
}
func UserAgentFromContext(ctx context.Context) *string {
if v, ok := ctx.Value(userAgentKey).(string); ok && v != "" {
return &v
}
return nil
}
func ContextWithUserRole(ctx context.Context, role string) context.Context {
return context.WithValue(ctx, userRoleKey, role)
}
func UserRoleFromContext(ctx context.Context) string {
role, _ := ctx.Value(userRoleKey).(string)
return role
}

View File

@@ -24,28 +24,26 @@ func (m *Middleware) RequireAuth(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
token := extractBearerToken(r)
if token == "" {
http.Error(w, "missing authorization token", http.StatusUnauthorized)
http.Error(w, `{"error":"missing authorization token"}`, http.StatusUnauthorized)
return
}
userID, err := m.verifyJWT(token)
if err != nil {
http.Error(w, fmt.Sprintf("invalid token: %v", err), http.StatusUnauthorized)
http.Error(w, `{"error":"invalid token"}`, http.StatusUnauthorized)
return
}
ctx := ContextWithUserID(r.Context(), userID)
// Resolve tenant from user_tenants
var tenantID uuid.UUID
err = m.db.GetContext(r.Context(), &tenantID,
"SELECT tenant_id FROM user_tenants WHERE user_id = $1 LIMIT 1", userID)
if err != nil {
http.Error(w, "no tenant found for user", http.StatusForbidden)
return
// Capture IP and user-agent for audit logging
ip := r.Header.Get("X-Forwarded-For")
if ip == "" {
ip = r.RemoteAddr
}
ctx = ContextWithTenantID(ctx, tenantID)
ctx = ContextWithRequestInfo(ctx, ip, r.UserAgent())
// Tenant and role resolution handled by TenantResolver middleware for scoped routes.
next.ServeHTTP(w, r.WithContext(ctx))
})
}

View File

@@ -0,0 +1,213 @@
package auth
import (
"context"
"net/http"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
)
// Valid roles ordered by privilege level (highest first).
var ValidRoles = []string{"owner", "partner", "associate", "paralegal", "secretary"}
// IsValidRole checks if a role string is one of the defined roles.
func IsValidRole(role string) bool {
for _, r := range ValidRoles {
if r == role {
return true
}
}
return false
}
// Permission represents an action that can be checked against roles.
type Permission int
const (
PermManageTeam Permission = iota
PermManageBilling
PermCreateCase
PermEditAllCases
PermEditAssignedCase
PermViewAllCases
PermManageDeadlines
PermManageAppointments
PermUploadDocuments
PermDeleteDocuments
PermDeleteOwnDocuments
PermViewAuditLog
PermManageSettings
PermAIExtraction
)
// rolePermissions maps each role to its set of permissions.
var rolePermissions = map[string]map[Permission]bool{
"owner": {
PermManageTeam: true,
PermManageBilling: true,
PermCreateCase: true,
PermEditAllCases: true,
PermEditAssignedCase: true,
PermViewAllCases: true,
PermManageDeadlines: true,
PermManageAppointments: true,
PermUploadDocuments: true,
PermDeleteDocuments: true,
PermDeleteOwnDocuments: true,
PermViewAuditLog: true,
PermManageSettings: true,
PermAIExtraction: true,
},
"partner": {
PermManageTeam: true,
PermManageBilling: true,
PermCreateCase: true,
PermEditAllCases: true,
PermEditAssignedCase: true,
PermViewAllCases: true,
PermManageDeadlines: true,
PermManageAppointments: true,
PermUploadDocuments: true,
PermDeleteDocuments: true,
PermDeleteOwnDocuments: true,
PermViewAuditLog: true,
PermManageSettings: true,
PermAIExtraction: true,
},
"associate": {
PermCreateCase: true,
PermEditAssignedCase: true,
PermViewAllCases: true,
PermManageDeadlines: true,
PermManageAppointments: true,
PermUploadDocuments: true,
PermDeleteOwnDocuments: true,
PermAIExtraction: true,
},
"paralegal": {
PermEditAssignedCase: true,
PermViewAllCases: true,
PermManageDeadlines: true,
PermManageAppointments: true,
PermUploadDocuments: true,
},
"secretary": {
PermViewAllCases: true,
PermManageAppointments: true,
PermUploadDocuments: true,
},
}
// HasPermission checks if the given role has the specified permission.
func HasPermission(role string, perm Permission) bool {
perms, ok := rolePermissions[role]
if !ok {
return false
}
return perms[perm]
}
// RequirePermission returns middleware that checks if the user's role has the given permission.
func RequirePermission(perm Permission) func(http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
role := UserRoleFromContext(r.Context())
if role == "" || !HasPermission(role, perm) {
writeJSONError(w, "insufficient permissions", http.StatusForbidden)
return
}
next.ServeHTTP(w, r)
})
}
}
// RequireRole returns middleware that checks if the user has one of the specified roles.
func RequireRole(roles ...string) func(http.Handler) http.Handler {
allowed := make(map[string]bool, len(roles))
for _, r := range roles {
allowed[r] = true
}
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
role := UserRoleFromContext(r.Context())
if !allowed[role] {
writeJSONError(w, "insufficient permissions", http.StatusForbidden)
return
}
next.ServeHTTP(w, r)
})
}
}
// IsAssignedToCase checks if a user is assigned to a specific case.
func IsAssignedToCase(ctx context.Context, db *sqlx.DB, userID, caseID uuid.UUID) (bool, error) {
var exists bool
err := db.GetContext(ctx, &exists,
`SELECT EXISTS(SELECT 1 FROM case_assignments WHERE user_id = $1 AND case_id = $2)`,
userID, caseID)
return exists, err
}
// CanEditCase checks if a user can edit a specific case based on role and assignment.
func CanEditCase(ctx context.Context, db *sqlx.DB, userID, caseID uuid.UUID, role string) (bool, error) {
// Owner and partner can edit all cases
if HasPermission(role, PermEditAllCases) {
return true, nil
}
// Others need to be assigned
if !HasPermission(role, PermEditAssignedCase) {
return false, nil
}
return IsAssignedToCase(ctx, db, userID, caseID)
}
// CanDeleteDocument checks if a user can delete a specific document.
func CanDeleteDocument(role string, docUploaderID, userID uuid.UUID) bool {
if HasPermission(role, PermDeleteDocuments) {
return true
}
if HasPermission(role, PermDeleteOwnDocuments) {
return docUploaderID == userID
}
return false
}
// permissionNames maps Permission constants to their string names for frontend use.
var permissionNames = map[Permission]string{
PermManageTeam: "manage_team",
PermManageBilling: "manage_billing",
PermCreateCase: "create_case",
PermEditAllCases: "edit_all_cases",
PermEditAssignedCase: "edit_assigned_case",
PermViewAllCases: "view_all_cases",
PermManageDeadlines: "manage_deadlines",
PermManageAppointments: "manage_appointments",
PermUploadDocuments: "upload_documents",
PermDeleteDocuments: "delete_documents",
PermDeleteOwnDocuments: "delete_own_documents",
PermViewAuditLog: "view_audit_log",
PermManageSettings: "manage_settings",
PermAIExtraction: "ai_extraction",
}
// GetRolePermissions returns a list of permission name strings for the given role.
func GetRolePermissions(role string) []string {
perms, ok := rolePermissions[role]
if !ok {
return nil
}
var names []string
for p := range perms {
if name, ok := permissionNames[p]; ok {
names = append(names, name)
}
}
return names
}
func writeJSONError(w http.ResponseWriter, msg string, status int) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(status)
w.Write([]byte(`{"error":"` + msg + `"}`))
}

View File

@@ -2,20 +2,21 @@ package auth
import (
"context"
"fmt"
"log/slog"
"net/http"
"github.com/google/uuid"
)
// TenantLookup resolves the default tenant for a user.
// TenantLookup resolves and verifies tenant access for a user.
// Defined as an interface to avoid circular dependency with services.
type TenantLookup interface {
FirstTenantForUser(ctx context.Context, userID uuid.UUID) (*uuid.UUID, error)
GetUserRole(ctx context.Context, userID, tenantID uuid.UUID) (string, error)
}
// TenantResolver is middleware that resolves the tenant from X-Tenant-ID header
// or defaults to the user's first tenant.
// or defaults to the user's first tenant. Always verifies user has access.
type TenantResolver struct {
lookup TenantLookup
}
@@ -28,34 +29,59 @@ func (tr *TenantResolver) Resolve(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
userID, ok := UserFromContext(r.Context())
if !ok {
http.Error(w, "unauthorized", http.StatusUnauthorized)
http.Error(w, `{"error":"unauthorized"}`, http.StatusUnauthorized)
return
}
var tenantID uuid.UUID
ctx := r.Context()
if header := r.Header.Get("X-Tenant-ID"); header != "" {
parsed, err := uuid.Parse(header)
if err != nil {
http.Error(w, fmt.Sprintf("invalid X-Tenant-ID: %v", err), http.StatusBadRequest)
http.Error(w, `{"error":"invalid X-Tenant-ID"}`, http.StatusBadRequest)
return
}
// Verify user has access and get their role
role, err := tr.lookup.GetUserRole(r.Context(), userID, parsed)
if err != nil {
slog.Error("tenant access check failed", "error", err, "user_id", userID, "tenant_id", parsed)
http.Error(w, `{"error":"internal error"}`, http.StatusInternalServerError)
return
}
if role == "" {
http.Error(w, `{"error":"no access to tenant"}`, http.StatusForbidden)
return
}
tenantID = parsed
ctx = ContextWithUserRole(ctx, role)
} else {
// Default to user's first tenant
first, err := tr.lookup.FirstTenantForUser(r.Context(), userID)
if err != nil {
http.Error(w, fmt.Sprintf("resolving tenant: %v", err), http.StatusInternalServerError)
slog.Error("failed to resolve default tenant", "error", err, "user_id", userID)
http.Error(w, `{"error":"internal error"}`, http.StatusInternalServerError)
return
}
if first == nil {
http.Error(w, "no tenant found for user", http.StatusBadRequest)
http.Error(w, `{"error":"no tenant found for user"}`, http.StatusBadRequest)
return
}
tenantID = *first
// Look up role for default tenant
role, err := tr.lookup.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
slog.Error("failed to get user role", "error", err, "user_id", userID, "tenant_id", tenantID)
http.Error(w, `{"error":"internal error"}`, http.StatusInternalServerError)
return
}
ctx = ContextWithUserRole(ctx, role)
}
ctx := ContextWithTenantID(r.Context(), tenantID)
ctx = ContextWithTenantID(ctx, tenantID)
next.ServeHTTP(w, r.WithContext(ctx))
})
}

View File

@@ -11,6 +11,7 @@ import (
type mockTenantLookup struct {
tenantID *uuid.UUID
role string
err error
}
@@ -18,17 +19,23 @@ func (m *mockTenantLookup) FirstTenantForUser(ctx context.Context, userID uuid.U
return m.tenantID, m.err
}
func (m *mockTenantLookup) GetUserRole(ctx context.Context, userID, tenantID uuid.UUID) (string, error) {
return m.role, m.err
}
func TestTenantResolver_FromHeader(t *testing.T) {
tenantID := uuid.New()
tr := NewTenantResolver(&mockTenantLookup{})
tr := NewTenantResolver(&mockTenantLookup{role: "partner"})
var gotTenantID uuid.UUID
var gotRole string
next := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
id, ok := TenantFromContext(r.Context())
if !ok {
t.Fatal("tenant ID not in context")
}
gotTenantID = id
gotRole = UserRoleFromContext(r.Context())
w.WriteHeader(http.StatusOK)
})
@@ -45,11 +52,34 @@ func TestTenantResolver_FromHeader(t *testing.T) {
if gotTenantID != tenantID {
t.Errorf("expected tenant %s, got %s", tenantID, gotTenantID)
}
if gotRole != "partner" {
t.Errorf("expected role partner, got %s", gotRole)
}
}
func TestTenantResolver_FromHeader_NoAccess(t *testing.T) {
tenantID := uuid.New()
tr := NewTenantResolver(&mockTenantLookup{role: ""})
next := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
t.Fatal("next should not be called")
})
r := httptest.NewRequest("GET", "/api/cases", nil)
r.Header.Set("X-Tenant-ID", tenantID.String())
r = r.WithContext(ContextWithUserID(r.Context(), uuid.New()))
w := httptest.NewRecorder()
tr.Resolve(next).ServeHTTP(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestTenantResolver_DefaultsToFirst(t *testing.T) {
tenantID := uuid.New()
tr := NewTenantResolver(&mockTenantLookup{tenantID: &tenantID})
tr := NewTenantResolver(&mockTenantLookup{tenantID: &tenantID, role: "associate"})
var gotTenantID uuid.UUID
next := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {

View File

@@ -13,6 +13,14 @@ type Config struct {
SupabaseServiceKey string
SupabaseJWTSecret string
AnthropicAPIKey string
FrontendOrigin string
// SMTP settings (optional — email sending disabled if SMTPHost is empty)
SMTPHost string
SMTPPort string
SMTPUser string
SMTPPass string
MailFrom string
}
func Load() (*Config, error) {
@@ -24,6 +32,13 @@ func Load() (*Config, error) {
SupabaseServiceKey: os.Getenv("SUPABASE_SERVICE_KEY"),
SupabaseJWTSecret: os.Getenv("SUPABASE_JWT_SECRET"),
AnthropicAPIKey: os.Getenv("ANTHROPIC_API_KEY"),
FrontendOrigin: getEnv("FRONTEND_ORIGIN", "https://kanzlai.msbls.de"),
SMTPHost: os.Getenv("SMTP_HOST"),
SMTPPort: getEnv("SMTP_PORT", "465"),
SMTPUser: os.Getenv("SMTP_USER"),
SMTPPass: os.Getenv("SMTP_PASS"),
MailFrom: getEnv("MAIL_FROM", "mgmt@msbls.de"),
}
if cfg.DatabaseURL == "" {

View File

@@ -13,8 +13,8 @@ func Connect(databaseURL string) (*sqlx.DB, error) {
return nil, fmt.Errorf("connecting to database: %w", err)
}
// Set search_path so queries use kanzlai schema by default
if _, err := db.Exec("SET search_path TO kanzlai, public"); err != nil {
// Set search_path so queries use mgmt schema by default
if _, err := db.Exec("SET search_path TO mgmt, public"); err != nil {
db.Close()
return nil, fmt.Errorf("setting search_path: %w", err)
}

View File

@@ -5,18 +5,18 @@ import (
"io"
"net/http"
"github.com/jmoiron/sqlx"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type AIHandler struct {
ai *services.AIService
db *sqlx.DB
}
func NewAIHandler(ai *services.AIService, db *sqlx.DB) *AIHandler {
return &AIHandler{ai: ai, db: db}
func NewAIHandler(ai *services.AIService) *AIHandler {
return &AIHandler{ai: ai}
}
// ExtractDeadlines handles POST /api/ai/extract-deadlines
@@ -61,10 +61,14 @@ func (h *AIHandler) ExtractDeadlines(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "provide either a PDF file or text")
return
}
if len(text) > maxDescriptionLen {
writeError(w, http.StatusBadRequest, "text exceeds maximum length")
return
}
deadlines, err := h.ai.ExtractDeadlines(r.Context(), pdfData, text)
if err != nil {
writeError(w, http.StatusInternalServerError, "AI extraction failed: "+err.Error())
internalError(w, "AI deadline extraction failed", err)
return
}
@@ -77,9 +81,9 @@ func (h *AIHandler) ExtractDeadlines(w http.ResponseWriter, r *http.Request) {
// SummarizeCase handles POST /api/ai/summarize-case
// Accepts JSON {"case_id": "uuid"}.
func (h *AIHandler) SummarizeCase(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -104,7 +108,7 @@ func (h *AIHandler) SummarizeCase(w http.ResponseWriter, r *http.Request) {
summary, err := h.ai.SummarizeCase(r.Context(), tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, "AI summarization failed: "+err.Error())
internalError(w, "AI case summarization failed", err)
return
}
@@ -113,3 +117,139 @@ func (h *AIHandler) SummarizeCase(w http.ResponseWriter, r *http.Request) {
"summary": summary,
})
}
// DraftDocument handles POST /api/ai/draft-document
// Accepts JSON {"case_id": "uuid", "template_type": "string", "instructions": "string", "language": "de|en|fr"}.
func (h *AIHandler) DraftDocument(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var body struct {
CaseID string `json:"case_id"`
TemplateType string `json:"template_type"`
Instructions string `json:"instructions"`
Language string `json:"language"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if body.CaseID == "" {
writeError(w, http.StatusBadRequest, "case_id is required")
return
}
if body.TemplateType == "" {
writeError(w, http.StatusBadRequest, "template_type is required")
return
}
caseID, err := parseUUID(body.CaseID)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
if len(body.Instructions) > maxDescriptionLen {
writeError(w, http.StatusBadRequest, "instructions exceeds maximum length")
return
}
draft, err := h.ai.DraftDocument(r.Context(), tenantID, caseID, body.TemplateType, body.Instructions, body.Language)
if err != nil {
internalError(w, "AI document drafting failed", err)
return
}
writeJSON(w, http.StatusOK, draft)
}
// CaseStrategy handles POST /api/ai/case-strategy
// Accepts JSON {"case_id": "uuid"}.
func (h *AIHandler) CaseStrategy(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var body struct {
CaseID string `json:"case_id"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if body.CaseID == "" {
writeError(w, http.StatusBadRequest, "case_id is required")
return
}
caseID, err := parseUUID(body.CaseID)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
strategy, err := h.ai.CaseStrategy(r.Context(), tenantID, caseID)
if err != nil {
internalError(w, "AI case strategy analysis failed", err)
return
}
writeJSON(w, http.StatusOK, strategy)
}
// SimilarCases handles POST /api/ai/similar-cases
// Accepts JSON {"case_id": "uuid", "description": "string"}.
func (h *AIHandler) SimilarCases(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var body struct {
CaseID string `json:"case_id"`
Description string `json:"description"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if body.CaseID == "" && body.Description == "" {
writeError(w, http.StatusBadRequest, "either case_id or description is required")
return
}
if len(body.Description) > maxDescriptionLen {
writeError(w, http.StatusBadRequest, "description exceeds maximum length")
return
}
var caseID uuid.UUID
if body.CaseID != "" {
var err error
caseID, err = parseUUID(body.CaseID)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
}
cases, err := h.ai.FindSimilarCases(r.Context(), tenantID, caseID, body.Description)
if err != nil {
internalError(w, "AI similar case search failed", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{
"cases": cases,
"count": len(cases),
})
}

View File

@@ -0,0 +1,74 @@
package handlers
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
)
func TestAIExtractDeadlines_EmptyInput(t *testing.T) {
h := &AIHandler{}
body := `{"text":""}`
r := httptest.NewRequest("POST", "/api/ai/extract-deadlines", bytes.NewBufferString(body))
r.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
h.ExtractDeadlines(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != "provide either a PDF file or text" {
t.Errorf("unexpected error: %s", resp["error"])
}
}
func TestAIExtractDeadlines_InvalidJSON(t *testing.T) {
h := &AIHandler{}
r := httptest.NewRequest("POST", "/api/ai/extract-deadlines", bytes.NewBufferString(`{broken`))
r.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
h.ExtractDeadlines(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestAISummarizeCase_MissingTenant(t *testing.T) {
h := &AIHandler{}
body := `{"case_id":""}`
r := httptest.NewRequest("POST", "/api/ai/summarize-case", bytes.NewBufferString(body))
r.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
h.SummarizeCase(w, r)
// Without tenant context, TenantFromContext returns !ok → 403
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestAISummarizeCase_InvalidJSON(t *testing.T) {
h := &AIHandler{}
r := httptest.NewRequest("POST", "/api/ai/summarize-case", bytes.NewBufferString(`not-json`))
r.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
h.SummarizeCase(w, r)
// Without tenant context, TenantFromContext returns !ok → 403
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}

View File

@@ -0,0 +1,196 @@
package handlers
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
)
func TestAppointmentCreate_NoTenant(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("POST", "/api/appointments", bytes.NewBufferString(`{}`))
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusUnauthorized {
t.Errorf("expected 401, got %d", w.Code)
}
}
func TestAppointmentCreate_MissingTitle(t *testing.T) {
h := &AppointmentHandler{}
body := `{"start_at":"2026-04-01T10:00:00Z"}`
r := httptest.NewRequest("POST", "/api/appointments", bytes.NewBufferString(body))
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != "title is required" {
t.Errorf("unexpected error: %s", resp["error"])
}
}
func TestAppointmentCreate_MissingStartAt(t *testing.T) {
h := &AppointmentHandler{}
body := `{"title":"Test Appointment"}`
r := httptest.NewRequest("POST", "/api/appointments", bytes.NewBufferString(body))
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != "start_at is required" {
t.Errorf("unexpected error: %s", resp["error"])
}
}
func TestAppointmentCreate_InvalidJSON(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("POST", "/api/appointments", bytes.NewBufferString(`{broken`))
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestAppointmentList_NoTenant(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("GET", "/api/appointments", nil)
w := httptest.NewRecorder()
h.List(w, r)
if w.Code != http.StatusUnauthorized {
t.Errorf("expected 401, got %d", w.Code)
}
}
func TestAppointmentUpdate_NoTenant(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("PUT", "/api/appointments/"+uuid.New().String(), bytes.NewBufferString(`{}`))
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.Update(w, r)
if w.Code != http.StatusUnauthorized {
t.Errorf("expected 401, got %d", w.Code)
}
}
func TestAppointmentUpdate_InvalidID(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("PUT", "/api/appointments/not-uuid", bytes.NewBufferString(`{}`))
r.SetPathValue("id", "not-uuid")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Update(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestAppointmentDelete_NoTenant(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("DELETE", "/api/appointments/"+uuid.New().String(), nil)
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusUnauthorized {
t.Errorf("expected 401, got %d", w.Code)
}
}
func TestAppointmentDelete_InvalidID(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("DELETE", "/api/appointments/bad", nil)
r.SetPathValue("id", "bad")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestAppointmentList_InvalidCaseID(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("GET", "/api/appointments?case_id=bad", nil)
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.List(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestAppointmentList_InvalidStartFrom(t *testing.T) {
h := &AppointmentHandler{}
r := httptest.NewRequest("GET", "/api/appointments?start_from=not-a-date", nil)
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.List(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}

View File

@@ -22,6 +22,33 @@ func NewAppointmentHandler(svc *services.AppointmentService) *AppointmentHandler
return &AppointmentHandler{svc: svc}
}
// Get handles GET /api/appointments/{id}
func (h *AppointmentHandler) Get(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
id, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid appointment id")
return
}
appt, err := h.svc.GetByID(r.Context(), tenantID, id)
if err != nil {
if errors.Is(err, sql.ErrNoRows) {
writeError(w, http.StatusNotFound, "appointment not found")
return
}
writeError(w, http.StatusInternalServerError, "failed to fetch appointment")
return
}
writeJSON(w, http.StatusOK, appt)
}
func (h *AppointmentHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
@@ -94,6 +121,10 @@ func (h *AppointmentHandler) Create(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "title is required")
return
}
if msg := validateStringLength("title", req.Title, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
if req.StartAt.IsZero() {
writeError(w, http.StatusBadRequest, "start_at is required")
return
@@ -161,6 +192,10 @@ func (h *AppointmentHandler) Update(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "title is required")
return
}
if msg := validateStringLength("title", req.Title, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
if req.StartAt.IsZero() {
writeError(w, http.StatusBadRequest, "start_at is required")
return

View File

@@ -0,0 +1,63 @@
package handlers
import (
"net/http"
"strconv"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type AuditLogHandler struct {
svc *services.AuditService
}
func NewAuditLogHandler(svc *services.AuditService) *AuditLogHandler {
return &AuditLogHandler{svc: svc}
}
func (h *AuditLogHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
q := r.URL.Query()
page, _ := strconv.Atoi(q.Get("page"))
limit, _ := strconv.Atoi(q.Get("limit"))
filter := services.AuditFilter{
EntityType: q.Get("entity_type"),
From: q.Get("from"),
To: q.Get("to"),
Page: page,
Limit: limit,
}
if idStr := q.Get("entity_id"); idStr != "" {
if id, err := uuid.Parse(idStr); err == nil {
filter.EntityID = &id
}
}
if idStr := q.Get("user_id"); idStr != "" {
if id, err := uuid.Parse(idStr); err == nil {
filter.UserID = &id
}
}
entries, total, err := h.svc.List(r.Context(), tenantID, filter)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to fetch audit log")
return
}
writeJSON(w, http.StatusOK, map[string]any{
"entries": entries,
"total": total,
"page": filter.Page,
"limit": filter.Limit,
})
}

View File

@@ -0,0 +1,66 @@
package handlers
import (
"encoding/json"
"net/http"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type BillingRateHandler struct {
svc *services.BillingRateService
}
func NewBillingRateHandler(svc *services.BillingRateService) *BillingRateHandler {
return &BillingRateHandler{svc: svc}
}
// List handles GET /api/billing-rates
func (h *BillingRateHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
rates, err := h.svc.List(r.Context(), tenantID)
if err != nil {
internalError(w, "failed to list billing rates", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{"billing_rates": rates})
}
// Upsert handles PUT /api/billing-rates
func (h *BillingRateHandler) Upsert(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var input services.UpsertBillingRateInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if input.Rate < 0 {
writeError(w, http.StatusBadRequest, "rate must be non-negative")
return
}
if input.ValidFrom == "" {
writeError(w, http.StatusBadRequest, "valid_from is required")
return
}
rate, err := h.svc.Upsert(r.Context(), tenantID, input)
if err != nil {
internalError(w, "failed to upsert billing rate", err)
return
}
writeJSON(w, http.StatusOK, rate)
}

View File

@@ -0,0 +1,83 @@
package handlers
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
)
func TestCalculate_MissingFields(t *testing.T) {
h := &CalculateHandlers{}
tests := []struct {
name string
body string
want string
}{
{
name: "empty body",
body: `{}`,
want: "proceeding_type and trigger_event_date are required",
},
{
name: "missing trigger_event_date",
body: `{"proceeding_type":"INF"}`,
want: "proceeding_type and trigger_event_date are required",
},
{
name: "missing proceeding_type",
body: `{"trigger_event_date":"2026-06-01"}`,
want: "proceeding_type and trigger_event_date are required",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
r := httptest.NewRequest("POST", "/api/deadlines/calculate", bytes.NewBufferString(tt.body))
w := httptest.NewRecorder()
h.Calculate(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != tt.want {
t.Errorf("expected error %q, got %q", tt.want, resp["error"])
}
})
}
}
func TestCalculate_InvalidDateFormat(t *testing.T) {
h := &CalculateHandlers{}
body := `{"proceeding_type":"INF","trigger_event_date":"01-06-2026"}`
r := httptest.NewRequest("POST", "/api/deadlines/calculate", bytes.NewBufferString(body))
w := httptest.NewRecorder()
h.Calculate(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != "invalid trigger_event_date format, expected YYYY-MM-DD" {
t.Errorf("unexpected error: %s", resp["error"])
}
}
func TestCalculate_InvalidJSON(t *testing.T) {
h := &CalculateHandlers{}
r := httptest.NewRequest("POST", "/api/deadlines/calculate", bytes.NewBufferString(`not-json`))
w := httptest.NewRecorder()
h.Calculate(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}

View File

@@ -0,0 +1,68 @@
package handlers
import (
"net/http"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
// CalDAVHandler handles CalDAV sync HTTP endpoints.
type CalDAVHandler struct {
svc *services.CalDAVService
}
// NewCalDAVHandler creates a new CalDAV handler.
func NewCalDAVHandler(svc *services.CalDAVService) *CalDAVHandler {
return &CalDAVHandler{svc: svc}
}
// TriggerSync handles POST /api/caldav/sync — triggers a full sync for the current tenant.
func (h *CalDAVHandler) TriggerSync(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "no tenant context")
return
}
cfg, err := h.svc.LoadTenantConfig(tenantID)
if err != nil {
writeError(w, http.StatusBadRequest, "CalDAV not configured for this tenant")
return
}
status, err := h.svc.SyncTenant(r.Context(), tenantID, *cfg)
if err != nil {
// Still return the status — it contains partial results + error info
writeJSON(w, http.StatusOK, map[string]any{
"status": "completed_with_errors",
"sync": status,
})
return
}
writeJSON(w, http.StatusOK, map[string]any{
"status": "ok",
"sync": status,
})
}
// GetStatus handles GET /api/caldav/status — returns last sync status.
func (h *CalDAVHandler) GetStatus(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "no tenant context")
return
}
status := h.svc.GetStatus(tenantID)
if status == nil {
writeJSON(w, http.StatusOK, map[string]any{
"status": "no_sync_yet",
"last_sync_at": nil,
})
return
}
writeJSON(w, http.StatusOK, status)
}

View File

@@ -0,0 +1,119 @@
package handlers
import (
"encoding/json"
"net/http"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type CaseAssignmentHandler struct {
svc *services.CaseAssignmentService
}
func NewCaseAssignmentHandler(svc *services.CaseAssignmentService) *CaseAssignmentHandler {
return &CaseAssignmentHandler{svc: svc}
}
// List handles GET /api/cases/{id}/assignments
func (h *CaseAssignmentHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
caseID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
assignments, err := h.svc.ListByCase(r.Context(), tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
return
}
writeJSON(w, http.StatusOK, map[string]any{
"assignments": assignments,
"total": len(assignments),
})
}
// Assign handles POST /api/cases/{id}/assignments
func (h *CaseAssignmentHandler) Assign(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
caseID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
var req struct {
UserID string `json:"user_id"`
Role string `json:"role"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
userID, err := uuid.Parse(req.UserID)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid user_id")
return
}
if req.Role == "" {
req.Role = "team"
}
if req.Role != "lead" && req.Role != "team" && req.Role != "viewer" {
writeError(w, http.StatusBadRequest, "role must be lead, team, or viewer")
return
}
assignment, err := h.svc.Assign(r.Context(), tenantID, caseID, userID, req.Role)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusCreated, assignment)
}
// Unassign handles DELETE /api/cases/{id}/assignments/{uid}
func (h *CaseAssignmentHandler) Unassign(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
caseID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
userID, err := uuid.Parse(r.PathValue("uid"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid user ID")
return
}
if err := h.svc.Unassign(r.Context(), tenantID, caseID, userID); err != nil {
writeError(w, http.StatusNotFound, err.Error())
return
}
writeJSON(w, http.StatusOK, map[string]string{"status": "removed"})
}

View File

@@ -0,0 +1,52 @@
package handlers
import (
"database/sql"
"errors"
"net/http"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
"github.com/jmoiron/sqlx"
)
type CaseEventHandler struct {
db *sqlx.DB
}
func NewCaseEventHandler(db *sqlx.DB) *CaseEventHandler {
return &CaseEventHandler{db: db}
}
// Get handles GET /api/case-events/{id}
func (h *CaseEventHandler) Get(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
eventID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid event ID")
return
}
var event models.CaseEvent
err = h.db.GetContext(r.Context(), &event,
`SELECT id, tenant_id, case_id, event_type, title, description, event_date, created_by, metadata, created_at, updated_at
FROM case_events
WHERE id = $1 AND tenant_id = $2`, eventID, tenantID)
if err != nil {
if errors.Is(err, sql.ErrNoRows) {
writeError(w, http.StatusNotFound, "case event not found")
return
}
writeError(w, http.StatusInternalServerError, "failed to fetch case event")
return
}
writeJSON(w, http.StatusOK, event)
}

View File

@@ -0,0 +1,177 @@
package handlers
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
)
func TestCaseCreate_NoAuth(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("POST", "/api/cases", bytes.NewBufferString(`{}`))
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestCaseCreate_MissingFields(t *testing.T) {
h := &CaseHandler{}
body := `{"case_number":"","title":""}`
r := httptest.NewRequest("POST", "/api/cases", bytes.NewBufferString(body))
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
var resp map[string]string
json.NewDecoder(w.Body).Decode(&resp)
if resp["error"] != "case_number and title are required" {
t.Errorf("unexpected error: %s", resp["error"])
}
}
func TestCaseCreate_InvalidJSON(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("POST", "/api/cases", bytes.NewBufferString(`not-json`))
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Create(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestCaseGet_InvalidID(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("GET", "/api/cases/not-a-uuid", nil)
r.SetPathValue("id", "not-a-uuid")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Get(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestCaseGet_NoTenant(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("GET", "/api/cases/"+uuid.New().String(), nil)
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.Get(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestCaseList_NoTenant(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("GET", "/api/cases", nil)
w := httptest.NewRecorder()
h.List(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestCaseUpdate_InvalidID(t *testing.T) {
h := &CaseHandler{}
body := `{"title":"Updated"}`
r := httptest.NewRequest("PUT", "/api/cases/bad-id", bytes.NewBufferString(body))
r.SetPathValue("id", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Update(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestCaseUpdate_InvalidJSON(t *testing.T) {
h := &CaseHandler{}
caseID := uuid.New().String()
r := httptest.NewRequest("PUT", "/api/cases/"+caseID, bytes.NewBufferString(`{bad`))
r.SetPathValue("id", caseID)
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Update(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestCaseDelete_NoTenant(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("DELETE", "/api/cases/"+uuid.New().String(), nil)
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestCaseDelete_InvalidID(t *testing.T) {
h := &CaseHandler{}
r := httptest.NewRequest("DELETE", "/api/cases/bad-id", nil)
r.SetPathValue("id", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}

View File

@@ -28,18 +28,25 @@ func (h *CaseHandler) List(w http.ResponseWriter, r *http.Request) {
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
offset, _ := strconv.Atoi(r.URL.Query().Get("offset"))
limit, offset = clampPagination(limit, offset)
search := r.URL.Query().Get("search")
if msg := validateStringLength("search", search, maxSearchLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
filter := services.CaseFilter{
Status: r.URL.Query().Get("status"),
Type: r.URL.Query().Get("type"),
Search: r.URL.Query().Get("search"),
Search: search,
Limit: limit,
Offset: offset,
}
cases, total, err := h.svc.List(r.Context(), tenantID, filter)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to list cases", err)
return
}
@@ -66,10 +73,18 @@ func (h *CaseHandler) Create(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "case_number and title are required")
return
}
if msg := validateStringLength("case_number", input.CaseNumber, maxCaseNumberLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
if msg := validateStringLength("title", input.Title, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
c, err := h.svc.Create(r.Context(), tenantID, userID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to create case", err)
return
}
@@ -91,7 +106,7 @@ func (h *CaseHandler) Get(w http.ResponseWriter, r *http.Request) {
detail, err := h.svc.GetByID(r.Context(), tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to get case", err)
return
}
if detail == nil {
@@ -121,10 +136,22 @@ func (h *CaseHandler) Update(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if input.Title != nil {
if msg := validateStringLength("title", *input.Title, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
}
if input.CaseNumber != nil {
if msg := validateStringLength("case_number", *input.CaseNumber, maxCaseNumberLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
}
updated, err := h.svc.Update(r.Context(), tenantID, caseID, userID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to update case", err)
return
}
if updated == nil {

View File

@@ -24,7 +24,7 @@ func (h *DashboardHandler) Get(w http.ResponseWriter, r *http.Request) {
data, err := h.svc.Get(r.Context(), tenantID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to load dashboard", err)
return
}

View File

@@ -0,0 +1,19 @@
package handlers
import (
"net/http"
"net/http/httptest"
"testing"
)
func TestDashboardGet_NoTenant(t *testing.T) {
h := &DashboardHandler{}
r := httptest.NewRequest("GET", "/api/dashboard", nil)
w := httptest.NewRecorder()
h.Get(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}

View File

@@ -4,33 +4,58 @@ import (
"encoding/json"
"net/http"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
// DeadlineHandlers holds handlers for deadline CRUD endpoints
type DeadlineHandlers struct {
deadlines *services.DeadlineService
db *sqlx.DB
}
// NewDeadlineHandlers creates deadline handlers
func NewDeadlineHandlers(ds *services.DeadlineService, db *sqlx.DB) *DeadlineHandlers {
return &DeadlineHandlers{deadlines: ds, db: db}
func NewDeadlineHandlers(ds *services.DeadlineService) *DeadlineHandlers {
return &DeadlineHandlers{deadlines: ds}
}
// Get handles GET /api/deadlines/{deadlineID}
func (h *DeadlineHandlers) Get(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
deadlineID, err := parsePathUUID(r, "deadlineID")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid deadline ID")
return
}
deadline, err := h.deadlines.GetByID(tenantID, deadlineID)
if err != nil {
internalError(w, "failed to fetch deadline", err)
return
}
if deadline == nil {
writeError(w, http.StatusNotFound, "deadline not found")
return
}
writeJSON(w, http.StatusOK, deadline)
}
// ListAll handles GET /api/deadlines
func (h *DeadlineHandlers) ListAll(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
deadlines, err := h.deadlines.ListAll(tenantID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to list deadlines")
internalError(w, "failed to list deadlines", err)
return
}
@@ -39,9 +64,9 @@ func (h *DeadlineHandlers) ListAll(w http.ResponseWriter, r *http.Request) {
// ListForCase handles GET /api/cases/{caseID}/deadlines
func (h *DeadlineHandlers) ListForCase(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -53,7 +78,7 @@ func (h *DeadlineHandlers) ListForCase(w http.ResponseWriter, r *http.Request) {
deadlines, err := h.deadlines.ListForCase(tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to list deadlines")
internalError(w, "failed to list deadlines for case", err)
return
}
@@ -62,9 +87,9 @@ func (h *DeadlineHandlers) ListForCase(w http.ResponseWriter, r *http.Request) {
// Create handles POST /api/cases/{caseID}/deadlines
func (h *DeadlineHandlers) Create(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -85,10 +110,14 @@ func (h *DeadlineHandlers) Create(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusBadRequest, "title and due_date are required")
return
}
if msg := validateStringLength("title", input.Title, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
deadline, err := h.deadlines.Create(tenantID, input)
deadline, err := h.deadlines.Create(r.Context(), tenantID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to create deadline")
internalError(w, "failed to create deadline", err)
return
}
@@ -97,9 +126,9 @@ func (h *DeadlineHandlers) Create(w http.ResponseWriter, r *http.Request) {
// Update handles PUT /api/deadlines/{deadlineID}
func (h *DeadlineHandlers) Update(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -115,9 +144,9 @@ func (h *DeadlineHandlers) Update(w http.ResponseWriter, r *http.Request) {
return
}
deadline, err := h.deadlines.Update(tenantID, deadlineID, input)
deadline, err := h.deadlines.Update(r.Context(), tenantID, deadlineID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to update deadline")
internalError(w, "failed to update deadline", err)
return
}
if deadline == nil {
@@ -130,9 +159,9 @@ func (h *DeadlineHandlers) Update(w http.ResponseWriter, r *http.Request) {
// Complete handles PATCH /api/deadlines/{deadlineID}/complete
func (h *DeadlineHandlers) Complete(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -142,9 +171,9 @@ func (h *DeadlineHandlers) Complete(w http.ResponseWriter, r *http.Request) {
return
}
deadline, err := h.deadlines.Complete(tenantID, deadlineID)
deadline, err := h.deadlines.Complete(r.Context(), tenantID, deadlineID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to complete deadline")
internalError(w, "failed to complete deadline", err)
return
}
if deadline == nil {
@@ -157,9 +186,9 @@ func (h *DeadlineHandlers) Complete(w http.ResponseWriter, r *http.Request) {
// Delete handles DELETE /api/deadlines/{deadlineID}
func (h *DeadlineHandlers) Delete(w http.ResponseWriter, r *http.Request) {
tenantID, err := resolveTenant(r, h.db)
if err != nil {
handleTenantError(w, err)
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
@@ -169,9 +198,8 @@ func (h *DeadlineHandlers) Delete(w http.ResponseWriter, r *http.Request) {
return
}
err = h.deadlines.Delete(tenantID, deadlineID)
if err != nil {
writeError(w, http.StatusNotFound, err.Error())
if err := h.deadlines.Delete(r.Context(), tenantID, deadlineID); err != nil {
writeError(w, http.StatusNotFound, "deadline not found")
return
}

View File

@@ -0,0 +1,127 @@
package handlers
import (
"encoding/json"
"net/http"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
// DetermineHandlers holds handlers for deadline determination endpoints
type DetermineHandlers struct {
determine *services.DetermineService
deadlines *services.DeadlineService
}
// NewDetermineHandlers creates determine handlers
func NewDetermineHandlers(determine *services.DetermineService, deadlines *services.DeadlineService) *DetermineHandlers {
return &DetermineHandlers{determine: determine, deadlines: deadlines}
}
// GetTimeline handles GET /api/proceeding-types/{code}/timeline
// Returns the full event tree for a proceeding type (no date calculations)
func (h *DetermineHandlers) GetTimeline(w http.ResponseWriter, r *http.Request) {
code := r.PathValue("code")
if code == "" {
writeError(w, http.StatusBadRequest, "proceeding type code required")
return
}
timeline, pt, err := h.determine.GetTimeline(code)
if err != nil {
writeError(w, http.StatusNotFound, "proceeding type not found")
return
}
writeJSON(w, http.StatusOK, map[string]any{
"proceeding_type": pt,
"timeline": timeline,
})
}
// Determine handles POST /api/deadlines/determine
// Calculates the full timeline with cascading dates and conditional logic
func (h *DetermineHandlers) Determine(w http.ResponseWriter, r *http.Request) {
var req services.DetermineRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if req.ProceedingType == "" || req.TriggerEventDate == "" {
writeError(w, http.StatusBadRequest, "proceeding_type and trigger_event_date are required")
return
}
resp, err := h.determine.Determine(req)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, resp)
}
// BatchCreate handles POST /api/cases/{caseID}/deadlines/batch
// Creates multiple deadlines on a case from determined timeline
func (h *DetermineHandlers) BatchCreate(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
caseID, err := parsePathUUID(r, "caseID")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
var req struct {
Deadlines []struct {
Title string `json:"title"`
DueDate string `json:"due_date"`
OriginalDueDate *string `json:"original_due_date,omitempty"`
RuleID *uuid.UUID `json:"rule_id,omitempty"`
RuleCode *string `json:"rule_code,omitempty"`
Notes *string `json:"notes,omitempty"`
} `json:"deadlines"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if len(req.Deadlines) == 0 {
writeError(w, http.StatusBadRequest, "at least one deadline is required")
return
}
var created int
for _, d := range req.Deadlines {
if d.Title == "" || d.DueDate == "" {
continue
}
input := services.CreateDeadlineInput{
CaseID: caseID,
Title: d.Title,
DueDate: d.DueDate,
Source: "determined",
RuleID: d.RuleID,
Notes: d.Notes,
}
_, err := h.deadlines.Create(r.Context(), tenantID, input)
if err != nil {
internalError(w, "failed to create deadline", err)
return
}
created++
}
writeJSON(w, http.StatusCreated, map[string]any{
"created": created,
})
}

View File

@@ -0,0 +1,166 @@
package handlers
import (
"net/http"
"net/http/httptest"
"testing"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
)
func TestDocumentListByCase_NoTenant(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/cases/"+uuid.New().String()+"/documents", nil)
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.ListByCase(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestDocumentListByCase_InvalidCaseID(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/cases/bad-id/documents", nil)
r.SetPathValue("id", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.ListByCase(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestDocumentUpload_NoTenant(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("POST", "/api/cases/"+uuid.New().String()+"/documents", nil)
r.SetPathValue("id", uuid.New().String())
w := httptest.NewRecorder()
h.Upload(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestDocumentUpload_InvalidCaseID(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("POST", "/api/cases/bad-id/documents", nil)
r.SetPathValue("id", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Upload(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestDocumentDownload_NoTenant(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/documents/"+uuid.New().String(), nil)
r.SetPathValue("docId", uuid.New().String())
w := httptest.NewRecorder()
h.Download(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestDocumentDownload_InvalidID(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/documents/bad-id", nil)
r.SetPathValue("docId", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Download(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestDocumentGetMeta_NoTenant(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/documents/"+uuid.New().String()+"/meta", nil)
r.SetPathValue("docId", uuid.New().String())
w := httptest.NewRecorder()
h.GetMeta(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestDocumentGetMeta_InvalidID(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("GET", "/api/documents/bad-id/meta", nil)
r.SetPathValue("docId", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.GetMeta(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}
func TestDocumentDelete_NoTenant(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("DELETE", "/api/documents/"+uuid.New().String(), nil)
r.SetPathValue("docId", uuid.New().String())
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusForbidden {
t.Errorf("expected 403, got %d", w.Code)
}
}
func TestDocumentDelete_InvalidID(t *testing.T) {
h := &DocumentHandler{}
r := httptest.NewRequest("DELETE", "/api/documents/bad-id", nil)
r.SetPathValue("docId", "bad-id")
ctx := auth.ContextWithTenantID(
auth.ContextWithUserID(r.Context(), uuid.New()),
uuid.New(),
)
r = r.WithContext(ctx)
w := httptest.NewRecorder()
h.Delete(w, r)
if w.Code != http.StatusBadRequest {
t.Errorf("expected 400, got %d", w.Code)
}
}

View File

@@ -36,7 +36,7 @@ func (h *DocumentHandler) ListByCase(w http.ResponseWriter, r *http.Request) {
docs, err := h.svc.ListByCase(r.Context(), tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to list documents", err)
return
}
@@ -98,7 +98,7 @@ func (h *DocumentHandler) Upload(w http.ResponseWriter, r *http.Request) {
writeError(w, http.StatusNotFound, "case not found")
return
}
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to upload document", err)
return
}
@@ -121,16 +121,16 @@ func (h *DocumentHandler) Download(w http.ResponseWriter, r *http.Request) {
body, contentType, title, err := h.svc.Download(r.Context(), tenantID, docID)
if err != nil {
if err.Error() == "document not found" || err.Error() == "document has no file" {
writeError(w, http.StatusNotFound, err.Error())
writeError(w, http.StatusNotFound, "document not found")
return
}
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to download document", err)
return
}
defer body.Close()
w.Header().Set("Content-Type", contentType)
w.Header().Set("Content-Disposition", fmt.Sprintf(`attachment; filename="%s"`, title))
w.Header().Set("Content-Disposition", fmt.Sprintf(`attachment; filename="%s"`, sanitizeFilename(title)))
io.Copy(w, body)
}
@@ -149,7 +149,7 @@ func (h *DocumentHandler) GetMeta(w http.ResponseWriter, r *http.Request) {
doc, err := h.svc.GetByID(r.Context(), tenantID, docID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to get document metadata", err)
return
}
if doc == nil {
@@ -167,6 +167,7 @@ func (h *DocumentHandler) Delete(w http.ResponseWriter, r *http.Request) {
return
}
userID, _ := auth.UserFromContext(r.Context())
role := auth.UserRoleFromContext(r.Context())
docID, err := uuid.Parse(r.PathValue("docId"))
if err != nil {
@@ -174,6 +175,26 @@ func (h *DocumentHandler) Delete(w http.ResponseWriter, r *http.Request) {
return
}
// Check permission: owner/partner can delete any, associate can delete own
doc, err := h.svc.GetByID(r.Context(), tenantID, docID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
return
}
if doc == nil {
writeError(w, http.StatusNotFound, "document not found")
return
}
uploaderID := uuid.Nil
if doc.UploadedBy != nil {
uploaderID = *doc.UploadedBy
}
if !auth.CanDeleteDocument(role, uploaderID, userID) {
writeError(w, http.StatusForbidden, "insufficient permissions to delete this document")
return
}
if err := h.svc.Delete(r.Context(), tenantID, docID, userID); err != nil {
writeError(w, http.StatusNotFound, "document not found")
return

View File

@@ -0,0 +1,53 @@
package handlers
import (
"encoding/json"
"net/http"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
// FeeCalculatorHandler handles fee calculation API endpoints.
type FeeCalculatorHandler struct {
calc *services.FeeCalculator
}
func NewFeeCalculatorHandler(calc *services.FeeCalculator) *FeeCalculatorHandler {
return &FeeCalculatorHandler{calc: calc}
}
// Calculate handles POST /api/fees/calculate.
func (h *FeeCalculatorHandler) Calculate(w http.ResponseWriter, r *http.Request) {
var req models.FeeCalculateRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if req.Streitwert <= 0 {
writeError(w, http.StatusBadRequest, "streitwert must be positive")
return
}
if req.VATRate < 0 || req.VATRate > 1 {
writeError(w, http.StatusBadRequest, "vat_rate must be between 0 and 1")
return
}
if len(req.Instances) == 0 {
writeError(w, http.StatusBadRequest, "at least one instance is required")
return
}
resp, err := h.calc.CalculateFullLitigation(req)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, resp)
}
// Schedules handles GET /api/fees/schedules.
func (h *FeeCalculatorHandler) Schedules(w http.ResponseWriter, r *http.Request) {
writeJSON(w, http.StatusOK, h.calc.GetSchedules())
}

View File

@@ -2,12 +2,12 @@ package handlers
import (
"encoding/json"
"log/slog"
"net/http"
"strings"
"unicode/utf8"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
)
func writeJSON(w http.ResponseWriter, status int, v any) {
@@ -20,62 +20,9 @@ func writeError(w http.ResponseWriter, status int, msg string) {
writeJSON(w, status, map[string]string{"error": msg})
}
// resolveTenant gets the tenant ID for the authenticated user.
// Checks X-Tenant-ID header first, then falls back to user's first tenant.
func resolveTenant(r *http.Request, db *sqlx.DB) (uuid.UUID, error) {
userID, ok := auth.UserFromContext(r.Context())
if !ok {
return uuid.Nil, errUnauthorized
}
// Check header first
if headerVal := r.Header.Get("X-Tenant-ID"); headerVal != "" {
tenantID, err := uuid.Parse(headerVal)
if err != nil {
return uuid.Nil, errInvalidTenant
}
// Verify user has access to this tenant
var count int
err = db.Get(&count,
`SELECT COUNT(*) FROM user_tenants WHERE user_id = $1 AND tenant_id = $2`,
userID, tenantID)
if err != nil || count == 0 {
return uuid.Nil, errTenantAccess
}
return tenantID, nil
}
// Fall back to user's first tenant
var tenantID uuid.UUID
err := db.Get(&tenantID,
`SELECT tenant_id FROM user_tenants WHERE user_id = $1 ORDER BY created_at LIMIT 1`,
userID)
if err != nil {
return uuid.Nil, errNoTenant
}
return tenantID, nil
}
type apiError struct {
msg string
status int
}
func (e *apiError) Error() string { return e.msg }
var (
errUnauthorized = &apiError{msg: "unauthorized", status: http.StatusUnauthorized}
errInvalidTenant = &apiError{msg: "invalid tenant ID", status: http.StatusBadRequest}
errTenantAccess = &apiError{msg: "no access to tenant", status: http.StatusForbidden}
errNoTenant = &apiError{msg: "no tenant found for user", status: http.StatusBadRequest}
)
// handleTenantError writes the appropriate error response for tenant resolution errors
func handleTenantError(w http.ResponseWriter, err error) {
if ae, ok := err.(*apiError); ok {
writeError(w, ae.status, ae.msg)
return
}
// internalError logs the real error and returns a generic message to the client.
func internalError(w http.ResponseWriter, msg string, err error) {
slog.Error(msg, "error", err)
writeError(w, http.StatusInternalServerError, "internal error")
}
@@ -88,3 +35,74 @@ func parsePathUUID(r *http.Request, key string) (uuid.UUID, error) {
func parseUUID(s string) (uuid.UUID, error) {
return uuid.Parse(s)
}
// --- Input validation helpers ---
const (
maxTitleLen = 500
maxDescriptionLen = 10000
maxCaseNumberLen = 100
maxSearchLen = 200
maxPaginationLimit = 100
)
// validateStringLength checks if a string exceeds the given max length.
func validateStringLength(field, value string, maxLen int) string {
if utf8.RuneCountInString(value) > maxLen {
return field + " exceeds maximum length"
}
return ""
}
// clampPagination enforces sane pagination defaults and limits.
func clampPagination(limit, offset int) (int, int) {
if limit <= 0 {
limit = 20
}
if limit > maxPaginationLimit {
limit = maxPaginationLimit
}
if offset < 0 {
offset = 0
}
return limit, offset
}
// sanitizeFilename removes characters unsafe for Content-Disposition headers.
func sanitizeFilename(name string) string {
// Remove control characters, quotes, and backslashes
var b strings.Builder
for _, r := range name {
if r < 32 || r == '"' || r == '\\' || r == '/' {
b.WriteRune('_')
} else {
b.WriteRune(r)
}
}
return b.String()
}
// maskSettingsPassword masks the CalDAV password in tenant settings JSON before returning to clients.
func maskSettingsPassword(settings json.RawMessage) json.RawMessage {
if len(settings) == 0 {
return settings
}
var m map[string]json.RawMessage
if err := json.Unmarshal(settings, &m); err != nil {
return settings
}
caldavRaw, ok := m["caldav"]
if !ok {
return settings
}
var caldav map[string]json.RawMessage
if err := json.Unmarshal(caldavRaw, &caldav); err != nil {
return settings
}
if _, ok := caldav["password"]; ok {
caldav["password"], _ = json.Marshal("********")
}
m["caldav"], _ = json.Marshal(caldav)
result, _ := json.Marshal(m)
return result
}

View File

@@ -0,0 +1,170 @@
package handlers
import (
"encoding/json"
"net/http"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
"github.com/google/uuid"
)
type InvoiceHandler struct {
svc *services.InvoiceService
}
func NewInvoiceHandler(svc *services.InvoiceService) *InvoiceHandler {
return &InvoiceHandler{svc: svc}
}
// List handles GET /api/invoices?case_id=&status=
func (h *InvoiceHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var caseID *uuid.UUID
if caseStr := r.URL.Query().Get("case_id"); caseStr != "" {
parsed, err := uuid.Parse(caseStr)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
caseID = &parsed
}
invoices, err := h.svc.List(r.Context(), tenantID, caseID, r.URL.Query().Get("status"))
if err != nil {
internalError(w, "failed to list invoices", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{"invoices": invoices})
}
// Get handles GET /api/invoices/{id}
func (h *InvoiceHandler) Get(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
invoiceID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid invoice ID")
return
}
inv, err := h.svc.GetByID(r.Context(), tenantID, invoiceID)
if err != nil {
internalError(w, "failed to get invoice", err)
return
}
if inv == nil {
writeError(w, http.StatusNotFound, "invoice not found")
return
}
writeJSON(w, http.StatusOK, inv)
}
// Create handles POST /api/invoices
func (h *InvoiceHandler) Create(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
userID, _ := auth.UserFromContext(r.Context())
var input services.CreateInvoiceInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if input.ClientName == "" {
writeError(w, http.StatusBadRequest, "client_name is required")
return
}
if input.CaseID == uuid.Nil {
writeError(w, http.StatusBadRequest, "case_id is required")
return
}
inv, err := h.svc.Create(r.Context(), tenantID, userID, input)
if err != nil {
internalError(w, "failed to create invoice", err)
return
}
writeJSON(w, http.StatusCreated, inv)
}
// Update handles PUT /api/invoices/{id}
func (h *InvoiceHandler) Update(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
invoiceID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid invoice ID")
return
}
var input services.UpdateInvoiceInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
inv, err := h.svc.Update(r.Context(), tenantID, invoiceID, input)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, inv)
}
// UpdateStatus handles PATCH /api/invoices/{id}/status
func (h *InvoiceHandler) UpdateStatus(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
invoiceID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid invoice ID")
return
}
var body struct {
Status string `json:"status"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if body.Status == "" {
writeError(w, http.StatusBadRequest, "status is required")
return
}
inv, err := h.svc.UpdateStatus(r.Context(), tenantID, invoiceID, body.Status)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, inv)
}

View File

@@ -0,0 +1,167 @@
package handlers
import (
"encoding/json"
"fmt"
"net/http"
"github.com/google/uuid"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type NoteHandler struct {
svc *services.NoteService
}
func NewNoteHandler(svc *services.NoteService) *NoteHandler {
return &NoteHandler{svc: svc}
}
// List handles GET /api/notes?{parent_type}_id={id}
func (h *NoteHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
parentType, parentID, err := parseNoteParent(r)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
notes, err := h.svc.ListByParent(r.Context(), tenantID, parentType, parentID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to list notes")
return
}
writeJSON(w, http.StatusOK, notes)
}
// Create handles POST /api/notes
func (h *NoteHandler) Create(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
userID, _ := auth.UserFromContext(r.Context())
var input services.CreateNoteInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if input.Content == "" {
writeError(w, http.StatusBadRequest, "content is required")
return
}
if msg := validateStringLength("content", input.Content, maxDescriptionLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
var createdBy *uuid.UUID
if userID != uuid.Nil {
createdBy = &userID
}
note, err := h.svc.Create(r.Context(), tenantID, createdBy, input)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to create note")
return
}
writeJSON(w, http.StatusCreated, note)
}
// Update handles PUT /api/notes/{id}
func (h *NoteHandler) Update(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
noteID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid note ID")
return
}
var req struct {
Content string `json:"content"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if req.Content == "" {
writeError(w, http.StatusBadRequest, "content is required")
return
}
if msg := validateStringLength("content", req.Content, maxDescriptionLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
note, err := h.svc.Update(r.Context(), tenantID, noteID, req.Content)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to update note")
return
}
if note == nil {
writeError(w, http.StatusNotFound, "note not found")
return
}
writeJSON(w, http.StatusOK, note)
}
// Delete handles DELETE /api/notes/{id}
func (h *NoteHandler) Delete(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "missing tenant")
return
}
noteID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
writeError(w, http.StatusBadRequest, "invalid note ID")
return
}
if err := h.svc.Delete(r.Context(), tenantID, noteID); err != nil {
writeError(w, http.StatusNotFound, "note not found")
return
}
w.WriteHeader(http.StatusNoContent)
}
// parseNoteParent extracts the parent type and ID from query parameters.
func parseNoteParent(r *http.Request) (string, uuid.UUID, error) {
params := map[string]string{
"case_id": "case",
"deadline_id": "deadline",
"appointment_id": "appointment",
"case_event_id": "case_event",
}
for param, parentType := range params {
if v := r.URL.Query().Get(param); v != "" {
id, err := uuid.Parse(v)
if err != nil {
return "", uuid.Nil, fmt.Errorf("invalid %s", param)
}
return parentType, id, nil
}
}
return "", uuid.Nil, fmt.Errorf("one of case_id, deadline_id, appointment_id, or case_event_id is required")
}

View File

@@ -0,0 +1,171 @@
package handlers
import (
"encoding/json"
"net/http"
"strconv"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
// NotificationHandler handles notification API endpoints.
type NotificationHandler struct {
svc *services.NotificationService
db *sqlx.DB
}
// NewNotificationHandler creates a new notification handler.
func NewNotificationHandler(svc *services.NotificationService, db *sqlx.DB) *NotificationHandler {
return &NotificationHandler{svc: svc, db: db}
}
// List returns paginated notifications for the authenticated user.
func (h *NotificationHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
offset, _ := strconv.Atoi(r.URL.Query().Get("offset"))
notifications, total, err := h.svc.ListForUser(r.Context(), tenantID, userID, limit, offset)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to list notifications")
return
}
writeJSON(w, http.StatusOK, map[string]any{
"data": notifications,
"total": total,
})
}
// UnreadCount returns the count of unread notifications.
func (h *NotificationHandler) UnreadCount(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
count, err := h.svc.UnreadCount(r.Context(), tenantID, userID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to count notifications")
return
}
writeJSON(w, http.StatusOK, map[string]int{"unread_count": count})
}
// MarkRead marks a single notification as read.
func (h *NotificationHandler) MarkRead(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
notifID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid notification ID")
return
}
if err := h.svc.MarkRead(r.Context(), tenantID, userID, notifID); err != nil {
writeError(w, http.StatusNotFound, err.Error())
return
}
writeJSON(w, http.StatusOK, map[string]string{"status": "ok"})
}
// MarkAllRead marks all notifications as read.
func (h *NotificationHandler) MarkAllRead(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
if err := h.svc.MarkAllRead(r.Context(), tenantID, userID); err != nil {
writeError(w, http.StatusInternalServerError, "failed to mark all read")
return
}
writeJSON(w, http.StatusOK, map[string]string{"status": "ok"})
}
// GetPreferences returns notification preferences for the authenticated user.
func (h *NotificationHandler) GetPreferences(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
pref, err := h.svc.GetPreferences(r.Context(), tenantID, userID)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to get preferences")
return
}
writeJSON(w, http.StatusOK, pref)
}
// UpdatePreferences updates notification preferences for the authenticated user.
func (h *NotificationHandler) UpdatePreferences(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
userID, ok := auth.UserFromContext(r.Context())
if !ok {
writeError(w, http.StatusUnauthorized, "unauthorized")
return
}
var input services.UpdatePreferencesInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
pref, err := h.svc.UpdatePreferences(r.Context(), tenantID, userID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, "failed to update preferences")
return
}
writeJSON(w, http.StatusOK, pref)
}

View File

@@ -34,7 +34,7 @@ func (h *PartyHandler) List(w http.ResponseWriter, r *http.Request) {
parties, err := h.svc.ListByCase(r.Context(), tenantID, caseID)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to list parties", err)
return
}
@@ -67,13 +67,18 @@ func (h *PartyHandler) Create(w http.ResponseWriter, r *http.Request) {
return
}
if msg := validateStringLength("name", input.Name, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
party, err := h.svc.Create(r.Context(), tenantID, caseID, userID, input)
if err != nil {
if err == sql.ErrNoRows {
writeError(w, http.StatusNotFound, "case not found")
return
}
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to create party", err)
return
}
@@ -101,7 +106,7 @@ func (h *PartyHandler) Update(w http.ResponseWriter, r *http.Request) {
updated, err := h.svc.Update(r.Context(), tenantID, partyID, input)
if err != nil {
writeError(w, http.StatusInternalServerError, err.Error())
internalError(w, "failed to update party", err)
return
}
if updated == nil {

View File

@@ -0,0 +1,109 @@
package handlers
import (
"net/http"
"time"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type ReportHandler struct {
svc *services.ReportingService
}
func NewReportHandler(svc *services.ReportingService) *ReportHandler {
return &ReportHandler{svc: svc}
}
// parseDateRange extracts from/to query params, defaulting to last 12 months.
func parseDateRange(r *http.Request) (time.Time, time.Time) {
now := time.Now()
from := time.Date(now.Year()-1, now.Month(), 1, 0, 0, 0, 0, time.UTC)
to := time.Date(now.Year(), now.Month(), now.Day(), 23, 59, 59, 0, time.UTC)
if v := r.URL.Query().Get("from"); v != "" {
if t, err := time.Parse("2006-01-02", v); err == nil {
from = t
}
}
if v := r.URL.Query().Get("to"); v != "" {
if t, err := time.Parse("2006-01-02", v); err == nil {
to = time.Date(t.Year(), t.Month(), t.Day(), 23, 59, 59, 0, time.UTC)
}
}
return from, to
}
func (h *ReportHandler) Cases(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
from, to := parseDateRange(r)
data, err := h.svc.CaseReport(r.Context(), tenantID, from, to)
if err != nil {
internalError(w, "failed to generate case report", err)
return
}
writeJSON(w, http.StatusOK, data)
}
func (h *ReportHandler) Deadlines(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
from, to := parseDateRange(r)
data, err := h.svc.DeadlineReport(r.Context(), tenantID, from, to)
if err != nil {
internalError(w, "failed to generate deadline report", err)
return
}
writeJSON(w, http.StatusOK, data)
}
func (h *ReportHandler) Workload(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
from, to := parseDateRange(r)
data, err := h.svc.WorkloadReport(r.Context(), tenantID, from, to)
if err != nil {
internalError(w, "failed to generate workload report", err)
return
}
writeJSON(w, http.StatusOK, data)
}
func (h *ReportHandler) Billing(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
from, to := parseDateRange(r)
data, err := h.svc.BillingReport(r.Context(), tenantID, from, to)
if err != nil {
internalError(w, "failed to generate billing report", err)
return
}
writeJSON(w, http.StatusOK, data)
}

View File

@@ -0,0 +1,328 @@
package handlers
import (
"encoding/json"
"net/http"
"strconv"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
type TemplateHandler struct {
templates *services.TemplateService
cases *services.CaseService
parties *services.PartyService
deadlines *services.DeadlineService
tenants *services.TenantService
}
func NewTemplateHandler(
templates *services.TemplateService,
cases *services.CaseService,
parties *services.PartyService,
deadlines *services.DeadlineService,
tenants *services.TenantService,
) *TemplateHandler {
return &TemplateHandler{
templates: templates,
cases: cases,
parties: parties,
deadlines: deadlines,
tenants: tenants,
}
}
// List handles GET /api/templates
func (h *TemplateHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
q := r.URL.Query()
limit, _ := strconv.Atoi(q.Get("limit"))
offset, _ := strconv.Atoi(q.Get("offset"))
limit, offset = clampPagination(limit, offset)
filter := services.TemplateFilter{
Category: q.Get("category"),
Search: q.Get("search"),
Limit: limit,
Offset: offset,
}
if filter.Search != "" {
if msg := validateStringLength("search", filter.Search, maxSearchLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
}
templates, total, err := h.templates.List(r.Context(), tenantID, filter)
if err != nil {
internalError(w, "failed to list templates", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{
"data": templates,
"total": total,
})
}
// Get handles GET /api/templates/{id}
func (h *TemplateHandler) Get(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
templateID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid template ID")
return
}
t, err := h.templates.GetByID(r.Context(), tenantID, templateID)
if err != nil {
internalError(w, "failed to get template", err)
return
}
if t == nil {
writeError(w, http.StatusNotFound, "template not found")
return
}
writeJSON(w, http.StatusOK, t)
}
// Create handles POST /api/templates
func (h *TemplateHandler) Create(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
var raw struct {
Name string `json:"name"`
Description *string `json:"description,omitempty"`
Category string `json:"category"`
Content string `json:"content"`
Variables any `json:"variables,omitempty"`
}
if err := json.NewDecoder(r.Body).Decode(&raw); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if raw.Name == "" {
writeError(w, http.StatusBadRequest, "name is required")
return
}
if msg := validateStringLength("name", raw.Name, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
if raw.Category == "" {
writeError(w, http.StatusBadRequest, "category is required")
return
}
var variables []byte
if raw.Variables != nil {
var err error
variables, err = json.Marshal(raw.Variables)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid variables")
return
}
}
input := services.CreateTemplateInput{
Name: raw.Name,
Description: raw.Description,
Category: raw.Category,
Content: raw.Content,
Variables: variables,
}
t, err := h.templates.Create(r.Context(), tenantID, input)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusCreated, t)
}
// Update handles PUT /api/templates/{id}
func (h *TemplateHandler) Update(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
templateID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid template ID")
return
}
var raw struct {
Name *string `json:"name,omitempty"`
Description *string `json:"description,omitempty"`
Category *string `json:"category,omitempty"`
Content *string `json:"content,omitempty"`
Variables any `json:"variables,omitempty"`
}
if err := json.NewDecoder(r.Body).Decode(&raw); err != nil {
writeError(w, http.StatusBadRequest, "invalid request body")
return
}
if raw.Name != nil {
if msg := validateStringLength("name", *raw.Name, maxTitleLen); msg != "" {
writeError(w, http.StatusBadRequest, msg)
return
}
}
var variables []byte
if raw.Variables != nil {
variables, err = json.Marshal(raw.Variables)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid variables")
return
}
}
input := services.UpdateTemplateInput{
Name: raw.Name,
Description: raw.Description,
Category: raw.Category,
Content: raw.Content,
Variables: variables,
}
t, err := h.templates.Update(r.Context(), tenantID, templateID, input)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
if t == nil {
writeError(w, http.StatusNotFound, "template not found")
return
}
writeJSON(w, http.StatusOK, t)
}
// Delete handles DELETE /api/templates/{id}
func (h *TemplateHandler) Delete(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
templateID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid template ID")
return
}
if err := h.templates.Delete(r.Context(), tenantID, templateID); err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, map[string]string{"status": "deleted"})
}
// Render handles POST /api/templates/{id}/render?case_id=X
func (h *TemplateHandler) Render(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
userID, _ := auth.UserFromContext(r.Context())
templateID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid template ID")
return
}
// Get template
tmpl, err := h.templates.GetByID(r.Context(), tenantID, templateID)
if err != nil {
internalError(w, "failed to get template", err)
return
}
if tmpl == nil {
writeError(w, http.StatusNotFound, "template not found")
return
}
// Build render data
data := services.RenderData{}
// Case data (optional)
caseIDStr := r.URL.Query().Get("case_id")
if caseIDStr != "" {
caseID, err := parseUUID(caseIDStr)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
caseDetail, err := h.cases.GetByID(r.Context(), tenantID, caseID)
if err != nil {
internalError(w, "failed to get case", err)
return
}
if caseDetail == nil {
writeError(w, http.StatusNotFound, "case not found")
return
}
data.Case = &caseDetail.Case
data.Parties = caseDetail.Parties
// Get next upcoming deadline for this case
deadlines, err := h.deadlines.ListForCase(tenantID, caseID)
if err == nil && len(deadlines) > 0 {
// Find next non-completed deadline
for i := range deadlines {
if deadlines[i].Status != "completed" {
data.Deadline = &deadlines[i]
break
}
}
}
}
// Tenant data
tenant, err := h.tenants.GetByID(r.Context(), tenantID)
if err == nil && tenant != nil {
data.Tenant = tenant
}
// User data (userID from context — detailed name/email would need a user table lookup)
data.UserName = userID.String()
data.UserEmail = ""
rendered := h.templates.Render(tmpl, data)
writeJSON(w, http.StatusOK, map[string]any{
"content": rendered,
"template_id": tmpl.ID,
"name": tmpl.Name,
})
}

View File

@@ -2,6 +2,7 @@ package handlers
import (
"encoding/json"
"log/slog"
"net/http"
"github.com/google/uuid"
@@ -41,7 +42,8 @@ func (h *TenantHandler) CreateTenant(w http.ResponseWriter, r *http.Request) {
tenant, err := h.svc.Create(r.Context(), userID, req.Name, req.Slug)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to create tenant", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
@@ -58,10 +60,16 @@ func (h *TenantHandler) ListTenants(w http.ResponseWriter, r *http.Request) {
tenants, err := h.svc.ListForUser(r.Context(), userID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to list tenants", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
// Mask CalDAV passwords in tenant settings
for i := range tenants {
tenants[i].Settings = maskSettingsPassword(tenants[i].Settings)
}
jsonResponse(w, tenants, http.StatusOK)
}
@@ -82,7 +90,8 @@ func (h *TenantHandler) GetTenant(w http.ResponseWriter, r *http.Request) {
// Verify user has access to this tenant
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to get user role", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if role == "" {
@@ -92,7 +101,8 @@ func (h *TenantHandler) GetTenant(w http.ResponseWriter, r *http.Request) {
tenant, err := h.svc.GetByID(r.Context(), tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to get tenant", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if tenant == nil {
@@ -100,6 +110,9 @@ func (h *TenantHandler) GetTenant(w http.ResponseWriter, r *http.Request) {
return
}
// Mask CalDAV password before returning
tenant.Settings = maskSettingsPassword(tenant.Settings)
jsonResponse(w, tenant, http.StatusOK)
}
@@ -117,14 +130,15 @@ func (h *TenantHandler) InviteUser(w http.ResponseWriter, r *http.Request) {
return
}
// Only owners and admins can invite
// Only owners and partners can invite
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to get user role", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if role != "owner" && role != "admin" {
jsonError(w, "only owners and admins can invite users", http.StatusForbidden)
if role != "owner" && role != "partner" {
jsonError(w, "only owners and partners can invite users", http.StatusForbidden)
return
}
@@ -141,16 +155,22 @@ func (h *TenantHandler) InviteUser(w http.ResponseWriter, r *http.Request) {
return
}
if req.Role == "" {
req.Role = "member"
req.Role = "associate"
}
if req.Role != "member" && req.Role != "admin" {
jsonError(w, "role must be member or admin", http.StatusBadRequest)
if !auth.IsValidRole(req.Role) {
jsonError(w, "invalid role", http.StatusBadRequest)
return
}
// Non-owners cannot invite as owner
if role != "owner" && req.Role == "owner" {
jsonError(w, "only owners can invite as owner", http.StatusForbidden)
return
}
ut, err := h.svc.InviteByEmail(r.Context(), tenantID, req.Email, req.Role)
if err != nil {
jsonError(w, err.Error(), http.StatusBadRequest)
// These are user-facing validation errors (user not found, already member)
jsonError(w, "failed to invite user", http.StatusBadRequest)
return
}
@@ -177,25 +197,72 @@ func (h *TenantHandler) RemoveMember(w http.ResponseWriter, r *http.Request) {
return
}
// Only owners and admins can remove members (or user removing themselves)
// Only owners and partners can remove members (or user removing themselves)
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to get user role", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if role != "owner" && role != "admin" && userID != memberID {
if role != "owner" && role != "partner" && userID != memberID {
jsonError(w, "insufficient permissions", http.StatusForbidden)
return
}
if err := h.svc.RemoveMember(r.Context(), tenantID, memberID); err != nil {
jsonError(w, err.Error(), http.StatusBadRequest)
// These are user-facing validation errors (not a member, last owner, etc.)
jsonError(w, "failed to remove member", http.StatusBadRequest)
return
}
jsonResponse(w, map[string]string{"status": "removed"}, http.StatusOK)
}
// UpdateSettings handles PUT /api/tenants/{id}/settings
func (h *TenantHandler) UpdateSettings(w http.ResponseWriter, r *http.Request) {
userID, ok := auth.UserFromContext(r.Context())
if !ok {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
tenantID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
jsonError(w, "invalid tenant ID", http.StatusBadRequest)
return
}
// Only owners and partners can update settings
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
slog.Error("failed to get user role", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if role != "owner" && role != "partner" {
jsonError(w, "only owners and partners can update settings", http.StatusForbidden)
return
}
var settings json.RawMessage
if err := json.NewDecoder(r.Body).Decode(&settings); err != nil {
jsonError(w, "invalid request body", http.StatusBadRequest)
return
}
tenant, err := h.svc.UpdateSettings(r.Context(), tenantID, settings)
if err != nil {
slog.Error("failed to update settings", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
// Mask CalDAV password before returning
tenant.Settings = maskSettingsPassword(tenant.Settings)
jsonResponse(w, tenant, http.StatusOK)
}
// ListMembers handles GET /api/tenants/{id}/members
func (h *TenantHandler) ListMembers(w http.ResponseWriter, r *http.Request) {
userID, ok := auth.UserFromContext(r.Context())
@@ -213,7 +280,8 @@ func (h *TenantHandler) ListMembers(w http.ResponseWriter, r *http.Request) {
// Verify user has access
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to get user role", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if role == "" {
@@ -223,13 +291,173 @@ func (h *TenantHandler) ListMembers(w http.ResponseWriter, r *http.Request) {
members, err := h.svc.ListMembers(r.Context(), tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
slog.Error("failed to list members", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
jsonResponse(w, members, http.StatusOK)
}
// UpdateMemberRole handles PUT /api/tenants/{id}/members/{uid}/role
func (h *TenantHandler) UpdateMemberRole(w http.ResponseWriter, r *http.Request) {
userID, ok := auth.UserFromContext(r.Context())
if !ok {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
tenantID, err := uuid.Parse(r.PathValue("id"))
if err != nil {
jsonError(w, "invalid tenant ID", http.StatusBadRequest)
return
}
memberID, err := uuid.Parse(r.PathValue("uid"))
if err != nil {
jsonError(w, "invalid member ID", http.StatusBadRequest)
return
}
// Only owners and partners can change roles
role, err := h.svc.GetUserRole(r.Context(), userID, tenantID)
if err != nil {
jsonError(w, err.Error(), http.StatusInternalServerError)
return
}
if role != "owner" && role != "partner" {
jsonError(w, "only owners and partners can change roles", http.StatusForbidden)
return
}
var req struct {
Role string `json:"role"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, "invalid request body", http.StatusBadRequest)
return
}
if !auth.IsValidRole(req.Role) {
jsonError(w, "invalid role", http.StatusBadRequest)
return
}
// Non-owners cannot promote to owner
if role != "owner" && req.Role == "owner" {
jsonError(w, "only owners can promote to owner", http.StatusForbidden)
return
}
if err := h.svc.UpdateMemberRole(r.Context(), tenantID, memberID, req.Role); err != nil {
jsonError(w, err.Error(), http.StatusBadRequest)
return
}
jsonResponse(w, map[string]string{"status": "updated"}, http.StatusOK)
}
// AutoAssign handles POST /api/tenants/auto-assign — checks if the user's email domain
// matches any tenant's auto_assign_domains and assigns them if so.
func (h *TenantHandler) AutoAssign(w http.ResponseWriter, r *http.Request) {
userID, ok := auth.UserFromContext(r.Context())
if !ok {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
var req struct {
Email string `json:"email"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, "invalid request body", http.StatusBadRequest)
return
}
if req.Email == "" {
jsonError(w, "email is required", http.StatusBadRequest)
return
}
// Extract domain from email
parts := splitEmail(req.Email)
if parts == "" {
jsonError(w, "invalid email format", http.StatusBadRequest)
return
}
result, err := h.svc.AutoAssignByDomain(r.Context(), userID, parts)
if err != nil {
slog.Error("auto-assign failed", "error", err)
jsonError(w, "internal error", http.StatusInternalServerError)
return
}
if result == nil {
jsonResponse(w, map[string]any{"assigned": false}, http.StatusOK)
return
}
jsonResponse(w, map[string]any{
"assigned": true,
"tenant_id": result.ID,
"name": result.Name,
"slug": result.Slug,
"role": result.Role,
"settings": result.Settings,
}, http.StatusOK)
}
// splitEmail extracts the domain part from an email address.
func splitEmail(email string) string {
at := -1
for i, c := range email {
if c == '@' {
at = i
break
}
}
if at < 0 || at >= len(email)-1 {
return ""
}
return email[at+1:]
}
// GetMe handles GET /api/me — returns the current user's ID and role in the active tenant.
func (h *TenantHandler) GetMe(w http.ResponseWriter, r *http.Request) {
userID, ok := auth.UserFromContext(r.Context())
if !ok {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
role := auth.UserRoleFromContext(r.Context())
tenantID, _ := auth.TenantFromContext(r.Context())
// Get user's permissions for frontend UI
perms := auth.GetRolePermissions(role)
// Check if tenant is in demo mode
isDemo := false
if tenant, err := h.svc.GetByID(r.Context(), tenantID); err == nil && tenant != nil {
var settings map[string]json.RawMessage
if json.Unmarshal(tenant.Settings, &settings) == nil {
if demoRaw, ok := settings["demo"]; ok {
var demo bool
if json.Unmarshal(demoRaw, &demo) == nil {
isDemo = demo
}
}
}
}
jsonResponse(w, map[string]any{
"user_id": userID,
"tenant_id": tenantID,
"role": role,
"permissions": perms,
"is_demo": isDemo,
}, http.StatusOK)
}
func jsonResponse(w http.ResponseWriter, data interface{}, status int) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(status)

View File

@@ -0,0 +1,209 @@
package handlers
import (
"encoding/json"
"net/http"
"strconv"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
"github.com/google/uuid"
)
type TimeEntryHandler struct {
svc *services.TimeEntryService
}
func NewTimeEntryHandler(svc *services.TimeEntryService) *TimeEntryHandler {
return &TimeEntryHandler{svc: svc}
}
// ListForCase handles GET /api/cases/{id}/time-entries
func (h *TimeEntryHandler) ListForCase(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
caseID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
entries, err := h.svc.ListForCase(r.Context(), tenantID, caseID)
if err != nil {
internalError(w, "failed to list time entries", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{"time_entries": entries})
}
// List handles GET /api/time-entries?case_id=&user_id=&from=&to=
func (h *TimeEntryHandler) List(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
offset, _ := strconv.Atoi(r.URL.Query().Get("offset"))
limit, offset = clampPagination(limit, offset)
filter := services.TimeEntryFilter{
From: r.URL.Query().Get("from"),
To: r.URL.Query().Get("to"),
Limit: limit,
Offset: offset,
}
if caseStr := r.URL.Query().Get("case_id"); caseStr != "" {
caseID, err := uuid.Parse(caseStr)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case_id")
return
}
filter.CaseID = &caseID
}
if userStr := r.URL.Query().Get("user_id"); userStr != "" {
userID, err := uuid.Parse(userStr)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid user_id")
return
}
filter.UserID = &userID
}
entries, total, err := h.svc.List(r.Context(), tenantID, filter)
if err != nil {
internalError(w, "failed to list time entries", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{
"time_entries": entries,
"total": total,
})
}
// Create handles POST /api/cases/{id}/time-entries
func (h *TimeEntryHandler) Create(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
userID, _ := auth.UserFromContext(r.Context())
caseID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid case ID")
return
}
var input services.CreateTimeEntryInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
input.CaseID = caseID
if input.Description == "" {
writeError(w, http.StatusBadRequest, "description is required")
return
}
if input.DurationMinutes <= 0 {
writeError(w, http.StatusBadRequest, "duration_minutes must be positive")
return
}
if input.Date == "" {
writeError(w, http.StatusBadRequest, "date is required")
return
}
entry, err := h.svc.Create(r.Context(), tenantID, userID, input)
if err != nil {
internalError(w, "failed to create time entry", err)
return
}
writeJSON(w, http.StatusCreated, entry)
}
// Update handles PUT /api/time-entries/{id}
func (h *TimeEntryHandler) Update(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
entryID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid time entry ID")
return
}
var input services.UpdateTimeEntryInput
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
writeError(w, http.StatusBadRequest, "invalid JSON body")
return
}
entry, err := h.svc.Update(r.Context(), tenantID, entryID, input)
if err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, entry)
}
// Delete handles DELETE /api/time-entries/{id}
func (h *TimeEntryHandler) Delete(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
entryID, err := parsePathUUID(r, "id")
if err != nil {
writeError(w, http.StatusBadRequest, "invalid time entry ID")
return
}
if err := h.svc.Delete(r.Context(), tenantID, entryID); err != nil {
writeError(w, http.StatusBadRequest, err.Error())
return
}
writeJSON(w, http.StatusOK, map[string]string{"status": "deleted"})
}
// Summary handles GET /api/time-entries/summary?group_by=case|user|month&from=&to=
func (h *TimeEntryHandler) Summary(w http.ResponseWriter, r *http.Request) {
tenantID, ok := auth.TenantFromContext(r.Context())
if !ok {
writeError(w, http.StatusForbidden, "missing tenant")
return
}
groupBy := r.URL.Query().Get("group_by")
if groupBy == "" {
groupBy = "case"
}
summaries, err := h.svc.Summary(r.Context(), tenantID, groupBy,
r.URL.Query().Get("from"), r.URL.Query().Get("to"))
if err != nil {
internalError(w, "failed to get summary", err)
return
}
writeJSON(w, http.StatusOK, map[string]any{"summary": summaries})
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,14 @@
package logging
import (
"log/slog"
"os"
)
// Setup initializes the global slog logger with JSON output for production.
func Setup() {
handler := slog.NewJSONHandler(os.Stdout, &slog.HandlerOptions{
Level: slog.LevelInfo,
})
slog.SetDefault(slog.New(handler))
}

View File

@@ -0,0 +1,98 @@
package middleware
import (
"log/slog"
"net/http"
"sync"
"time"
)
// TokenBucket implements a simple per-IP token bucket rate limiter.
type TokenBucket struct {
mu sync.Mutex
buckets map[string]*bucket
rate float64 // tokens per second
burst int // max tokens
}
type bucket struct {
tokens float64
lastTime time.Time
}
// NewTokenBucket creates a rate limiter allowing rate requests per second with burst capacity.
func NewTokenBucket(rate float64, burst int) *TokenBucket {
tb := &TokenBucket{
buckets: make(map[string]*bucket),
rate: rate,
burst: burst,
}
// Periodically clean up stale buckets
go tb.cleanup()
return tb
}
func (tb *TokenBucket) allow(key string) bool {
tb.mu.Lock()
defer tb.mu.Unlock()
b, ok := tb.buckets[key]
if !ok {
b = &bucket{tokens: float64(tb.burst), lastTime: time.Now()}
tb.buckets[key] = b
}
now := time.Now()
elapsed := now.Sub(b.lastTime).Seconds()
b.tokens += elapsed * tb.rate
if b.tokens > float64(tb.burst) {
b.tokens = float64(tb.burst)
}
b.lastTime = now
if b.tokens < 1 {
return false
}
b.tokens--
return true
}
func (tb *TokenBucket) cleanup() {
ticker := time.NewTicker(5 * time.Minute)
defer ticker.Stop()
for range ticker.C {
tb.mu.Lock()
cutoff := time.Now().Add(-10 * time.Minute)
for key, b := range tb.buckets {
if b.lastTime.Before(cutoff) {
delete(tb.buckets, key)
}
}
tb.mu.Unlock()
}
}
// Limit wraps an http.Handler with rate limiting.
func (tb *TokenBucket) Limit(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ip := r.Header.Get("X-Forwarded-For")
if ip == "" {
ip = r.RemoteAddr
}
if !tb.allow(ip) {
slog.Warn("rate limit exceeded", "ip", ip, "path", r.URL.Path)
w.Header().Set("Content-Type", "application/json")
w.Header().Set("Retry-After", "10")
w.WriteHeader(http.StatusTooManyRequests)
w.Write([]byte(`{"error":"rate limit exceeded, try again later"}`))
return
}
next.ServeHTTP(w, r)
})
}
// LimitFunc wraps an http.HandlerFunc with rate limiting.
func (tb *TokenBucket) LimitFunc(next http.HandlerFunc) http.HandlerFunc {
limited := tb.Limit(http.HandlerFunc(next))
return limited.ServeHTTP
}

View File

@@ -0,0 +1,70 @@
package middleware
import (
"net/http"
"net/http/httptest"
"testing"
)
func TestTokenBucket_AllowsBurst(t *testing.T) {
tb := NewTokenBucket(1.0, 5) // 1/sec, burst 5
handler := tb.LimitFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
})
// Should allow burst of 5 requests
for i := 0; i < 5; i++ {
req := httptest.NewRequest("GET", "/test", nil)
w := httptest.NewRecorder()
handler.ServeHTTP(w, req)
if w.Code != http.StatusOK {
t.Fatalf("request %d: expected 200, got %d", i+1, w.Code)
}
}
// 6th request should be rate limited
req := httptest.NewRequest("GET", "/test", nil)
w := httptest.NewRecorder()
handler.ServeHTTP(w, req)
if w.Code != http.StatusTooManyRequests {
t.Fatalf("request 6: expected 429, got %d", w.Code)
}
}
func TestTokenBucket_DifferentIPs(t *testing.T) {
tb := NewTokenBucket(1.0, 2) // 1/sec, burst 2
handler := tb.LimitFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
})
// Exhaust IP1's bucket
for i := 0; i < 2; i++ {
req := httptest.NewRequest("GET", "/test", nil)
req.Header.Set("X-Forwarded-For", "1.2.3.4")
w := httptest.NewRecorder()
handler.ServeHTTP(w, req)
if w.Code != http.StatusOK {
t.Fatalf("ip1 request %d: expected 200, got %d", i+1, w.Code)
}
}
// IP1 should now be limited
req := httptest.NewRequest("GET", "/test", nil)
req.Header.Set("X-Forwarded-For", "1.2.3.4")
w := httptest.NewRecorder()
handler.ServeHTTP(w, req)
if w.Code != http.StatusTooManyRequests {
t.Fatalf("ip1 request 3: expected 429, got %d", w.Code)
}
// IP2 should still work
req = httptest.NewRequest("GET", "/test", nil)
req.Header.Set("X-Forwarded-For", "5.6.7.8")
w = httptest.NewRecorder()
handler.ServeHTTP(w, req)
if w.Code != http.StatusOK {
t.Fatalf("ip2 request 1: expected 200, got %d", w.Code)
}
}

View File

@@ -0,0 +1,49 @@
package middleware
import (
"net/http"
"strings"
)
// SecurityHeaders adds standard security headers to all responses.
func SecurityHeaders(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("X-Frame-Options", "DENY")
w.Header().Set("X-Content-Type-Options", "nosniff")
w.Header().Set("X-XSS-Protection", "1; mode=block")
w.Header().Set("Strict-Transport-Security", "max-age=31536000; includeSubDomains")
w.Header().Set("Referrer-Policy", "strict-origin-when-cross-origin")
next.ServeHTTP(w, r)
})
}
// CORS returns middleware that restricts cross-origin requests to the given origin.
// If allowedOrigin is empty, CORS headers are not set (same-origin only).
func CORS(allowedOrigin string) func(http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
origin := r.Header.Get("Origin")
if allowedOrigin != "" && origin != "" && matchOrigin(origin, allowedOrigin) {
w.Header().Set("Access-Control-Allow-Origin", allowedOrigin)
w.Header().Set("Access-Control-Allow-Methods", "GET, POST, PUT, PATCH, DELETE, OPTIONS")
w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Authorization, X-Tenant-ID")
w.Header().Set("Access-Control-Max-Age", "86400")
w.Header().Set("Vary", "Origin")
}
// Handle preflight
if r.Method == http.MethodOptions {
w.WriteHeader(http.StatusNoContent)
return
}
next.ServeHTTP(w, r)
})
}
}
// matchOrigin checks if the request origin matches the allowed origin.
func matchOrigin(origin, allowed string) bool {
return strings.EqualFold(strings.TrimRight(origin, "/"), strings.TrimRight(allowed, "/"))
}

View File

@@ -0,0 +1,22 @@
package models
import (
"encoding/json"
"time"
"github.com/google/uuid"
)
type AuditLog struct {
ID int64 `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
UserID *uuid.UUID `db:"user_id" json:"user_id,omitempty"`
Action string `db:"action" json:"action"`
EntityType string `db:"entity_type" json:"entity_type"`
EntityID *uuid.UUID `db:"entity_id" json:"entity_id,omitempty"`
OldValues *json.RawMessage `db:"old_values" json:"old_values,omitempty"`
NewValues *json.RawMessage `db:"new_values" json:"new_values,omitempty"`
IPAddress *string `db:"ip_address" json:"ip_address,omitempty"`
UserAgent *string `db:"user_agent" json:"user_agent,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
}

View File

@@ -0,0 +1,18 @@
package models
import (
"time"
"github.com/google/uuid"
)
type BillingRate struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
UserID *uuid.UUID `db:"user_id" json:"user_id,omitempty"`
Rate float64 `db:"rate" json:"rate"`
Currency string `db:"currency" json:"currency"`
ValidFrom string `db:"valid_from" json:"valid_from"`
ValidTo *string `db:"valid_to" json:"valid_to,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
}

View File

@@ -0,0 +1,15 @@
package models
import (
"time"
"github.com/google/uuid"
)
type CaseAssignment struct {
ID uuid.UUID `db:"id" json:"id"`
CaseID uuid.UUID `db:"case_id" json:"case_id"`
UserID uuid.UUID `db:"user_id" json:"user_id"`
Role string `db:"role" json:"role"`
AssignedAt time.Time `db:"assigned_at" json:"assigned_at"`
}

View File

@@ -26,6 +26,8 @@ type DeadlineRule struct {
AltDurationValue *int `db:"alt_duration_value" json:"alt_duration_value,omitempty"`
AltDurationUnit *string `db:"alt_duration_unit" json:"alt_duration_unit,omitempty"`
AltRuleCode *string `db:"alt_rule_code" json:"alt_rule_code,omitempty"`
IsSpawn bool `db:"is_spawn" json:"is_spawn"`
SpawnLabel *string `db:"spawn_label" json:"spawn_label,omitempty"`
IsActive bool `db:"is_active" json:"is_active"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
@@ -37,6 +39,7 @@ type ProceedingType struct {
Name string `db:"name" json:"name"`
Description *string `db:"description" json:"description,omitempty"`
Jurisdiction *string `db:"jurisdiction" json:"jurisdiction,omitempty"`
Category *string `db:"category" json:"category,omitempty"`
DefaultColor string `db:"default_color" json:"default_color"`
SortOrder int `db:"sort_order" json:"sort_order"`
IsActive bool `db:"is_active" json:"is_active"`

View File

@@ -0,0 +1,21 @@
package models
import (
"encoding/json"
"time"
"github.com/google/uuid"
)
type DocumentTemplate struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID *uuid.UUID `db:"tenant_id" json:"tenant_id,omitempty"`
Name string `db:"name" json:"name"`
Description *string `db:"description" json:"description,omitempty"`
Category string `db:"category" json:"category"`
Content string `db:"content" json:"content"`
Variables json.RawMessage `db:"variables" json:"variables"`
IsSystem bool `db:"is_system" json:"is_system"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}

View File

@@ -0,0 +1,125 @@
package models
// FeeScheduleVersion identifies a fee schedule version.
type FeeScheduleVersion string
const (
FeeVersion2005 FeeScheduleVersion = "2005"
FeeVersion2013 FeeScheduleVersion = "2013"
FeeVersion2021 FeeScheduleVersion = "2021"
FeeVersion2025 FeeScheduleVersion = "2025"
FeeVersionAktuell FeeScheduleVersion = "Aktuell"
)
// InstanceType identifies a court instance.
type InstanceType string
const (
InstanceLG InstanceType = "LG"
InstanceOLG InstanceType = "OLG"
InstanceBGHNZB InstanceType = "BGH_NZB"
InstanceBGHRev InstanceType = "BGH_Rev"
InstanceBPatG InstanceType = "BPatG"
InstanceBGHNull InstanceType = "BGH_Null"
InstanceDPMA InstanceType = "DPMA"
InstanceBPatGCanc InstanceType = "BPatG_Canc"
)
// ProceedingPath identifies the type of patent litigation proceeding.
type ProceedingPath string
const (
PathInfringement ProceedingPath = "infringement"
PathNullity ProceedingPath = "nullity"
PathCancellation ProceedingPath = "cancellation"
)
// --- Request ---
// FeeCalculateRequest is the request body for POST /api/fees/calculate.
type FeeCalculateRequest struct {
Streitwert float64 `json:"streitwert"`
VATRate float64 `json:"vat_rate"`
ProceedingPath ProceedingPath `json:"proceeding_path"`
Instances []InstanceInput `json:"instances"`
IncludeSecurityCosts bool `json:"include_security_costs"`
}
// InstanceInput configures one court instance in the calculation request.
type InstanceInput struct {
Type InstanceType `json:"type"`
Enabled bool `json:"enabled"`
FeeVersion FeeScheduleVersion `json:"fee_version"`
NumAttorneys int `json:"num_attorneys"`
NumPatentAttorneys int `json:"num_patent_attorneys"`
NumClients int `json:"num_clients"`
OralHearing bool `json:"oral_hearing"`
ExpertFees float64 `json:"expert_fees"`
}
// --- Response ---
// FeeCalculateResponse is the response for POST /api/fees/calculate.
type FeeCalculateResponse struct {
Instances []InstanceResult `json:"instances"`
Totals []FeeTotal `json:"totals"`
SecurityForCosts *SecurityForCosts `json:"security_for_costs,omitempty"`
}
// InstanceResult contains the cost breakdown for one court instance.
type InstanceResult struct {
Type InstanceType `json:"type"`
Label string `json:"label"`
CourtFeeBase float64 `json:"court_fee_base"`
CourtFeeMultiplier float64 `json:"court_fee_multiplier"`
CourtFeeSource string `json:"court_fee_source"`
CourtFee float64 `json:"court_fee"`
ExpertFees float64 `json:"expert_fees"`
CourtSubtotal float64 `json:"court_subtotal"`
AttorneyBreakdown *AttorneyBreakdown `json:"attorney_breakdown,omitempty"`
PatentAttorneyBreakdown *AttorneyBreakdown `json:"patent_attorney_breakdown,omitempty"`
AttorneySubtotal float64 `json:"attorney_subtotal"`
PatentAttorneySubtotal float64 `json:"patent_attorney_subtotal"`
InstanceTotal float64 `json:"instance_total"`
}
// AttorneyBreakdown details the fee computation for one attorney type.
type AttorneyBreakdown struct {
BaseFee float64 `json:"base_fee"`
VGFactor float64 `json:"vg_factor"`
VGFee float64 `json:"vg_fee"`
IncreaseFee float64 `json:"increase_fee"`
TGFactor float64 `json:"tg_factor"`
TGFee float64 `json:"tg_fee"`
Pauschale float64 `json:"pauschale"`
SubtotalNet float64 `json:"subtotal_net"`
VAT float64 `json:"vat"`
SubtotalGross float64 `json:"subtotal_gross"`
Count int `json:"count"`
TotalGross float64 `json:"total_gross"`
}
// FeeTotal is a labeled total amount.
type FeeTotal struct {
Label string `json:"label"`
Total float64 `json:"total"`
}
// SecurityForCosts is the Prozesskostensicherheit calculation result.
type SecurityForCosts struct {
Instance1 float64 `json:"instance_1"`
Instance2 float64 `json:"instance_2"`
NZB float64 `json:"nzb"`
SubtotalNet float64 `json:"subtotal_net"`
VAT float64 `json:"vat"`
TotalGross float64 `json:"total_gross"`
}
// FeeScheduleInfo describes a fee schedule version for the schedules endpoint.
type FeeScheduleInfo struct {
Key string `json:"key"`
Label string `json:"label"`
ValidFrom string `json:"valid_from"`
IsAlias bool `json:"is_alias,omitempty"`
AliasOf string `json:"alias_of,omitempty"`
}

View File

@@ -0,0 +1,38 @@
package models
import (
"encoding/json"
"time"
"github.com/google/uuid"
)
type Invoice struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
CaseID uuid.UUID `db:"case_id" json:"case_id"`
InvoiceNumber string `db:"invoice_number" json:"invoice_number"`
ClientName string `db:"client_name" json:"client_name"`
ClientAddress *string `db:"client_address" json:"client_address,omitempty"`
Items json.RawMessage `db:"items" json:"items"`
Subtotal float64 `db:"subtotal" json:"subtotal"`
TaxRate float64 `db:"tax_rate" json:"tax_rate"`
TaxAmount float64 `db:"tax_amount" json:"tax_amount"`
Total float64 `db:"total" json:"total"`
Status string `db:"status" json:"status"`
IssuedAt *string `db:"issued_at" json:"issued_at,omitempty"`
DueAt *string `db:"due_at" json:"due_at,omitempty"`
PaidAt *time.Time `db:"paid_at" json:"paid_at,omitempty"`
Notes *string `db:"notes" json:"notes,omitempty"`
CreatedBy uuid.UUID `db:"created_by" json:"created_by"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}
type InvoiceItem struct {
Description string `json:"description"`
DurationMinutes int `json:"duration_minutes,omitempty"`
HourlyRate float64 `json:"hourly_rate,omitempty"`
Amount float64 `json:"amount"`
TimeEntryID *string `json:"time_entry_id,omitempty"`
}

View File

@@ -0,0 +1,20 @@
package models
import (
"time"
"github.com/google/uuid"
)
type Note struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
CaseID *uuid.UUID `db:"case_id" json:"case_id,omitempty"`
DeadlineID *uuid.UUID `db:"deadline_id" json:"deadline_id,omitempty"`
AppointmentID *uuid.UUID `db:"appointment_id" json:"appointment_id,omitempty"`
CaseEventID *uuid.UUID `db:"case_event_id" json:"case_event_id,omitempty"`
Content string `db:"content" json:"content"`
CreatedBy *uuid.UUID `db:"created_by" json:"created_by,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}

View File

@@ -0,0 +1,32 @@
package models
import (
"time"
"github.com/google/uuid"
"github.com/lib/pq"
)
type Notification struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
UserID uuid.UUID `db:"user_id" json:"user_id"`
Type string `db:"type" json:"type"`
EntityType *string `db:"entity_type" json:"entity_type,omitempty"`
EntityID *uuid.UUID `db:"entity_id" json:"entity_id,omitempty"`
Title string `db:"title" json:"title"`
Body *string `db:"body" json:"body,omitempty"`
SentAt *time.Time `db:"sent_at" json:"sent_at,omitempty"`
ReadAt *time.Time `db:"read_at" json:"read_at,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
}
type NotificationPreferences struct {
UserID uuid.UUID `db:"user_id" json:"user_id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
DeadlineReminderDays pq.Int64Array `db:"deadline_reminder_days" json:"deadline_reminder_days"`
EmailEnabled bool `db:"email_enabled" json:"email_enabled"`
DailyDigest bool `db:"daily_digest" json:"daily_digest"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}

View File

@@ -20,6 +20,7 @@ type UserTenant struct {
UserID uuid.UUID `db:"user_id" json:"user_id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
Role string `db:"role" json:"role"`
Email string `db:"email" json:"email"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
}

View File

@@ -0,0 +1,24 @@
package models
import (
"time"
"github.com/google/uuid"
)
type TimeEntry struct {
ID uuid.UUID `db:"id" json:"id"`
TenantID uuid.UUID `db:"tenant_id" json:"tenant_id"`
CaseID uuid.UUID `db:"case_id" json:"case_id"`
UserID uuid.UUID `db:"user_id" json:"user_id"`
Date string `db:"date" json:"date"`
DurationMinutes int `db:"duration_minutes" json:"duration_minutes"`
Description string `db:"description" json:"description"`
Activity *string `db:"activity" json:"activity,omitempty"`
Billable bool `db:"billable" json:"billable"`
Billed bool `db:"billed" json:"billed"`
InvoiceID *uuid.UUID `db:"invoice_id" json:"invoice_id,omitempty"`
HourlyRate *float64 `db:"hourly_rate" json:"hourly_rate,omitempty"`
CreatedAt time.Time `db:"created_at" json:"created_at"`
UpdatedAt time.Time `db:"updated_at" json:"updated_at"`
}

View File

@@ -2,53 +2,91 @@ package router
import (
"encoding/json"
"log/slog"
"net/http"
"time"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/config"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/handlers"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/middleware"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/services"
)
func New(db *sqlx.DB, authMW *auth.Middleware, cfg *config.Config) http.Handler {
func New(db *sqlx.DB, authMW *auth.Middleware, cfg *config.Config, calDAVSvc *services.CalDAVService, notifSvc *services.NotificationService, youpcDB ...*sqlx.DB) http.Handler {
mux := http.NewServeMux()
// Services
tenantSvc := services.NewTenantService(db)
caseSvc := services.NewCaseService(db)
partySvc := services.NewPartyService(db)
appointmentSvc := services.NewAppointmentService(db)
auditSvc := services.NewAuditService(db)
tenantSvc := services.NewTenantService(db, auditSvc)
caseSvc := services.NewCaseService(db, auditSvc)
partySvc := services.NewPartyService(db, auditSvc)
appointmentSvc := services.NewAppointmentService(db, auditSvc)
holidaySvc := services.NewHolidayService(db)
deadlineSvc := services.NewDeadlineService(db)
deadlineSvc := services.NewDeadlineService(db, auditSvc)
deadlineRuleSvc := services.NewDeadlineRuleService(db)
calculator := services.NewDeadlineCalculator(holidaySvc)
determineSvc := services.NewDetermineService(db, calculator)
storageCli := services.NewStorageClient(cfg.SupabaseURL, cfg.SupabaseServiceKey)
documentSvc := services.NewDocumentService(db, storageCli)
documentSvc := services.NewDocumentService(db, storageCli, auditSvc)
assignmentSvc := services.NewCaseAssignmentService(db)
reportSvc := services.NewReportingService(db)
timeEntrySvc := services.NewTimeEntryService(db, auditSvc)
invoiceSvc := services.NewInvoiceService(db, auditSvc)
billingRateSvc := services.NewBillingRateService(db, auditSvc)
templateSvc := services.NewTemplateService(db, auditSvc)
// AI service (optional — only if API key is configured)
var aiH *handlers.AIHandler
if cfg.AnthropicAPIKey != "" {
aiSvc := services.NewAIService(cfg.AnthropicAPIKey, db)
aiH = handlers.NewAIHandler(aiSvc, db)
var ydb *sqlx.DB
if len(youpcDB) > 0 {
ydb = youpcDB[0]
}
aiSvc := services.NewAIService(cfg.AnthropicAPIKey, db, ydb)
aiH = handlers.NewAIHandler(aiSvc)
}
// Middleware
tenantResolver := auth.NewTenantResolver(tenantSvc)
noteSvc := services.NewNoteService(db, auditSvc)
dashboardSvc := services.NewDashboardService(db)
// Notification handler (optional — nil in tests)
var notifH *handlers.NotificationHandler
if notifSvc != nil {
notifH = handlers.NewNotificationHandler(notifSvc, db)
}
// Handlers
auditH := handlers.NewAuditLogHandler(auditSvc)
tenantH := handlers.NewTenantHandler(tenantSvc)
caseH := handlers.NewCaseHandler(caseSvc)
partyH := handlers.NewPartyHandler(partySvc)
apptH := handlers.NewAppointmentHandler(appointmentSvc)
deadlineH := handlers.NewDeadlineHandlers(deadlineSvc, db)
deadlineH := handlers.NewDeadlineHandlers(deadlineSvc)
ruleH := handlers.NewDeadlineRuleHandlers(deadlineRuleSvc)
calcH := handlers.NewCalculateHandlers(calculator, deadlineRuleSvc)
determineH := handlers.NewDetermineHandlers(determineSvc, deadlineSvc)
dashboardH := handlers.NewDashboardHandler(dashboardSvc)
noteH := handlers.NewNoteHandler(noteSvc)
eventH := handlers.NewCaseEventHandler(db)
docH := handlers.NewDocumentHandler(documentSvc)
assignmentH := handlers.NewCaseAssignmentHandler(assignmentSvc)
reportH := handlers.NewReportHandler(reportSvc)
timeH := handlers.NewTimeEntryHandler(timeEntrySvc)
invoiceH := handlers.NewInvoiceHandler(invoiceSvc)
billingH := handlers.NewBillingRateHandler(billingRateSvc)
templateH := handlers.NewTemplateHandler(templateSvc, caseSvc, partySvc, deadlineSvc, tenantSvc)
// Fee calculator (public — no auth required, pure computation)
feeCalc := services.NewFeeCalculator()
feeCalcH := handlers.NewFeeCalculatorHandler(feeCalc)
mux.HandleFunc("POST /api/fees/calculate", feeCalcH.Calculate)
mux.HandleFunc("GET /api/fees/schedules", feeCalcH.Schedules)
// Public routes
mux.HandleFunc("GET /health", handleHealth(db))
@@ -57,80 +95,184 @@ func New(db *sqlx.DB, authMW *auth.Middleware, cfg *config.Config) http.Handler
api := http.NewServeMux()
// Tenant management (no tenant resolver — these operate across tenants)
api.HandleFunc("POST /api/tenants/auto-assign", tenantH.AutoAssign)
api.HandleFunc("POST /api/tenants", tenantH.CreateTenant)
api.HandleFunc("GET /api/tenants", tenantH.ListTenants)
api.HandleFunc("GET /api/tenants/{id}", tenantH.GetTenant)
api.HandleFunc("PUT /api/tenants/{id}/settings", tenantH.UpdateSettings)
api.HandleFunc("POST /api/tenants/{id}/invite", tenantH.InviteUser)
api.HandleFunc("DELETE /api/tenants/{id}/members/{uid}", tenantH.RemoveMember)
api.HandleFunc("GET /api/tenants/{id}/members", tenantH.ListMembers)
api.HandleFunc("PUT /api/tenants/{id}/members/{uid}/role", tenantH.UpdateMemberRole)
// Permission-wrapping helper: wraps a HandlerFunc with a permission check
perm := func(p auth.Permission, fn http.HandlerFunc) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
role := auth.UserRoleFromContext(r.Context())
if !auth.HasPermission(role, p) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusForbidden)
w.Write([]byte(`{"error":"insufficient permissions"}`))
return
}
fn(w, r)
}
}
// Tenant-scoped routes (require tenant context)
scoped := http.NewServeMux()
// Cases
// Current user info (role, permissions) — all authenticated users
scoped.HandleFunc("GET /api/me", tenantH.GetMe)
// Cases — all can view, create needs PermCreateCase, archive needs PermCreateCase
scoped.HandleFunc("GET /api/cases", caseH.List)
scoped.HandleFunc("POST /api/cases", caseH.Create)
scoped.HandleFunc("POST /api/cases", perm(auth.PermCreateCase, caseH.Create))
scoped.HandleFunc("GET /api/cases/{id}", caseH.Get)
scoped.HandleFunc("PUT /api/cases/{id}", caseH.Update)
scoped.HandleFunc("DELETE /api/cases/{id}", caseH.Delete)
scoped.HandleFunc("DELETE /api/cases/{id}", perm(auth.PermCreateCase, caseH.Delete))
// Parties
// Parties — same access as case editing
scoped.HandleFunc("GET /api/cases/{id}/parties", partyH.List)
scoped.HandleFunc("POST /api/cases/{id}/parties", partyH.Create)
scoped.HandleFunc("PUT /api/parties/{partyId}", partyH.Update)
scoped.HandleFunc("DELETE /api/parties/{partyId}", partyH.Delete)
// Deadlines
// Deadlines — manage needs PermManageDeadlines, view is open
scoped.HandleFunc("GET /api/deadlines/{deadlineID}", deadlineH.Get)
scoped.HandleFunc("GET /api/deadlines", deadlineH.ListAll)
scoped.HandleFunc("GET /api/cases/{caseID}/deadlines", deadlineH.ListForCase)
scoped.HandleFunc("POST /api/cases/{caseID}/deadlines", deadlineH.Create)
scoped.HandleFunc("PUT /api/deadlines/{deadlineID}", deadlineH.Update)
scoped.HandleFunc("PATCH /api/deadlines/{deadlineID}/complete", deadlineH.Complete)
scoped.HandleFunc("DELETE /api/deadlines/{deadlineID}", deadlineH.Delete)
scoped.HandleFunc("POST /api/cases/{caseID}/deadlines", perm(auth.PermManageDeadlines, deadlineH.Create))
scoped.HandleFunc("PUT /api/deadlines/{deadlineID}", perm(auth.PermManageDeadlines, deadlineH.Update))
scoped.HandleFunc("PATCH /api/deadlines/{deadlineID}/complete", perm(auth.PermManageDeadlines, deadlineH.Complete))
scoped.HandleFunc("DELETE /api/deadlines/{deadlineID}", perm(auth.PermManageDeadlines, deadlineH.Delete))
// Deadline rules (reference data)
// Deadline rules (reference data) — all can read
scoped.HandleFunc("GET /api/deadline-rules", ruleH.List)
scoped.HandleFunc("GET /api/deadline-rules/{type}", ruleH.GetRuleTree)
scoped.HandleFunc("GET /api/proceeding-types", ruleH.ListProceedingTypes)
// Deadline calculator
// Deadline calculator — all can use
scoped.HandleFunc("POST /api/deadlines/calculate", calcH.Calculate)
// Appointments
scoped.HandleFunc("GET /api/appointments", apptH.List)
scoped.HandleFunc("POST /api/appointments", apptH.Create)
scoped.HandleFunc("PUT /api/appointments/{id}", apptH.Update)
scoped.HandleFunc("DELETE /api/appointments/{id}", apptH.Delete)
// Deadline determination — full timeline calculation with conditions
scoped.HandleFunc("GET /api/proceeding-types/{code}/timeline", determineH.GetTimeline)
scoped.HandleFunc("POST /api/deadlines/determine", determineH.Determine)
scoped.HandleFunc("POST /api/cases/{caseID}/deadlines/batch", perm(auth.PermManageDeadlines, determineH.BatchCreate))
// Dashboard
// Appointments — all can manage (PermManageAppointments granted to all)
scoped.HandleFunc("GET /api/appointments/{id}", apptH.Get)
scoped.HandleFunc("GET /api/appointments", apptH.List)
scoped.HandleFunc("POST /api/appointments", perm(auth.PermManageAppointments, apptH.Create))
scoped.HandleFunc("PUT /api/appointments/{id}", perm(auth.PermManageAppointments, apptH.Update))
scoped.HandleFunc("DELETE /api/appointments/{id}", perm(auth.PermManageAppointments, apptH.Delete))
// Case assignments — manage team required for assign/unassign
scoped.HandleFunc("GET /api/cases/{id}/assignments", assignmentH.List)
scoped.HandleFunc("POST /api/cases/{id}/assignments", perm(auth.PermManageTeam, assignmentH.Assign))
scoped.HandleFunc("DELETE /api/cases/{id}/assignments/{uid}", perm(auth.PermManageTeam, assignmentH.Unassign))
// Case events — all can view
scoped.HandleFunc("GET /api/case-events/{id}", eventH.Get)
// Notes — all can manage
scoped.HandleFunc("GET /api/notes", noteH.List)
scoped.HandleFunc("POST /api/notes", noteH.Create)
scoped.HandleFunc("PUT /api/notes/{id}", noteH.Update)
scoped.HandleFunc("DELETE /api/notes/{id}", noteH.Delete)
// Dashboard — all can view
scoped.HandleFunc("GET /api/dashboard", dashboardH.Get)
// Documents
// Audit log
scoped.HandleFunc("GET /api/audit-log", perm(auth.PermViewAuditLog, auditH.List))
// Documents — all can upload, delete checked in handler (own vs all)
scoped.HandleFunc("GET /api/cases/{id}/documents", docH.ListByCase)
scoped.HandleFunc("POST /api/cases/{id}/documents", docH.Upload)
scoped.HandleFunc("POST /api/cases/{id}/documents", perm(auth.PermUploadDocuments, docH.Upload))
scoped.HandleFunc("GET /api/documents/{docId}", docH.Download)
scoped.HandleFunc("GET /api/documents/{docId}/meta", docH.GetMeta)
scoped.HandleFunc("DELETE /api/documents/{docId}", docH.Delete)
// AI endpoints
// AI endpoints (rate limited: 5 req/min burst 10 per IP)
if aiH != nil {
scoped.HandleFunc("POST /api/ai/extract-deadlines", aiH.ExtractDeadlines)
scoped.HandleFunc("POST /api/ai/summarize-case", aiH.SummarizeCase)
aiLimiter := middleware.NewTokenBucket(5.0/60.0, 10)
scoped.HandleFunc("POST /api/ai/extract-deadlines", perm(auth.PermAIExtraction, aiLimiter.LimitFunc(aiH.ExtractDeadlines)))
scoped.HandleFunc("POST /api/ai/summarize-case", perm(auth.PermAIExtraction, aiLimiter.LimitFunc(aiH.SummarizeCase)))
scoped.HandleFunc("POST /api/ai/draft-document", perm(auth.PermAIExtraction, aiLimiter.LimitFunc(aiH.DraftDocument)))
scoped.HandleFunc("POST /api/ai/case-strategy", perm(auth.PermAIExtraction, aiLimiter.LimitFunc(aiH.CaseStrategy)))
scoped.HandleFunc("POST /api/ai/similar-cases", perm(auth.PermAIExtraction, aiLimiter.LimitFunc(aiH.SimilarCases)))
}
// Notifications
if notifH != nil {
scoped.HandleFunc("GET /api/notifications", notifH.List)
scoped.HandleFunc("GET /api/notifications/unread-count", notifH.UnreadCount)
scoped.HandleFunc("PATCH /api/notifications/{id}/read", notifH.MarkRead)
scoped.HandleFunc("PATCH /api/notifications/read-all", notifH.MarkAllRead)
scoped.HandleFunc("GET /api/notification-preferences", notifH.GetPreferences)
scoped.HandleFunc("PUT /api/notification-preferences", notifH.UpdatePreferences)
}
// CalDAV sync endpoints — settings permission required
if calDAVSvc != nil {
calDAVH := handlers.NewCalDAVHandler(calDAVSvc)
scoped.HandleFunc("POST /api/caldav/sync", perm(auth.PermManageSettings, calDAVH.TriggerSync))
scoped.HandleFunc("GET /api/caldav/status", calDAVH.GetStatus)
}
// Reports — cases/deadlines/workload open to all, billing restricted
scoped.HandleFunc("GET /api/reports/cases", reportH.Cases)
scoped.HandleFunc("GET /api/reports/deadlines", reportH.Deadlines)
scoped.HandleFunc("GET /api/reports/workload", reportH.Workload)
scoped.HandleFunc("GET /api/reports/billing", perm(auth.PermManageBilling, reportH.Billing))
// Time entries — all can view/create, tied to cases
scoped.HandleFunc("GET /api/cases/{id}/time-entries", timeH.ListForCase)
scoped.HandleFunc("GET /api/time-entries", timeH.List)
scoped.HandleFunc("POST /api/cases/{id}/time-entries", timeH.Create)
scoped.HandleFunc("PUT /api/time-entries/{id}", timeH.Update)
scoped.HandleFunc("DELETE /api/time-entries/{id}", timeH.Delete)
scoped.HandleFunc("GET /api/time-entries/summary", timeH.Summary)
// Invoices — billing permission required
scoped.HandleFunc("GET /api/invoices", perm(auth.PermManageBilling, invoiceH.List))
scoped.HandleFunc("GET /api/invoices/{id}", perm(auth.PermManageBilling, invoiceH.Get))
scoped.HandleFunc("POST /api/invoices", perm(auth.PermManageBilling, invoiceH.Create))
scoped.HandleFunc("PUT /api/invoices/{id}", perm(auth.PermManageBilling, invoiceH.Update))
scoped.HandleFunc("PATCH /api/invoices/{id}/status", perm(auth.PermManageBilling, invoiceH.UpdateStatus))
// Billing rates — billing permission required
scoped.HandleFunc("GET /api/billing-rates", perm(auth.PermManageBilling, billingH.List))
scoped.HandleFunc("PUT /api/billing-rates", perm(auth.PermManageBilling, billingH.Upsert))
// Document templates — all can view/use, manage needs case creation permission
scoped.HandleFunc("GET /api/templates", templateH.List)
scoped.HandleFunc("GET /api/templates/{id}", templateH.Get)
scoped.HandleFunc("POST /api/templates", perm(auth.PermCreateCase, templateH.Create))
scoped.HandleFunc("PUT /api/templates/{id}", perm(auth.PermCreateCase, templateH.Update))
scoped.HandleFunc("DELETE /api/templates/{id}", perm(auth.PermCreateCase, templateH.Delete))
scoped.HandleFunc("POST /api/templates/{id}/render", templateH.Render)
// Wire: auth -> tenant routes go directly, scoped routes get tenant resolver
api.Handle("/api/", tenantResolver.Resolve(scoped))
mux.Handle("/api/", authMW.RequireAuth(api))
return mux
// Apply security middleware stack: CORS -> Security Headers -> Request Logger -> Routes
var handler http.Handler = mux
handler = requestLogger(handler)
handler = middleware.SecurityHeaders(handler)
handler = middleware.CORS(cfg.FrontendOrigin)(handler)
return handler
}
func handleHealth(db *sqlx.DB) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
if err := db.Ping(); err != nil {
w.WriteHeader(http.StatusServiceUnavailable)
json.NewEncoder(w).Encode(map[string]string{"status": "error", "error": err.Error()})
json.NewEncoder(w).Encode(map[string]string{"status": "error"})
return
}
w.Header().Set("Content-Type", "application/json")
@@ -138,3 +280,33 @@ func handleHealth(db *sqlx.DB) http.HandlerFunc {
}
}
type statusWriter struct {
http.ResponseWriter
status int
}
func (w *statusWriter) WriteHeader(code int) {
w.status = code
w.ResponseWriter.WriteHeader(code)
}
func requestLogger(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
// Skip health checks to reduce noise
if r.URL.Path == "/health" {
next.ServeHTTP(w, r)
return
}
sw := &statusWriter{ResponseWriter: w, status: http.StatusOK}
start := time.Now()
next.ServeHTTP(sw, r)
slog.Info("request",
"method", r.Method,
"path", r.URL.Path,
"status", sw.status,
"duration_ms", time.Since(start).Milliseconds(),
)
})
}

View File

@@ -5,6 +5,7 @@ import (
"encoding/base64"
"encoding/json"
"fmt"
"strings"
"time"
"github.com/anthropics/anthropic-sdk-go"
@@ -18,11 +19,12 @@ import (
type AIService struct {
client anthropic.Client
db *sqlx.DB
youpcDB *sqlx.DB // read-only connection to youpc.org for similar case finder (may be nil)
}
func NewAIService(apiKey string, db *sqlx.DB) *AIService {
func NewAIService(apiKey string, db *sqlx.DB, youpcDB *sqlx.DB) *AIService {
client := anthropic.NewClient(option.WithAPIKey(apiKey))
return &AIService{client: client, db: db}
return &AIService{client: client, db: db, youpcDB: youpcDB}
}
// ExtractedDeadline represents a deadline extracted by AI from a document.
@@ -281,3 +283,726 @@ func (s *AIService) SummarizeCase(ctx context.Context, tenantID, caseID uuid.UUI
return summary, nil
}
// --- Document Drafting ---
// DocumentDraft represents an AI-generated document draft.
type DocumentDraft struct {
Title string `json:"title"`
Content string `json:"content"`
Language string `json:"language"`
}
// templateDescriptions maps template type IDs to descriptions for Claude.
var templateDescriptions = map[string]string{
"klageschrift": "Klageschrift (Statement of Claim) — formal complaint initiating legal proceedings",
"klageerwiderung": "Klageerwiderung (Statement of Defence) — formal response to a statement of claim",
"abmahnung": "Abmahnung (Cease and Desist Letter) — formal warning letter demanding cessation of an activity",
"schriftsatz": "Schriftsatz (Legal Brief) — formal legal submission to the court",
"berufung": "Berufungsschrift (Appeal Brief) — formal appeal against a court decision",
"antrag": "Antrag (Motion/Application) — formal application or motion to the court",
"stellungnahme": "Stellungnahme (Statement/Position Paper) — formal response or position paper",
"gutachten": "Gutachten (Legal Opinion/Expert Report) — detailed legal analysis or opinion",
"vertrag": "Vertrag (Contract/Agreement) — legal contract or agreement between parties",
"vollmacht": "Vollmacht (Power of Attorney) — formal authorization document",
"upc_claim": "UPC Statement of Claim — claim filed at the Unified Patent Court",
"upc_defence": "UPC Statement of Defence — defence filed at the Unified Patent Court",
"upc_counterclaim": "UPC Counterclaim for Revocation — counterclaim for patent revocation at the UPC",
"upc_injunction": "UPC Application for Provisional Measures — application for injunctive relief at the UPC",
}
const draftDocumentSystemPrompt = `You are an expert legal document drafter for German and UPC (Unified Patent Court) patent litigation.
You draft professional legal documents in the requested language, following proper legal formatting conventions.
Guidelines:
- Use proper legal structure with numbered sections and paragraphs
- Include standard legal formalities (headers, salutations, signatures block)
- Reference relevant legal provisions (BGB, ZPO, UPC Rules of Procedure, etc.)
- Use precise legal terminology appropriate for the jurisdiction
- Include placeholders in [BRACKETS] for information that needs to be filled in
- Base the content on the provided case data and instructions
- Output the document as clean text with proper formatting`
// DraftDocument generates an AI-drafted legal document based on case data and a template type.
func (s *AIService) DraftDocument(ctx context.Context, tenantID, caseID uuid.UUID, templateType, instructions, language string) (*DocumentDraft, error) {
if language == "" {
language = "de"
}
langLabel := "German"
if language == "en" {
langLabel = "English"
} else if language == "fr" {
langLabel = "French"
}
// Load case data
var c models.Case
if err := s.db.GetContext(ctx, &c,
"SELECT * FROM cases WHERE id = $1 AND tenant_id = $2", caseID, tenantID); err != nil {
return nil, fmt.Errorf("loading case: %w", err)
}
// Load parties
var parties []models.Party
_ = s.db.SelectContext(ctx, &parties,
"SELECT * FROM parties WHERE case_id = $1 AND tenant_id = $2", caseID, tenantID)
// Load recent events
var events []models.CaseEvent
_ = s.db.SelectContext(ctx, &events,
"SELECT * FROM case_events WHERE case_id = $1 AND tenant_id = $2 ORDER BY created_at DESC LIMIT 15",
caseID, tenantID)
// Load active deadlines
var deadlines []models.Deadline
_ = s.db.SelectContext(ctx, &deadlines,
"SELECT * FROM deadlines WHERE case_id = $1 AND tenant_id = $2 AND status = 'active' ORDER BY due_date ASC LIMIT 10",
caseID, tenantID)
// Load documents metadata for context
var documents []models.Document
_ = s.db.SelectContext(ctx, &documents,
"SELECT id, title, doc_type, created_at FROM documents WHERE case_id = $1 AND tenant_id = $2 ORDER BY created_at DESC LIMIT 10",
caseID, tenantID)
// Build context
var b strings.Builder
b.WriteString(fmt.Sprintf("Case: %s — %s\nStatus: %s\n", c.CaseNumber, c.Title, c.Status))
if c.Court != nil {
b.WriteString(fmt.Sprintf("Court: %s\n", *c.Court))
}
if c.CourtRef != nil {
b.WriteString(fmt.Sprintf("Court Reference: %s\n", *c.CourtRef))
}
if c.CaseType != nil {
b.WriteString(fmt.Sprintf("Type: %s\n", *c.CaseType))
}
if len(parties) > 0 {
b.WriteString("\nParties:\n")
for _, p := range parties {
role := "unknown role"
if p.Role != nil {
role = *p.Role
}
b.WriteString(fmt.Sprintf("- %s (%s)", p.Name, role))
if p.Representative != nil {
b.WriteString(fmt.Sprintf(" — represented by %s", *p.Representative))
}
b.WriteString("\n")
}
}
if len(events) > 0 {
b.WriteString("\nRecent Events:\n")
for _, e := range events {
b.WriteString(fmt.Sprintf("- [%s] %s", e.CreatedAt.Format("2006-01-02"), e.Title))
if e.Description != nil {
b.WriteString(fmt.Sprintf(": %s", *e.Description))
}
b.WriteString("\n")
}
}
if len(deadlines) > 0 {
b.WriteString("\nUpcoming Deadlines:\n")
for _, d := range deadlines {
b.WriteString(fmt.Sprintf("- %s: due %s\n", d.Title, d.DueDate))
}
}
templateDesc, ok := templateDescriptions[templateType]
if !ok {
templateDesc = templateType
}
prompt := fmt.Sprintf(`Draft a %s for this case in %s.
Document type: %s
Case context:
%s
Additional instructions from the lawyer:
%s
Generate the complete document now.`, templateDesc, langLabel, templateDesc, b.String(), instructions)
msg, err := s.client.Messages.New(ctx, anthropic.MessageNewParams{
Model: anthropic.ModelClaudeSonnet4_20250514,
MaxTokens: 8192,
System: []anthropic.TextBlockParam{
{Text: draftDocumentSystemPrompt},
},
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock(prompt)),
},
})
if err != nil {
return nil, fmt.Errorf("claude API call: %w", err)
}
var content string
for _, block := range msg.Content {
if block.Type == "text" {
content += block.Text
}
}
if content == "" {
return nil, fmt.Errorf("empty response from Claude")
}
title := fmt.Sprintf("%s — %s", templateDesc, c.CaseNumber)
return &DocumentDraft{
Title: title,
Content: content,
Language: language,
}, nil
}
// --- Case Strategy ---
// StrategyRecommendation represents an AI-generated case strategy analysis.
type StrategyRecommendation struct {
Summary string `json:"summary"`
NextSteps []StrategyStep `json:"next_steps"`
RiskAssessment []RiskItem `json:"risk_assessment"`
Timeline []TimelineItem `json:"timeline"`
}
type StrategyStep struct {
Priority string `json:"priority"` // high, medium, low
Action string `json:"action"`
Reasoning string `json:"reasoning"`
Deadline string `json:"deadline,omitempty"`
}
type RiskItem struct {
Level string `json:"level"` // high, medium, low
Risk string `json:"risk"`
Mitigation string `json:"mitigation"`
}
type TimelineItem struct {
Date string `json:"date"`
Event string `json:"event"`
Importance string `json:"importance"` // critical, important, routine
}
type strategyToolInput struct {
Summary string `json:"summary"`
NextSteps []StrategyStep `json:"next_steps"`
RiskAssessment []RiskItem `json:"risk_assessment"`
Timeline []TimelineItem `json:"timeline"`
}
var caseStrategyTool = anthropic.ToolParam{
Name: "case_strategy",
Description: anthropic.String("Provide strategic case analysis with next steps, risk assessment, and timeline optimization."),
InputSchema: anthropic.ToolInputSchemaParam{
Properties: map[string]any{
"summary": map[string]any{
"type": "string",
"description": "Executive summary of the case situation and strategic outlook (2-4 sentences)",
},
"next_steps": map[string]any{
"type": "array",
"description": "Recommended next actions in priority order",
"items": map[string]any{
"type": "object",
"properties": map[string]any{
"priority": map[string]any{
"type": "string",
"enum": []string{"high", "medium", "low"},
},
"action": map[string]any{
"type": "string",
"description": "Specific recommended action",
},
"reasoning": map[string]any{
"type": "string",
"description": "Why this action is recommended",
},
"deadline": map[string]any{
"type": "string",
"description": "Suggested deadline in YYYY-MM-DD format, if applicable",
},
},
"required": []string{"priority", "action", "reasoning"},
},
},
"risk_assessment": map[string]any{
"type": "array",
"description": "Key risks and mitigation strategies",
"items": map[string]any{
"type": "object",
"properties": map[string]any{
"level": map[string]any{
"type": "string",
"enum": []string{"high", "medium", "low"},
},
"risk": map[string]any{
"type": "string",
"description": "Description of the risk",
},
"mitigation": map[string]any{
"type": "string",
"description": "Recommended mitigation strategy",
},
},
"required": []string{"level", "risk", "mitigation"},
},
},
"timeline": map[string]any{
"type": "array",
"description": "Optimized timeline of upcoming milestones and events",
"items": map[string]any{
"type": "object",
"properties": map[string]any{
"date": map[string]any{
"type": "string",
"description": "Date in YYYY-MM-DD format",
},
"event": map[string]any{
"type": "string",
"description": "Description of the milestone or event",
},
"importance": map[string]any{
"type": "string",
"enum": []string{"critical", "important", "routine"},
},
},
"required": []string{"date", "event", "importance"},
},
},
},
Required: []string{"summary", "next_steps", "risk_assessment", "timeline"},
},
}
const caseStrategySystemPrompt = `You are a senior litigation strategist specializing in German law and UPC (Unified Patent Court) patent proceedings.
Analyze the case thoroughly and provide:
1. An executive summary of the current strategic position
2. Prioritized next steps with clear reasoning
3. Risk assessment with mitigation strategies
4. An optimized timeline of upcoming milestones
Consider:
- Procedural deadlines and their implications
- Strength of the parties' positions based on available information
- Potential settlement opportunities
- Cost-efficiency of different strategic approaches
- UPC-specific procedural peculiarities if applicable (bifurcation, preliminary injunctions, etc.)
Be practical and actionable. Avoid generic advice — tailor recommendations to the specific case data provided.`
// CaseStrategy analyzes a case and returns strategic recommendations.
func (s *AIService) CaseStrategy(ctx context.Context, tenantID, caseID uuid.UUID) (*StrategyRecommendation, error) {
// Load case
var c models.Case
if err := s.db.GetContext(ctx, &c,
"SELECT * FROM cases WHERE id = $1 AND tenant_id = $2", caseID, tenantID); err != nil {
return nil, fmt.Errorf("loading case: %w", err)
}
// Load parties
var parties []models.Party
_ = s.db.SelectContext(ctx, &parties,
"SELECT * FROM parties WHERE case_id = $1 AND tenant_id = $2", caseID, tenantID)
// Load all events
var events []models.CaseEvent
_ = s.db.SelectContext(ctx, &events,
"SELECT * FROM case_events WHERE case_id = $1 AND tenant_id = $2 ORDER BY created_at DESC LIMIT 25",
caseID, tenantID)
// Load all deadlines (active + completed for context)
var deadlines []models.Deadline
_ = s.db.SelectContext(ctx, &deadlines,
"SELECT * FROM deadlines WHERE case_id = $1 AND tenant_id = $2 ORDER BY due_date ASC LIMIT 20",
caseID, tenantID)
// Load documents metadata
var documents []models.Document
_ = s.db.SelectContext(ctx, &documents,
"SELECT id, title, doc_type, created_at FROM documents WHERE case_id = $1 AND tenant_id = $2 ORDER BY created_at DESC LIMIT 15",
caseID, tenantID)
// Build comprehensive context
var b strings.Builder
b.WriteString(fmt.Sprintf("Case: %s — %s\nStatus: %s\n", c.CaseNumber, c.Title, c.Status))
if c.Court != nil {
b.WriteString(fmt.Sprintf("Court: %s\n", *c.Court))
}
if c.CourtRef != nil {
b.WriteString(fmt.Sprintf("Court Reference: %s\n", *c.CourtRef))
}
if c.CaseType != nil {
b.WriteString(fmt.Sprintf("Type: %s\n", *c.CaseType))
}
if len(parties) > 0 {
b.WriteString("\nParties:\n")
for _, p := range parties {
role := "unknown"
if p.Role != nil {
role = *p.Role
}
b.WriteString(fmt.Sprintf("- %s (%s)", p.Name, role))
if p.Representative != nil {
b.WriteString(fmt.Sprintf(" — represented by %s", *p.Representative))
}
b.WriteString("\n")
}
}
if len(events) > 0 {
b.WriteString("\nCase Events (chronological):\n")
for _, e := range events {
b.WriteString(fmt.Sprintf("- [%s] %s", e.CreatedAt.Format("2006-01-02"), e.Title))
if e.Description != nil {
b.WriteString(fmt.Sprintf(": %s", *e.Description))
}
b.WriteString("\n")
}
}
if len(deadlines) > 0 {
b.WriteString("\nDeadlines:\n")
for _, d := range deadlines {
b.WriteString(fmt.Sprintf("- %s: due %s (status: %s)\n", d.Title, d.DueDate, d.Status))
}
}
if len(documents) > 0 {
b.WriteString("\nDocuments on file:\n")
for _, d := range documents {
docType := ""
if d.DocType != nil {
docType = fmt.Sprintf(" [%s]", *d.DocType)
}
b.WriteString(fmt.Sprintf("- %s%s (%s)\n", d.Title, docType, d.CreatedAt.Format("2006-01-02")))
}
}
msg, err := s.client.Messages.New(ctx, anthropic.MessageNewParams{
Model: anthropic.ModelClaudeOpus4_6,
MaxTokens: 4096,
System: []anthropic.TextBlockParam{
{Text: caseStrategySystemPrompt},
},
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Analyze this case and provide strategic recommendations:\n\n" + b.String())),
},
Tools: []anthropic.ToolUnionParam{
{OfTool: &caseStrategyTool},
},
ToolChoice: anthropic.ToolChoiceParamOfTool("case_strategy"),
})
if err != nil {
return nil, fmt.Errorf("claude API call: %w", err)
}
for _, block := range msg.Content {
if block.Type == "tool_use" && block.Name == "case_strategy" {
var input strategyToolInput
if err := json.Unmarshal(block.Input, &input); err != nil {
return nil, fmt.Errorf("parsing strategy output: %w", err)
}
result := &StrategyRecommendation{
Summary: input.Summary,
NextSteps: input.NextSteps,
RiskAssessment: input.RiskAssessment,
Timeline: input.Timeline,
}
// Cache in database
strategyJSON, _ := json.Marshal(result)
_, _ = s.db.ExecContext(ctx,
"UPDATE cases SET ai_summary = $1, updated_at = $2 WHERE id = $3 AND tenant_id = $4",
string(strategyJSON), time.Now(), caseID, tenantID)
return result, nil
}
}
return nil, fmt.Errorf("no tool_use block in response")
}
// --- Similar Case Finder ---
// SimilarCase represents a UPC case found to be similar.
type SimilarCase struct {
CaseNumber string `json:"case_number"`
Title string `json:"title"`
Court string `json:"court"`
Date string `json:"date"`
Relevance float64 `json:"relevance"` // 0.0-1.0
Explanation string `json:"explanation"` // why this case is similar
KeyHoldings string `json:"key_holdings"` // relevant holdings
URL string `json:"url,omitempty"` // link to youpc.org
}
// youpcCase represents a case from the youpc.org database.
type youpcCase struct {
ID string `db:"id" json:"id"`
CaseNumber *string `db:"case_number" json:"case_number"`
Title *string `db:"title" json:"title"`
Court *string `db:"court" json:"court"`
DecisionDate *string `db:"decision_date" json:"decision_date"`
CaseType *string `db:"case_type" json:"case_type"`
Outcome *string `db:"outcome" json:"outcome"`
PatentNumbers *string `db:"patent_numbers" json:"patent_numbers"`
Summary *string `db:"summary" json:"summary"`
Claimant *string `db:"claimant" json:"claimant"`
Defendant *string `db:"defendant" json:"defendant"`
}
type similarCaseToolInput struct {
Cases []struct {
CaseID string `json:"case_id"`
Relevance float64 `json:"relevance"`
Explanation string `json:"explanation"`
KeyHoldings string `json:"key_holdings"`
} `json:"cases"`
}
var similarCaseTool = anthropic.ToolParam{
Name: "rank_similar_cases",
Description: anthropic.String("Rank the provided UPC cases by relevance to the query case and explain why each is similar."),
InputSchema: anthropic.ToolInputSchemaParam{
Properties: map[string]any{
"cases": map[string]any{
"type": "array",
"description": "UPC cases ranked by relevance (most relevant first)",
"items": map[string]any{
"type": "object",
"properties": map[string]any{
"case_id": map[string]any{
"type": "string",
"description": "The ID of the UPC case from the provided list",
},
"relevance": map[string]any{
"type": "number",
"minimum": 0,
"maximum": 1,
"description": "Relevance score from 0.0 to 1.0",
},
"explanation": map[string]any{
"type": "string",
"description": "Why this case is relevant — what legal issues, parties, patents, or procedural aspects are similar",
},
"key_holdings": map[string]any{
"type": "string",
"description": "Key holdings or legal principles from this case that are relevant",
},
},
"required": []string{"case_id", "relevance", "explanation", "key_holdings"},
},
},
},
Required: []string{"cases"},
},
}
const similarCaseSystemPrompt = `You are a UPC (Unified Patent Court) case law expert.
Given a case description and a list of UPC cases from the database, rank the cases by relevance and explain why each one is similar or relevant.
Consider:
- Similar patents or technology areas
- Same parties or representatives
- Similar legal issues (infringement, validity, injunctions, etc.)
- Similar procedural situations
- Relevant legal principles that could apply
Only include cases that are genuinely relevant (relevance > 0.3). Order by relevance descending.`
// FindSimilarCases searches the youpc.org database for similar UPC cases.
func (s *AIService) FindSimilarCases(ctx context.Context, tenantID, caseID uuid.UUID, description string) ([]SimilarCase, error) {
if s.youpcDB == nil {
return nil, fmt.Errorf("youpc.org database not configured")
}
// Build query context from the case (if provided) or description
var queryText string
if caseID != uuid.Nil {
var c models.Case
if err := s.db.GetContext(ctx, &c,
"SELECT * FROM cases WHERE id = $1 AND tenant_id = $2", caseID, tenantID); err != nil {
return nil, fmt.Errorf("loading case: %w", err)
}
var parties []models.Party
_ = s.db.SelectContext(ctx, &parties,
"SELECT * FROM parties WHERE case_id = $1 AND tenant_id = $2", caseID, tenantID)
var b strings.Builder
b.WriteString(fmt.Sprintf("Case: %s — %s\n", c.CaseNumber, c.Title))
if c.CaseType != nil {
b.WriteString(fmt.Sprintf("Type: %s\n", *c.CaseType))
}
if c.Court != nil {
b.WriteString(fmt.Sprintf("Court: %s\n", *c.Court))
}
for _, p := range parties {
role := ""
if p.Role != nil {
role = *p.Role
}
b.WriteString(fmt.Sprintf("Party: %s (%s)\n", p.Name, role))
}
if description != "" {
b.WriteString(fmt.Sprintf("\nAdditional context: %s\n", description))
}
queryText = b.String()
} else if description != "" {
queryText = description
} else {
return nil, fmt.Errorf("either case_id or description must be provided")
}
// Query youpc.org database for candidate cases
// Search by text similarity across case titles, summaries, party names
var candidates []youpcCase
err := s.youpcDB.SelectContext(ctx, &candidates, `
SELECT
id,
case_number,
title,
court,
decision_date,
case_type,
outcome,
patent_numbers,
summary,
claimant,
defendant
FROM mlex.cases
ORDER BY decision_date DESC NULLS LAST
LIMIT 50
`)
if err != nil {
return nil, fmt.Errorf("querying youpc.org cases: %w", err)
}
if len(candidates) == 0 {
return []SimilarCase{}, nil
}
// Build candidate list for Claude
var candidateText strings.Builder
for _, c := range candidates {
candidateText.WriteString(fmt.Sprintf("ID: %s\n", c.ID))
if c.CaseNumber != nil {
candidateText.WriteString(fmt.Sprintf("Case Number: %s\n", *c.CaseNumber))
}
if c.Title != nil {
candidateText.WriteString(fmt.Sprintf("Title: %s\n", *c.Title))
}
if c.Court != nil {
candidateText.WriteString(fmt.Sprintf("Court: %s\n", *c.Court))
}
if c.DecisionDate != nil {
candidateText.WriteString(fmt.Sprintf("Decision Date: %s\n", *c.DecisionDate))
}
if c.CaseType != nil {
candidateText.WriteString(fmt.Sprintf("Type: %s\n", *c.CaseType))
}
if c.Outcome != nil {
candidateText.WriteString(fmt.Sprintf("Outcome: %s\n", *c.Outcome))
}
if c.PatentNumbers != nil {
candidateText.WriteString(fmt.Sprintf("Patents: %s\n", *c.PatentNumbers))
}
if c.Claimant != nil {
candidateText.WriteString(fmt.Sprintf("Claimant: %s\n", *c.Claimant))
}
if c.Defendant != nil {
candidateText.WriteString(fmt.Sprintf("Defendant: %s\n", *c.Defendant))
}
if c.Summary != nil {
candidateText.WriteString(fmt.Sprintf("Summary: %s\n", *c.Summary))
}
candidateText.WriteString("---\n")
}
prompt := fmt.Sprintf(`Find UPC cases relevant to this matter:
%s
Here are the UPC cases from the database to evaluate:
%s
Rank only the genuinely relevant cases by similarity.`, queryText, candidateText.String())
msg, err := s.client.Messages.New(ctx, anthropic.MessageNewParams{
Model: anthropic.ModelClaudeSonnet4_20250514,
MaxTokens: 4096,
System: []anthropic.TextBlockParam{
{Text: similarCaseSystemPrompt},
},
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock(prompt)),
},
Tools: []anthropic.ToolUnionParam{
{OfTool: &similarCaseTool},
},
ToolChoice: anthropic.ToolChoiceParamOfTool("rank_similar_cases"),
})
if err != nil {
return nil, fmt.Errorf("claude API call: %w", err)
}
for _, block := range msg.Content {
if block.Type == "tool_use" && block.Name == "rank_similar_cases" {
var input similarCaseToolInput
if err := json.Unmarshal(block.Input, &input); err != nil {
return nil, fmt.Errorf("parsing similar cases output: %w", err)
}
// Build lookup map for candidate data
candidateMap := make(map[string]youpcCase)
for _, c := range candidates {
candidateMap[c.ID] = c
}
var results []SimilarCase
for _, ranked := range input.Cases {
if ranked.Relevance < 0.3 {
continue
}
c, ok := candidateMap[ranked.CaseID]
if !ok {
continue
}
sc := SimilarCase{
Relevance: ranked.Relevance,
Explanation: ranked.Explanation,
KeyHoldings: ranked.KeyHoldings,
}
if c.CaseNumber != nil {
sc.CaseNumber = *c.CaseNumber
}
if c.Title != nil {
sc.Title = *c.Title
}
if c.Court != nil {
sc.Court = *c.Court
}
if c.DecisionDate != nil {
sc.Date = *c.DecisionDate
}
if c.CaseNumber != nil {
sc.URL = fmt.Sprintf("https://youpc.org/cases/%s", *c.CaseNumber)
}
results = append(results, sc)
}
return results, nil
}
}
return nil, fmt.Errorf("no tool_use block in response")
}

View File

@@ -12,11 +12,12 @@ import (
)
type AppointmentService struct {
db *sqlx.DB
db *sqlx.DB
audit *AuditService
}
func NewAppointmentService(db *sqlx.DB) *AppointmentService {
return &AppointmentService{db: db}
func NewAppointmentService(db *sqlx.DB, audit *AuditService) *AppointmentService {
return &AppointmentService{db: db, audit: audit}
}
type AppointmentFilter struct {
@@ -86,6 +87,7 @@ func (s *AppointmentService) Create(ctx context.Context, a *models.Appointment)
if err != nil {
return fmt.Errorf("creating appointment: %w", err)
}
s.audit.Log(ctx, "create", "appointment", &a.ID, nil, a)
return nil
}
@@ -116,6 +118,7 @@ func (s *AppointmentService) Update(ctx context.Context, a *models.Appointment)
if rows == 0 {
return fmt.Errorf("appointment not found")
}
s.audit.Log(ctx, "update", "appointment", &a.ID, nil, a)
return nil
}
@@ -131,5 +134,6 @@ func (s *AppointmentService) Delete(ctx context.Context, tenantID, id uuid.UUID)
if rows == 0 {
return fmt.Errorf("appointment not found")
}
s.audit.Log(ctx, "delete", "appointment", &id, nil, nil)
return nil
}

View File

@@ -0,0 +1,141 @@
package services
import (
"context"
"encoding/json"
"fmt"
"log/slog"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/auth"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type AuditService struct {
db *sqlx.DB
}
func NewAuditService(db *sqlx.DB) *AuditService {
return &AuditService{db: db}
}
// Log records an audit entry. It extracts tenant, user, IP, and user-agent from context.
// Errors are logged but not returned — audit logging must not break business operations.
func (s *AuditService) Log(ctx context.Context, action, entityType string, entityID *uuid.UUID, oldValues, newValues any) {
tenantID, ok := auth.TenantFromContext(ctx)
if !ok {
slog.Warn("audit: missing tenant_id in context", "action", action, "entity_type", entityType)
return
}
var userID *uuid.UUID
if uid, ok := auth.UserFromContext(ctx); ok {
userID = &uid
}
var oldJSON, newJSON *json.RawMessage
if oldValues != nil {
if b, err := json.Marshal(oldValues); err == nil {
raw := json.RawMessage(b)
oldJSON = &raw
}
}
if newValues != nil {
if b, err := json.Marshal(newValues); err == nil {
raw := json.RawMessage(b)
newJSON = &raw
}
}
ip := auth.IPFromContext(ctx)
ua := auth.UserAgentFromContext(ctx)
_, err := s.db.ExecContext(ctx,
`INSERT INTO audit_log (tenant_id, user_id, action, entity_type, entity_id, old_values, new_values, ip_address, user_agent)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)`,
tenantID, userID, action, entityType, entityID, oldJSON, newJSON, ip, ua)
if err != nil {
slog.Error("audit: failed to write log entry",
"error", err,
"action", action,
"entity_type", entityType,
"entity_id", entityID,
)
}
}
// AuditFilter holds query parameters for listing audit log entries.
type AuditFilter struct {
EntityType string
EntityID *uuid.UUID
UserID *uuid.UUID
From string // RFC3339 date
To string // RFC3339 date
Page int
Limit int
}
// List returns paginated audit log entries for a tenant.
func (s *AuditService) List(ctx context.Context, tenantID uuid.UUID, filter AuditFilter) ([]models.AuditLog, int, error) {
if filter.Limit <= 0 {
filter.Limit = 50
}
if filter.Limit > 200 {
filter.Limit = 200
}
if filter.Page <= 0 {
filter.Page = 1
}
offset := (filter.Page - 1) * filter.Limit
where := "WHERE tenant_id = $1"
args := []any{tenantID}
argIdx := 2
if filter.EntityType != "" {
where += fmt.Sprintf(" AND entity_type = $%d", argIdx)
args = append(args, filter.EntityType)
argIdx++
}
if filter.EntityID != nil {
where += fmt.Sprintf(" AND entity_id = $%d", argIdx)
args = append(args, *filter.EntityID)
argIdx++
}
if filter.UserID != nil {
where += fmt.Sprintf(" AND user_id = $%d", argIdx)
args = append(args, *filter.UserID)
argIdx++
}
if filter.From != "" {
where += fmt.Sprintf(" AND created_at >= $%d", argIdx)
args = append(args, filter.From)
argIdx++
}
if filter.To != "" {
where += fmt.Sprintf(" AND created_at <= $%d", argIdx)
args = append(args, filter.To)
argIdx++
}
var total int
if err := s.db.GetContext(ctx, &total, "SELECT COUNT(*) FROM audit_log "+where, args...); err != nil {
return nil, 0, fmt.Errorf("counting audit entries: %w", err)
}
query := fmt.Sprintf("SELECT * FROM audit_log %s ORDER BY created_at DESC LIMIT $%d OFFSET $%d",
where, argIdx, argIdx+1)
args = append(args, filter.Limit, offset)
var entries []models.AuditLog
if err := s.db.SelectContext(ctx, &entries, query, args...); err != nil {
return nil, 0, fmt.Errorf("listing audit entries: %w", err)
}
if entries == nil {
entries = []models.AuditLog{}
}
return entries, total, nil
}

View File

@@ -0,0 +1,88 @@
package services
import (
"context"
"fmt"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type BillingRateService struct {
db *sqlx.DB
audit *AuditService
}
func NewBillingRateService(db *sqlx.DB, audit *AuditService) *BillingRateService {
return &BillingRateService{db: db, audit: audit}
}
type UpsertBillingRateInput struct {
UserID *uuid.UUID `json:"user_id,omitempty"`
Rate float64 `json:"rate"`
Currency string `json:"currency"`
ValidFrom string `json:"valid_from"`
ValidTo *string `json:"valid_to,omitempty"`
}
func (s *BillingRateService) List(ctx context.Context, tenantID uuid.UUID) ([]models.BillingRate, error) {
var rates []models.BillingRate
err := s.db.SelectContext(ctx, &rates,
`SELECT id, tenant_id, user_id, rate, currency, valid_from, valid_to, created_at
FROM billing_rates
WHERE tenant_id = $1
ORDER BY valid_from DESC, user_id NULLS LAST`,
tenantID)
if err != nil {
return nil, fmt.Errorf("list billing rates: %w", err)
}
return rates, nil
}
func (s *BillingRateService) Upsert(ctx context.Context, tenantID uuid.UUID, input UpsertBillingRateInput) (*models.BillingRate, error) {
if input.Currency == "" {
input.Currency = "EUR"
}
// Close any existing open-ended rate for this user
_, err := s.db.ExecContext(ctx,
`UPDATE billing_rates SET valid_to = $3
WHERE tenant_id = $1
AND (($2::uuid IS NULL AND user_id IS NULL) OR user_id = $2)
AND valid_to IS NULL
AND valid_from < $3`,
tenantID, input.UserID, input.ValidFrom)
if err != nil {
return nil, fmt.Errorf("close existing rate: %w", err)
}
var rate models.BillingRate
err = s.db.QueryRowxContext(ctx,
`INSERT INTO billing_rates (tenant_id, user_id, rate, currency, valid_from, valid_to)
VALUES ($1, $2, $3, $4, $5, $6)
RETURNING id, tenant_id, user_id, rate, currency, valid_from, valid_to, created_at`,
tenantID, input.UserID, input.Rate, input.Currency, input.ValidFrom, input.ValidTo,
).StructScan(&rate)
if err != nil {
return nil, fmt.Errorf("upsert billing rate: %w", err)
}
s.audit.Log(ctx, "create", "billing_rate", &rate.ID, nil, rate)
return &rate, nil
}
func (s *BillingRateService) GetCurrentRate(ctx context.Context, tenantID uuid.UUID, userID uuid.UUID, date string) (*float64, error) {
var rate float64
err := s.db.GetContext(ctx, &rate,
`SELECT rate FROM billing_rates
WHERE tenant_id = $1 AND (user_id = $2 OR user_id IS NULL)
AND valid_from <= $3 AND (valid_to IS NULL OR valid_to >= $3)
ORDER BY user_id NULLS LAST LIMIT 1`,
tenantID, userID, date)
if err != nil {
return nil, err
}
return &rate, nil
}

View File

@@ -0,0 +1,687 @@
package services
import (
"context"
"encoding/json"
"fmt"
"log/slog"
"strings"
"sync"
"time"
"github.com/emersion/go-ical"
"github.com/emersion/go-webdav"
"github.com/emersion/go-webdav/caldav"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
const (
calDAVDomain = "kanzlai.msbls.de"
calDAVProdID = "-//KanzlAI//KanzlAI-mGMT//EN"
defaultSyncMin = 15
)
// CalDAVConfig holds per-tenant CalDAV configuration from tenants.settings.
type CalDAVConfig struct {
URL string `json:"url"`
Username string `json:"username"`
Password string `json:"password"`
CalendarPath string `json:"calendar_path"`
SyncEnabled bool `json:"sync_enabled"`
SyncIntervalMinutes int `json:"sync_interval_minutes"`
}
// SyncStatus holds the last sync result for a tenant.
type SyncStatus struct {
TenantID uuid.UUID `json:"tenant_id"`
LastSyncAt time.Time `json:"last_sync_at"`
ItemsPushed int `json:"items_pushed"`
ItemsPulled int `json:"items_pulled"`
Errors []string `json:"errors,omitempty"`
SyncDuration string `json:"sync_duration"`
}
// CalDAVService handles bidirectional CalDAV synchronization.
type CalDAVService struct {
db *sqlx.DB
mu sync.RWMutex
statuses map[uuid.UUID]*SyncStatus // per-tenant sync status
stopCh chan struct{}
wg sync.WaitGroup
}
// NewCalDAVService creates a new CalDAV sync service.
func NewCalDAVService(db *sqlx.DB) *CalDAVService {
return &CalDAVService{
db: db,
statuses: make(map[uuid.UUID]*SyncStatus),
stopCh: make(chan struct{}),
}
}
// GetStatus returns the last sync status for a tenant.
func (s *CalDAVService) GetStatus(tenantID uuid.UUID) *SyncStatus {
s.mu.RLock()
defer s.mu.RUnlock()
return s.statuses[tenantID]
}
// setStatus stores the sync status for a tenant.
func (s *CalDAVService) setStatus(status *SyncStatus) {
s.mu.Lock()
defer s.mu.Unlock()
s.statuses[status.TenantID] = status
}
// Start begins the background sync goroutine that polls per-tenant.
func (s *CalDAVService) Start() {
s.wg.Go(func() {
s.backgroundLoop()
})
slog.Info("CalDAV sync service started")
}
// Stop gracefully stops the background sync.
func (s *CalDAVService) Stop() {
close(s.stopCh)
s.wg.Wait()
slog.Info("CalDAV sync service stopped")
}
// backgroundLoop polls tenants at their configured interval.
func (s *CalDAVService) backgroundLoop() {
// Check every minute, but only sync tenants whose interval has elapsed.
ticker := time.NewTicker(1 * time.Minute)
defer ticker.Stop()
for {
select {
case <-s.stopCh:
return
case <-ticker.C:
s.syncAllTenants()
}
}
}
// syncAllTenants checks all tenants and syncs those due for a sync.
func (s *CalDAVService) syncAllTenants() {
configs, err := s.loadAllTenantConfigs()
if err != nil {
slog.Error("CalDAV: failed to load tenant configs", "error", err)
return
}
for tenantID, cfg := range configs {
if !cfg.SyncEnabled {
continue
}
interval := cfg.SyncIntervalMinutes
if interval <= 0 {
interval = defaultSyncMin
}
// Check if enough time has passed since last sync
status := s.GetStatus(tenantID)
if status != nil && time.Since(status.LastSyncAt) < time.Duration(interval)*time.Minute {
continue
}
go func(tid uuid.UUID, c CalDAVConfig) {
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
defer cancel()
if _, err := s.SyncTenant(ctx, tid, c); err != nil {
slog.Error("CalDAV: sync failed", "tenant_id", tid, "error", err)
}
}(tenantID, cfg)
}
}
// loadAllTenantConfigs reads CalDAV configs from all tenants.
func (s *CalDAVService) loadAllTenantConfigs() (map[uuid.UUID]CalDAVConfig, error) {
type row struct {
ID uuid.UUID `db:"id"`
Settings json.RawMessage `db:"settings"`
}
var rows []row
if err := s.db.Select(&rows, "SELECT id, settings FROM tenants"); err != nil {
return nil, fmt.Errorf("querying tenants: %w", err)
}
result := make(map[uuid.UUID]CalDAVConfig)
for _, r := range rows {
cfg, err := parseCalDAVConfig(r.Settings)
if err != nil || cfg.URL == "" {
continue
}
result[r.ID] = cfg
}
return result, nil
}
// LoadTenantConfig reads CalDAV config for a single tenant.
func (s *CalDAVService) LoadTenantConfig(tenantID uuid.UUID) (*CalDAVConfig, error) {
var settings json.RawMessage
if err := s.db.Get(&settings, "SELECT settings FROM tenants WHERE id = $1", tenantID); err != nil {
return nil, fmt.Errorf("loading tenant settings: %w", err)
}
cfg, err := parseCalDAVConfig(settings)
if err != nil {
return nil, err
}
if cfg.URL == "" {
return nil, fmt.Errorf("no CalDAV configuration for tenant")
}
return &cfg, nil
}
func parseCalDAVConfig(settings json.RawMessage) (CalDAVConfig, error) {
if len(settings) == 0 {
return CalDAVConfig{}, nil
}
var wrapper struct {
CalDAV CalDAVConfig `json:"caldav"`
}
if err := json.Unmarshal(settings, &wrapper); err != nil {
return CalDAVConfig{}, fmt.Errorf("parsing CalDAV settings: %w", err)
}
return wrapper.CalDAV, nil
}
// newCalDAVClient creates a caldav.Client from config.
func newCalDAVClient(cfg CalDAVConfig) (*caldav.Client, error) {
httpClient := webdav.HTTPClientWithBasicAuth(nil, cfg.Username, cfg.Password)
return caldav.NewClient(httpClient, cfg.URL)
}
// SyncTenant performs a full bidirectional sync for a tenant.
func (s *CalDAVService) SyncTenant(ctx context.Context, tenantID uuid.UUID, cfg CalDAVConfig) (*SyncStatus, error) {
start := time.Now()
status := &SyncStatus{
TenantID: tenantID,
}
client, err := newCalDAVClient(cfg)
if err != nil {
status.Errors = append(status.Errors, fmt.Sprintf("creating client: %v", err))
status.LastSyncAt = time.Now()
s.setStatus(status)
return status, err
}
// Push local changes to CalDAV
pushed, pushErrs := s.pushAll(ctx, client, tenantID, cfg)
status.ItemsPushed = pushed
status.Errors = append(status.Errors, pushErrs...)
// Pull remote changes from CalDAV
pulled, pullErrs := s.pullAll(ctx, client, tenantID, cfg)
status.ItemsPulled = pulled
status.Errors = append(status.Errors, pullErrs...)
status.LastSyncAt = time.Now()
status.SyncDuration = time.Since(start).String()
s.setStatus(status)
if len(status.Errors) > 0 {
return status, fmt.Errorf("sync completed with %d errors", len(status.Errors))
}
return status, nil
}
// --- Push: Local -> CalDAV ---
// pushAll pushes all deadlines and appointments to CalDAV.
func (s *CalDAVService) pushAll(ctx context.Context, client *caldav.Client, tenantID uuid.UUID, cfg CalDAVConfig) (int, []string) {
var pushed int
var errs []string
// Push deadlines as VTODO
deadlines, err := s.loadDeadlines(tenantID)
if err != nil {
return 0, []string{fmt.Sprintf("loading deadlines: %v", err)}
}
for _, d := range deadlines {
if err := s.pushDeadline(ctx, client, cfg, &d); err != nil {
errs = append(errs, fmt.Sprintf("push deadline %s: %v", d.ID, err))
} else {
pushed++
}
}
// Push appointments as VEVENT
appointments, err := s.loadAppointments(ctx, tenantID)
if err != nil {
errs = append(errs, fmt.Sprintf("loading appointments: %v", err))
return pushed, errs
}
for _, a := range appointments {
if err := s.pushAppointment(ctx, client, cfg, &a); err != nil {
errs = append(errs, fmt.Sprintf("push appointment %s: %v", a.ID, err))
} else {
pushed++
}
}
return pushed, errs
}
// PushDeadline pushes a single deadline to CalDAV (called on create/update).
func (s *CalDAVService) PushDeadline(ctx context.Context, tenantID uuid.UUID, deadline *models.Deadline) error {
cfg, err := s.LoadTenantConfig(tenantID)
if err != nil || !cfg.SyncEnabled {
return nil // CalDAV not configured or disabled — silently skip
}
client, err := newCalDAVClient(*cfg)
if err != nil {
return fmt.Errorf("creating CalDAV client: %w", err)
}
return s.pushDeadline(ctx, client, *cfg, deadline)
}
func (s *CalDAVService) pushDeadline(ctx context.Context, client *caldav.Client, cfg CalDAVConfig, d *models.Deadline) error {
uid := deadlineUID(d.ID)
cal := ical.NewCalendar()
cal.Props.SetText(ical.PropProductID, calDAVProdID)
cal.Props.SetText(ical.PropVersion, "2.0")
todo := ical.NewComponent(ical.CompToDo)
todo.Props.SetText(ical.PropUID, uid)
todo.Props.SetText(ical.PropSummary, d.Title)
todo.Props.SetDateTime(ical.PropDateTimeStamp, time.Now().UTC())
if d.Description != nil {
todo.Props.SetText(ical.PropDescription, *d.Description)
}
if d.Notes != nil {
desc := ""
if d.Description != nil {
desc = *d.Description + "\n\n"
}
todo.Props.SetText(ical.PropDescription, desc+*d.Notes)
}
// Parse due_date (stored as string "YYYY-MM-DD")
if due, err := time.Parse("2006-01-02", d.DueDate); err == nil {
todo.Props.SetDate(ical.PropDue, due)
}
// Map status
switch d.Status {
case "completed":
todo.Props.SetText(ical.PropStatus, "COMPLETED")
if d.CompletedAt != nil {
todo.Props.SetDateTime(ical.PropCompleted, d.CompletedAt.UTC())
}
case "pending":
todo.Props.SetText(ical.PropStatus, "NEEDS-ACTION")
default:
todo.Props.SetText(ical.PropStatus, "IN-PROCESS")
}
cal.Children = append(cal.Children, todo)
path := calendarObjectPath(cfg.CalendarPath, uid)
obj, err := client.PutCalendarObject(ctx, path, cal)
if err != nil {
return fmt.Errorf("putting VTODO: %w", err)
}
// Update caldav_uid and etag in DB
return s.updateDeadlineCalDAV(d.ID, uid, obj.ETag)
}
// PushAppointment pushes a single appointment to CalDAV (called on create/update).
func (s *CalDAVService) PushAppointment(ctx context.Context, tenantID uuid.UUID, appointment *models.Appointment) error {
cfg, err := s.LoadTenantConfig(tenantID)
if err != nil || !cfg.SyncEnabled {
return nil
}
client, err := newCalDAVClient(*cfg)
if err != nil {
return fmt.Errorf("creating CalDAV client: %w", err)
}
return s.pushAppointment(ctx, client, *cfg, appointment)
}
func (s *CalDAVService) pushAppointment(ctx context.Context, client *caldav.Client, cfg CalDAVConfig, a *models.Appointment) error {
uid := appointmentUID(a.ID)
cal := ical.NewCalendar()
cal.Props.SetText(ical.PropProductID, calDAVProdID)
cal.Props.SetText(ical.PropVersion, "2.0")
event := ical.NewEvent()
event.Props.SetText(ical.PropUID, uid)
event.Props.SetText(ical.PropSummary, a.Title)
event.Props.SetDateTime(ical.PropDateTimeStamp, time.Now().UTC())
event.Props.SetDateTime(ical.PropDateTimeStart, a.StartAt.UTC())
if a.EndAt != nil {
event.Props.SetDateTime(ical.PropDateTimeEnd, a.EndAt.UTC())
}
if a.Description != nil {
event.Props.SetText(ical.PropDescription, *a.Description)
}
if a.Location != nil {
event.Props.SetText(ical.PropLocation, *a.Location)
}
cal.Children = append(cal.Children, event.Component)
path := calendarObjectPath(cfg.CalendarPath, uid)
obj, err := client.PutCalendarObject(ctx, path, cal)
if err != nil {
return fmt.Errorf("putting VEVENT: %w", err)
}
return s.updateAppointmentCalDAV(a.ID, uid, obj.ETag)
}
// DeleteDeadlineCalDAV removes a deadline's VTODO from CalDAV.
func (s *CalDAVService) DeleteDeadlineCalDAV(ctx context.Context, tenantID uuid.UUID, deadline *models.Deadline) error {
if deadline.CalDAVUID == nil || *deadline.CalDAVUID == "" {
return nil
}
cfg, err := s.LoadTenantConfig(tenantID)
if err != nil || !cfg.SyncEnabled {
return nil
}
client, err := newCalDAVClient(*cfg)
if err != nil {
return fmt.Errorf("creating CalDAV client: %w", err)
}
path := calendarObjectPath(cfg.CalendarPath, *deadline.CalDAVUID)
return client.RemoveAll(ctx, path)
}
// DeleteAppointmentCalDAV removes an appointment's VEVENT from CalDAV.
func (s *CalDAVService) DeleteAppointmentCalDAV(ctx context.Context, tenantID uuid.UUID, appointment *models.Appointment) error {
if appointment.CalDAVUID == nil || *appointment.CalDAVUID == "" {
return nil
}
cfg, err := s.LoadTenantConfig(tenantID)
if err != nil || !cfg.SyncEnabled {
return nil
}
client, err := newCalDAVClient(*cfg)
if err != nil {
return fmt.Errorf("creating CalDAV client: %w", err)
}
path := calendarObjectPath(cfg.CalendarPath, *appointment.CalDAVUID)
return client.RemoveAll(ctx, path)
}
// --- Pull: CalDAV -> Local ---
// pullAll fetches all calendar objects from CalDAV and reconciles with local DB.
func (s *CalDAVService) pullAll(ctx context.Context, client *caldav.Client, tenantID uuid.UUID, cfg CalDAVConfig) (int, []string) {
var pulled int
var errs []string
query := &caldav.CalendarQuery{
CompFilter: caldav.CompFilter{
Name: ical.CompCalendar,
},
}
objects, err := client.QueryCalendar(ctx, cfg.CalendarPath, query)
if err != nil {
return 0, []string{fmt.Sprintf("querying calendar: %v", err)}
}
for _, obj := range objects {
if obj.Data == nil {
continue
}
for _, child := range obj.Data.Children {
switch child.Name {
case ical.CompToDo:
uid, _ := child.Props.Text(ical.PropUID)
if uid == "" || !isKanzlAIUID(uid, "deadline") {
continue
}
if err := s.reconcileDeadline(ctx, tenantID, child, obj.ETag); err != nil {
errs = append(errs, fmt.Sprintf("reconcile deadline %s: %v", uid, err))
} else {
pulled++
}
case ical.CompEvent:
uid, _ := child.Props.Text(ical.PropUID)
if uid == "" || !isKanzlAIUID(uid, "appointment") {
continue
}
if err := s.reconcileAppointment(ctx, tenantID, child, obj.ETag); err != nil {
errs = append(errs, fmt.Sprintf("reconcile appointment %s: %v", uid, err))
} else {
pulled++
}
}
}
}
return pulled, errs
}
// reconcileDeadline handles conflict resolution for a pulled VTODO.
// KanzlAI wins for dates/status, CalDAV wins for notes/description.
func (s *CalDAVService) reconcileDeadline(ctx context.Context, tenantID uuid.UUID, comp *ical.Component, remoteEtag string) error {
uid, _ := comp.Props.Text(ical.PropUID)
deadlineID := extractIDFromUID(uid, "deadline")
if deadlineID == uuid.Nil {
return fmt.Errorf("invalid UID: %s", uid)
}
// Load existing deadline
var d models.Deadline
err := s.db.Get(&d, `SELECT id, tenant_id, case_id, title, description, due_date, original_due_date,
warning_date, source, rule_id, status, completed_at,
caldav_uid, caldav_etag, notes, created_at, updated_at
FROM deadlines WHERE id = $1 AND tenant_id = $2`, deadlineID, tenantID)
if err != nil {
return fmt.Errorf("loading deadline: %w", err)
}
// Check if remote changed (etag mismatch)
if d.CalDAVEtag != nil && *d.CalDAVEtag == remoteEtag {
return nil // No change
}
// CalDAV wins for description/notes
description, _ := comp.Props.Text(ical.PropDescription)
hasConflict := false
if description != "" {
existingDesc := ""
if d.Description != nil {
existingDesc = *d.Description
}
existingNotes := ""
if d.Notes != nil {
existingNotes = *d.Notes
}
// CalDAV wins for notes/description
if description != existingDesc && description != existingNotes {
hasConflict = true
_, err = s.db.Exec(`UPDATE deadlines SET notes = $1, caldav_etag = $2, updated_at = NOW()
WHERE id = $3 AND tenant_id = $4`, description, remoteEtag, deadlineID, tenantID)
if err != nil {
return fmt.Errorf("updating deadline notes: %w", err)
}
}
}
if !hasConflict {
// Just update etag
_, err = s.db.Exec(`UPDATE deadlines SET caldav_etag = $1, updated_at = NOW()
WHERE id = $2 AND tenant_id = $3`, remoteEtag, deadlineID, tenantID)
if err != nil {
return fmt.Errorf("updating deadline etag: %w", err)
}
}
// Log conflict in case_events if detected
if hasConflict {
s.logConflictEvent(ctx, tenantID, d.CaseID, "deadline", deadlineID, "CalDAV description updated from remote")
}
return nil
}
// reconcileAppointment handles conflict resolution for a pulled VEVENT.
func (s *CalDAVService) reconcileAppointment(ctx context.Context, tenantID uuid.UUID, comp *ical.Component, remoteEtag string) error {
uid, _ := comp.Props.Text(ical.PropUID)
appointmentID := extractIDFromUID(uid, "appointment")
if appointmentID == uuid.Nil {
return fmt.Errorf("invalid UID: %s", uid)
}
var a models.Appointment
err := s.db.GetContext(ctx, &a, `SELECT * FROM appointments WHERE id = $1 AND tenant_id = $2`, appointmentID, tenantID)
if err != nil {
return fmt.Errorf("loading appointment: %w", err)
}
if a.CalDAVEtag != nil && *a.CalDAVEtag == remoteEtag {
return nil
}
// CalDAV wins for description
description, _ := comp.Props.Text(ical.PropDescription)
location, _ := comp.Props.Text(ical.PropLocation)
hasConflict := false
updates := []string{"caldav_etag = $1", "updated_at = NOW()"}
args := []any{remoteEtag}
argN := 2
if description != "" {
existingDesc := ""
if a.Description != nil {
existingDesc = *a.Description
}
if description != existingDesc {
hasConflict = true
updates = append(updates, fmt.Sprintf("description = $%d", argN))
args = append(args, description)
argN++
}
}
if location != "" {
existingLoc := ""
if a.Location != nil {
existingLoc = *a.Location
}
if location != existingLoc {
hasConflict = true
updates = append(updates, fmt.Sprintf("location = $%d", argN))
args = append(args, location)
argN++
}
}
args = append(args, appointmentID, tenantID)
query := fmt.Sprintf("UPDATE appointments SET %s WHERE id = $%d AND tenant_id = $%d",
strings.Join(updates, ", "), argN, argN+1)
if _, err := s.db.ExecContext(ctx, query, args...); err != nil {
return fmt.Errorf("updating appointment: %w", err)
}
if hasConflict {
caseID := uuid.Nil
if a.CaseID != nil {
caseID = *a.CaseID
}
s.logConflictEvent(ctx, tenantID, caseID, "appointment", appointmentID, "CalDAV description/location updated from remote")
}
return nil
}
// --- DB helpers ---
func (s *CalDAVService) loadDeadlines(tenantID uuid.UUID) ([]models.Deadline, error) {
var deadlines []models.Deadline
err := s.db.Select(&deadlines, `SELECT id, tenant_id, case_id, title, description, due_date,
original_due_date, warning_date, source, rule_id, status, completed_at,
caldav_uid, caldav_etag, notes, created_at, updated_at
FROM deadlines WHERE tenant_id = $1`, tenantID)
return deadlines, err
}
func (s *CalDAVService) loadAppointments(ctx context.Context, tenantID uuid.UUID) ([]models.Appointment, error) {
var appointments []models.Appointment
err := s.db.SelectContext(ctx, &appointments, "SELECT * FROM appointments WHERE tenant_id = $1", tenantID)
return appointments, err
}
func (s *CalDAVService) updateDeadlineCalDAV(id uuid.UUID, calDAVUID, etag string) error {
_, err := s.db.Exec(`UPDATE deadlines SET caldav_uid = $1, caldav_etag = $2, updated_at = NOW()
WHERE id = $3`, calDAVUID, etag, id)
return err
}
func (s *CalDAVService) updateAppointmentCalDAV(id uuid.UUID, calDAVUID, etag string) error {
_, err := s.db.Exec(`UPDATE appointments SET caldav_uid = $1, caldav_etag = $2, updated_at = NOW()
WHERE id = $3`, calDAVUID, etag, id)
return err
}
func (s *CalDAVService) logConflictEvent(ctx context.Context, tenantID, caseID uuid.UUID, objectType string, objectID uuid.UUID, msg string) {
if caseID == uuid.Nil {
return
}
metadata, _ := json.Marshal(map[string]string{
"object_type": objectType,
"object_id": objectID.String(),
"source": "caldav_sync",
})
_, err := s.db.ExecContext(ctx, `INSERT INTO case_events (id, tenant_id, case_id, event_type, title, description, metadata, created_at, updated_at)
VALUES ($1, $2, $3, 'caldav_conflict', $4, $5, $6, NOW(), NOW())`,
uuid.New(), tenantID, caseID, "CalDAV sync conflict", msg, metadata)
if err != nil {
slog.Error("CalDAV: failed to log conflict event", "error", err)
}
}
// --- UID helpers ---
func deadlineUID(id uuid.UUID) string {
return fmt.Sprintf("kanzlai-deadline-%s@%s", id, calDAVDomain)
}
func appointmentUID(id uuid.UUID) string {
return fmt.Sprintf("kanzlai-appointment-%s@%s", id, calDAVDomain)
}
func isKanzlAIUID(uid, objectType string) bool {
return strings.HasPrefix(uid, "kanzlai-"+objectType+"-") && strings.HasSuffix(uid, "@"+calDAVDomain)
}
func extractIDFromUID(uid, objectType string) uuid.UUID {
prefix := "kanzlai-" + objectType + "-"
suffix := "@" + calDAVDomain
if !strings.HasPrefix(uid, prefix) || !strings.HasSuffix(uid, suffix) {
return uuid.Nil
}
idStr := uid[len(prefix) : len(uid)-len(suffix)]
id, err := uuid.Parse(idStr)
if err != nil {
return uuid.Nil
}
return id
}
func calendarObjectPath(calendarPath, uid string) string {
path := strings.TrimSuffix(calendarPath, "/")
return path + "/" + uid + ".ics"
}

View File

@@ -0,0 +1,124 @@
package services
import (
"testing"
"github.com/google/uuid"
)
func TestDeadlineUID(t *testing.T) {
id := uuid.MustParse("550e8400-e29b-41d4-a716-446655440000")
uid := deadlineUID(id)
want := "kanzlai-deadline-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de"
if uid != want {
t.Errorf("deadlineUID = %q, want %q", uid, want)
}
}
func TestAppointmentUID(t *testing.T) {
id := uuid.MustParse("550e8400-e29b-41d4-a716-446655440000")
uid := appointmentUID(id)
want := "kanzlai-appointment-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de"
if uid != want {
t.Errorf("appointmentUID = %q, want %q", uid, want)
}
}
func TestIsKanzlAIUID(t *testing.T) {
tests := []struct {
uid string
objectType string
want bool
}{
{"kanzlai-deadline-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de", "deadline", true},
{"kanzlai-appointment-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de", "appointment", true},
{"kanzlai-deadline-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de", "appointment", false},
{"random-uid@other.com", "deadline", false},
{"", "deadline", false},
}
for _, tt := range tests {
got := isKanzlAIUID(tt.uid, tt.objectType)
if got != tt.want {
t.Errorf("isKanzlAIUID(%q, %q) = %v, want %v", tt.uid, tt.objectType, got, tt.want)
}
}
}
func TestExtractIDFromUID(t *testing.T) {
id := uuid.MustParse("550e8400-e29b-41d4-a716-446655440000")
tests := []struct {
uid string
objectType string
want uuid.UUID
}{
{"kanzlai-deadline-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de", "deadline", id},
{"kanzlai-appointment-550e8400-e29b-41d4-a716-446655440000@kanzlai.msbls.de", "appointment", id},
{"invalid-uid", "deadline", uuid.Nil},
{"kanzlai-deadline-not-a-uuid@kanzlai.msbls.de", "deadline", uuid.Nil},
}
for _, tt := range tests {
got := extractIDFromUID(tt.uid, tt.objectType)
if got != tt.want {
t.Errorf("extractIDFromUID(%q, %q) = %v, want %v", tt.uid, tt.objectType, got, tt.want)
}
}
}
func TestCalendarObjectPath(t *testing.T) {
tests := []struct {
calendarPath string
uid string
want string
}{
{"/dav/calendars/user/cal", "kanzlai-deadline-abc@kanzlai.msbls.de", "/dav/calendars/user/cal/kanzlai-deadline-abc@kanzlai.msbls.de.ics"},
{"/dav/calendars/user/cal/", "kanzlai-deadline-abc@kanzlai.msbls.de", "/dav/calendars/user/cal/kanzlai-deadline-abc@kanzlai.msbls.de.ics"},
}
for _, tt := range tests {
got := calendarObjectPath(tt.calendarPath, tt.uid)
if got != tt.want {
t.Errorf("calendarObjectPath(%q, %q) = %q, want %q", tt.calendarPath, tt.uid, got, tt.want)
}
}
}
func TestParseCalDAVConfig(t *testing.T) {
settings := []byte(`{"caldav": {"url": "https://dav.example.com", "username": "user", "password": "pass", "calendar_path": "/cal", "sync_enabled": true, "sync_interval_minutes": 30}}`)
cfg, err := parseCalDAVConfig(settings)
if err != nil {
t.Fatalf("parseCalDAVConfig: %v", err)
}
if cfg.URL != "https://dav.example.com" {
t.Errorf("URL = %q, want %q", cfg.URL, "https://dav.example.com")
}
if cfg.Username != "user" {
t.Errorf("Username = %q, want %q", cfg.Username, "user")
}
if cfg.SyncIntervalMinutes != 30 {
t.Errorf("SyncIntervalMinutes = %d, want 30", cfg.SyncIntervalMinutes)
}
if !cfg.SyncEnabled {
t.Error("SyncEnabled = false, want true")
}
}
func TestParseCalDAVConfig_Empty(t *testing.T) {
cfg, err := parseCalDAVConfig(nil)
if err != nil {
t.Fatalf("parseCalDAVConfig(nil): %v", err)
}
if cfg.URL != "" {
t.Errorf("expected empty config, got URL=%q", cfg.URL)
}
}
func TestParseCalDAVConfig_NoCalDAV(t *testing.T) {
settings := []byte(`{"other_setting": true}`)
cfg, err := parseCalDAVConfig(settings)
if err != nil {
t.Fatalf("parseCalDAVConfig: %v", err)
}
if cfg.URL != "" {
t.Errorf("expected empty caldav config, got URL=%q", cfg.URL)
}
}

View File

@@ -0,0 +1,92 @@
package services
import (
"context"
"fmt"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type CaseAssignmentService struct {
db *sqlx.DB
}
func NewCaseAssignmentService(db *sqlx.DB) *CaseAssignmentService {
return &CaseAssignmentService{db: db}
}
// ListByCase returns all assignments for a case.
func (s *CaseAssignmentService) ListByCase(ctx context.Context, tenantID, caseID uuid.UUID) ([]models.CaseAssignment, error) {
var assignments []models.CaseAssignment
err := s.db.SelectContext(ctx, &assignments,
`SELECT ca.id, ca.case_id, ca.user_id, ca.role, ca.assigned_at
FROM case_assignments ca
JOIN cases c ON c.id = ca.case_id
WHERE ca.case_id = $1 AND c.tenant_id = $2
ORDER BY ca.assigned_at`,
caseID, tenantID)
if err != nil {
return nil, fmt.Errorf("list case assignments: %w", err)
}
return assignments, nil
}
// Assign adds a user to a case with the given role.
func (s *CaseAssignmentService) Assign(ctx context.Context, tenantID, caseID, userID uuid.UUID, role string) (*models.CaseAssignment, error) {
// Verify user is a member of this tenant
var memberExists bool
err := s.db.GetContext(ctx, &memberExists,
`SELECT EXISTS(SELECT 1 FROM user_tenants WHERE user_id = $1 AND tenant_id = $2)`,
userID, tenantID)
if err != nil {
return nil, fmt.Errorf("check membership: %w", err)
}
if !memberExists {
return nil, fmt.Errorf("user is not a member of this tenant")
}
// Verify case belongs to tenant
var caseExists bool
err = s.db.GetContext(ctx, &caseExists,
`SELECT EXISTS(SELECT 1 FROM cases WHERE id = $1 AND tenant_id = $2)`,
caseID, tenantID)
if err != nil {
return nil, fmt.Errorf("check case: %w", err)
}
if !caseExists {
return nil, fmt.Errorf("case not found")
}
var assignment models.CaseAssignment
err = s.db.QueryRowxContext(ctx,
`INSERT INTO case_assignments (case_id, user_id, role)
VALUES ($1, $2, $3)
ON CONFLICT (case_id, user_id) DO UPDATE SET role = EXCLUDED.role
RETURNING id, case_id, user_id, role, assigned_at`,
caseID, userID, role,
).StructScan(&assignment)
if err != nil {
return nil, fmt.Errorf("assign user to case: %w", err)
}
return &assignment, nil
}
// Unassign removes a user from a case.
func (s *CaseAssignmentService) Unassign(ctx context.Context, tenantID, caseID, userID uuid.UUID) error {
result, err := s.db.ExecContext(ctx,
`DELETE FROM case_assignments ca
USING cases c
WHERE ca.case_id = c.id AND ca.case_id = $1 AND ca.user_id = $2 AND c.tenant_id = $3`,
caseID, userID, tenantID)
if err != nil {
return fmt.Errorf("unassign: %w", err)
}
rows, _ := result.RowsAffected()
if rows == 0 {
return fmt.Errorf("assignment not found")
}
return nil
}

View File

@@ -13,11 +13,12 @@ import (
)
type CaseService struct {
db *sqlx.DB
db *sqlx.DB
audit *AuditService
}
func NewCaseService(db *sqlx.DB) *CaseService {
return &CaseService{db: db}
func NewCaseService(db *sqlx.DB, audit *AuditService) *CaseService {
return &CaseService{db: db, audit: audit}
}
type CaseFilter struct {
@@ -162,6 +163,9 @@ func (s *CaseService) Create(ctx context.Context, tenantID uuid.UUID, userID uui
if err := s.db.GetContext(ctx, &c, "SELECT * FROM cases WHERE id = $1", id); err != nil {
return nil, fmt.Errorf("fetching created case: %w", err)
}
s.audit.Log(ctx, "create", "case", &id, nil, c)
return &c, nil
}
@@ -239,6 +243,9 @@ func (s *CaseService) Update(ctx context.Context, tenantID, caseID uuid.UUID, us
if err := s.db.GetContext(ctx, &updated, "SELECT * FROM cases WHERE id = $1", caseID); err != nil {
return nil, fmt.Errorf("fetching updated case: %w", err)
}
s.audit.Log(ctx, "update", "case", &caseID, current, updated)
return &updated, nil
}
@@ -254,6 +261,7 @@ func (s *CaseService) Delete(ctx context.Context, tenantID, caseID uuid.UUID, us
return sql.ErrNoRows
}
createEvent(ctx, s.db, tenantID, caseID, userID, "case_archived", "Case archived", nil)
s.audit.Log(ctx, "delete", "case", &caseID, map[string]string{"status": "active"}, map[string]string{"status": "archived"})
return nil
}

View File

@@ -42,6 +42,7 @@ type UpcomingDeadline struct {
ID uuid.UUID `json:"id" db:"id"`
Title string `json:"title" db:"title"`
DueDate string `json:"due_date" db:"due_date"`
CaseID uuid.UUID `json:"case_id" db:"case_id"`
CaseNumber string `json:"case_number" db:"case_number"`
CaseTitle string `json:"case_title" db:"case_title"`
Status string `json:"status" db:"status"`
@@ -56,8 +57,10 @@ type UpcomingAppointment struct {
}
type RecentActivity struct {
ID uuid.UUID `json:"id" db:"id"`
EventType *string `json:"event_type" db:"event_type"`
Title string `json:"title" db:"title"`
CaseID uuid.UUID `json:"case_id" db:"case_id"`
CaseNumber string `json:"case_number" db:"case_number"`
EventDate *time.Time `json:"event_date" db:"event_date"`
}
@@ -109,7 +112,7 @@ func (s *DashboardService) Get(ctx context.Context, tenantID uuid.UUID) (*Dashbo
// Upcoming deadlines (next 7 days)
deadlineQuery := `
SELECT d.id, d.title, d.due_date, c.case_number, c.title AS case_title, d.status
SELECT d.id, d.title, d.due_date, d.case_id, c.case_number, c.title AS case_title, d.status
FROM deadlines d
JOIN cases c ON c.id = d.case_id AND c.tenant_id = d.tenant_id
WHERE d.tenant_id = $1 AND d.status = 'pending' AND d.due_date >= $2 AND d.due_date <= $3
@@ -135,7 +138,7 @@ func (s *DashboardService) Get(ctx context.Context, tenantID uuid.UUID) (*Dashbo
// Recent activity (last 10 case events)
activityQuery := `
SELECT ce.event_type, ce.title, c.case_number, ce.event_date
SELECT ce.id, ce.event_type, ce.title, ce.case_id, c.case_number, ce.event_date
FROM case_events ce
JOIN cases c ON c.id = ce.case_id AND c.tenant_id = ce.tenant_id
WHERE ce.tenant_id = $1

View File

@@ -8,6 +8,12 @@ import (
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
const ruleColumns = `id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code,
is_spawn, spawn_label, is_active, created_at, updated_at`
// DeadlineRuleService handles deadline rule queries
type DeadlineRuleService struct {
db *sqlx.DB
@@ -25,21 +31,13 @@ func (s *DeadlineRuleService) List(proceedingTypeID *int) ([]models.DeadlineRule
if proceedingTypeID != nil {
err = s.db.Select(&rules,
`SELECT id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code, is_active,
created_at, updated_at
`SELECT `+ruleColumns+`
FROM deadline_rules
WHERE proceeding_type_id = $1 AND is_active = true
ORDER BY sequence_order`, *proceedingTypeID)
} else {
err = s.db.Select(&rules,
`SELECT id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code, is_active,
created_at, updated_at
`SELECT `+ruleColumns+`
FROM deadline_rules
WHERE is_active = true
ORDER BY proceeding_type_id, sequence_order`)
@@ -72,11 +70,7 @@ func (s *DeadlineRuleService) GetRuleTree(proceedingTypeCode string) ([]RuleTree
// Get all rules for this proceeding type
var rules []models.DeadlineRule
err = s.db.Select(&rules,
`SELECT id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code, is_active,
created_at, updated_at
`SELECT `+ruleColumns+`
FROM deadline_rules
WHERE proceeding_type_id = $1 AND is_active = true
ORDER BY sequence_order`, pt.ID)
@@ -87,6 +81,36 @@ func (s *DeadlineRuleService) GetRuleTree(proceedingTypeCode string) ([]RuleTree
return buildTree(rules), nil
}
// GetFullTimeline returns the full event tree for a proceeding type using a recursive CTE.
// Unlike GetRuleTree, this follows parent_id across proceeding types (includes cross-type spawns).
func (s *DeadlineRuleService) GetFullTimeline(proceedingTypeCode string) ([]models.DeadlineRule, *models.ProceedingType, error) {
var pt models.ProceedingType
err := s.db.Get(&pt,
`SELECT id, code, name, description, jurisdiction, default_color, sort_order, is_active
FROM proceeding_types
WHERE code = $1 AND is_active = true`, proceedingTypeCode)
if err != nil {
return nil, nil, fmt.Errorf("resolving proceeding type %q: %w", proceedingTypeCode, err)
}
var rules []models.DeadlineRule
err = s.db.Select(&rules, `
WITH RECURSIVE tree AS (
SELECT * FROM deadline_rules
WHERE proceeding_type_id = $1 AND parent_id IS NULL AND is_active = true
UNION ALL
SELECT dr.* FROM deadline_rules dr
JOIN tree t ON dr.parent_id = t.id
WHERE dr.is_active = true
)
SELECT `+ruleColumns+` FROM tree ORDER BY sequence_order`, pt.ID)
if err != nil {
return nil, nil, fmt.Errorf("fetching timeline for type %q: %w", proceedingTypeCode, err)
}
return rules, &pt, nil
}
// GetByIDs returns deadline rules by their IDs
func (s *DeadlineRuleService) GetByIDs(ids []string) ([]models.DeadlineRule, error) {
if len(ids) == 0 {
@@ -94,11 +118,7 @@ func (s *DeadlineRuleService) GetByIDs(ids []string) ([]models.DeadlineRule, err
}
query, args, err := sqlx.In(
`SELECT id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code, is_active,
created_at, updated_at
`SELECT `+ruleColumns+`
FROM deadline_rules
WHERE id IN (?) AND is_active = true
ORDER BY sequence_order`, ids)
@@ -119,11 +139,7 @@ func (s *DeadlineRuleService) GetByIDs(ids []string) ([]models.DeadlineRule, err
func (s *DeadlineRuleService) GetRulesForProceedingType(proceedingTypeID int) ([]models.DeadlineRule, error) {
var rules []models.DeadlineRule
err := s.db.Select(&rules,
`SELECT id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
timing, rule_code, deadline_notes, sequence_order, condition_rule_id,
alt_duration_value, alt_duration_unit, alt_rule_code, is_active,
created_at, updated_at
`SELECT `+ruleColumns+`
FROM deadline_rules
WHERE proceeding_type_id = $1 AND is_active = true
ORDER BY sequence_order`, proceedingTypeID)

View File

@@ -1,6 +1,7 @@
package services
import (
"context"
"database/sql"
"fmt"
"time"
@@ -13,12 +14,13 @@ import (
// DeadlineService handles CRUD operations for case deadlines
type DeadlineService struct {
db *sqlx.DB
db *sqlx.DB
audit *AuditService
}
// NewDeadlineService creates a new deadline service
func NewDeadlineService(db *sqlx.DB) *DeadlineService {
return &DeadlineService{db: db}
func NewDeadlineService(db *sqlx.DB, audit *AuditService) *DeadlineService {
return &DeadlineService{db: db, audit: audit}
}
// ListAll returns all deadlines for a tenant, ordered by due_date
@@ -87,7 +89,7 @@ type CreateDeadlineInput struct {
}
// Create inserts a new deadline
func (s *DeadlineService) Create(tenantID uuid.UUID, input CreateDeadlineInput) (*models.Deadline, error) {
func (s *DeadlineService) Create(ctx context.Context, tenantID uuid.UUID, input CreateDeadlineInput) (*models.Deadline, error) {
id := uuid.New()
source := input.Source
if source == "" {
@@ -108,6 +110,7 @@ func (s *DeadlineService) Create(tenantID uuid.UUID, input CreateDeadlineInput)
if err != nil {
return nil, fmt.Errorf("creating deadline: %w", err)
}
s.audit.Log(ctx, "create", "deadline", &id, nil, d)
return &d, nil
}
@@ -123,7 +126,7 @@ type UpdateDeadlineInput struct {
}
// Update modifies an existing deadline
func (s *DeadlineService) Update(tenantID, deadlineID uuid.UUID, input UpdateDeadlineInput) (*models.Deadline, error) {
func (s *DeadlineService) Update(ctx context.Context, tenantID, deadlineID uuid.UUID, input UpdateDeadlineInput) (*models.Deadline, error) {
// First check it exists and belongs to tenant
existing, err := s.GetByID(tenantID, deadlineID)
if err != nil {
@@ -154,11 +157,12 @@ func (s *DeadlineService) Update(tenantID, deadlineID uuid.UUID, input UpdateDea
if err != nil {
return nil, fmt.Errorf("updating deadline: %w", err)
}
s.audit.Log(ctx, "update", "deadline", &deadlineID, existing, d)
return &d, nil
}
// Complete marks a deadline as completed
func (s *DeadlineService) Complete(tenantID, deadlineID uuid.UUID) (*models.Deadline, error) {
func (s *DeadlineService) Complete(ctx context.Context, tenantID, deadlineID uuid.UUID) (*models.Deadline, error) {
query := `UPDATE deadlines SET
status = 'completed',
completed_at = $1,
@@ -176,11 +180,12 @@ func (s *DeadlineService) Complete(tenantID, deadlineID uuid.UUID) (*models.Dead
}
return nil, fmt.Errorf("completing deadline: %w", err)
}
s.audit.Log(ctx, "update", "deadline", &deadlineID, map[string]string{"status": "pending"}, map[string]string{"status": "completed"})
return &d, nil
}
// Delete removes a deadline
func (s *DeadlineService) Delete(tenantID, deadlineID uuid.UUID) error {
func (s *DeadlineService) Delete(ctx context.Context, tenantID, deadlineID uuid.UUID) error {
query := `DELETE FROM deadlines WHERE id = $1 AND tenant_id = $2`
result, err := s.db.Exec(query, deadlineID, tenantID)
if err != nil {
@@ -193,5 +198,6 @@ func (s *DeadlineService) Delete(tenantID, deadlineID uuid.UUID) error {
if rows == 0 {
return fmt.Errorf("deadline not found")
}
s.audit.Log(ctx, "delete", "deadline", &deadlineID, nil, nil)
return nil
}

View File

@@ -0,0 +1,236 @@
package services
import (
"fmt"
"time"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
// DetermineService handles event-driven deadline determination.
// It walks the proceeding event tree and calculates cascading dates.
type DetermineService struct {
rules *DeadlineRuleService
calculator *DeadlineCalculator
}
// NewDetermineService creates a new determine service
func NewDetermineService(db *sqlx.DB, calculator *DeadlineCalculator) *DetermineService {
return &DetermineService{
rules: NewDeadlineRuleService(db),
calculator: calculator,
}
}
// TimelineEvent represents a calculated event in the proceeding timeline
type TimelineEvent struct {
ID string `json:"id"`
Code string `json:"code,omitempty"`
Name string `json:"name"`
Description string `json:"description,omitempty"`
PrimaryParty string `json:"primary_party,omitempty"`
EventType string `json:"event_type,omitempty"`
IsMandatory bool `json:"is_mandatory"`
DurationValue int `json:"duration_value"`
DurationUnit string `json:"duration_unit"`
RuleCode string `json:"rule_code,omitempty"`
DeadlineNotes string `json:"deadline_notes,omitempty"`
IsSpawn bool `json:"is_spawn"`
SpawnLabel string `json:"spawn_label,omitempty"`
HasCondition bool `json:"has_condition"`
ConditionRuleID string `json:"condition_rule_id,omitempty"`
AltRuleCode string `json:"alt_rule_code,omitempty"`
AltDurationValue *int `json:"alt_duration_value,omitempty"`
AltDurationUnit string `json:"alt_duration_unit,omitempty"`
Date string `json:"date,omitempty"`
OriginalDate string `json:"original_date,omitempty"`
WasAdjusted bool `json:"was_adjusted"`
Children []TimelineEvent `json:"children,omitempty"`
}
// DetermineRequest is the input for POST /api/deadlines/determine
type DetermineRequest struct {
ProceedingType string `json:"proceeding_type"`
TriggerEventDate string `json:"trigger_event_date"`
Conditions map[string]bool `json:"conditions"`
}
// DetermineResponse is the output of the determine endpoint
type DetermineResponse struct {
ProceedingType string `json:"proceeding_type"`
ProceedingName string `json:"proceeding_name"`
ProceedingColor string `json:"proceeding_color"`
TriggerDate string `json:"trigger_event_date"`
Timeline []TimelineEvent `json:"timeline"`
TotalDeadlines int `json:"total_deadlines"`
}
// GetTimeline returns the proceeding event tree (without date calculations)
func (s *DetermineService) GetTimeline(proceedingTypeCode string) ([]TimelineEvent, *models.ProceedingType, error) {
rules, pt, err := s.rules.GetFullTimeline(proceedingTypeCode)
if err != nil {
return nil, nil, err
}
tree := buildTimelineTree(rules)
return tree, pt, nil
}
// Determine calculates the full timeline with cascading dates
func (s *DetermineService) Determine(req DetermineRequest) (*DetermineResponse, error) {
timeline, pt, err := s.GetTimeline(req.ProceedingType)
if err != nil {
return nil, fmt.Errorf("loading timeline: %w", err)
}
triggerDate, err := time.Parse("2006-01-02", req.TriggerEventDate)
if err != nil {
return nil, fmt.Errorf("invalid trigger_event_date: %w", err)
}
conditions := req.Conditions
if conditions == nil {
conditions = make(map[string]bool)
}
total := s.calculateDates(timeline, triggerDate, conditions)
return &DetermineResponse{
ProceedingType: pt.Code,
ProceedingName: pt.Name,
ProceedingColor: pt.DefaultColor,
TriggerDate: req.TriggerEventDate,
Timeline: timeline,
TotalDeadlines: total,
}, nil
}
// calculateDates walks the tree and calculates dates for each node
func (s *DetermineService) calculateDates(events []TimelineEvent, parentDate time.Time, conditions map[string]bool) int {
total := 0
for i := range events {
ev := &events[i]
// Skip inactive spawns: if this is a spawn node and conditions don't include it, skip
if ev.IsSpawn && !conditions[ev.ID] {
continue
}
durationValue := ev.DurationValue
durationUnit := ev.DurationUnit
ruleCode := ev.RuleCode
// Apply conditional logic
if ev.HasCondition && ev.ConditionRuleID != "" {
if conditions[ev.ConditionRuleID] {
if ev.AltDurationValue != nil {
durationValue = *ev.AltDurationValue
}
if ev.AltDurationUnit != "" {
durationUnit = ev.AltDurationUnit
}
if ev.AltRuleCode != "" {
ruleCode = ev.AltRuleCode
}
}
}
// Calculate this node's date
if durationValue > 0 {
rule := models.DeadlineRule{
DurationValue: durationValue,
DurationUnit: durationUnit,
}
adjusted, original, wasAdjusted := s.calculator.CalculateEndDate(parentDate, rule)
ev.Date = adjusted.Format("2006-01-02")
ev.OriginalDate = original.Format("2006-01-02")
ev.WasAdjusted = wasAdjusted
} else {
ev.Date = parentDate.Format("2006-01-02")
ev.OriginalDate = parentDate.Format("2006-01-02")
}
ev.RuleCode = ruleCode
total++
// Recurse: children's dates cascade from this node's date
if len(ev.Children) > 0 {
childDate, _ := time.Parse("2006-01-02", ev.Date)
total += s.calculateDates(ev.Children, childDate, conditions)
}
}
return total
}
// buildTimelineTree converts flat rules to a tree of TimelineEvents
func buildTimelineTree(rules []models.DeadlineRule) []TimelineEvent {
nodeMap := make(map[string]*TimelineEvent, len(rules))
var roots []TimelineEvent
// Create event nodes
for _, r := range rules {
ev := ruleToEvent(r)
nodeMap[r.ID.String()] = &ev
}
// Build tree by parent_id
for _, r := range rules {
ev := nodeMap[r.ID.String()]
if r.ParentID != nil {
parentKey := r.ParentID.String()
if parent, ok := nodeMap[parentKey]; ok {
parent.Children = append(parent.Children, *ev)
continue
}
}
roots = append(roots, *ev)
}
return roots
}
func ruleToEvent(r models.DeadlineRule) TimelineEvent {
ev := TimelineEvent{
ID: r.ID.String(),
Name: r.Name,
IsMandatory: r.IsMandatory,
DurationValue: r.DurationValue,
DurationUnit: r.DurationUnit,
IsSpawn: r.IsSpawn,
HasCondition: r.ConditionRuleID != nil,
}
if r.Code != nil {
ev.Code = *r.Code
}
if r.Description != nil {
ev.Description = *r.Description
}
if r.PrimaryParty != nil {
ev.PrimaryParty = *r.PrimaryParty
}
if r.EventType != nil {
ev.EventType = *r.EventType
}
if r.RuleCode != nil {
ev.RuleCode = *r.RuleCode
}
if r.DeadlineNotes != nil {
ev.DeadlineNotes = *r.DeadlineNotes
}
if r.SpawnLabel != nil {
ev.SpawnLabel = *r.SpawnLabel
}
if r.ConditionRuleID != nil {
ev.ConditionRuleID = r.ConditionRuleID.String()
}
if r.AltRuleCode != nil {
ev.AltRuleCode = *r.AltRuleCode
}
ev.AltDurationValue = r.AltDurationValue
if r.AltDurationUnit != nil {
ev.AltDurationUnit = *r.AltDurationUnit
}
return ev
}

View File

@@ -18,10 +18,11 @@ const documentBucket = "kanzlai-documents"
type DocumentService struct {
db *sqlx.DB
storage *StorageClient
audit *AuditService
}
func NewDocumentService(db *sqlx.DB, storage *StorageClient) *DocumentService {
return &DocumentService{db: db, storage: storage}
func NewDocumentService(db *sqlx.DB, storage *StorageClient, audit *AuditService) *DocumentService {
return &DocumentService{db: db, storage: storage, audit: audit}
}
type CreateDocumentInput struct {
@@ -97,6 +98,7 @@ func (s *DocumentService) Create(ctx context.Context, tenantID, caseID, userID u
if err := s.db.GetContext(ctx, &doc, "SELECT * FROM documents WHERE id = $1", id); err != nil {
return nil, fmt.Errorf("fetching created document: %w", err)
}
s.audit.Log(ctx, "create", "document", &id, nil, doc)
return &doc, nil
}
@@ -151,6 +153,7 @@ func (s *DocumentService) Delete(ctx context.Context, tenantID, docID, userID uu
// Log case event
createEvent(ctx, s.db, tenantID, doc.CaseID, userID, "document_deleted",
fmt.Sprintf("Document deleted: %s", doc.Title), nil)
s.audit.Log(ctx, "delete", "document", &docID, doc, nil)
return nil
}

View File

@@ -0,0 +1,393 @@
package services
import (
"fmt"
"math"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
// FeeCalculator computes patent litigation costs based on GKG/RVG fee schedules.
type FeeCalculator struct{}
func NewFeeCalculator() *FeeCalculator {
return &FeeCalculator{}
}
// resolveSchedule returns the actual schedule data, resolving aliases.
func resolveSchedule(version string) (*feeScheduleData, error) {
sched, ok := feeSchedules[version]
if !ok {
return nil, fmt.Errorf("unknown fee schedule: %s", version)
}
if sched.AliasOf != "" {
target, ok := feeSchedules[sched.AliasOf]
if !ok {
return nil, fmt.Errorf("alias target not found: %s", sched.AliasOf)
}
return target, nil
}
return sched, nil
}
// CalculateBaseFee computes the base 1.0x fee using the step-based accumulator.
// feeType must be "GKG" or "RVG".
//
// Algorithm: The first bracket defines the minimum fee (one step). Each subsequent
// bracket accumulates ceil(portion / stepSize) * increment for the portion of the
// Streitwert that falls within that bracket's range.
func (fc *FeeCalculator) CalculateBaseFee(streitwert float64, version string, feeType string) (float64, error) {
sched, err := resolveSchedule(version)
if err != nil {
return 0, err
}
if streitwert <= 0 {
return 0, nil
}
brackets := sched.Brackets
fee := 0.0
prevUpper := 0.0
for i, b := range brackets {
increment := b.GKGIncrement
if feeType == "RVG" {
increment = b.RVGIncrement
}
if i == 0 {
// First bracket: minimum fee = one increment
fee = increment
prevUpper = b.UpperBound
continue
}
if streitwert <= prevUpper {
break
}
portion := math.Min(streitwert, b.UpperBound) - prevUpper
if portion <= 0 {
break
}
steps := math.Ceil(portion / b.StepSize)
fee += steps * increment
prevUpper = b.UpperBound
}
return fee, nil
}
// CalculateCourtFees computes court fees for a given instance type.
// Returns the base fee, multiplier, computed court fee, and the fee source.
func (fc *FeeCalculator) CalculateCourtFees(streitwert float64, instanceType string, version string) (baseFee, multiplier, courtFee float64, source string, err error) {
cfg, ok := instanceConfigs[instanceType]
if !ok {
return 0, 0, 0, "", fmt.Errorf("unknown instance type: %s", instanceType)
}
source = cfg.CourtSource
if cfg.CourtSource == "fixed" {
return 0, 0, cfg.FixedCourtFee, source, nil
}
// Both GKG and PatKostG use the same step-based GKG bracket lookup
baseFee, err = fc.CalculateBaseFee(streitwert, version, "GKG")
if err != nil {
return 0, 0, 0, "", err
}
multiplier = cfg.CourtFactor
courtFee = baseFee * multiplier
return baseFee, multiplier, courtFee, source, nil
}
// CalculateAttorneyFees computes the fees for one attorney type at one instance.
// Returns nil if numAttorneys <= 0.
func (fc *FeeCalculator) CalculateAttorneyFees(
streitwert float64,
version string,
vgFactor, tgFactor float64,
numAttorneys, numClients int,
oralHearing bool,
vatRate float64,
) (*models.AttorneyBreakdown, error) {
if numAttorneys <= 0 {
return nil, nil
}
baseFee, err := fc.CalculateBaseFee(streitwert, version, "RVG")
if err != nil {
return nil, err
}
vgFee := vgFactor * baseFee
// Increase fee (Nr. 1008 VV RVG) for multiple clients
increaseFee := 0.0
if numClients > 1 {
factor := float64(numClients-1) * erhoehungsfaktor
if factor > erhoehungsfaktorMax {
factor = erhoehungsfaktorMax
}
increaseFee = factor * baseFee
}
tgFee := 0.0
if oralHearing {
tgFee = tgFactor * baseFee
}
subtotalNet := vgFee + increaseFee + tgFee + auslagenpauschale
vat := subtotalNet * vatRate
subtotalGross := subtotalNet + vat
totalGross := subtotalGross * float64(numAttorneys)
return &models.AttorneyBreakdown{
BaseFee: baseFee,
VGFactor: vgFactor,
VGFee: vgFee,
IncreaseFee: increaseFee,
TGFactor: tgFactor,
TGFee: tgFee,
Pauschale: auslagenpauschale,
SubtotalNet: subtotalNet,
VAT: vat,
SubtotalGross: subtotalGross,
Count: numAttorneys,
TotalGross: totalGross,
}, nil
}
// CalculateInstanceTotal computes the full cost for one court instance.
// Bug 3 fix: expert fees are included in the court subtotal (not silently dropped).
func (fc *FeeCalculator) CalculateInstanceTotal(streitwert float64, inst models.InstanceInput, vatRate float64) (*models.InstanceResult, error) {
cfg, ok := instanceConfigs[string(inst.Type)]
if !ok {
return nil, fmt.Errorf("unknown instance type: %s", inst.Type)
}
version := string(inst.FeeVersion)
if version == "" {
version = "Aktuell"
}
baseFee, multiplier, courtFee, source, err := fc.CalculateCourtFees(streitwert, string(inst.Type), version)
if err != nil {
return nil, fmt.Errorf("court fees: %w", err)
}
// Bug 3 fix: include expert fees in court subtotal
courtSubtotal := courtFee + inst.ExpertFees
// Attorney (Rechtsanwalt) fees
raBreakdown, err := fc.CalculateAttorneyFees(
streitwert, version,
cfg.RAVGFactor, cfg.RATGFactor,
inst.NumAttorneys, inst.NumClients,
inst.OralHearing, vatRate,
)
if err != nil {
return nil, fmt.Errorf("attorney fees: %w", err)
}
// Patent attorney (Patentanwalt) fees
var paBreakdown *models.AttorneyBreakdown
if cfg.HasPA && inst.NumPatentAttorneys > 0 {
paBreakdown, err = fc.CalculateAttorneyFees(
streitwert, version,
cfg.PAVGFactor, cfg.PATGFactor,
inst.NumPatentAttorneys, inst.NumClients,
inst.OralHearing, vatRate,
)
if err != nil {
return nil, fmt.Errorf("patent attorney fees: %w", err)
}
}
attorneyTotal := 0.0
if raBreakdown != nil {
attorneyTotal = raBreakdown.TotalGross
}
paTotal := 0.0
if paBreakdown != nil {
paTotal = paBreakdown.TotalGross
}
return &models.InstanceResult{
Type: inst.Type,
Label: cfg.Label,
CourtFeeBase: baseFee,
CourtFeeMultiplier: multiplier,
CourtFeeSource: source,
CourtFee: courtFee,
ExpertFees: inst.ExpertFees,
CourtSubtotal: courtSubtotal,
AttorneyBreakdown: raBreakdown,
PatentAttorneyBreakdown: paBreakdown,
AttorneySubtotal: attorneyTotal,
PatentAttorneySubtotal: paTotal,
InstanceTotal: courtSubtotal + attorneyTotal + paTotal,
}, nil
}
// CalculateSecurityForCosts computes the Prozesskostensicherheit.
//
// Bug 1 fix: uses (1 + VAT) for the total, not (1 - VAT) as in the Excel.
// Bug 2 fix: uses GKG base fee for the court fee component, not RVG.
func (fc *FeeCalculator) CalculateSecurityForCosts(streitwert float64, version string, numClients int, vatRate float64) (*models.SecurityForCosts, error) {
rvgBase, err := fc.CalculateBaseFee(streitwert, version, "RVG")
if err != nil {
return nil, err
}
// Bug 2 fix: use GKG base for court fees, not RVG
gkgBase, err := fc.CalculateBaseFee(streitwert, version, "GKG")
if err != nil {
return nil, err
}
// Increase fee (Nr. 1008 VV RVG) for multiple clients
increaseFee := 0.0
if numClients > 1 {
factor := float64(numClients-1) * erhoehungsfaktor
if factor > erhoehungsfaktorMax {
factor = erhoehungsfaktorMax
}
increaseFee = factor * rvgBase
}
// 1. Instanz: 2.5x RA + increase + 2.5x PA + increase + EUR 5,000
inst1 := 2.5*rvgBase + increaseFee + 2.5*rvgBase + increaseFee + 5000
// 2. Instanz: 2.8x RA + increase + 2.8x PA + increase + 4.0x GKG court + EUR 5,000
inst2 := 2.8*rvgBase + increaseFee + 2.8*rvgBase + increaseFee + 4.0*gkgBase + 5000
// NZB: 2.3x RA + increase + 2.3x PA + increase
nzb := 2.3*rvgBase + increaseFee + 2.3*rvgBase + increaseFee
subtotalNet := inst1 + inst2 + nzb
// Bug 1 fix: add VAT, don't subtract
vat := subtotalNet * vatRate
totalGross := subtotalNet + vat
return &models.SecurityForCosts{
Instance1: inst1,
Instance2: inst2,
NZB: nzb,
SubtotalNet: subtotalNet,
VAT: vat,
TotalGross: totalGross,
}, nil
}
// CalculateFullLitigation computes costs for all enabled instances in a proceeding path.
func (fc *FeeCalculator) CalculateFullLitigation(req models.FeeCalculateRequest) (*models.FeeCalculateResponse, error) {
if req.Streitwert <= 0 {
return nil, fmt.Errorf("streitwert must be positive")
}
resp := &models.FeeCalculateResponse{}
grandTotal := 0.0
for _, inst := range req.Instances {
if !inst.Enabled {
continue
}
result, err := fc.CalculateInstanceTotal(req.Streitwert, inst, req.VATRate)
if err != nil {
return nil, fmt.Errorf("instance %s: %w", inst.Type, err)
}
resp.Instances = append(resp.Instances, *result)
grandTotal += result.InstanceTotal
}
// Build totals based on proceeding path
switch req.ProceedingPath {
case models.PathInfringement:
resp.Totals = fc.infringementTotals(resp.Instances)
case models.PathNullity:
resp.Totals = []models.FeeTotal{{Label: "Gesamtkosten Nichtigkeitsverfahren", Total: grandTotal}}
case models.PathCancellation:
resp.Totals = []models.FeeTotal{{Label: "Gesamtkosten Löschungsverfahren", Total: grandTotal}}
default:
resp.Totals = []models.FeeTotal{{Label: "Gesamtkosten", Total: grandTotal}}
}
// Security for costs (only for infringement proceedings)
if req.IncludeSecurityCosts && req.ProceedingPath == models.PathInfringement {
version := "Aktuell"
numClients := 1
for _, inst := range req.Instances {
if inst.Enabled {
version = string(inst.FeeVersion)
if version == "" {
version = "Aktuell"
}
numClients = inst.NumClients
break
}
}
sec, err := fc.CalculateSecurityForCosts(req.Streitwert, version, numClients, req.VATRate)
if err != nil {
return nil, fmt.Errorf("security for costs: %w", err)
}
resp.SecurityForCosts = sec
}
return resp, nil
}
// infringementTotals builds summary totals for infringement proceedings.
func (fc *FeeCalculator) infringementTotals(instances []models.InstanceResult) []models.FeeTotal {
byType := make(map[models.InstanceType]float64)
for _, inst := range instances {
byType[inst.Type] = inst.InstanceTotal
}
var totals []models.FeeTotal
// "Gesamtkosten bei Nichtzulassung" = LG + OLG + BGH NZB
nzb := byType[models.InstanceLG] + byType[models.InstanceOLG] + byType[models.InstanceBGHNZB]
if nzb > 0 {
totals = append(totals, models.FeeTotal{Label: "Gesamtkosten bei Nichtzulassung", Total: nzb})
}
// "Gesamtkosten bei Revision" = LG + OLG + BGH Rev
rev := byType[models.InstanceLG] + byType[models.InstanceOLG] + byType[models.InstanceBGHRev]
if rev > 0 {
totals = append(totals, models.FeeTotal{Label: "Gesamtkosten bei Revision", Total: rev})
}
if len(totals) == 0 {
total := 0.0
for _, v := range byType {
total += v
}
totals = append(totals, models.FeeTotal{Label: "Gesamtkosten Verletzungsverfahren", Total: total})
}
return totals
}
// GetSchedules returns information about all available fee schedules.
func (fc *FeeCalculator) GetSchedules() []models.FeeScheduleInfo {
order := []string{"2005", "2013", "2021", "2025", "Aktuell"}
result := make([]models.FeeScheduleInfo, 0, len(order))
for _, key := range order {
sched := feeSchedules[key]
info := models.FeeScheduleInfo{
Key: key,
Label: sched.Label,
ValidFrom: sched.ValidFrom,
}
if sched.AliasOf != "" {
info.IsAlias = true
info.AliasOf = sched.AliasOf
}
result = append(result, info)
}
return result
}

View File

@@ -0,0 +1,421 @@
package services
import (
"math"
"testing"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
func TestCalculateBaseFee_GKG_2025(t *testing.T) {
fc := NewFeeCalculator()
tests := []struct {
name string
streitwert float64
want float64
}{
{"minimum (500 EUR)", 500, 40},
{"1000 EUR", 1000, 61}, // 40 + ceil(500/500)*21 = 40+21
{"2000 EUR", 2000, 103}, // 40 + ceil(1500/500)*21 = 40+63
{"500k EUR", 500000, 4138}, // verified against GKG Anlage 2
{"1M EUR", 1000000, 6238}, // 4138 + ceil(500k/50k)*210 = 4138+2100
{"3M EUR", 3000000, 14638}, // 4138 + ceil(2.5M/50k)*210 = 4138+10500
{"5M EUR", 5000000, 23038}, // 4138 + ceil(4.5M/50k)*210 = 4138+18900
{"10M EUR", 10000000, 44038}, // 4138 + ceil(9.5M/50k)*210 = 4138+39900
{"30M EUR", 30000000, 128038}, // 4138 + ceil(29.5M/50k)*210 = 4138+123900
{"50M EUR", 50000000, 212038}, // 4138 + ceil(49.5M/50k)*210 = 4138+207900
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := fc.CalculateBaseFee(tt.streitwert, "2025", "GKG")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if math.Abs(got-tt.want) > 0.01 {
t.Errorf("CalculateBaseFee(%v, 2025, GKG) = %v, want %v", tt.streitwert, got, tt.want)
}
})
}
}
func TestCalculateBaseFee_RVG_2025(t *testing.T) {
fc := NewFeeCalculator()
tests := []struct {
name string
streitwert float64
want float64
}{
{"minimum (500 EUR)", 500, 51.5},
{"1000 EUR", 1000, 93}, // 51.5 + ceil(500/500)*41.5 = 51.5+41.5
{"2000 EUR", 2000, 176}, // 51.5 + ceil(1500/500)*41.5 = 51.5+124.5
{"500k EUR", 500000, 3752}, // computed via brackets
{"1M EUR", 1000000, 5502}, // 3752 + ceil(500k/50k)*175 = 3752+1750
{"3M EUR", 3000000, 12502}, // 3752 + ceil(2.5M/50k)*175 = 3752+8750
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := fc.CalculateBaseFee(tt.streitwert, "2025", "RVG")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if math.Abs(got-tt.want) > 0.01 {
t.Errorf("CalculateBaseFee(%v, 2025, RVG) = %v, want %v", tt.streitwert, got, tt.want)
}
})
}
}
func TestCalculateBaseFee_GKG_2013(t *testing.T) {
fc := NewFeeCalculator()
tests := []struct {
name string
streitwert float64
want float64
}{
{"minimum (500 EUR)", 500, 35},
{"1000 EUR", 1000, 53}, // 35 + ceil(500/500)*18 = 35+18
{"1500 EUR", 1500, 71}, // 35 + ceil(1000/500)*18 = 35+36
{"2000 EUR", 2000, 89}, // 35 + ceil(1500/500)*18 = 35+54
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := fc.CalculateBaseFee(tt.streitwert, "2013", "GKG")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if math.Abs(got-tt.want) > 0.01 {
t.Errorf("CalculateBaseFee(%v, 2013, GKG) = %v, want %v", tt.streitwert, got, tt.want)
}
})
}
}
func TestCalculateBaseFee_Aktuell_IsAlias(t *testing.T) {
fc := NewFeeCalculator()
aktuell, err := fc.CalculateBaseFee(1000000, "Aktuell", "GKG")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
v2025, err := fc.CalculateBaseFee(1000000, "2025", "GKG")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if aktuell != v2025 {
t.Errorf("Aktuell (%v) should equal 2025 (%v)", aktuell, v2025)
}
}
func TestCalculateBaseFee_InvalidSchedule(t *testing.T) {
fc := NewFeeCalculator()
_, err := fc.CalculateBaseFee(1000, "1999", "GKG")
if err == nil {
t.Error("expected error for unknown schedule")
}
}
func TestCalculateCourtFees(t *testing.T) {
fc := NewFeeCalculator()
tests := []struct {
name string
streitwert float64
instanceType string
version string
wantCourtFee float64
wantSource string
}{
{"LG 1M (3.0x GKG)", 1000000, "LG", "2025", 18714, "GKG"},
{"LG 3M (3.0x GKG)", 3000000, "LG", "2025", 43914, "GKG"},
{"OLG 3M (4.0x GKG)", 3000000, "OLG", "2025", 58552, "GKG"},
{"BPatG 1M (4.5x PatKostG)", 1000000, "BPatG", "2025", 28071, "PatKostG"},
{"DPMA (fixed 300)", 1000000, "DPMA", "2025", 300, "fixed"},
{"BPatG_Canc (fixed 500)", 1000000, "BPatG_Canc", "2025", 500, "fixed"},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
_, _, courtFee, source, err := fc.CalculateCourtFees(tt.streitwert, tt.instanceType, tt.version)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if math.Abs(courtFee-tt.wantCourtFee) > 0.01 {
t.Errorf("court fee = %v, want %v", courtFee, tt.wantCourtFee)
}
if source != tt.wantSource {
t.Errorf("source = %v, want %v", source, tt.wantSource)
}
})
}
}
func TestCalculateAttorneyFees(t *testing.T) {
fc := NewFeeCalculator()
// 1 RA at LG, 1M Streitwert, 1 client, oral hearing, no VAT
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 1, 1, true, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if bd == nil {
t.Fatal("expected non-nil breakdown")
}
// RVG base at 1M = 5502
if math.Abs(bd.BaseFee-5502) > 0.01 {
t.Errorf("base fee = %v, want 5502", bd.BaseFee)
}
// VG = 1.3 * 5502 = 7152.6
if math.Abs(bd.VGFee-7152.6) > 0.01 {
t.Errorf("VG fee = %v, want 7152.6", bd.VGFee)
}
// No increase (1 client)
if bd.IncreaseFee != 0 {
t.Errorf("increase fee = %v, want 0", bd.IncreaseFee)
}
// TG = 1.2 * 5502 = 6602.4
if math.Abs(bd.TGFee-6602.4) > 0.01 {
t.Errorf("TG fee = %v, want 6602.4", bd.TGFee)
}
// Subtotal net = 7152.6 + 0 + 6602.4 + 20 = 13775
if math.Abs(bd.SubtotalNet-13775) > 0.01 {
t.Errorf("subtotal net = %v, want 13775", bd.SubtotalNet)
}
}
func TestCalculateAttorneyFees_WithVAT(t *testing.T) {
fc := NewFeeCalculator()
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 1, 1, true, 0.19)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
// Net = 13775, VAT = 13775 * 0.19 = 2617.25
wantVAT := 13775 * 0.19
if math.Abs(bd.VAT-wantVAT) > 0.01 {
t.Errorf("VAT = %v, want %v", bd.VAT, wantVAT)
}
wantGross := 13775 + wantVAT
if math.Abs(bd.SubtotalGross-wantGross) > 0.01 {
t.Errorf("subtotal gross = %v, want %v", bd.SubtotalGross, wantGross)
}
}
func TestCalculateAttorneyFees_MultipleClients(t *testing.T) {
fc := NewFeeCalculator()
// 3 clients → increase factor = min((3-1)*0.3, 2.0) = 0.6
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 1, 3, true, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
wantIncrease := 0.6 * 5502 // 3301.2
if math.Abs(bd.IncreaseFee-wantIncrease) > 0.01 {
t.Errorf("increase fee = %v, want %v", bd.IncreaseFee, wantIncrease)
}
}
func TestCalculateAttorneyFees_IncreaseCapped(t *testing.T) {
fc := NewFeeCalculator()
// 10 clients → factor = min((10-1)*0.3, 2.0) = min(2.7, 2.0) = 2.0
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 1, 10, true, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
wantIncrease := 2.0 * 5502 // 11004
if math.Abs(bd.IncreaseFee-wantIncrease) > 0.01 {
t.Errorf("increase fee = %v, want %v (should be capped at 2.0x)", bd.IncreaseFee, wantIncrease)
}
}
func TestCalculateAttorneyFees_NoHearing(t *testing.T) {
fc := NewFeeCalculator()
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 1, 1, false, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if bd.TGFee != 0 {
t.Errorf("TG fee = %v, want 0 (no hearing)", bd.TGFee)
}
}
func TestCalculateAttorneyFees_ZeroAttorneys(t *testing.T) {
fc := NewFeeCalculator()
bd, err := fc.CalculateAttorneyFees(1000000, "2025", 1.3, 1.2, 0, 1, true, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if bd != nil {
t.Error("expected nil breakdown for 0 attorneys")
}
}
func TestCalculateInstanceTotal_ExpertFees(t *testing.T) {
fc := NewFeeCalculator()
// Bug 3 fix: expert fees must be included in court subtotal
inst := models.InstanceInput{
Type: models.InstanceBPatG,
Enabled: true,
FeeVersion: "2025",
NumAttorneys: 1,
NumPatentAttorneys: 1,
NumClients: 1,
OralHearing: true,
ExpertFees: 10000,
}
result, err := fc.CalculateInstanceTotal(1000000, inst, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
// Court subtotal should include expert fees
wantCourtSubtotal := result.CourtFee + 10000
if math.Abs(result.CourtSubtotal-wantCourtSubtotal) > 0.01 {
t.Errorf("court subtotal = %v, want %v (should include expert fees)", result.CourtSubtotal, wantCourtSubtotal)
}
// Instance total should include expert fees
if result.InstanceTotal < result.CourtFee+10000 {
t.Errorf("instance total %v should include expert fees (court fee %v + expert 10000)", result.InstanceTotal, result.CourtFee)
}
}
func TestCalculateSecurityForCosts_BugFixes(t *testing.T) {
fc := NewFeeCalculator()
// Bug 1: VAT should be added (1 + VAT), not subtracted
sec, err := fc.CalculateSecurityForCosts(1000000, "2025", 1, 0.19)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if sec.TotalGross < sec.SubtotalNet {
t.Errorf("Bug 1: total %v should be > subtotal %v (VAT should add, not subtract)", sec.TotalGross, sec.SubtotalNet)
}
wantTotal := sec.SubtotalNet * 1.19
if math.Abs(sec.TotalGross-wantTotal) > 0.01 {
t.Errorf("Bug 1: total = %v, want %v", sec.TotalGross, wantTotal)
}
// Bug 2: verify inst2 uses GKG base, not RVG base
// At 1M: GKG base = 6238, RVG base = 5502
// inst2 includes "4.0x court" = 4.0 * GKG_base = 24952
// If it incorrectly used RVG: 4.0 * 5502 = 22008
rvgBase := 5502.0
gkgBase := 6238.0
expectedInst2 := 2.8*rvgBase + 2.8*rvgBase + 4.0*gkgBase + 5000
if math.Abs(sec.Instance2-expectedInst2) > 0.01 {
t.Errorf("Bug 2: instance2 = %v, want %v (should use GKG base for court fee)", sec.Instance2, expectedInst2)
}
}
func TestCalculateSecurityForCosts_ZeroVAT(t *testing.T) {
fc := NewFeeCalculator()
sec, err := fc.CalculateSecurityForCosts(1000000, "2025", 1, 0)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
// With 0% VAT, total should equal subtotal (bug is invisible)
if sec.TotalGross != sec.SubtotalNet {
t.Errorf("with 0%% VAT: total %v should equal subtotal %v", sec.TotalGross, sec.SubtotalNet)
}
}
func TestCalculateFullLitigation_Infringement(t *testing.T) {
fc := NewFeeCalculator()
req := models.FeeCalculateRequest{
Streitwert: 1000000,
VATRate: 0,
ProceedingPath: models.PathInfringement,
Instances: []models.InstanceInput{
{Type: models.InstanceLG, Enabled: true, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
{Type: models.InstanceOLG, Enabled: true, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
{Type: models.InstanceBGHNZB, Enabled: true, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
{Type: models.InstanceBGHRev, Enabled: false, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
},
}
resp, err := fc.CalculateFullLitigation(req)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
// 3 enabled instances
if len(resp.Instances) != 3 {
t.Errorf("got %d instances, want 3", len(resp.Instances))
}
// Should have at least "Gesamtkosten bei Nichtzulassung"
found := false
for _, total := range resp.Totals {
if total.Label == "Gesamtkosten bei Nichtzulassung" {
found = true
if total.Total <= 0 {
t.Errorf("Nichtzulassung total = %v, should be > 0", total.Total)
}
}
}
if !found {
t.Error("missing 'Gesamtkosten bei Nichtzulassung' total")
}
}
func TestCalculateFullLitigation_Nullity(t *testing.T) {
fc := NewFeeCalculator()
req := models.FeeCalculateRequest{
Streitwert: 1000000,
VATRate: 0,
ProceedingPath: models.PathNullity,
Instances: []models.InstanceInput{
{Type: models.InstanceBPatG, Enabled: true, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
{Type: models.InstanceBGHNull, Enabled: true, FeeVersion: "2025", NumAttorneys: 1, NumPatentAttorneys: 1, NumClients: 1, OralHearing: true},
},
}
resp, err := fc.CalculateFullLitigation(req)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(resp.Instances) != 2 {
t.Errorf("got %d instances, want 2", len(resp.Instances))
}
if len(resp.Totals) != 1 || resp.Totals[0].Label != "Gesamtkosten Nichtigkeitsverfahren" {
t.Errorf("unexpected totals: %+v", resp.Totals)
}
}
func TestGetSchedules(t *testing.T) {
fc := NewFeeCalculator()
schedules := fc.GetSchedules()
if len(schedules) != 5 {
t.Errorf("got %d schedules, want 5", len(schedules))
}
// Aktuell should be an alias
last := schedules[len(schedules)-1]
if !last.IsAlias || last.AliasOf != "2025" {
t.Errorf("Aktuell should be alias of 2025, got IsAlias=%v AliasOf=%v", last.IsAlias, last.AliasOf)
}
}

View File

@@ -0,0 +1,249 @@
package services
import "math"
// feeBracket defines one bracket in the GKG/RVG fee schedule.
// Each bracket covers a range of Streitwert values.
type feeBracket struct {
UpperBound float64 // upper bound of this bracket
StepSize float64 // step size within this bracket
GKGIncrement float64 // GKG fee increment per step
RVGIncrement float64 // RVG fee increment per step
}
// feeScheduleData holds the bracket data for a fee schedule version.
type feeScheduleData struct {
Label string
ValidFrom string
Brackets []feeBracket
AliasOf string // if non-empty, this schedule is an alias for another
}
// feeSchedules contains all historical GKG/RVG fee schedule versions.
// Data extracted from Patentprozesskostenrechner.xlsm ListObjects.
var feeSchedules = map[string]*feeScheduleData{
"2005": {
Label: "GKG/RVG 2006-09-01",
ValidFrom: "2006-09-01",
Brackets: []feeBracket{
{300, 300, 25, 25},
{1500, 300, 10, 20},
{5000, 500, 8, 28},
{10000, 1000, 15, 37},
{25000, 3000, 23, 40},
{50000, 5000, 29, 72},
{200000, 15000, 100, 77},
{500000, 30000, 150, 118},
{math.MaxFloat64, 50000, 150, 150},
},
},
"2013": {
Label: "GKG/RVG 2013-08-01",
ValidFrom: "2013-08-01",
Brackets: []feeBracket{
{500, 300, 35, 45},
{2000, 500, 18, 35},
{10000, 1000, 19, 51},
{25000, 3000, 26, 46},
{50000, 5000, 35, 75},
{200000, 15000, 120, 85},
{500000, 30000, 179, 120},
{math.MaxFloat64, 50000, 180, 150},
},
},
"2021": {
Label: "GKG/RVG 2021-01-01",
ValidFrom: "2021-01-01",
Brackets: []feeBracket{
{500, 300, 38, 49},
{2000, 500, 20, 39},
{10000, 1000, 21, 56},
{25000, 3000, 29, 52},
{50000, 5000, 38, 81},
{200000, 15000, 132, 94},
{500000, 30000, 198, 132},
{math.MaxFloat64, 50000, 198, 165},
},
},
"2025": {
Label: "GKG/RVG 2025-06-01",
ValidFrom: "2025-06-01",
Brackets: []feeBracket{
{500, 300, 40, 51.5},
{2000, 500, 21, 41.5},
{10000, 1000, 22.5, 59.5},
{25000, 3000, 30.5, 55},
{50000, 5000, 40.5, 86},
{200000, 15000, 140, 99.5},
{500000, 30000, 210, 140},
{math.MaxFloat64, 50000, 210, 175},
},
},
"Aktuell": {
Label: "Aktuell (= 2025-06-01)",
ValidFrom: "2025-06-01",
AliasOf: "2025",
},
}
// instanceConfig holds the multipliers and fee basis for each court instance.
type instanceConfig struct {
Label string
CourtFactor float64 // multiplier for base court fee
CourtSource string // "GKG", "PatKostG", or "fixed"
FixedCourtFee float64 // only used when CourtSource is "fixed"
RAVGFactor float64 // Rechtsanwalt Verfahrensgebühr factor
RATGFactor float64 // Rechtsanwalt Terminsgebühr factor
PAVGFactor float64 // Patentanwalt Verfahrensgebühr factor
PATGFactor float64 // Patentanwalt Terminsgebühr factor
HasPA bool // whether patent attorneys are applicable
}
// instanceConfigs defines the fee parameters for each court instance type.
var instanceConfigs = map[string]instanceConfig{
"LG": {"LG (Verletzung 1. Instanz)", 3.0, "GKG", 0, 1.3, 1.2, 1.3, 1.2, true},
"OLG": {"OLG (Berufung)", 4.0, "GKG", 0, 1.6, 1.2, 1.6, 1.2, true},
"BGH_NZB": {"BGH (Nichtzulassungsbeschwerde)", 2.0, "GKG", 0, 2.3, 1.2, 1.6, 1.2, true},
"BGH_Rev": {"BGH (Revision)", 5.0, "GKG", 0, 2.3, 1.5, 1.6, 1.5, true},
"BPatG": {"BPatG (Nichtigkeitsverfahren)", 4.5, "PatKostG", 0, 1.3, 1.2, 1.3, 1.2, true},
"BGH_Null": {"BGH (Nichtigkeitsberufung)", 6.0, "GKG", 0, 1.6, 1.5, 1.6, 1.5, true},
"DPMA": {"DPMA (Löschungsverfahren)", 0, "fixed", 300, 1.3, 1.2, 0, 0, false},
"BPatG_Canc": {"BPatG (Löschungsbeschwerde)", 0, "fixed", 500, 1.3, 1.2, 0, 0, false},
}
// --- UPC Fee Data ---
// upcFeeBracket defines one bracket in a UPC fee table.
// MaxValue 0 means unlimited (last bracket).
type upcFeeBracket struct {
MaxValue float64
Fee float64
}
type upcFixedFees struct {
Infringement float64
CounterclaimInfringement float64
NonInfringement float64
LicenseCompensation float64
DetermineDamages float64
RevocationStandalone float64
CounterclaimRevocation float64
ProvisionalMeasures float64
}
type upcScheduleData struct {
Label string
ValidFrom string
FixedFees upcFixedFees
ValueBased []upcFeeBracket
Recoverable []upcFeeBracket
SMReduction float64
}
// upcSchedules contains UPC fee data for pre-2026 and 2026+ schedules.
var upcSchedules = map[string]*upcScheduleData{
"pre2026": {
Label: "UPC (vor 2026)",
ValidFrom: "2023-06-01",
FixedFees: upcFixedFees{
Infringement: 11000,
CounterclaimInfringement: 11000,
NonInfringement: 11000,
LicenseCompensation: 11000,
DetermineDamages: 3000,
RevocationStandalone: 20000,
CounterclaimRevocation: 20000,
ProvisionalMeasures: 11000,
},
ValueBased: []upcFeeBracket{
{500000, 0},
{750000, 2500},
{1000000, 4000},
{1500000, 8000},
{2000000, 13000},
{3000000, 20000},
{4000000, 26000},
{5000000, 32000},
{6000000, 39000},
{7000000, 46000},
{8000000, 52000},
{9000000, 58000},
{10000000, 65000},
{15000000, 75000},
{20000000, 100000},
{25000000, 125000},
{30000000, 150000},
{50000000, 250000},
{0, 325000},
},
Recoverable: []upcFeeBracket{
{250000, 38000},
{500000, 56000},
{1000000, 112000},
{2000000, 200000},
{4000000, 400000},
{8000000, 600000},
{16000000, 800000},
{30000000, 1200000},
{50000000, 1500000},
{0, 2000000},
},
SMReduction: 0.40,
},
"2026": {
Label: "UPC (ab 2026)",
ValidFrom: "2026-01-01",
FixedFees: upcFixedFees{
Infringement: 14600,
CounterclaimInfringement: 14600,
NonInfringement: 14600,
LicenseCompensation: 14600,
DetermineDamages: 4000,
RevocationStandalone: 26500,
CounterclaimRevocation: 26500,
ProvisionalMeasures: 14600,
},
// Estimated ~32% increase on pre-2026 values; replace with official data when available.
ValueBased: []upcFeeBracket{
{500000, 0},
{750000, 3300},
{1000000, 5300},
{1500000, 10600},
{2000000, 17200},
{3000000, 26400},
{4000000, 34300},
{5000000, 42200},
{6000000, 51500},
{7000000, 60700},
{8000000, 68600},
{9000000, 76600},
{10000000, 85800},
{15000000, 99000},
{20000000, 132000},
{25000000, 165000},
{30000000, 198000},
{50000000, 330000},
{0, 429000},
},
Recoverable: []upcFeeBracket{
{250000, 38000},
{500000, 56000},
{1000000, 112000},
{2000000, 200000},
{4000000, 400000},
{8000000, 600000},
{16000000, 800000},
{30000000, 1200000},
{50000000, 1500000},
{0, 2000000},
},
SMReduction: 0.50,
},
}
// Fee calculation constants (RVG)
const (
erhoehungsfaktor = 0.3 // Nr. 1008 VV RVG increase per additional client
erhoehungsfaktorMax = 2.0 // maximum increase factor
auslagenpauschale = 20.0 // Nr. 7002 VV RVG flat expense allowance (EUR)
)

View File

@@ -0,0 +1,292 @@
package services
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"time"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type InvoiceService struct {
db *sqlx.DB
audit *AuditService
}
func NewInvoiceService(db *sqlx.DB, audit *AuditService) *InvoiceService {
return &InvoiceService{db: db, audit: audit}
}
type CreateInvoiceInput struct {
CaseID uuid.UUID `json:"case_id"`
ClientName string `json:"client_name"`
ClientAddress *string `json:"client_address,omitempty"`
Items []models.InvoiceItem `json:"items"`
TaxRate *float64 `json:"tax_rate,omitempty"`
IssuedAt *string `json:"issued_at,omitempty"`
DueAt *string `json:"due_at,omitempty"`
Notes *string `json:"notes,omitempty"`
TimeEntryIDs []uuid.UUID `json:"time_entry_ids,omitempty"`
}
type UpdateInvoiceInput struct {
ClientName *string `json:"client_name,omitempty"`
ClientAddress *string `json:"client_address,omitempty"`
Items []models.InvoiceItem `json:"items,omitempty"`
TaxRate *float64 `json:"tax_rate,omitempty"`
IssuedAt *string `json:"issued_at,omitempty"`
DueAt *string `json:"due_at,omitempty"`
Notes *string `json:"notes,omitempty"`
}
const invoiceCols = `id, tenant_id, case_id, invoice_number, client_name, client_address,
items, subtotal, tax_rate, tax_amount, total, status, issued_at, due_at, paid_at, notes,
created_by, created_at, updated_at`
func (s *InvoiceService) List(ctx context.Context, tenantID uuid.UUID, caseID *uuid.UUID, status string) ([]models.Invoice, error) {
where := "WHERE tenant_id = $1"
args := []any{tenantID}
argIdx := 2
if caseID != nil {
where += fmt.Sprintf(" AND case_id = $%d", argIdx)
args = append(args, *caseID)
argIdx++
}
if status != "" {
where += fmt.Sprintf(" AND status = $%d", argIdx)
args = append(args, status)
argIdx++
}
var invoices []models.Invoice
err := s.db.SelectContext(ctx, &invoices,
fmt.Sprintf("SELECT %s FROM invoices %s ORDER BY created_at DESC", invoiceCols, where),
args...)
if err != nil {
return nil, fmt.Errorf("list invoices: %w", err)
}
return invoices, nil
}
func (s *InvoiceService) GetByID(ctx context.Context, tenantID, invoiceID uuid.UUID) (*models.Invoice, error) {
var inv models.Invoice
err := s.db.GetContext(ctx, &inv,
`SELECT `+invoiceCols+` FROM invoices WHERE tenant_id = $1 AND id = $2`,
tenantID, invoiceID)
if err == sql.ErrNoRows {
return nil, nil
}
if err != nil {
return nil, fmt.Errorf("get invoice: %w", err)
}
return &inv, nil
}
func (s *InvoiceService) Create(ctx context.Context, tenantID, userID uuid.UUID, input CreateInvoiceInput) (*models.Invoice, error) {
tx, err := s.db.BeginTxx(ctx, nil)
if err != nil {
return nil, fmt.Errorf("begin tx: %w", err)
}
defer tx.Rollback()
// Generate invoice number: RE-YYYY-NNN
year := time.Now().Year()
var seq int
err = tx.GetContext(ctx, &seq,
`SELECT COUNT(*) + 1 FROM invoices WHERE tenant_id = $1 AND invoice_number LIKE $2`,
tenantID, fmt.Sprintf("RE-%d-%%", year))
if err != nil {
return nil, fmt.Errorf("generate invoice number: %w", err)
}
invoiceNumber := fmt.Sprintf("RE-%d-%03d", year, seq)
// Calculate totals
taxRate := 19.00
if input.TaxRate != nil {
taxRate = *input.TaxRate
}
var subtotal float64
for _, item := range input.Items {
subtotal += item.Amount
}
taxAmount := subtotal * taxRate / 100
total := subtotal + taxAmount
itemsJSON, err := json.Marshal(input.Items)
if err != nil {
return nil, fmt.Errorf("marshal items: %w", err)
}
var inv models.Invoice
err = tx.QueryRowxContext(ctx,
`INSERT INTO invoices (tenant_id, case_id, invoice_number, client_name, client_address,
items, subtotal, tax_rate, tax_amount, total, issued_at, due_at, notes, created_by)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14)
RETURNING `+invoiceCols,
tenantID, input.CaseID, invoiceNumber, input.ClientName, input.ClientAddress,
itemsJSON, subtotal, taxRate, taxAmount, total, input.IssuedAt, input.DueAt, input.Notes, userID,
).StructScan(&inv)
if err != nil {
return nil, fmt.Errorf("create invoice: %w", err)
}
// Mark linked time entries as billed
if len(input.TimeEntryIDs) > 0 {
query, args, err := sqlx.In(
`UPDATE time_entries SET billed = true, invoice_id = ? WHERE tenant_id = ? AND id IN (?)`,
inv.ID, tenantID, input.TimeEntryIDs)
if err != nil {
return nil, fmt.Errorf("build time entry update: %w", err)
}
query = tx.Rebind(query)
_, err = tx.ExecContext(ctx, query, args...)
if err != nil {
return nil, fmt.Errorf("mark time entries billed: %w", err)
}
}
if err := tx.Commit(); err != nil {
return nil, fmt.Errorf("commit: %w", err)
}
s.audit.Log(ctx, "create", "invoice", &inv.ID, nil, inv)
return &inv, nil
}
func (s *InvoiceService) Update(ctx context.Context, tenantID, invoiceID uuid.UUID, input UpdateInvoiceInput) (*models.Invoice, error) {
old, err := s.GetByID(ctx, tenantID, invoiceID)
if err != nil {
return nil, err
}
if old == nil {
return nil, fmt.Errorf("invoice not found")
}
if old.Status != "draft" {
return nil, fmt.Errorf("can only update draft invoices")
}
// Recalculate totals if items changed
var itemsJSON json.RawMessage
var subtotal float64
taxRate := old.TaxRate
if input.Items != nil {
for _, item := range input.Items {
subtotal += item.Amount
}
itemsJSON, _ = json.Marshal(input.Items)
}
if input.TaxRate != nil {
taxRate = *input.TaxRate
}
if input.Items != nil {
taxAmount := subtotal * taxRate / 100
total := subtotal + taxAmount
var inv models.Invoice
err = s.db.QueryRowxContext(ctx,
`UPDATE invoices SET
client_name = COALESCE($3, client_name),
client_address = COALESCE($4, client_address),
items = $5,
subtotal = $6,
tax_rate = $7,
tax_amount = $8,
total = $9,
issued_at = COALESCE($10, issued_at),
due_at = COALESCE($11, due_at),
notes = COALESCE($12, notes),
updated_at = now()
WHERE tenant_id = $1 AND id = $2
RETURNING `+invoiceCols,
tenantID, invoiceID, input.ClientName, input.ClientAddress,
itemsJSON, subtotal, taxRate, subtotal*taxRate/100, total,
input.IssuedAt, input.DueAt, input.Notes,
).StructScan(&inv)
if err != nil {
return nil, fmt.Errorf("update invoice: %w", err)
}
s.audit.Log(ctx, "update", "invoice", &inv.ID, old, inv)
return &inv, nil
}
// Update without changing items
var inv models.Invoice
err = s.db.QueryRowxContext(ctx,
`UPDATE invoices SET
client_name = COALESCE($3, client_name),
client_address = COALESCE($4, client_address),
tax_rate = COALESCE($5, tax_rate),
issued_at = COALESCE($6, issued_at),
due_at = COALESCE($7, due_at),
notes = COALESCE($8, notes),
updated_at = now()
WHERE tenant_id = $1 AND id = $2
RETURNING `+invoiceCols,
tenantID, invoiceID, input.ClientName, input.ClientAddress,
input.TaxRate, input.IssuedAt, input.DueAt, input.Notes,
).StructScan(&inv)
if err != nil {
return nil, fmt.Errorf("update invoice: %w", err)
}
s.audit.Log(ctx, "update", "invoice", &inv.ID, old, inv)
return &inv, nil
}
func (s *InvoiceService) UpdateStatus(ctx context.Context, tenantID, invoiceID uuid.UUID, newStatus string) (*models.Invoice, error) {
old, err := s.GetByID(ctx, tenantID, invoiceID)
if err != nil {
return nil, err
}
if old == nil {
return nil, fmt.Errorf("invoice not found")
}
// Validate transitions
validTransitions := map[string][]string{
"draft": {"sent", "cancelled"},
"sent": {"paid", "cancelled"},
"paid": {},
"cancelled": {},
}
allowed := validTransitions[old.Status]
valid := false
for _, s := range allowed {
if s == newStatus {
valid = true
break
}
}
if !valid {
return nil, fmt.Errorf("invalid status transition from %s to %s", old.Status, newStatus)
}
var paidAt *time.Time
if newStatus == "paid" {
now := time.Now()
paidAt = &now
}
var inv models.Invoice
err = s.db.QueryRowxContext(ctx,
`UPDATE invoices SET status = $3, paid_at = COALESCE($4, paid_at), updated_at = now()
WHERE tenant_id = $1 AND id = $2
RETURNING `+invoiceCols,
tenantID, invoiceID, newStatus, paidAt,
).StructScan(&inv)
if err != nil {
return nil, fmt.Errorf("update invoice status: %w", err)
}
s.audit.Log(ctx, "update", "invoice", &inv.ID, old, inv)
return &inv, nil
}

View File

@@ -0,0 +1,124 @@
package services
import (
"context"
"database/sql"
"fmt"
"time"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type NoteService struct {
db *sqlx.DB
audit *AuditService
}
func NewNoteService(db *sqlx.DB, audit *AuditService) *NoteService {
return &NoteService{db: db, audit: audit}
}
// ListByParent returns all notes for a given parent entity, scoped to tenant.
func (s *NoteService) ListByParent(ctx context.Context, tenantID uuid.UUID, parentType string, parentID uuid.UUID) ([]models.Note, error) {
col, err := parentColumn(parentType)
if err != nil {
return nil, err
}
query := fmt.Sprintf(
`SELECT id, tenant_id, case_id, deadline_id, appointment_id, case_event_id,
content, created_by, created_at, updated_at
FROM notes
WHERE tenant_id = $1 AND %s = $2
ORDER BY created_at DESC`, col)
var notes []models.Note
if err := s.db.SelectContext(ctx, &notes, query, tenantID, parentID); err != nil {
return nil, fmt.Errorf("listing notes by %s: %w", parentType, err)
}
if notes == nil {
notes = []models.Note{}
}
return notes, nil
}
type CreateNoteInput struct {
CaseID *uuid.UUID `json:"case_id,omitempty"`
DeadlineID *uuid.UUID `json:"deadline_id,omitempty"`
AppointmentID *uuid.UUID `json:"appointment_id,omitempty"`
CaseEventID *uuid.UUID `json:"case_event_id,omitempty"`
Content string `json:"content"`
}
// Create inserts a new note.
func (s *NoteService) Create(ctx context.Context, tenantID uuid.UUID, createdBy *uuid.UUID, input CreateNoteInput) (*models.Note, error) {
id := uuid.New()
now := time.Now().UTC()
query := `INSERT INTO notes (id, tenant_id, case_id, deadline_id, appointment_id, case_event_id, content, created_by, created_at, updated_at)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $9)
RETURNING id, tenant_id, case_id, deadline_id, appointment_id, case_event_id, content, created_by, created_at, updated_at`
var n models.Note
err := s.db.GetContext(ctx, &n, query,
id, tenantID, input.CaseID, input.DeadlineID, input.AppointmentID, input.CaseEventID,
input.Content, createdBy, now)
if err != nil {
return nil, fmt.Errorf("creating note: %w", err)
}
s.audit.Log(ctx, "create", "note", &id, nil, n)
return &n, nil
}
// Update modifies a note's content.
func (s *NoteService) Update(ctx context.Context, tenantID, noteID uuid.UUID, content string) (*models.Note, error) {
query := `UPDATE notes SET content = $1, updated_at = $2
WHERE id = $3 AND tenant_id = $4
RETURNING id, tenant_id, case_id, deadline_id, appointment_id, case_event_id, content, created_by, created_at, updated_at`
var n models.Note
err := s.db.GetContext(ctx, &n, query, content, time.Now().UTC(), noteID, tenantID)
if err != nil {
if err == sql.ErrNoRows {
return nil, nil
}
return nil, fmt.Errorf("updating note: %w", err)
}
s.audit.Log(ctx, "update", "note", &noteID, nil, n)
return &n, nil
}
// Delete removes a note.
func (s *NoteService) Delete(ctx context.Context, tenantID, noteID uuid.UUID) error {
result, err := s.db.ExecContext(ctx, "DELETE FROM notes WHERE id = $1 AND tenant_id = $2", noteID, tenantID)
if err != nil {
return fmt.Errorf("deleting note: %w", err)
}
rows, err := result.RowsAffected()
if err != nil {
return fmt.Errorf("checking delete result: %w", err)
}
if rows == 0 {
return fmt.Errorf("note not found")
}
s.audit.Log(ctx, "delete", "note", &noteID, nil, nil)
return nil
}
func parentColumn(parentType string) (string, error) {
switch parentType {
case "case":
return "case_id", nil
case "deadline":
return "deadline_id", nil
case "appointment":
return "appointment_id", nil
case "case_event":
return "case_event_id", nil
default:
return "", fmt.Errorf("invalid parent type: %s", parentType)
}
}

View File

@@ -0,0 +1,571 @@
package services
import (
"context"
"crypto/tls"
"fmt"
"log/slog"
"net"
"net/smtp"
"os"
"strings"
"sync"
"time"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"github.com/lib/pq"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
// NotificationService handles notification CRUD, deadline reminders, and email sending.
type NotificationService struct {
db *sqlx.DB
stopCh chan struct{}
wg sync.WaitGroup
}
// NewNotificationService creates a new notification service.
func NewNotificationService(db *sqlx.DB) *NotificationService {
return &NotificationService{
db: db,
stopCh: make(chan struct{}),
}
}
// Start launches the background reminder checker (every hour) and daily digest (8am).
func (s *NotificationService) Start() {
s.wg.Add(1)
go s.backgroundLoop()
}
// Stop gracefully shuts down background workers.
func (s *NotificationService) Stop() {
close(s.stopCh)
s.wg.Wait()
}
func (s *NotificationService) backgroundLoop() {
defer s.wg.Done()
// Check reminders on startup
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
s.CheckDeadlineReminders(ctx)
cancel()
reminderTicker := time.NewTicker(1 * time.Hour)
defer reminderTicker.Stop()
// Digest ticker: check every 15 minutes, send at 8am
digestTicker := time.NewTicker(15 * time.Minute)
defer digestTicker.Stop()
var lastDigestDate string
for {
select {
case <-s.stopCh:
return
case <-reminderTicker.C:
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
s.CheckDeadlineReminders(ctx)
cancel()
case now := <-digestTicker.C:
today := now.Format("2006-01-02")
hour := now.Hour()
if hour >= 8 && lastDigestDate != today {
lastDigestDate = today
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Minute)
s.SendDailyDigests(ctx)
cancel()
}
}
}
}
// CheckDeadlineReminders finds deadlines due in N days matching user preferences and creates notifications.
func (s *NotificationService) CheckDeadlineReminders(ctx context.Context) {
slog.Info("checking deadline reminders")
// Get all user preferences with email enabled
var prefs []models.NotificationPreferences
err := s.db.SelectContext(ctx, &prefs,
`SELECT user_id, tenant_id, deadline_reminder_days, email_enabled, daily_digest, created_at, updated_at
FROM notification_preferences`)
if err != nil {
slog.Error("failed to load notification preferences", "error", err)
return
}
if len(prefs) == 0 {
return
}
// Collect all unique reminder day values across all users
daySet := make(map[int64]bool)
for _, p := range prefs {
for _, d := range p.DeadlineReminderDays {
daySet[d] = true
}
}
if len(daySet) == 0 {
return
}
// Build array of target dates
today := time.Now().Truncate(24 * time.Hour)
var targetDates []string
dayToDate := make(map[string]int64)
for d := range daySet {
target := today.AddDate(0, 0, int(d))
dateStr := target.Format("2006-01-02")
targetDates = append(targetDates, dateStr)
dayToDate[dateStr] = d
}
// Also check overdue deadlines
todayStr := today.Format("2006-01-02")
// Find pending deadlines matching target dates
type deadlineRow struct {
models.Deadline
CaseTitle string `db:"case_title"`
CaseNumber string `db:"case_number"`
}
// Reminder deadlines (due in N days)
var reminderDeadlines []deadlineRow
query, args, err := sqlx.In(
`SELECT d.*, c.title AS case_title, c.case_number
FROM deadlines d
JOIN cases c ON c.id = d.case_id
WHERE d.status = 'pending' AND d.due_date IN (?)`,
targetDates)
if err == nil {
query = s.db.Rebind(query)
err = s.db.SelectContext(ctx, &reminderDeadlines, query, args...)
}
if err != nil {
slog.Error("failed to query reminder deadlines", "error", err)
}
// Overdue deadlines
var overdueDeadlines []deadlineRow
err = s.db.SelectContext(ctx, &overdueDeadlines,
`SELECT d.*, c.title AS case_title, c.case_number
FROM deadlines d
JOIN cases c ON c.id = d.case_id
WHERE d.status = 'pending' AND d.due_date < $1`, todayStr)
if err != nil {
slog.Error("failed to query overdue deadlines", "error", err)
}
// Create notifications for each user based on their tenant and preferences
for _, pref := range prefs {
// Reminder notifications
for _, dl := range reminderDeadlines {
if dl.TenantID != pref.TenantID {
continue
}
daysUntil := dayToDate[dl.DueDate]
// Check if this user cares about this many days
if !containsDay(pref.DeadlineReminderDays, daysUntil) {
continue
}
title := fmt.Sprintf("Frist in %d Tagen: %s", daysUntil, dl.Title)
body := fmt.Sprintf("Akte %s — %s\nFällig am %s", dl.CaseNumber, dl.CaseTitle, dl.DueDate)
entityType := "deadline"
s.CreateNotification(ctx, CreateNotificationInput{
TenantID: pref.TenantID,
UserID: pref.UserID,
Type: "deadline_reminder",
EntityType: &entityType,
EntityID: &dl.ID,
Title: title,
Body: &body,
SendEmail: pref.EmailEnabled && !pref.DailyDigest,
})
}
// Overdue notifications
for _, dl := range overdueDeadlines {
if dl.TenantID != pref.TenantID {
continue
}
title := fmt.Sprintf("Frist überfällig: %s", dl.Title)
body := fmt.Sprintf("Akte %s — %s\nFällig seit %s", dl.CaseNumber, dl.CaseTitle, dl.DueDate)
entityType := "deadline"
s.CreateNotification(ctx, CreateNotificationInput{
TenantID: pref.TenantID,
UserID: pref.UserID,
Type: "deadline_overdue",
EntityType: &entityType,
EntityID: &dl.ID,
Title: title,
Body: &body,
SendEmail: pref.EmailEnabled && !pref.DailyDigest,
})
}
}
}
// SendDailyDigests compiles pending notifications into one email per user.
func (s *NotificationService) SendDailyDigests(ctx context.Context) {
slog.Info("sending daily digests")
// Find users with daily_digest enabled
var prefs []models.NotificationPreferences
err := s.db.SelectContext(ctx, &prefs,
`SELECT user_id, tenant_id, deadline_reminder_days, email_enabled, daily_digest, created_at, updated_at
FROM notification_preferences
WHERE daily_digest = true AND email_enabled = true`)
if err != nil {
slog.Error("failed to load digest preferences", "error", err)
return
}
for _, pref := range prefs {
// Get unsent notifications for this user from the last 24 hours
var notifications []models.Notification
err := s.db.SelectContext(ctx, &notifications,
`SELECT id, tenant_id, user_id, type, entity_type, entity_id, title, body, sent_at, read_at, created_at
FROM notifications
WHERE user_id = $1 AND tenant_id = $2 AND sent_at IS NULL
AND created_at > now() - interval '24 hours'
ORDER BY created_at DESC`,
pref.UserID, pref.TenantID)
if err != nil {
slog.Error("failed to load unsent notifications", "error", err, "user_id", pref.UserID)
continue
}
if len(notifications) == 0 {
continue
}
// Get user email
email := s.getUserEmail(ctx, pref.UserID)
if email == "" {
continue
}
// Build digest
var lines []string
lines = append(lines, fmt.Sprintf("Guten Morgen! Hier ist Ihre Tagesübersicht mit %d Benachrichtigungen:\n", len(notifications)))
for _, n := range notifications {
body := ""
if n.Body != nil {
body = " — " + *n.Body
}
lines = append(lines, fmt.Sprintf("• %s%s", n.Title, body))
}
lines = append(lines, "\n---\nKanzlAI Kanzleimanagement")
subject := fmt.Sprintf("KanzlAI Tagesübersicht — %d Benachrichtigungen", len(notifications))
bodyText := strings.Join(lines, "\n")
if err := SendEmail(email, subject, bodyText); err != nil {
slog.Error("failed to send digest email", "error", err, "user_id", pref.UserID)
continue
}
// Mark all as sent
ids := make([]uuid.UUID, len(notifications))
for i, n := range notifications {
ids[i] = n.ID
}
query, args, err := sqlx.In(
`UPDATE notifications SET sent_at = now() WHERE id IN (?)`, ids)
if err == nil {
query = s.db.Rebind(query)
_, err = s.db.ExecContext(ctx, query, args...)
}
if err != nil {
slog.Error("failed to mark digest notifications sent", "error", err)
}
slog.Info("sent daily digest", "user_id", pref.UserID, "count", len(notifications))
}
}
// CreateNotificationInput holds the data for creating a notification.
type CreateNotificationInput struct {
TenantID uuid.UUID
UserID uuid.UUID
Type string
EntityType *string
EntityID *uuid.UUID
Title string
Body *string
SendEmail bool
}
// CreateNotification stores a notification in the DB and optionally sends an email.
func (s *NotificationService) CreateNotification(ctx context.Context, input CreateNotificationInput) (*models.Notification, error) {
// Dedup: check if we already sent this notification today
if input.EntityID != nil {
var count int
err := s.db.GetContext(ctx, &count,
`SELECT COUNT(*) FROM notifications
WHERE user_id = $1 AND entity_id = $2 AND type = $3
AND created_at::date = CURRENT_DATE`,
input.UserID, input.EntityID, input.Type)
if err == nil && count > 0 {
return nil, nil // Already notified today
}
}
var n models.Notification
err := s.db.QueryRowxContext(ctx,
`INSERT INTO notifications (tenant_id, user_id, type, entity_type, entity_id, title, body)
VALUES ($1, $2, $3, $4, $5, $6, $7)
RETURNING id, tenant_id, user_id, type, entity_type, entity_id, title, body, sent_at, read_at, created_at`,
input.TenantID, input.UserID, input.Type, input.EntityType, input.EntityID,
input.Title, input.Body).StructScan(&n)
if err != nil {
slog.Error("failed to create notification", "error", err)
return nil, fmt.Errorf("create notification: %w", err)
}
// Send email immediately if requested (non-digest users)
if input.SendEmail {
email := s.getUserEmail(ctx, input.UserID)
if email != "" {
go func() {
if err := SendEmail(email, input.Title, derefStr(input.Body)); err != nil {
slog.Error("failed to send notification email", "error", err, "user_id", input.UserID)
} else {
// Mark as sent
_, _ = s.db.Exec(`UPDATE notifications SET sent_at = now() WHERE id = $1`, n.ID)
}
}()
}
}
return &n, nil
}
// ListForUser returns notifications for a user in a tenant, paginated.
func (s *NotificationService) ListForUser(ctx context.Context, tenantID, userID uuid.UUID, limit, offset int) ([]models.Notification, int, error) {
if limit <= 0 {
limit = 50
}
if limit > 200 {
limit = 200
}
var total int
err := s.db.GetContext(ctx, &total,
`SELECT COUNT(*) FROM notifications WHERE user_id = $1 AND tenant_id = $2`,
userID, tenantID)
if err != nil {
return nil, 0, fmt.Errorf("count notifications: %w", err)
}
var notifications []models.Notification
err = s.db.SelectContext(ctx, &notifications,
`SELECT id, tenant_id, user_id, type, entity_type, entity_id, title, body, sent_at, read_at, created_at
FROM notifications
WHERE user_id = $1 AND tenant_id = $2
ORDER BY created_at DESC
LIMIT $3 OFFSET $4`,
userID, tenantID, limit, offset)
if err != nil {
return nil, 0, fmt.Errorf("list notifications: %w", err)
}
return notifications, total, nil
}
// UnreadCount returns the number of unread notifications for a user.
func (s *NotificationService) UnreadCount(ctx context.Context, tenantID, userID uuid.UUID) (int, error) {
var count int
err := s.db.GetContext(ctx, &count,
`SELECT COUNT(*) FROM notifications WHERE user_id = $1 AND tenant_id = $2 AND read_at IS NULL`,
userID, tenantID)
return count, err
}
// MarkRead marks a single notification as read.
func (s *NotificationService) MarkRead(ctx context.Context, tenantID, userID, notificationID uuid.UUID) error {
result, err := s.db.ExecContext(ctx,
`UPDATE notifications SET read_at = now()
WHERE id = $1 AND user_id = $2 AND tenant_id = $3 AND read_at IS NULL`,
notificationID, userID, tenantID)
if err != nil {
return fmt.Errorf("mark notification read: %w", err)
}
rows, _ := result.RowsAffected()
if rows == 0 {
return fmt.Errorf("notification not found or already read")
}
return nil
}
// MarkAllRead marks all notifications as read for a user.
func (s *NotificationService) MarkAllRead(ctx context.Context, tenantID, userID uuid.UUID) error {
_, err := s.db.ExecContext(ctx,
`UPDATE notifications SET read_at = now()
WHERE user_id = $1 AND tenant_id = $2 AND read_at IS NULL`,
userID, tenantID)
return err
}
// GetPreferences returns notification preferences for a user, creating defaults if needed.
func (s *NotificationService) GetPreferences(ctx context.Context, tenantID, userID uuid.UUID) (*models.NotificationPreferences, error) {
var pref models.NotificationPreferences
err := s.db.GetContext(ctx, &pref,
`SELECT user_id, tenant_id, deadline_reminder_days, email_enabled, daily_digest, created_at, updated_at
FROM notification_preferences
WHERE user_id = $1 AND tenant_id = $2`,
userID, tenantID)
if err != nil {
// Return defaults if no preferences set
return &models.NotificationPreferences{
UserID: userID,
TenantID: tenantID,
DeadlineReminderDays: pq.Int64Array{7, 3, 1},
EmailEnabled: true,
DailyDigest: false,
}, nil
}
return &pref, nil
}
// UpdatePreferences upserts notification preferences for a user.
func (s *NotificationService) UpdatePreferences(ctx context.Context, tenantID, userID uuid.UUID, input UpdatePreferencesInput) (*models.NotificationPreferences, error) {
var pref models.NotificationPreferences
err := s.db.QueryRowxContext(ctx,
`INSERT INTO notification_preferences (user_id, tenant_id, deadline_reminder_days, email_enabled, daily_digest)
VALUES ($1, $2, $3, $4, $5)
ON CONFLICT (user_id, tenant_id)
DO UPDATE SET deadline_reminder_days = $3, email_enabled = $4, daily_digest = $5, updated_at = now()
RETURNING user_id, tenant_id, deadline_reminder_days, email_enabled, daily_digest, created_at, updated_at`,
userID, tenantID, pq.Int64Array(input.DeadlineReminderDays), input.EmailEnabled, input.DailyDigest).StructScan(&pref)
if err != nil {
return nil, fmt.Errorf("update preferences: %w", err)
}
return &pref, nil
}
// UpdatePreferencesInput holds the data for updating notification preferences.
type UpdatePreferencesInput struct {
DeadlineReminderDays []int64 `json:"deadline_reminder_days"`
EmailEnabled bool `json:"email_enabled"`
DailyDigest bool `json:"daily_digest"`
}
// SendEmail sends an email via direct SMTP over TLS.
// Requires SMTP_HOST, SMTP_USER, SMTP_PASS env vars. Falls back to no-op if unconfigured.
func SendEmail(to, subject, body string) error {
host := os.Getenv("SMTP_HOST")
port := os.Getenv("SMTP_PORT")
user := os.Getenv("SMTP_USER")
pass := os.Getenv("SMTP_PASS")
from := os.Getenv("MAIL_FROM")
if port == "" {
port = "465"
}
if from == "" {
from = "mgmt@msbls.de"
}
if host == "" || user == "" || pass == "" {
slog.Warn("SMTP not configured, skipping email", "to", to, "subject", subject)
return nil
}
// Build RFC 2822 message
msg := fmt.Sprintf("From: \"KanzlAI-mGMT\" <%s>\r\n"+
"To: %s\r\n"+
"Subject: [KanzlAI] %s\r\n"+
"MIME-Version: 1.0\r\n"+
"Content-Type: text/plain; charset=utf-8\r\n"+
"Date: %s\r\n"+
"\r\n%s",
from, to, subject,
time.Now().Format(time.RFC1123Z),
body)
addr := net.JoinHostPort(host, port)
// Connect with implicit TLS (port 465)
tlsConfig := &tls.Config{ServerName: host}
conn, err := tls.Dial("tcp", addr, tlsConfig)
if err != nil {
return fmt.Errorf("smtp tls dial: %w", err)
}
client, err := smtp.NewClient(conn, host)
if err != nil {
conn.Close()
return fmt.Errorf("smtp new client: %w", err)
}
defer client.Close()
// Authenticate
auth := smtp.PlainAuth("", user, pass, host)
if err := client.Auth(auth); err != nil {
return fmt.Errorf("smtp auth: %w", err)
}
// Send
if err := client.Mail(from); err != nil {
return fmt.Errorf("smtp mail from: %w", err)
}
if err := client.Rcpt(to); err != nil {
return fmt.Errorf("smtp rcpt to: %w", err)
}
w, err := client.Data()
if err != nil {
return fmt.Errorf("smtp data: %w", err)
}
if _, err := w.Write([]byte(msg)); err != nil {
return fmt.Errorf("smtp write: %w", err)
}
if err := w.Close(); err != nil {
return fmt.Errorf("smtp close data: %w", err)
}
if err := client.Quit(); err != nil {
slog.Warn("smtp quit error (non-fatal)", "error", err)
}
slog.Info("email sent via SMTP", "from", from, "to", to, "subject", subject)
return nil
}
// getUserEmail looks up the email for a user from Supabase auth.users.
func (s *NotificationService) getUserEmail(ctx context.Context, userID uuid.UUID) string {
var email string
err := s.db.GetContext(ctx, &email,
`SELECT email FROM auth.users WHERE id = $1`, userID)
if err != nil {
slog.Error("failed to get user email", "error", err, "user_id", userID)
return ""
}
return email
}
func containsDay(arr pq.Int64Array, day int64) bool {
for _, d := range arr {
if d == day {
return true
}
}
return false
}
func derefStr(s *string) string {
if s == nil {
return ""
}
return *s
}

View File

@@ -13,11 +13,12 @@ import (
)
type PartyService struct {
db *sqlx.DB
db *sqlx.DB
audit *AuditService
}
func NewPartyService(db *sqlx.DB) *PartyService {
return &PartyService{db: db}
func NewPartyService(db *sqlx.DB, audit *AuditService) *PartyService {
return &PartyService{db: db, audit: audit}
}
type CreatePartyInput struct {
@@ -79,6 +80,7 @@ func (s *PartyService) Create(ctx context.Context, tenantID, caseID uuid.UUID, u
if err := s.db.GetContext(ctx, &party, "SELECT * FROM parties WHERE id = $1", id); err != nil {
return nil, fmt.Errorf("fetching created party: %w", err)
}
s.audit.Log(ctx, "create", "party", &id, nil, party)
return &party, nil
}
@@ -135,6 +137,7 @@ func (s *PartyService) Update(ctx context.Context, tenantID, partyID uuid.UUID,
if err := s.db.GetContext(ctx, &updated, "SELECT * FROM parties WHERE id = $1", partyID); err != nil {
return nil, fmt.Errorf("fetching updated party: %w", err)
}
s.audit.Log(ctx, "update", "party", &partyID, current, updated)
return &updated, nil
}
@@ -148,5 +151,6 @@ func (s *PartyService) Delete(ctx context.Context, tenantID, partyID uuid.UUID)
if rows == 0 {
return sql.ErrNoRows
}
s.audit.Log(ctx, "delete", "party", &partyID, nil, nil)
return nil
}

View File

@@ -0,0 +1,329 @@
package services
import (
"context"
"fmt"
"time"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
)
type ReportingService struct {
db *sqlx.DB
}
func NewReportingService(db *sqlx.DB) *ReportingService {
return &ReportingService{db: db}
}
// --- Case Statistics ---
type CaseStats struct {
Period string `json:"period" db:"period"`
Opened int `json:"opened" db:"opened"`
Closed int `json:"closed" db:"closed"`
Active int `json:"active" db:"active"`
}
type CasesByType struct {
CaseType string `json:"case_type" db:"case_type"`
Count int `json:"count" db:"count"`
}
type CasesByCourt struct {
Court string `json:"court" db:"court"`
Count int `json:"count" db:"count"`
}
type CaseReport struct {
Monthly []CaseStats `json:"monthly"`
ByType []CasesByType `json:"by_type"`
ByCourt []CasesByCourt `json:"by_court"`
Total CaseReportTotals `json:"total"`
}
type CaseReportTotals struct {
Opened int `json:"opened"`
Closed int `json:"closed"`
Active int `json:"active"`
}
func (s *ReportingService) CaseReport(ctx context.Context, tenantID uuid.UUID, from, to time.Time) (*CaseReport, error) {
report := &CaseReport{}
// Monthly breakdown
monthlyQuery := `
SELECT
TO_CHAR(DATE_TRUNC('month', created_at), 'YYYY-MM') AS period,
COUNT(*) AS opened,
COUNT(*) FILTER (WHERE status IN ('closed', 'archived')) AS closed,
COUNT(*) FILTER (WHERE status = 'active') AS active
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3
GROUP BY DATE_TRUNC('month', created_at)
ORDER BY DATE_TRUNC('month', created_at)`
report.Monthly = []CaseStats{}
if err := s.db.SelectContext(ctx, &report.Monthly, monthlyQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("case report monthly: %w", err)
}
// By type
typeQuery := `
SELECT COALESCE(case_type, 'Sonstiges') AS case_type, COUNT(*) AS count
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3
GROUP BY case_type
ORDER BY count DESC`
report.ByType = []CasesByType{}
if err := s.db.SelectContext(ctx, &report.ByType, typeQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("case report by type: %w", err)
}
// By court
courtQuery := `
SELECT COALESCE(court, 'Ohne Gericht') AS court, COUNT(*) AS count
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3
GROUP BY court
ORDER BY count DESC`
report.ByCourt = []CasesByCourt{}
if err := s.db.SelectContext(ctx, &report.ByCourt, courtQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("case report by court: %w", err)
}
// Totals
totalsQuery := `
SELECT
COUNT(*) AS opened,
COUNT(*) FILTER (WHERE status IN ('closed', 'archived')) AS closed,
COUNT(*) FILTER (WHERE status = 'active') AS active
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3`
if err := s.db.GetContext(ctx, &report.Total, totalsQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("case report totals: %w", err)
}
return report, nil
}
// --- Deadline Compliance ---
type DeadlineCompliance struct {
Period string `json:"period" db:"period"`
Total int `json:"total" db:"total"`
Met int `json:"met" db:"met"`
Missed int `json:"missed" db:"missed"`
Pending int `json:"pending" db:"pending"`
ComplianceRate float64 `json:"compliance_rate"`
}
type MissedDeadline struct {
ID uuid.UUID `json:"id" db:"id"`
Title string `json:"title" db:"title"`
DueDate string `json:"due_date" db:"due_date"`
CaseID uuid.UUID `json:"case_id" db:"case_id"`
CaseNumber string `json:"case_number" db:"case_number"`
CaseTitle string `json:"case_title" db:"case_title"`
DaysOverdue int `json:"days_overdue" db:"days_overdue"`
}
type DeadlineReport struct {
Monthly []DeadlineCompliance `json:"monthly"`
Missed []MissedDeadline `json:"missed"`
Total DeadlineReportTotals `json:"total"`
}
type DeadlineReportTotals struct {
Total int `json:"total"`
Met int `json:"met"`
Missed int `json:"missed"`
Pending int `json:"pending"`
ComplianceRate float64 `json:"compliance_rate"`
}
func (s *ReportingService) DeadlineReport(ctx context.Context, tenantID uuid.UUID, from, to time.Time) (*DeadlineReport, error) {
report := &DeadlineReport{}
// Monthly compliance
monthlyQuery := `
SELECT
TO_CHAR(DATE_TRUNC('month', due_date), 'YYYY-MM') AS period,
COUNT(*) AS total,
COUNT(*) FILTER (WHERE status = 'completed' AND (completed_at IS NULL OR completed_at::date <= due_date)) AS met,
COUNT(*) FILTER (WHERE (status = 'completed' AND completed_at::date > due_date) OR (status = 'pending' AND due_date < CURRENT_DATE)) AS missed,
COUNT(*) FILTER (WHERE status = 'pending' AND due_date >= CURRENT_DATE) AS pending
FROM deadlines
WHERE tenant_id = $1 AND due_date >= $2 AND due_date <= $3
GROUP BY DATE_TRUNC('month', due_date)
ORDER BY DATE_TRUNC('month', due_date)`
report.Monthly = []DeadlineCompliance{}
if err := s.db.SelectContext(ctx, &report.Monthly, monthlyQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("deadline report monthly: %w", err)
}
// Calculate compliance rates
for i := range report.Monthly {
completed := report.Monthly[i].Met + report.Monthly[i].Missed
if completed > 0 {
report.Monthly[i].ComplianceRate = float64(report.Monthly[i].Met) / float64(completed) * 100
}
}
// Missed deadlines list
missedQuery := `
SELECT d.id, d.title, d.due_date, d.case_id, c.case_number, c.title AS case_title,
(CURRENT_DATE - d.due_date::date) AS days_overdue
FROM deadlines d
JOIN cases c ON c.id = d.case_id AND c.tenant_id = d.tenant_id
WHERE d.tenant_id = $1 AND d.due_date >= $2 AND d.due_date <= $3
AND ((d.status = 'pending' AND d.due_date < CURRENT_DATE)
OR (d.status = 'completed' AND d.completed_at::date > d.due_date))
ORDER BY d.due_date ASC
LIMIT 50`
report.Missed = []MissedDeadline{}
if err := s.db.SelectContext(ctx, &report.Missed, missedQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("deadline report missed: %w", err)
}
// Totals
totalsQuery := `
SELECT
COUNT(*) AS total,
COUNT(*) FILTER (WHERE status = 'completed' AND (completed_at IS NULL OR completed_at::date <= due_date)) AS met,
COUNT(*) FILTER (WHERE (status = 'completed' AND completed_at::date > due_date) OR (status = 'pending' AND due_date < CURRENT_DATE)) AS missed,
COUNT(*) FILTER (WHERE status = 'pending' AND due_date >= CURRENT_DATE) AS pending
FROM deadlines
WHERE tenant_id = $1 AND due_date >= $2 AND due_date <= $3`
if err := s.db.GetContext(ctx, &report.Total, totalsQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("deadline report totals: %w", err)
}
completed := report.Total.Met + report.Total.Missed
if completed > 0 {
report.Total.ComplianceRate = float64(report.Total.Met) / float64(completed) * 100
}
return report, nil
}
// --- Workload ---
type UserWorkload struct {
UserID uuid.UUID `json:"user_id" db:"user_id"`
ActiveCases int `json:"active_cases" db:"active_cases"`
Deadlines int `json:"deadlines" db:"deadlines"`
Overdue int `json:"overdue" db:"overdue"`
Completed int `json:"completed" db:"completed"`
}
type WorkloadReport struct {
Users []UserWorkload `json:"users"`
}
func (s *ReportingService) WorkloadReport(ctx context.Context, tenantID uuid.UUID, from, to time.Time) (*WorkloadReport, error) {
report := &WorkloadReport{}
query := `
WITH user_cases AS (
SELECT ca.user_id, COUNT(DISTINCT ca.case_id) AS active_cases
FROM case_assignments ca
JOIN cases c ON c.id = ca.case_id AND c.tenant_id = $1
WHERE c.status = 'active'
GROUP BY ca.user_id
),
user_deadlines AS (
SELECT ca.user_id,
COUNT(*) AS deadlines,
COUNT(*) FILTER (WHERE d.status = 'pending' AND d.due_date < CURRENT_DATE) AS overdue,
COUNT(*) FILTER (WHERE d.status = 'completed' AND d.completed_at >= $2 AND d.completed_at <= $3) AS completed
FROM case_assignments ca
JOIN deadlines d ON d.case_id = ca.case_id AND d.tenant_id = $1
WHERE d.due_date >= $2 AND d.due_date <= $3
GROUP BY ca.user_id
)
SELECT
COALESCE(uc.user_id, ud.user_id) AS user_id,
COALESCE(uc.active_cases, 0) AS active_cases,
COALESCE(ud.deadlines, 0) AS deadlines,
COALESCE(ud.overdue, 0) AS overdue,
COALESCE(ud.completed, 0) AS completed
FROM user_cases uc
FULL OUTER JOIN user_deadlines ud ON uc.user_id = ud.user_id
ORDER BY active_cases DESC`
report.Users = []UserWorkload{}
if err := s.db.SelectContext(ctx, &report.Users, query, tenantID, from, to); err != nil {
return nil, fmt.Errorf("workload report: %w", err)
}
return report, nil
}
// --- Billing (summary from case data) ---
type BillingByMonth struct {
Period string `json:"period" db:"period"`
CasesActive int `json:"cases_active" db:"cases_active"`
CasesClosed int `json:"cases_closed" db:"cases_closed"`
CasesNew int `json:"cases_new" db:"cases_new"`
}
type BillingByType struct {
CaseType string `json:"case_type" db:"case_type"`
Active int `json:"active" db:"active"`
Closed int `json:"closed" db:"closed"`
Total int `json:"total" db:"total"`
}
type BillingReport struct {
Monthly []BillingByMonth `json:"monthly"`
ByType []BillingByType `json:"by_type"`
}
func (s *ReportingService) BillingReport(ctx context.Context, tenantID uuid.UUID, from, to time.Time) (*BillingReport, error) {
report := &BillingReport{}
// Monthly activity for billing overview
monthlyQuery := `
SELECT
TO_CHAR(DATE_TRUNC('month', created_at), 'YYYY-MM') AS period,
COUNT(*) FILTER (WHERE status = 'active') AS cases_active,
COUNT(*) FILTER (WHERE status IN ('closed', 'archived')) AS cases_closed,
COUNT(*) AS cases_new
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3
GROUP BY DATE_TRUNC('month', created_at)
ORDER BY DATE_TRUNC('month', created_at)`
report.Monthly = []BillingByMonth{}
if err := s.db.SelectContext(ctx, &report.Monthly, monthlyQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("billing report monthly: %w", err)
}
// By case type
typeQuery := `
SELECT
COALESCE(case_type, 'Sonstiges') AS case_type,
COUNT(*) FILTER (WHERE status = 'active') AS active,
COUNT(*) FILTER (WHERE status IN ('closed', 'archived')) AS closed,
COUNT(*) AS total
FROM cases
WHERE tenant_id = $1 AND created_at >= $2 AND created_at <= $3
GROUP BY case_type
ORDER BY total DESC`
report.ByType = []BillingByType{}
if err := s.db.SelectContext(ctx, &report.ByType, typeQuery, tenantID, from, to); err != nil {
return nil, fmt.Errorf("billing report by type: %w", err)
}
return report, nil
}

View File

@@ -0,0 +1,330 @@
package services
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"strings"
"time"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type TemplateService struct {
db *sqlx.DB
audit *AuditService
}
func NewTemplateService(db *sqlx.DB, audit *AuditService) *TemplateService {
return &TemplateService{db: db, audit: audit}
}
type TemplateFilter struct {
Category string
Search string
Limit int
Offset int
}
type CreateTemplateInput struct {
Name string `json:"name"`
Description *string `json:"description,omitempty"`
Category string `json:"category"`
Content string `json:"content"`
Variables []byte `json:"variables,omitempty"`
}
type UpdateTemplateInput struct {
Name *string `json:"name,omitempty"`
Description *string `json:"description,omitempty"`
Category *string `json:"category,omitempty"`
Content *string `json:"content,omitempty"`
Variables []byte `json:"variables,omitempty"`
}
var validCategories = map[string]bool{
"schriftsatz": true,
"vertrag": true,
"korrespondenz": true,
"intern": true,
}
func (s *TemplateService) List(ctx context.Context, tenantID uuid.UUID, filter TemplateFilter) ([]models.DocumentTemplate, int, error) {
if filter.Limit <= 0 {
filter.Limit = 50
}
if filter.Limit > 100 {
filter.Limit = 100
}
// Show system templates + tenant's own templates
where := "WHERE (tenant_id = $1 OR is_system = true)"
args := []any{tenantID}
argIdx := 2
if filter.Category != "" {
where += fmt.Sprintf(" AND category = $%d", argIdx)
args = append(args, filter.Category)
argIdx++
}
if filter.Search != "" {
where += fmt.Sprintf(" AND (name ILIKE $%d OR description ILIKE $%d)", argIdx, argIdx)
args = append(args, "%"+filter.Search+"%")
argIdx++
}
var total int
countQ := "SELECT COUNT(*) FROM document_templates " + where
if err := s.db.GetContext(ctx, &total, countQ, args...); err != nil {
return nil, 0, fmt.Errorf("counting templates: %w", err)
}
query := "SELECT * FROM document_templates " + where + " ORDER BY is_system DESC, name ASC"
query += fmt.Sprintf(" LIMIT $%d OFFSET $%d", argIdx, argIdx+1)
args = append(args, filter.Limit, filter.Offset)
var templates []models.DocumentTemplate
if err := s.db.SelectContext(ctx, &templates, query, args...); err != nil {
return nil, 0, fmt.Errorf("listing templates: %w", err)
}
return templates, total, nil
}
func (s *TemplateService) GetByID(ctx context.Context, tenantID, templateID uuid.UUID) (*models.DocumentTemplate, error) {
var t models.DocumentTemplate
err := s.db.GetContext(ctx, &t,
"SELECT * FROM document_templates WHERE id = $1 AND (tenant_id = $2 OR is_system = true)",
templateID, tenantID)
if err == sql.ErrNoRows {
return nil, nil
}
if err != nil {
return nil, fmt.Errorf("getting template: %w", err)
}
return &t, nil
}
func (s *TemplateService) Create(ctx context.Context, tenantID uuid.UUID, input CreateTemplateInput) (*models.DocumentTemplate, error) {
if input.Name == "" {
return nil, fmt.Errorf("name is required")
}
if !validCategories[input.Category] {
return nil, fmt.Errorf("invalid category: %s", input.Category)
}
variables := input.Variables
if variables == nil {
variables = []byte("[]")
}
var t models.DocumentTemplate
err := s.db.GetContext(ctx, &t,
`INSERT INTO document_templates (tenant_id, name, description, category, content, variables, is_system)
VALUES ($1, $2, $3, $4, $5, $6, false)
RETURNING *`,
tenantID, input.Name, input.Description, input.Category, input.Content, variables)
if err != nil {
return nil, fmt.Errorf("creating template: %w", err)
}
s.audit.Log(ctx, "create", "document_template", &t.ID, nil, t)
return &t, nil
}
func (s *TemplateService) Update(ctx context.Context, tenantID, templateID uuid.UUID, input UpdateTemplateInput) (*models.DocumentTemplate, error) {
// Don't allow editing system templates
existing, err := s.GetByID(ctx, tenantID, templateID)
if err != nil {
return nil, err
}
if existing == nil {
return nil, nil
}
if existing.IsSystem {
return nil, fmt.Errorf("system templates cannot be edited")
}
if existing.TenantID == nil || *existing.TenantID != tenantID {
return nil, fmt.Errorf("template does not belong to tenant")
}
sets := []string{}
args := []any{}
argIdx := 1
if input.Name != nil {
sets = append(sets, fmt.Sprintf("name = $%d", argIdx))
args = append(args, *input.Name)
argIdx++
}
if input.Description != nil {
sets = append(sets, fmt.Sprintf("description = $%d", argIdx))
args = append(args, *input.Description)
argIdx++
}
if input.Category != nil {
if !validCategories[*input.Category] {
return nil, fmt.Errorf("invalid category: %s", *input.Category)
}
sets = append(sets, fmt.Sprintf("category = $%d", argIdx))
args = append(args, *input.Category)
argIdx++
}
if input.Content != nil {
sets = append(sets, fmt.Sprintf("content = $%d", argIdx))
args = append(args, *input.Content)
argIdx++
}
if input.Variables != nil {
sets = append(sets, fmt.Sprintf("variables = $%d", argIdx))
args = append(args, input.Variables)
argIdx++
}
if len(sets) == 0 {
return existing, nil
}
sets = append(sets, "updated_at = now()")
query := fmt.Sprintf("UPDATE document_templates SET %s WHERE id = $%d AND tenant_id = $%d RETURNING *",
strings.Join(sets, ", "), argIdx, argIdx+1)
args = append(args, templateID, tenantID)
var t models.DocumentTemplate
if err := s.db.GetContext(ctx, &t, query, args...); err != nil {
return nil, fmt.Errorf("updating template: %w", err)
}
s.audit.Log(ctx, "update", "document_template", &t.ID, existing, t)
return &t, nil
}
func (s *TemplateService) Delete(ctx context.Context, tenantID, templateID uuid.UUID) error {
// Don't allow deleting system templates
existing, err := s.GetByID(ctx, tenantID, templateID)
if err != nil {
return err
}
if existing == nil {
return fmt.Errorf("template not found")
}
if existing.IsSystem {
return fmt.Errorf("system templates cannot be deleted")
}
if existing.TenantID == nil || *existing.TenantID != tenantID {
return fmt.Errorf("template does not belong to tenant")
}
_, err = s.db.ExecContext(ctx, "DELETE FROM document_templates WHERE id = $1 AND tenant_id = $2", templateID, tenantID)
if err != nil {
return fmt.Errorf("deleting template: %w", err)
}
s.audit.Log(ctx, "delete", "document_template", &templateID, existing, nil)
return nil
}
// RenderData holds all the data available for template variable replacement.
type RenderData struct {
Case *models.Case
Parties []models.Party
Tenant *models.Tenant
Deadline *models.Deadline
UserName string
UserEmail string
}
// Render replaces {{placeholders}} in the template content with actual data.
func (s *TemplateService) Render(template *models.DocumentTemplate, data RenderData) string {
content := template.Content
now := time.Now()
replacements := map[string]string{
"{{date.today}}": now.Format("02.01.2006"),
"{{date.today_long}}": formatGermanDate(now),
}
// Case data
if data.Case != nil {
replacements["{{case.number}}"] = data.Case.CaseNumber
replacements["{{case.title}}"] = data.Case.Title
if data.Case.Court != nil {
replacements["{{case.court}}"] = *data.Case.Court
}
if data.Case.CourtRef != nil {
replacements["{{case.court_ref}}"] = *data.Case.CourtRef
}
}
// Party data
for _, p := range data.Parties {
role := ""
if p.Role != nil {
role = *p.Role
}
switch role {
case "claimant", "plaintiff", "klaeger":
replacements["{{party.claimant.name}}"] = p.Name
if p.Representative != nil {
replacements["{{party.claimant.representative}}"] = *p.Representative
}
case "defendant", "beklagter":
replacements["{{party.defendant.name}}"] = p.Name
if p.Representative != nil {
replacements["{{party.defendant.representative}}"] = *p.Representative
}
}
}
// Tenant data
if data.Tenant != nil {
replacements["{{tenant.name}}"] = data.Tenant.Name
// Extract address from settings if available
replacements["{{tenant.address}}"] = extractSettingsField(data.Tenant.Settings, "address")
}
// User data
replacements["{{user.name}}"] = data.UserName
replacements["{{user.email}}"] = data.UserEmail
// Deadline data
if data.Deadline != nil {
replacements["{{deadline.title}}"] = data.Deadline.Title
replacements["{{deadline.due_date}}"] = data.Deadline.DueDate
}
for placeholder, value := range replacements {
content = strings.ReplaceAll(content, placeholder, value)
}
return content
}
func formatGermanDate(t time.Time) string {
months := []string{
"Januar", "Februar", "März", "April", "Mai", "Juni",
"Juli", "August", "September", "Oktober", "November", "Dezember",
}
return fmt.Sprintf("%d. %s %d", t.Day(), months[t.Month()-1], t.Year())
}
func extractSettingsField(settings []byte, field string) string {
if len(settings) == 0 {
return ""
}
var m map[string]any
if err := json.Unmarshal(settings, &m); err != nil {
return ""
}
if v, ok := m[field]; ok {
if s, ok := v.(string); ok {
return s
}
}
return ""
}

View File

@@ -3,6 +3,7 @@ package services
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"github.com/google/uuid"
@@ -12,11 +13,12 @@ import (
)
type TenantService struct {
db *sqlx.DB
db *sqlx.DB
audit *AuditService
}
func NewTenantService(db *sqlx.DB) *TenantService {
return &TenantService{db: db}
func NewTenantService(db *sqlx.DB, audit *AuditService) *TenantService {
return &TenantService{db: db, audit: audit}
}
// Create creates a new tenant and assigns the creator as owner.
@@ -48,6 +50,7 @@ func (s *TenantService) Create(ctx context.Context, userID uuid.UUID, name, slug
return nil, fmt.Errorf("commit: %w", err)
}
s.audit.Log(ctx, "create", "tenant", &tenant.ID, nil, tenant)
return &tenant, nil
}
@@ -100,6 +103,19 @@ func (s *TenantService) GetUserRole(ctx context.Context, userID, tenantID uuid.U
return role, nil
}
// VerifyAccess checks if a user has access to a given tenant.
func (s *TenantService) VerifyAccess(ctx context.Context, userID, tenantID uuid.UUID) (bool, error) {
var exists bool
err := s.db.GetContext(ctx, &exists,
`SELECT EXISTS(SELECT 1 FROM user_tenants WHERE user_id = $1 AND tenant_id = $2)`,
userID, tenantID,
)
if err != nil {
return false, fmt.Errorf("verify tenant access: %w", err)
}
return exists, nil
}
// FirstTenantForUser returns the user's first tenant (by name), used as default.
func (s *TenantService) FirstTenantForUser(ctx context.Context, userID uuid.UUID) (*uuid.UUID, error) {
var tenantID uuid.UUID
@@ -123,7 +139,11 @@ func (s *TenantService) FirstTenantForUser(ctx context.Context, userID uuid.UUID
func (s *TenantService) ListMembers(ctx context.Context, tenantID uuid.UUID) ([]models.UserTenant, error) {
var members []models.UserTenant
err := s.db.SelectContext(ctx, &members,
`SELECT user_id, tenant_id, role, created_at FROM user_tenants WHERE tenant_id = $1 ORDER BY created_at`,
`SELECT ut.user_id, ut.tenant_id, ut.role, ut.created_at, COALESCE(au.email, '') as email
FROM user_tenants ut
LEFT JOIN auth.users au ON au.id = ut.user_id
WHERE ut.tenant_id = $1
ORDER BY ut.created_at`,
tenantID,
)
if err != nil {
@@ -170,9 +190,108 @@ func (s *TenantService) InviteByEmail(ctx context.Context, tenantID uuid.UUID, e
return nil, fmt.Errorf("invite user: %w", err)
}
s.audit.Log(ctx, "create", "membership", &tenantID, nil, ut)
return &ut, nil
}
// UpdateSettings merges new settings into the tenant's existing settings JSONB.
func (s *TenantService) UpdateSettings(ctx context.Context, tenantID uuid.UUID, settings json.RawMessage) (*models.Tenant, error) {
var tenant models.Tenant
err := s.db.QueryRowxContext(ctx,
`UPDATE tenants SET settings = COALESCE(settings, '{}'::jsonb) || $1::jsonb, updated_at = NOW()
WHERE id = $2
RETURNING id, name, slug, settings, created_at, updated_at`,
settings, tenantID,
).StructScan(&tenant)
if err != nil {
return nil, fmt.Errorf("update settings: %w", err)
}
s.audit.Log(ctx, "update", "settings", &tenantID, nil, settings)
return &tenant, nil
}
// UpdateMemberRole changes a member's role in a tenant.
func (s *TenantService) UpdateMemberRole(ctx context.Context, tenantID, userID uuid.UUID, newRole string) error {
// Get current role
currentRole, err := s.GetUserRole(ctx, userID, tenantID)
if err != nil {
return fmt.Errorf("get current role: %w", err)
}
if currentRole == "" {
return fmt.Errorf("user is not a member of this tenant")
}
// If demoting the last owner, block it
if currentRole == "owner" && newRole != "owner" {
var ownerCount int
err := s.db.GetContext(ctx, &ownerCount,
`SELECT COUNT(*) FROM user_tenants WHERE tenant_id = $1 AND role = 'owner'`,
tenantID)
if err != nil {
return fmt.Errorf("count owners: %w", err)
}
if ownerCount <= 1 {
return fmt.Errorf("cannot demote the last owner")
}
}
_, err = s.db.ExecContext(ctx,
`UPDATE user_tenants SET role = $1 WHERE user_id = $2 AND tenant_id = $3`,
newRole, userID, tenantID)
if err != nil {
return fmt.Errorf("update role: %w", err)
}
return nil
}
// AutoAssignByDomain finds a tenant with a matching auto_assign_domains setting
// and adds the user as a member. Returns the tenant and role, or nil if no match.
func (s *TenantService) AutoAssignByDomain(ctx context.Context, userID uuid.UUID, emailDomain string) (*models.TenantWithRole, error) {
// Find tenant where settings.auto_assign_domains contains this domain
var tenant models.Tenant
err := s.db.GetContext(ctx, &tenant,
`SELECT id, name, slug, settings, created_at, updated_at
FROM tenants
WHERE settings->'auto_assign_domains' ? $1
LIMIT 1`,
emailDomain,
)
if err != nil {
return nil, nil // no match — not an error
}
// Check if already a member
var exists bool
err = s.db.GetContext(ctx, &exists,
`SELECT EXISTS(SELECT 1 FROM user_tenants WHERE user_id = $1 AND tenant_id = $2)`,
userID, tenant.ID,
)
if err != nil {
return nil, fmt.Errorf("check membership: %w", err)
}
if exists {
// Already a member — return the existing membership
role, err := s.GetUserRole(ctx, userID, tenant.ID)
if err != nil {
return nil, fmt.Errorf("get existing role: %w", err)
}
return &models.TenantWithRole{Tenant: tenant, Role: role}, nil
}
// Add as member (associate by default for auto-assigned users)
role := "associate"
_, err = s.db.ExecContext(ctx,
`INSERT INTO user_tenants (user_id, tenant_id, role) VALUES ($1, $2, $3)`,
userID, tenant.ID, role,
)
if err != nil {
return nil, fmt.Errorf("auto-assign user: %w", err)
}
s.audit.Log(ctx, "create", "auto_membership", &tenant.ID, map[string]any{"domain": emailDomain}, map[string]any{"user_id": userID, "role": role})
return &models.TenantWithRole{Tenant: tenant, Role: role}, nil
}
// RemoveMember removes a user from a tenant. Cannot remove the last owner.
func (s *TenantService) RemoveMember(ctx context.Context, tenantID, userID uuid.UUID) error {
// Check if the user being removed is an owner
@@ -207,5 +326,6 @@ func (s *TenantService) RemoveMember(ctx context.Context, tenantID, userID uuid.
return fmt.Errorf("remove member: %w", err)
}
s.audit.Log(ctx, "delete", "membership", &tenantID, map[string]any{"user_id": userID, "role": role}, nil)
return nil
}

View File

@@ -0,0 +1,276 @@
package services
import (
"context"
"database/sql"
"fmt"
"github.com/google/uuid"
"github.com/jmoiron/sqlx"
"mgit.msbls.de/m/KanzlAI-mGMT/internal/models"
)
type TimeEntryService struct {
db *sqlx.DB
audit *AuditService
}
func NewTimeEntryService(db *sqlx.DB, audit *AuditService) *TimeEntryService {
return &TimeEntryService{db: db, audit: audit}
}
type CreateTimeEntryInput struct {
CaseID uuid.UUID `json:"case_id"`
Date string `json:"date"`
DurationMinutes int `json:"duration_minutes"`
Description string `json:"description"`
Activity *string `json:"activity,omitempty"`
Billable *bool `json:"billable,omitempty"`
HourlyRate *float64 `json:"hourly_rate,omitempty"`
}
type UpdateTimeEntryInput struct {
Date *string `json:"date,omitempty"`
DurationMinutes *int `json:"duration_minutes,omitempty"`
Description *string `json:"description,omitempty"`
Activity *string `json:"activity,omitempty"`
Billable *bool `json:"billable,omitempty"`
HourlyRate *float64 `json:"hourly_rate,omitempty"`
}
type TimeEntryFilter struct {
CaseID *uuid.UUID
UserID *uuid.UUID
From string
To string
Limit int
Offset int
}
type TimeEntrySummary struct {
GroupKey string `db:"group_key" json:"group_key"`
TotalMinutes int `db:"total_minutes" json:"total_minutes"`
BillableMinutes int `db:"billable_minutes" json:"billable_minutes"`
TotalAmount float64 `db:"total_amount" json:"total_amount"`
EntryCount int `db:"entry_count" json:"entry_count"`
}
const timeEntryCols = `id, tenant_id, case_id, user_id, date, duration_minutes, description,
activity, billable, billed, invoice_id, hourly_rate, created_at, updated_at`
func (s *TimeEntryService) ListForCase(ctx context.Context, tenantID, caseID uuid.UUID) ([]models.TimeEntry, error) {
var entries []models.TimeEntry
err := s.db.SelectContext(ctx, &entries,
`SELECT `+timeEntryCols+` FROM time_entries
WHERE tenant_id = $1 AND case_id = $2
ORDER BY date DESC, created_at DESC`,
tenantID, caseID)
if err != nil {
return nil, fmt.Errorf("list time entries for case: %w", err)
}
return entries, nil
}
func (s *TimeEntryService) List(ctx context.Context, tenantID uuid.UUID, filter TimeEntryFilter) ([]models.TimeEntry, int, error) {
if filter.Limit <= 0 {
filter.Limit = 20
}
if filter.Limit > 100 {
filter.Limit = 100
}
where := "WHERE tenant_id = $1"
args := []any{tenantID}
argIdx := 2
if filter.CaseID != nil {
where += fmt.Sprintf(" AND case_id = $%d", argIdx)
args = append(args, *filter.CaseID)
argIdx++
}
if filter.UserID != nil {
where += fmt.Sprintf(" AND user_id = $%d", argIdx)
args = append(args, *filter.UserID)
argIdx++
}
if filter.From != "" {
where += fmt.Sprintf(" AND date >= $%d", argIdx)
args = append(args, filter.From)
argIdx++
}
if filter.To != "" {
where += fmt.Sprintf(" AND date <= $%d", argIdx)
args = append(args, filter.To)
argIdx++
}
var total int
err := s.db.GetContext(ctx, &total,
"SELECT COUNT(*) FROM time_entries "+where, args...)
if err != nil {
return nil, 0, fmt.Errorf("count time entries: %w", err)
}
query := fmt.Sprintf("SELECT %s FROM time_entries %s ORDER BY date DESC, created_at DESC LIMIT $%d OFFSET $%d",
timeEntryCols, where, argIdx, argIdx+1)
args = append(args, filter.Limit, filter.Offset)
var entries []models.TimeEntry
err = s.db.SelectContext(ctx, &entries, query, args...)
if err != nil {
return nil, 0, fmt.Errorf("list time entries: %w", err)
}
return entries, total, nil
}
func (s *TimeEntryService) GetByID(ctx context.Context, tenantID, entryID uuid.UUID) (*models.TimeEntry, error) {
var entry models.TimeEntry
err := s.db.GetContext(ctx, &entry,
`SELECT `+timeEntryCols+` FROM time_entries WHERE tenant_id = $1 AND id = $2`,
tenantID, entryID)
if err == sql.ErrNoRows {
return nil, nil
}
if err != nil {
return nil, fmt.Errorf("get time entry: %w", err)
}
return &entry, nil
}
func (s *TimeEntryService) Create(ctx context.Context, tenantID, userID uuid.UUID, input CreateTimeEntryInput) (*models.TimeEntry, error) {
billable := true
if input.Billable != nil {
billable = *input.Billable
}
// If no hourly rate provided, look up the current billing rate
hourlyRate := input.HourlyRate
if hourlyRate == nil {
var rate float64
err := s.db.GetContext(ctx, &rate,
`SELECT rate FROM billing_rates
WHERE tenant_id = $1 AND (user_id = $2 OR user_id IS NULL)
AND valid_from <= $3 AND (valid_to IS NULL OR valid_to >= $3)
ORDER BY user_id NULLS LAST LIMIT 1`,
tenantID, userID, input.Date)
if err == nil {
hourlyRate = &rate
}
}
var entry models.TimeEntry
err := s.db.QueryRowxContext(ctx,
`INSERT INTO time_entries (tenant_id, case_id, user_id, date, duration_minutes, description, activity, billable, hourly_rate)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
RETURNING `+timeEntryCols,
tenantID, input.CaseID, userID, input.Date, input.DurationMinutes, input.Description, input.Activity, billable, hourlyRate,
).StructScan(&entry)
if err != nil {
return nil, fmt.Errorf("create time entry: %w", err)
}
s.audit.Log(ctx, "create", "time_entry", &entry.ID, nil, entry)
return &entry, nil
}
func (s *TimeEntryService) Update(ctx context.Context, tenantID, entryID uuid.UUID, input UpdateTimeEntryInput) (*models.TimeEntry, error) {
old, err := s.GetByID(ctx, tenantID, entryID)
if err != nil {
return nil, err
}
if old == nil {
return nil, fmt.Errorf("time entry not found")
}
if old.Billed {
return nil, fmt.Errorf("cannot update a billed time entry")
}
var entry models.TimeEntry
err = s.db.QueryRowxContext(ctx,
`UPDATE time_entries SET
date = COALESCE($3, date),
duration_minutes = COALESCE($4, duration_minutes),
description = COALESCE($5, description),
activity = COALESCE($6, activity),
billable = COALESCE($7, billable),
hourly_rate = COALESCE($8, hourly_rate),
updated_at = now()
WHERE tenant_id = $1 AND id = $2
RETURNING `+timeEntryCols,
tenantID, entryID, input.Date, input.DurationMinutes, input.Description, input.Activity, input.Billable, input.HourlyRate,
).StructScan(&entry)
if err != nil {
return nil, fmt.Errorf("update time entry: %w", err)
}
s.audit.Log(ctx, "update", "time_entry", &entry.ID, old, entry)
return &entry, nil
}
func (s *TimeEntryService) Delete(ctx context.Context, tenantID, entryID uuid.UUID) error {
old, err := s.GetByID(ctx, tenantID, entryID)
if err != nil {
return err
}
if old == nil {
return fmt.Errorf("time entry not found")
}
if old.Billed {
return fmt.Errorf("cannot delete a billed time entry")
}
_, err = s.db.ExecContext(ctx,
`DELETE FROM time_entries WHERE tenant_id = $1 AND id = $2`,
tenantID, entryID)
if err != nil {
return fmt.Errorf("delete time entry: %w", err)
}
s.audit.Log(ctx, "delete", "time_entry", &entryID, old, nil)
return nil
}
func (s *TimeEntryService) Summary(ctx context.Context, tenantID uuid.UUID, groupBy string, from, to string) ([]TimeEntrySummary, error) {
var groupExpr string
switch groupBy {
case "user":
groupExpr = "user_id::text"
case "month":
groupExpr = "to_char(date, 'YYYY-MM')"
default:
groupExpr = "case_id::text"
}
where := "WHERE tenant_id = $1"
args := []any{tenantID}
argIdx := 2
if from != "" {
where += fmt.Sprintf(" AND date >= $%d", argIdx)
args = append(args, from)
argIdx++
}
if to != "" {
where += fmt.Sprintf(" AND date <= $%d", argIdx)
args = append(args, to)
argIdx++
}
query := fmt.Sprintf(`SELECT %s AS group_key,
SUM(duration_minutes) AS total_minutes,
SUM(CASE WHEN billable THEN duration_minutes ELSE 0 END) AS billable_minutes,
SUM(CASE WHEN billable AND hourly_rate IS NOT NULL THEN duration_minutes * hourly_rate / 60.0 ELSE 0 END) AS total_amount,
COUNT(*) AS entry_count
FROM time_entries %s
GROUP BY %s
ORDER BY %s`,
groupExpr, where, groupExpr, groupExpr)
var summaries []TimeEntrySummary
err := s.db.SelectContext(ctx, &summaries, query, args...)
if err != nil {
return nil, fmt.Errorf("time entry summary: %w", err)
}
return summaries, nil
}

167
backend/seed/demo_data.sql Normal file
View File

@@ -0,0 +1,167 @@
-- KanzlAI Demo Data
-- Creates 1 test tenant, 5 cases with deadlines and appointments
-- Run with: psql $DATABASE_URL -f demo_data.sql
SET search_path TO mgmt, public;
-- Demo tenant
INSERT INTO tenants (id, name, slug, settings) VALUES
('a0000000-0000-0000-0000-000000000001', 'Kanzlei Siebels & Partner', 'siebels-partner', '{}')
ON CONFLICT (id) DO NOTHING;
-- Link both users to the demo tenant
INSERT INTO user_tenants (user_id, tenant_id, role) VALUES
('1da9374d-a8a6-49fc-a2ec-5ddfa91d522d', 'a0000000-0000-0000-0000-000000000001', 'owner'),
('ac6c9501-3757-4a6d-8b97-2cff4288382b', 'a0000000-0000-0000-0000-000000000001', 'member')
ON CONFLICT DO NOTHING;
-- ============================================================
-- Case 1: Patentverletzung (patent infringement) — active
-- ============================================================
INSERT INTO cases (id, tenant_id, case_number, title, case_type, court, court_ref, status) VALUES
('c0000000-0000-0000-0000-000000000001',
'a0000000-0000-0000-0000-000000000001',
'2026/001', 'TechCorp GmbH ./. InnovatAG — Patentverletzung EP 1234567',
'patent', 'UPC München (Lokalkammer)', 'UPC_CFI-123/2026',
'active');
INSERT INTO parties (id, tenant_id, case_id, name, role, representative) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'TechCorp GmbH', 'claimant', 'RA Dr. Siebels'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'InnovatAG', 'defendant', 'RA Müller');
INSERT INTO deadlines (id, tenant_id, case_id, title, due_date, warning_date, status, source) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'Klageerwiderung einreichen', CURRENT_DATE + INTERVAL '3 days', CURRENT_DATE + INTERVAL '1 day', 'pending', 'manual'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'Beweisangebote nachreichen', CURRENT_DATE + INTERVAL '14 days', CURRENT_DATE + INTERVAL '10 days', 'pending', 'manual'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'Schriftsatz Anspruch 3', CURRENT_DATE - INTERVAL '2 days', CURRENT_DATE - INTERVAL '5 days', 'pending', 'manual');
INSERT INTO appointments (id, tenant_id, case_id, title, start_at, end_at, location, appointment_type) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'Mündliche Verhandlung', CURRENT_DATE + INTERVAL '21 days' + TIME '10:00', CURRENT_DATE + INTERVAL '21 days' + TIME '12:00',
'UPC München, Saal 4', 'hearing');
-- ============================================================
-- Case 2: Markenrecht (trademark) — active
-- ============================================================
INSERT INTO cases (id, tenant_id, case_number, title, case_type, court, court_ref, status) VALUES
('c0000000-0000-0000-0000-000000000002',
'a0000000-0000-0000-0000-000000000001',
'2026/002', 'BrandHouse ./. CopyShop UG — Markenverletzung DE 30201234',
'trademark', 'LG Hamburg', '315 O 78/26',
'active');
INSERT INTO parties (id, tenant_id, case_id, name, role, representative) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'BrandHouse SE', 'claimant', 'RA Dr. Siebels'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'CopyShop UG', 'defendant', 'RA Weber');
INSERT INTO deadlines (id, tenant_id, case_id, title, due_date, warning_date, status, source) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'Antrag einstweilige Verfügung', CURRENT_DATE + INTERVAL '5 days', CURRENT_DATE + INTERVAL '2 days', 'pending', 'manual'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'Abmahnung Fristablauf', CURRENT_DATE + INTERVAL '30 days', CURRENT_DATE + INTERVAL '25 days', 'pending', 'manual');
INSERT INTO appointments (id, tenant_id, case_id, title, start_at, end_at, location, appointment_type) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'Mandantenbesprechung BrandHouse', CURRENT_DATE + INTERVAL '2 days' + TIME '14:00', CURRENT_DATE + INTERVAL '2 days' + TIME '15:30',
'Kanzlei, Besprechungsraum 1', 'consultation');
-- ============================================================
-- Case 3: Arbeitsgericht (labor law) — active
-- ============================================================
INSERT INTO cases (id, tenant_id, case_number, title, case_type, court, court_ref, status) VALUES
('c0000000-0000-0000-0000-000000000003',
'a0000000-0000-0000-0000-000000000001',
'2026/003', 'Schmidt ./. AutoWerk Bayern GmbH — Kündigungsschutz',
'labor', 'ArbG München', '12 Ca 456/26',
'active');
INSERT INTO parties (id, tenant_id, case_id, name, role, representative) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'Klaus Schmidt', 'claimant', 'RA Dr. Siebels'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'AutoWerk Bayern GmbH', 'defendant', 'RA Fischer');
INSERT INTO deadlines (id, tenant_id, case_id, title, due_date, warning_date, status, source) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'Kündigungsschutzklage einreichen (3-Wochen-Frist)', CURRENT_DATE + INTERVAL '7 days', CURRENT_DATE + INTERVAL '4 days', 'pending', 'manual'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'Stellungnahme Arbeitgeber', CURRENT_DATE + INTERVAL '28 days', CURRENT_DATE + INTERVAL '21 days', 'pending', 'manual');
INSERT INTO appointments (id, tenant_id, case_id, title, start_at, end_at, location, appointment_type) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'Güteverhandlung', CURRENT_DATE + INTERVAL '35 days' + TIME '09:00', CURRENT_DATE + INTERVAL '35 days' + TIME '10:00',
'ArbG München, Saal 12', 'hearing');
-- ============================================================
-- Case 4: Mietrecht (tenancy) — active
-- ============================================================
INSERT INTO cases (id, tenant_id, case_number, title, case_type, court, court_ref, status) VALUES
('c0000000-0000-0000-0000-000000000004',
'a0000000-0000-0000-0000-000000000001',
'2026/004', 'Hausverwaltung Zentral ./. Meier — Mietrückstand',
'civil', 'AG München', '432 C 1234/26',
'active');
INSERT INTO parties (id, tenant_id, case_id, name, role, representative) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'Hausverwaltung Zentral GmbH', 'claimant', 'RA Dr. Siebels'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'Thomas Meier', 'defendant', NULL);
INSERT INTO deadlines (id, tenant_id, case_id, title, due_date, warning_date, status, source) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'Mahnbescheid beantragen', CURRENT_DATE + INTERVAL '10 days', CURRENT_DATE + INTERVAL '7 days', 'pending', 'manual'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'Räumungsfrist prüfen', CURRENT_DATE + INTERVAL '60 days', CURRENT_DATE + INTERVAL '50 days', 'pending', 'manual');
INSERT INTO appointments (id, tenant_id, case_id, title, start_at, end_at, location, appointment_type) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'Besprechung Hausverwaltung', CURRENT_DATE + INTERVAL '4 days' + TIME '11:00', CURRENT_DATE + INTERVAL '4 days' + TIME '12:00',
'Kanzlei, Besprechungsraum 2', 'meeting');
-- ============================================================
-- Case 5: Erbrecht (inheritance) — closed
-- ============================================================
INSERT INTO cases (id, tenant_id, case_number, title, case_type, court, court_ref, status) VALUES
('c0000000-0000-0000-0000-000000000005',
'a0000000-0000-0000-0000-000000000001',
'2025/042', 'Nachlass Wagner — Erbauseinandersetzung',
'civil', 'AG Starnberg', '3 VI 891/25',
'closed');
INSERT INTO parties (id, tenant_id, case_id, name, role, representative) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000005',
'Maria Wagner', 'claimant', 'RA Dr. Siebels'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000005',
'Peter Wagner', 'defendant', 'RA Braun');
INSERT INTO deadlines (id, tenant_id, case_id, title, due_date, warning_date, status, source, completed_at) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000005',
'Erbscheinsantrag einreichen', CURRENT_DATE - INTERVAL '30 days', CURRENT_DATE - INTERVAL '37 days', 'completed', 'manual', CURRENT_DATE - INTERVAL '32 days');
-- ============================================================
-- Case events for realistic activity feed
-- ============================================================
INSERT INTO case_events (id, tenant_id, case_id, event_type, title, description, created_at, updated_at) VALUES
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'case_created', 'Akte angelegt', 'Patentverletzungsklage TechCorp ./. InnovatAG eröffnet', NOW() - INTERVAL '10 days', NOW() - INTERVAL '10 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'party_added', 'Partei hinzugefügt', 'TechCorp GmbH als Kläger eingetragen', NOW() - INTERVAL '10 days', NOW() - INTERVAL '10 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000002',
'case_created', 'Akte angelegt', 'Markenrechtsstreit BrandHouse ./. CopyShop eröffnet', NOW() - INTERVAL '7 days', NOW() - INTERVAL '7 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000003',
'case_created', 'Akte angelegt', 'Kündigungsschutzklage Schmidt eröffnet', NOW() - INTERVAL '5 days', NOW() - INTERVAL '5 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000004',
'case_created', 'Akte angelegt', 'Mietrückstand Hausverwaltung ./. Meier eröffnet', NOW() - INTERVAL '3 days', NOW() - INTERVAL '3 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000001',
'status_changed', 'Fristablauf überschritten', 'Schriftsatz Anspruch 3 ist überfällig', NOW() - INTERVAL '1 day', NOW() - INTERVAL '1 day'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000005',
'case_created', 'Akte angelegt', 'Erbauseinandersetzung Wagner eröffnet', NOW() - INTERVAL '60 days', NOW() - INTERVAL '60 days'),
(gen_random_uuid(), 'a0000000-0000-0000-0000-000000000001', 'c0000000-0000-0000-0000-000000000005',
'status_changed', 'Akte geschlossen', 'Erbscheinsverfahren abgeschlossen', NOW() - INTERVAL '20 days', NOW() - INTERVAL '20 days');

View File

@@ -0,0 +1,466 @@
-- UPC Proceeding Timeline: Full event tree with conditional deadlines
-- Ported from youpc.org migrations 039 + 040
-- Run against mgmt schema in youpc.org Supabase instance
-- ========================================
-- 1. Add is_spawn + spawn_label columns
-- ========================================
ALTER TABLE deadline_rules
ADD COLUMN IF NOT EXISTS is_spawn BOOLEAN DEFAULT false,
ADD COLUMN IF NOT EXISTS spawn_label TEXT;
-- ========================================
-- 2. Clear existing UPC rules (fresh seed)
-- ========================================
DELETE FROM deadline_rules WHERE proceeding_type_id IN (
SELECT id FROM proceeding_types WHERE code IN ('INF', 'REV', 'CCR', 'APM', 'APP', 'AMD')
);
-- ========================================
-- 3. Ensure all proceeding types exist
-- ========================================
INSERT INTO proceeding_types (code, name, description, is_active, sort_order, default_color)
VALUES
('INF', 'Infringement', 'Patent infringement proceedings', true, 1, '#3b82f6'),
('REV', 'Revocation', 'Standalone revocation proceedings', true, 2, '#ef4444'),
('CCR', 'Counterclaim for Revocation', 'Counterclaim for revocation within infringement', true, 3, '#ef4444'),
('APM', 'Provisional Measures', 'Application for preliminary injunction', true, 4, '#f59e0b'),
('APP', 'Appeal', 'Appeal to the Court of Appeal', true, 5, '#8b5cf6'),
('AMD', 'Application to Amend Patent', 'Sub-proceeding for patent amendment during revocation', true, 6, '#10b981')
ON CONFLICT (code) DO UPDATE SET
name = EXCLUDED.name,
description = EXCLUDED.description,
default_color = EXCLUDED.default_color,
sort_order = EXCLUDED.sort_order,
is_active = EXCLUDED.is_active;
-- ========================================
-- 4. Seed all proceeding events
-- ========================================
DO $$
DECLARE
v_inf INTEGER;
v_rev INTEGER;
v_ccr INTEGER;
v_apm INTEGER;
v_app INTEGER;
v_amd INTEGER;
-- INF event IDs
v_inf_soc UUID;
v_inf_sod UUID;
v_inf_reply UUID;
v_inf_rejoin UUID;
v_inf_interim UUID;
v_inf_oral UUID;
v_inf_decision UUID;
v_inf_prelim UUID;
-- CCR event IDs
v_ccr_root UUID;
v_ccr_defence UUID;
v_ccr_reply UUID;
v_ccr_rejoin UUID;
v_ccr_interim UUID;
v_ccr_oral UUID;
v_ccr_decision UUID;
-- REV event IDs
v_rev_app UUID;
v_rev_defence UUID;
v_rev_reply UUID;
v_rev_rejoin UUID;
v_rev_interim UUID;
v_rev_oral UUID;
v_rev_decision UUID;
-- PI event IDs
v_pi_app UUID;
v_pi_resp UUID;
v_pi_oral UUID;
-- APP event IDs
v_app_notice UUID;
v_app_grounds UUID;
v_app_response UUID;
v_app_oral UUID;
BEGIN
SELECT id INTO v_inf FROM proceeding_types WHERE code = 'INF';
SELECT id INTO v_rev FROM proceeding_types WHERE code = 'REV';
SELECT id INTO v_ccr FROM proceeding_types WHERE code = 'CCR';
SELECT id INTO v_apm FROM proceeding_types WHERE code = 'APM';
SELECT id INTO v_app FROM proceeding_types WHERE code = 'APP';
SELECT id INTO v_amd FROM proceeding_types WHERE code = 'AMD';
-- ========================================
-- INFRINGEMENT PROCEEDINGS
-- ========================================
-- Root: Statement of Claim
v_inf_soc := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_soc, v_inf, NULL, 'inf.soc', 'Statement of Claim',
'Claimant files the statement of claim with the Registry',
'claimant', 'filing', true, 0, 'months', NULL, NULL, false, NULL, 0, true);
-- Preliminary Objection (from SoC)
v_inf_prelim := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_prelim, v_inf, v_inf_soc, 'inf.prelim', 'Preliminary Objection',
'Defendant raises preliminary objection (jurisdiction, admissibility)',
'defendant', 'filing', false, 1, 'months', 'R.19',
'Rarely triggers separate decision; usually decided with main case',
false, NULL, 1, true);
-- Statement of Defence (from SoC)
v_inf_sod := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_sod, v_inf, v_inf_soc, 'inf.sod', 'Statement of Defence',
'Defendant files the statement of defence',
'defendant', 'filing', true, 3, 'months', 'RoP.023', NULL,
false, NULL, 2, true);
-- Reply to Defence (from SoD) — CONDITIONAL: rule code changes if CCR
v_inf_reply := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_reply, v_inf, v_inf_sod, 'inf.reply', 'Reply to Defence',
'Claimant''s reply to the statement of defence (includes Defence to Counterclaim if CCR active)',
'claimant', 'filing', true, 2, 'months', 'RoP.029b', NULL,
false, NULL, 1, true);
-- Rejoinder (from Reply) — CONDITIONAL: duration changes if CCR
v_inf_rejoin := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_rejoin, v_inf, v_inf_reply, 'inf.rejoin', 'Rejoinder',
'Defendant''s rejoinder to the reply',
'defendant', 'filing', true, 1, 'months', 'RoP.029c', NULL,
false, NULL, 0, true);
-- Interim Conference
v_inf_interim := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_interim, v_inf, v_inf_rejoin, 'inf.interim', 'Interim Conference',
'Interim conference with the judge-rapporteur',
'court', 'hearing', true, 0, 'months', NULL, NULL, false, NULL, 0, true);
-- Oral Hearing
v_inf_oral := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_oral, v_inf, v_inf_interim, 'inf.oral', 'Oral Hearing',
'Oral hearing before the panel',
'court', 'hearing', true, 0, 'months', NULL, NULL, false, NULL, 0, true);
-- Decision
v_inf_decision := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_inf_decision, v_inf, v_inf_oral, 'inf.decision', 'Decision',
'Panel delivers its decision',
'court', 'decision', true, 0, 'months', NULL, NULL, false, NULL, 0, true);
-- Appeal (spawn from Decision — cross-type to APP)
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_app, v_inf_decision, 'inf.appeal', 'Appeal',
'Appeal against infringement decision to Court of Appeal',
'both', 'filing', true, 2, 'months', 'RoP.220.1', NULL,
true, 'Appeal filed', 0, true);
-- ========================================
-- COUNTERCLAIM FOR REVOCATION (spawn from SoD)
-- ========================================
v_ccr_root := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_root, v_ccr, v_inf_sod, 'ccr.counterclaim', 'Counterclaim for Revocation',
'Defendant files counterclaim challenging patent validity (included in SoD)',
'defendant', 'filing', true, 0, 'months', NULL, NULL,
true, 'Includes counterclaim for revocation', 0, true);
-- Defence to Counterclaim
v_ccr_defence := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_defence, v_ccr, v_ccr_root, 'ccr.defence', 'Defence to Counterclaim',
'Patent proprietor files defence to revocation counterclaim',
'claimant', 'filing', true, 3, 'months', 'RoP.050', NULL,
false, NULL, 0, true);
-- Reply in CCR
v_ccr_reply := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_reply, v_ccr, v_ccr_defence, 'ccr.reply', 'Reply in CCR',
'Reply in the counterclaim for revocation',
'defendant', 'filing', true, 2, 'months', NULL,
'Timing overlaps with infringement Rejoinder',
false, NULL, 1, true);
-- Rejoinder in CCR
v_ccr_rejoin := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_rejoin, v_ccr, v_ccr_reply, 'ccr.rejoin', 'Rejoinder in CCR',
'Rejoinder in the counterclaim for revocation',
'claimant', 'filing', true, 2, 'months', NULL, NULL,
false, NULL, 0, true);
-- Interim Conference
v_ccr_interim := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_interim, v_ccr, v_ccr_rejoin, 'ccr.interim', 'Interim Conference',
'Interim conference covering revocation issues',
'court', 'hearing', true, 0, 'months', NULL,
'May be combined with infringement IC',
false, NULL, 0, true);
-- Oral Hearing
v_ccr_oral := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_oral, v_ccr, v_ccr_interim, 'ccr.oral', 'Oral Hearing',
'Oral hearing on validity',
'court', 'hearing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
-- Decision
v_ccr_decision := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_ccr_decision, v_ccr, v_ccr_oral, 'ccr.decision', 'Decision',
'Decision on validity of the patent',
'court', 'decision', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
-- Appeal from CCR
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_app, v_ccr_decision, 'ccr.appeal', 'Appeal',
'Appeal against revocation decision to Court of Appeal',
'both', 'filing', true, 2, 'months', 'RoP.220.1', NULL,
true, 'Appeal filed', 0, true);
-- Application to Amend Patent (spawn from Defence to CCR)
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_amd, v_ccr_defence, 'ccr.amend', 'Application to Amend Patent',
'Patent proprietor applies to amend the patent during revocation proceedings',
'claimant', 'filing', false, 0, 'months', NULL, NULL,
true, 'Includes application to amend patent', 2, true);
-- ========================================
-- STANDALONE REVOCATION
-- ========================================
v_rev_app := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_app, v_rev, NULL, 'rev.app', 'Application for Revocation',
'Applicant files standalone application for revocation of the patent',
'claimant', 'filing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
v_rev_defence := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_defence, v_rev, v_rev_app, 'rev.defence', 'Defence to Revocation',
'Patent proprietor files defence to revocation application',
'defendant', 'filing', true, 3, 'months', NULL, NULL,
false, NULL, 0, true);
v_rev_reply := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_reply, v_rev, v_rev_defence, 'rev.reply', 'Reply',
'Reply in standalone revocation proceedings',
'claimant', 'filing', true, 2, 'months', NULL, NULL,
false, NULL, 1, true);
v_rev_rejoin := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_rejoin, v_rev, v_rev_reply, 'rev.rejoin', 'Rejoinder',
'Rejoinder in standalone revocation proceedings',
'defendant', 'filing', true, 2, 'months', NULL, NULL,
false, NULL, 0, true);
v_rev_interim := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_interim, v_rev, v_rev_rejoin, 'rev.interim', 'Interim Conference',
'Interim conference with the judge-rapporteur',
'court', 'hearing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
v_rev_oral := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_oral, v_rev, v_rev_interim, 'rev.oral', 'Oral Hearing',
'Oral hearing on validity in standalone revocation',
'court', 'hearing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
v_rev_decision := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_rev_decision, v_rev, v_rev_oral, 'rev.decision', 'Decision',
'Decision on patent validity',
'court', 'decision', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
-- Appeal from REV
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_app, v_rev_decision, 'rev.appeal', 'Appeal',
'Appeal against revocation decision to Court of Appeal',
'both', 'filing', true, 2, 'months', 'RoP.220.1', NULL,
true, 'Appeal filed', 0, true);
-- Application to Amend Patent from REV Defence
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_amd, v_rev_defence, 'rev.amend', 'Application to Amend Patent',
'Patent proprietor applies to amend the patent',
'claimant', 'filing', false, 0, 'months', NULL, NULL,
true, 'Includes application to amend patent', 2, true);
-- ========================================
-- PRELIMINARY INJUNCTION
-- ========================================
v_pi_app := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_pi_app, v_apm, NULL, 'pi.app', 'Application for Provisional Measures',
'Claimant files application for preliminary injunction',
'claimant', 'filing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
v_pi_resp := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_pi_resp, v_apm, v_pi_app, 'pi.response', 'Response to PI Application',
'Defendant files response to preliminary injunction application',
'defendant', 'filing', true, 0, 'months', NULL,
'Deadline set by court',
false, NULL, 0, true);
v_pi_oral := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_pi_oral, v_apm, v_pi_resp, 'pi.oral', 'Oral Hearing',
'Oral hearing on provisional measures',
'court', 'hearing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_apm, v_pi_oral, 'pi.order', 'Order on Provisional Measures',
'Court issues order on preliminary injunction',
'court', 'decision', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
-- ========================================
-- APPEAL (standalone)
-- ========================================
v_app_notice := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_app_notice, v_app, NULL, 'app.notice', 'Notice of Appeal',
'Appellant files notice of appeal with the Court of Appeal',
'both', 'filing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
v_app_grounds := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_app_grounds, v_app, v_app_notice, 'app.grounds', 'Statement of Grounds of Appeal',
'Appellant files statement of grounds',
'both', 'filing', true, 2, 'months', 'RoP.220.1', NULL,
false, NULL, 0, true);
v_app_response := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_app_response, v_app, v_app_grounds, 'app.response', 'Response to Appeal',
'Respondent files response to the appeal',
'both', 'filing', true, 2, 'months', NULL, NULL,
false, NULL, 0, true);
v_app_oral := gen_random_uuid();
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (v_app_oral, v_app, v_app_response, 'app.oral', 'Oral Hearing',
'Oral hearing before the Court of Appeal',
'court', 'hearing', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
INSERT INTO deadline_rules (id, proceeding_type_id, parent_id, code, name, description,
primary_party, event_type, is_mandatory, duration_value, duration_unit,
rule_code, deadline_notes, is_spawn, spawn_label, sequence_order, is_active)
VALUES (gen_random_uuid(), v_app, v_app_oral, 'app.decision', 'Decision',
'Court of Appeal delivers its decision',
'court', 'decision', true, 0, 'months', NULL, NULL,
false, NULL, 0, true);
-- ========================================
-- 5. Set conditional deadlines (from 040)
-- ========================================
-- Reply to Defence: rule code changes when CCR is active
-- Default: RoP.029b | With CCR: RoP.029a
UPDATE deadline_rules
SET condition_rule_id = v_ccr_root,
alt_rule_code = 'RoP.029a'
WHERE id = v_inf_reply;
-- Rejoinder: duration changes when CCR is active
-- Default: 1 month RoP.029c | With CCR: 2 months RoP.029d
UPDATE deadline_rules
SET condition_rule_id = v_ccr_root,
alt_duration_value = 2,
alt_duration_unit = 'months',
alt_rule_code = 'RoP.029d'
WHERE id = v_inf_rejoin;
END $$;

View File

@@ -6,6 +6,12 @@ services:
- "8080"
environment:
- PORT=8080
- DATABASE_URL=${DATABASE_URL}
- SUPABASE_URL=${SUPABASE_URL}
- SUPABASE_ANON_KEY=${SUPABASE_ANON_KEY}
- SUPABASE_SERVICE_KEY=${SUPABASE_SERVICE_KEY}
- SUPABASE_JWT_SECRET=${SUPABASE_JWT_SECRET}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:8080/health"]
interval: 30s
@@ -16,6 +22,9 @@ services:
frontend:
build:
context: ./frontend
args:
NEXT_PUBLIC_SUPABASE_URL: ${SUPABASE_URL}
NEXT_PUBLIC_SUPABASE_ANON_KEY: ${SUPABASE_ANON_KEY}
expose:
- "3000"
depends_on:
@@ -23,6 +32,8 @@ services:
condition: service_healthy
environment:
- API_URL=http://backend:8080
- NEXT_PUBLIC_SUPABASE_URL=${SUPABASE_URL}
- NEXT_PUBLIC_SUPABASE_ANON_KEY=${SUPABASE_ANON_KEY}
healthcheck:
test: ["CMD", "node", "-e", "fetch('http://localhost:3000').then(r=>{if(!r.ok)throw r.status;process.exit(0)}).catch(()=>process.exit(1))"]
interval: 30s

514
docs/kostenrechner-plan.md Normal file
View File

@@ -0,0 +1,514 @@
# Patentprozesskostenrechner — Implementation Plan
**Date:** 2026-03-31
**Source:** Analysis of `Patentprozesskostenrechner.xlsm` (c) 2021 M. Siebels
**Status:** Research complete, ready for implementation
---
## 1. Fee Calculation Logic Summary
The calculator computes costs for German patent litigation using two fee systems:
### GKG (Gerichtskostengesetz) — Court Fees
A **step-based accumulator**. The Streitwert is divided into brackets, each with a step size and per-step increment. The algorithm:
1. Start with the minimum fee (first row of the fee table)
2. For each bracket: compute `steps = ceil(portion_in_bracket / step_size)`
3. Accumulate: `fee += steps * increment`
4. Result = "einfache Gebühr" (1.0x base fee)
5. Multiply by the instance-specific factor (e.g., 3.0x for LG, 4.0x for OLG)
For Streitwert > EUR 500,000 (post-2025 schedule): `base = 4,138 + ceil((Streitwert - 500,000) / 50,000) * 210`
### RVG (Rechtsanwaltsvergütungsgesetz) — Attorney Fees
Same step-based lookup but with its own column in the fee table. Per attorney, the formula is:
```
attorney_cost = (VG_factor * base_RVG + increase_fee + TG_factor * base_RVG + Pauschale) * (1 + VAT)
```
Where:
- **VG** = Verfahrensgebühr (procedural fee): 1.3x (LG/BPatG), 1.6x (OLG/BGH nullity), 2.3x (BGH NZB/Rev for RA)
- **TG** = Terminsgebühr (hearing fee): 1.2x or 1.5x (BGH), only if hearing held
- **Increase fee** (Nr. 1008 VV RVG): `MIN((clients - 1) * 0.3, 2.0) * base_RVG` for multiple clients
- **Pauschale** = EUR 20 (Auslagenpauschale Nr. 7002 VV RVG)
### PatKostG — Patent Court Fees
BPatG nullity uses PatKostG instead of GKG for court fees (but same step-based lookup from the same table). DPMA/BPatG cancellation uses fixed fees (EUR 300 / EUR 500).
### Instance Multipliers (Complete Reference)
| Instance | Court Fee Factor | Source | Fee Basis |
|---|---|---|---|
| **LG** (infringement 1st) | 3.0x GKG | Nr. 1210 Anl. 1 GKG | GKG |
| **OLG** (infringement appeal) | 4.0x GKG | Nr. 1420 KV GKG | GKG |
| **BGH NZB** (leave to appeal) | 2.0x GKG | Nr. 1242 KV GKG | GKG |
| **BGH Revision** | 5.0x GKG | Nr. 1230 KV GKG | GKG |
| **BPatG** (nullity 1st) | 4.5x | Nr. 402 100 Anl. PatKostG | PatKostG |
| **BGH** (nullity appeal) | 6.0x GKG | Nr. 1250 KV GKG | GKG |
| **DPMA** (cancellation) | EUR 300 flat | Nr. 323 100 Anl. PatKostG | Fixed |
| **BPatG** (cancellation appeal) | EUR 500 flat | Nr. 401 100 Anl. PatKostG | Fixed |
| Instance | RA VG Factor | RA TG Factor | PA VG Factor | PA TG Factor |
|---|---|---|---|---|
| **LG** | 1.3x | 1.2x | 1.3x | 1.2x |
| **OLG** | 1.6x | 1.2x | 1.6x | 1.2x |
| **BGH NZB** | 2.3x | 1.2x | 1.6x | 1.2x |
| **BGH Revision** | 2.3x | 1.5x | 1.6x | 1.5x |
| **BPatG** (nullity) | 1.3x | 1.2x | 1.3x | 1.2x |
| **BGH** (nullity appeal) | 1.6x | 1.5x | 1.6x | 1.5x |
| **DPMA** | 1.3x | 1.2x | — | — |
| **BPatG** (cancellation) | 1.3x | 1.2x | — | — |
---
## 2. Fee Schedule Data (JSON)
Five historical versions of the fee table. Each row: `[upperBound, stepSize, gkgIncrement, rvgIncrement]`.
Each row: `[upperBound, stepSize, gkgIncrement, rvgIncrement]`. Values extracted directly from the Excel ListObjects. Note: increments are decimal (e.g., 51.5 EUR per step). The last bracket uses a very large upper bound (effectively infinity).
```json
{
"feeSchedules": {
"2005": {
"label": "GKG/RVG 2006-09-01",
"validFrom": "2006-09-01",
"brackets": [
[300, 300, 25, 25],
[1500, 300, 10, 20],
[5000, 500, 8, 28],
[10000, 1000, 15, 37],
[25000, 3000, 23, 40],
[50000, 5000, 29, 72],
[200000, 15000, 100, 77],
[500000, 30000, 150, 118],
[Infinity, 50000, 150, 150]
]
},
"2013": {
"label": "GKG/RVG 2013-08-01",
"validFrom": "2013-08-01",
"brackets": [
[500, 300, 35, 45],
[2000, 500, 18, 35],
[10000, 1000, 19, 51],
[25000, 3000, 26, 46],
[50000, 5000, 35, 75],
[200000, 15000, 120, 85],
[500000, 30000, 179, 120],
[Infinity, 50000, 180, 150]
]
},
"2021": {
"label": "GKG/RVG 2021-01-01",
"validFrom": "2021-01-01",
"brackets": [
[500, 300, 38, 49],
[2000, 500, 20, 39],
[10000, 1000, 21, 56],
[25000, 3000, 29, 52],
[50000, 5000, 38, 81],
[200000, 15000, 132, 94],
[500000, 30000, 198, 132],
[Infinity, 50000, 198, 165]
]
},
"2025": {
"label": "GKG/RVG 2025-06-01",
"validFrom": "2025-06-01",
"brackets": [
[500, 300, 40, 51.5],
[2000, 500, 21, 41.5],
[10000, 1000, 22.5, 59.5],
[25000, 3000, 30.5, 55],
[50000, 5000, 40.5, 86],
[200000, 15000, 140, 99.5],
[500000, 30000, 210, 140],
[Infinity, 50000, 210, 175]
]
},
"Aktuell": {
"label": "Aktuell (= 2025-06-01)",
"validFrom": "2025-06-01",
"aliasOf": "2025"
}
},
"constants": {
"erhoehungsfaktor": 0.3,
"erhoehungsfaktorMax": 2.0,
"auslagenpauschale": 20
}
}
```
**Notes on the data:**
- The 2005 version has 9 brackets (starts at 300, not 500). Older versions differ more than expected.
- Increments are **not integers** in the 2025 version (e.g., 51.5, 41.5, 59.5) — the implementation must handle decimal arithmetic.
- The last bracket upper bound is `1e+20` in the Excel (effectively infinity). Use `Infinity` in TypeScript or a sentinel value.
- "Aktuell" is currently identical to "2025" — implement as an alias that can diverge when fees are next updated.
### UPC Fee Data (New — Not in Excel)
```json
{
"upcFees": {
"pre2026": {
"label": "UPC (vor 2026)",
"validFrom": "2023-06-01",
"fixedFees": {
"infringement": 11000,
"counterclaim_infringement": 11000,
"non_infringement": 11000,
"license_compensation": 11000,
"determine_damages": 3000,
"revocation_standalone": 20000,
"counterclaim_revocation": 20000,
"provisional_measures": 11000
},
"valueBased": [
{ "maxValue": 500000, "fee": 0 },
{ "maxValue": 750000, "fee": 2500 },
{ "maxValue": 1000000, "fee": 4000 },
{ "maxValue": 1500000, "fee": 8000 },
{ "maxValue": 2000000, "fee": 13000 },
{ "maxValue": 3000000, "fee": 20000 },
{ "maxValue": 4000000, "fee": 26000 },
{ "maxValue": 5000000, "fee": 32000 },
{ "maxValue": 6000000, "fee": 39000 },
{ "maxValue": 7000000, "fee": 46000 },
{ "maxValue": 8000000, "fee": 52000 },
{ "maxValue": 9000000, "fee": 58000 },
{ "maxValue": 10000000, "fee": 65000 },
{ "maxValue": 15000000, "fee": 75000 },
{ "maxValue": 20000000, "fee": 100000 },
{ "maxValue": 25000000, "fee": 125000 },
{ "maxValue": 30000000, "fee": 150000 },
{ "maxValue": 50000000, "fee": 250000 },
{ "maxValue": null, "fee": 325000 }
],
"recoverableCosts": [
{ "maxValue": 250000, "ceiling": 38000 },
{ "maxValue": 500000, "ceiling": 56000 },
{ "maxValue": 1000000, "ceiling": 112000 },
{ "maxValue": 2000000, "ceiling": 200000 },
{ "maxValue": 4000000, "ceiling": 400000 },
{ "maxValue": 8000000, "ceiling": 600000 },
{ "maxValue": 16000000, "ceiling": 800000 },
{ "maxValue": 30000000, "ceiling": 1200000 },
{ "maxValue": 50000000, "ceiling": 1500000 },
{ "maxValue": null, "ceiling": 2000000 }
],
"smReduction": 0.40
},
"2026": {
"label": "UPC (ab 2026)",
"validFrom": "2026-01-01",
"fixedFees": {
"infringement": 14600,
"counterclaim_infringement": 14600,
"non_infringement": 14600,
"license_compensation": 14600,
"determine_damages": 4000,
"revocation_standalone": 26500,
"counterclaim_revocation": 26500,
"provisional_measures": 14600
},
"valueBased": "TODO: exact 2026 table not yet published in extractable form. Estimated ~32% increase on pre-2026 values. Replace with official data when available.",
"smReduction": 0.50
}
}
}
```
---
## 3. Architecture Decision
### Recommendation: New page within KanzlAI-mGMT at `/kosten/rechner`
**Reasons:**
1. **Existing infrastructure**: KanzlAI already has billing/cost infrastructure (time tracking, invoices, RVG rates). The Kostenrechner is a natural extension.
2. **Shared UI patterns**: Sidebar nav, card layout, Tailwind styling, Recharts for any comparison charts — all already established.
3. **Future integration**: Cost calculations can link to cases (attach estimated costs to a case), feed into Prozesskostensicherheit calculations, and inform billing.
4. **No auth required for core calculator**: The page can work without login (public tool for marketing), but logged-in users get case-linking and save functionality.
5. **No backend needed initially**: All fee calculations are deterministic lookups + arithmetic — pure frontend. Data lives as static JSON/TypeScript constants.
**Against standalone deployment:**
- Maintaining a separate deploy adds operational overhead for zero benefit
- Can't integrate with cases or billing later without cross-origin complexity
- Duplicates styling/build tooling
### Proposed Route Structure
```
/kosten/ — Overview page (links to sub-calculators)
/kosten/rechner — Patentprozesskostenrechner (main calculator)
/kosten/rechner/vergleich — (future) Venue comparison tool (DE vs UPC)
```
### Frontend Architecture
```
frontend/src/
app/(app)/kosten/
page.tsx — Overview
rechner/
page.tsx — Calculator page (client component)
lib/
costs/
fee-tables.ts — All fee schedule data (GKG, RVG, UPC)
calculator.ts — Pure calculation functions
types.ts — TypeScript types for inputs/outputs
components/
costs/
CostCalculator.tsx — Main calculator component
InstanceCard.tsx — Per-instance input card (LG, OLG, etc.)
CostSummary.tsx — Results display with breakdown
CostComparison.tsx — (future) Side-by-side venue comparison
```
**No backend changes needed.** All calculation logic is client-side. If we later want to save calculations to a case, we add one API endpoint.
---
## 4. All Inputs
### Global Inputs
| Input | Type | Range | Default | Description |
|---|---|---|---|---|
| `vatRate` | enum | 0%, 16%, 19% | 0% | Umsatzsteuer |
| `streitwert` | number | 50030,000,000 | 100,000 | Amount in dispute (EUR) |
| `erhoehungsStreitwert` | number | >= streitwert | = streitwert | Increased amount (for Erhoehungsgebuehr) |
| `proceedingType` | enum | infringement, nullity, cancellation, security | infringement | Which proceeding to calculate |
### Per-Instance Inputs (Infringement: LG, OLG, BGH-NZB, BGH-Rev)
| Input | Type | Default | Description |
|---|---|---|---|
| `enabled` | boolean | true (LG), false (others) | Include this instance |
| `feeVersion` | enum | "Aktuell" | Fee schedule version (2005/2013/2021/2025/Aktuell) |
| `numAttorneys` | integer >= 0 | 1 | Rechtsanwälte |
| `numPatentAttorneys` | integer >= 0 | 1 | Patentanwälte |
| `oralHearing` | boolean | true | Mündliche Verhandlung held? |
| `expertFees` | number >= 0 | 0 | Sachverständigenvergütung (EUR) |
| `terminationType` | enum | "Urteil" | How case ended (Urteil/Vergleich/Klagerücknahme/etc.) |
| `numClients` | integer >= 1 | 1 | Mandanten (for Erhöhungsgebühr) |
### Per-Instance Inputs (Nullity: BPatG, BGH)
Same structure as infringement instances.
### Per-Instance Inputs (Cancellation: DPMA, BPatG)
Same structure but no patent attorneys at DPMA level, fixed court fees.
### UPC-Specific Inputs (New)
| Input | Type | Default | Description |
|---|---|---|---|
| `actionType` | enum | "infringement" | UPC action type (affects fixed fee + value-based applicability) |
| `feeVersion` | enum | "2026" | pre-2026 or 2026 |
| `isSME` | boolean | false | Small/micro enterprise (40%/50% court fee reduction) |
| `includeAppeal` | boolean | false | Include Court of Appeal |
| `includeRevocation` | boolean | false | Include counterclaim for revocation |
---
## 5. All Outputs
### Per-Instance Breakdown
| Output | Description |
|---|---|
| Court fee (base) | e.g., "3.0x GKG = EUR 18,714" |
| Expert fees | If applicable |
| **Court subtotal** | Court fee + expert fees |
| Per-attorney cost | VG + Erhöhung + TG + Pauschale, before VAT |
| Per-attorney cost (incl. VAT) | × (1 + VAT) |
| Attorney subtotal | Per-attorney × num_attorneys |
| Patent attorney subtotal | Same calculation × num_patent_attorneys |
| **Instance total** | Court subtotal + attorney subtotal + patent attorney subtotal |
### Summary Totals (Infringement)
| Output | Description |
|---|---|
| Gesamtkosten bei Nichtzulassung | LG + OLG + BGH-NZB |
| Gesamtkosten bei Revision | LG + OLG + BGH-Rev |
### Summary Totals (Nullity)
| Output | Description |
|---|---|
| Gesamtkosten Nichtigkeitsverfahren | BPatG + BGH |
### Summary Totals (Cancellation)
| Output | Description |
|---|---|
| Gesamtkosten Löschungsverfahren | DPMA + BPatG |
### Security for Costs (Prozesskostensicherheit)
| Output | Description |
|---|---|
| 1. Instanz | 2.5x RA + increase + 2.5x PA + increase + EUR 5,000 |
| 2. Instanz | 2.8x RA + increase + 2.8x PA + increase + 4.0x court + EUR 5,000 |
| NZB | 2.3x RA + increase + 2.3x PA + increase |
| Total (incl. VAT) | Sum × (1 + VAT) |
### UPC Outputs (New)
| Output | Description |
|---|---|
| Fixed court fee | Per action type |
| Value-based fee | Per Streitwert bracket |
| Total court fees | Fixed + value-based |
| Court fees (SME) | With 40%/50% reduction |
| Recoverable costs ceiling | Per Streitwert bracket |
| Appeal court fees | If appeal enabled |
| **Total cost risk** | All court fees + recoverable costs ceiling |
### Fee Schedule Reference (Quick Lookup)
| Output | Description |
|---|---|
| Base 1.0x GKG fee | For each fee version at given Streitwert |
| Base 1.0x RVG fee | For each fee version at given Streitwert |
---
## 6. Bugs to Fix (from Excel)
### Bug 1: Prozesskostensicherheit VAT Formula (CRITICAL)
- **Location:** Excel cell C31 on Prozesskostensicherheit sheet
- **Problem:** Formula `=C30*(1-Umsatzsteuer)` subtracts VAT instead of adding it
- **Label says:** "inkl. Umsatzsteuer" (including VAT)
- **Fix:** `=C30*(1+Umsatzsteuer)` → in web version: `total * (1 + vatRate)`
- **Impact:** 32% error when VAT = 19% (EUR 35,394 vs correct EUR 51,998)
- **Why hidden:** Default VAT is 0%, so `1-0 = 1+0 = 1` — bug only manifests with non-zero VAT
### Bug 2: Prozesskostensicherheit Uses Wrong Fee Type
- **Location:** Excel cell C22 on Prozesskostensicherheit sheet
- **Problem:** `mGebuehrensatz(Streitwert, 1, SelectedVersion)` — parameter `1` selects RVG (attorney) fees
- **Should be:** `mGebuehrensatz(Streitwert, 0, SelectedVersion)` — parameter `0` selects GKG (court) fees
- **Context:** This cell calculates "4-fache Gerichts-Verfahrensgebühr (Nr. 1420 KV)" — clearly a court fee
- **Fix:** Use GKG fee schedule for court fee calculations
- **Impact:** GKG and RVG fees differ, so the result is always wrong
### Bug 3: Nichtigkeitsverfahren Missing Expert Fees in Total
- **Location:** Excel cell D24 on Nichtigkeitsverfahren sheet
- **Problem:** `=C23+C13` adds attorney total (C23) + bare court fee (C13), but C13 is only the 4.5x fee line item
- **Should be:** `=C23+C15` where C15 is the Zwischensumme (court fees + expert fees)
- **Fix:** Include expert fees subtotal in instance total
- **Impact:** Expert fees are silently dropped from the BPatG total. Consistent with Verletzungsverfahren pattern where D26 = C25 + C17 (uses Zwischensumme)
---
## 7. UPC Fee Structure (New Feature)
### How UPC Fees Differ from German Courts
| Aspect | German Courts (GKG/RVG) | UPC |
|---|---|---|
| **Court fee model** | Step-based accumulator | Fixed fee + bracket lookup |
| **Attorney fees** | RVG statutory table | Contractual (market rates) |
| **Recoverable costs** | RVG-based (predictable) | Ceiling table (much higher) |
| **Scope** | Single country | Pan-European |
| **Nullity** | Separate BPatG proceeding | Counterclaim in same action |
### UPC Court Fees: Two Components
1. **Fixed fee** — always due, depends on action type:
- Infringement: EUR 14,600 (2026) / EUR 11,000 (pre-2026)
- Revocation: EUR 26,500 (2026) / EUR 20,000 (pre-2026) — flat, no value-based component
- Appeal: same fixed fees as first instance
2. **Value-based fee** — only for infringement-type actions when Streitwert > EUR 500,000:
- 19 brackets from EUR 0 (≤500k) to EUR 325,000 (>50M) — pre-2026
- ~32% increase in 2026 (exact table pending official publication)
- Revocation actions have NO value-based fee
3. **SME reduction**: 50% (2026) / 40% (pre-2026) on all court fees
### Recoverable Costs Ceilings (Attorney Fee Caps)
Per instance, the losing party reimburses up to:
| Streitwert | Ceiling |
|---|---|
| ≤ EUR 250,000 | EUR 38,000 |
| ≤ EUR 500,000 | EUR 56,000 |
| ≤ EUR 1,000,000 | EUR 112,000 |
| ≤ EUR 2,000,000 | EUR 200,000 |
| ≤ EUR 4,000,000 | EUR 400,000 |
| ≤ EUR 8,000,000 | EUR 600,000 |
| ≤ EUR 16,000,000 | EUR 800,000 |
| ≤ EUR 30,000,000 | EUR 1,200,000 |
| ≤ EUR 50,000,000 | EUR 1,500,000 |
| > EUR 50,000,000 | EUR 2,000,000 |
Court can raise ceiling by 50% (cases ≤1M) or 25% (150M). Expert/translator fees recoverable separately on top.
### Cost Comparison (Key Insight)
At EUR 3M Streitwert (infringement, 1st instance):
| | UPC (2026) | German LG |
|---|---|---|
| Court fees | ~EUR 41,000 | EUR 43,914 |
| Recoverable attorney costs | up to EUR 400,000 | ~EUR 100,388 |
| **Total cost risk** | **~EUR 441,000** | **~EUR 144,302** |
**Key takeaway:** UPC court fees are comparable to or lower than German courts. But recoverable attorney costs are 35x higher, making total cost risk at UPC roughly 23x German courts. This is the critical information patent litigators need.
### UPC Sources
- Rule 370 RoP (court fees), Rule 152 RoP (recoverable costs), Art. 69 UPCA
- UPC Administrative Committee fee table AC/05/08072022 (pre-2026)
- UPC Administrative Committee amendment, 4 Nov 2025 (2026 changes)
- Scale of Ceilings for Recoverable Costs: D-AC/10/24042023
- Maiwald MAIinsight April 2025 (practitioner analysis with verified figures)
---
## 8. Implementation Recommendations
### Phase 1: Core Calculator (MVP)
- Implement `fee-tables.ts` with all 5 GKG/RVG schedule versions as typed constants
- Implement `calculator.ts` with pure functions: `computeBaseFee(streitwert, isRVG, version)`, per-instance calculations, totals
- Build the UI as a single `"use client"` page at `/kosten/rechner`
- Card-based layout: global inputs at top, collapsible instance cards, results summary at bottom
- Fix all 3 bugs in the implementation (don't port them from Excel)
- German language UI throughout
### Phase 2: UPC Extension
- Add UPC fee data and calculation functions (bracket lookup, not step-based)
- Add UPC section to the calculator (separate card or tab)
- Add venue comparison view: side-by-side DE vs UPC for the same Streitwert
### Phase 3: Integration
- Allow saving calculations to a case (requires one backend endpoint)
- PDF export of cost breakdown
- Wire up Verfahrensbeendigung (termination type affects fee multipliers)
### Data Extraction TODO
Before implementation begins, the exact fee table values must be extracted from the Excel file. The analysis doc describes the structure but doesn't list every cell value. The implementer should either:
1. Open the Excel and manually read the Hebesaetze sheet values, or
2. Use a Python script with `openpyxl` to extract the ListObject data programmatically
This is critical — the older fee versions (2005, 2013, 2021) have different step sizes and increments.

View File

@@ -0,0 +1,391 @@
# UPC Fee Structure Research
> Research date: 2026-03-31
> Status: Complete (pre-2026 tables verified from official sources; 2026 amendments documented with confirmed changes)
> Purpose: Data for implementing a patent litigation cost calculator
## Overview
The UPC fee system consists of:
1. **Fixed fees** (always due, depend on action type)
2. **Value-based fees** (additional, for infringement-type actions with value > EUR 500,000)
3. **Recoverable costs ceilings** (caps on lawyer fees the losing party must reimburse)
Legal basis: Rule 370 RoP (court fees), Rule 152 RoP (recoverable costs), Art. 69 UPCA (cost allocation).
---
## 1. Fixed Court Fees (Court of First Instance)
### Pre-2026 Schedule (actions filed before 1 Jan 2026)
| Action Type | Fixed Fee |
|---|---|
| Infringement action | EUR 11,000 |
| Counterclaim for infringement | EUR 11,000 |
| Declaration of non-infringement | EUR 11,000 |
| Compensation for license of right | EUR 11,000 |
| Application to determine damages | EUR 3,000 |
| **Standalone revocation action** | **EUR 20,000** (flat, no value-based fee) |
| **Counterclaim for revocation** | **EUR 20,000** (flat, no value-based fee) |
| Application for provisional measures | EUR 11,000 |
| Order to preserve evidence | EUR 350 |
| Order for inspection | EUR 350 |
| Order to freeze assets | EUR 1,000 |
| Protective letter filing | EUR 200 |
| Protective letter extension (per 6 months) | EUR 100 |
| Action against EPO decision | EUR 1,000 |
| Re-establishment of rights | EUR 350 |
| Review case management order | EUR 300 |
| Set aside decision by default | EUR 1,000 |
### 2026 Schedule (actions filed from 1 Jan 2026)
~33% increase across the board:
| Action Type | Old Fee | New Fee (2026) | Change |
|---|---|---|---|
| Infringement action | EUR 11,000 | **EUR 14,600** | +33% |
| Counterclaim for infringement | EUR 11,000 | **EUR 14,600** | +33% |
| Declaration of non-infringement | EUR 11,000 | **EUR 14,600** | +33% |
| Standalone revocation action | EUR 20,000 | **EUR 26,500** | +33% |
| Counterclaim for revocation | EUR 20,000 | **EUR 26,500** | +33% |
| Application for provisional measures | EUR 11,000 | **EUR 14,600** | +33% |
| Application to determine damages | EUR 3,000 | **EUR 4,000** | +33% |
| Order to preserve evidence | EUR 350 | **EUR 5,000** | +1,329% |
| Order for inspection | EUR 350 | **EUR 5,000** | +1,329% |
| Order to freeze assets | EUR 1,000 | **EUR 5,000** | +400% |
| Protective letter filing | EUR 200 | **EUR 300** | +50% |
| Protective letter extension | EUR 100 | **EUR 130** | +30% |
| Application for rehearing | EUR 2,500 | **EUR 14,600** | +484% |
**Key 2026 change**: Provisional measures now also have a **value-based fee** (new). The value is deemed to be 2/3 of the value of the merits proceedings.
---
## 2. Value-Based Fees (Additional to Fixed Fee)
Applies to: infringement actions, counterclaim for infringement, declaration of non-infringement, compensation for license of right, application to determine damages.
Does NOT apply to: revocation actions (standalone or counterclaim) -- these are flat fee only.
### Pre-2026 Value-Based Fee Table
| Value of the Action | Additional Value-Based Fee |
|---|---|
| <= EUR 500,000 | EUR 0 |
| <= EUR 750,000 | EUR 2,500 |
| <= EUR 1,000,000 | EUR 4,000 |
| <= EUR 1,500,000 | EUR 8,000 |
| <= EUR 2,000,000 | EUR 13,000 |
| <= EUR 3,000,000 | EUR 20,000 |
| <= EUR 4,000,000 | EUR 26,000 |
| <= EUR 5,000,000 | EUR 32,000 |
| <= EUR 6,000,000 | EUR 39,000 |
| <= EUR 7,000,000 | EUR 46,000 |
| <= EUR 8,000,000 | EUR 52,000 |
| <= EUR 9,000,000 | EUR 58,000 |
| <= EUR 10,000,000 | EUR 65,000 |
| <= EUR 15,000,000 | EUR 75,000 |
| <= EUR 20,000,000 | EUR 100,000 |
| <= EUR 25,000,000 | EUR 125,000 |
| <= EUR 30,000,000 | EUR 150,000 |
| <= EUR 50,000,000 | EUR 250,000 |
| > EUR 50,000,000 | EUR 325,000 |
Source: Maiwald MAIinsight April 2025 (verified against commentedupc.com and official UPC fee table AC/05/08072022).
### 2026 Value-Based Fee Table (estimated)
The 2026 amendment increases value-based fees by ~32% at first instance and ~45% on appeal. Applying the 32% increase:
| Value of the Action | Old Fee | New Fee (est. ~32%) |
|---|---|---|
| <= EUR 500,000 | EUR 0 | EUR 0 |
| <= EUR 750,000 | EUR 2,500 | ~EUR 3,300 |
| <= EUR 1,000,000 | EUR 4,000 | ~EUR 5,300 |
| <= EUR 1,500,000 | EUR 8,000 | ~EUR 10,600 |
| <= EUR 2,000,000 | EUR 13,000 | ~EUR 17,200 |
| <= EUR 3,000,000 | EUR 20,000 | ~EUR 26,400 |
| <= EUR 5,000,000 | EUR 32,000 | ~EUR 42,200 |
| <= EUR 10,000,000 | EUR 65,000 | ~EUR 85,800 |
| <= EUR 30,000,000 | EUR 150,000 | ~EUR 198,000 |
| <= EUR 50,000,000 | EUR 250,000 | ~EUR 330,000 |
| > EUR 50,000,000 | EUR 325,000 | ~EUR 429,000 |
**Note**: The official consolidated 2026 fee table PDF was not extractable (403 on UPC website). The Maiwald blog confirms: for a dispute valued at EUR 5M, total court fees (fixed + value-based) went from EUR 43,000 to EUR 44,600, suggesting the 2026 value-based fee for <=5M is EUR 30,000 (i.e., 14,600 + 30,000 = 44,600). This is close to the ~32% estimate. When the exact 2026 table becomes available, these estimates should be replaced.
**Verified 2026 data point** (Secerna): For a typical infringement action valued at EUR 2,000,000, total court fees increase from EUR 24,000 to EUR 31,800. This means: new value-based fee for <=2M = 31,800 - 14,600 = EUR 17,200 (vs. old: 13,000 + 11,000 = 24,000).
---
## 3. Appeal Fees (Court of Appeal)
### Pre-2026
| Appeal Type | Fixed Fee | Value-Based Fee |
|---|---|---|
| Appeal on infringement action | EUR 11,000 | Same table as CFI |
| Appeal on counterclaim for infringement | EUR 11,000 | Same table as CFI |
| Appeal on non-infringement declaration | EUR 11,000 | Same table as CFI |
| Appeal on license compensation | EUR 11,000 | Same table as CFI |
| Appeal on damages determination | EUR 3,000 | Same table as CFI |
| Appeal on revocation action | EUR 20,000 | None |
| Appeal on counterclaim for revocation | Same as first instance | None |
| Appeal on provisional measures | EUR 11,000 | None |
| Appeal on other interlocutory orders | EUR 3,000 | None |
| Application for rehearing | EUR 2,500 | None |
| Appeal on cost decision | EUR 3,000 | None |
| Leave to appeal costs | EUR 1,500 | None |
| Discretionary review request | EUR 350 | None |
### 2026 Appeal Changes
- Fixed fees: ~33% increase (same as CFI)
- Value-based fees: ~45% increase (10% more than CFI increase)
- Revocation appeal: EUR 20,000 -> EUR 29,200 (+46%)
- Rehearing application: EUR 2,500 -> EUR 14,600 (+484%)
---
## 4. Recoverable Costs (Attorney Fee Ceilings)
Per Rule 152(2) RoP and the Administrative Committee's Scale of Ceilings. These are caps on what the winning party can recover from the losing party for legal representation costs. Apply **per instance**.
| Value of the Proceedings | Ceiling for Recoverable Costs |
|---|---|
| <= EUR 250,000 | EUR 38,000 |
| <= EUR 500,000 | EUR 56,000 |
| <= EUR 1,000,000 | EUR 112,000 |
| <= EUR 2,000,000 | EUR 200,000 |
| <= EUR 4,000,000 | EUR 400,000 |
| <= EUR 8,000,000 | EUR 600,000 |
| <= EUR 16,000,000 | EUR 800,000 |
| <= EUR 30,000,000 | EUR 1,200,000 |
| <= EUR 50,000,000 | EUR 1,500,000 |
| > EUR 50,000,000 | EUR 2,000,000 |
### Ceiling Adjustments
The court may **raise** the ceiling upon party request:
- Up to **50% increase** for cases valued up to EUR 1 million
- Up to **25% increase** for cases valued EUR 1-50 million
- Up to an **absolute cap of EUR 5 million** for cases valued above EUR 50 million
The court may **lower** the ceiling if it would threaten the economic existence of a party (SMEs, non-profits, universities, individuals).
### What is recoverable
- Lawyer fees (Rechtsanwalt)
- Patent attorney fees (Patentanwalt)
- Expert fees
- Witness costs
- Interpreter and translator fees
- Note: Expert/translator/witness fees are NOT subject to the ceiling -- they must be reasonable but are recoverable on top of the ceiling
---
## 5. SME / Micro-Enterprise Reductions
### Pre-2026
- Small and micro enterprises: **40% reduction** on all court fees (pay 60%)
- Conditions per Rule 370(8) RoP
### 2026
- Increased to **50% reduction** (pay 50%)
### Legal Aid
- Available in principle under Rule 375 et seq. RoP
---
## 6. Comparison: UPC vs. German National Courts
### German Court Fee Calculation (GKG)
German court fees are calculated using **Anlage 2 zu § 34 GKG**:
- Fee table covers Streitwert up to EUR 500,000 (base fee at 500k: EUR 4,138)
- Above EUR 500,000: fee increases by **EUR 210 per each additional EUR 50,000**
- This gives a "simple fee" (einfache Gebuehr)
- Actual court fees = simple fee x multiplier from Kostenverzeichnis
**Multipliers** for patent cases:
- LG infringement, 1st instance: **3.0x** simple fee
- OLG appeal (Berufung): **4.0x** simple fee
- BPatG nullity, 1st instance: **4.5x** simple fee (separate proceedings!)
- BGH nullity appeal: **6.0x** simple fee
### GKG Simple Fee Calculation for Key Streitwerte
Formula for Streitwert > 500,000: `4,138 + ceil((Streitwert - 500,000) / 50,000) * 210`
| Streitwert | Simple Fee | LG 1st (3.0x) | OLG Appeal (4.0x) |
|---|---|---|---|
| EUR 500,000 | EUR 4,138 | EUR 12,414 | EUR 16,552 |
| EUR 1,000,000 | EUR 6,238 | EUR 18,714 | EUR 24,952 |
| EUR 2,000,000 | EUR 10,438 | EUR 31,314 | EUR 41,752 |
| EUR 3,000,000 | EUR 14,638 | EUR 43,914 | EUR 58,552 |
| EUR 5,000,000 | EUR 23,038 | EUR 69,114 | EUR 92,152 |
| EUR 10,000,000 | EUR 44,038 | EUR 132,114 | EUR 176,152 |
| EUR 30,000,000 | EUR 128,038 | EUR 384,114 | EUR 512,152 |
| EUR 50,000,000 | EUR 212,038 | EUR 636,114 | EUR 848,152 |
Note: GKG was updated effective 01.06.2025 -- the increment changed from EUR 198 to EUR 210 per 50k.
### German Attorney Fees (RVG)
RVG fees use the same Streitwert table (§ 13 RVG with Anlage 2 GKG) but with their own multipliers:
- Verfahrensgebuehr (procedural fee): 1.3x
- Terminsgebuehr (hearing fee): 1.2x
- Einigungsgebuehr (settlement fee): 1.0x (if applicable)
- Patent attorney (Patentanwalt) adds same fees again
Recoverable costs in Germany = court fees + 1 Rechtsanwalt (RVG) + 1 Patentanwalt (RVG).
### Side-by-Side Comparison
#### Example: Infringement action, Streitwert EUR 1,000,000
| Cost Component | UPC (pre-2026) | UPC (2026) | Germany LG |
|---|---|---|---|
| Court fees | EUR 15,000 | ~EUR 19,900 | EUR 18,714 |
| Max recoverable attorney costs | EUR 112,000 | EUR 112,000 | ~EUR 30,000* |
| **Total cost risk (loser)** | **~EUR 127,000** | **~EUR 131,900** | **~EUR 48,714** |
#### Example: Infringement action, Streitwert EUR 3,000,000
| Cost Component | UPC (pre-2026) | UPC (2026) | Germany LG |
|---|---|---|---|
| Court fees | EUR 31,000 | ~EUR 41,000 | EUR 43,914 |
| Max recoverable attorney costs | EUR 400,000 | EUR 400,000 | ~EUR 100,388** |
| **Total cost risk (loser)** | **~EUR 431,000** | **~EUR 441,000** | **~EUR 144,302** |
*German RVG: 1x RA (1.3 VG + 1.2 TG) + 1x PA (same) + Auslagenpauschale + USt on Streitwert EUR 1M
**Maiwald comparison figure for EUR 3M (1x PA + 1x RA, including court fees, based on RVG)
#### Example: Infringement action, Streitwert EUR 10,000,000
| Cost Component | UPC (pre-2026) | UPC (2026) | Germany LG |
|---|---|---|---|
| Court fees | EUR 76,000 | ~EUR 100,400 | EUR 132,114 |
| Max recoverable attorney costs | EUR 600,000 | EUR 600,000 | ~EUR 222,000 |
| **Total cost risk (loser)** | **~EUR 676,000** | **~EUR 700,400** | **~EUR 354,114** |
### Key Insight
- **Court fees**: German courts are **more expensive** than UPC for court fees alone at moderate Streitwerte (EUR 1-5M). At very high Streitwerte (EUR 10M+), German courts become more expensive still.
- **Recoverable attorney costs**: UPC is **dramatically more expensive** -- ceilings are 3-5x higher than German RVG-based costs.
- **Total cost risk**: UPC total exposure is typically **2-3x German courts** for the same Streitwert, primarily driven by the high attorney cost ceilings.
- **Territorial scope**: UPC covers multiple countries in one action, so the higher cost may be justified by the broader geographic effect compared to a single German LG action.
---
## 7. Implementation Notes for Calculator
### Data structures needed
```typescript
// Value-based fee brackets (UPC)
interface FeeBracket {
maxValue: number; // upper bound of bracket (Infinity for last)
fee: number; // EUR amount
}
// Pre-2026 brackets
const upcValueFees2025: FeeBracket[] = [
{ maxValue: 500_000, fee: 0 },
{ maxValue: 750_000, fee: 2_500 },
{ maxValue: 1_000_000, fee: 4_000 },
{ maxValue: 1_500_000, fee: 8_000 },
{ maxValue: 2_000_000, fee: 13_000 },
{ maxValue: 3_000_000, fee: 20_000 },
{ maxValue: 4_000_000, fee: 26_000 },
{ maxValue: 5_000_000, fee: 32_000 },
{ maxValue: 6_000_000, fee: 39_000 },
{ maxValue: 7_000_000, fee: 46_000 },
{ maxValue: 8_000_000, fee: 52_000 },
{ maxValue: 9_000_000, fee: 58_000 },
{ maxValue: 10_000_000, fee: 65_000 },
{ maxValue: 15_000_000, fee: 75_000 },
{ maxValue: 20_000_000, fee: 100_000 },
{ maxValue: 25_000_000, fee: 125_000 },
{ maxValue: 30_000_000, fee: 150_000 },
{ maxValue: 50_000_000, fee: 250_000 },
{ maxValue: Infinity, fee: 325_000 },
];
// Recoverable costs ceilings
const upcRecoverableCosts: FeeBracket[] = [
{ maxValue: 250_000, fee: 38_000 },
{ maxValue: 500_000, fee: 56_000 },
{ maxValue: 1_000_000, fee: 112_000 },
{ maxValue: 2_000_000, fee: 200_000 },
{ maxValue: 4_000_000, fee: 400_000 },
{ maxValue: 8_000_000, fee: 600_000 },
{ maxValue: 16_000_000, fee: 800_000 },
{ maxValue: 30_000_000, fee: 1_200_000 },
{ maxValue: 50_000_000, fee: 1_500_000 },
{ maxValue: Infinity, fee: 2_000_000 },
];
// German GKG simple fee calculation
function gkgSimpleFee(streitwert: number): number {
if (streitwert <= 500_000) {
// Use lookup table from Anlage 2 GKG
return gkgLookupTable(streitwert);
}
// Above 500k: base + 210 EUR per 50k increment
const base = 4_138; // simple fee at 500k (as of 2025-06-01)
const excess = streitwert - 500_000;
const increments = Math.ceil(excess / 50_000);
return base + increments * 210;
}
// German court fee = simple fee * multiplier
// LG 1st instance infringement: 3.0x
// OLG appeal: 4.0x
// BPatG nullity 1st: 4.5x
// BGH nullity appeal: 6.0x
```
### Fixed fees by action type (for calculator dropdown)
```typescript
type ActionType =
| 'infringement'
| 'counterclaim_infringement'
| 'non_infringement'
| 'license_compensation'
| 'determine_damages'
| 'revocation_standalone'
| 'counterclaim_revocation'
| 'provisional_measures'
| 'preserve_evidence'
| 'inspection_order'
| 'freeze_assets'
| 'protective_letter'
| 'protective_letter_extension';
interface ActionFees {
fixedFee2025: number;
fixedFee2026: number;
hasValueBasedFee: boolean;
hasValueBasedFee2026: boolean; // provisional measures gained value-based in 2026
}
```
---
## Sources
- Maiwald MAIinsight Issue No. 2, April 2025: "Court fees and recoverable costs at the UPC" (PDF, authoritative practitioner analysis)
- UPC Official Table of Court Fees: AC/05/08072022 (original schedule)
- UPC Administrative Committee amendment decision, 4 Nov 2025 (2026 changes)
- commentedupc.com/table-of-court-fees/ (complete pre-2026 table)
- Scale of Ceilings for Recoverable Costs: D-AC/10/24042023
- GKG Anlage 2 zu § 34 (German court fee table, as of 01.06.2025)
- Secerna: "Major Financial Overhaul: Significant UPC Fee Increase" (2026 data points)
- Casalonga: "Unified Patent Court: Significant Increase in Court Costs in 2026"
- Bird & Bird: "Unified Patent Court Fees increase from 1 January 2026"
- Vossius: "Costs and Cost Risk in UPC Proceedings"
- Haug Partners: "A U.S. View on the UPC -- Part 5: Of Costs and Fees"

0
frontend/.m/spawn.lock Normal file
View File

View File

@@ -10,6 +10,10 @@ WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
ENV API_URL=http://backend:8080
ARG NEXT_PUBLIC_SUPABASE_URL
ARG NEXT_PUBLIC_SUPABASE_ANON_KEY
ENV NEXT_PUBLIC_SUPABASE_URL=$NEXT_PUBLIC_SUPABASE_URL
ENV NEXT_PUBLIC_SUPABASE_ANON_KEY=$NEXT_PUBLIC_SUPABASE_ANON_KEY
RUN mkdir -p public
RUN bun run build

View File

@@ -14,30 +14,103 @@
"react": "19.1.0",
"react-dom": "19.1.0",
"react-dropzone": "^15.0.0",
"recharts": "^3.8.1",
"sonner": "^2.0.7",
},
"devDependencies": {
"@eslint/eslintrc": "^3",
"@tailwindcss/postcss": "^4",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@testing-library/user-event": "^14.6.1",
"@types/node": "^20",
"@types/react": "^19",
"@types/react-dom": "^19",
"eslint": "^9",
"eslint-config-next": "15.5.14",
"jsdom": "24.1.3",
"msw": "^2.12.14",
"tailwindcss": "^4",
"typescript": "^5",
"vitest": "2.1.8",
},
},
},
"packages": {
"@adobe/css-tools": ["@adobe/css-tools@4.4.4", "", {}, "sha512-Elp+iwUx5rN5+Y8xLt5/GRoG20WGoDCQ/1Fb+1LiGtvwbDavuSk0jhD/eZdckHAuzcDzccnkv+rEjyWfRx18gg=="],
"@alloc/quick-lru": ["@alloc/quick-lru@5.2.0", "", {}, "sha512-UrcABB+4bUrFABwbluTIBErXwvbsU/V7TZWfmbgJfbkwiBuziS9gxdODUyuiecfdGQ85jglMW6juS3+z5TsKLw=="],
"@asamuzakjp/css-color": ["@asamuzakjp/css-color@3.2.0", "", { "dependencies": { "@csstools/css-calc": "^2.1.3", "@csstools/css-color-parser": "^3.0.9", "@csstools/css-parser-algorithms": "^3.0.4", "@csstools/css-tokenizer": "^3.0.3", "lru-cache": "^10.4.3" } }, "sha512-K1A6z8tS3XsmCMM86xoWdn7Fkdn9m6RSVtocUrJYIwZnFVkng/PvkEoWtOWmP+Scc6saYWHWZYbndEEXxl24jw=="],
"@babel/code-frame": ["@babel/code-frame@7.29.0", "", { "dependencies": { "@babel/helper-validator-identifier": "^7.28.5", "js-tokens": "^4.0.0", "picocolors": "^1.1.1" } }, "sha512-9NhCeYjq9+3uxgdtp20LSiJXJvN0FeCtNGpJxuMFZ1Kv3cWUNb6DOhJwUvcVCzKGR66cw4njwM6hrJLqgOwbcw=="],
"@babel/helper-validator-identifier": ["@babel/helper-validator-identifier@7.28.5", "", {}, "sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q=="],
"@babel/runtime": ["@babel/runtime@7.29.2", "", {}, "sha512-JiDShH45zKHWyGe4ZNVRrCjBz8Nh9TMmZG1kh4QTK8hCBTWBi8Da+i7s1fJw7/lYpM4ccepSNfqzZ/QvABBi5g=="],
"@csstools/color-helpers": ["@csstools/color-helpers@5.1.0", "", {}, "sha512-S11EXWJyy0Mz5SYvRmY8nJYTFFd1LCNV+7cXyAgQtOOuzb4EsgfqDufL+9esx72/eLhsRdGZwaldu/h+E4t4BA=="],
"@csstools/css-calc": ["@csstools/css-calc@2.1.4", "", { "peerDependencies": { "@csstools/css-parser-algorithms": "^3.0.5", "@csstools/css-tokenizer": "^3.0.4" } }, "sha512-3N8oaj+0juUw/1H3YwmDDJXCgTB1gKU6Hc/bB502u9zR0q2vd786XJH9QfrKIEgFlZmhZiq6epXl4rHqhzsIgQ=="],
"@csstools/css-color-parser": ["@csstools/css-color-parser@3.1.0", "", { "dependencies": { "@csstools/color-helpers": "^5.1.0", "@csstools/css-calc": "^2.1.4" }, "peerDependencies": { "@csstools/css-parser-algorithms": "^3.0.5", "@csstools/css-tokenizer": "^3.0.4" } }, "sha512-nbtKwh3a6xNVIp/VRuXV64yTKnb1IjTAEEh3irzS+HkKjAOYLTGNb9pmVNntZ8iVBHcWDA2Dof0QtPgFI1BaTA=="],
"@csstools/css-parser-algorithms": ["@csstools/css-parser-algorithms@3.0.5", "", { "peerDependencies": { "@csstools/css-tokenizer": "^3.0.4" } }, "sha512-DaDeUkXZKjdGhgYaHNJTV9pV7Y9B3b644jCLs9Upc3VeNGg6LWARAT6O+Q+/COo+2gg/bM5rhpMAtf70WqfBdQ=="],
"@csstools/css-tokenizer": ["@csstools/css-tokenizer@3.0.4", "", {}, "sha512-Vd/9EVDiu6PPJt9yAh6roZP6El1xHrdvIVGjyBsHR0RYwNHgL7FJPyIIW4fANJNG6FtyZfvlRPpFI4ZM/lubvw=="],
"@emnapi/core": ["@emnapi/core@1.9.1", "", { "dependencies": { "@emnapi/wasi-threads": "1.2.0", "tslib": "^2.4.0" } }, "sha512-mukuNALVsoix/w1BJwFzwXBN/dHeejQtuVzcDsfOEsdpCumXb/E9j8w11h5S54tT1xhifGfbbSm/ICrObRb3KA=="],
"@emnapi/runtime": ["@emnapi/runtime@1.9.1", "", { "dependencies": { "tslib": "^2.4.0" } }, "sha512-VYi5+ZVLhpgK4hQ0TAjiQiZ6ol0oe4mBx7mVv7IflsiEp0OWoVsp/+f9Vc1hOhE0TtkORVrI1GvzyreqpgWtkA=="],
"@emnapi/wasi-threads": ["@emnapi/wasi-threads@1.2.0", "", { "dependencies": { "tslib": "^2.4.0" } }, "sha512-N10dEJNSsUx41Z6pZsXU8FjPjpBEplgH24sfkmITrBED1/U2Esum9F3lfLrMjKHHjmi557zQn7kR9R+XWXu5Rg=="],
"@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.21.5", "", { "os": "aix", "cpu": "ppc64" }, "sha512-1SDgH6ZSPTlggy1yI6+Dbkiz8xzpHJEVAlF/AM1tHPLsf5STom9rwtjE4hKAF20FfXXNTFqEYXyJNWh1GiZedQ=="],
"@esbuild/android-arm": ["@esbuild/android-arm@0.21.5", "", { "os": "android", "cpu": "arm" }, "sha512-vCPvzSjpPHEi1siZdlvAlsPxXl7WbOVUBBAowWug4rJHb68Ox8KualB+1ocNvT5fjv6wpkX6o/iEpbDrf68zcg=="],
"@esbuild/android-arm64": ["@esbuild/android-arm64@0.21.5", "", { "os": "android", "cpu": "arm64" }, "sha512-c0uX9VAUBQ7dTDCjq+wdyGLowMdtR/GoC2U5IYk/7D1H1JYC0qseD7+11iMP2mRLN9RcCMRcjC4YMclCzGwS/A=="],
"@esbuild/android-x64": ["@esbuild/android-x64@0.21.5", "", { "os": "android", "cpu": "x64" }, "sha512-D7aPRUUNHRBwHxzxRvp856rjUHRFW1SdQATKXH2hqA0kAZb1hKmi02OpYRacl0TxIGz/ZmXWlbZgjwWYaCakTA=="],
"@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.21.5", "", { "os": "darwin", "cpu": "arm64" }, "sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ=="],
"@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.21.5", "", { "os": "darwin", "cpu": "x64" }, "sha512-se/JjF8NlmKVG4kNIuyWMV/22ZaerB+qaSi5MdrXtd6R08kvs2qCN4C09miupktDitvh8jRFflwGFBQcxZRjbw=="],
"@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.21.5", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-5JcRxxRDUJLX8JXp/wcBCy3pENnCgBR9bN6JsY4OmhfUtIHe3ZW0mawA7+RDAcMLrMIZaf03NlQiX9DGyB8h4g=="],
"@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.21.5", "", { "os": "freebsd", "cpu": "x64" }, "sha512-J95kNBj1zkbMXtHVH29bBriQygMXqoVQOQYA+ISs0/2l3T9/kj42ow2mpqerRBxDJnmkUDCaQT/dfNXWX/ZZCQ=="],
"@esbuild/linux-arm": ["@esbuild/linux-arm@0.21.5", "", { "os": "linux", "cpu": "arm" }, "sha512-bPb5AHZtbeNGjCKVZ9UGqGwo8EUu4cLq68E95A53KlxAPRmUyYv2D6F0uUI65XisGOL1hBP5mTronbgo+0bFcA=="],
"@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.21.5", "", { "os": "linux", "cpu": "arm64" }, "sha512-ibKvmyYzKsBeX8d8I7MH/TMfWDXBF3db4qM6sy+7re0YXya+K1cem3on9XgdT2EQGMu4hQyZhan7TeQ8XkGp4Q=="],
"@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.21.5", "", { "os": "linux", "cpu": "ia32" }, "sha512-YvjXDqLRqPDl2dvRODYmmhz4rPeVKYvppfGYKSNGdyZkA01046pLWyRKKI3ax8fbJoK5QbxblURkwK/MWY18Tg=="],
"@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.21.5", "", { "os": "linux", "cpu": "none" }, "sha512-uHf1BmMG8qEvzdrzAqg2SIG/02+4/DHB6a9Kbya0XDvwDEKCoC8ZRWI5JJvNdUjtciBGFQ5PuBlpEOXQj+JQSg=="],
"@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.21.5", "", { "os": "linux", "cpu": "none" }, "sha512-IajOmO+KJK23bj52dFSNCMsz1QP1DqM6cwLUv3W1QwyxkyIWecfafnI555fvSGqEKwjMXVLokcV5ygHW5b3Jbg=="],
"@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.21.5", "", { "os": "linux", "cpu": "ppc64" }, "sha512-1hHV/Z4OEfMwpLO8rp7CvlhBDnjsC3CttJXIhBi+5Aj5r+MBvy4egg7wCbe//hSsT+RvDAG7s81tAvpL2XAE4w=="],
"@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.21.5", "", { "os": "linux", "cpu": "none" }, "sha512-2HdXDMd9GMgTGrPWnJzP2ALSokE/0O5HhTUvWIbD3YdjME8JwvSCnNGBnTThKGEB91OZhzrJ4qIIxk/SBmyDDA=="],
"@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.21.5", "", { "os": "linux", "cpu": "s390x" }, "sha512-zus5sxzqBJD3eXxwvjN1yQkRepANgxE9lgOW2qLnmr8ikMTphkjgXu1HR01K4FJg8h1kEEDAqDcZQtbrRnB41A=="],
"@esbuild/linux-x64": ["@esbuild/linux-x64@0.21.5", "", { "os": "linux", "cpu": "x64" }, "sha512-1rYdTpyv03iycF1+BhzrzQJCdOuAOtaqHTWJZCWvijKD2N5Xu0TtVC8/+1faWqcP9iBCWOmjmhoH94dH82BxPQ=="],
"@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.21.5", "", { "os": "none", "cpu": "x64" }, "sha512-Woi2MXzXjMULccIwMnLciyZH4nCIMpWQAs049KEeMvOcNADVxo0UBIQPfSmxB3CWKedngg7sWZdLvLczpe0tLg=="],
"@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.21.5", "", { "os": "openbsd", "cpu": "x64" }, "sha512-HLNNw99xsvx12lFBUwoT8EVCsSvRNDVxNpjZ7bPn947b8gJPzeHWyNVhFsaerc0n3TsbOINvRP2byTZ5LKezow=="],
"@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.21.5", "", { "os": "sunos", "cpu": "x64" }, "sha512-6+gjmFpfy0BHU5Tpptkuh8+uw3mnrvgs+dSPQXQOv3ekbordwnzTVEb4qnIvQcYXq6gzkyTnoZ9dZG+D4garKg=="],
"@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.21.5", "", { "os": "win32", "cpu": "arm64" }, "sha512-Z0gOTd75VvXqyq7nsl93zwahcTROgqvuAcYDUr+vOv8uHhNSKROyU961kgtCD1e95IqPKSQKH7tBTslnS3tA8A=="],
"@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.21.5", "", { "os": "win32", "cpu": "ia32" }, "sha512-SWXFF1CL2RVNMaVs+BBClwtfZSvDgtL//G/smwAc5oVK/UPu2Gu9tIaRgFmYFFKrmg3SyAjSrElf0TiJ1v8fYA=="],
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.21.5", "", { "os": "win32", "cpu": "x64" }, "sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw=="],
"@eslint-community/eslint-utils": ["@eslint-community/eslint-utils@4.9.1", "", { "dependencies": { "eslint-visitor-keys": "^3.4.3" }, "peerDependencies": { "eslint": "^6.0.0 || ^7.0.0 || >=8.0.0" } }, "sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ=="],
"@eslint-community/regexpp": ["@eslint-community/regexpp@4.12.2", "", {}, "sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew=="],
@@ -114,6 +187,16 @@
"@img/sharp-win32-x64": ["@img/sharp-win32-x64@0.34.5", "", { "os": "win32", "cpu": "x64" }, "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw=="],
"@inquirer/ansi": ["@inquirer/ansi@1.0.2", "", {}, "sha512-S8qNSZiYzFd0wAcyG5AXCvUHC5Sr7xpZ9wZ2py9XR88jUz8wooStVx5M6dRzczbBWjic9NP7+rY0Xi7qqK/aMQ=="],
"@inquirer/confirm": ["@inquirer/confirm@5.1.21", "", { "dependencies": { "@inquirer/core": "^10.3.2", "@inquirer/type": "^3.0.10" }, "peerDependencies": { "@types/node": ">=18" }, "optionalPeers": ["@types/node"] }, "sha512-KR8edRkIsUayMXV+o3Gv+q4jlhENF9nMYUZs9PA2HzrXeHI8M5uDag70U7RJn9yyiMZSbtF5/UexBtAVtZGSbQ=="],
"@inquirer/core": ["@inquirer/core@10.3.2", "", { "dependencies": { "@inquirer/ansi": "^1.0.2", "@inquirer/figures": "^1.0.15", "@inquirer/type": "^3.0.10", "cli-width": "^4.1.0", "mute-stream": "^2.0.0", "signal-exit": "^4.1.0", "wrap-ansi": "^6.2.0", "yoctocolors-cjs": "^2.1.3" }, "peerDependencies": { "@types/node": ">=18" }, "optionalPeers": ["@types/node"] }, "sha512-43RTuEbfP8MbKzedNqBrlhhNKVwoK//vUFNW3Q3vZ88BLcrs4kYpGg+B2mm5p2K/HfygoCxuKwJJiv8PbGmE0A=="],
"@inquirer/figures": ["@inquirer/figures@1.0.15", "", {}, "sha512-t2IEY+unGHOzAaVM5Xx6DEWKeXlDDcNPeDyUpsRc6CUhBfU3VQOEl+Vssh7VNp1dR8MdUJBWhuObjXCsVpjN5g=="],
"@inquirer/type": ["@inquirer/type@3.0.10", "", { "peerDependencies": { "@types/node": ">=18" }, "optionalPeers": ["@types/node"] }, "sha512-BvziSRxfz5Ov8ch0z/n3oijRSEcEsHnhggm4xFZe93DHcUCTlutlq9Ox4SVENAfcRD22UQq7T/atg9Wr3k09eA=="],
"@jridgewell/gen-mapping": ["@jridgewell/gen-mapping@0.3.13", "", { "dependencies": { "@jridgewell/sourcemap-codec": "^1.5.0", "@jridgewell/trace-mapping": "^0.3.24" } }, "sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA=="],
"@jridgewell/remapping": ["@jridgewell/remapping@2.3.5", "", { "dependencies": { "@jridgewell/gen-mapping": "^0.3.5", "@jridgewell/trace-mapping": "^0.3.24" } }, "sha512-LI9u/+laYG4Ds1TDKSJW2YPrIlcVYOwi2fUC6xB43lueCjgxV4lffOCZCtYFiH6TNOX+tQKXx97T4IKHbhyHEQ=="],
@@ -124,6 +207,8 @@
"@jridgewell/trace-mapping": ["@jridgewell/trace-mapping@0.3.31", "", { "dependencies": { "@jridgewell/resolve-uri": "^3.1.0", "@jridgewell/sourcemap-codec": "^1.4.14" } }, "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw=="],
"@mswjs/interceptors": ["@mswjs/interceptors@0.41.3", "", { "dependencies": { "@open-draft/deferred-promise": "^2.2.0", "@open-draft/logger": "^0.3.0", "@open-draft/until": "^2.0.0", "is-node-process": "^1.2.0", "outvariant": "^1.4.3", "strict-event-emitter": "^0.5.1" } }, "sha512-cXu86tF4VQVfwz8W1SPbhoRyHJkti6mjH/XJIxp40jhO4j2k1m4KYrEykxqWPkFF3vrK4rgQppBh//AwyGSXPA=="],
"@napi-rs/wasm-runtime": ["@napi-rs/wasm-runtime@0.2.12", "", { "dependencies": { "@emnapi/core": "^1.4.3", "@emnapi/runtime": "^1.4.3", "@tybys/wasm-util": "^0.10.0" } }, "sha512-ZVWUcfwY4E/yPitQJl481FjFo3K22D6qF0DuFH6Y/nbnE11GY5uguDxZMGXPQ8WQ0128MXQD7TnfHyK4oWoIJQ=="],
"@next/env": ["@next/env@15.5.14", "", {}, "sha512-aXeirLYuASxEgi4X4WhfXsShCFxWDfNn/8ZeC5YXAS2BB4A8FJi1kwwGL6nvMVboE7fZCzmJPNdMvVHc8JpaiA=="],
@@ -154,10 +239,72 @@
"@nolyfill/is-core-module": ["@nolyfill/is-core-module@1.0.39", "", {}, "sha512-nn5ozdjYQpUCZlWGuxcJY/KpxkWQs4DcbMCmKojjyrYDEAGy4Ce19NN4v5MduafTwJlbKc99UA8YhSVqq9yPZA=="],
"@open-draft/deferred-promise": ["@open-draft/deferred-promise@2.2.0", "", {}, "sha512-CecwLWx3rhxVQF6V4bAgPS5t+So2sTbPgAzafKkVizyi7tlwpcFpdFqq+wqF2OwNBmqFuu6tOyouTuxgpMfzmA=="],
"@open-draft/logger": ["@open-draft/logger@0.3.0", "", { "dependencies": { "is-node-process": "^1.2.0", "outvariant": "^1.4.0" } }, "sha512-X2g45fzhxH238HKO4xbSr7+wBS8Fvw6ixhTDuvLd5mqh6bJJCFAPwU9mPDxbcrRtfxv4u5IHCEH77BmxvXmmxQ=="],
"@open-draft/until": ["@open-draft/until@2.1.0", "", {}, "sha512-U69T3ItWHvLwGg5eJ0n3I62nWuE6ilHlmz7zM0npLBRvPRd7e6NYmg54vvRtP5mZG7kZqZCFVdsTWo7BPtBujg=="],
"@reduxjs/toolkit": ["@reduxjs/toolkit@2.11.2", "", { "dependencies": { "@standard-schema/spec": "^1.0.0", "@standard-schema/utils": "^0.3.0", "immer": "^11.0.0", "redux": "^5.0.1", "redux-thunk": "^3.1.0", "reselect": "^5.1.0" }, "peerDependencies": { "react": "^16.9.0 || ^17.0.0 || ^18 || ^19", "react-redux": "^7.2.1 || ^8.1.3 || ^9.0.0" }, "optionalPeers": ["react", "react-redux"] }, "sha512-Kd6kAHTA6/nUpp8mySPqj3en3dm0tdMIgbttnQ1xFMVpufoj+ADi8pXLBsd4xzTRHQa7t/Jv8W5UnCuW4kuWMQ=="],
"@rollup/rollup-android-arm-eabi": ["@rollup/rollup-android-arm-eabi@4.60.0", "", { "os": "android", "cpu": "arm" }, "sha512-WOhNW9K8bR3kf4zLxbfg6Pxu2ybOUbB2AjMDHSQx86LIF4rH4Ft7vmMwNt0loO0eonglSNy4cpD3MKXXKQu0/A=="],
"@rollup/rollup-android-arm64": ["@rollup/rollup-android-arm64@4.60.0", "", { "os": "android", "cpu": "arm64" }, "sha512-u6JHLll5QKRvjciE78bQXDmqRqNs5M/3GVqZeMwvmjaNODJih/WIrJlFVEihvV0MiYFmd+ZyPr9wxOVbPAG2Iw=="],
"@rollup/rollup-darwin-arm64": ["@rollup/rollup-darwin-arm64@4.60.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-qEF7CsKKzSRc20Ciu2Zw1wRrBz4g56F7r/vRwY430UPp/nt1x21Q/fpJ9N5l47WWvJlkNCPJz3QRVw008fi7yA=="],
"@rollup/rollup-darwin-x64": ["@rollup/rollup-darwin-x64@4.60.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-WADYozJ4QCnXCH4wPB+3FuGmDPoFseVCUrANmA5LWwGmC6FL14BWC7pcq+FstOZv3baGX65tZ378uT6WG8ynTw=="],
"@rollup/rollup-freebsd-arm64": ["@rollup/rollup-freebsd-arm64@4.60.0", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-6b8wGHJlDrGeSE3aH5mGNHBjA0TTkxdoNHik5EkvPHCt351XnigA4pS7Wsj/Eo9Y8RBU6f35cjN9SYmCFBtzxw=="],
"@rollup/rollup-freebsd-x64": ["@rollup/rollup-freebsd-x64@4.60.0", "", { "os": "freebsd", "cpu": "x64" }, "sha512-h25Ga0t4jaylMB8M/JKAyrvvfxGRjnPQIR8lnCayyzEjEOx2EJIlIiMbhpWxDRKGKF8jbNH01NnN663dH638mA=="],
"@rollup/rollup-linux-arm-gnueabihf": ["@rollup/rollup-linux-arm-gnueabihf@4.60.0", "", { "os": "linux", "cpu": "arm" }, "sha512-RzeBwv0B3qtVBWtcuABtSuCzToo2IEAIQrcyB/b2zMvBWVbjo8bZDjACUpnaafaxhTw2W+imQbP2BD1usasK4g=="],
"@rollup/rollup-linux-arm-musleabihf": ["@rollup/rollup-linux-arm-musleabihf@4.60.0", "", { "os": "linux", "cpu": "arm" }, "sha512-Sf7zusNI2CIU1HLzuu9Tc5YGAHEZs5Lu7N1ssJG4Tkw6e0MEsN7NdjUDDfGNHy2IU+ENyWT+L2obgWiguWibWQ=="],
"@rollup/rollup-linux-arm64-gnu": ["@rollup/rollup-linux-arm64-gnu@4.60.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-DX2x7CMcrJzsE91q7/O02IJQ5/aLkVtYFryqCjduJhUfGKG6yJV8hxaw8pZa93lLEpPTP/ohdN4wFz7yp/ry9A=="],
"@rollup/rollup-linux-arm64-musl": ["@rollup/rollup-linux-arm64-musl@4.60.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-09EL+yFVbJZlhcQfShpswwRZ0Rg+z/CsSELFCnPt3iK+iqwGsI4zht3secj5vLEs957QvFFXnzAT0FFPIxSrkQ=="],
"@rollup/rollup-linux-loong64-gnu": ["@rollup/rollup-linux-loong64-gnu@4.60.0", "", { "os": "linux", "cpu": "none" }, "sha512-i9IcCMPr3EXm8EQg5jnja0Zyc1iFxJjZWlb4wr7U2Wx/GrddOuEafxRdMPRYVaXjgbhvqalp6np07hN1w9kAKw=="],
"@rollup/rollup-linux-loong64-musl": ["@rollup/rollup-linux-loong64-musl@4.60.0", "", { "os": "linux", "cpu": "none" }, "sha512-DGzdJK9kyJ+B78MCkWeGnpXJ91tK/iKA6HwHxF4TAlPIY7GXEvMe8hBFRgdrR9Ly4qebR/7gfUs9y2IoaVEyog=="],
"@rollup/rollup-linux-ppc64-gnu": ["@rollup/rollup-linux-ppc64-gnu@4.60.0", "", { "os": "linux", "cpu": "ppc64" }, "sha512-RwpnLsqC8qbS8z1H1AxBA1H6qknR4YpPR9w2XX0vo2Sz10miu57PkNcnHVaZkbqyw/kUWfKMI73jhmfi9BRMUQ=="],
"@rollup/rollup-linux-ppc64-musl": ["@rollup/rollup-linux-ppc64-musl@4.60.0", "", { "os": "linux", "cpu": "ppc64" }, "sha512-Z8pPf54Ly3aqtdWC3G4rFigZgNvd+qJlOE52fmko3KST9SoGfAdSRCwyoyG05q1HrrAblLbk1/PSIV+80/pxLg=="],
"@rollup/rollup-linux-riscv64-gnu": ["@rollup/rollup-linux-riscv64-gnu@4.60.0", "", { "os": "linux", "cpu": "none" }, "sha512-3a3qQustp3COCGvnP4SvrMHnPQ9d1vzCakQVRTliaz8cIp/wULGjiGpbcqrkv0WrHTEp8bQD/B3HBjzujVWLOA=="],
"@rollup/rollup-linux-riscv64-musl": ["@rollup/rollup-linux-riscv64-musl@4.60.0", "", { "os": "linux", "cpu": "none" }, "sha512-pjZDsVH/1VsghMJ2/kAaxt6dL0psT6ZexQVrijczOf+PeP2BUqTHYejk3l6TlPRydggINOeNRhvpLa0AYpCWSQ=="],
"@rollup/rollup-linux-s390x-gnu": ["@rollup/rollup-linux-s390x-gnu@4.60.0", "", { "os": "linux", "cpu": "s390x" }, "sha512-3ObQs0BhvPgiUVZrN7gqCSvmFuMWvWvsjG5ayJ3Lraqv+2KhOsp+pUbigqbeWqueGIsnn+09HBw27rJ+gYK4VQ=="],
"@rollup/rollup-linux-x64-gnu": ["@rollup/rollup-linux-x64-gnu@4.60.0", "", { "os": "linux", "cpu": "x64" }, "sha512-EtylprDtQPdS5rXvAayrNDYoJhIz1/vzN2fEubo3yLE7tfAw+948dO0g4M0vkTVFhKojnF+n6C8bDNe+gDRdTg=="],
"@rollup/rollup-linux-x64-musl": ["@rollup/rollup-linux-x64-musl@4.60.0", "", { "os": "linux", "cpu": "x64" }, "sha512-k09oiRCi/bHU9UVFqD17r3eJR9bn03TyKraCrlz5ULFJGdJGi7VOmm9jl44vOJvRJ6P7WuBi/s2A97LxxHGIdw=="],
"@rollup/rollup-openbsd-x64": ["@rollup/rollup-openbsd-x64@4.60.0", "", { "os": "openbsd", "cpu": "x64" }, "sha512-1o/0/pIhozoSaDJoDcec+IVLbnRtQmHwPV730+AOD29lHEEo4F5BEUB24H0OBdhbBBDwIOSuf7vgg0Ywxdfiiw=="],
"@rollup/rollup-openharmony-arm64": ["@rollup/rollup-openharmony-arm64@4.60.0", "", { "os": "none", "cpu": "arm64" }, "sha512-pESDkos/PDzYwtyzB5p/UoNU/8fJo68vcXM9ZW2V0kjYayj1KaaUfi1NmTUTUpMn4UhU4gTuK8gIaFO4UGuMbA=="],
"@rollup/rollup-win32-arm64-msvc": ["@rollup/rollup-win32-arm64-msvc@4.60.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-hj1wFStD7B1YBeYmvY+lWXZ7ey73YGPcViMShYikqKT1GtstIKQAtfUI6yrzPjAy/O7pO0VLXGmUVWXQMaYgTQ=="],
"@rollup/rollup-win32-ia32-msvc": ["@rollup/rollup-win32-ia32-msvc@4.60.0", "", { "os": "win32", "cpu": "ia32" }, "sha512-SyaIPFoxmUPlNDq5EHkTbiKzmSEmq/gOYFI/3HHJ8iS/v1mbugVa7dXUzcJGQfoytp9DJFLhHH4U3/eTy2Bq4w=="],
"@rollup/rollup-win32-x64-gnu": ["@rollup/rollup-win32-x64-gnu@4.60.0", "", { "os": "win32", "cpu": "x64" }, "sha512-RdcryEfzZr+lAr5kRm2ucN9aVlCCa2QNq4hXelZxb8GG0NJSazq44Z3PCCc8wISRuCVnGs0lQJVX5Vp6fKA+IA=="],
"@rollup/rollup-win32-x64-msvc": ["@rollup/rollup-win32-x64-msvc@4.60.0", "", { "os": "win32", "cpu": "x64" }, "sha512-PrsWNQ8BuE00O3Xsx3ALh2Df8fAj9+cvvX9AIA6o4KpATR98c9mud4XtDWVvsEuyia5U4tVSTKygawyJkjm60w=="],
"@rtsao/scc": ["@rtsao/scc@1.1.0", "", {}, "sha512-zt6OdqaDoOnJ1ZYsCYGt9YmWzDXl4vQdKTyJev62gFhRGKdx7mcT54V9KIjg+d2wi9EXsPvAPKe7i7WjfVWB8g=="],
"@rushstack/eslint-patch": ["@rushstack/eslint-patch@1.16.1", "", {}, "sha512-TvZbIpeKqGQQ7X0zSCvPH9riMSFQFSggnfBjFZ1mEoILW+UuXCKwOoPcgjMwiUtRqFZ8jWhPJc4um14vC6I4ag=="],
"@standard-schema/spec": ["@standard-schema/spec@1.1.0", "", {}, "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w=="],
"@standard-schema/utils": ["@standard-schema/utils@0.3.0", "", {}, "sha512-e7Mew686owMaPJVNNLs55PUvgz371nKgwsc4vxE49zsODpJEnxgxRo2y/OKrqueavXgZNMDVj3DdHFlaSAeU8g=="],
"@supabase/auth-js": ["@supabase/auth-js@2.100.0", "", { "dependencies": { "tslib": "2.8.1" } }, "sha512-pdT3ye3UVRN1Cg0wom6BmyY+XTtp5DiJaYnPi6j8ht5i8Lq8kfqxJMJz9GI9YDKk3w1nhGOPnh6Qz5qpyYm+1w=="],
"@supabase/functions-js": ["@supabase/functions-js@2.100.0", "", { "dependencies": { "tslib": "2.8.1" } }, "sha512-keLg79RPwP+uiwHuxFPTFgDRxPV46LM4j/swjyR2GKJgWniTVSsgiBHfbIBDcrQwehLepy09b/9QSHUywtKRWQ=="],
@@ -210,8 +357,36 @@
"@tanstack/react-query": ["@tanstack/react-query@5.95.2", "", { "dependencies": { "@tanstack/query-core": "5.95.2" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-/wGkvLj/st5Ud1Q76KF1uFxScV7WeqN1slQx5280ycwAyYkIPGaRZAEgHxe3bjirSd5Zpwkj6zNcR4cqYni/ZA=="],
"@testing-library/dom": ["@testing-library/dom@10.4.1", "", { "dependencies": { "@babel/code-frame": "^7.10.4", "@babel/runtime": "^7.12.5", "@types/aria-query": "^5.0.1", "aria-query": "5.3.0", "dom-accessibility-api": "^0.5.9", "lz-string": "^1.5.0", "picocolors": "1.1.1", "pretty-format": "^27.0.2" } }, "sha512-o4PXJQidqJl82ckFaXUeoAW+XysPLauYI43Abki5hABd853iMhitooc6znOnczgbTYmEP6U6/y1ZyKAIsvMKGg=="],
"@testing-library/jest-dom": ["@testing-library/jest-dom@6.9.1", "", { "dependencies": { "@adobe/css-tools": "^4.4.0", "aria-query": "^5.0.0", "css.escape": "^1.5.1", "dom-accessibility-api": "^0.6.3", "picocolors": "^1.1.1", "redent": "^3.0.0" } }, "sha512-zIcONa+hVtVSSep9UT3jZ5rizo2BsxgyDYU7WFD5eICBE7no3881HGeb/QkGfsJs6JTkY1aQhT7rIPC7e+0nnA=="],
"@testing-library/react": ["@testing-library/react@16.3.2", "", { "dependencies": { "@babel/runtime": "^7.12.5" }, "peerDependencies": { "@testing-library/dom": "^10.0.0", "@types/react": "^18.0.0 || ^19.0.0", "@types/react-dom": "^18.0.0 || ^19.0.0", "react": "^18.0.0 || ^19.0.0", "react-dom": "^18.0.0 || ^19.0.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-XU5/SytQM+ykqMnAnvB2umaJNIOsLF3PVv//1Ew4CTcpz0/BRyy/af40qqrt7SjKpDdT1saBMc42CUok5gaw+g=="],
"@testing-library/user-event": ["@testing-library/user-event@14.6.1", "", { "peerDependencies": { "@testing-library/dom": ">=7.21.4" } }, "sha512-vq7fv0rnt+QTXgPxr5Hjc210p6YKq2kmdziLgnsZGgLJ9e6VAShx1pACLuRjd/AS/sr7phAR58OIIpf0LlmQNw=="],
"@tybys/wasm-util": ["@tybys/wasm-util@0.10.1", "", { "dependencies": { "tslib": "^2.4.0" } }, "sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg=="],
"@types/aria-query": ["@types/aria-query@5.0.4", "", {}, "sha512-rfT93uj5s0PRL7EzccGMs3brplhcrghnDoV26NqKhCAS1hVo+WdNsPvE/yb6ilfr5hi2MEk6d5EWJTKdxg8jVw=="],
"@types/d3-array": ["@types/d3-array@3.2.2", "", {}, "sha512-hOLWVbm7uRza0BYXpIIW5pxfrKe0W+D5lrFiAEYR+pb6w3N2SwSMaJbXdUfSEv+dT4MfHBLtn5js0LAWaO6otw=="],
"@types/d3-color": ["@types/d3-color@3.1.3", "", {}, "sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A=="],
"@types/d3-ease": ["@types/d3-ease@3.0.2", "", {}, "sha512-NcV1JjO5oDzoK26oMzbILE6HW7uVXOHLQvHshBUW4UMdZGfiY6v5BeQwh9a9tCzv+CeefZQHJt5SRgK154RtiA=="],
"@types/d3-interpolate": ["@types/d3-interpolate@3.0.4", "", { "dependencies": { "@types/d3-color": "*" } }, "sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA=="],
"@types/d3-path": ["@types/d3-path@3.1.1", "", {}, "sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg=="],
"@types/d3-scale": ["@types/d3-scale@4.0.9", "", { "dependencies": { "@types/d3-time": "*" } }, "sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw=="],
"@types/d3-shape": ["@types/d3-shape@3.1.8", "", { "dependencies": { "@types/d3-path": "*" } }, "sha512-lae0iWfcDeR7qt7rA88BNiqdvPS5pFVPpo5OfjElwNaT2yyekbM0C9vK+yqBqEmHr6lDkRnYNoTBYlAgJa7a4w=="],
"@types/d3-time": ["@types/d3-time@3.0.4", "", {}, "sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g=="],
"@types/d3-timer": ["@types/d3-timer@3.0.2", "", {}, "sha512-Ps3T8E8dZDam6fUyNiMkekK3XUsaUEik+idO9/YjPtfj2qruF8tFBXS7XhtE4iIXBLxhmLjP3SXpLhVf21I9Lw=="],
"@types/estree": ["@types/estree@1.0.8", "", {}, "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w=="],
"@types/json-schema": ["@types/json-schema@7.0.15", "", {}, "sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA=="],
@@ -224,6 +399,10 @@
"@types/react-dom": ["@types/react-dom@19.2.3", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ=="],
"@types/statuses": ["@types/statuses@2.0.6", "", {}, "sha512-xMAgYwceFhRA2zY+XbEA7mxYbA093wdiW8Vu6gZPGWy9cmOyU9XesH1tNcEWsKFd5Vzrqx5T3D38PWx1FIIXkA=="],
"@types/use-sync-external-store": ["@types/use-sync-external-store@0.0.6", "", {}, "sha512-zFDAD+tlpf2r4asuHEj0XH6pY6i0g5NeAHPn+15wk3BV6JA69eERFXC1gyGThDkVa1zCyKr5jox1+2LbV/AMLg=="],
"@types/ws": ["@types/ws@8.18.1", "", { "dependencies": { "@types/node": "*" } }, "sha512-ThVF6DCVhA8kUGy+aazFQ4kXQ7E1Ty7A3ypFOe0IcJV8O/M511G99AW24irKrW56Wt44yG9+ij8FaqoBGkuBXg=="],
"@typescript-eslint/eslint-plugin": ["@typescript-eslint/eslint-plugin@8.57.2", "", { "dependencies": { "@eslint-community/regexpp": "^4.12.2", "@typescript-eslint/scope-manager": "8.57.2", "@typescript-eslint/type-utils": "8.57.2", "@typescript-eslint/utils": "8.57.2", "@typescript-eslint/visitor-keys": "8.57.2", "ignore": "^7.0.5", "natural-compare": "^1.4.0", "ts-api-utils": "^2.4.0" }, "peerDependencies": { "@typescript-eslint/parser": "^8.57.2", "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", "typescript": ">=4.8.4 <6.0.0" } }, "sha512-NZZgp0Fm2IkD+La5PR81sd+g+8oS6JwJje+aRWsDocxHkjyRw0J5L5ZTlN3LI1LlOcGL7ph3eaIUmTXMIjLk0w=="],
@@ -284,12 +463,30 @@
"@unrs/resolver-binding-win32-x64-msvc": ["@unrs/resolver-binding-win32-x64-msvc@1.11.1", "", { "os": "win32", "cpu": "x64" }, "sha512-lrW200hZdbfRtztbygyaq/6jP6AKE8qQN2KvPcJ+x7wiD038YtnYtZ82IMNJ69GJibV7bwL3y9FgK+5w/pYt6g=="],
"@vitest/expect": ["@vitest/expect@2.1.8", "", { "dependencies": { "@vitest/spy": "2.1.8", "@vitest/utils": "2.1.8", "chai": "^5.1.2", "tinyrainbow": "^1.2.0" } }, "sha512-8ytZ/fFHq2g4PJVAtDX57mayemKgDR6X3Oa2Foro+EygiOJHUXhCqBAAKQYYajZpFoIfvBCF1j6R6IYRSIUFuw=="],
"@vitest/mocker": ["@vitest/mocker@2.1.8", "", { "dependencies": { "@vitest/spy": "2.1.8", "estree-walker": "^3.0.3", "magic-string": "^0.30.12" }, "peerDependencies": { "msw": "^2.4.9", "vite": "^5.0.0" }, "optionalPeers": ["msw", "vite"] }, "sha512-7guJ/47I6uqfttp33mgo6ga5Gr1VnL58rcqYKyShoRK9ebu8T5Rs6HN3s1NABiBeVTdWNrwUMcHH54uXZBN4zA=="],
"@vitest/pretty-format": ["@vitest/pretty-format@2.1.9", "", { "dependencies": { "tinyrainbow": "^1.2.0" } }, "sha512-KhRIdGV2U9HOUzxfiHmY8IFHTdqtOhIzCpd8WRdJiE7D/HUcZVD0EgQCVjm+Q9gkUXWgBvMmTtZgIG48wq7sOQ=="],
"@vitest/runner": ["@vitest/runner@2.1.8", "", { "dependencies": { "@vitest/utils": "2.1.8", "pathe": "^1.1.2" } }, "sha512-17ub8vQstRnRlIU5k50bG+QOMLHRhYPAna5tw8tYbj+jzjcspnwnwtPtiOlkuKC4+ixDPTuLZiqiWWQ2PSXHVg=="],
"@vitest/snapshot": ["@vitest/snapshot@2.1.8", "", { "dependencies": { "@vitest/pretty-format": "2.1.8", "magic-string": "^0.30.12", "pathe": "^1.1.2" } }, "sha512-20T7xRFbmnkfcmgVEz+z3AU/3b0cEzZOt/zmnvZEctg64/QZbSDJEVm9fLnnlSi74KibmRsO9/Qabi+t0vCRPg=="],
"@vitest/spy": ["@vitest/spy@2.1.8", "", { "dependencies": { "tinyspy": "^3.0.2" } }, "sha512-5swjf2q95gXeYPevtW0BLk6H8+bPlMb4Vw/9Em4hFxDcaOxS+e0LOX4yqNxoHzMR2akEB2xfpnWUzkZokmgWDg=="],
"@vitest/utils": ["@vitest/utils@2.1.8", "", { "dependencies": { "@vitest/pretty-format": "2.1.8", "loupe": "^3.1.2", "tinyrainbow": "^1.2.0" } }, "sha512-dwSoui6djdwbfFmIgbIjX2ZhIoG7Ex/+xpxyiEgIGzjliY8xGkcpITKTlp6B4MgtGkF2ilvm97cPM96XZaAgcA=="],
"acorn": ["acorn@8.16.0", "", { "bin": { "acorn": "bin/acorn" } }, "sha512-UVJyE9MttOsBQIDKw1skb9nAwQuR5wuGD3+82K6JgJlm/Y+KI92oNsMNGZCYdDsVtRHSak0pcV5Dno5+4jh9sw=="],
"acorn-jsx": ["acorn-jsx@5.3.2", "", { "peerDependencies": { "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0" } }, "sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ=="],
"agent-base": ["agent-base@7.1.4", "", {}, "sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ=="],
"ajv": ["ajv@6.14.0", "", { "dependencies": { "fast-deep-equal": "^3.1.1", "fast-json-stable-stringify": "^2.0.0", "json-schema-traverse": "^0.4.1", "uri-js": "^4.2.2" } }, "sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw=="],
"ansi-regex": ["ansi-regex@5.0.1", "", {}, "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ=="],
"ansi-styles": ["ansi-styles@4.3.0", "", { "dependencies": { "color-convert": "^2.0.1" } }, "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg=="],
"argparse": ["argparse@2.0.1", "", {}, "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q=="],
@@ -312,10 +509,14 @@
"arraybuffer.prototype.slice": ["arraybuffer.prototype.slice@1.0.4", "", { "dependencies": { "array-buffer-byte-length": "^1.0.1", "call-bind": "^1.0.8", "define-properties": "^1.2.1", "es-abstract": "^1.23.5", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.6", "is-array-buffer": "^3.0.4" } }, "sha512-BNoCY6SXXPQ7gF2opIP4GBE+Xw7U+pHMYKuzjgCN3GwiaIR09UUeKfheyIry77QtrCBlC0KK0q5/TER/tYh3PQ=="],
"assertion-error": ["assertion-error@2.0.1", "", {}, "sha512-Izi8RQcffqCeNVgFigKli1ssklIbpHnCYc6AknXGYoB6grJqyeby7jv12JUQgmTAnIDnbck1uxksT4dzN3PWBA=="],
"ast-types-flow": ["ast-types-flow@0.0.8", "", {}, "sha512-OH/2E5Fg20h2aPrbe+QL8JZQFko0YZaF+j4mnQ7BGhfavO7OpSLa8a0y9sBwomHdSbkhTS8TQNayBfnW5DwbvQ=="],
"async-function": ["async-function@1.0.0", "", {}, "sha512-hsU18Ae8CDTR6Kgu9DYf0EbCr/a5iGL0rytQDobUcdpYOKokk8LEjVphnXkDkgpi0wYVsqrXuP0bZxJaTqdgoA=="],
"asynckit": ["asynckit@0.4.0", "", {}, "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="],
"attr-accept": ["attr-accept@2.2.5", "", {}, "sha512-0bDNnY/u6pPwHDMoF0FieU354oBi0a8rD9FcsLwzcGWbc8KS8KPIi7y+s13OlVY+gMWc/9xEMUgNE6Qm8ZllYQ=="],
"available-typed-arrays": ["available-typed-arrays@1.0.7", "", { "dependencies": { "possible-typed-array-names": "^1.0.0" } }, "sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ=="],
@@ -330,6 +531,8 @@
"braces": ["braces@3.0.3", "", { "dependencies": { "fill-range": "^7.1.1" } }, "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA=="],
"cac": ["cac@6.7.14", "", {}, "sha512-b6Ilus+c3RrdDk+JhLKUAQfzzgLEPy6wcXqS7f/xe1EETvsDP6GORG7SFuOs6cID5YkqchW/LXZbX5bc8j7ZcQ=="],
"call-bind": ["call-bind@1.0.8", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.0", "es-define-property": "^1.0.0", "get-intrinsic": "^1.2.4", "set-function-length": "^1.2.2" } }, "sha512-oKlSFMcMwpUg2ednkhQ454wfWiU/ul3CkJe/PEHcTKuiX6RpbehUiFMXu13HalGZxfUwCQzZG747YXBn1im9ww=="],
"call-bind-apply-helpers": ["call-bind-apply-helpers@1.0.2", "", { "dependencies": { "es-errors": "^1.3.0", "function-bind": "^1.1.2" } }, "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ=="],
@@ -340,24 +543,64 @@
"caniuse-lite": ["caniuse-lite@1.0.30001781", "", {}, "sha512-RdwNCyMsNBftLjW6w01z8bKEvT6e/5tpPVEgtn22TiLGlstHOVecsX2KHFkD5e/vRnIE4EGzpuIODb3mtswtkw=="],
"chai": ["chai@5.3.3", "", { "dependencies": { "assertion-error": "^2.0.1", "check-error": "^2.1.1", "deep-eql": "^5.0.1", "loupe": "^3.1.0", "pathval": "^2.0.0" } }, "sha512-4zNhdJD/iOjSH0A05ea+Ke6MU5mmpQcbQsSOkgdaUMJ9zTlDTD/GYlwohmIE2u0gaxHYiVHEn1Fw9mZ/ktJWgw=="],
"chalk": ["chalk@4.1.2", "", { "dependencies": { "ansi-styles": "^4.1.0", "supports-color": "^7.1.0" } }, "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA=="],
"check-error": ["check-error@2.1.3", "", {}, "sha512-PAJdDJusoxnwm1VwW07VWwUN1sl7smmC3OKggvndJFadxxDRyFJBX/ggnu/KE4kQAB7a3Dp8f/YXC1FlUprWmA=="],
"cli-width": ["cli-width@4.1.0", "", {}, "sha512-ouuZd4/dm2Sw5Gmqy6bGyNNNe1qt9RpmxveLSO7KcgsTnU7RXfsw+/bukWGo1abgBiMAic068rclZsO4IWmmxQ=="],
"client-only": ["client-only@0.0.1", "", {}, "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA=="],
"cliui": ["cliui@8.0.1", "", { "dependencies": { "string-width": "^4.2.0", "strip-ansi": "^6.0.1", "wrap-ansi": "^7.0.0" } }, "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ=="],
"clsx": ["clsx@2.1.1", "", {}, "sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA=="],
"color-convert": ["color-convert@2.0.1", "", { "dependencies": { "color-name": "~1.1.4" } }, "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ=="],
"color-name": ["color-name@1.1.4", "", {}, "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="],
"combined-stream": ["combined-stream@1.0.8", "", { "dependencies": { "delayed-stream": "~1.0.0" } }, "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg=="],
"concat-map": ["concat-map@0.0.1", "", {}, "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg=="],
"cookie": ["cookie@1.1.1", "", {}, "sha512-ei8Aos7ja0weRpFzJnEA9UHJ/7XQmqglbRwnf2ATjcB9Wq874VKH9kfjjirM6UhU2/E5fFYadylyhFldcqSidQ=="],
"cross-spawn": ["cross-spawn@7.0.6", "", { "dependencies": { "path-key": "^3.1.0", "shebang-command": "^2.0.0", "which": "^2.0.1" } }, "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA=="],
"css.escape": ["css.escape@1.5.1", "", {}, "sha512-YUifsXXuknHlUsmlgyY0PKzgPOr7/FjCePfHNt0jxm83wHZi44VDMQ7/fGNkjY3/jV1MC+1CmZbaHzugyeRtpg=="],
"cssstyle": ["cssstyle@4.6.0", "", { "dependencies": { "@asamuzakjp/css-color": "^3.2.0", "rrweb-cssom": "^0.8.0" } }, "sha512-2z+rWdzbbSZv6/rhtvzvqeZQHrBaqgogqt85sqFNbabZOuFbCVFb8kPeEtZjiKkbrm395irpNKiYeFeLiQnFPg=="],
"csstype": ["csstype@3.2.3", "", {}, "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ=="],
"d3-array": ["d3-array@3.2.4", "", { "dependencies": { "internmap": "1 - 2" } }, "sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg=="],
"d3-color": ["d3-color@3.1.0", "", {}, "sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA=="],
"d3-ease": ["d3-ease@3.0.1", "", {}, "sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w=="],
"d3-format": ["d3-format@3.1.2", "", {}, "sha512-AJDdYOdnyRDV5b6ArilzCPPwc1ejkHcoyFarqlPqT7zRYjhavcT3uSrqcMvsgh2CgoPbK3RCwyHaVyxYcP2Arg=="],
"d3-interpolate": ["d3-interpolate@3.0.1", "", { "dependencies": { "d3-color": "1 - 3" } }, "sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g=="],
"d3-path": ["d3-path@3.1.0", "", {}, "sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ=="],
"d3-scale": ["d3-scale@4.0.2", "", { "dependencies": { "d3-array": "2.10.0 - 3", "d3-format": "1 - 3", "d3-interpolate": "1.2.0 - 3", "d3-time": "2.1.1 - 3", "d3-time-format": "2 - 4" } }, "sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ=="],
"d3-shape": ["d3-shape@3.2.0", "", { "dependencies": { "d3-path": "^3.1.0" } }, "sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA=="],
"d3-time": ["d3-time@3.1.0", "", { "dependencies": { "d3-array": "2 - 3" } }, "sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q=="],
"d3-time-format": ["d3-time-format@4.1.0", "", { "dependencies": { "d3-time": "1 - 3" } }, "sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg=="],
"d3-timer": ["d3-timer@3.0.1", "", {}, "sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA=="],
"damerau-levenshtein": ["damerau-levenshtein@1.0.8", "", {}, "sha512-sdQSFB7+llfUcQHUQO3+B8ERRj0Oa4w9POWMI/puGtuf7gFywGmkaLCElnudfTiKZV+NvHqL0ifzdrI8Ro7ESA=="],
"data-urls": ["data-urls@5.0.0", "", { "dependencies": { "whatwg-mimetype": "^4.0.0", "whatwg-url": "^14.0.0" } }, "sha512-ZYP5VBHshaDAiVZxjbRVcFJpc+4xGgT0bK3vzy1HLN8jTO975HEbuYzZJcHoQEY5K1a0z8YayJkyVETa08eNTg=="],
"data-view-buffer": ["data-view-buffer@1.0.2", "", { "dependencies": { "call-bound": "^1.0.3", "es-errors": "^1.3.0", "is-data-view": "^1.0.2" } }, "sha512-EmKO5V3OLXh1rtK2wgXRansaK1/mtVdTUEiEI0W8RkvgT05kfxaH29PliLnpLP73yYO6142Q72QNa8Wx/A5CqQ=="],
"data-view-byte-length": ["data-view-byte-length@1.0.2", "", { "dependencies": { "call-bound": "^1.0.3", "es-errors": "^1.3.0", "is-data-view": "^1.0.2" } }, "sha512-tuhGbE6CfTM9+5ANGf+oQb72Ky/0+s3xKUpHvShfiz2RxMFgFPjsXuRLBVMtvMs15awe45SRb83D6wH4ew6wlQ=="],
@@ -368,22 +611,36 @@
"debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
"decimal.js": ["decimal.js@10.6.0", "", {}, "sha512-YpgQiITW3JXGntzdUmyUR1V812Hn8T1YVXhCu+wO3OpS4eU9l4YdD3qjyiKdV6mvV29zapkMeD390UVEf2lkUg=="],
"decimal.js-light": ["decimal.js-light@2.5.1", "", {}, "sha512-qIMFpTMZmny+MMIitAB6D7iVPEorVw6YQRWkvarTkT4tBeSLLiHzcwj6q0MmYSFCiVpiqPJTJEYIrpcPzVEIvg=="],
"deep-eql": ["deep-eql@5.0.2", "", {}, "sha512-h5k/5U50IJJFpzfL6nO9jaaumfjO/f2NjK/oYB2Djzm4p9L+3T9qWpZqZ2hAbLPuuYq9wrU08WQyBTL5GbPk5Q=="],
"deep-is": ["deep-is@0.1.4", "", {}, "sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ=="],
"define-data-property": ["define-data-property@1.1.4", "", { "dependencies": { "es-define-property": "^1.0.0", "es-errors": "^1.3.0", "gopd": "^1.0.1" } }, "sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A=="],
"define-properties": ["define-properties@1.2.1", "", { "dependencies": { "define-data-property": "^1.0.1", "has-property-descriptors": "^1.0.0", "object-keys": "^1.1.1" } }, "sha512-8QmQKqEASLd5nx0U1B1okLElbUuuttJ/AnYmRXbbbGDWh6uS208EjD4Xqq/I9wK7u0v6O08XhTWnt5XtEbR6Dg=="],
"delayed-stream": ["delayed-stream@1.0.0", "", {}, "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ=="],
"dequal": ["dequal@2.0.3", "", {}, "sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA=="],
"detect-libc": ["detect-libc@2.1.2", "", {}, "sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ=="],
"doctrine": ["doctrine@2.1.0", "", { "dependencies": { "esutils": "^2.0.2" } }, "sha512-35mSku4ZXK0vfCuHEDAwt55dg2jNajHZ1odvF+8SSr82EsZY4QmXfuWso8oEd8zRhVObSN18aM0CjSdoBX7zIw=="],
"dom-accessibility-api": ["dom-accessibility-api@0.6.3", "", {}, "sha512-7ZgogeTnjuHbo+ct10G9Ffp0mif17idi0IyWNVA/wcwcm7NPOD/WEHVP3n7n3MhXqxoIYm8d6MuZohYWIZ4T3w=="],
"dunder-proto": ["dunder-proto@1.0.1", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.1", "es-errors": "^1.3.0", "gopd": "^1.2.0" } }, "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A=="],
"emoji-regex": ["emoji-regex@9.2.2", "", {}, "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg=="],
"enhanced-resolve": ["enhanced-resolve@5.20.1", "", { "dependencies": { "graceful-fs": "^4.2.4", "tapable": "^2.3.0" } }, "sha512-Qohcme7V1inbAfvjItgw0EaxVX5q2rdVEZHRBrEQdRZTssLDGsL8Lwrznl8oQ/6kuTJONLaDcGjkNP247XEhcA=="],
"entities": ["entities@6.0.1", "", {}, "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g=="],
"es-abstract": ["es-abstract@1.24.1", "", { "dependencies": { "array-buffer-byte-length": "^1.0.2", "arraybuffer.prototype.slice": "^1.0.4", "available-typed-arrays": "^1.0.7", "call-bind": "^1.0.8", "call-bound": "^1.0.4", "data-view-buffer": "^1.0.2", "data-view-byte-length": "^1.0.2", "data-view-byte-offset": "^1.0.1", "es-define-property": "^1.0.1", "es-errors": "^1.3.0", "es-object-atoms": "^1.1.1", "es-set-tostringtag": "^2.1.0", "es-to-primitive": "^1.3.0", "function.prototype.name": "^1.1.8", "get-intrinsic": "^1.3.0", "get-proto": "^1.0.1", "get-symbol-description": "^1.1.0", "globalthis": "^1.0.4", "gopd": "^1.2.0", "has-property-descriptors": "^1.0.2", "has-proto": "^1.2.0", "has-symbols": "^1.1.0", "hasown": "^2.0.2", "internal-slot": "^1.1.0", "is-array-buffer": "^3.0.5", "is-callable": "^1.2.7", "is-data-view": "^1.0.2", "is-negative-zero": "^2.0.3", "is-regex": "^1.2.1", "is-set": "^2.0.3", "is-shared-array-buffer": "^1.0.4", "is-string": "^1.1.1", "is-typed-array": "^1.1.15", "is-weakref": "^1.1.1", "math-intrinsics": "^1.1.0", "object-inspect": "^1.13.4", "object-keys": "^1.1.1", "object.assign": "^4.1.7", "own-keys": "^1.0.1", "regexp.prototype.flags": "^1.5.4", "safe-array-concat": "^1.1.3", "safe-push-apply": "^1.0.0", "safe-regex-test": "^1.1.0", "set-proto": "^1.0.0", "stop-iteration-iterator": "^1.1.0", "string.prototype.trim": "^1.2.10", "string.prototype.trimend": "^1.0.9", "string.prototype.trimstart": "^1.0.8", "typed-array-buffer": "^1.0.3", "typed-array-byte-length": "^1.0.3", "typed-array-byte-offset": "^1.0.4", "typed-array-length": "^1.0.7", "unbox-primitive": "^1.1.0", "which-typed-array": "^1.1.19" } }, "sha512-zHXBLhP+QehSSbsS9Pt23Gg964240DPd6QCf8WpkqEXxQ7fhdZzYsocOr5u7apWonsS5EjZDmTF+/slGMyasvw=="],
"es-define-property": ["es-define-property@1.0.1", "", {}, "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g=="],
@@ -392,6 +649,8 @@
"es-iterator-helpers": ["es-iterator-helpers@1.3.1", "", { "dependencies": { "call-bind": "^1.0.8", "call-bound": "^1.0.4", "define-properties": "^1.2.1", "es-abstract": "^1.24.1", "es-errors": "^1.3.0", "es-set-tostringtag": "^2.1.0", "function-bind": "^1.1.2", "get-intrinsic": "^1.3.0", "globalthis": "^1.0.4", "gopd": "^1.2.0", "has-property-descriptors": "^1.0.2", "has-proto": "^1.2.0", "has-symbols": "^1.1.0", "internal-slot": "^1.1.0", "iterator.prototype": "^1.1.5", "math-intrinsics": "^1.1.0", "safe-array-concat": "^1.1.3" } }, "sha512-zWwRvqWiuBPr0muUG/78cW3aHROFCNIQ3zpmYDpwdbnt2m+xlNyRWpHBpa2lJjSBit7BQ+RXA1iwbSmu5yJ/EQ=="],
"es-module-lexer": ["es-module-lexer@1.7.0", "", {}, "sha512-jEQoCwk8hyb2AZziIOLhDqpm5+2ww5uIE6lkO/6jcOCusfk6LhMHpXXfBLXTZ7Ydyt0j4VoUQv6uGNYbdW+kBA=="],
"es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA=="],
"es-set-tostringtag": ["es-set-tostringtag@2.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "get-intrinsic": "^1.2.6", "has-tostringtag": "^1.0.2", "hasown": "^2.0.2" } }, "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA=="],
@@ -400,6 +659,12 @@
"es-to-primitive": ["es-to-primitive@1.3.0", "", { "dependencies": { "is-callable": "^1.2.7", "is-date-object": "^1.0.5", "is-symbol": "^1.0.4" } }, "sha512-w+5mJ3GuFL+NjVtJlvydShqE1eN3h3PbI7/5LAsYJP/2qtuMXjfL2LpHSRqo4b4eSF5K/DH1JXKUAHSB2UW50g=="],
"es-toolkit": ["es-toolkit@1.45.1", "", {}, "sha512-/jhoOj/Fx+A+IIyDNOvO3TItGmlMKhtX8ISAHKE90c4b/k1tqaqEZ+uUqfpU8DMnW5cgNJv606zS55jGvza0Xw=="],
"esbuild": ["esbuild@0.21.5", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.21.5", "@esbuild/android-arm": "0.21.5", "@esbuild/android-arm64": "0.21.5", "@esbuild/android-x64": "0.21.5", "@esbuild/darwin-arm64": "0.21.5", "@esbuild/darwin-x64": "0.21.5", "@esbuild/freebsd-arm64": "0.21.5", "@esbuild/freebsd-x64": "0.21.5", "@esbuild/linux-arm": "0.21.5", "@esbuild/linux-arm64": "0.21.5", "@esbuild/linux-ia32": "0.21.5", "@esbuild/linux-loong64": "0.21.5", "@esbuild/linux-mips64el": "0.21.5", "@esbuild/linux-ppc64": "0.21.5", "@esbuild/linux-riscv64": "0.21.5", "@esbuild/linux-s390x": "0.21.5", "@esbuild/linux-x64": "0.21.5", "@esbuild/netbsd-x64": "0.21.5", "@esbuild/openbsd-x64": "0.21.5", "@esbuild/sunos-x64": "0.21.5", "@esbuild/win32-arm64": "0.21.5", "@esbuild/win32-ia32": "0.21.5", "@esbuild/win32-x64": "0.21.5" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw=="],
"escalade": ["escalade@3.2.0", "", {}, "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA=="],
"escape-string-regexp": ["escape-string-regexp@4.0.0", "", {}, "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA=="],
"eslint": ["eslint@9.39.4", "", { "dependencies": { "@eslint-community/eslint-utils": "^4.8.0", "@eslint-community/regexpp": "^4.12.1", "@eslint/config-array": "^0.21.2", "@eslint/config-helpers": "^0.4.2", "@eslint/core": "^0.17.0", "@eslint/eslintrc": "^3.3.5", "@eslint/js": "9.39.4", "@eslint/plugin-kit": "^0.4.1", "@humanfs/node": "^0.16.6", "@humanwhocodes/module-importer": "^1.0.1", "@humanwhocodes/retry": "^0.4.2", "@types/estree": "^1.0.6", "ajv": "^6.14.0", "chalk": "^4.0.0", "cross-spawn": "^7.0.6", "debug": "^4.3.2", "escape-string-regexp": "^4.0.0", "eslint-scope": "^8.4.0", "eslint-visitor-keys": "^4.2.1", "espree": "^10.4.0", "esquery": "^1.5.0", "esutils": "^2.0.2", "fast-deep-equal": "^3.1.3", "file-entry-cache": "^8.0.0", "find-up": "^5.0.0", "glob-parent": "^6.0.2", "ignore": "^5.2.0", "imurmurhash": "^0.1.4", "is-glob": "^4.0.0", "json-stable-stringify-without-jsonify": "^1.0.1", "lodash.merge": "^4.6.2", "minimatch": "^3.1.5", "natural-compare": "^1.4.0", "optionator": "^0.9.3" }, "peerDependencies": { "jiti": "*" }, "optionalPeers": ["jiti"], "bin": { "eslint": "bin/eslint.js" } }, "sha512-XoMjdBOwe/esVgEvLmNsD3IRHkm7fbKIUGvrleloJXUZgDHig2IPWNniv+GwjyJXzuNqVjlr5+4yVUZjycJwfQ=="],
@@ -432,8 +697,14 @@
"estraverse": ["estraverse@5.3.0", "", {}, "sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA=="],
"estree-walker": ["estree-walker@3.0.3", "", { "dependencies": { "@types/estree": "^1.0.0" } }, "sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g=="],
"esutils": ["esutils@2.0.3", "", {}, "sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g=="],
"eventemitter3": ["eventemitter3@5.0.4", "", {}, "sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw=="],
"expect-type": ["expect-type@1.3.0", "", {}, "sha512-knvyeauYhqjOYvQ66MznSMs83wmHrCycNEN6Ao+2AeYEfxUIkuiVxdEa1qlGEPK+We3n0THiDciYSsCcgW/DoA=="],
"fast-deep-equal": ["fast-deep-equal@3.1.3", "", {}, "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q=="],
"fast-glob": ["fast-glob@3.3.1", "", { "dependencies": { "@nodelib/fs.stat": "^2.0.2", "@nodelib/fs.walk": "^1.2.3", "glob-parent": "^5.1.2", "merge2": "^1.3.0", "micromatch": "^4.0.4" } }, "sha512-kNFPyjhh5cKjrUltxs+wFx+ZkbRaxxmZ+X0ZU31SOsxCEtP9VPgtq2teZw1DebupL5GmDaNQ6yKMMVcM41iqDg=="],
@@ -460,6 +731,10 @@
"for-each": ["for-each@0.3.5", "", { "dependencies": { "is-callable": "^1.2.7" } }, "sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg=="],
"form-data": ["form-data@4.0.5", "", { "dependencies": { "asynckit": "^0.4.0", "combined-stream": "^1.0.8", "es-set-tostringtag": "^2.1.0", "hasown": "^2.0.2", "mime-types": "^2.1.12" } }, "sha512-8RipRLol37bNs2bhoV67fiTEvdTrbMUYcFTiy3+wuuOnUog2QBHCZWXDRijWQfAkhBj2Uf5UnVaiWwA5vdd82w=="],
"fsevents": ["fsevents@2.3.3", "", { "os": "darwin" }, "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw=="],
"function-bind": ["function-bind@1.1.2", "", {}, "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA=="],
"function.prototype.name": ["function.prototype.name@1.1.8", "", { "dependencies": { "call-bind": "^1.0.8", "call-bound": "^1.0.3", "define-properties": "^1.2.1", "functions-have-names": "^1.2.3", "hasown": "^2.0.2", "is-callable": "^1.2.7" } }, "sha512-e5iwyodOHhbMr/yNrc7fDYG4qlbIvI5gajyzPnb5TCwyhjApznQh1BMFou9b30SevY43gCJKXycoCBjMbsuW0Q=="],
@@ -468,6 +743,8 @@
"generator-function": ["generator-function@2.0.1", "", {}, "sha512-SFdFmIJi+ybC0vjlHN0ZGVGHc3lgE0DxPAT0djjVg+kjOnSqclqmj0KQ7ykTOLP6YxoqOvuAODGdcHJn+43q3g=="],
"get-caller-file": ["get-caller-file@2.0.5", "", {}, "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg=="],
"get-intrinsic": ["get-intrinsic@1.3.0", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "es-define-property": "^1.0.1", "es-errors": "^1.3.0", "es-object-atoms": "^1.1.1", "function-bind": "^1.1.2", "get-proto": "^1.0.1", "gopd": "^1.2.0", "has-symbols": "^1.1.0", "hasown": "^2.0.2", "math-intrinsics": "^1.1.0" } }, "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ=="],
"get-proto": ["get-proto@1.0.1", "", { "dependencies": { "dunder-proto": "^1.0.1", "es-object-atoms": "^1.0.0" } }, "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g=="],
@@ -486,6 +763,8 @@
"graceful-fs": ["graceful-fs@4.2.11", "", {}, "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ=="],
"graphql": ["graphql@16.13.2", "", {}, "sha512-5bJ+nf/UCpAjHM8i06fl7eLyVC9iuNAjm9qzkiu2ZGhM0VscSvS6WDPfAwkdkBuoXGM9FJSbKl6wylMwP9Ktig=="],
"has-bigints": ["has-bigints@1.1.0", "", {}, "sha512-R3pbpkcIqv2Pm3dUwgjclDRVmWpTJW2DcMzcIhEXEx1oh/CEMObMm3KLmRJOdvhM7o4uQBnwr8pzRK2sJWIqfg=="],
"has-flag": ["has-flag@4.0.0", "", {}, "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ=="],
@@ -500,16 +779,32 @@
"hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ=="],
"headers-polyfill": ["headers-polyfill@4.0.3", "", {}, "sha512-IScLbePpkvO846sIwOtOTDjutRMWdXdJmXdMvk6gCBHxFO8d+QKOQedyZSxFTTFYRSmlgSTDtXqqq4pcenBXLQ=="],
"html-encoding-sniffer": ["html-encoding-sniffer@4.0.0", "", { "dependencies": { "whatwg-encoding": "^3.1.1" } }, "sha512-Y22oTqIU4uuPgEemfz7NDJz6OeKf12Lsu+QC+s3BVpda64lTiMYCyGwg5ki4vFxkMwQdeZDl2adZoqUgdFuTgQ=="],
"http-proxy-agent": ["http-proxy-agent@7.0.2", "", { "dependencies": { "agent-base": "^7.1.0", "debug": "^4.3.4" } }, "sha512-T1gkAiYYDWYx3V5Bmyu7HcfcvL7mUrTWiM6yOfa3PIphViJ/gFPbvidQ+veqSOHci/PxBcDabeUNCzpOODJZig=="],
"https-proxy-agent": ["https-proxy-agent@7.0.6", "", { "dependencies": { "agent-base": "^7.1.2", "debug": "4" } }, "sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw=="],
"iceberg-js": ["iceberg-js@0.8.1", "", {}, "sha512-1dhVQZXhcHje7798IVM+xoo/1ZdVfzOMIc8/rgVSijRK38EDqOJoGula9N/8ZI5RD8QTxNQtK/Gozpr+qUqRRA=="],
"iconv-lite": ["iconv-lite@0.6.3", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3.0.0" } }, "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw=="],
"ignore": ["ignore@5.3.2", "", {}, "sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g=="],
"immer": ["immer@10.2.0", "", {}, "sha512-d/+XTN3zfODyjr89gM3mPq1WNX2B8pYsu7eORitdwyA2sBubnTl3laYlBk4sXY5FUa5qTZGBDPJICVbvqzjlbw=="],
"import-fresh": ["import-fresh@3.3.1", "", { "dependencies": { "parent-module": "^1.0.0", "resolve-from": "^4.0.0" } }, "sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ=="],
"imurmurhash": ["imurmurhash@0.1.4", "", {}, "sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA=="],
"indent-string": ["indent-string@4.0.0", "", {}, "sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg=="],
"internal-slot": ["internal-slot@1.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "hasown": "^2.0.2", "side-channel": "^1.1.0" } }, "sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw=="],
"internmap": ["internmap@2.0.3", "", {}, "sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg=="],
"is-array-buffer": ["is-array-buffer@3.0.5", "", { "dependencies": { "call-bind": "^1.0.8", "call-bound": "^1.0.3", "get-intrinsic": "^1.2.6" } }, "sha512-DDfANUiiG2wC1qawP66qlTugJeL5HyzMpfr8lLK+jMQirGzNod0B12cFB/9q838Ru27sBwfw78/rdoU7RERz6A=="],
"is-async-function": ["is-async-function@2.1.1", "", { "dependencies": { "async-function": "^1.0.0", "call-bound": "^1.0.3", "get-proto": "^1.0.1", "has-tostringtag": "^1.0.2", "safe-regex-test": "^1.1.0" } }, "sha512-9dgM/cZBnNvjzaMYHVoxxfPj2QXt22Ev7SuuPrs+xav0ukGB0S6d4ydZdEiM48kLx5kDV+QBPrpVnFyefL8kkQ=="],
@@ -532,6 +827,8 @@
"is-finalizationregistry": ["is-finalizationregistry@1.1.1", "", { "dependencies": { "call-bound": "^1.0.3" } }, "sha512-1pC6N8qWJbWoPtEjgcL2xyhQOP491EQjeUo3qTKcmV8YSDDJrOepfG8pcC7h/QgnQHYSv0mJ3Z/ZWxmatVrysg=="],
"is-fullwidth-code-point": ["is-fullwidth-code-point@3.0.0", "", {}, "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg=="],
"is-generator-function": ["is-generator-function@1.1.2", "", { "dependencies": { "call-bound": "^1.0.4", "generator-function": "^2.0.0", "get-proto": "^1.0.1", "has-tostringtag": "^1.0.2", "safe-regex-test": "^1.1.0" } }, "sha512-upqt1SkGkODW9tsGNG5mtXTXtECizwtS2kA161M+gJPc1xdb/Ax629af6YrTwcOeQHbewrPNlE5Dx7kzvXTizA=="],
"is-glob": ["is-glob@4.0.3", "", { "dependencies": { "is-extglob": "^2.1.1" } }, "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg=="],
@@ -540,10 +837,14 @@
"is-negative-zero": ["is-negative-zero@2.0.3", "", {}, "sha512-5KoIu2Ngpyek75jXodFvnafB6DJgr3u8uuK0LEZJjrU19DrMD3EVERaR8sjz8CCGgpZvxPl9SuE1GMVPFHx1mw=="],
"is-node-process": ["is-node-process@1.2.0", "", {}, "sha512-Vg4o6/fqPxIjtxgUH5QLJhwZ7gW5diGCVlXpuUfELC62CuxM1iHcRe51f2W1FDy04Ai4KJkagKjx3XaqyfRKXw=="],
"is-number": ["is-number@7.0.0", "", {}, "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng=="],
"is-number-object": ["is-number-object@1.1.1", "", { "dependencies": { "call-bound": "^1.0.3", "has-tostringtag": "^1.0.2" } }, "sha512-lZhclumE1G6VYD8VHe35wFaIif+CTy5SJIi5+3y4psDgWu4wPDoBhF8NxUOinEc7pHgiTsT6MaBb92rKhhD+Xw=="],
"is-potential-custom-element-name": ["is-potential-custom-element-name@1.0.1", "", {}, "sha512-bCYeRA2rVibKZd+s2625gGnGF/t7DSqDs4dP7CrLA1m7jKWz6pps0LpYLJN8Q64HtmPKJ1hrN3nzPNKFEKOUiQ=="],
"is-regex": ["is-regex@1.2.1", "", { "dependencies": { "call-bound": "^1.0.2", "gopd": "^1.2.0", "has-tostringtag": "^1.0.2", "hasown": "^2.0.2" } }, "sha512-MjYsKHO5O7mCsmRGxWcLWheFqN9DJ/2TmngvjKXihe6efViPqc274+Fx/4fYj/r03+ESvBdTXK0V6tA3rgez1g=="],
"is-set": ["is-set@2.0.3", "", {}, "sha512-iPAjerrse27/ygGLxw+EBR9agv9Y6uLeYVJMu+QNCoouJ1/1ri0mGrcWpfCqFZuzzx3WjtwxG098X+n4OuRkPg=="],
@@ -574,6 +875,8 @@
"js-yaml": ["js-yaml@4.1.1", "", { "dependencies": { "argparse": "^2.0.1" }, "bin": { "js-yaml": "bin/js-yaml.js" } }, "sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA=="],
"jsdom": ["jsdom@24.1.3", "", { "dependencies": { "cssstyle": "^4.0.1", "data-urls": "^5.0.0", "decimal.js": "^10.4.3", "form-data": "^4.0.0", "html-encoding-sniffer": "^4.0.0", "http-proxy-agent": "^7.0.2", "https-proxy-agent": "^7.0.5", "is-potential-custom-element-name": "^1.0.1", "nwsapi": "^2.2.12", "parse5": "^7.1.2", "rrweb-cssom": "^0.7.1", "saxes": "^6.0.0", "symbol-tree": "^3.2.4", "tough-cookie": "^4.1.4", "w3c-xmlserializer": "^5.0.0", "webidl-conversions": "^7.0.0", "whatwg-encoding": "^3.1.1", "whatwg-mimetype": "^4.0.0", "whatwg-url": "^14.0.0", "ws": "^8.18.0", "xml-name-validator": "^5.0.0" }, "peerDependencies": { "canvas": "^2.11.2" }, "optionalPeers": ["canvas"] }, "sha512-MyL55p3Ut3cXbeBEG7Hcv0mVM8pp8PBNWxRqchZnSfAiES1v1mRnMeFfaHWIPULpwsYfvO+ZmMZz5tGCnjzDUQ=="],
"json-buffer": ["json-buffer@3.0.1", "", {}, "sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ=="],
"json-schema-traverse": ["json-schema-traverse@0.4.1", "", {}, "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg=="],
@@ -622,8 +925,14 @@
"loose-envify": ["loose-envify@1.4.0", "", { "dependencies": { "js-tokens": "^3.0.0 || ^4.0.0" }, "bin": { "loose-envify": "cli.js" } }, "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q=="],
"loupe": ["loupe@3.2.1", "", {}, "sha512-CdzqowRJCeLU72bHvWqwRBBlLcMEtIvGrlvef74kMnV2AolS9Y8xUv1I0U/MNAWMhBlKIoyuEgoJ0t/bbwHbLQ=="],
"lru-cache": ["lru-cache@10.4.3", "", {}, "sha512-JNAzZcXrCt42VGLuYz0zfAzDfAvJWW6AfYlDBQyDV5DClI2m5sAmK+OIO7s59XfsRsWHp02jAJrRadPRGTt6SQ=="],
"lucide-react": ["lucide-react@1.6.0", "", { "peerDependencies": { "react": "^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0" } }, "sha512-YxLKVCOF5ZDI1AhKQE5IBYMY9y/Nr4NT15+7QEWpsTSVCdn4vmZhww+6BP76jWYjQx8rSz1Z+gGme1f+UycWEw=="],
"lz-string": ["lz-string@1.5.0", "", { "bin": { "lz-string": "bin/bin.js" } }, "sha512-h5bgJWpxJNswbU7qCrV0tIKQCaS3blPDrqKWx+QxzuzL1zGUzij9XCWLrSLsJPu5t+eWA/ycetzYAO5IOMcWAQ=="],
"magic-string": ["magic-string@0.30.21", "", { "dependencies": { "@jridgewell/sourcemap-codec": "^1.5.5" } }, "sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ=="],
"math-intrinsics": ["math-intrinsics@1.1.0", "", {}, "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g=="],
@@ -632,12 +941,22 @@
"micromatch": ["micromatch@4.0.8", "", { "dependencies": { "braces": "^3.0.3", "picomatch": "^2.3.1" } }, "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA=="],
"mime-db": ["mime-db@1.52.0", "", {}, "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg=="],
"mime-types": ["mime-types@2.1.35", "", { "dependencies": { "mime-db": "1.52.0" } }, "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw=="],
"min-indent": ["min-indent@1.0.1", "", {}, "sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg=="],
"minimatch": ["minimatch@3.1.5", "", { "dependencies": { "brace-expansion": "^1.1.7" } }, "sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w=="],
"minimist": ["minimist@1.2.8", "", {}, "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA=="],
"ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
"msw": ["msw@2.12.14", "", { "dependencies": { "@inquirer/confirm": "^5.0.0", "@mswjs/interceptors": "^0.41.2", "@open-draft/deferred-promise": "^2.2.0", "@types/statuses": "^2.0.6", "cookie": "^1.0.2", "graphql": "^16.12.0", "headers-polyfill": "^4.0.2", "is-node-process": "^1.2.0", "outvariant": "^1.4.3", "path-to-regexp": "^6.3.0", "picocolors": "^1.1.1", "rettime": "^0.10.1", "statuses": "^2.0.2", "strict-event-emitter": "^0.5.1", "tough-cookie": "^6.0.0", "type-fest": "^5.2.0", "until-async": "^3.0.2", "yargs": "^17.7.2" }, "peerDependencies": { "typescript": ">= 4.8.x" }, "optionalPeers": ["typescript"], "bin": { "msw": "cli/index.js" } }, "sha512-4KXa4nVBIBjbDbd7vfQNuQ25eFxug0aropCQFoI0JdOBuJWamkT1yLVIWReFI8SiTRc+H1hKzaNk+cLk2N9rtQ=="],
"mute-stream": ["mute-stream@2.0.0", "", {}, "sha512-WWdIxpyjEn+FhQJQQv9aQAYlHoNVdzIzUySNV1gHUPDSdZJ3yZn7pAAbQcV7B56Mvu881q9FZV+0Vx2xC44VWA=="],
"nanoid": ["nanoid@3.3.11", "", { "bin": { "nanoid": "bin/nanoid.cjs" } }, "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w=="],
"napi-postinstall": ["napi-postinstall@0.3.4", "", { "bin": { "napi-postinstall": "lib/cli.js" } }, "sha512-PHI5f1O0EP5xJ9gQmFGMS6IZcrVvTjpXjz7Na41gTE7eE2hK11lg04CECCYEEjdc17EV4DO+fkGEtt7TpTaTiQ=="],
@@ -648,6 +967,8 @@
"node-exports-info": ["node-exports-info@1.6.0", "", { "dependencies": { "array.prototype.flatmap": "^1.3.3", "es-errors": "^1.3.0", "object.entries": "^1.1.9", "semver": "^6.3.1" } }, "sha512-pyFS63ptit/P5WqUkt+UUfe+4oevH+bFeIiPPdfb0pFeYEu/1ELnJu5l+5EcTKYL5M7zaAa7S8ddywgXypqKCw=="],
"nwsapi": ["nwsapi@2.2.23", "", {}, "sha512-7wfH4sLbt4M0gCDzGE6vzQBo0bfTKjU7Sfpqy/7gs1qBfYz2vEJH6vXcBKpO3+6Yu1telwd0t9HpyOoLEQQbIQ=="],
"object-assign": ["object-assign@4.1.1", "", {}, "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg=="],
"object-inspect": ["object-inspect@1.13.4", "", {}, "sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew=="],
@@ -666,6 +987,8 @@
"optionator": ["optionator@0.9.4", "", { "dependencies": { "deep-is": "^0.1.3", "fast-levenshtein": "^2.0.6", "levn": "^0.4.1", "prelude-ls": "^1.2.1", "type-check": "^0.4.0", "word-wrap": "^1.2.5" } }, "sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g=="],
"outvariant": ["outvariant@1.4.3", "", {}, "sha512-+Sl2UErvtsoajRDKCE5/dBz4DIvHXQQnAxtQTF04OJxY0+DyZXSo5P5Bb7XYWOh81syohlYL24hbDwxedPUJCA=="],
"own-keys": ["own-keys@1.0.1", "", { "dependencies": { "get-intrinsic": "^1.2.6", "object-keys": "^1.1.1", "safe-push-apply": "^1.0.0" } }, "sha512-qFOyK5PjiWZd+QQIh+1jhdb9LpxTF0qs7Pm8o5QHYZ0M3vKqSqzsZaEB6oWlxZ+q2sJBMI/Ktgd2N5ZwQoRHfg=="],
"p-limit": ["p-limit@3.1.0", "", { "dependencies": { "yocto-queue": "^0.1.0" } }, "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ=="],
@@ -674,12 +997,20 @@
"parent-module": ["parent-module@1.0.1", "", { "dependencies": { "callsites": "^3.0.0" } }, "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g=="],
"parse5": ["parse5@7.3.0", "", { "dependencies": { "entities": "^6.0.0" } }, "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw=="],
"path-exists": ["path-exists@4.0.0", "", {}, "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w=="],
"path-key": ["path-key@3.1.1", "", {}, "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q=="],
"path-parse": ["path-parse@1.0.7", "", {}, "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw=="],
"path-to-regexp": ["path-to-regexp@6.3.0", "", {}, "sha512-Yhpw4T9C6hPpgPeA28us07OJeqZ5EzQTkbfwuhsUg0c237RomFoETJgmp2sa3F/41gfLE6G5cqcYwznmeEeOlQ=="],
"pathe": ["pathe@1.1.2", "", {}, "sha512-whLdWMYL2TwI08hn8/ZqAbrVemu0LNaNNJZX73O6qaIdCTfXutsLhMkjdENX0qhsQ9uIimo4/aQOmXkoon2nDQ=="],
"pathval": ["pathval@2.0.1", "", {}, "sha512-//nshmD55c46FuFw26xV/xFAaB5HF9Xdap7HJBBnrKdAd6/GxDBaNA1870O79+9ueg61cZLSVc+OaFlfmObYVQ=="],
"picocolors": ["picocolors@1.1.1", "", {}, "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA=="],
"picomatch": ["picomatch@4.0.4", "", {}, "sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A=="],
@@ -690,10 +1021,16 @@
"prelude-ls": ["prelude-ls@1.2.1", "", {}, "sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g=="],
"pretty-format": ["pretty-format@27.5.1", "", { "dependencies": { "ansi-regex": "^5.0.1", "ansi-styles": "^5.0.0", "react-is": "^17.0.1" } }, "sha512-Qb1gy5OrP5+zDf2Bvnzdl3jsTf1qXVMazbvCoKhtKqVs4/YK4ozX4gKQJJVyNe+cajNPn0KoC0MC3FUmaHWEmQ=="],
"prop-types": ["prop-types@15.8.1", "", { "dependencies": { "loose-envify": "^1.4.0", "object-assign": "^4.1.1", "react-is": "^16.13.1" } }, "sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg=="],
"psl": ["psl@1.15.0", "", { "dependencies": { "punycode": "^2.3.1" } }, "sha512-JZd3gMVBAVQkSs6HdNZo9Sdo0LNcQeMNP3CozBJb3JYC/QUYZTnKxP+f8oWRX4rHP5EurWxqAHTSwUCjlNKa1w=="],
"punycode": ["punycode@2.3.1", "", {}, "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg=="],
"querystringify": ["querystringify@2.2.0", "", {}, "sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ=="],
"queue-microtask": ["queue-microtask@1.2.3", "", {}, "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A=="],
"react": ["react@19.1.0", "", {}, "sha512-FS+XFBNvn3GTAWq26joslQgWNoFu08F4kl0J4CgdNKADkdSGXQyTCnKteIAJy96Br6YbpEU1LSzV5dYtjMkMDg=="],
@@ -702,20 +1039,42 @@
"react-dropzone": ["react-dropzone@15.0.0", "", { "dependencies": { "attr-accept": "^2.2.4", "file-selector": "^2.1.0", "prop-types": "^15.8.1" }, "peerDependencies": { "react": ">= 16.8 || 18.0.0" } }, "sha512-lGjYV/EoqEjEWPnmiSvH4v5IoIAwQM2W4Z1C0Q/Pw2xD0eVzKPS359BQTUMum+1fa0kH2nrKjuavmTPOGhpLPg=="],
"react-is": ["react-is@16.13.1", "", {}, "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="],
"react-is": ["react-is@17.0.2", "", {}, "sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w=="],
"react-redux": ["react-redux@9.2.0", "", { "dependencies": { "@types/use-sync-external-store": "^0.0.6", "use-sync-external-store": "^1.4.0" }, "peerDependencies": { "@types/react": "^18.2.25 || ^19", "react": "^18.0 || ^19", "redux": "^5.0.0" }, "optionalPeers": ["@types/react", "redux"] }, "sha512-ROY9fvHhwOD9ySfrF0wmvu//bKCQ6AeZZq1nJNtbDC+kk5DuSuNX/n6YWYF/SYy7bSba4D4FSz8DJeKY/S/r+g=="],
"recharts": ["recharts@3.8.1", "", { "dependencies": { "@reduxjs/toolkit": "^1.9.0 || 2.x.x", "clsx": "^2.1.1", "decimal.js-light": "^2.5.1", "es-toolkit": "^1.39.3", "eventemitter3": "^5.0.1", "immer": "^10.1.1", "react-redux": "8.x.x || 9.x.x", "reselect": "5.1.1", "tiny-invariant": "^1.3.3", "use-sync-external-store": "^1.2.2", "victory-vendor": "^37.0.2" }, "peerDependencies": { "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0", "react-dom": "^16.0.0 || ^17.0.0 || ^18.0.0 || ^19.0.0", "react-is": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" } }, "sha512-mwzmO1s9sFL0TduUpwndxCUNoXsBw3u3E/0+A+cLcrSfQitSG62L32N69GhqUrrT5qKcAE3pCGVINC6pqkBBQg=="],
"redent": ["redent@3.0.0", "", { "dependencies": { "indent-string": "^4.0.0", "strip-indent": "^3.0.0" } }, "sha512-6tDA8g98We0zd0GvVeMT9arEOnTw9qM03L9cJXaCjrip1OO764RDBLBfrB4cwzNGDj5OA5ioymC9GkizgWJDUg=="],
"redux": ["redux@5.0.1", "", {}, "sha512-M9/ELqF6fy8FwmkpnF0S3YKOqMyoWJ4+CS5Efg2ct3oY9daQvd/Pc71FpGZsVsbl3Cpb+IIcjBDUnnyBdQbq4w=="],
"redux-thunk": ["redux-thunk@3.1.0", "", { "peerDependencies": { "redux": "^5.0.0" } }, "sha512-NW2r5T6ksUKXCabzhL9z+h206HQw/NJkcLm1GPImRQ8IzfXwRGqjVhKJGauHirT0DAuyy6hjdnMZaRoAcy0Klw=="],
"reflect.getprototypeof": ["reflect.getprototypeof@1.0.10", "", { "dependencies": { "call-bind": "^1.0.8", "define-properties": "^1.2.1", "es-abstract": "^1.23.9", "es-errors": "^1.3.0", "es-object-atoms": "^1.0.0", "get-intrinsic": "^1.2.7", "get-proto": "^1.0.1", "which-builtin-type": "^1.2.1" } }, "sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw=="],
"regexp.prototype.flags": ["regexp.prototype.flags@1.5.4", "", { "dependencies": { "call-bind": "^1.0.8", "define-properties": "^1.2.1", "es-errors": "^1.3.0", "get-proto": "^1.0.1", "gopd": "^1.2.0", "set-function-name": "^2.0.2" } }, "sha512-dYqgNSZbDwkaJ2ceRd9ojCGjBq+mOm9LmtXnAnEGyHhN/5R7iDW2TRw3h+o/jCFxus3P2LfWIIiwowAjANm7IA=="],
"require-directory": ["require-directory@2.1.1", "", {}, "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q=="],
"requires-port": ["requires-port@1.0.0", "", {}, "sha512-KigOCHcocU3XODJxsu8i/j8T9tzT4adHiecwORRQ0ZZFcp7ahwXuRU1m+yuO90C5ZUyGeGfocHDI14M3L3yDAQ=="],
"reselect": ["reselect@5.1.1", "", {}, "sha512-K/BG6eIky/SBpzfHZv/dd+9JBFiS4SWV7FIujVyJRux6e45+73RaUHXLmIR1f7WOMaQ0U1km6qwklRQxpJJY0w=="],
"resolve": ["resolve@1.22.11", "", { "dependencies": { "is-core-module": "^2.16.1", "path-parse": "^1.0.7", "supports-preserve-symlinks-flag": "^1.0.0" }, "bin": { "resolve": "bin/resolve" } }, "sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ=="],
"resolve-from": ["resolve-from@4.0.0", "", {}, "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g=="],
"resolve-pkg-maps": ["resolve-pkg-maps@1.0.0", "", {}, "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw=="],
"rettime": ["rettime@0.10.1", "", {}, "sha512-uyDrIlUEH37cinabq0AX4QbgV4HbFZ/gqoiunWQ1UqBtRvTTytwhNYjE++pO/MjPTZL5KQCf2bEoJ/BJNVQ5Kw=="],
"reusify": ["reusify@1.1.0", "", {}, "sha512-g6QUff04oZpHs0eG5p83rFLhHeV00ug/Yf9nZM6fLeUrPguBTkTQOdpAWWspMh55TZfVQDPaN3NQJfbVRAxdIw=="],
"rollup": ["rollup@4.60.0", "", { "dependencies": { "@types/estree": "1.0.8" }, "optionalDependencies": { "@rollup/rollup-android-arm-eabi": "4.60.0", "@rollup/rollup-android-arm64": "4.60.0", "@rollup/rollup-darwin-arm64": "4.60.0", "@rollup/rollup-darwin-x64": "4.60.0", "@rollup/rollup-freebsd-arm64": "4.60.0", "@rollup/rollup-freebsd-x64": "4.60.0", "@rollup/rollup-linux-arm-gnueabihf": "4.60.0", "@rollup/rollup-linux-arm-musleabihf": "4.60.0", "@rollup/rollup-linux-arm64-gnu": "4.60.0", "@rollup/rollup-linux-arm64-musl": "4.60.0", "@rollup/rollup-linux-loong64-gnu": "4.60.0", "@rollup/rollup-linux-loong64-musl": "4.60.0", "@rollup/rollup-linux-ppc64-gnu": "4.60.0", "@rollup/rollup-linux-ppc64-musl": "4.60.0", "@rollup/rollup-linux-riscv64-gnu": "4.60.0", "@rollup/rollup-linux-riscv64-musl": "4.60.0", "@rollup/rollup-linux-s390x-gnu": "4.60.0", "@rollup/rollup-linux-x64-gnu": "4.60.0", "@rollup/rollup-linux-x64-musl": "4.60.0", "@rollup/rollup-openbsd-x64": "4.60.0", "@rollup/rollup-openharmony-arm64": "4.60.0", "@rollup/rollup-win32-arm64-msvc": "4.60.0", "@rollup/rollup-win32-ia32-msvc": "4.60.0", "@rollup/rollup-win32-x64-gnu": "4.60.0", "@rollup/rollup-win32-x64-msvc": "4.60.0", "fsevents": "~2.3.2" }, "bin": { "rollup": "dist/bin/rollup" } }, "sha512-yqjxruMGBQJ2gG4HtjZtAfXArHomazDHoFwFFmZZl0r7Pdo7qCIXKqKHZc8yeoMgzJJ+pO6pEEHa+V7uzWlrAQ=="],
"rrweb-cssom": ["rrweb-cssom@0.7.1", "", {}, "sha512-TrEMa7JGdVm0UThDJSx7ddw5nVm3UJS9o9CCIZ72B1vSyEZoziDqBYP3XIoi/12lKrJR8rE3jeFHMok2F/Mnsg=="],
"run-parallel": ["run-parallel@1.2.0", "", { "dependencies": { "queue-microtask": "^1.2.2" } }, "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA=="],
"safe-array-concat": ["safe-array-concat@1.1.3", "", { "dependencies": { "call-bind": "^1.0.8", "call-bound": "^1.0.2", "get-intrinsic": "^1.2.6", "has-symbols": "^1.1.0", "isarray": "^2.0.5" } }, "sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q=="],
@@ -724,6 +1083,10 @@
"safe-regex-test": ["safe-regex-test@1.1.0", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "is-regex": "^1.2.1" } }, "sha512-x/+Cz4YrimQxQccJf5mKEbIa1NzeCRNI5Ecl/ekmlYaampdNLPalVyIcCZNNH3MvmqBugV5TMYZXv0ljslUlaw=="],
"safer-buffer": ["safer-buffer@2.1.2", "", {}, "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg=="],
"saxes": ["saxes@6.0.0", "", { "dependencies": { "xmlchars": "^2.2.0" } }, "sha512-xAg7SOnEhrm5zI3puOOKyy1OMcMlIJZYNJY7xLBwSze0UjhPLnWfj2GF2EpT0jmzaJKIWKHLsaSSajf35bcYnA=="],
"scheduler": ["scheduler@0.26.0", "", {}, "sha512-NlHwttCI/l5gCPR3D1nNXtWABUmBwvZpEQiD4IXSbIDq8BzLIK/7Ir5gTFSGZDUu37K5cMNp0hFtzO38sC7gWA=="],
"semver": ["semver@6.3.1", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA=="],
@@ -748,14 +1111,28 @@
"side-channel-weakmap": ["side-channel-weakmap@1.0.2", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3", "side-channel-map": "^1.0.1" } }, "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A=="],
"siginfo": ["siginfo@2.0.0", "", {}, "sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g=="],
"signal-exit": ["signal-exit@4.1.0", "", {}, "sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw=="],
"sonner": ["sonner@2.0.7", "", { "peerDependencies": { "react": "^18.0.0 || ^19.0.0 || ^19.0.0-rc", "react-dom": "^18.0.0 || ^19.0.0 || ^19.0.0-rc" } }, "sha512-W6ZN4p58k8aDKA4XPcx2hpIQXBRAgyiWVkYhT7CvK6D3iAu7xjvVyhQHg2/iaKJZ1XVJ4r7XuwGL+WGEK37i9w=="],
"source-map-js": ["source-map-js@1.2.1", "", {}, "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA=="],
"stable-hash": ["stable-hash@0.0.5", "", {}, "sha512-+L3ccpzibovGXFK+Ap/f8LOS0ahMrHTf3xu7mMLSpEGU0EO9ucaysSylKo9eRDFNhWve/y275iPmIZ4z39a9iA=="],
"stackback": ["stackback@0.0.2", "", {}, "sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw=="],
"statuses": ["statuses@2.0.2", "", {}, "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw=="],
"std-env": ["std-env@3.10.0", "", {}, "sha512-5GS12FdOZNliM5mAOxFRg7Ir0pWz8MdpYm6AY6VPkGpbA7ZzmbzNcBJQ0GPvvyWgcY7QAhCgf9Uy89I03faLkg=="],
"stop-iteration-iterator": ["stop-iteration-iterator@1.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "internal-slot": "^1.1.0" } }, "sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ=="],
"strict-event-emitter": ["strict-event-emitter@0.5.1", "", {}, "sha512-vMgjE/GGEPEFnhFub6pa4FmJBRBVOLpIII2hvCZ8Kzb7K0hlHo7mQv6xYrBvCL2LtAIBwFUK8wvuJgTVSQ5MFQ=="],
"string-width": ["string-width@4.2.3", "", { "dependencies": { "emoji-regex": "^8.0.0", "is-fullwidth-code-point": "^3.0.0", "strip-ansi": "^6.0.1" } }, "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g=="],
"string.prototype.includes": ["string.prototype.includes@2.0.1", "", { "dependencies": { "call-bind": "^1.0.7", "define-properties": "^1.2.1", "es-abstract": "^1.23.3" } }, "sha512-o7+c9bW6zpAdJHTtujeePODAhkuicdAryFsfVKwA+wGw89wJ4GTY484WTucM9hLtDEOpOvI+aHnzqnC5lHp4Rg=="],
"string.prototype.matchall": ["string.prototype.matchall@4.0.12", "", { "dependencies": { "call-bind": "^1.0.8", "call-bound": "^1.0.3", "define-properties": "^1.2.1", "es-abstract": "^1.23.6", "es-errors": "^1.3.0", "es-object-atoms": "^1.0.0", "get-intrinsic": "^1.2.6", "gopd": "^1.2.0", "has-symbols": "^1.1.0", "internal-slot": "^1.1.0", "regexp.prototype.flags": "^1.5.3", "set-function-name": "^2.0.2", "side-channel": "^1.1.0" } }, "sha512-6CC9uyBL+/48dYizRf7H7VAYCMCNTBeM78x/VTUe9bFEaxBepPJDa1Ow99LqI/1yF7kuy7Q3cQsYMrcjGUcskA=="],
@@ -768,8 +1145,12 @@
"string.prototype.trimstart": ["string.prototype.trimstart@1.0.8", "", { "dependencies": { "call-bind": "^1.0.7", "define-properties": "^1.2.1", "es-object-atoms": "^1.0.0" } }, "sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg=="],
"strip-ansi": ["strip-ansi@6.0.1", "", { "dependencies": { "ansi-regex": "^5.0.1" } }, "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A=="],
"strip-bom": ["strip-bom@3.0.0", "", {}, "sha512-vavAMRXOgBVNF6nyEEmL3DBK19iRpDcoIwW+swQ+CbGiu7lju6t+JklA1MHweoWtadgt4ISVUsXLyDq34ddcwA=="],
"strip-indent": ["strip-indent@3.0.0", "", { "dependencies": { "min-indent": "^1.0.0" } }, "sha512-laJTa3Jb+VQpaC6DseHhF7dXVqHTfJPCRDaEbid/drOhgitgYku/letMUqOXFoWV0zIIUbjpdH2t+tYj4bQMRQ=="],
"strip-json-comments": ["strip-json-comments@3.1.1", "", {}, "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig=="],
"styled-jsx": ["styled-jsx@5.1.6", "", { "dependencies": { "client-only": "0.0.1" }, "peerDependencies": { "react": ">= 16.8.0 || 17.x.x || ^18.0.0-0 || ^19.0.0-0" } }, "sha512-qSVyDTeMotdvQYoHWLNGwRFJHC+i+ZvdBRYosOFgC+Wg1vx4frN2/RG/NA7SYqqvKNLf39P2LSRA2pu6n0XYZA=="],
@@ -778,14 +1159,38 @@
"supports-preserve-symlinks-flag": ["supports-preserve-symlinks-flag@1.0.0", "", {}, "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w=="],
"symbol-tree": ["symbol-tree@3.2.4", "", {}, "sha512-9QNk5KwDF+Bvz+PyObkmSYjI5ksVUYtjW7AU22r2NKcfLJcXp96hkDWU3+XndOsUb+AQ9QhfzfCT2O+CNWT5Tw=="],
"tagged-tag": ["tagged-tag@1.0.0", "", {}, "sha512-yEFYrVhod+hdNyx7g5Bnkkb0G6si8HJurOoOEgC8B/O0uXLHlaey/65KRv6cuWBNhBgHKAROVpc7QyYqE5gFng=="],
"tailwindcss": ["tailwindcss@4.2.2", "", {}, "sha512-KWBIxs1Xb6NoLdMVqhbhgwZf2PGBpPEiwOqgI4pFIYbNTfBXiKYyWoTsXgBQ9WFg/OlhnvHaY+AEpW7wSmFo2Q=="],
"tapable": ["tapable@2.3.2", "", {}, "sha512-1MOpMXuhGzGL5TTCZFItxCc0AARf1EZFQkGqMm7ERKj8+Hgr5oLvJOVFcC+lRmR8hCe2S3jC4T5D7Vg/d7/fhA=="],
"tiny-invariant": ["tiny-invariant@1.3.3", "", {}, "sha512-+FbBPE1o9QAYvviau/qC5SE3caw21q3xkvWKBtja5vgqOWIHHJ3ioaq1VPfn/Szqctz2bU/oYeKd9/z5BL+PVg=="],
"tinybench": ["tinybench@2.9.0", "", {}, "sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg=="],
"tinyexec": ["tinyexec@0.3.2", "", {}, "sha512-KQQR9yN7R5+OSwaK0XQoj22pwHoTlgYqmUscPYoknOoWCWfj/5/ABTMRi69FrKU5ffPVh5QcFikpWJI/P1ocHA=="],
"tinyglobby": ["tinyglobby@0.2.15", "", { "dependencies": { "fdir": "^6.5.0", "picomatch": "^4.0.3" } }, "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ=="],
"tinypool": ["tinypool@1.1.1", "", {}, "sha512-Zba82s87IFq9A9XmjiX5uZA/ARWDrB03OHlq+Vw1fSdt0I+4/Kutwy8BP4Y/y/aORMo61FQ0vIb5j44vSo5Pkg=="],
"tinyrainbow": ["tinyrainbow@1.2.0", "", {}, "sha512-weEDEq7Z5eTHPDh4xjX789+fHfF+P8boiFB+0vbWzpbnbsEr/GRaohi/uMKxg8RZMXnl1ItAi/IUHWMsjDV7kQ=="],
"tinyspy": ["tinyspy@3.0.2", "", {}, "sha512-n1cw8k1k0x4pgA2+9XrOkFydTerNcJ1zWCO5Nn9scWHTD+5tp8dghT2x1uduQePZTZgd3Tupf+x9BxJjeJi77Q=="],
"tldts": ["tldts@7.0.27", "", { "dependencies": { "tldts-core": "^7.0.27" }, "bin": { "tldts": "bin/cli.js" } }, "sha512-I4FZcVFcqCRuT0ph6dCDpPuO4Xgzvh+spkcTr1gK7peIvxWauoloVO0vuy1FQnijT63ss6AsHB6+OIM4aXHbPg=="],
"tldts-core": ["tldts-core@7.0.27", "", {}, "sha512-YQ7uPjgWUibIK6DW5lrKujGwUKhLevU4hcGbP5O6TcIUb+oTjJYJVWPS4nZsIHrEEEG6myk/oqAJUEQmpZrHsg=="],
"to-regex-range": ["to-regex-range@5.0.1", "", { "dependencies": { "is-number": "^7.0.0" } }, "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ=="],
"tough-cookie": ["tough-cookie@4.1.4", "", { "dependencies": { "psl": "^1.1.33", "punycode": "^2.1.1", "universalify": "^0.2.0", "url-parse": "^1.5.3" } }, "sha512-Loo5UUvLD9ScZ6jh8beX1T6sO1w2/MpCRpEP7V280GKMVUQ0Jzar2U3UJPsrdbziLEMMhu3Ujnq//rhiFuIeag=="],
"tr46": ["tr46@5.1.1", "", { "dependencies": { "punycode": "^2.3.1" } }, "sha512-hdF5ZgjTqgAntKkklYw0R03MG2x/bSzTtkxmIRw/sTNV8YXsCJ1tfLAX23lhxhHJlEf3CRCOCGGWw3vI3GaSPw=="],
"ts-api-utils": ["ts-api-utils@2.5.0", "", { "peerDependencies": { "typescript": ">=4.8.4" } }, "sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA=="],
"tsconfig-paths": ["tsconfig-paths@3.15.0", "", { "dependencies": { "@types/json5": "^0.0.29", "json5": "^1.0.2", "minimist": "^1.2.6", "strip-bom": "^3.0.0" } }, "sha512-2Ac2RgzDe/cn48GvOe3M+o82pEFewD3UPbyoUHHdKasHwJKjds4fLXWf/Ux5kATBKN20oaFGu+jbElp1pos0mg=="],
@@ -794,6 +1199,8 @@
"type-check": ["type-check@0.4.0", "", { "dependencies": { "prelude-ls": "^1.2.1" } }, "sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew=="],
"type-fest": ["type-fest@5.5.0", "", { "dependencies": { "tagged-tag": "^1.0.0" } }, "sha512-PlBfpQwiUvGViBNX84Yxwjsdhd1TUlXr6zjX7eoirtCPIr08NAmxwa+fcYBTeRQxHo9YC9wwF3m9i700sHma8g=="],
"typed-array-buffer": ["typed-array-buffer@1.0.3", "", { "dependencies": { "call-bound": "^1.0.3", "es-errors": "^1.3.0", "is-typed-array": "^1.1.14" } }, "sha512-nAYYwfY3qnzX30IkA6AQZjVbtK6duGontcQm1WSG1MD94YLqK0515GNApXkoxKOWMusVssAHWLh9SeaoefYFGw=="],
"typed-array-byte-length": ["typed-array-byte-length@1.0.3", "", { "dependencies": { "call-bind": "^1.0.8", "for-each": "^0.3.3", "gopd": "^1.2.0", "has-proto": "^1.2.0", "is-typed-array": "^1.1.14" } }, "sha512-BaXgOuIxz8n8pIq3e7Atg/7s+DpiYrxn4vdot3w9KbnBhcRQq6o3xemQdIfynqSeXeDrF32x+WvfzmOjPiY9lg=="],
@@ -808,10 +1215,36 @@
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"universalify": ["universalify@0.2.0", "", {}, "sha512-CJ1QgKmNg3CwvAv/kOFmtnEN05f0D/cn9QntgNOQlQF9dgvVTHj3t+8JPdjqawCHk7V/KA+fbUqzZ9XWhcqPUg=="],
"unrs-resolver": ["unrs-resolver@1.11.1", "", { "dependencies": { "napi-postinstall": "^0.3.0" }, "optionalDependencies": { "@unrs/resolver-binding-android-arm-eabi": "1.11.1", "@unrs/resolver-binding-android-arm64": "1.11.1", "@unrs/resolver-binding-darwin-arm64": "1.11.1", "@unrs/resolver-binding-darwin-x64": "1.11.1", "@unrs/resolver-binding-freebsd-x64": "1.11.1", "@unrs/resolver-binding-linux-arm-gnueabihf": "1.11.1", "@unrs/resolver-binding-linux-arm-musleabihf": "1.11.1", "@unrs/resolver-binding-linux-arm64-gnu": "1.11.1", "@unrs/resolver-binding-linux-arm64-musl": "1.11.1", "@unrs/resolver-binding-linux-ppc64-gnu": "1.11.1", "@unrs/resolver-binding-linux-riscv64-gnu": "1.11.1", "@unrs/resolver-binding-linux-riscv64-musl": "1.11.1", "@unrs/resolver-binding-linux-s390x-gnu": "1.11.1", "@unrs/resolver-binding-linux-x64-gnu": "1.11.1", "@unrs/resolver-binding-linux-x64-musl": "1.11.1", "@unrs/resolver-binding-wasm32-wasi": "1.11.1", "@unrs/resolver-binding-win32-arm64-msvc": "1.11.1", "@unrs/resolver-binding-win32-ia32-msvc": "1.11.1", "@unrs/resolver-binding-win32-x64-msvc": "1.11.1" } }, "sha512-bSjt9pjaEBnNiGgc9rUiHGKv5l4/TGzDmYw3RhnkJGtLhbnnA/5qJj7x3dNDCRx/PJxu774LlH8lCOlB4hEfKg=="],
"until-async": ["until-async@3.0.2", "", {}, "sha512-IiSk4HlzAMqTUseHHe3VhIGyuFmN90zMTpD3Z3y8jeQbzLIq500MVM7Jq2vUAnTKAFPJrqwkzr6PoTcPhGcOiw=="],
"uri-js": ["uri-js@4.4.1", "", { "dependencies": { "punycode": "^2.1.0" } }, "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg=="],
"url-parse": ["url-parse@1.5.10", "", { "dependencies": { "querystringify": "^2.1.1", "requires-port": "^1.0.0" } }, "sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ=="],
"use-sync-external-store": ["use-sync-external-store@1.6.0", "", { "peerDependencies": { "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" } }, "sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w=="],
"victory-vendor": ["victory-vendor@37.3.6", "", { "dependencies": { "@types/d3-array": "^3.0.3", "@types/d3-ease": "^3.0.0", "@types/d3-interpolate": "^3.0.1", "@types/d3-scale": "^4.0.2", "@types/d3-shape": "^3.1.0", "@types/d3-time": "^3.0.0", "@types/d3-timer": "^3.0.0", "d3-array": "^3.1.6", "d3-ease": "^3.0.1", "d3-interpolate": "^3.0.1", "d3-scale": "^4.0.2", "d3-shape": "^3.1.0", "d3-time": "^3.0.0", "d3-timer": "^3.0.1" } }, "sha512-SbPDPdDBYp+5MJHhBCAyI7wKM3d5ivekigc2Dk2s7pgbZ9wIgIBYGVw4zGHBml/qTFbexrofXW6Gu4noGxrOwQ=="],
"vite": ["vite@5.4.21", "", { "dependencies": { "esbuild": "^0.21.3", "postcss": "^8.4.43", "rollup": "^4.20.0" }, "optionalDependencies": { "fsevents": "~2.3.3" }, "peerDependencies": { "@types/node": "^18.0.0 || >=20.0.0", "less": "*", "lightningcss": "^1.21.0", "sass": "*", "sass-embedded": "*", "stylus": "*", "sugarss": "*", "terser": "^5.4.0" }, "optionalPeers": ["@types/node", "less", "lightningcss", "sass", "sass-embedded", "stylus", "sugarss", "terser"], "bin": { "vite": "bin/vite.js" } }, "sha512-o5a9xKjbtuhY6Bi5S3+HvbRERmouabWbyUcpXXUA1u+GNUKoROi9byOJ8M0nHbHYHkYICiMlqxkg1KkYmm25Sw=="],
"vite-node": ["vite-node@2.1.8", "", { "dependencies": { "cac": "^6.7.14", "debug": "^4.3.7", "es-module-lexer": "^1.5.4", "pathe": "^1.1.2", "vite": "^5.0.0" }, "bin": { "vite-node": "vite-node.mjs" } }, "sha512-uPAwSr57kYjAUux+8E2j0q0Fxpn8M9VoyfGiRI8Kfktz9NcYMCenwY5RnZxnF1WTu3TGiYipirIzacLL3VVGFg=="],
"vitest": ["vitest@2.1.8", "", { "dependencies": { "@vitest/expect": "2.1.8", "@vitest/mocker": "2.1.8", "@vitest/pretty-format": "^2.1.8", "@vitest/runner": "2.1.8", "@vitest/snapshot": "2.1.8", "@vitest/spy": "2.1.8", "@vitest/utils": "2.1.8", "chai": "^5.1.2", "debug": "^4.3.7", "expect-type": "^1.1.0", "magic-string": "^0.30.12", "pathe": "^1.1.2", "std-env": "^3.8.0", "tinybench": "^2.9.0", "tinyexec": "^0.3.1", "tinypool": "^1.0.1", "tinyrainbow": "^1.2.0", "vite": "^5.0.0", "vite-node": "2.1.8", "why-is-node-running": "^2.3.0" }, "peerDependencies": { "@edge-runtime/vm": "*", "@types/node": "^18.0.0 || >=20.0.0", "@vitest/browser": "2.1.8", "@vitest/ui": "2.1.8", "happy-dom": "*", "jsdom": "*" }, "optionalPeers": ["@edge-runtime/vm", "@types/node", "@vitest/browser", "@vitest/ui", "happy-dom", "jsdom"], "bin": { "vitest": "vitest.mjs" } }, "sha512-1vBKTZskHw/aosXqQUlVWWlGUxSJR8YtiyZDJAFeW2kPAeX6S3Sool0mjspO+kXLuxVWlEDDowBAeqeAQefqLQ=="],
"w3c-xmlserializer": ["w3c-xmlserializer@5.0.0", "", { "dependencies": { "xml-name-validator": "^5.0.0" } }, "sha512-o8qghlI8NZHU1lLPrpi2+Uq7abh4GGPpYANlalzWxyWteJOCsr/P+oPBA49TOLu5FTZO4d3F9MnWJfiMo4BkmA=="],
"webidl-conversions": ["webidl-conversions@7.0.0", "", {}, "sha512-VwddBukDzu71offAQR975unBIGqfKZpM+8ZX6ySk8nYhVoo5CYaZyzt3YBvYtRtO+aoGlqxPg/B87NGVZ/fu6g=="],
"whatwg-encoding": ["whatwg-encoding@3.1.1", "", { "dependencies": { "iconv-lite": "0.6.3" } }, "sha512-6qN4hJdMwfYBtE3YBTTHhoeuUrDBPZmbQaxWAqSALV/MeEnR5z1xd8UKud2RAkFoPkmB+hli1TZSnyi84xz1vQ=="],
"whatwg-mimetype": ["whatwg-mimetype@4.0.0", "", {}, "sha512-QaKxh0eNIi2mE9p2vEdzfagOKHCcj1pJ56EEHGQOVxp8r9/iszLUUV7v89x9O1p/T+NlTM5W7jW6+cz4Fq1YVg=="],
"whatwg-url": ["whatwg-url@14.2.0", "", { "dependencies": { "tr46": "^5.1.0", "webidl-conversions": "^7.0.0" } }, "sha512-De72GdQZzNTUBBChsXueQUnPKDkg/5A5zp7pFDuQAj5UFoENpiACU0wlCvzpAGnTkj++ihpKwKyYewn/XNUbKw=="],
"which": ["which@2.0.2", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "./bin/node-which" } }, "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA=="],
"which-boxed-primitive": ["which-boxed-primitive@1.1.1", "", { "dependencies": { "is-bigint": "^1.1.0", "is-boolean-object": "^1.2.1", "is-number-object": "^1.1.1", "is-string": "^1.1.1", "is-symbol": "^1.1.1" } }, "sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA=="],
@@ -822,14 +1255,32 @@
"which-typed-array": ["which-typed-array@1.1.20", "", { "dependencies": { "available-typed-arrays": "^1.0.7", "call-bind": "^1.0.8", "call-bound": "^1.0.4", "for-each": "^0.3.5", "get-proto": "^1.0.1", "gopd": "^1.2.0", "has-tostringtag": "^1.0.2" } }, "sha512-LYfpUkmqwl0h9A2HL09Mms427Q1RZWuOHsukfVcKRq9q95iQxdw0ix1JQrqbcDR9PH1QDwf5Qo8OZb5lksZ8Xg=="],
"why-is-node-running": ["why-is-node-running@2.3.0", "", { "dependencies": { "siginfo": "^2.0.0", "stackback": "0.0.2" }, "bin": { "why-is-node-running": "cli.js" } }, "sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w=="],
"word-wrap": ["word-wrap@1.2.5", "", {}, "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA=="],
"wrap-ansi": ["wrap-ansi@6.2.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA=="],
"ws": ["ws@8.20.0", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": ">=5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, "sha512-sAt8BhgNbzCtgGbt2OxmpuryO63ZoDk/sqaB/znQm94T4fCEsy/yV+7CdC1kJhOU9lboAEU7R3kquuycDoibVA=="],
"xml-name-validator": ["xml-name-validator@5.0.0", "", {}, "sha512-EvGK8EJ3DhaHfbRlETOWAS5pO9MZITeauHKJyb8wyajUfQUenkIg2MvLDTZ4T/TgIcm3HU0TFBgWWboAZ30UHg=="],
"xmlchars": ["xmlchars@2.2.0", "", {}, "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw=="],
"y18n": ["y18n@5.0.8", "", {}, "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA=="],
"yargs": ["yargs@17.7.2", "", { "dependencies": { "cliui": "^8.0.1", "escalade": "^3.1.1", "get-caller-file": "^2.0.5", "require-directory": "^2.1.1", "string-width": "^4.2.3", "y18n": "^5.0.5", "yargs-parser": "^21.1.1" } }, "sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w=="],
"yargs-parser": ["yargs-parser@21.1.1", "", {}, "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw=="],
"yocto-queue": ["yocto-queue@0.1.0", "", {}, "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q=="],
"yoctocolors-cjs": ["yoctocolors-cjs@2.1.3", "", {}, "sha512-U/PBtDf35ff0D8X8D0jfdzHYEPFxAI7jJlxZXwCSez5M3190m+QobIfh+sWDWSHMCWWJN2AWamkegn6vr6YBTw=="],
"@eslint-community/eslint-utils/eslint-visitor-keys": ["eslint-visitor-keys@3.4.3", "", {}, "sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag=="],
"@reduxjs/toolkit/immer": ["immer@11.1.4", "", {}, "sha512-XREFCPo6ksxVzP4E0ekD5aMdf8WMwmdNaz6vuvxgI40UaEiu6q3p8X52aU6GdyvLY3XXX/8R7JOTXStz/nBbRw=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/core": ["@emnapi/core@1.9.1", "", { "dependencies": { "@emnapi/wasi-threads": "1.2.0", "tslib": "^2.4.0" }, "bundled": true }, "sha512-mukuNALVsoix/w1BJwFzwXBN/dHeejQtuVzcDsfOEsdpCumXb/E9j8w11h5S54tT1xhifGfbbSm/ICrObRb3KA=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/runtime": ["@emnapi/runtime@1.9.1", "", { "dependencies": { "tslib": "^2.4.0" }, "bundled": true }, "sha512-VYi5+ZVLhpgK4hQ0TAjiQiZ6ol0oe4mBx7mVv7IflsiEp0OWoVsp/+f9Vc1hOhE0TtkORVrI1GvzyreqpgWtkA=="],
@@ -842,6 +1293,10 @@
"@tailwindcss/oxide-wasm32-wasi/tslib": ["tslib@2.8.1", "", { "bundled": true }, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
"@testing-library/dom/aria-query": ["aria-query@5.3.0", "", { "dependencies": { "dequal": "^2.0.3" } }, "sha512-b0P0sZPKtyu8HkeRAfCq0IfURZK+SuwMjY1UXGBU27wpAiTwQAIlq56IbIO+ytk/JjS1fMR14ee5WBBfKi5J6A=="],
"@testing-library/dom/dom-accessibility-api": ["dom-accessibility-api@0.5.16", "", {}, "sha512-X7BJ2yElsnOJ30pZF4uIIDfBEVgF4XEBxL9Bxhy6dnrm5hkzqmsWHGTiHqRiITNhMyFLyAiWndIJP7Z1NTteDg=="],
"@typescript-eslint/eslint-plugin/ignore": ["ignore@7.0.5", "", {}, "sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg=="],
"@typescript-eslint/typescript-estree/minimatch": ["minimatch@10.2.4", "", { "dependencies": { "brace-expansion": "^5.0.2" } }, "sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg=="],
@@ -850,6 +1305,14 @@
"@typescript-eslint/visitor-keys/eslint-visitor-keys": ["eslint-visitor-keys@5.0.1", "", {}, "sha512-tD40eHxA35h0PEIZNeIjkHoDR4YjjJp34biM0mDvplBe//mB+IHCqHDGV7pxF+7MklTvighcCPPZC7ynWyjdTA=="],
"@vitest/snapshot/@vitest/pretty-format": ["@vitest/pretty-format@2.1.8", "", { "dependencies": { "tinyrainbow": "^1.2.0" } }, "sha512-9HiSZ9zpqNLKlbIDRWOnAWqgcA7xu+8YxXSekhr0Ykab7PAYFkhkwoqVArPOtJhPmYeE2YHgKZlj3CP36z2AJQ=="],
"@vitest/utils/@vitest/pretty-format": ["@vitest/pretty-format@2.1.8", "", { "dependencies": { "tinyrainbow": "^1.2.0" } }, "sha512-9HiSZ9zpqNLKlbIDRWOnAWqgcA7xu+8YxXSekhr0Ykab7PAYFkhkwoqVArPOtJhPmYeE2YHgKZlj3CP36z2AJQ=="],
"cliui/wrap-ansi": ["wrap-ansi@7.0.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q=="],
"cssstyle/rrweb-cssom": ["rrweb-cssom@0.8.0", "", {}, "sha512-guoltQEx+9aMf2gDZ0s62EcV8lsXR+0w8915TC3ITdn2YueuNjdAYh/levpU9nFaoChh9RUS5ZdQMrKfVEN9tw=="],
"eslint-import-resolver-node/debug": ["debug@3.2.7", "", { "dependencies": { "ms": "^2.1.1" } }, "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ=="],
"eslint-module-utils/debug": ["debug@3.2.7", "", { "dependencies": { "ms": "^2.1.1" } }, "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ=="],
@@ -864,10 +1327,18 @@
"micromatch/picomatch": ["picomatch@2.3.2", "", {}, "sha512-V7+vQEJ06Z+c5tSye8S+nHUfI51xoXIXjHQ99cQtKUkQqqO1kO/KCJUfZXuB47h/YBlDhah2H3hdUGXn8ie0oA=="],
"msw/tough-cookie": ["tough-cookie@6.0.1", "", { "dependencies": { "tldts": "^7.0.5" } }, "sha512-LktZQb3IeoUWB9lqR5EWTHgW/VTITCXg4D21M+lvybRVdylLrRMnqaIONLVb5mav8vM19m44HIcGq4qASeu2Qw=="],
"next/postcss": ["postcss@8.4.31", "", { "dependencies": { "nanoid": "^3.3.6", "picocolors": "^1.0.0", "source-map-js": "^1.0.2" } }, "sha512-PS08Iboia9mts/2ygV3eLpY5ghnUcfLV/EXTOW1E2qYxJKGGBUtNjN76FYHnMs36RmARn41bC0AZmn+rR0OVpQ=="],
"pretty-format/ansi-styles": ["ansi-styles@5.2.0", "", {}, "sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA=="],
"prop-types/react-is": ["react-is@16.13.1", "", {}, "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="],
"sharp/semver": ["semver@7.7.4", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA=="],
"string-width/emoji-regex": ["emoji-regex@8.0.0", "", {}, "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="],
"@typescript-eslint/typescript-estree/minimatch/brace-expansion": ["brace-expansion@5.0.4", "", { "dependencies": { "balanced-match": "^4.0.2" } }, "sha512-h+DEnpVvxmfVefa4jFbCf5HdH5YMDXRsmKflpf1pILZWRFlTbJpxeU55nJl4Smt5HQaGzg1o6RHFPJaOqnmBDg=="],
"@typescript-eslint/typescript-estree/minimatch/brace-expansion/balanced-match": ["balanced-match@4.0.4", "", {}, "sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA=="],

View File

@@ -5,7 +5,7 @@ const nextConfig: NextConfig = {
rewrites: async () => [
{
source: "/api/:path*",
destination: `${process.env.API_URL || "http://localhost:8080"}/:path*`,
destination: `${process.env.API_URL || "http://localhost:8080"}/api/:path*`,
},
],
};

View File

@@ -6,7 +6,9 @@
"dev": "next dev --turbopack",
"build": "next build --turbopack",
"start": "next start",
"lint": "eslint"
"lint": "eslint",
"test": "vitest run",
"test:watch": "vitest"
},
"dependencies": {
"@supabase/ssr": "^0.9.0",
@@ -18,17 +20,24 @@
"react": "19.1.0",
"react-dom": "19.1.0",
"react-dropzone": "^15.0.0",
"recharts": "^3.8.1",
"sonner": "^2.0.7"
},
"devDependencies": {
"typescript": "^5",
"@eslint/eslintrc": "^3",
"@tailwindcss/postcss": "^4",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@testing-library/user-event": "^14.6.1",
"@types/node": "^20",
"@types/react": "^19",
"@types/react-dom": "^19",
"@tailwindcss/postcss": "^4",
"tailwindcss": "^4",
"eslint": "^9",
"eslint-config-next": "15.5.14",
"@eslint/eslintrc": "^3"
"jsdom": "24.1.3",
"msw": "^2.12.14",
"tailwindcss": "^4",
"typescript": "^5",
"vitest": "2.1.8"
}
}

View File

@@ -0,0 +1,47 @@
import { describe, it, expect } from "vitest";
import { render, screen } from "@testing-library/react";
import { CaseOverviewGrid } from "@/components/dashboard/CaseOverviewGrid";
import type { CaseSummary } from "@/lib/types";
describe("CaseOverviewGrid", () => {
const defaultData: CaseSummary = {
active_count: 15,
new_this_month: 4,
closed_count: 8,
};
it("renders all three case categories", () => {
render(<CaseOverviewGrid data={defaultData} />);
expect(screen.getByText("Aktive Akten")).toBeInTheDocument();
expect(screen.getByText("Neu (Monat)")).toBeInTheDocument();
expect(screen.getByText("Abgeschlossen")).toBeInTheDocument();
});
it("displays correct counts", () => {
render(<CaseOverviewGrid data={defaultData} />);
expect(screen.getByText("15")).toBeInTheDocument();
expect(screen.getByText("4")).toBeInTheDocument();
expect(screen.getByText("8")).toBeInTheDocument();
});
it("renders the section header", () => {
render(<CaseOverviewGrid data={defaultData} />);
expect(screen.getByText("Aktenübersicht")).toBeInTheDocument();
});
it("handles zero counts", () => {
const zeroData: CaseSummary = {
active_count: 0,
new_this_month: 0,
closed_count: 0,
};
render(<CaseOverviewGrid data={zeroData} />);
const zeros = screen.getAllByText("0");
expect(zeros).toHaveLength(3);
});
});

View File

@@ -0,0 +1,67 @@
import { describe, it, expect, vi } from "vitest";
import { render, screen, fireEvent } from "@testing-library/react";
import { DeadlineTrafficLights } from "@/components/dashboard/DeadlineTrafficLights";
import type { DeadlineSummary } from "@/lib/types";
describe("DeadlineTrafficLights", () => {
const defaultData: DeadlineSummary = {
overdue_count: 3,
due_this_week: 5,
due_next_week: 2,
ok_count: 10,
};
it("renders all three traffic light cards", () => {
render(<DeadlineTrafficLights data={defaultData} />);
expect(screen.getByText("Überfällig")).toBeInTheDocument();
expect(screen.getByText("Diese Woche")).toBeInTheDocument();
expect(screen.getByText("Im Zeitplan")).toBeInTheDocument();
});
it("displays correct counts", () => {
render(<DeadlineTrafficLights data={defaultData} />);
// Overdue: 3
expect(screen.getByText("3")).toBeInTheDocument();
// This week: 5
expect(screen.getByText("5")).toBeInTheDocument();
// OK: ok_count + due_next_week = 10 + 2 = 12
expect(screen.getByText("12")).toBeInTheDocument();
});
it("displays zero counts correctly", () => {
const zeroData: DeadlineSummary = {
overdue_count: 0,
due_this_week: 0,
due_next_week: 0,
ok_count: 0,
};
render(<DeadlineTrafficLights data={zeroData} />);
const zeros = screen.getAllByText("0");
expect(zeros).toHaveLength(3);
});
it("calls onFilter with correct key when clicked", () => {
const onFilter = vi.fn();
render(<DeadlineTrafficLights data={defaultData} onFilter={onFilter} />);
fireEvent.click(screen.getByText("Überfällig"));
expect(onFilter).toHaveBeenCalledWith("overdue");
fireEvent.click(screen.getByText("Diese Woche"));
expect(onFilter).toHaveBeenCalledWith("this_week");
fireEvent.click(screen.getByText("Im Zeitplan"));
expect(onFilter).toHaveBeenCalledWith("ok");
});
it("renders without onFilter prop (no crash)", () => {
expect(() => {
render(<DeadlineTrafficLights data={defaultData} />);
fireEvent.click(screen.getByText("Überfällig"));
}).not.toThrow();
});
});

Some files were not shown because too many files have changed in this diff Show More