Compare commits

...

222 Commits

Author SHA1 Message Date
Admin
50a13447a4 docs: add homelab secrets setup instructions
Some checks failed
Release / Test backend (push) Successful in 1m2s
Release / Check ui (push) Successful in 1m0s
Release / Docker (push) Successful in 4m39s
Release / Deploy to prod (push) Successful in 2m18s
Release / Deploy to homelab (push) Failing after 4s
Release / Gitea Release (push) Successful in 29s
2026-04-16 19:08:25 +05:00
Admin
ce34d2c75f feat: add homelab runner deployment step to release workflow
- Add deploy-homelab job to sync homelab/runner/docker-compose.yml
- Rename deploy → deploy-prod for clarity
- Both deployments run in parallel after Docker images are pushed
- Homelab runner pulls only the runner image and restarts

Required secrets (to be added in Gitea):
- HOMELAB_HOST (192.168.0.109)
- HOMELAB_USER (root)
- HOMELAB_SSH_KEY (same as PROD_SSH_KEY or separate)
- HOMELAB_SSH_KNOWN_HOSTS (ssh-keyscan -H 192.168.0.109)
2026-04-16 19:07:59 +05:00
Admin
d394ac454b Remove duplicate action buttons on discover page
All checks were successful
Release / Test backend (push) Successful in 57s
Release / Check ui (push) Successful in 59s
Release / Docker (push) Successful in 4m26s
Release / Gitea Release (push) Successful in 26s
Release / Deploy to prod (push) Successful in 2m18s
- Remove redundant Skip/Read Now/Like buttons from desktop right panel
- Main action buttons in center area remain visible on both mobile and desktop
- Keyboard shortcuts hint still available at bottom
- Cleaner UI with no repeated functionality
2026-04-16 14:36:40 +05:00
Admin
f24720b087 Enhance UI/UX for book info and discover pages
All checks were successful
Release / Test backend (push) Successful in 57s
Release / Check ui (push) Successful in 54s
Release / Docker (push) Successful in 4m24s
Release / Deploy to prod (push) Successful in 2m14s
Release / Gitea Release (push) Successful in 27s
Book Info Page Improvements:
- Add quick stats row (chapters, rating, readers)
- Enhance author display with better typography
- Improve genre tags with amber-branded pills
- Add green badge for 'ongoing' status
- Wrap summary in card with better styling
- Enhance 'More' button with proper design
- Improve StarRating component to show rating more prominently

Discover Page Improvements:
- Add progress bar showing completion through deck
- Enhance keyboard shortcuts with visual kbd elements
- Improve empty state with better visuals and CTA hierarchy
- Enhance toast notifications with icons and color-coded backgrounds
- Add auto-dismiss for toast (4s timeout)
- Improve preferences modal button labels
- Add undo functionality for last vote
- Better stat display (X of Y remaining)

All changes maintain consistency with LibNovel's design system.
2026-04-16 14:19:54 +05:00
Admin
71a628673d feat(ui): homepage UX polish — headings, placeholders, genre highlight, view-all
All checks were successful
Release / Test backend (push) Successful in 55s
Release / Check ui (push) Successful in 1m3s
Release / Docker (push) Successful in 4m26s
Release / Deploy to prod (push) Successful in 2m38s
Release / Gitea Release (push) Successful in 27s
- Bump all section headings to text-lg for visual hierarchy
- Replace SVG book icon no-cover placeholders with first-letter avatars
  across Completed, Ready to Listen, Trending, Recommendations, Recently
  Updated, and From Following shelves
- Highlight user's top genre pill in Browse by Genre strip
- Add "View all → /catalogue?genre=…" link to Because You Read section

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-16 13:00:50 +05:00
Admin
5f5aac5e3e fix(admin): UX and bug fixes across admin pages
All checks were successful
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 1m1s
Release / Docker (push) Successful in 5m15s
Release / Deploy to prod (push) Successful in 2m0s
Release / Gitea Release (push) Successful in 26s
- Import: fix "1/1/1" date display (Go zero time.Time → show dash)
- AI Jobs: guard fmtDate against zero-time dates
- AI Jobs: add "Cancel all in-flight (N)" bulk action button
- AI Jobs sidebar: show live running+pending count badge from layout
- Notifications: add broadcast panel linking to push.libnovel.cc, relabel inbox section

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-16 12:48:33 +05:00
Admin
e65883cc9e feat(catalogue): UX improvements and bug fixes
All checks were successful
Release / Test backend (push) Successful in 56s
Release / Check ui (push) Successful in 58s
Release / Docker (push) Successful in 8m2s
Release / Deploy to prod (push) Successful in 1m52s
Release / Gitea Release (push) Successful in 27s
- Persist audio filter in URL (?audio=1) via history.replaceState
- Show total novel count in browse mode subtitle
- Close filter panel automatically on Apply
- Replace missing-cover SVG placeholder with styled first-letter avatar
- Add forbidden scrape badge to list view (was missing, grid had it)
- Carry audio param through applyFilters() navigation

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-16 12:34:55 +05:00
Admin
b19af1e8f3 fix: simplify Docker build workflow, remove PREBUILT artifact workaround
All checks were successful
Release / Test backend (push) Successful in 1m6s
Release / Check ui (push) Successful in 1m4s
Release / Docker (push) Successful in 8m43s
Release / Deploy to prod (push) Successful in 2m37s
Release / Gitea Release (push) Successful in 40s
- Remove UI build artifact upload/download steps (18 lines removed)
- Remove .dockerignore manipulation workaround
- Always build UI from source inside Docker (more reliable)
- Remove PREBUILT arg from ui/Dockerfile and docker-bake.hcl

This fixes the '/app/build not found' error in CI by eliminating the fragile
artifact-passing mechanism. UI now builds fresh in Docker using build cache,
same as local development.
2026-04-15 21:35:13 +05:00
Admin
2864c4a6c0 chore: clean up release workflow and document Doppler usage
Some checks failed
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 2m0s
Release / Docker (push) Failing after 2m20s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
2026-04-15 20:14:45 +05:00
Admin
6d0dac256d fix: simplify bake file to avoid locals/function blocks (buildx compat)
Some checks failed
Release / Test backend (push) Successful in 59s
Release / Check ui (push) Successful in 1m56s
Release / Docker (push) Failing after 2m21s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
The Gitea runner's docker buildx doesn't support HCL locals{} or function{}
blocks (added in buildx 0.12+). Replace with plain variables: VERSION and
MAJOR_MINOR are pre-computed in a CI step and passed as env vars to bake.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:43:54 +05:00
Admin
8922111471 fix: pin meilisearch to v1.40.0
Some checks failed
Release / Test backend (push) Successful in 56s
Release / Check ui (push) Successful in 2m10s
Release / Docker (push) Failing after 1m36s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
v1.42.1 (pulled by latest) is incompatible with existing data (db version
1.40.0). Pinning to v1.40.0 until a deliberate migration is planned.
Upgrade path: https://www.meilisearch.com/docs/learn/update_and_migration/updating

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:24:57 +05:00
Admin
74e7c8e8d1 chore: move tag logic into bake file, drop all metadata-action steps
Some checks failed
Release / Test backend (push) Successful in 59s
Release / Check ui (push) Successful in 1m55s
Release / Docker (push) Failing after 1m27s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
docker-bake.hcl now owns semver tag generation via HCL locals + a reusable
img_tags() function: GIT_TAG="v4.1.5" → :4.1.5, :4.1, :latest on all images.
The Docker job shrinks from 13 steps to 6 — no docker/metadata-action needed.

CI simply passes GIT_TAG, COMMIT, BUILD_TIME as env vars to bake-action.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:22:06 +05:00
Admin
2f74b2b229 fix: only pull app images during deploy, not infra images
Some checks failed
Release / Test backend (push) Successful in 1m8s
Release / Check ui (push) Successful in 1m59s
Release / Docker (push) Failing after 1m37s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
docker compose pull without arguments pulls ALL services including
getmeili/meilisearch:latest — a recent Meilisearch version bump broke
compatibility with existing data on the server, causing meilisearch to
crash immediately on restart.

CI deploy now only pulls the 5 custom app images (backend, runner, ui,
caddy, pocketbase). Infrastructure images (meilisearch, valkey, minio,
redis, crowdsec) are updated deliberately via `just pull-infra`.

Also add pocketbase to `just pull-images` now that it's a custom image.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:19:31 +05:00
Admin
cb9598a786 chore: migrate Docker builds to docker buildx bake
Some checks failed
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 2m10s
Release / Docker (push) Failing after 3m43s
Release / Deploy to prod (push) Has been skipped
Release / Gitea Release (push) Has been skipped
Replace 5 sequential build-push-action steps with a single docker/bake-action
call backed by docker-bake.hcl. BuildKit now builds all images in parallel:
backend/runner/pocketbase share one Go builder stage (compiled once), while
caddy and ui build concurrently alongside the Go targets.

Workflow YAML goes from ~130 lines of build steps to ~35. Adding a new image
now only requires a new target block in docker-bake.hcl.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:09:27 +05:00
Admin
fc73756308 fix: run pocketbase as root + add healthcheck start_period
Some checks failed
Release / Test backend (push) Successful in 56s
Release / Check ui (push) Successful in 1m59s
Release / Docker (push) Successful in 7m49s
Release / Deploy to prod (push) Failing after 33s
Release / Gitea Release (push) Successful in 29s
- Remove non-root user from pocketbase Docker image; the existing pb_data
  volume was created by the previous root-running image so files are owned
  by root — running as a non-root appuser caused an immediate permission
  error and container exit
- Increase healthcheck retries to 10 and add start_period=30s so migrations
  have time to run on first boot before liveness checks begin

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 19:00:21 +05:00
Admin
3f436877ee fix: pocketbase healthcheck (add wget) + saveIfAbsent infinite recursion
Some checks failed
Release / Test backend (push) Successful in 1m6s
Release / Check ui (push) Successful in 1m56s
Release / Docker (push) Successful in 8m19s
Release / Deploy to prod (push) Failing after 36s
Release / Gitea Release (push) Successful in 22s
- Add wget to pocketbase Alpine image so the docker-compose healthcheck
  (wget http://localhost:8090/api/health) can actually run
- Fix saveIfAbsent calling itself instead of app.Save(c) — was an
  infinite recursion that would stack-overflow on a fresh install

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 18:05:02 +05:00
Admin
812028e50d ci: add pocketbase image build + automated prod deploy step
Some checks failed
Release / Test backend (push) Successful in 5m39s
Release / Check ui (push) Successful in 2m1s
Release / Docker (push) Successful in 7m46s
Release / Deploy to prod (push) Failing after 1m53s
Release / Gitea Release (push) Successful in 35s
release.yaml:
  - Build and push kalekber/libnovel-pocketbase image on every release tag
  - Add deploy job (runs after docker): copies docker-compose.yml from the
    tagged commit to /opt/libnovel on prod, pulls new images, restarts
    changed services with --remove-orphans (cleans up removed pb-init)

ci.yaml:
  - Validate cmd/pocketbase builds on every branch push

Required new Gitea secrets: PROD_HOST, PROD_USER, PROD_SSH_KEY,
PROD_SSH_KNOWN_HOSTS (see deploy job comments for instructions).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 11:06:27 +05:00
Admin
38cf1c82a1 fix: make migration 1 idempotent for existing installs
Each collection creator now skips creation if the collection already exists,
so deploying to an existing prod install no longer requires running
`migrate history-sync` manually. Migration 2 (missing fields) still applies
as normal since those fields genuinely don't exist yet.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-14 19:38:26 +05:00
Admin
fd0f2afe16 feat: replace pb-init-v3.sh with PocketBase Go migrations
Adds a custom PocketBase binary (cmd/pocketbase) that embeds PocketBase as a
Go framework with version-controlled migrations, replacing the fragile 419-line
shell script. Migrations apply automatically on every `serve` startup.

Migration 1 (20260414000001): full baseline schema — all 21 collections, both
  indexes on chapters_idx, and initial superuser creation from env vars.
Migration 2 (20260414000002): adds three fields found in code but missing from
  the old script — books.rating, app_users.notify_new_chapters_push,
  book_comments.chapter.

docker-compose: pocketbase service now uses the custom kalekber/libnovel-pocketbase
image; pb-init one-shot container removed (migrations replace it entirely).

Existing installs: run `migrate history-sync` once to mark migration 1 as done,
then restart — migration 2 will apply the three previously missing fields.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-14 19:30:42 +05:00
Admin
0f9977744a feat: enforce bearer token auth on all /api/admin/* endpoints
All checks were successful
Release / Test backend (push) Successful in 1m1s
Release / Check ui (push) Successful in 2m12s
Release / Docker (push) Successful in 6m30s
Release / Gitea Release (push) Successful in 29s
Adds BACKEND_ADMIN_TOKEN env var (set in Doppler) as a required Bearer
token for every admin route. Also fixes PocketBase filter injection in
notification queries and wires BACKEND_ADMIN_TOKEN through docker-compose
to both backend and ui services. Includes CLAUDE.md for AI assistant guidance.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-14 18:04:10 +05:00
root
9f1c82fe05 chore: add site_config collection to pb-init-v3.sh 2026-04-14 10:09:59 +05:00
root
419bb7e366 feat: seasonal decoration overlay + logo animation, admin site-theme config page 2026-04-13 21:42:12 +05:00
root
734ba68eed perf: cache translated chapter responses for 1 hour
Translation content is immutable once generated — add Cache-Control: public,
max-age=3600, stale-while-revalidate=86400 to handleTranslationRead so the
browser and any CDN reuse the response without hitting MinIO on repeat views.
Forward the header through the SvelteKit /api/translation/[slug]/[n] proxy
which previously stripped it by constructing a bare Content-Type-only Response.
2026-04-13 21:34:02 +05:00
root
708f8bcd6f fix: ai-jobs page empty list + missing Review button + no results in modal
Three bugs:

1. +page.server.ts returned an unawaited Promise — SvelteKit awaits it on
   the server anyway so data.jobs arrived as a plain AIJob[] on the client,
   not a Promise. The $effect calling .then() on an array silently failed,
   leaving jobs=[] and the table empty. Fixed by awaiting in the load fn.

2. +page.svelte $effect updated to assign data.jobs directly (plain array)
   instead of calling .then() on it.

3. handlers_textgen.go: final UpdateAIJob (payload+status write) used
   r.Context() which is cancelled when the SSE client disconnects. If the
   browser navigated away mid-job, results were silently dropped and the
   payload stayed as the initial {pattern} stub with no results array.
   Fixed by using context.Background() for the final write, matching the
   pattern already used in handlers_image.go.
2026-04-13 21:25:52 +05:00
root
7009b24568 fix: repair notification system (broken type assertion + wrong PocketBase filter quotes)
All checks were successful
Release / Test backend (push) Successful in 55s
Release / Check ui (push) Successful in 2m0s
Release / Docker (push) Successful in 7m19s
Release / Gitea Release (push) Successful in 26s
- Add bookstore.NotificationStore interface and wire it directly to *storage.Store
  in Dependencies, bypassing the Asynq wrapper that caused Producer.(*storage.Store)
  to fail with a 500 on every notification endpoint when Redis is configured
- Replace s.deps.Producer.(*storage.Store) type assertion in all 5 notification
  handlers with s.deps.NotificationStore (nil-safe, always works)
- Fix PocketBase filter single-quote bug in ListNotifications, ClearAllNotifications,
  and MarkAllNotificationsRead (PocketBase requires double quotes for string values)
2026-04-13 21:18:49 +05:00
root
5b90667b4b fix: replace {#await} IIFE trick with $effect for streamed data on discover page
All checks were successful
Release / Test backend (push) Successful in 49s
Release / Check ui (push) Successful in 2m0s
Release / Docker (push) Successful in 7m4s
Release / Gitea Release (push) Successful in 28s
The {#await ... then} + {@const} IIFE pattern for assigning to $state
variables stopped working reliably in Svelte 5.53+. Replaced with a
proper $effect that awaits both streamed promises and assigns to state,
which correctly triggers reactivity.

Also: switch library page selection mode entry from long-press to a
'Select' button in the page header.
2026-04-13 21:14:14 +05:00
root
dec11f0c01 fix: hero carousel — horizontal book spine stack instead of vertical overlap
All checks were successful
Release / Test backend (push) Successful in 49s
Release / Check ui (push) Successful in 1m55s
Release / Docker (push) Successful in 7m23s
Release / Gitea Release (push) Successful in 53s
2026-04-13 21:08:51 +05:00
root
0f1ded2269 feat: stacked card effect on home hero carousel (desktop sm+)
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m54s
Release / Docker (push) Successful in 7m15s
Release / Gitea Release (push) Successful in 27s
2026-04-13 19:56:51 +05:00
root
2473a0213e feat: redesign discover page — desktop two-col, full-screen mobile, skeleton, streaming, keyboard shortcuts
Some checks failed
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m48s
Release / Docker (push) Failing after 3m23s
Release / Gitea Release (push) Has been skipped
2026-04-13 17:11:09 +05:00
root
1064c784d4 fix: clamp hero carousel card height to cover aspect ratio, prevent text overflow
All checks were successful
Release / Test backend (push) Successful in 58s
Release / Check ui (push) Successful in 1m59s
Release / Docker (push) Successful in 6m24s
Release / Gitea Release (push) Successful in 23s
2026-04-13 10:32:51 +05:00
root
ed9eeb6262 feat: admin archive/delete UI for books (Danger Zone panel)
All checks were successful
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 1m48s
Release / Docker (push) Successful in 6m54s
Release / Gitea Release (push) Successful in 28s
2026-04-12 22:49:15 +05:00
root
e6f7f7297d feat: add sticky sidebar to chapter reader with ToC, progress, book info, and chapter nav
Some checks failed
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 2m2s
Release / Docker (push) Failing after 7m3s
Release / Gitea Release (push) Has been skipped
2026-04-12 22:44:24 +05:00
root
93cc0b6eb0 perf: fix discover page 4s load — parallel fetches + per-user caching
Three compounding issues caused the 4+ second load:

1. getAllRatings() ran sequentially after the first Promise.all group,
   adding it unnecessarily to the critical path. Now runs in parallel
   with listBooks/getVotedSlugs/getSavedSlugs (all 4 concurrent).

2. discovery_votes was fetched twice on every page load — once inside
   getBooksForDiscovery (via getVotedSlugs) and again by getVotedBooks.
   Fixed by caching getVotedSlugs results with a 30s TTL so the second
   call hits cache instead of PocketBase.

3. getVotedSlugs and getSavedSlugs were always uncached, hitting
   PocketBase on every navigation. Added short-TTL per-user Valkey
   cache entries (voted: 30s, saved: 60s). Cache is invalidated
   immediately after each write (upsertDiscoveryVote, clearDiscoveryVotes,
   undoDiscoveryVote, saveBook) so stale data is never served.
2026-04-12 22:34:46 +05:00
root
6af5a4966f fix: remove redundant X icons from SearchModal search input
Removed the custom clear button (shown when query is non-empty) and
suppressed the browser-native webkit search cancel button via CSS.
Only the single Cancel button remains, avoiding the double/triple X
clutter on wider screens.
2026-04-12 22:24:58 +05:00
root
14388e8186 fix: persist chapter-names results into job payload from sync SSE handler
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m56s
Release / Docker (push) Successful in 5m35s
Release / Gitea Release (push) Successful in 23s
The SSE (non-async) chapter-names handler streamed results to the client
but never wrote them into the PocketBase job payload — only the initial
{pattern} stub was stored. The Review button then fetched the job and
found no results, showing 'No results found in this job's payload.'

Fix: accumulate allResults across batches (same as the async handler) and
write the full {pattern, slug, results:[...]} payload when marking done.
2026-04-12 18:44:09 +05:00
root
5cebbb1692 fix: restore pointer-events on ListeningMode and ChapterPickerOverlay
The wrapper div in +layout.svelte had pointer-events:none which blocked
all taps inside ListeningMode (chapter rows, buttons, scrolling). Removed
the wrapper div and moved the fly transition onto ListeningMode's own root
element so the slide-in animation works without stealing pointer events.
2026-04-12 18:31:50 +05:00
root
a0e705beec feat: redesign notifications settings with per-category in-app/push table
All checks were successful
Release / Test backend (push) Successful in 53s
Release / Check ui (push) Successful in 1m49s
Release / Docker (push) Successful in 5m49s
Release / Gitea Release (push) Successful in 21s
- Add notify_new_chapters_push field to AppUser, PATCH /api/profile, and profile loader
- Fix bell panel to reload notifications on every open (not just once on mount)
- Replace flat in-app + push toggles with structured category table (Category | In-app | Push)
- Add browser push master subscribe/unsubscribe row above the table
- Push column toggle disabled until browser is subscribed; shows — when unsupported/denied
- Update Notifications row hint to summarise active channels (In-app · Push / Off)
2026-04-12 17:56:53 +05:00
root
761ca83da5 fix: add push_subscriptions collection and notify_new_chapters migration to pb-init-v3.sh 2026-04-12 17:49:23 +05:00
root
48d0ae63bf feat: unified chapter picker overlay + currently reading quick-switch modal on reader
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 5m40s
Release / Gitea Release (push) Successful in 21s
2026-04-12 17:42:45 +05:00
root
44f81bbf5c surface audio-ready chapters: headphones badge on chapter list, instant-play prompt on reader
All checks were successful
Release / Test backend (push) Successful in 1m1s
Release / Check ui (push) Successful in 1m42s
Release / Docker (push) Successful in 6m13s
Release / Gitea Release (push) Successful in 21s
- getReadyChaptersForSlug(slug, preferredVoice) in pocketbase.ts: per-slug done jobs map, cached 60s, prefers user's voice
- GET /api/audio/chapters?slug=&voice= endpoint
- Chapter list (/books/[slug]/chapters): amber headphones icon on ready rows, play button for instant listen, banner showing ready count
- Chapter reader: audioReady + availableVoice from server load; 'Audio ready — Listen now' banner shown when audio exists and player is not yet active; sets voice preference before expanding player
2026-04-12 11:13:36 +05:00
root
a2ce907480 fix svelte-check errors in /listen page: use meta_updated for sort, untrack data props
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 2m6s
Release / Docker (push) Successful in 6m7s
Release / Gitea Release (push) Successful in 21s
2026-04-12 10:25:53 +05:00
root
e4631e7486 refactor: profile page grouped menu layout inspired by iOS settings style
Some checks failed
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Failing after 32s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
2026-04-12 10:21:20 +05:00
root
015cb8a0cd add Ready to Listen feature: audio book shelf on home + /listen browse page
Some checks failed
Release / Test backend (push) Successful in 53s
Release / Check ui (push) Failing after 45s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- getBooksWithAudioCount() in pocketbase.ts aggregates done audio_jobs, deduplicates by chapter per slug, caches 5 min
- GET /api/audio/books endpoint
- home page: readyToListen shelf with headphones badge, chapter count, Listen button, hideable
- /listen page: full grid with search, sort (most narrated / A-Z / recent), empty state
2026-04-12 10:18:40 +05:00
root
53edb6fdef fix: seek bars work on iOS (onchange+oninput), minimal bar is range input, float drag direction corrected
All checks were successful
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 1m43s
Release / Docker (push) Successful in 6m0s
Release / Gitea Release (push) Successful in 20s
2026-04-12 08:28:59 +05:00
root
f79538f6b2 fix: use untrack() in float clamp effect to prevent reactive loop that locked up the page
All checks were successful
Release / Test backend (push) Successful in 50s
Release / Check ui (push) Successful in 1m46s
Release / Docker (push) Successful in 6m18s
Release / Gitea Release (push) Successful in 21s
2026-04-12 07:49:08 +05:00
root
a3a218fef1 fix: float circle releases pointer capture on pointerup/cancel so page stays responsive
All checks were successful
Release / Test backend (push) Successful in 49s
Release / Check ui (push) Successful in 1m53s
Release / Docker (push) Successful in 6m28s
Release / Gitea Release (push) Successful in 21s
2026-04-11 23:56:14 +05:00
root
0c6c3b8c43 feat: show search button on chapter reader pages
All checks were successful
Release / Test backend (push) Successful in 1m3s
Release / Check ui (push) Successful in 1m51s
Release / Docker (push) Successful in 6m16s
Release / Gitea Release (push) Successful in 25s
2026-04-11 23:37:12 +05:00
root
a47cc0e711 feat: float player is now a draggable circle with viewport clamping and tap-to-play/pause
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 2m15s
Release / Docker (push) Successful in 6m36s
Release / Gitea Release (push) Successful in 39s
2026-04-11 23:35:49 +05:00
root
ac3d6e1784 fix: move hamburger backdrop outside <header> so drawer items are not blurred
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m49s
Release / Docker (push) Successful in 6m21s
Release / Gitea Release (push) Successful in 28s
2026-04-11 23:22:32 +05:00
root
adacd8944b fix: AudioPlayer chapter picker highlights audioStore.chapter (playing) not page chapter prop
All checks were successful
Release / Test backend (push) Successful in 59s
Release / Check ui (push) Successful in 1m56s
Release / Docker (push) Successful in 6m24s
Release / Gitea Release (push) Successful in 28s
2026-04-11 23:13:24 +05:00
root
ea58dab71c fix: hamburger backdrop starts below header so menu items are not blurred
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m58s
Release / Docker (push) Successful in 6m31s
Release / Gitea Release (push) Successful in 37s
2026-04-11 18:39:32 +05:00
root
cf3a3ad910 feat: add backdrop blur overlay when mobile hamburger menu is open
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m46s
Release / Docker (push) Successful in 6m5s
Release / Gitea Release (push) Successful in 34s
2026-04-11 17:30:55 +05:00
root
8660c675b6 fix: suppress mini-bar for float/minimal player styles; persist float position
All checks were successful
Release / Test backend (push) Successful in 1m0s
Release / Check ui (push) Successful in 1m42s
Release / Docker (push) Successful in 5m55s
Release / Gitea Release (push) Successful in 39s
2026-04-11 17:20:09 +05:00
root
1f4d67dc77 fix: player float mode now works; add minimal player style
All checks were successful
Release / Test backend (push) Successful in 52s
Release / Check ui (push) Successful in 1m49s
Release / Docker (push) Successful in 6m29s
Release / Gitea Release (push) Successful in 35s
Float mode was broken because AudioPlayer was unmounted the moment
audioStore.active became true — exactly when the float overlay needs
to render. Fix: keep AudioPlayer mounted in float and minimal modes
regardless of audioStore.active; only standard mode shows the
'Controls below' message.

Adds a third 'minimal' style: a compact single-row bar (skip ±,
play/pause, seek, time) with no voice picker or chapter browser.
Voice picker and chapter button are hidden in the idle pill too.

Settings UI updated to show all three options with a live
description of what each style does.
2026-04-11 16:00:46 +05:00
root
b0e23cb50a feat: floating scroll nav buttons in scroll reader mode
All checks were successful
Release / Test backend (push) Successful in 46s
Release / Check ui (push) Successful in 1m38s
Release / Docker (push) Successful in 6m29s
Release / Gitea Release (push) Successful in 31s
Up/down chevron buttons fixed to the bottom-right of the viewport.
At the top of the chapter the up button becomes a Prev chapter link;
at the bottom the down button becomes an amber Next chapter link.
Hidden in focus mode (uses its own pill). Lifts above the audio
mini-player when it is active.
2026-04-11 15:52:14 +05:00
root
1e886a705d feat: notifications modal, admin dedup, and in-app notification preferences
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m53s
Release / Docker (push) Successful in 6m22s
Release / Gitea Release (push) Successful in 35s
- Replace bell dropdown with full-screen NotificationsModal (mirrors SearchModal pattern)
- Notifications visible to all logged-in users (not just admin)
- Admin users excluded from new-chapter fan-out (dedup vs Scrape Complete notification)
- Users with notify_new_chapters=false opted out of new-chapter in-app notifications
- Toggle in profile page to enable/disable in-app new-chapter notifications
- PATCH /api/profile endpoint to save notification preferences
- User-facing /notifications page (admin redirects to /admin/notifications)
2026-04-11 15:31:37 +05:00
root
19b5b44454 feat: hold-to-repeat page buttons and tap-counter slider in paginated reader
All checks were successful
Release / Test backend (push) Successful in 49s
Release / Check ui (push) Successful in 1m47s
Release / Docker (push) Successful in 6m8s
Release / Gitea Release (push) Successful in 36s
2026-04-11 15:13:34 +05:00
root
b95c811898 feat: web push notifications for new chapters
All checks were successful
Release / Test backend (push) Successful in 4m12s
Release / Check ui (push) Successful in 1m53s
Release / Docker (push) Successful in 5m46s
Release / Gitea Release (push) Successful in 35s
- Service worker (src/service-worker.ts) handles push events and
  notification clicks, navigating to the book page on tap
- Web app manifest (manifest.webmanifest) linked in app.html
- Profile page: push notification toggle (subscribe/unsubscribe)
  using the browser Notification + PushManager API with VAPID
- API route POST/DELETE /api/push-subscription proxies to backend
- Go backend: push_subscriptions PocketBase collection storage
  methods (SavePushSubscription, DeletePushSubscription,
  ListPushSubscriptionsByBook) in storage/store.go
- handlers_push.go: GET vapid-public-key, POST/DELETE subscription
- webpush package: VAPID-signed sends via webpush-go, SendToBook
  fans out to all users who have the book in their library
- Runner fires push to subscribers whenever ChaptersScraped > 0
  after a successful book scrape
- Config: VAPID_PUBLIC_KEY, VAPID_PRIVATE_KEY, VAPID_SUBJECT env vars
- domain.ScrapeResult gets a Slug field; orchestrator populates it
2026-04-11 14:59:21 +05:00
root
3a9f3b773e fix: reduce log noise during catalogue/book scrapes
All checks were successful
Release / Test backend (push) Successful in 55s
Release / Check ui (push) Successful in 2m0s
Release / Docker (push) Successful in 5m57s
Release / Gitea Release (push) Successful in 32s
Demote per-book and per-chapter-list-page Info logs to Debug — these
fire hundreds of times per catalogue run and drown out meaningful signals:
- orchestrator: RunBook starting (per book)
- metadata saved (per book)
- chapter list fetched (per book)
- scraping chapter list page N (per pagination page per book)

The 'book scrape finished' summary log (with scraped/skipped/errors
counters) remains at Info — it is the useful signal per book.
2026-04-11 12:39:41 +05:00
root
6776d9106f fix: catalogue job always shows 0 counters after cancel/finish
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m55s
Release / Docker (push) Successful in 6m5s
Release / Gitea Release (push) Successful in 32s
Two bugs fixed in runScrapeTask / runCatalogueTask:

1. FinishScrapeTask was called with the task's own context, which is
   already cancelled when the task is stopped. The PATCH to PocketBase
   failed silently, leaving all counters at their initial zero values.
   Fix: use a fresh context.WithTimeout(Background, 15s) for the write.

2. BooksFound was double-counted: RunBook already sets BooksFound=1 on
   success, but the accumulation loop added an extra +1 unconditionally,
   reporting 2 books per successful scrape.
   Fix: result.BooksFound += bookResult.BooksFound  (drop the + 1).
2026-04-11 12:33:30 +05:00
root
ada7de466a perf: remove voice picker from profile, parallelize server load
All checks were successful
Release / Test backend (push) Successful in 50s
Release / Check ui (push) Successful in 1m52s
Release / Docker (push) Successful in 5m58s
Release / Gitea Release (push) Successful in 33s
Remove the TTS voice section from the profile page — it fetched
/api/voices on every mount, blocking paint for the full round-trip.
Voice selection lives on the chapter page where voices are already loaded.

Rewrite the server load to run avatar, sessions+stats, and reading history
all concurrently via Promise.allSettled instead of sequentially, cutting
SSR latency by ~2-3x on the profile route.
2026-04-11 10:41:35 +05:00
root
c91dd20c8c refactor: clean up profile page UI — remove decorative icons
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m51s
Release / Docker (push) Successful in 6m21s
Release / Gitea Release (push) Successful in 36s
Remove all decorative SVG icons (checkmarks, chevrons, stars, fire,
external-link arrows, empty-state illustrations). Replace icon-only
interactive elements with text (avatar hover shows 'Edit', voice sample
buttons show 'Play'/'Stop', danger zone toggle shows 'Open'/'Close').
Replace SVG avatar placeholder with the user's initial. Strip emoji
from stats cards and genre chips. Tighten playback toggle descriptions.
2026-04-11 10:21:14 +05:00
root
3b24f4560f feat: add OG/Twitter meta tags on book and chapter pages
All checks were successful
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Successful in 1m53s
Release / Docker (push) Successful in 6m13s
Release / Gitea Release (push) Successful in 37s
Add og:title, og:description, og:image (book cover), og:url, og:type,
og:site_name, twitter:card, twitter:image, and rel=canonical to the
book detail and chapter reader pages so link previews in Telegram,
WhatsApp, Twitter/X, Discord etc. show the cover image instead of
the site logo.
2026-04-11 09:35:21 +05:00
root
973e639274 refactor: extract shared ChapterPickerOverlay component
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 1m55s
Release / Docker (push) Successful in 6m19s
Release / Gitea Release (push) Successful in 32s
Unify the duplicated chapter picker overlays from AudioPlayer and
ListeningMode into a single ChapterPickerOverlay component.
Both callers keep their own onselect handlers; the overlay owns
search state internally and includes safe-area insets + scrollIfActive.
2026-04-11 09:01:24 +05:00
root
e78c44459e refactor(profile): visual voice picker, playback toggles, danger zone
All checks were successful
Release / Test backend (push) Successful in 1m2s
Release / Check ui (push) Successful in 1m46s
Release / Docker (push) Successful in 6m9s
Release / Gitea Release (push) Successful in 29s
- Replace voice <select> with a two-column card grid grouped by engine
  (Kokoro GPU / Pocket TTS CPU / Cloudflare AI); each card has a per-voice
  sample play/pause button matching AudioPlayer behaviour
- Add Announce chapter and Audio mode (Stream/Generate) toggles to a
  unified Playback row in Preferences; Audio mode toggle disabled for
  CF AI voices
- Remove duplicate PUT /api/settings from the profile page; all writes
  go directly into audioStore / theme context and the layout's single
  debounced effect persists them
- Add Danger Zone section: collapsible, requires typing username to
  unlock Delete account button; calls DELETE /api/profile
- Add deleteUserAccount() to pocketbase.ts: purges user_settings,
  user_library, progress, comment_votes, book_ratings,
  user_subscriptions, notifications, user_sessions then the
  app_users record
- Add DELETE /api/profile server route (auth-guarded)
2026-04-10 22:30:39 +05:00
root
f8c66fcf63 feat: stream/generate audio mode toggle
All checks were successful
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 2m6s
Release / Docker (push) Successful in 6m15s
Release / Gitea Release (push) Successful in 29s
Add a user-selectable playback mode stored in user_settings:
- 'stream' (default): /api/audio-stream starts playing within seconds,
  saves to MinIO concurrently — low latency
- 'generate': queue runner task, poll until full audio is ready in
  MinIO, then play via presigned URL — legacy behaviour

UI toggles in two places:
- AudioPlayer idle pill: compact '· Stream / · Generate' inline next
  to voice name and estimated duration
- ListeningMode controls row: pill alongside Auto, Announce, Sleep;
  disabled and grayed out for CF AI voices (batch-only, no streaming)

startPlayback() now branches on audioStore.audioMode for non-CF AI
voices; generate mode uses the same runner task + progress bar flow
as CF AI but without the preview clip.

PocketBase: audio_mode text field added to user_settings on
pb.libnovel.cc (live) and in pb-init-v3.sh (create block +
add_field migration line).
2026-04-10 20:06:56 +05:00
root
a1def0f0f8 feat: admin soft-delete and hard-delete for books
Some checks failed
Release / Test backend (push) Successful in 58s
Release / Docker (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Check ui (push) Has been cancelled
- Add `archived` bool to domain.BookMeta, pbBook, and Meilisearch bookDoc
- ArchiveBook / UnarchiveBook patch the PocketBase record; ListBooks filters
  archived=false so hidden books disappear from all public responses
- Meilisearch: add `archived` as a filterable attribute; Search and Catalogue
  always prepend `archived = false` to exclude archived books from results
- DeleteBook permanently removes the PocketBase record, all chapters_idx rows,
  MinIO chapter objects, cover image, and the Meilisearch document
- New BookAdminStore interface with ArchiveBook, UnarchiveBook, DeleteBook
- Admin HTTP endpoints: PATCH /api/admin/books/{slug}/archive|unarchive,
  DELETE /api/admin/books/{slug}
- PocketBase schema: archived field added to live pb.libnovel.cc and to
  pb-init-v3.sh (both create block and add_field migration)
2026-04-10 19:31:33 +05:00
root
e0dec05885 fix: chunk large chapter text for Kokoro TTS to prevent EOF on big inputs
All checks were successful
Release / Test backend (push) Successful in 49s
Release / Check ui (push) Successful in 1m48s
Release / Docker (push) Successful in 5m58s
Release / Gitea Release (push) Successful in 34s
Split chapter text into ~1000-char sentence-boundary chunks before sending
to kokoro-fastapi. Each chunk is generated individually and the raw MP3 bytes
are concatenated. This prevents the EOF / timeout failures that occur when
the server receives a very large single request (e.g. a full PDF 'Full Text'
chapter). chunkText() breaks at sentence endings (. ! ? newlines) to preserve
natural speech flow.
2026-04-10 09:24:37 +05:00
root
8662aed565 feat: PDF single-chapter import, EPUB numbering fix, admin chapter split tool
All checks were successful
Release / Check ui (push) Successful in 2m10s
Release / Test backend (push) Successful in 53s
Release / Docker (push) Successful in 6m7s
Release / Gitea Release (push) Successful in 23s
- parsePDF: return all text as single 'Full Text' chapter (admin splits manually)
- parseEPUB: fix chapter numbering to use sequential counter not spine index
- Remove dead code: chaptersFromBookmarks, cleanChapterText, extractChaptersFromText, chapterHeadingRE; drop pdfcpu alias and regexp imports
- Backend: POST /api/admin/books/:slug/split-chapters endpoint — splits text on '---' dividers, optional '## Title' headers, writes chapters via WriteChapter
- UI: admin panel now shows for all admin users regardless of source_url; chapter split tool shown when book has single 'Full Text' chapter, pre-fills from MinIO content
2026-04-09 23:59:24 +05:00
root
cdfa1ac5b2 fix(pdf): fix page ordering, Win-1252 quotes, and chapter header cleanup
Some checks failed
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 1m49s
Release / Gitea Release (push) Has been cancelled
Release / Docker (push) Has been cancelled
Three fixes to PDF chapter extraction quality:

1. Page ordering: parse page number from pdfcpu filename (out_Content_page_N.txt)
   instead of using lexicographic sort index — fixes chapters bleeding into each
   other (e.g. Prologue text appearing inside Chapter 1).

2. Windows-1252 chars: map bytes 0x91-0x9F to proper Unicode (curly quotes U+2018/
   U+2019/U+201C/U+201D, em-dash U+2014, etc.) instead of raw Latin-1 control
   bytes that rendered as ◆ in the browser.

3. Chapter header cleanup: skip the first page of each bookmark range (decorative
   title art page) and strip any run-on title fragment at the start of the first
   body page (e.g. 'for New Journeys!I stood atop...' → 'I stood atop...'). The
   remaining sentence truncation is a fundamental limitation of this PDF's
   PUA-encoded body font (C2_1/Literata) — those glyphs cannot be decoded without
   the publisher's private ToUnicode mapping.
2026-04-09 23:43:09 +05:00
root
ffcdf5ee10 fix(pdf): replace dslipak/pdf with pdfcpu bookmark+content-stream extraction
All checks were successful
Release / Test backend (push) Successful in 4m55s
Release / Check ui (push) Successful in 1m52s
Release / Docker (push) Successful in 7m13s
Release / Gitea Release (push) Successful in 46s
Use PDF outline (bookmarks) for chapter titles and page ranges, then
extract text from per-page content streams via pdfcpu ExtractContent.
This avoids the indefinite hang caused by dslipak/pdf trying to resolve
custom font ToUnicode CMaps on publisher PDFs.

- parsePDF: decrypt → ExtractContent → Bookmarks → chaptersFromBookmarks
- chaptersFromBookmarks: flatten bookmark tree, skip front/back matter
  (Cover, Insert, Title Page, Copyright, Appendix), assign page ranges
- extractTextFromContentStream: handle TJ arrays (concat literal strings,
  skip hex glyph arrays and kerning numbers) + single Tj strings
- Falls back to paragraph-splitting when no bookmarks present
- Build verified; test PDF produces 9 chapters with proper titles
2026-04-09 22:36:58 +05:00
root
899c504d1f feat(import): move PDF parsing to backend; fix heartbeat/reap for import_tasks
All checks were successful
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 2m2s
Release / Docker (push) Successful in 7m32s
Release / Gitea Release (push) Successful in 1m0s
- parsePDF function restored in import.go (body was orphaned outside function)
- ParseImportFile() called at upload time with 3-min timeout; chapters stored as JSON in MinIO
- runner.go: prefer ChaptersKey path (read pre-parsed JSON) over BookImport.Import()
- ImportChapterStore interface added; store wired in runner/main.go
- HeartbeatTask and ReapStaleTasks now include import_tasks collection
- parseImportTask now returns ChaptersKey in domain.ImportTask
- asynq_runner.go handleImportTask passes ChaptersKey
- pb-init-v3.sh: chapters_key field added to import_tasks schema
2026-04-09 21:19:43 +05:00
root
d82aa9d4b4 fix(import): decrypt owner-encrypted PDFs with pdfcpu; add imports bucket to minio-init
All checks were successful
Release / Test backend (push) Successful in 2m10s
Release / Check ui (push) Successful in 1m55s
Release / Docker (push) Successful in 7m17s
Release / Gitea Release (push) Successful in 48s
- parsePDF now attempts to strip encryption via pdfcpu (empty user password)
  before handing bytes to dslipak/pdf — fixes '256-bit encryption key' error
  on publisher PDFs that use owner-only encryption (copy/print restrictions)
- Add pdfcpu v0.11.1 as direct dependency (was already indirect)
- docker-compose.yml minio-init: add 'imports' and 'translations' buckets
  so a fresh deploy creates all required buckets
2026-04-09 20:08:12 +05:00
root
ae08382b81 fix(import): wire ImportFileStore to bypass Asynq type assertion; add pb-init collections
All checks were successful
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Successful in 1m56s
Release / Docker (push) Successful in 6m10s
Release / Gitea Release (push) Successful in 37s
- Add ImportFileStore interface to bookstore package
- Add ImportFileStore field to backend.Dependencies
- Wire ImportFileStore: store in cmd/backend/main.go
- handlers_import.go: use s.deps.ImportFileStore.PutImportFile instead of
  broken s.deps.Producer.(*storage.Store) type assertion (fails when Asynq active)
- pb-init-v3.sh: add import_tasks and notifications collection definitions
2026-04-09 19:05:11 +05:00
root
b9f8008c2c chore: embed git credentials in remote URL; update AGENTS.md 2026-04-09 17:09:53 +05:00
root
d25cee3d8c fix(ci): track generated admin_nav_notifications.js to avoid CDN-dependent paraglide failure
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 1m54s
Release / Docker (push) Successful in 5m45s
Release / Gitea Release (push) Successful in 35s
2026-04-09 17:03:30 +05:00
root
48714cd98b fix(import): persist object_key + metadata; add nav + logout session cleanup
Some checks failed
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Failing after 29s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Import task: persist object_key, author, cover_url, genres, summary,
  book_status in PocketBase so the runner can fetch the file and write
  book metadata on completion
- Runner poll mode: pass task.ObjectKey instead of empty string
- Runner: write BookMeta + UpsertBook in Meilisearch after chapter ingest
  so imported books appear in catalogue and search
- Import UI: add author, cover URL, genres, summary, status fields; add
  AI tasks panel (chapter names, description, image gen, tagline) after
  import completes; add AI tasks button on each done task in the list
- Admin nav: add Notifications entry to sidebar (all 5 locales)
- Logout: delete user_sessions row on sign-out so sessions don't
  accumulate as phantoms after each login/logout cycle
2026-04-09 16:59:40 +05:00
root
1a2bf580cd v2.6.51: fix PDF import — raise UI body limit, wire real analyze
All checks were successful
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 1m49s
Release / Docker (push) Successful in 6m4s
Release / Gitea Release (push) Successful in 40s
- docker-compose.yml: BODY_SIZE_LIMIT=52428800 (50MB) on UI service
  — adapter-node was rejecting PDFs >512KB with 'Content-length exceeds limit'
- storage/import.go: add AnalyzeFile() public function using real PDF/EPUB parsers
- handlers_import.go: analyzeImportFile() now calls storage.AnalyzeFile() instead
  of the file-size stub, so chapter count is accurate in the preview
2026-04-09 15:55:06 +05:00
root
2ca1ab2250 v2.6.50: notifications overhaul, fix blank page, fix chapter review loading
All checks were successful
Release / Test backend (push) Successful in 3m15s
Release / Check ui (push) Successful in 1m49s
Release / Docker (push) Successful in 5m53s
Release / Gitea Release (push) Successful in 40s
- svelte.config.js: paths.relative=false so CSS uses absolute /_app/ paths (fixes blank home page after redirect)
- ai-jobs: fix openReview() mutating stale alias r instead of $state review — was causing 'Loading results...' to never resolve for chapter-names/image-gen/description
- notifications bell: redesign with All/Unread tabs, per-item dismiss (×), mark-all-read, clear-all, 'View all' footer link
- /admin/notifications: new dedicated full-page notifications view
- api/notifications proxy: add PATCH (mark-all-read) and DELETE (clear-all, dismiss) handlers
- runner: add CreateNotification calls on success/failure in runScrapeTask, runAudioTask, runTranslationTask
- storage/import.go: real PDF (dslipak/pdf) and EPUB (archive/zip + x/net/html) parsing replacing stubs
- translation admin page: stream jobs Promise instead of blocking navigation
- store.go: DeleteNotification, ClearAllNotifications, MarkAllNotificationsRead methods
- handlers_notifications.go + server.go: PATCH /api/notifications, DELETE /api/notifications, DELETE /api/notifications/{id}
2026-04-09 15:14:00 +05:00
root
2571c243c9 perf: stream slow load functions in admin pages to unblock navigation
All checks were successful
Release / Test backend (push) Successful in 50s
Release / Check ui (push) Successful in 2m4s
Release / Docker (push) Successful in 5m42s
Release / Gitea Release (push) Successful in 36s
- image-gen, text-gen: books list streamed (listBooks is expensive on cold cache)
- ai-jobs: jobs list streamed; add 30s cache to listAIJobs (was uncached listAll)
- changelog: Gitea releases streamed on cold cache; cached path stays synchronous
- admin/+layout.svelte: remove duplicate audio/translation/image-gen nav links
2026-04-09 13:02:32 +05:00
root
89f0d6a546 fix: forward multipart/form-data correctly in import API proxy
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m43s
Release / Docker (push) Successful in 5m51s
Release / Gitea Release (push) Successful in 39s
The SvelteKit proxy was calling request.json() unconditionally,
consuming the body before forwarding. File uploads (multipart/form-data)
now use request.formData() and pass the FormData directly to backendFetch
so the fetch API sets the correct Content-Type boundary automatically.
2026-04-09 12:39:20 +05:00
root
8bc9460989 fix: force-add missing admin_nav_import.js paraglide generated file
All checks were successful
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 1m47s
Release / Docker (push) Successful in 7m11s
Release / Gitea Release (push) Successful in 41s
The paraglide .gitignore uses '*' to ignore all generated files.
admin_nav_import.js was never force-added after the key was introduced
in v2.6.44, causing svelte-check to fail in CI with 'Cannot find module'.
2026-04-09 12:21:44 +05:00
root
fcd4b3ad7f fix: wire import chapter ingestion, live task polling, a11y labels, notification user targeting
Some checks failed
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Failing after 36s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- NewBookImporter now takes *Store instead of raw *minio.Client (same package access)
- Runner Dependencies gains ChapterIngester interface; storeImportedChapters no-op removed
- store.IngestChapters is now actually called after PDF/EPUB extraction
- BookImport and ChapterIngester wired in runner main.go
- CreateImportTask interface gains initiatorUserID param; threaded through store/asynq/handler
- domain.ImportTask gains InitiatorUserID field; parseImportTask populates it
- Runner notifies initiator (falls back to 'admin' when empty)
- UI: import task list polls every 3s while any task is pending/running
- UI: label[for] + input[id] fix for a11y warnings in admin import page
2026-04-09 11:00:01 +05:00
root
ab92bf84bb feat: import review step + admin notifications
Some checks failed
Release / Test backend (push) Failing after 16s
Release / Check ui (push) Failing after 33s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Import page: add file upload with review step before committing
- Backend: add analyze endpoint to preview chapters before import
- Add notifications collection + API for admin alerts
- Add bell icon in header with notification dropdown (admin only)
- Runner creates notification on import completion
- Notifications link to /admin/import for easy review
2026-04-09 10:30:36 +05:00
root
bb55afb562 fix: add missing admin_nav_import i18n key for import page
Some checks failed
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Failing after 36s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
2026-04-09 10:18:12 +05:00
root
e088bc056e feat: add PDF/EPUB import functionality
Some checks failed
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Failing after 35s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Add ImportTask/ImportResult types to domain.go
- Add TypeImportBook to asynqqueue for task routing
- Add CreateImportTask to producer and storage layers
- Add ClaimNextImportTask/FinishImportTask to Consumer interfaces
- Add import task handling to runner (polling + Asynq handler)
- Add BookImporter interface to bookstore for PDF/EPUB parsing
- Add backend API endpoints: POST/GET /api/admin/import
- Add SvelteKit UI at /admin/import with task list
- Add nav link in admin layout

Note: PDF/EPUB parsing is a placeholder - needs external library integration.
2026-04-09 10:01:20 +05:00
root
a904ff4e21 fix: update CF AI image gen to use multipart for FLUX.2 models
All checks were successful
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 5m39s
Release / Gitea Release (push) Successful in 30s
Cloudflare Workers AI changed the API for flux-2-dev, flux-2-klein-4b,
and flux-2-klein-9b to require multipart/form-data (instead of JSON) and
now returns {"image":"<base64>"} instead of raw PNG bytes.

- Add requiresMultipart() helper for the three FLUX.2 models
- callImageAPI builds multipart body for those models, JSON for others
- Parse {"image":"<base64>"} JSON response; fall back to raw bytes for legacy models
- Use "steps" field name (not "num_steps") in multipart forms per CF docs
- Book page: capture and display actual backend error message instead of blank 'Error'
2026-04-08 22:28:36 +05:00
root
04e63414a3 fix: restore swipeStartX declaration (newline eaten in prior edit)
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 5m34s
Release / Gitea Release (push) Successful in 33s
2026-04-08 21:23:23 +05:00
root
bae363893b fix: include generated id in createComment POST body
Some checks failed
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Failing after 32s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
The book_comments collection has a custom 'id' field (type: text,
required: true) distinct from PocketBase's system record ID.
Every comment POST was returning 400 validation_required because
the id field was never sent.

Fix: generate a 15-char hex ID via crypto.randomUUID() and include
it in the payload, matching PocketBase's own ID alphabet.
2026-04-08 21:06:23 +05:00
root
b7306877f1 refactor: clean up home page UI
Some checks failed
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Failing after 36s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Remove all emojis (flame icon in streak widget, 🔥 in stats footer,
  ✓ in completed badge) — they cheapened the overall feel
- Fix double carousel indicator: drop the animated progress sub-line
  below the active dot; the expanding pill shape is sufficient signal.
  Also removes the rAF animation loop and progressStart state.
- Remove 'X left' badge from Continue Reading shelf cards
- Remove 'X chapters ahead' text from hero card info row
2026-04-08 20:54:37 +05:00
root
0723049e0c ci: fail if Paraglide generated files are out of sync with messages JSON
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m46s
Release / Docker (push) Successful in 5m39s
Release / Gitea Release (push) Successful in 29s
2026-04-08 20:42:21 +05:00
root
b206994459 fix: generate missing Paraglide message modules for new i18n keys
Hand-authored the 12 missing .js message modules and updated _index.js
since npm/paraglide-js codegen cannot run in this environment.
These are equivalent to what 'npm run paraglide' would generate.

Going forward: run 'npm run paraglide' after adding keys to messages/*.json.
2026-04-08 20:40:58 +05:00
root
956594ae7b feat: replace compact player with draggable floating overlay
Some checks failed
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Failing after 32s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Remove compact player mode (seekable bar inline was undifferentiated
  from standard; didn't have voice/chapter/warm-up controls)
- Add 'float' player style: a fixed, draggable mini-overlay positioned
  above the bottom mini-bar (bottom-right corner, z-55)
  · Seek bar, skip ±15/30s, play/pause, time, speed indicator
  · Drag handle (pointer capture) to reposition anywhere on screen
  · Animated playing pulse dot in the title row
  · Suppresses the standard non-idle inline block when active so
    there is no duplicate UI
- PlayerStyle type: 'standard' | 'float' (was 'compact')
- Bump localStorage key to reader_layout_v2 so old 'compact' pref
  does not pollute the new default
- Listening tab Style row now shows Standard / Float
2026-04-08 20:28:47 +05:00
root
e399b1ce01 feat: admin UX overhaul — status filters, retry/cancel, mobile cards, i18n, shelf pre-populate
Some checks failed
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Failing after 32s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
- Admin layout: SVG icons, active highlight, divider between nav sections
- Scrape page: status filter pills with counts, text + status combined search
- Audio page: status filter pills, cancel jobs, retry failed jobs, mobile cards for cache tab
- Translation page: status filter pills (incl. cancelled), cancel + retry jobs, mobile cancel/retry cards, i18n for all labels
- AI Jobs page: fix concurrent cancel (Set instead of single slot), per-job cancel errors inline, full mobile card layout, i18n title/heading
- Text-gen page: tagline editable input + copy, warnings copy, i18n title/heading
- Book page: chapter cover Save button, audio monitor link, currentShelf pre-populated from server
- pocketbase.ts: add getBookShelf(), shelf field on UserLibraryEntry
- New API route: POST /api/admin/translation/bulk (proxy for translation retry)
- i18n: 15 new admin_translation_*, admin_ai_jobs_*, admin_text_gen_* keys across all 5 locales
2026-04-08 18:30:35 +05:00
root
320f9fc76b feat: add page fade transition and CSS performance improvements
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m35s
Release / Docker (push) Successful in 5m53s
Release / Gitea Release (push) Successful in 29s
- Wrap {#key} children in fade transition (out 100ms, in 180ms+60ms delay)
  for a smooth cross-fade between pages with no added dependencies
- Halve navigation progress bar animation duration (8s → 4s) for
  more realistic feedback on typical navigations
- Add prefers-reduced-motion media query to collapse all animation/transition
  durations for users with accessibility needs
- Add content-visibility: auto on footer to skip browser paint of
  off-screen content and improve rendering performance
2026-04-08 16:54:51 +05:00
root
7bcc481483 feat: listening settings tab controls, warm-up button, resume button
All checks were successful
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 5m37s
Release / Gitea Release (push) Successful in 31s
- Listening tab: expose Speed, Auto-next, and Announce chapter controls
  alongside the existing Player Style row (chapters/[n]/+page.svelte)
- AudioPlayer: add Warm up Ch. N button in standard player ready state
  when autoNext is off; shows prefetching spinner and completion status
- Book page: fetch saved audioTime on mount; show Resume Ch. N button
  (desktop + mobile) that sets autoStartChapter and navigates directly
2026-04-08 15:14:51 +05:00
root
16f277354b fix: close missing {/if} for playerStyle block in AudioPlayer
All checks were successful
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 1m40s
Release / Docker (push) Successful in 5m50s
Release / Gitea Release (push) Successful in 37s
The {#if playerStyle === 'compact'} block was left unclosed after the
idle-state refactor, causing a svelte-check parse error.
2026-04-08 14:45:56 +05:00
root
3c33b22511 feat: redesign standard player idle state as pill-style row
Some checks failed
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Failing after 35s
Release / Docker (push) Has been skipped
Release / Gitea Release (push) Has been skipped
Replace the two-row layout (toolbar + lonely 'Play narration' button)
with a single pill row:
- Large circular play button on the left (brand color, active:scale-95)
- 'Play narration' label + voice selector button + estimated duration
  (based on word count at ~150wpm) in the centre
- Chapters icon button on the right

Voice panel drops inline below the pill when open.
Non-idle states (loading / generating / ready / other-chapter-playing)
keep the existing toolbar + status layout unchanged.

Also adds wordCount prop to AudioPlayer and passes it from the chapter page.
2026-04-08 13:52:52 +05:00
root
85492fae73 fix: replace speechSynthesis announce with real audio clip via /api/tts-announce
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m48s
Release / Docker (push) Successful in 13m20s
Release / Gitea Release (push) Successful in 40s
speechSynthesis is silently muted on iOS Safari and Chrome Android after
the audio session ends (onended), so chapter announcements never played.

Fix:
- Add GET /api/tts-announce backend endpoint: streams a short TTS clip
  for arbitrary text without MinIO caching (backend/internal/backend/)
- Add GET /api/announce SvelteKit proxy route (no paywall)
- Add announceNavigatePending/announcePendingSlug/announcePendingChapter
  to AudioStore
- Rewrite onended announce branch: sets audioStore.audioUrl to the
  announcement clip URL so the persistent <audio> element plays it;
  the next onended detects announceNavigatePending and navigates
- 10s safety timeout in case the clip fails to load/end
2026-04-08 11:57:04 +05:00
root
559b6234e7 fix: update otel-collector telemetry.metrics config for v0.103+ (address → readers) 2026-04-07 18:15:21 +05:00
root
75cac363fc fix: chapter/voice modals in ListeningMode use fixed inset-0 with safe-area insets to fill full screen
All checks were successful
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 6m32s
Release / Gitea Release (push) Successful in 1m9s
2026-04-07 17:54:49 +05:00
root
68c7ae55e7 fix: carousel no longer reshuffles continue-reading shelf; remove arrows, add swipe; animate progress line under active dot
All checks were successful
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 1m41s
Release / Docker (push) Successful in 5m50s
Release / Gitea Release (push) Successful in 29s
2026-04-07 12:32:45 +05:00
root
c900fc476f fix: add forest, mono, cyber to settings API validThemes allowlist
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 1m40s
Release / Docker (push) Successful in 5m43s
Release / Gitea Release (push) Successful in 31s
2026-04-07 12:09:56 +05:00
root
d612b40fdb ci: consolidate Docker jobs into one + comment out sourcemaps step
- Merged docker-backend, docker-runner, docker-ui, docker-caddy into a
  single 'docker' job. Docker Hub is now authenticated once; the credential
  in ~/.docker/config.json is reused by all four build-push-action steps.
  Eliminates 3 redundant login, checkout, setup-buildx, and scheduler
  round-trips. Builds still run sequentially within the job so inline layer
  cache pushes don't race each other.

- docker-ui now downloads the plain ui-build artifact from check-ui directly
  (previously it depended on upload-sourcemaps which produced ui-build-injected).

- release job now only needs: [docker] instead of the previous 5-job list.

- Entire upload-sourcemaps job commented out as requested. Re-enable by
  uncommenting the job block and adding 'upload-sourcemaps' back to the
  docker job's needs list (also swap ui-build → ui-build-injected artifact).
2026-04-07 12:01:34 +05:00
root
faa4c42f20 fix: add missing Paraglide compiled message files for new themes + aria-label
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 1m39s
Release / Docker / backend (push) Successful in 2m50s
Release / Docker / runner (push) Successful in 2m53s
Release / Upload source maps (push) Successful in 1m27s
Release / Docker / ui (push) Successful in 2m53s
Release / Docker / caddy (push) Successful in 48s
Release / Gitea Release (push) Successful in 36s
CI svelte-check failed because Paraglide generates per-key .js files that
must be committed — the compiler runs at build time in CI (svelte-kit sync)
but the output directory is tracked in git, so new keys added to messages/*.json
require the corresponding generated files to be committed manually when node
is not available locally.

Added:
- messages/profile_theme_forest.js
- messages/profile_theme_mono.js
- messages/profile_theme_cyber.js
- messages/_index.js: three new export lines

Also fixed a11y warn: added aria-label='Close history' to the icon-only
close button in the discover page history drawer.
2026-04-07 11:40:38 +05:00
root
17fa913ba9 feat: add Forest, Mono, and Cyberpunk color themes
Some checks failed
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Failing after 31s
Release / Upload source maps (push) Has been skipped
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m29s
Release / Docker / runner (push) Successful in 2m39s
Release / Gitea Release (push) Has been skipped
Forest — deep green surfaces (#0a130d base) with green-400 accent.
  Inspired by terminal emulators and dark mode IDE forest schemes.

Mono — near-black zinc-950 surface with pure white/zinc-100 accent.
  Minimal high-contrast monochromatic palette for distraction-free reading.

Cyber — near-black blue surface with neon cyan-400 accent and dracula-palette
  success/danger colors. Cyberpunk aesthetic with neon glow feel.

Changes:
- app.css: three new [data-theme] blocks with full token sets
- +layout.svelte: added forest/mono/cyber to THEMES array (before light themes)
- profile/+page.svelte: same additions with i18n label functions
- messages/*.json: translated names in all 5 locales (en/fr/ru/pt/id)
2026-04-07 11:27:30 +05:00
root
95f45a5f13 fix: chapter list modal full-screen clip + add clear close button
All checks were successful
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Successful in 2m2s
Release / Docker / caddy (push) Successful in 52s
Release / Docker / backend (push) Successful in 3m31s
Release / Docker / runner (push) Successful in 3m33s
Release / Upload source maps (push) Successful in 1m37s
Release / Docker / ui (push) Successful in 5m16s
Release / Gitea Release (push) Successful in 44s
Root cause: the chapter picker overlay (fixed inset-0) was rendered as a
descendant of the AudioPlayer's wrapping div, which is itself inside a
'rounded-b-lg overflow-hidden' container on the chapter page. Chrome clips
position:fixed descendants when a parent has overflow:hidden + border-radius,
so the overlay was constrained to the parent's bounding box instead of
covering the full viewport.

Fix: moved the {#if showChapterPanel} block from inside the standard player
div to a top-level sibling at the end of the component template. Svelte
components support multiple root nodes, so the overlay is now a sibling of
all player containers — no ancestor overflow-hidden can clip it.

Also replaced the subtle chevron-down close button with a clear X (×) button
in the top-right of the header, making it obvious how to dismiss the panel.
2026-04-07 11:00:58 +05:00
root
2ed37f78c7 fix: announce chapter reliability — timeout fallback + eager chapters sync
All checks were successful
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Successful in 1m53s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m37s
Release / Docker / runner (push) Successful in 2m33s
Release / Upload source maps (push) Successful in 1m29s
Release / Docker / ui (push) Successful in 2m30s
Release / Gitea Release (push) Successful in 32s
Two issues causing announce to silently fail or permanently block navigation:

1. No hard timeout fallback on speechSynthesis.speak():
   Chrome Android (and some desktop) silently drops utterances not triggered
   within a user-gesture window. If both onend and onerror fail to fire (a
   known browser bug), doNavigate() was never called and the chapter
   transition was permanently lost. Added an 8-second setTimeout fallback
   (safeNavigate) that forces navigation if the speech engine never resolves.
   safeNavigate is idempotent — guarded by a 'navigated' flag so it only
   fires once even if onend, onerror, and the timeout all fire.

2. audioStore.chapters only written inside startPlayback():
   The onended handler reads audioStore.chapters to build the utterance text
   (Chapter N — Title). If auto-next navigated to this chapter and the user
   never manually pressed play (startPlayback was never called), chapters
   held whatever the previous AudioPlayer had written — potentially stale or
   empty on a book switch. Added a reactive $effect that keeps chapters in
   sync whenever the prop changes, same pattern as nextChapter.
2026-04-06 22:35:54 +05:00
root
963ecdd89b fix: auto-next transition deadlock and resume-at-end bug
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 1m38s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m38s
Release / Docker / runner (push) Successful in 2m38s
Release / Upload source maps (push) Successful in 1m23s
Release / Docker / ui (push) Successful in 2m26s
Release / Gitea Release (push) Successful in 31s
Bug 1 — Auto-next not transitioning:
audioExpanded defaulted to false on the new chapter page because
audioStore.chapter still held the old chapter number when the page script
initialized. The $effect only opened the panel when isPlaying was already
true — a circular dependency (can't play without the panel, panel only opens
when playing). Fix: also set audioExpanded=true when autoStartChapter targets
this chapter, both in the initial $state and in the reactive $effect.

Bug 2 — Resume starts at the end:
onended called saveAudioTime() which captured currentTime≈duration and fired a
PATCH 2 seconds later (after navigation had already completed). Next visit to
that chapter restored the end-of-file position. Fix: in onended, cancel the
debounced timer (clearTimeout) and immediately PATCH audioTime=0 for the
finished chapter, so it always resumes from the beginning on re-visit.
2026-04-06 21:51:47 +05:00
root
12963342bb fix: update votedBooks state immediately on swipe so history drawer isn't empty
All checks were successful
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 1m35s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m32s
Release / Docker / runner (push) Successful in 2m42s
Release / Upload source maps (push) Successful in 1m33s
Release / Docker / ui (push) Successful in 2m25s
Release / Gitea Release (push) Successful in 29s
doAction() was fire-and-forgetting the POST but never updating the client-side
votedBooks array. History was only populated from SSR data.votedBooks (loaded
at page init), so any votes cast during the current session were invisible in
the drawer until a full page reload. Now we prepend/replace an entry in
votedBooks optimistically the moment a swipe action fires.
2026-04-06 21:45:29 +05:00
root
bdbec3ae16 feat: redesign discover page with bigger cards and prominent action buttons
All checks were successful
Release / Test backend (push) Successful in 1m12s
Release / Check ui (push) Successful in 2m35s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / backend (push) Successful in 2m38s
Release / Docker / runner (push) Successful in 2m35s
Release / Upload source maps (push) Successful in 1m33s
Release / Docker / ui (push) Successful in 2m10s
Release / Gitea Release (push) Successful in 30s
- Remove tab switcher, move history behind a modal drawer (clock icon in header with badge count)
- Increase card aspect ratio from 3/4.2 to 3/4.6 for more cover real estate
- Replace 5 small icon-only buttons with 3 large labeled buttons (Skip / Read Now / Like)
- Read Now is solid blue as the center primary CTA; Skip and Like use tinted bg with colored border
- Swipe indicators are larger (text-2xl, border-[3px], bg tint) for better visibility
- Remove swipe hint text to reclaim vertical space
- Larger title text on card (text-2xl)
2026-04-06 21:36:28 +05:00
root
c98d43a503 chore: add announce_chapter field to user_settings pb-init script
- Added announce_chapter (bool) to the user_settings create block
- Added add_field migration line for existing installs
- Also backfilled missing user_settings fields in the create block
  (theme, locale, font_family, font_size were already migrated but
  absent from the create definition)
- Migrated live prod PocketBase (pb.libnovel.cc) — field confirmed present
2026-04-06 21:18:59 +05:00
root
1f83a7c05f feat: add chapter announcing setting to audio player
All checks were successful
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Successful in 1m51s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m24s
Release / Docker / runner (push) Successful in 2m49s
Release / Upload source maps (push) Successful in 1m25s
Release / Docker / ui (push) Successful in 2m12s
Release / Gitea Release (push) Successful in 30s
When enabled, the Web Speech API speaks the upcoming chapter number and
title (e.g. 'Chapter 12 — The Final Battle') between auto-next chapters,
giving an audible cue before the next narration begins.

- AudioStore.announceChapter ( boolean, default false)
- PBUserSettings.announce_chapter persisted to PocketBase
- GET/PUT /api/settings includes announceChapter field
- +layout.server.ts loads + defaults the field
- +layout.svelte applies on load, saves in debounced PUT, and fires
  SpeechSynthesisUtterance in onended before navigating (falls back to
  immediate navigation if speechSynthesis is unavailable)
- ListeningMode: 'Announce' pill added to the Speed · Auto · Sleep row
2026-04-06 21:12:10 +05:00
root
93e9d88066 fix: add missing fmtBytes helper in image-gen page
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m45s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m30s
Release / Docker / runner (push) Successful in 3m37s
Release / Upload source maps (push) Successful in 1m35s
Release / Docker / ui (push) Successful in 2m39s
Release / Gitea Release (push) Successful in 35s
Fixes CI type-check failure: fmtBytes was used on line 592 to format
the reference image file size but was never defined.
2026-04-06 20:37:02 +05:00
root
5b8987a191 fix: use theme tokens on catalogue scrape buttons
Some checks failed
Release / Test backend (push) Has been cancelled
Release / Check ui (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Replace hardcoded bg-amber-500/text-zinc-900 with bg-(--color-brand)/text-(--color-surface)
to match the rest of the UI's button palette (both grid and list views).
2026-04-06 20:36:25 +05:00
root
b6904bcb6e perf: cache admin job lists + targeted polling for audio/translation pages
Some checks failed
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Failing after 34s
Release / Upload source maps (push) Has been skipped
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 40s
Release / Docker / runner (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
- Add 30s Valkey cache to listScrapingTasks, listAudioJobs, listTranslationJobs
  (use listN(500) instead of unbounded listAll to cap at one request)
- Delete listAudioCache() — derive AudioCacheEntry[] from jobs in server load
- Add listBookSlugs() with 10min cache — replaces full listBooks() in translation load
- Add GET /api/admin/audio-jobs, /api/admin/translation-jobs, /api/admin/scrape-tasks
  (lightweight polling endpoints backed by the Valkey cache)
- Replace invalidateAll() interval polling in audio+translation pages with
  targeted fetch to the new endpoints (avoids re-running full server load)
2026-04-06 20:33:46 +05:00
root
75e6a870d3 feat: async image-gen and description jobs with review panels
Some checks failed
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Failing after 38s
Release / Upload source maps (push) Has been skipped
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m44s
Release / Docker / runner (push) Successful in 2m45s
Release / Gitea Release (push) Has been skipped
- Add POST /api/admin/image-gen/async: fire-and-forget image generation
  that stores the result (base64) in an ai_job payload and returns 202
  immediately — no more 60-120s blocking on FLUX models
- Add POST /api/admin/text-gen/description/async: same pattern for book
  description generation
- Register both new routes in server.go
- Rewrite image-gen admin page to use the async path (submit → redirect
  to AI Jobs for monitoring)
- Extend ai-jobs page with Review panels for image-gen jobs (show image,
  Save as cover / Download / Discard) and description jobs (diff old vs
  new, editable textarea, Apply / Discard)
2026-04-06 19:46:59 +05:00
root
5098acea20 feat: universal search modal
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m42s
Release / Docker / caddy (push) Successful in 56s
Release / Docker / backend (push) Successful in 2m35s
Release / Docker / runner (push) Successful in 3m37s
Release / Upload source maps (push) Successful in 1m29s
Release / Docker / ui (push) Successful in 2m33s
Release / Gitea Release (push) Successful in 37s
- New SearchModal.svelte: full-screen modal with blurred backdrop
  - Live results as you type (300ms debounce, min 2 chars)
  - Local vs Novelfire badge on each result card (cover + title + author +
    genres + chapter count)
  - Local/remote counts shown in result header
  - 'See all in catalogue' shortcut button + footer repeat link
  - Recent searches (localStorage, max 8, per-item remove + clear all)
  - Genre suggestion chips shown when query is empty or no results found
  - Keyboard navigation: ArrowUp/Down to select, Enter to open, Escape to close
  - Body scroll lock while open
- +layout.svelte:
  - Imports SearchModal, adds searchOpen state
  - Search icon button in nav header (hidden on chapter reader pages)
  - Global keyboard shortcut: '/' or Cmd/Ctrl+K opens modal
  - Shortcut ignored when focused in input/textarea or on chapter pages
  - Modal not shown while ListeningMode is open
  - Auto-closes on route change
2026-04-06 18:41:32 +05:00
root
3e4d7b54d7 feat: merge page nav into focus mode floating pill
All checks were successful
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Successful in 1m40s
Release / Docker / caddy (push) Successful in 40s
Release / Docker / backend (push) Successful in 2m38s
Release / Docker / runner (push) Successful in 2m43s
Release / Upload source maps (push) Successful in 1m33s
Release / Docker / ui (push) Successful in 2m18s
Release / Gitea Release (push) Successful in 35s
In paginated + focus mode there were two separate UI elements: an inline
Prev/counter/Next bar and a floating chapter-nav pill. Merged into one:

- ‹ Ch.N | ‹ (page) N/M (page) › | × Exit focus | Ch.N ›
- Inline page bar + hint text are now hidden when focusMode is active
- Floating pill grows to include page controls only in paginated mode;
  scroll mode pill is unchanged (just chapter nav + exit)
- Added max-w-[calc(100vw-2rem)] so pill never overflows on small screens
2026-04-06 18:23:25 +05:00
Admin
495f386b4f fix(player): standard skip icons, next/prev start playback, fix auto-next stale presign
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 1m40s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / backend (push) Successful in 2m34s
Release / Docker / runner (push) Successful in 2m43s
Release / Upload source maps (push) Successful in 1m44s
Release / Docker / ui (push) Successful in 2m19s
Release / Gitea Release (push) Successful in 34s
- Replace double-triangle icons with proper skip-prev (|◄) and skip-next (►|) icons
- Convert prev/next chapter <a> links to buttons calling playChapter() so navigation auto-starts audio
- Fix auto-next silent failure: fast path A now re-presigns instead of reusing the cached URL, preventing stale/expired MinIO presigned URL from silently failing on the audio element
2026-04-06 17:22:42 +05:00
root
bb61a4654a feat: portrait cover card + blurred bg in ListeningMode
Some checks failed
Release / Check ui (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Test backend (push) Has been cancelled
Replace the full-bleed landscape hero with the Apple Music / Spotify
layout pattern:
- Full-screen blurred+darkened cover as atmospheric background layer
- Centered portrait 2/3 cover card (38svh tall, rounded-2xl, shadow-2xl)
- Track info (chapter label, title, book name) moved below cover card
- Radial vignette overlay for depth
- No dead empty space between art and controls
- Header bar and controls area lifted to z-index 2 above the bg layers
2026-04-06 17:22:31 +05:00
root
1cdc7275f8 fix: homepage overlay blocker and carousel auto-advance
Some checks failed
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m43s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m31s
Release / Docker / runner (push) Successful in 2m36s
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
CI / Backend (push) Successful in 43s
CI / UI (push) Successful in 49s
- Add pointer-events:none to ListeningMode fly-transition wrapper div in
  +layout.svelte so the exiting animation div never blocks page interaction
- Add pointer-events:auto to ListeningMode root div so it still captures
  all touch/click events correctly despite the parent being pointer-events:none
- Rewrite carousel auto-advance using $effect + autoAdvanceSeed pattern:
  replaces the stale-closure setInterval in resetAutoAdvance() with a
  reactive $effect that owns the interval and re-starts cleanly on manual
  navigation by bumping a seed counter
2026-04-06 16:58:23 +05:00
root
9d925382b3 fix(player): register touchmove with passive:false via $effect
All checks were successful
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 1m34s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m36s
Release / Docker / runner (push) Successful in 2m34s
Release / Upload source maps (push) Successful in 1m35s
Release / Docker / ui (push) Successful in 3m18s
Release / Gitea Release (push) Successful in 1m27s
Svelte 5 has no |nonpassive modifier. Register the touchmove listener
manually so e.preventDefault() can suppress page scroll during the
pull-down gesture.
2026-04-06 16:32:25 +05:00
root
718929e9cd feat(player): pull-down-to-dismiss gesture on ListeningMode
Some checks failed
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Failing after 31s
Release / Upload source maps (push) Has been skipped
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m33s
Release / Docker / runner (push) Successful in 2m30s
Release / Gitea Release (push) Has been skipped
- Slide-up transition on open (fly from bottom, 320ms)
- Drag-down on the overlay follows the finger in real time with no
  transition; on release springs back (0.32s cubic-bezier) if drag
  < 130px and velocity < 0.4px/ms, otherwise slides off-screen and
  calls onclose after 220ms
- Opacity fades as the overlay is pulled down (fully transparent at 500px)
- Touch guard: gesture does not activate if touch starts inside an
  .overflow-y-auto element (chapter/voice lists) or while a modal is open
2026-04-06 16:16:09 +05:00
root
e8870a11da feat(player): redesign ListeningMode UI
All checks were successful
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 1m39s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m57s
Release / Docker / runner (push) Successful in 2m44s
Release / Upload source maps (push) Successful in 1m35s
Release / Docker / ui (push) Successful in 2m15s
Release / Gitea Release (push) Successful in 40s
- Full-bleed cover fills top ~52% of screen with top+bottom gradient
  overlays for header and track info legibility; eliminates the large
  dead space between cover and seek bar
- Chapter number shown as brand-coloured label above the chapter title
- Remaining time (−m:ss) displayed in the centre of the seek bar row
- Transport row uses justify-between; chapter-skip buttons are smaller
  (w-5, muted/60 opacity) vs time-skip (w-7, full muted) to clearly
  distinguish secondary from primary seek controls
- Speed / Auto-next / Sleep now sit on a single tidy row — no more
  wrapping or mixed visual styles between the segmented speed control
  and the two pill buttons
- Header buttons use frosted-glass style (bg-black/25 backdrop-blur)
  so they remain legible over the cover image
2026-04-06 15:51:06 +05:00
root
b70fed5cd7 fix(reader): clean up focus mode footer
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 1m35s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m45s
Release / Docker / runner (push) Successful in 2m32s
Release / Upload source maps (push) Successful in 1m31s
Release / Docker / ui (push) Successful in 2m48s
Release / Gitea Release (push) Successful in 45s
- Hide the global site footer on chapter pages (not useful mid-reading)
- Merge the three separate floating nav pills into a single unified pill
  with dividers, removing the visual clutter of multiple bordered bubbles
- Float the pill lower (bottom-6) when the mini-player is not active
2026-04-06 15:42:45 +05:00
root
5dd9dd2ebb feat(nav): make book title in chapter header a link back to the book page
Some checks failed
Release / Check ui (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Test backend (push) Has been cancelled
2026-04-06 15:38:18 +05:00
root
1c5c25e5dd feat(reader): add lines-per-page setting for paginated mode
Some checks failed
Release / Test backend (push) Successful in 58s
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Check ui (push) Has been cancelled
Adds a PageLines preference (Few/Normal/Many) that adjusts the paginated
container height via a rem offset on the existing calc(). The setting row
appears in Reader Settings → Layout only when Pages mode is active, matching
the style of all other setting rows. Persisted in localStorage (reader_layout_v1).
2026-04-06 15:37:29 +05:00
root
5177320418 feat(player): scroll chapter list to current chapter on open
Some checks failed
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 1m41s
Release / Docker / caddy (push) Successful in 52s
Release / Docker / backend (push) Successful in 2m42s
Release / Docker / runner (push) Successful in 2m35s
Release / Upload source maps (push) Successful in 1m31s
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Use a Svelte action on each chapter button that calls scrollIntoView with
behavior:'instant' so the list opens centred on the active chapter with
no visible scroll animation.
2026-04-06 15:31:24 +05:00
root
836c9855af fix(player): use untrack() in toggleRequest effect to prevent play/pause loop
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 1m40s
Release / Docker / caddy (push) Successful in 46s
Release / Docker / backend (push) Successful in 2m29s
Release / Docker / runner (push) Successful in 2m43s
Release / Upload source maps (push) Successful in 1m31s
Release / Docker / ui (push) Successful in 2m21s
Release / Gitea Release (push) Successful in 30s
Reading audioStore.isPlaying inside the toggleRequest $effect caused Svelte 5
to subscribe to it, so the effect re-ran on every isPlaying change. When
resuming from ListeningMode, play() would fire onplay → isPlaying=true →
effect re-ran → called pause() → onpause → isPlaying=false → effect re-ran
→ called play() → infinite loop. Wrapping the isPlaying read in untrack()
limits the effect's subscription to toggleRequest only.
2026-04-06 14:57:27 +05:00
Admin
5c2c9b1b67 feat(home): hero carousel with auto-advance, arrows, and dot indicators
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 1m37s
Release / Docker / caddy (push) Successful in 39s
Release / Docker / backend (push) Successful in 2m32s
Release / Docker / runner (push) Successful in 2m28s
Release / Upload source maps (push) Successful in 1m30s
Release / Docker / ui (push) Successful in 2m14s
Release / Gitea Release (push) Successful in 32s
Cycles through all in-progress books every 6s; prev/next arrow buttons
overlay the card edges; active dot stretches to a pill; cover fades in
on slide change via {#key} + animate-fade-in; shelf excludes the current
hero to avoid duplication.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-06 00:16:39 +05:00
Admin
79b3de3e8d fix(player): fill ListeningMode empty space + global audio context + chapter picker from mini-bar
Some checks failed
Release / Test backend (push) Successful in 49s
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Check ui (push) Has been cancelled
- justify-between on scrollable body so cover art sits top, controls
  sit bottom — no more half-empty screen on tall phones
- Move ListeningMode outside {#if audioStore.active} so pausing never
  tears down the overlay and loses resume context
- Mini-bar time/track click now opens ListeningMode with chapter picker
  pre-shown (same view as the Chapters button inside ListeningMode)
- Remove the old chapterDrawerOpen mini-bar drawer (replaced by above)
- Add openChapters prop to ListeningMode for pre-opening chapter modal
2026-04-06 00:15:12 +05:00
Admin
5804cd629a feat(reader): convert settings bottom sheet to full-screen overlay
Replaces the bottom drawer + backdrop with a fixed full-screen overlay
matching the voice/chapter picker style in ListeningMode — chevron header,
tab bar with brand-color active state, scrollable content. Escape closes it.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-06 00:12:38 +05:00
Admin
b130ba4e1b feat(player): refactor chapter picker to full-screen overlay like voice picker
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m41s
Release / Docker / caddy (push) Successful in 50s
Release / Docker / backend (push) Successful in 2m37s
Release / Docker / runner (push) Successful in 2m36s
Release / Upload source maps (push) Successful in 1m32s
Release / Docker / ui (push) Successful in 2m18s
Release / Gitea Release (push) Successful in 40s
Both ListeningMode and the standard AudioPlayer now open chapters via a
full-screen overlay (same UX as the voice selector) — header + search bar +
rows with circular chapter-number badge, title, and active indicator.
Removes the cramped inline card from the bottom of ListeningMode.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 23:42:49 +05:00
Admin
cc1f6b87e4 fix(player): show '--:--' for unknown duration instead of '0:00'
Prevents the silly "2:01 / 0:00" display when audio src is being swapped
from preview to full audio and duration hasn't loaded yet.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 23:26:33 +05:00
Admin
8279bd5caa fix(reader): clean chapter title display and declutter audio panel
- Strip leading digit prefix (e.g. "6Chapter 6 → Chapter 6") and
  content after first newline (scraped date artifacts) from chapter titles
- Add "CHAPTER N" eyebrow label above the h1 for clear hierarchy
- Show date_label as small muted text in the meta row
- Remove double-border / mt-6 gap from standard AudioPlayer inside the
  chapter page's collapsible panel (was rendering two nested boxes)
- Remove redundant "Audio Narration" label (toggle already says "Listen")

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 23:18:17 +05:00
Admin
59794e3694 fix(sessions): remove IP from device fingerprint to prevent duplicate sessions on network change
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 1m43s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m40s
Release / Docker / runner (push) Successful in 4m16s
Release / Upload source maps (push) Successful in 1m41s
Release / Docker / ui (push) Successful in 2m47s
Release / Gitea Release (push) Successful in 40s
- deviceFingerprint now hashes only User-Agent (not UA+IP) so switching
  networks (VPN, mobile data, wifi) no longer creates a new session row
- On re-login with same device, also refresh the stored IP field so the
  sessions page shows the current network address
- feat(library): bulk remove and bulk shelf-change actions on /books
  Long-press any card to enter selection mode; sticky action bar with
  Move to shelf dropdown and Remove button; POST /api/library/bulk-remove
  and POST /api/library/bulk-shelf endpoints
- fix(catalogue): make Scrape button visible with solid amber-500 fill
  and dark text instead of low-opacity ghost style that blended into card
2026-04-05 23:12:31 +05:00
Admin
150eb2a2af fix(book): move description to full-width section below header
The description was crammed into the narrow right column beside the cover,
creating a wall of text on mobile. Now it renders full-width below the
cover+title row with better line-height, 5-line collapse, gradient fade,
and a chevron-annotated show-more button.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 23:11:49 +05:00
Admin
a0404cea57 feat(home): add streak widget, trending, genre recs, completed shelf, audio quick-play
- Reading streak + books-in-progress mini-widget (derived from progress timestamps)
- "N chapters left" badge on continue-reading shelf cards
- Audio listen button on hero card and hover-overlay on shelf cards (autoStartChapter + goto)
- Completed shelf section for books where chapter >= total_chapters
- Trending Now section (books sorted by ranking field, 15-min cache)
- "Because you read [Genre]" recommendations (genre-matched, excludes user's books, 10-min cache)
- Both new sections are hideable via the existing show/hide mechanism
- getTrendingBooks / getRecommendedBooks added to pocketbase.ts
- Cache invalidation for trending/recs added to invalidateBooksCache

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 23:08:36 +05:00
Admin
45a0190d75 feat: async chapter-names AI generation with review & apply
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 1m38s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 3m15s
Release / Docker / runner (push) Successful in 4m34s
Release / Upload source maps (push) Successful in 1m33s
Release / Docker / ui (push) Successful in 3m39s
Release / Gitea Release (push) Successful in 40s
- Add POST /api/admin/text-gen/chapter-names/async backend endpoint: fire-and-forget,
  returns job_id immediately (HTTP 202), runs batch generation in background goroutine,
  persists proposed titles in ai_job payload when done
- Register new route in server.go alongside existing SSE endpoint (backward compat)
- Add SvelteKit proxy at /api/admin/text-gen/chapter-names/async
- Add SvelteKit proxy for GET /api/admin/ai-jobs/[id] (job detail with payload)
- Add Review button on ai-jobs page for done chapter-names jobs; inline panel shows
  editable title table (old to new) with Apply All button that POSTs to chapter-names/apply
2026-04-05 22:53:47 +05:00
Admin
1abb4cd714 feat(player): CF AI preview/swap + fix PB token expiry + local build time
All checks were successful
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 1m44s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m42s
Release / Docker / runner (push) Successful in 2m56s
Release / Upload source maps (push) Successful in 1m30s
Release / Docker / ui (push) Successful in 2m42s
Release / Gitea Release (push) Successful in 1m14s
- Backend: add GET /api/audio-preview/{slug}/{n} — generates first ~1800-char
  chunk via CF AI so playback starts immediately; full chapter cached in MinIO
- Frontend: replace CF AI spinner with preview blob URL + background swap to
  full presigned URL when runner finishes, preserving currentTime
- AudioPlayer: isPreview state + 'preview' badge in mini-bar during swap
- pocketbase.ts: fix 403 on stale token — reduce TTL to 50 min + retry once
  on 401/403 with forced re-auth (was cached 12 h, PB tokens expire in 1 h)
- Footer build time now rendered in user's local timezone via toLocaleString()
2026-04-05 22:12:22 +05:00
Admin
a308672317 fix(cache): reduce Valkey connectTimeout to 1.5s to avoid 10s hang
When Valkey is unreachable, ioredis was holding cache.get() calls in
the offline queue for the default 10s connectTimeout before failing.
This caused admin/image-gen and text-gen pages to stall on every load.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 22:04:05 +05:00
Admin
5d7c3b42fa fix(player): move speed and auto-next controls out of mini-player bar
Speed and auto-next are already available in the full listening mode
overlay — no need to clutter the compact bottom bar with them.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 21:58:09 +05:00
Admin
45f5c51da6 fix(ci): strip 'build' from .dockerignore before docker-ui build
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m41s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 3m23s
Release / Docker / runner (push) Successful in 2m53s
Release / Upload source maps (push) Successful in 1m27s
Release / Docker / ui (push) Successful in 2m32s
Release / Gitea Release (push) Successful in 42s
When PREBUILT=1 the pre-built artifact is downloaded into ui/build/ but
.dockerignore excludes 'build', so Docker never sees it and /app/build
doesn't exist in the builder stage — causing the runtime COPY to fail.

Fix: rewrite ui/.dockerignore on the CI runner (grep -v '^build$') so the
pre-built directory is included in the Docker context.

Also in this commit:
- book page: gate EPUB download on isPro (UI upsell + server 403 guard)
- book page: chapter names default pattern changed to '{scene}'
2026-04-05 21:26:05 +05:00
Admin
55df88c3e5 fix(player): remove duplicate inline player and declutter bottom bar
Some checks failed
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 1m47s
Release / Docker / caddy (push) Successful in 50s
Release / Docker / backend (push) Successful in 3m11s
Release / Docker / runner (push) Successful in 2m42s
Release / Upload source maps (push) Successful in 1m30s
Release / Docker / ui (push) Failing after 1m35s
Release / Gitea Release (push) Has been skipped
When the mini-player is active on the current chapter, the collapsible
'Listen to this chapter' panel now shows a brief note instead of rendering
a full second AudioPlayer.

Bottom bar track info column: removed chapter title and book title lines
so the time display fits on one line without crowding the controls.
2026-04-05 20:31:01 +05:00
Admin
eb137fdbf5 fix: source maps — ship pre-built artifact with injected debug IDs to Docker
Some checks failed
Release / Test backend (push) Successful in 46s
Release / Check ui (push) Successful in 1m38s
Release / Docker / caddy (push) Successful in 49s
Release / Docker / backend (push) Successful in 2m54s
Release / Docker / runner (push) Successful in 2m52s
Release / Upload source maps (push) Successful in 1m39s
Release / Docker / ui (push) Failing after 1m34s
Release / Gitea Release (push) Has been skipped
The docker-ui job was rebuilding the UI from scratch inside Docker, producing
chunk hashes and JS files that didn't match the source maps uploaded to
GlitchTip, causing all stack traces to appear minified.

Fix:
- ui/Dockerfile: add PREBUILT=1 ARG; skip npm run build when set
- release.yaml upload-sourcemaps: re-upload artifact after sentry-cli inject
- release.yaml docker-ui: download injected artifact into ui/build/ and pass
  PREBUILT=1 so Docker reuses the exact same JS files whose debug IDs are in
  GlitchTip
2026-04-05 20:25:27 +05:00
Admin
385c9cd8f2 feat(reader): voice modal with search, chapter search, comments collapse, audio stream fix
All checks were successful
Release / Test backend (push) Successful in 46s
Release / Check ui (push) Successful in 1m41s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 4m15s
Release / Docker / runner (push) Successful in 4m28s
Release / Upload source maps (push) Successful in 53s
Release / Docker / ui (push) Successful in 7m30s
Release / Gitea Release (push) Successful in 49s
- ListeningMode: replace inline voice dropdown with full-screen modal + voice search
- ListeningMode: add chapter search box (shown when > 6 chapters) + click-to-play via autoStartChapter
- CommentsSection: collapse by default when empty, auto-expand once comments load
- AudioPlayer: always use streaming endpoint for Kokoro/PocketTTS (backend deduplicates via AudioExists)
2026-04-05 20:20:19 +05:00
Admin
e3bb19892c feat: add AI Jobs admin page, cache model lists, improve image-gen 502 UX
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 1m36s
Release / Docker / caddy (push) Successful in 45s
Release / Docker / backend (push) Successful in 4m27s
Release / Docker / runner (push) Successful in 3m32s
Release / Upload source maps (push) Successful in 44s
Release / Docker / ui (push) Successful in 2m46s
Release / Gitea Release (push) Successful in 58s
- Add /admin/ai-jobs page with live-polling jobs table, status badges, progress bars, and cancel action
- Add listAIJobs() helper in pocketbase.ts; AIJob type
- Add POST /api/admin/ai-jobs/[id]/cancel SvelteKit proxy
- Add ai-jobs link to admin sidebar nav (all 5 locales)
- Cache image-gen and text-gen model lists in Valkey (10 min TTL) via scraper.ts helpers
- Cache changelog Gitea API response (5 min TTL)
- Improve 502/504 error message on image-gen page with CF AI timeout hint
- Show CF AI timeout warning in advanced options when a Cloudflare/FLUX model is selected
2026-04-05 20:07:54 +05:00
Admin
6ca704ec9a fix: use upload/download-artifact@v3 for Gitea GHES compatibility
All checks were successful
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 1m40s
Release / Docker / caddy (push) Successful in 38s
Release / Docker / backend (push) Successful in 5m3s
Release / Docker / runner (push) Successful in 3m41s
Release / Upload source maps (push) Successful in 1m44s
Release / Docker / ui (push) Successful in 4m8s
Release / Gitea Release (push) Successful in 2m42s
2026-04-05 18:15:43 +05:00
Admin
2bdb5e29af perf: reuse UI build artifact for source map upload in CI
Some checks failed
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Failing after 51s
Release / Upload source maps (push) Has been skipped
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 52s
Release / Docker / backend (push) Successful in 4m28s
Release / Docker / runner (push) Successful in 4m56s
Release / Gitea Release (push) Has been skipped
Pass build output from check-ui to upload-sourcemaps via artifact
instead of rebuilding from scratch. Saves ~4-5 min (npm ci + npm run build)
from the upload-sourcemaps job.
2026-04-05 18:01:00 +05:00
Admin
222627a18c refactor: clean up chapter reader UI — nav, audio, lang switcher, bottom CTA
- Consolidate top nav into a single row: ← back | ← → chapter arrows | settings gear
  Removes the duplicate prev/next buttons that appeared at both top and bottom
- Language switcher moved inline into chapter meta line (after word count);
  lock icons made smaller and less distracting for free users; removes the
  separate "Upgrade to Pro" text link
- Audio player wrapped in a collapsible "Listen to this chapter" panel;
  auto-expands if audio is already playing for the chapter, collapsed otherwise
  so content is immediately visible on page load
- Bottom nav redesigned: next chapter is a full-width card CTA with hover
  accent, previous is a small secondary text link — clear visual hierarchy
- Remove floating gear button (settings now triggered from top nav settings icon)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 17:57:44 +05:00
Admin
0ae71c62f9 fix: switch CI source map upload from glitchtip-cli to sentry-cli
All checks were successful
Release / Test backend (push) Successful in 1m9s
Release / Check ui (push) Successful in 51s
Release / Docker / caddy (push) Successful in 46s
Release / Docker / backend (push) Successful in 4m30s
Release / Docker / runner (push) Successful in 3m54s
Release / Upload source maps (push) Successful in 53s
Release / Docker / ui (push) Successful in 2m57s
Release / Gitea Release (push) Successful in 57s
glitchtip-cli v0.1.0 uploads chunks but never calls the assemble endpoint,
leaving releases with 0 files. sentry-cli correctly calls assemble after
chunk upload — confirmed working in manual test (2.5.84-test3 = 211 files).
2026-04-05 17:52:42 +05:00
Admin
d0c95889ca fix: persist GlitchTip GzipChunk patch and prune old releases in CI
All checks were successful
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Successful in 50s
Release / Docker / caddy (push) Successful in 43s
Release / Docker / backend (push) Successful in 2m46s
Release / Docker / runner (push) Successful in 2m43s
Release / Upload source maps (push) Successful in 4m36s
Release / Docker / ui (push) Successful in 4m38s
Release / Gitea Release (push) Successful in 4m46s
- Add homelab/glitchtip/files_api.py with GzipChunk fallback for sentry-cli 3.x
  raw zip uploads; bind-mount into both glitchtip-web and glitchtip-worker
- Add release.yaml prune step to delete all but the 10 newest GlitchTip releases
- Reader page: remove dead code and simplify layout
2026-04-05 17:30:39 +05:00
Admin
a3ad54db70 feat: add Listening Mode overlay with voice picker, speed, sleep timer, and chapter list
Some checks failed
Release / Test backend (push) Has been cancelled
Release / Check ui (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Docker / caddy (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Adds a full-screen listening mode accessible via the headphones button in the
mini-player bar. Moves voice selector, speed, auto-next, and sleep timer out of
the reader settings panel into the new overlay. Voices are stored in AudioStore
so ListeningMode can read them without prop drilling.
2026-04-05 17:06:56 +05:00
Admin
48bc206c4e Fix GlitchTip source map assembly: shared uploads volume + MEDIA_ROOT
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 49s
Release / Docker / caddy (push) Successful in 51s
Release / Docker / backend (push) Successful in 2m47s
Release / Docker / runner (push) Successful in 2m38s
Release / Upload source maps (push) Successful in 3m20s
Release / Docker / ui (push) Successful in 2m59s
Release / Gitea Release (push) Successful in 1m45s
- Mount glitchtip_uploads named volume to /code/uploads on both
  glitchtip-web and glitchtip-worker so chunk blobs written by the
  web container are readable by the worker during assembly
- Set MEDIA_ROOT=/code/uploads explicitly on both services so the
  Django storage path matches the volume mount and DB blob records
2026-04-05 16:58:56 +05:00
Admin
4c1ad84fa9 Fix source maps + reader UI redesign
All checks were successful
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Successful in 47s
Release / Docker / caddy (push) Successful in 53s
Release / Docker / backend (push) Successful in 3m13s
Release / Docker / runner (push) Successful in 2m54s
Release / Upload source maps (push) Successful in 3m46s
Release / Docker / ui (push) Successful in 2m25s
Release / Gitea Release (push) Successful in 1m26s
- Strip v prefix from GlitchTip release name in upload-sourcemaps job
  so it matches PUBLIC_BUILD_VERSION reported by the deployed app
- Focus mode: hide bottom nav/comments, show floating prev/next/exit pill
- Listening mode: full-screen overlay with transport, speed pills, voice selector
- Settings panel: dedup speed/auto-next/sleep controls (single source of truth)
- Mini-bar: unified speed steps, headphones button opens ListeningMode
2026-04-05 16:14:28 +05:00
Admin
9c79fd5deb feat: AI job tracking, range support, auto-prompt, and resume
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 55s
Release / Docker / caddy (push) Successful in 38s
Release / Docker / backend (push) Successful in 3m21s
Release / Docker / runner (push) Successful in 2m41s
Release / Upload source maps (push) Successful in 3m44s
Release / Docker / ui (push) Successful in 2m42s
Release / Gitea Release (push) Successful in 1m22s
- New `ai_jobs` PocketBase collection tracks all long-running AI tasks
  (batch-covers, chapter-names) with status, progress, and cancellation
- `handlers_aijobs.go`: GET/cancel endpoints for ai_jobs; centralised
  cancel registry (moved from handlers_catalogue)
- Batch-covers and chapter-names SSE handlers now create/resume ai_job
  records, support from_item/to_item ranges, and resume from items_done
  on restart via job_id
- New `POST /api/admin/image-gen/auto-prompt`: generates an image prompt
  from book description (cover) or chapter title (chapter) via LLM
- image-gen page: "Auto-prompt" button calls auto-prompt API when a slug
  is selected; falls back gracefully if TextGen not configured
- text-gen chapter-names: from/to chapter range inputs + job ID display
- catalogue-tools batch-covers: from/to item range + resume job ID input
- pb-init-v3.sh: adds ai_jobs collection (idempotent)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 15:40:52 +05:00
Admin
7aad42834f fix: GlitchTip source map upload flow; add AGENTS.md
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 54s
Release / Docker / caddy (push) Successful in 38s
Release / Docker / backend (push) Successful in 3m7s
Release / Docker / runner (push) Successful in 2m58s
Release / Upload source maps (push) Successful in 1m56s
Release / Docker / ui (push) Successful in 2m17s
Release / Gitea Release (push) Successful in 47s
Add 'releases new' and 'releases finalize' steps around sourcemaps
upload in release.yaml — without an explicit 'releases new' call,
GlitchTip creates the release entry but associates 0 files.

Add root AGENTS.md (picked up by Claude, Cursor, Copilot, etc.) with
full project context: stack, repo layout, Gitea CI conventions,
GlitchTip DSN/upload flow, infra, and iOS notes.
2026-04-05 14:52:41 +05:00
Admin
15a31a5c64 fix: chapter menu drawer — constrain width on desktop, fix scroll
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 39s
Release / Docker / backend (push) Successful in 3m12s
Release / Docker / runner (push) Successful in 3m41s
Release / Upload source maps (push) Successful in 2m13s
Release / Docker / ui (push) Successful in 2m16s
Release / Gitea Release (push) Successful in 36s
On md+ screens the drawer is now right-aligned and 320px wide (w-80)
instead of full-viewport-width. The sticky header is pulled out of the
scroll container so it never scrolls away, and overflow-y-auto is
applied only to the chapter list itself so both mobile and desktop can
scroll through long chapter lists.
2026-04-05 14:35:03 +05:00
Admin
4d3b91af30 feat: add Grafana Faro RUM, fix dashboards, add Grafana to admin nav
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 1m5s
Release / Docker / caddy (push) Successful in 1m9s
Release / Docker / backend (push) Successful in 3m14s
Release / Docker / runner (push) Successful in 4m7s
Release / Upload source maps (push) Successful in 2m11s
Release / Docker / ui (push) Successful in 2m23s
Release / Gitea Release (push) Successful in 40s
- Add @grafana/faro-web-sdk to UI; wire initializeFaro in hooks.client.ts
  gated on PUBLIC_FARO_COLLECTOR_URL (no-op in dev)
- Add Grafana Alloy service (faro.receiver) to homelab compose;
  Faro endpoint → alloy:12347 (faro.libnovel.cc via cloudflared)
- Add PUBLIC_FARO_COLLECTOR_URL env var to docker-compose.yml UI service
- Add Web Vitals dashboard (web-vitals.json): LCP/INP/CLS/TTFB/FCP p75
  stats + LCP/TTFB time-series + Faro exception logs from Loki
- Fix runner.json: strip libnovel_ prefix from all metric names
- Fix backend.json: replace 5 dead http_client_* panels with
  spanmetrics-based equivalents (Request Rate by Span Name + Latency
  by Span Name p95)
- Fix OTel collector: add service.telemetry.metrics.address: 0.0.0.0:8888
  so Prometheus can scrape collector self-metrics
- Add Grafana link to admin nav external tools; add admin_nav_grafana
  message key to all 5 locale files; recompile paraglide
2026-04-05 12:58:16 +05:00
Admin
eb8a92f0c1 ci: fix GlitchTip project slug (ui, not libnovel-ui)
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 5m16s
Release / Docker / runner (push) Successful in 4m32s
Release / Upload source maps (push) Successful in 2m8s
Release / Docker / ui (push) Successful in 2m45s
Release / Gitea Release (push) Successful in 59s
2026-04-05 12:27:02 +05:00
Admin
fa2803c164 ci: re-enable GlitchTip source map upload job for UI releases
Some checks failed
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 47s
Release / Docker / caddy (push) Successful in 56s
Release / Docker / runner (push) Has been cancelled
Release / Upload source maps (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
2026-04-05 12:26:23 +05:00
Admin
787942b172 fix: split GLITCHTIP_DSN into per-service vars (backend/runner/ui) 2026-04-05 12:17:03 +05:00
Admin
cb858bf4c9 chor: delete all ios-ux skill 2026-04-05 11:51:23 +05:00
Admin
4c3c160102 fix: downscale reference image before CF AI img2img to avoid 502 payload limit
CF Workers AI has a ~4MB JSON body limit. Large cover images base64-encoded
can easily exceed this, causing Cloudflare to return a 502 Bad Gateway.

Added resizeRefImage() which scales the reference down so its longest side
≤ 768px (nearest-neighbour, re-encoded as JPEG) before passing it to the
GenerateImageFromReference call. Images already within the limit pass through
unchanged.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 11:40:37 +05:00
Admin
37deac1eb3 fix: resolve all svelte-check warnings and errors (0 errors, 0 warnings)
All checks were successful
Release / Test backend (push) Successful in 45s
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 40s
Release / Docker / backend (push) Successful in 4m54s
Release / Docker / runner (push) Successful in 56s
Release / Docker / ui (push) Successful in 2m9s
Release / Gitea Release (push) Successful in 48s
2026-04-05 11:03:23 +05:00
Admin
6f0069daca feat: catalogue enrichment — tagline, genres, warnings, quality score, batch covers
Backend (handlers_catalogue.go):
- POST /api/admin/text-gen/tagline — 1-sentence marketing hook
- POST /api/admin/text-gen/genres + /apply — LLM genre suggestions, editable + persist
- POST /api/admin/text-gen/content-warnings — mature theme detection
- POST /api/admin/text-gen/quality-score — 1–5 description quality rating
- POST /api/admin/catalogue/batch-covers (SSE) — generate covers for books missing one
- POST /api/admin/catalogue/batch-covers/cancel — cancel via in-memory job registry
- POST /api/admin/catalogue/refresh-metadata/{slug} (SSE) — description + cover refresh

Frontend:
- text-gen: 4 new tabs (Tagline, Genres, Warnings, Quality) with book autocomplete
- image-gen: localStorage style presets (save/apply/delete named prompt templates)
- catalogue-tools: new admin page with batch cover SSE progress + cancel
- admin nav: "Catalogue Tools" link added

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 10:52:38 +05:00
Admin
0fc30d1328 fix: handle Llama 4 Scout array response shape in CF AI text decoder
Llama 4 Scout returns `result.response` as an array of objects
[{"generated_text":"..."}] instead of a plain string. Decode into
json.RawMessage and try both shapes; fall back to generated_text[0].

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 10:28:14 +05:00
Admin
40151f2f33 feat: enhance admin panel on book page — prompts, img2img, SSE chapter names
Some checks failed
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Failing after 35s
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 31s
Release / Docker / backend (push) Successful in 2m21s
Release / Docker / runner (push) Successful in 2m23s
Release / Gitea Release (push) Has been skipped
- Cover generation: editable prompt (pre-filled from title+summary),
  img2img toggle to use existing cover as reference, Full editor link
- Chapter cover: editable prompt textarea
- Description: instructions input field, Full editor link
- Chapter names: editable pattern field, SSE streaming with live batch
  progress, inline-editable title proposals, batch warning display

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 00:45:12 +05:00
Admin
ad2d1a2603 feat: stream chapter-name generation via SSE batching
All checks were successful
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 47s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 3m0s
Release / Docker / runner (push) Successful in 2m38s
Release / Docker / ui (push) Successful in 2m16s
Release / Gitea Release (push) Successful in 36s
Split chapter-name LLM requests into 100-chapter batches and stream
results back as SSE so large books (e.g. Shadow Slave: 2916 chapters)
never time out or truncate. Frontend shows live batch progress inline
and accumulates proposals as they arrive.
2026-04-05 00:32:18 +05:00
Admin
b0d8c02787 fix: add created field to chapters_idx to fix recentlyUpdatedBooks 400
All checks were successful
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 40s
Release / Docker / backend (push) Successful in 2m46s
Release / Docker / runner (push) Successful in 2m43s
Release / Docker / ui (push) Successful in 2m25s
Release / Gitea Release (push) Successful in 50s
chapters_idx was missing created/updated columns (never defined in the
PocketBase schema), causing PocketBase to return 400 for any query
sorted by -created. recentlyUpdatedBooks() uses this sort.

- Add created date field to chapters_idx schema in pb-init-v3.sh
  (also added via add_field for existing installations)
- Add idx_chapters_idx_created index for sort performance
- Set created timestamp on first insert in upsertChapterIdx so new
  chapters are immediately sortable; existing records retain empty created
  and will sort to the back (acceptable — only affects home page recency)
2026-04-04 23:47:23 +05:00
Admin
5b4c1db931 fix: add watchtower label to runner service so auto-updates work
Without com.centurylinklabs.watchtower.enable=true the homelab watchtower
(running with --label-enable) silently skipped the runner container,
leaving it stuck on v2.5.60 while fixes accumulated on newer tags.
2026-04-04 23:39:19 +05:00
Admin
0c54c59586 fix: guard against font_size=0 collapsing chapter text
All checks were successful
Release / Test backend (push) Successful in 51s
Release / Check ui (push) Successful in 47s
Release / Docker / caddy (push) Successful in 38s
Release / Docker / backend (push) Successful in 2m30s
Release / Docker / runner (push) Successful in 2m38s
Release / Docker / ui (push) Successful in 2m1s
Release / Gitea Release (push) Successful in 41s
- Replace ?? with || when reading font_size so 0 falls back to 1.0
  (affects GET /api/settings, layout.server.ts, +layout.svelte)
- Remove the explicit 'body.fontSize !== 0' exception in PUT /api/settings
  validation so 0 is now correctly rejected as an invalid font size
- Add add_index helper + idx_chapters_idx_slug_number declaration to
  scripts/pb-init-v3.sh (idempotent UNIQUE INDEX on chapters_idx)
2026-04-04 23:26:54 +05:00
Admin
0e5eb84097 feat: add SvelteKit proxy route for admin dedup-chapters endpoint
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 43s
Release / Docker / backend (push) Successful in 6m12s
Release / Docker / runner (push) Successful in 3m8s
Release / Docker / ui (push) Successful in 2m15s
Release / Gitea Release (push) Successful in 47s
2026-04-04 22:34:26 +05:00
Admin
6ef82a1d12 fix: add DeduplicateChapters stub to test mocks to satisfy BookWriter interface
All checks were successful
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 43s
Release / Docker / backend (push) Successful in 2m46s
Release / Docker / runner (push) Successful in 3m19s
Release / Docker / ui (push) Successful in 3m12s
Release / Gitea Release (push) Successful in 1m22s
2026-04-04 21:17:55 +05:00
Admin
7a418ee62b fix: await marked() to prevent Promise being passed as chapter HTML
Some checks failed
Release / Test backend (push) Failing after 15s
Release / Docker / backend (push) Has been skipped
Release / Docker / runner (push) Has been skipped
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 39s
Release / Docker / ui (push) Successful in 2m41s
Release / Gitea Release (push) Has been skipped
marked() returns string | Promise<string>; the previous cast 'as string'
silently passed a Promise object, which Svelte rendered as nothing.
Free users saw blank content even though SSR HTML was correct.
2026-04-04 21:15:06 +05:00
Admin
d4f35a4899 fix: prevent duplicate chapters_idx records + add dedup endpoint
Some checks failed
Release / Test backend (push) Failing after 18s
Release / Docker / backend (push) Has been skipped
Release / Docker / runner (push) Has been skipped
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 38s
Release / Docker / ui (push) Successful in 2m45s
Release / Gitea Release (push) Has been skipped
- Fix upsertChapterIdx race: use conflict-retry pattern (mirrors WriteMetadata)
  so concurrent goroutines don't double-POST the same chapter number
- Add DeduplicateChapters to BookWriter interface and Store implementation;
  keeps the latest record per (slug, number) and deletes extras
- Wire POST /api/admin/dedup-chapters/{slug} handler in server.go
2026-04-04 21:00:10 +05:00
Admin
6559a8c015 fix: split long text into chunks before sending to Cloudflare AI TTS
Some checks failed
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 1m5s
Release / Docker / caddy (push) Successful in 37s
Release / Docker / backend (push) Has been cancelled
Release / Docker / runner (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
The aura-2-en model enforces a hard 2 000-character limit per request.
Chapters routinely exceed this, producing 413 errors.

GenerateAudio now splits the stripped text into ≤1 800-char chunks at
paragraph → sentence → space → hard-cut boundaries, calls the API once
per chunk, and concatenates the MP3 frames. Callers (runner, streaming
handler) are unchanged. StreamAudioMP3/WAV inherit the fix automatically
since they delegate to GenerateAudio.
2026-04-04 20:45:22 +05:00
Admin
05bfd110b8 refactor: replace floating settings panel with bottom sheet + Reading/Listening tabs
Some checks failed
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 39s
Release / Docker / runner (push) Has been cancelled
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Docker / backend (push) Has been cancelled
The old vertical dropdown was too tall on mobile — 14 stacked groups
required heavy scrolling and had no visual hierarchy.

New design:
- Full-width bottom sheet slides from screen edge (natural mobile gesture)
- Drag handle + dimmed backdrop for clarity
- Two tabs split the settings: Reading (typography + layout) and Listening
  (player style, speed, auto-next, sleep timer)
- Each row uses label-on-left + pill-group-on-right layout — saves one line
  per setting and makes the list scannable at a glance
- Settings are grouped into titled cards (Typography, Layout, Player)
  with dividers between rows instead of floating individual blocks
- Gear button moved to bottom-[4.5rem] to clear the mini-player bar
2026-04-04 20:35:15 +05:00
Admin
bfd0ad8fb7 fix: chapter content vanishes — replace stale untrack snapshots with $derived
Some checks failed
Release / Test backend (push) Successful in 44s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / backend (push) Successful in 2m53s
Release / Docker / runner (push) Successful in 2m58s
Release / Docker / ui (push) Successful in 4m58s
Release / Gitea Release (push) Has been cancelled
html and fetchingContent were captured with untrack() at mount time.
When SvelteKit re-ran the page load (triggered by the layout's settings PUT),
data.html updated but html stayed stale. The {#key} block in the layout then
destroyed and recreated the component, and on remount data.html was momentarily
empty so html became '' and the live-scrape fallback ran unnecessarily.

Fix:
- html is now $derived(scrapedHtml || data.html || '') — always tracks load
- scrapedHtml is a separate $state only set by the live-scrape fallback
- fetchingContent starts false; the fallback sets it true only when actually fetching
- translationStatus/translatingLang: dropped untrack() so they also react to re-runs
- Removed unused untrack import
2026-04-04 20:26:43 +05:00
Admin
4b7fcf432b fix: pass POLAR_API_TOKEN and POLAR_WEBHOOK_SECRET to ui container
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 47s
Release / Docker / caddy (push) Successful in 45s
Release / Docker / backend (push) Successful in 2m35s
Release / Docker / runner (push) Successful in 2m36s
Release / Docker / ui (push) Successful in 2m25s
Release / Gitea Release (push) Successful in 43s
Both vars were present in Doppler but never injected into the ui service
environment, causing all checkout requests to fail with 500 and webhooks
to be silently rejected.
2026-04-04 20:18:19 +05:00
Admin
c4a0256f6e feat: /subscribe pricing page + Pro nav link + fix checkout token scope
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m35s
Release / Docker / runner (push) Successful in 2m38s
Release / Docker / ui (push) Successful in 2m12s
Release / Gitea Release (push) Successful in 41s
- Add /subscribe route with hero, benefits list, and pricing cards
  (annual featured with Save 33% badge, monthly secondary)
- Add 'Pro' link in nav for non-Pro users
- Add 'See plans' link in profile subscription section
- i18n keys across en/fr/id/pt/ru for all subscribe strings

Note: checkout still requires POLAR_API_TOKEN with checkouts:write scope.
Regenerate the token at polar.sh to fix the 502 error on subscribe buttons.
2026-04-04 20:12:24 +05:00
Admin
18f490f790 feat: admin controls on book detail page (cover/desc/chapter-names/audio TTS)
All checks were successful
Release / Test backend (push) Successful in 48s
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 3m22s
Release / Docker / runner (push) Successful in 2m51s
Release / Docker / ui (push) Successful in 3m17s
Release / Gitea Release (push) Successful in 1m12s
Add 5 admin sections to /books/[slug] for admins:
- Book cover generation (CF AI image-gen with preview + save)
- Chapter cover generation (chapter number input + preview)
- Description regeneration (preview + apply/discard)
- Chapter names generation (preview table + apply/discard)
- Audio TTS bulk enqueue (voice selector, chapter range, cancel)

Also adds /api/admin/audio/bulk and /api/admin/audio/cancel-bulk proxy routes,
and all i18n keys across en/fr/id/pt/ru.
2026-04-04 19:57:16 +05:00
Admin
6456e8cf5d perf: skip ffmpeg transcode for PocketTTS streaming — use WAV directly
All checks were successful
Release / Test backend (push) Successful in 42s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m46s
Release / Docker / runner (push) Successful in 3m12s
Release / Docker / ui (push) Successful in 2m19s
Release / Gitea Release (push) Successful in 36s
PocketTTS emits 16-bit PCM WAV (16 kHz mono). WAV is natively supported
on all browsers including iOS/macOS Safari, so the ffmpeg MP3 transcode
is unnecessary for the streaming path.

Using format=wav for PocketTTS voices eliminates the ffmpeg subprocess
startup delay (~200–400 ms) and a pipeline stage, giving lower latency
to first audio frame. Kokoro and CF AI continue using MP3 (they output
MP3 natively or via the OpenAI-compatible endpoint).

The runner (MinIO storage) is unaffected — it still stores MP3 for
space efficiency.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 19:55:05 +05:00
Admin
25150c2284 feat: TTS streaming — Kokoro/PocketTTS audio starts immediately
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / backend (push) Successful in 4m38s
Release / Docker / runner (push) Successful in 2m49s
Release / Docker / ui (push) Successful in 2m13s
Release / Gitea Release (push) Successful in 57s
Previously all voices waited for full TTS generation before first byte
reached the browser (POST → poll → presign flow). For long chapters this
meant 2–5 minutes of silence.

New flow for Kokoro and PocketTTS:
- tryPresign fast path unchanged (audio in MinIO → seekable presigned URL)
- On cache miss: set audioEl.src to /api/audio-stream/{slug}/{n} and mark
  status=ready immediately — audio starts playing within seconds
- Backend streams bytes to browser while concurrently uploading to MinIO;
  subsequent plays use the fast path

New SvelteKit route: GET /api/audio-stream/[slug]/[n]
- Proxies backend handleAudioStream (already implemented in Go)
- Same 3 chapters/day free paywall as POST route
- HEAD handler for paywall pre-check (no side effects, no counter increment)
  so AudioPlayer can surface upgrade CTA before pointing <audio> at URL

CF AI voices keep the old POST+poll flow (batch-only API, no real streaming
benefit, preserves the generating progress bar UX).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 19:49:53 +05:00
Admin
0e0a70a786 feat: listening settings panel + compact player style
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / backend (push) Successful in 3m16s
Release / Docker / runner (push) Successful in 2m53s
Release / Docker / ui (push) Successful in 2m30s
Release / Gitea Release (push) Successful in 49s
- AudioPlayer: add 'compact' playerStyle prop — slim seekable player with
  progress bar, skip ±15/30s, play/pause circle button, and speed cycle
- Chapter reader settings gear panel: new Listening section with player
  style picker (Standard/Compact), speed control (0.75–2×), auto-next
  toggle, and sleep timer — all persisted in reader_layout_v1 localStorage
- audioStore.speed/autoNext/sleep now accessible directly from settings
  panel without opening the audio player

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 18:34:41 +05:00
Admin
bdbe48ce1a fix: register MediaSession action handlers for iOS lock screen resume
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 56s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m53s
Release / Docker / runner (push) Successful in 3m13s
Release / Docker / ui (push) Successful in 2m25s
Release / Gitea Release (push) Successful in 50s
Without explicit setActionHandler('play'/'pause'/...) the browser uses
default handling which on iOS Safari stops working after ~1 min of
pause in background/locked state (AudioSession gets suspended and the
lock screen button can't resume it without a direct user gesture).

Registering handlers in the layout (where audioEl lives) ensures:
- play/pause call audioEl directly — iOS treats this as a trusted
  gesture and allows .play() even from the lock screen
- seekbackward/seekforward map to ±15s / ±30s
- playbackState stays in sync so the lock screen shows the right icon

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 18:22:09 +05:00
Admin
87c541b178 fix: text-gen chapter-names truncation and bad title format
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 51s
Release / Docker / backend (push) Successful in 2m45s
Release / Docker / runner (push) Successful in 2m50s
Release / Docker / ui (push) Successful in 2m14s
Release / Gitea Release (push) Successful in 44s
- Default max_tokens to 4096 for chapter-names so large chapter lists
  are not cut off mid-JSON by the model's token limit
- Rewrite system prompt to clarify placeholder semantics ({n} = number,
  {scene} = scene hint) and explicitly forbid echoing the number inside
  the title field — prevents "Chapter 1 - 1: ..." style duplications
- UI: surface raw_response in the error area when chapters:[] is returned
  so the admin can see what the model actually produced
2026-04-04 13:43:29 +05:00
Admin
0b82d96798 feat: admin Text Gen tool — chapter names + book description via CF Workers AI
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m55s
Release / Docker / runner (push) Successful in 3m3s
Release / Docker / ui (push) Successful in 2m31s
Release / Gitea Release (push) Successful in 43s
Adds backend handlers and SvelteKit UI for an admin text generation tool.
The tool lets admins propose and apply AI-generated chapter titles and book
descriptions using Cloudflare Workers AI (12 LLM models, model selector shared
across both tabs).
2026-04-04 13:16:10 +05:00
Admin
a2dd0681d2 fix: pass CFAI_ACCOUNT_ID/CFAI_API_TOKEN into backend and runner containers
All checks were successful
Release / Test backend (push) Successful in 54s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m35s
Release / Docker / runner (push) Successful in 2m48s
Release / Docker / ui (push) Successful in 2m45s
Release / Gitea Release (push) Successful in 55s
Both vars were in Doppler but never forwarded via docker-compose.yml, causing
the image generation handler to return 503 "CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing".
2026-04-04 12:50:19 +05:00
Admin
ad50bd21ea chore: add .githooks/pre-commit to auto-recompile paraglide on JSON changes
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m47s
Release / Docker / runner (push) Successful in 2m41s
Release / Docker / ui (push) Successful in 2m7s
Release / Gitea Release (push) Successful in 39s
Committed hooks directory (.githooks/pre-commit) auto-runs `npm run paraglide`
and force-stages the generated JS output whenever ui/messages/*.json files are
staged, preventing svelte-check CI failures from stale paraglide output.

Added `just setup` recipe to configure core.hooksPath for new contributors.
2026-04-04 12:11:42 +05:00
Admin
6572e7c849 fix: cover_url bug, add save-cover endpoint, fix svelte-check CI failure
All checks were successful
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 44s
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 2m50s
Release / Docker / runner (push) Successful in 2m42s
Release / Docker / ui (push) Successful in 2m9s
Release / Gitea Release (push) Successful in 38s
- Fix handlers_image.go: cover_url now uses /api/cover/novelfire.net/{slug} (was 'local')
- Add POST /api/admin/image-gen/save-cover: persists pre-generated base64 directly to
  MinIO via PutCover without re-calling Cloudflare AI
- Add SvelteKit proxy route api/admin/image-gen/save-cover/+server.ts
- Update UI saveAsCover(): send existing image_b64 to save-cover instead of re-generating
- Run npm run paraglide to compile admin_nav_image_gen message; force-add gitignored files
- svelte-check: 0 errors, 21 warnings (all pre-existing)
2026-04-04 12:05:19 +05:00
Admin
74ece7e94e feat: reading view modes — progress bar, paginated, spacing, width, focus
Some checks failed
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Failing after 27s
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 43s
Release / Docker / backend (push) Successful in 2m50s
Release / Docker / runner (push) Successful in 3m3s
Release / Gitea Release (push) Has been skipped
Five new reader settings (persisted in localStorage as reader_layout_v1):

- Read mode: Scroll (default) vs Pages — paginated mode splits content
  into viewport-height pages, navigate with tap left/right, arrow keys,
  or Prev/Next buttons. Recalculates on content change.

- Line spacing: Tight (1.55) / Normal (1.85) / Loose (2.2) via
  --reading-line-height CSS var on :root.

- Reading width: Narrow (58ch) / Normal (72ch) / Wide (90ch) via
  --reading-max-width CSS var on :root.

- Paragraph style: Spaced (default) vs Indented (text-indent: 2em,
  tight margin — book-like feel).

- Focus mode: hides audio player, language switcher, bottom nav and
  comments so only the text remains.

Scroll progress bar: thin 2px brand-colored bar fixed at top of
viewport in scroll mode, fills as you read through the chapter.

All options added to the floating settings gear panel.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-04 11:54:44 +05:00
Admin
d1b7d3e36c feat: admin image generation via Cloudflare Workers AI
Some checks failed
Release / Test backend (push) Successful in 46s
Release / Check ui (push) Failing after 27s
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 3m3s
Release / Docker / runner (push) Successful in 2m57s
Release / Gitea Release (push) Has been skipped
- Add cfai/image.go: ImageGenClient with GenerateImage, GenerateImageFromReference, AllImageModels (9 models)
- Add handlers_image.go: GET /api/admin/image-gen/models + POST /api/admin/image-gen (JSON + multipart)
- Wire ImageGen client in main.go + server.go Dependencies
- Add admin/image-gen SvelteKit page: type toggle, model selector, prompt, reference img2img, advanced options, result panel, history, save-as-cover, download
- Add SvelteKit proxy route api/admin/image-gen forwarding to Go backend
- Add admin_nav_image_gen message key to all 5 locale files
- Add Image Gen nav link to admin layout
2026-04-04 11:46:22 +05:00
Admin
aaa008ac99 feat: add Cloudflare AI TTS engine (aura-2-en) with voice grouping in UI
All checks were successful
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 46s
Release / Docker / backend (push) Successful in 2m45s
Release / Docker / runner (push) Successful in 2m53s
Release / Docker / ui (push) Successful in 2m5s
Release / Gitea Release (push) Successful in 41s
2026-04-04 11:12:55 +05:00
Admin
9806f0d894 feat: live Gitea changelog, Gitea sidebar link, release title/body extraction
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 41s
Release / Docker / backend (push) Successful in 2m23s
Release / Docker / runner (push) Successful in 2m48s
Release / Docker / ui (push) Successful in 2m5s
Release / Gitea Release (push) Successful in 21s
- Admin changelog page now fetches releases live from the public Gitea
  API (https://gitea.kalekber.cc) instead of a baked releases.json file
- Remove /static/releases.json from ui/.gitignore (no longer generated)
- Add Gitea to admin sidebar external tools (admin_nav_gitea i18n key,
  all 5 locales, compiled paraglide JS force-added)
- release.yaml: remove 'Fetch releases from Gitea API' step; extract
  release title and body from the tagged commit message for Gitea releases
2026-04-03 23:20:30 +05:00
Admin
e862135775 fix: commit missing paraglide feed message JS files (were gitignored, needed force-add)
All checks were successful
Release / Test backend (push) Successful in 47s
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m23s
Release / Docker / runner (push) Successful in 2m35s
Release / Docker / ui (push) Successful in 1m57s
Release / Gitea Release (push) Successful in 20s
New feed_*.js and nav_feed.js outputs from npm run paraglide were not tracked
because src/lib/paraglide/.gitignore ignores everything by default. Force-add
them so CI can run svelte-check without running paraglide compile first.

Also include recentlyUpdatedBooks fallback improvements from working tree.
2026-04-03 23:13:46 +05:00
Admin
8588475d03 feat: in-reader settings, more like this, reading history, audio filter, feed page
Some checks failed
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Failing after 33s
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 51s
Release / Docker / backend (push) Successful in 2m20s
Release / Docker / runner (push) Successful in 2m28s
Release / Gitea Release (push) Has been skipped
- Add floating gear button + theme/font/size drawer in chapter reader
- Add 'More like this' horizontal scroll row on book detail page
- Add reading history tab on profile page (chronological progress timeline)
- Add audio-available badge + filter toggle on catalogue (GET /api/audio/slugs)
- Fix discover card entry animation pop (split entry vs snap-back cubic-bezier)
- Add dedicated /feed page showing books followed users are reading
- Add Feed nav link (desktop + mobile) in all 5 locales
2026-04-03 22:32:37 +05:00
Admin
28fee7aee3 feat: fix recently-updated section + hideable home sections
All checks were successful
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 44s
Release / Docker / backend (push) Successful in 2m28s
Release / Docker / runner (push) Successful in 2m28s
Release / Docker / ui (push) Successful in 1m52s
Release / Gitea Release (push) Successful in 19s
Fix "Recently Updated" showing stale books: replace meta_updated
sorting (only changes on metadata writes) with chapters_idx sorted
by -created, so the section now reflects actual chapter activity.

Add per-section show/hide toggles on the home page, persisted in
localStorage via Svelte 5 $state. Each section header gets a small
hide button; hidden sections appear as restore chips above the footer.
Toggleable: Recently Updated, Browse by Genre, From Following.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-03 22:09:48 +05:00
Admin
a888d9a0f5 fix: clean up book detail mobile layout + make genre tags linkable
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 49s
Release / Docker / caddy (push) Successful in 53s
Release / Docker / backend (push) Successful in 2m53s
Release / Docker / runner (push) Successful in 2m22s
Release / Docker / ui (push) Successful in 1m52s
Release / Gitea Release (push) Successful in 19s
Mobile action area was cluttered with Continue, Start ch.1, bookmark,
stars, and shelf dropdown all competing in a single unstructured row.

- Split mobile CTAs into two clear rows:
  Row 1: primary read button(s) only (Continue / Start from ch.1)
  Row 2: bookmark icon + shelf dropdown + star rating inline
- 'Start from ch.1' no longer stretches to flex-1 when Continue is
  present — it's a compact secondary button instead
- Stars and shelf dropdown moved out of the CTA row into their own line

Genre tags were plain <span> elements with no interaction. Tapping
'fantasy' or 'action' now navigates to /catalogue?genre=fantasy,
pre-selecting the genre filter on the catalogue page.
2026-04-03 21:34:38 +05:00
Admin
ac7b686fba fix: don't save settings immediately after login
All checks were successful
Release / Test backend (push) Successful in 41s
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 45s
Release / Docker / backend (push) Successful in 2m53s
Release / Docker / runner (push) Successful in 2m34s
Release / Docker / ui (push) Successful in 1m56s
Release / Gitea Release (push) Successful in 20s
The save-settings $effect was firing on the initial data load because
settingsApplied was set to true synchronously in the apply effect, then
currentTheme/fontFamily/fontSize were written in the same tick — causing
the save effect to immediately fire with uninitialized default values
(theme: "", fontFamily: "", fontSize: 0), producing a 400 error.

- Add settingsDirty flag, set via setTimeout(0) after initial apply so
  the save effect is blocked for the first load and only runs on real
  user-driven changes
- Also accept empty string / 0 as 'not provided' in PUT /api/settings
  validation as a defensive backstop
2026-04-03 21:07:01 +05:00
Admin
24d73cb730 fix: add device_fingerprint to PB schema + fix homelab Redis routing
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 43s
Release / Docker / caddy (push) Successful in 54s
Release / Docker / runner (push) Successful in 2m33s
Release / Docker / backend (push) Successful in 3m0s
Release / Docker / ui (push) Successful in 1m56s
Release / Gitea Release (push) Successful in 20s
OAuth login was silently failing: upsertUserSession() queried the
device_fingerprint column which didn't exist in the user_sessions
collection, PocketBase returned 400, the fallback authSessionId was
never written to the DB, and isSessionRevoked() immediately revoked
the cookie on first load after the OAuth redirect.

- scripts/pb-init-v3.sh: add device_fingerprint text field to the
  user_sessions create block (new installs) and add an idempotent
  add_field migration line (existing installs)

Audio jobs were stuck pending because the homelab runner was
connecting to its own local Redis instead of the prod VPS Redis.

- homelab/docker-compose.yml: change hardcoded REDIS_ADDR=redis:6379
  to ${REDIS_ADDR} so Doppler injects rediss://redis.libnovel.cc:6380
  (the Caddy TLS proxy that bridges the homelab runner to prod Redis)
2026-04-03 20:37:10 +05:00
Admin
19aeb90403 perf: cache home stats + ratings, fix discover card pop animation
All checks were successful
Release / Test backend (push) Successful in 39s
Release / Check ui (push) Successful in 48s
Release / Docker / caddy (push) Successful in 48s
Release / Docker / runner (push) Successful in 2m48s
Release / Docker / backend (push) Successful in 2m52s
Release / Docker / ui (push) Successful in 2m4s
Release / Gitea Release (push) Successful in 21s
Cache home stats (10 min) and recently added books (5 min) to avoid
hitting PocketBase on every homepage load. Cache all ratings for
discovery ranking (5 min) with invalidation on setBookRating.
invalidateBooksCache now clears all related keys atomically.

Fix discover card pop-to-full-size bug: new card now transitions from
scale(0.95) to scale(1.0) matching its back-card position, instead of
snapping to full size instantly after each swipe.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-03 20:15:58 +05:00
Admin
06d4a7bfd4 feat: profile stats, discover history, end-of-chapter sleep, rating-ranked deck
All checks were successful
Release / Test backend (push) Successful in 43s
Release / Check ui (push) Successful in 46s
Release / Docker / caddy (push) Successful in 52s
Release / Docker / backend (push) Successful in 2m32s
Release / Docker / ui (push) Successful in 2m15s
Release / Docker / runner (push) Successful in 2m50s
Release / Gitea Release (push) Successful in 22s
**Profile stats tab**
- New Stats tab on /profile page (Profile / Stats switcher)
- Reading overview: chapters read, completed, reading, plan-to-read counts
- Activity cards: day streak + avg rating given
- Favourite genres (top 3 by frequency across library/progress)
- getUserStats() in pocketbase.ts — computes streak, shelf counts, genre freq

**Discover history tab**
- New History tab on /discover with full voted-book list
- Per-entry: cover thumbnail, title link, author, action label (Liked/Skipped/etc.)
- Undo button: optimistic update + DELETE /api/discover/vote?slug=...
- Clear all history button; tab shows vote count badge
- getVotedBooks(), undoDiscoveryVote() in pocketbase.ts

**Rating-ranked discovery deck**
- getBooksForDiscovery now sorts by community avg rating before returning
- Tier-based shuffle: books within the same ±0.5 star bucket are still randomised
- Higher-rated books surface earlier without making the deck fully deterministic

**End-of-chapter sleep timer**
- New cycle option: Off → End of Chapter → 15m → 30m → 45m → 60m → Off
- sleepAfterChapter flag in AudioStore; layout handles it in onended (skips auto-next)
- Button shows "End Ch." label when active in this mode

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-03 07:26:54 +05:00
Admin
73a92ccf8f fix: deduplicate sessions with device fingerprint upsert
All checks were successful
Release / Test backend (push) Successful in 29s
Release / Check ui (push) Successful in 41s
Release / Docker / caddy (push) Successful in 50s
Release / Docker / ui (push) Successful in 2m8s
Release / Docker / backend (push) Successful in 3m17s
Release / Docker / runner (push) Successful in 3m13s
Release / Gitea Release (push) Successful in 12s
OAuth callbacks were creating a new session record on every login from
the same device because user-agent/IP were hardcoded as empty strings,
producing a pile-up of 6+ identical 'Unknown browser' sessions.

- Add upsertUserSession(): looks up existing session by user_id +
  device_fingerprint (SHA-256 of ua::ip, first 16 hex chars); reuses
  and touches it (returning the same authSessionId) if found, creates
  a new record otherwise
- Add device_fingerprint field to UserSession interface
- Fix OAuth callback to read real user-agent/IP from request headers
  (they are available in RequestHandler via request.headers)
- Switch both OAuth and password login to upsertUserSession so the
  returned authSessionId is used for the auth token
- Extend pruneStaleUserSessions to also cap sessions at 10 per user
- Keep createUserSession as a deprecated shim for gradual migration
2026-04-02 22:22:17 +05:00
Admin
08361172c6 feat: ratings, shelves, sleep timer, EPUB export + fix TS errors
All checks were successful
Release / Docker / caddy (push) Successful in 38s
Release / Check ui (push) Successful in 39s
Release / Test backend (push) Successful in 57s
Release / Docker / ui (push) Successful in 1m49s
Release / Docker / runner (push) Successful in 3m12s
Release / Docker / backend (push) Successful in 3m46s
Release / Gitea Release (push) Successful in 13s
**Ratings (1–5 stars)**
- New `book_ratings` PB collection (session_id, user_id, slug, rating)
- `getBookRating`, `getBookAvgRating`, `setBookRating` in pocketbase.ts
- GET/POST /api/ratings/[slug] API route
- StarRating.svelte component with hover, animated stars, avg display
- Star rating shown on book detail page (desktop + mobile)

**Plan-to-Read shelf**
- `shelf` field added to `user_library` (reading/plan_to_read/completed/dropped)
- `updateBookShelf`, `getShelfMap` in pocketbase.ts
- PATCH /api/library/[slug] for shelf updates
- Shelf selector dropdown on book detail page (only when saved)
- Shelf tabs on library page to filter by category

**Sleep timer**
- `sleepUntil` state added to AudioStore
- Layout handles timer lifecycle (survives chapter navigation)
- Cycles Off → 15m → 30m → 45m → 60m → Off
- Shows live countdown in AudioPlayer when active

**EPUB export**
- Go backend: GET /api/export/{slug}?from=N&to=N
- Generates valid EPUB2 zip (mimetype uncompressed, OPF, NCX, XHTML chapters)
- Markdown → HTML via goldmark
- SvelteKit proxy at /api/export/[slug]
- Download button on book detail page (only when in library)

**Fix TS errors**
- discover/+page.svelte: currentBook possibly undefined (use {@const book})
- cardEl now $state for reactive binding

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 21:50:04 +05:00
Admin
809dc8d898 fix: make asynq consumer actually claim and heartbeat translation tasks
All checks were successful
Release / Check ui (push) Successful in 27s
Release / Test backend (push) Successful in 43s
Release / Docker / caddy (push) Successful in 1m3s
Release / Docker / ui (push) Successful in 1m58s
Release / Docker / runner (push) Successful in 3m23s
Release / Docker / backend (push) Successful in 4m26s
Release / Gitea Release (push) Successful in 13s
ClaimNextTranslationTask and HeartbeatTask were no-ops in the asynq
Consumer, so translation tasks created in PocketBase were never picked
up by the runner. Translation tasks live in PocketBase (not Redis),
so they must be claimed/heartbeated via the underlying pb consumer.
ReapStaleTasks is also delegated so stale translation tasks get reset.

Also removes the LibreTranslate healthcheck from homelab/runner
docker-compose.yml and relaxes depends_on to service_started — the
healthcheck was blocking runner startup until models loaded (~2 min)
and the models are already pre-downloaded in the volume.
2026-04-02 21:16:48 +05:00
Admin
e9c3426fbe feat: scroll active chapter into view when chapter drawer opens
All checks were successful
Release / Check ui (push) Successful in 40s
Release / Test backend (push) Successful in 43s
Release / Docker / caddy (push) Successful in 51s
Release / Docker / ui (push) Successful in 2m30s
Release / Docker / runner (push) Successful in 3m26s
Release / Docker / backend (push) Successful in 4m3s
Release / Gitea Release (push) Successful in 12s
When the mini-player chapter drawer is opened, the current chapter is
now immediately scrolled into the center of the list instead of always
starting from the top. Uses a Svelte action (setIfActive) to track the
active chapter element and a $effect to call scrollIntoView on open.
2026-04-02 20:44:12 +05:00
Admin
8e611840d1 fix: add 30s timeout to PB HTTP client; halve heartbeat tick interval
All checks were successful
Release / Test backend (push) Successful in 32s
Release / Docker / caddy (push) Successful in 42s
Release / Check ui (push) Successful in 44s
Release / Docker / ui (push) Successful in 2m32s
Release / Docker / backend (push) Successful in 2m49s
Release / Docker / runner (push) Successful in 3m26s
Release / Gitea Release (push) Successful in 15s
- storage/pocketbase.go: replace http.DefaultClient (no timeout) with a
  dedicated pbHTTPClient{Timeout: 30s} so a slow/hung PocketBase cannot
  stall the backend or runner indefinitely
- runner/asynq_runner.go: heartbeat ticker was firing at StaleTaskThreshold
  (2 min) == the Docker healthcheck deadline, so a single missed tick would
  mark the container unhealthy; halved to StaleTaskThreshold/2 (1 min)
2026-04-02 18:49:04 +05:00
Admin
b9383570e3 ci: fix duplicate runs — ignore tag pushes and remove pull_request trigger
All checks were successful
Release / Test backend (push) Successful in 24s
Release / Docker / caddy (push) Successful in 50s
Release / Check ui (push) Successful in 57s
Release / Docker / backend (push) Successful in 2m15s
Release / Docker / runner (push) Successful in 2m47s
Release / Docker / ui (push) Successful in 2m42s
Release / Gitea Release (push) Successful in 13s
2026-04-02 18:00:48 +05:00
Admin
eac9358c6f fix(discover): guard active card with {#if currentBook} to fix TS errors; use $state for cardEl bind
All checks were successful
CI / Backend (pull_request) Successful in 45s
CI / UI (pull_request) Successful in 26s
Release / Test backend (push) Successful in 23s
CI / UI (push) Successful in 39s
CI / Backend (push) Successful in 42s
Release / Check ui (push) Successful in 28s
Release / Docker / caddy (push) Successful in 54s
Release / Docker / runner (push) Successful in 2m12s
Release / Docker / ui (push) Successful in 1m58s
Release / Docker / backend (push) Successful in 3m16s
Release / Gitea Release (push) Successful in 13s
2026-04-02 17:48:25 +05:00
Admin
9cb11bc5e4 chore(pb): add discovery_votes collection to pb-init script
Some checks failed
CI / UI (pull_request) Failing after 30s
CI / Backend (pull_request) Successful in 34s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 17:38:15 +05:00
Admin
7196f8e930 feat(discover): Tinder-style book discovery + fix duplicate books
Some checks failed
CI / UI (push) Failing after 22s
CI / Backend (push) Successful in 53s
Release / Test backend (push) Successful in 24s
Release / Check ui (push) Failing after 24s
Release / Docker / ui (push) Has been skipped
CI / Backend (pull_request) Successful in 25s
CI / UI (pull_request) Failing after 22s
Release / Docker / caddy (push) Successful in 56s
Release / Docker / backend (push) Successful in 1m38s
Release / Docker / runner (push) Successful in 3m14s
Release / Gitea Release (push) Has been skipped
- New /discover page with swipe UI: left=skip, right=like, up=read now, down=nope
- Onboarding modal to collect genre/status preferences (persisted in localStorage)
- 3-card stack with pointer-event drag, CSS fly-out animation, 5 action buttons
- Tap card for preview modal; empty state with deck reset
- Like/read-now auto-saves book to user library
- POST /api/discover/vote + DELETE for deck reset
- Discovery vote persistence via PocketBase discovery_votes collection
- Fix duplicate books: dedup by slug in getBooksBySlugs
- Fix WriteMetadata TOCTOU race: conflict-retry on concurrent insert

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 17:33:27 +05:00
Admin
a771405db8 feat(audio): WAV streaming, bulk audio generation admin endpoints, cancel/resume
Some checks failed
CI / Backend (push) Successful in 48s
CI / UI (push) Successful in 28s
Release / Check ui (push) Successful in 39s
Release / Test backend (push) Successful in 49s
Release / Docker / caddy (push) Successful in 59s
CI / Backend (pull_request) Successful in 41s
CI / UI (pull_request) Successful in 46s
Release / Docker / ui (push) Successful in 1m31s
Release / Docker / backend (push) Successful in 3m27s
Release / Docker / runner (push) Successful in 3m47s
Release / Gitea Release (push) Failing after 32s
- Add StreamAudioWAV() to pocket-tts and Kokoro clients; pocket-tts streams
  raw WAV directly (no ffmpeg), Kokoro requests response_format:wav with stream:true
- GET /api/audio-stream supports ?format=wav for lower-latency first-byte delivery;
  WAV cached separately in MinIO as {slug}/{n}/{voice}.wav
- Add GET /api/admin/audio/jobs with optional ?slug filter
- Add POST /api/admin/audio/bulk {slug, voice, from, to, skip_existing, force}
  where skip_existing=true (default) resumes interrupted bulk jobs
- Add POST /api/admin/audio/cancel-bulk {slug} to cancel all pending/running tasks
- Add CancelAudioTasksBySlug to taskqueue.Producer + asynqqueue implementation
- Add AudioObjectKeyExt to bookstore.AudioStore for format-aware MinIO keys

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 16:19:14 +05:00
Admin
1e9a96aa0f fix(payments): fix TypeScript cast errors in polar webhook handler
Some checks failed
CI / UI (push) Successful in 25s
Release / Test backend (push) Successful in 22s
CI / Backend (push) Successful in 1m59s
Release / Check ui (push) Successful in 27s
Release / Docker / caddy (push) Successful in 44s
CI / UI (pull_request) Successful in 27s
CI / Backend (pull_request) Failing after 1m26s
Release / Docker / runner (push) Successful in 1m40s
Release / Docker / backend (push) Successful in 2m34s
Release / Docker / ui (push) Successful in 3m21s
Release / Gitea Release (push) Successful in 13s
Cast through unknown to satisfy TS strict overlap check for
PolarSubscription and PolarOrder types from Record<string, unknown>.
2026-03-31 23:40:11 +05:00
Admin
23ae1ed500 feat(payments): lock checkout email via Polar server-side checkout sessions
Some checks failed
CI / UI (pull_request) Failing after 23s
CI / Backend (push) Successful in 27s
CI / Backend (pull_request) Successful in 53s
CI / UI (push) Failing after 25s
Release / Test backend (push) Successful in 28s
Release / Check ui (push) Failing after 30s
Release / Docker / ui (push) Has been skipped
Release / Docker / caddy (push) Successful in 42s
Release / Docker / backend (push) Successful in 1m50s
Release / Docker / runner (push) Successful in 3m47s
Release / Gitea Release (push) Has been skipped
Replace static Polar checkout links with a server-side POST /api/checkout
route that creates a checkout session with customer_external_id = user ID
and customer_email locked (not editable). Adds loading/error states and
a post-checkout success banner on the profile page.
2026-03-31 23:36:53 +05:00
Admin
e7cb460f9b fix(payments): point manage subscription to org customer portal
Some checks failed
CI / Backend (pull_request) Successful in 32s
CI / UI (pull_request) Failing after 28s
CI / Backend (push) Successful in 26s
CI / UI (push) Failing after 25s
Release / Check ui (push) Failing after 16s
Release / Docker / ui (push) Has been skipped
Release / Test backend (push) Successful in 47s
Release / Docker / caddy (push) Successful in 50s
Release / Docker / runner (push) Successful in 2m0s
Release / Docker / backend (push) Successful in 2m59s
Release / Gitea Release (push) Has been skipped
2026-03-31 23:26:57 +05:00
Admin
392248e8a6 fix(payments): update Polar checkout links to use checkout link IDs
Some checks failed
CI / Backend (pull_request) Successful in 25s
CI / UI (pull_request) Failing after 16s
CI / UI (push) Failing after 17s
CI / Backend (push) Successful in 41s
Release / Test backend (push) Successful in 40s
Release / Docker / caddy (push) Successful in 54s
Release / Docker / backend (push) Successful in 1m55s
Release / Docker / runner (push) Successful in 2m52s
Release / Docker / ui (push) Has been cancelled
Release / Gitea Release (push) Has been cancelled
Release / Check ui (push) Has been cancelled
2026-03-31 23:25:26 +05:00
Admin
68ea2d2808 feat(payments): fix Polar webhook + pre-fill checkout email
Some checks failed
CI / Backend (pull_request) Successful in 26s
CI / UI (pull_request) Failing after 24s
CI / Backend (push) Successful in 26s
Release / Check ui (push) Failing after 16s
Release / Docker / ui (push) Has been skipped
CI / UI (push) Failing after 31s
Release / Test backend (push) Successful in 41s
Release / Docker / caddy (push) Successful in 33s
Release / Docker / runner (push) Successful in 2m58s
Release / Docker / backend (push) Successful in 3m43s
Release / Gitea Release (push) Has been skipped
- Fix customer email path: was data.customer_email, is actually
  data.customer.email per Polar v1 API schema
- Add resolveUser() helper: tries polar_customer_id → email → external_id
- Add subscription.active and subscription.canceled event handling
- Handle order.created for fast-path pro upgrade on purchase
- Profile page: fetch user email + polarCustomerId from PocketBase
- Profile page: pre-fill ?customer_email= on checkout links
- Profile page: link to polar.sh/purchases for existing customers
2026-03-31 23:11:34 +05:00
Admin
7b1df9b592 fix(infra): fix libretranslate healthcheck; fix scrollbar-none css
All checks were successful
CI / Backend (push) Successful in 25s
Release / Test backend (push) Successful in 24s
CI / UI (push) Successful in 51s
Release / Check ui (push) Successful in 29s
CI / UI (pull_request) Successful in 27s
CI / Backend (pull_request) Successful in 44s
Release / Docker / caddy (push) Successful in 54s
Release / Docker / ui (push) Successful in 2m4s
Release / Docker / runner (push) Successful in 2m56s
Release / Docker / backend (push) Successful in 3m23s
Release / Gitea Release (push) Successful in 12s
2026-03-31 22:36:19 +05:00
Admin
f4089fe111 fix(admin): add layout guard and redirect /admin to /admin/scrape
All checks were successful
CI / Backend (push) Successful in 45s
CI / UI (push) Successful in 56s
Release / Check ui (push) Successful in 26s
Release / Test backend (push) Successful in 39s
CI / Backend (pull_request) Successful in 24s
Release / Docker / caddy (push) Successful in 48s
CI / UI (pull_request) Successful in 40s
Release / Docker / runner (push) Successful in 2m52s
Release / Docker / backend (push) Successful in 3m27s
Release / Docker / ui (push) Successful in 3m38s
Release / Gitea Release (push) Successful in 14s
- Add +layout.server.ts to enforce admin role check at layout level,
  preventing 404 on /admin and protecting all sub-routes centrally
- Add +page.server.ts to redirect /admin → /admin/scrape (was 404)
2026-03-31 22:33:39 +05:00
Admin
87b5ad1460 feat(auth): add debug-login bypass endpoint secured by DEBUG_LOGIN_TOKEN
All checks were successful
CI / Backend (push) Successful in 43s
CI / UI (push) Successful in 1m10s
Release / Check ui (push) Successful in 26s
Release / Test backend (push) Successful in 41s
CI / Backend (pull_request) Successful in 25s
Release / Docker / caddy (push) Successful in 47s
CI / UI (pull_request) Successful in 41s
Release / Docker / ui (push) Successful in 2m32s
Release / Docker / backend (push) Successful in 3m57s
Release / Docker / runner (push) Successful in 4m8s
Release / Gitea Release (push) Successful in 12s
2026-03-31 21:59:58 +05:00
285 changed files with 33013 additions and 2584 deletions

View File

@@ -2,11 +2,8 @@ name: CI
on:
push:
paths:
- "backend/**"
- "ui/**"
- ".gitea/workflows/ci.yaml"
pull_request:
tags-ignore:
- "v*"
paths:
- "backend/**"
- "ui/**"
@@ -44,6 +41,9 @@ jobs:
- name: Build healthcheck
run: go build -o /dev/null ./cmd/healthcheck
- name: Build pocketbase
run: go build -o /dev/null ./cmd/pocketbase
- name: Run tests
run: go test -short -race -count=1 -timeout=60s ./...
@@ -66,6 +66,9 @@ jobs:
- name: Install dependencies
run: npm ci
- name: Check Paraglide codegen is up to date
run: npm run paraglide && git diff --exit-code src/lib/paraglide/
- name: Type check
run: npm run check

View File

@@ -55,11 +55,11 @@ jobs:
- name: Build
run: npm run build
# ── docker: backend ───────────────────────────────────────────────────────────
docker-backend:
name: Docker / backend
# ── docker: build + push all images via docker bake ──────────────────────────
docker:
name: Docker
runs-on: ubuntu-latest
needs: [test-backend]
needs: [test-backend, check-ui]
steps:
- uses: actions/checkout@v4
@@ -71,215 +71,133 @@ jobs:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ secrets.DOCKER_USER }}/libnovel-backend
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest
- name: Build and push
uses: docker/build-push-action@v6
with:
context: backend
target: backend
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
VERSION=${{ steps.meta.outputs.version }}
COMMIT=${{ gitea.sha }}
cache-from: type=registry,ref=${{ secrets.DOCKER_USER }}/libnovel-backend:latest
cache-to: type=inline
# ── docker: runner ────────────────────────────────────────────────────────────
docker-runner:
name: Docker / runner
runs-on: ubuntu-latest
needs: [test-backend]
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ secrets.DOCKER_USER }}/libnovel-runner
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest
- name: Build and push
uses: docker/build-push-action@v6
with:
context: backend
target: runner
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
VERSION=${{ steps.meta.outputs.version }}
COMMIT=${{ gitea.sha }}
cache-from: type=registry,ref=${{ secrets.DOCKER_USER }}/libnovel-runner:latest
cache-to: type=inline
# ── ui: source map upload ─────────────────────────────────────────────────────
# Commented out: GlitchTip project/auth token needs to be recreated after
# the GlitchTip DB wipe. Re-enable once GLITCHTIP_AUTH_TOKEN is updated.
# upload-sourcemaps:
# name: Upload source maps
# runs-on: ubuntu-latest
# needs: [check-ui]
# defaults:
# run:
# working-directory: ui
# steps:
# - uses: actions/checkout@v4
#
# - uses: actions/setup-node@v4
# with:
# node-version: "22"
# cache: npm
# cache-dependency-path: ui/package-lock.json
#
# - name: Install dependencies
# run: npm ci
#
# - name: Build with source maps
# run: npm run build
#
# - name: Download glitchtip-cli
# run: |
# curl -L "https://gitlab.com/glitchtip/glitchtip-cli/-/jobs/artifacts/v0.1.0/raw/artifacts/glitchtip-cli-linux-x86_64?job=build-linux-x86_64" \
# -o /usr/local/bin/glitchtip-cli
# chmod +x /usr/local/bin/glitchtip-cli
#
# - name: Inject debug IDs into build artifacts
# run: glitchtip-cli sourcemaps inject ./build
# env:
# SENTRY_URL: https://errors.libnovel.cc/
# SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
# SENTRY_ORG: libnovel
# SENTRY_PROJECT: libnovel-ui
#
# - name: Upload source maps to GlitchTip
# run: glitchtip-cli sourcemaps upload ./build --release ${{ gitea.ref_name }}
# env:
# SENTRY_URL: https://errors.libnovel.cc/
# SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
# SENTRY_ORG: libnovel
# SENTRY_PROJECT: libnovel-ui
# ── docker: ui ────────────────────────────────────────────────────────────────
docker-ui:
name: Docker / ui
runs-on: ubuntu-latest
needs: [check-ui]
steps:
- uses: actions/checkout@v4
- name: Fetch releases from Gitea API
- name: Compute version tags
id: ver
run: |
set -euo pipefail
RESPONSE=$(curl -sfL \
-H "Accept: application/json" \
"http://gitea.kalekber.cc/api/v1/repos/kamil/libnovel/releases?limit=50&page=1")
# Validate JSON before writing — fails hard if response is not a JSON array
COUNT=$(echo "$RESPONSE" | jq 'if type == "array" then length else error("expected array, got \(type)") end')
echo "$RESPONSE" > ui/static/releases.json
echo "Fetched $COUNT releases"
V="${{ gitea.ref_name }}"
VER="${V#v}"
echo "version=$VER" >> "$GITHUB_OUTPUT"
echo "major_minor=$(echo "$VER" | cut -d. -f1-2)" >> "$GITHUB_OUTPUT"
- uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
uses: docker/login-action@v3
- name: Build and push all images
uses: docker/bake-action@v6
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_TOKEN }}
files: docker-bake.hcl
set: |
*.output=type=image,push=true
env:
VERSION: ${{ steps.ver.outputs.version }}
MAJOR_MINOR: ${{ steps.ver.outputs.major_minor }}
COMMIT: ${{ gitea.sha }}
BUILD_TIME: ${{ gitea.event.head_commit.timestamp }}
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ secrets.DOCKER_USER }}/libnovel-ui
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest
- name: Build and push
uses: docker/build-push-action@v6
with:
context: ui
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
BUILD_VERSION=${{ steps.meta.outputs.version }}
BUILD_COMMIT=${{ gitea.sha }}
BUILD_TIME=${{ gitea.event.head_commit.timestamp }}
cache-from: type=registry,ref=${{ secrets.DOCKER_USER }}/libnovel-ui:latest
cache-to: type=inline
# ── docker: caddy ─────────────────────────────────────────────────────────────
docker-caddy:
name: Docker / caddy
# ── deploy: sync docker-compose.yml + restart prod ───────────────────────────
# Runs after all images are pushed to Docker Hub.
# Copies the compose file from the tagged commit to the server, pulls the new
# images, and restarts only the services whose image or config changed.
# --remove-orphans cleans up containers no longer defined in the compose file
# (e.g. the now-removed pb-init container).
#
# Required Gitea secrets:
# PROD_HOST — prod server IP or hostname
# PROD_USER — SSH login user (typically root)
# PROD_SSH_KEY — private key whose public half is in authorized_keys
# PROD_SSH_KNOWN_HOSTS — output of: ssh-keyscan -H <PROD_HOST>
deploy-prod:
name: Deploy to prod
runs-on: ubuntu-latest
needs: [docker]
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Install SSH key
run: |
mkdir -p ~/.ssh
printf '%s\n' "${{ secrets.PROD_SSH_KEY }}" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key
printf '%s\n' "${{ secrets.PROD_SSH_KNOWN_HOSTS }}" >> ~/.ssh/known_hosts
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Copy docker-compose.yml to prod
run: |
scp -i ~/.ssh/deploy_key \
docker-compose.yml \
"${{ secrets.PROD_USER }}@${{ secrets.PROD_HOST }}:/opt/libnovel/docker-compose.yml"
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ secrets.DOCKER_USER }}/libnovel-caddy
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest
- name: Pull new images and restart changed services
run: |
ssh -i ~/.ssh/deploy_key \
"${{ secrets.PROD_USER }}@${{ secrets.PROD_HOST }}" \
'set -euo pipefail
cd /opt/libnovel
doppler run -- docker compose pull backend runner ui caddy pocketbase
doppler run -- docker compose up -d --remove-orphans'
- name: Build and push
uses: docker/build-push-action@v6
with:
context: caddy
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=${{ secrets.DOCKER_USER }}/libnovel-caddy:latest
cache-to: type=inline
# ── deploy homelab runner ─────────────────────────────────────────────────────
# Syncs the homelab runner compose file and restarts the runner service.
#
# Required Gitea secrets:
# HOMELAB_HOST — homelab server IP (192.168.0.109)
# HOMELAB_USER — SSH login user (typically root)
# HOMELAB_SSH_KEY — private key whose public half is in authorized_keys
# HOMELAB_SSH_KNOWN_HOSTS — output of: ssh-keyscan -H <HOMELAB_HOST>
deploy-homelab:
name: Deploy to homelab
runs-on: ubuntu-latest
needs: [docker]
steps:
- uses: actions/checkout@v4
- name: Install SSH key
run: |
mkdir -p ~/.ssh
printf '%s\n' "${{ secrets.HOMELAB_SSH_KEY }}" > ~/.ssh/homelab_key
chmod 600 ~/.ssh/homelab_key
printf '%s\n' "${{ secrets.HOMELAB_SSH_KNOWN_HOSTS }}" >> ~/.ssh/known_hosts
- name: Copy docker-compose.yml to homelab
run: |
scp -i ~/.ssh/homelab_key \
homelab/runner/docker-compose.yml \
"${{ secrets.HOMELAB_USER }}@${{ secrets.HOMELAB_HOST }}:/opt/libnovel-runner/docker-compose.yml"
- name: Pull new runner image and restart
run: |
ssh -i ~/.ssh/homelab_key \
"${{ secrets.HOMELAB_USER }}@${{ secrets.HOMELAB_HOST }}" \
'set -euo pipefail
cd /opt/libnovel-runner
doppler run --project libnovel --config prd_homelab -- docker compose pull runner
doppler run --project libnovel --config prd_homelab -- docker compose up -d runner'
# ── Gitea release ─────────────────────────────────────────────────────────────
release:
name: Gitea Release
runs-on: ubuntu-latest
needs: [docker-backend, docker-runner, docker-ui, docker-caddy]
needs: [docker]
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Extract release notes from tag commit
id: notes
run: |
set -euo pipefail
# Subject line (first line of commit message) → release title
SUBJECT=$(git log -1 --format="%s" "${{ gitea.sha }}")
# Body (everything after the blank line) → release body
BODY=$(git log -1 --format="%b" "${{ gitea.sha }}" | sed '/^Co-Authored-By:/d' | sed '/^[[:space:]]*$/{ N; /^\n$/d }' | sed 's/^[[:space:]]*$//' | awk 'NF || !p; {p = !NF}')
echo "title=${SUBJECT}" >> "$GITHUB_OUTPUT"
# Use a heredoc delimiter to safely handle multi-line body
{
echo "body<<RELEASE_BODY_EOF"
echo "${BODY}"
echo "RELEASE_BODY_EOF"
} >> "$GITHUB_OUTPUT"
- name: Create release
uses: https://gitea.com/actions/gitea-release-action@v1
with:
token: ${{ secrets.GITEA_TOKEN }}
generate_release_notes: true
title: ${{ steps.notes.outputs.title }}
body: ${{ steps.notes.outputs.body }}

14
.githooks/pre-commit Executable file
View File

@@ -0,0 +1,14 @@
#!/usr/bin/env bash
# Auto-recompile paraglide messages when ui/messages/*.json files are staged.
# Prevents svelte-check / CI failures caused by stale generated JS files.
set -euo pipefail
STAGED=$(git diff --cached --name-only)
if echo "$STAGED" | grep -q '^ui/messages/'; then
echo "[pre-commit] ui/messages/*.json changed — recompiling paraglide..."
(cd ui && npm run paraglide --silent)
git add -f ui/src/lib/paraglide/messages/
echo "[pre-commit] paraglide output re-staged."
fi

View File

@@ -1,156 +0,0 @@
---
name: ios-ux
description: iOS/SwiftUI UI & UX review and implementation guidelines for LibNovel. Enforces Apple HIG, iOS 17+ APIs, spring animations, haptics, accessibility, performance, and offline handling. Load this skill for any iOS view work.
compatibility: opencode
---
# iOS UI/UX Skill — LibNovel
Load this skill whenever working on SwiftUI views in `ios/`. It defines design standards, review process for screenshots, and implementation rules.
---
## Screenshot Review Process
When the user provides a screenshot of the app:
1. **Analyze first** — identify specific UI/UX issues across these categories:
- Visual hierarchy and spacing
- Typography (size, weight, contrast)
- Color and material usage
- Animation and interactivity gaps
- Accessibility problems
- Deprecated or non-native patterns
2. **Present a numbered list** of suggested improvements with brief rationale for each.
3. **Ask for confirmation** before writing any code: "Should I apply all of these, or only specific ones?"
4. Apply only what the user confirms.
---
## Design System
### Colors & Materials
- **Accent**: `Color.amber` (project-defined). Use for active state, selection indicators, progress fills, and CTAs.
- **Backgrounds**: Prefer `.regularMaterial`, `.ultraThinMaterial`, or `.thinMaterial` over hard-coded `Color.black.opacity(x)` or `Color(.systemBackground)`.
- **Dark overlays** (e.g. full-screen players): Use `KFImage` blurred background + `Color.black.opacity(0.50.6)` overlay. Never use a flat solid black background.
- **Semantic colors**: Use `.primary`, `.secondary`, `.tertiary` foreground styles. Avoid hard-coded `Color.white` except on dark material contexts (full-screen player).
- **No hardcoded color literals** — use `Color+App.swift` extensions or system semantic colors.
### Typography
- Use the SF Pro system font via `.font(.title)`, `.font(.body)`, etc. — never hardcode font names except for intentional stylistic accents (e.g. "Snell Roundhand" for voice watermark).
- Apply `.fontWeight()` and `.fontDesign()` modifiers rather than custom font families.
- Support Dynamic Type — never hardcode a fixed font size as the sole option without a `.minimumScaleFactor` or system font size modifier.
- Hierarchy: title3.bold for primary labels, subheadline for secondary, caption/caption2 for metadata.
### Spacing & Layout
- Minimum touch target: **44×44 pt**. Use `.frame(minWidth: 44, minHeight: 44)` or `.contentShape(Rectangle())` on small icons.
- Prefer 1620 pt horizontal padding on full-width containers; 12 pt for compact inner elements.
- Use `VStack(spacing:)` and `HStack(spacing:)` explicitly — never rely on default spacing for production UI.
- Corner radii: 1214 pt for cards/chips, 10 pt for small badges, 2024 pt for large cover art.
---
## Animation Rules
### Spring Animations (default for all interactive transitions)
- Use `.spring(response:dampingFraction:)` for state-driven layout changes, selection feedback, and appear/disappear transitions.
- Recommended defaults:
- Interactive elements: `response: 0.3, dampingFraction: 0.7`
- Entrance animations: `response: 0.450.5, dampingFraction: 0.7`
- Quick snappy feedback: `response: 0.2, dampingFraction: 0.6`
- Reserve `.easeInOut` only for non-interactive, ambient animations (e.g. opacity pulses, generating overlays).
### SF Symbol Transitions
- Always use `contentTransition(.symbolEffect(.replace.downUp))` when a symbol name changes based on state (play/pause, checkmark/circle, etc.).
- Use `.symbolEffect(.variableColor.cumulative)` for continuous animations (waveform, loading indicators).
- Use `.symbolEffect(.bounce)` for one-shot entrance emphasis (e.g. completion checkmark appearing).
- Use `.symbolEffect(.pulse)` for error/warning states that need attention.
### Repeating Animations
- Use `phaseAnimator` for any looping animation that previously used manual `@State` + `withAnimation` chains.
- Do not use `Timer` publishers for UI animation — prefer `phaseAnimator` or `TimelineView`.
---
## Haptic Feedback
Add `UIImpactFeedbackGenerator` to every user-initiated interactive control:
- `.light` — toggle switches, selection chips, secondary actions, slider drag start.
- `.medium` — primary transport buttons (play/pause, chapter skip), significant confirmations.
- `.heavy` — destructive actions (only if no confirmation dialog).
Pattern:
```swift
Button {
UIImpactFeedbackGenerator(style: .light).impactOccurred()
// action
} label: { ... }
```
Do **not** add haptics to:
- Programmatic state changes not directly triggered by a tap.
- Buttons inside `List` rows that already use swipe actions.
- Scroll events.
---
## iOS 17+ API Usage
Flag and replace any of the following deprecated patterns:
| Deprecated | Replace with |
|---|---|
| `NavigationView` | `NavigationStack` |
| `@StateObject` / `ObservableObject` (new types only) | `@Observable` macro |
| `DispatchQueue.main.async` | `await MainActor.run` or `@MainActor` |
| Manual `@State` animation chains for repeating loops | `phaseAnimator` |
| `.animation(_:)` without `value:` | `.animation(_:value:)` |
| `AnyView` wrapping for conditional content | `@ViewBuilder` + `Group` |
Do **not** refactor existing `ObservableObject` types to `@Observable` unless explicitly asked — only apply `@Observable` to new types.
---
## Accessibility
Every view must:
- Support VoiceOver: add `.accessibilityLabel()` to icon-only buttons and image views.
- Support Dynamic Type: test that text doesn't truncate at xxxLarge without a layout adjustment.
- Meet contrast ratio: text on tinted backgrounds must be legible — avoid `.opacity(0.25)` or lower for any user-readable text.
- Touch targets ≥ 44pt (see Spacing above).
- Interactive controls must have `.accessibilityAddTraits(.isButton)` if not using `Button`.
- Do not rely solely on color to convey state — pair color with icon or label.
---
## Performance
- **Isolate high-frequency observers**: Any view that observes a `PlaybackProgress` (timer-tick updates) must be a separate sub-view that `@ObservedObject`-observes only the progress object — not the parent view. This prevents the entire parent from re-rendering every 0.5 seconds.
- **Avoid `id()` overuse**: Only use `.id()` to force view recreation when necessary (e.g. background image on track change). Prefer `onChange(of:)` for side effects.
- **Lazy containers**: Use `LazyVStack` / `LazyHStack` inside `ScrollView` for lists of 20+ items. `List` is inherently lazy and does not need this.
- **Image loading**: Always use `KFImage` (Kingfisher) with `.placeholder` for remote images. Never use `AsyncImage` for cover art — it has no disk cache.
- **Avoid `AnyView`**: It breaks structural identity and hurts diffing. Use `@ViewBuilder` or `Group { }` instead.
---
## Offline & Error States
Every view that makes network calls must:
1. Wrap the body in a `VStack` with `OfflineBanner` at the top, gated on `networkMonitor.isConnected`.
2. Suppress network errors silently when offline via `ErrorAlertModifier` — do not show an alert when the device is offline.
3. Gate `.task` / `.onAppear` network calls: `guard networkMonitor.isConnected else { return }`.
4. Show a non-blocking inline empty state (not a full-screen error) for failed loads when online.
---
## Component Checklist (before submitting any view change)
- [ ] All interactive elements ≥ 44pt touch target
- [ ] SF Symbol state changes use `contentTransition(.symbolEffect(...))`
- [ ] State-driven layout transitions use `.spring(response:dampingFraction:)`
- [ ] Tappable controls have haptic feedback
- [ ] No `NavigationView`, no `DispatchQueue.main.async`, no `.animation(_:)` without `value:`
- [ ] High-frequency observers are isolated sub-views
- [ ] Offline state handled with `OfflineBanner` + `NetworkMonitor`
- [ ] VoiceOver labels on icon-only buttons
- [ ] No hardcoded `Color.black` / `Color.white` / `Color(.systemBackground)` where a material applies

193
AGENTS.md Normal file
View File

@@ -0,0 +1,193 @@
# LibNovel v2 — Agent Context
This file is the root-level knowledge base for LLM coding agents (OpenCode, Claude, Cursor, Copilot, etc.).
Sub-directories have their own `AGENTS.md` with deeper context (e.g. `ios/AGENTS.md`).
---
## Stack
| Layer | Technology |
|---|---|
| UI | SvelteKit 2 + Svelte 5, TypeScript, TailwindCSS |
| Backend / Runner | Go (single repo, two binaries: `backend`, `runner`) |
| iOS app | SwiftUI, iOS 17+, Swift 5.9+ |
| Database | PocketBase (SQLite) + MinIO (object storage) |
| Search | Meilisearch |
| Queue | Asynq over Redis (local) / Valkey (prod) |
| Scraping | Novelfire scraper in `backend/novelfire/` |
---
## Repository Layout
```
.
├── .gitea/workflows/ # CI/CD — Gitea Actions (NOT .github/)
├── .opencode/ # OpenCode agent config (memory, skills)
├── backend/ # Go backend + runner (single module)
├── caddy/ # Caddy reverse proxy Dockerfile
├── homelab/ # Homelab docker-compose + observability stack
├── ios/ # SwiftUI iOS app (see ios/AGENTS.md)
├── scripts/ # Utility scripts
├── ui/ # SvelteKit UI
├── docker-compose.yml # Prod compose (all services)
├── AGENTS.md # This file
└── opencode.json # OpenCode config
```
---
## CI/CD — Gitea Actions
- Workflows live in `.gitea/workflows/`**not** `.github/workflows/`
- Self-hosted Gitea instance; use `gitea.ref_name` / `gitea.sha` (not `github.*`)
- Two workflows:
- `ci.yaml` — runs on every push to `main` (test + type-check)
- `release.yaml` — runs on `v*` tags (build Docker images, upload source maps, create Gitea release)
- Secrets: `DOCKER_USER`, `DOCKER_TOKEN`, `GITEA_TOKEN`, `GLITCHTIP_AUTH_TOKEN`
### Git credentials
Credentials are embedded in the remote URL — no `HOME=/root` or credential helper needed for push:
```
https://kamil:95782641Apple%24@gitea.kalekber.cc/kamil/libnovel.git
```
All git commands still use `HOME=/root` prefix for consistency (picks up `/root/.gitconfig` for user name/email), but push auth works without it.
### Releasing a new version
```bash
HOME=/root git tag v2.6.X -m "Short title"
HOME=/root git push origin v3-cleanup --tags
```
CI will build all Docker images, upload source maps to GlitchTip, and create a Gitea release automatically.
---
## GlitchTip Error Tracking
- Instance: `https://errors.libnovel.cc/`
- Org: `libnovel`
- Projects: `ui` (id/1), `backend` (id/2), `runner` (id/3)
- Tool: `glitchtip-cli` v0.1.0
### Per-service DSNs (stored in Doppler)
| Service | Doppler key | GlitchTip project |
|---|---|---|
| UI (SvelteKit) | `PUBLIC_GLITCHTIP_DSN` | ui (1) |
| Backend (Go) | `GLITCHTIP_DSN_BACKEND` | backend (2) |
| Runner (Go) | `GLITCHTIP_DSN_RUNNER` | runner (3) |
### Source map upload flow (release.yaml)
The correct order is **critical** — uploading before `releases new` results in 0 files shown in GlitchTip UI:
```
glitchtip-cli sourcemaps inject ./build # inject debug IDs
glitchtip-cli releases new <version> # MUST come before upload
glitchtip-cli sourcemaps upload ./build \
--release <version> # associate files with release
glitchtip-cli releases finalize <version> # mark release complete
```
---
## Infrastructure
| Environment | Host | Path | Doppler config |
|---|---|---|---|
| Prod | `165.22.70.138` | `/opt/libnovel/` | `prd` |
| Homelab runner | `192.168.0.109` | `/opt/libnovel-runner/` | `prd_homelab` |
### Docker Compose — always use Doppler
```bash
# Prod
doppler run --project libnovel --config prd -- docker compose <cmd>
# Homelab full-stack (runs from .bak file on server)
doppler run --project libnovel --config prd_homelab -- docker compose -f homelab/docker-compose.yml.bak <cmd>
# Homelab runner only
doppler run --project libnovel --config prd_homelab -- docker compose -f homelab/runner/docker-compose.yml <cmd>
```
- Prod runner has `profiles: [runner]``docker compose up -d` will NOT accidentally start it
- When deploying, always sync `docker-compose.yml` to the server before running `up -d`
- **Caddyfile is NOT in git** — lives at `/opt/libnovel/Caddyfile` on prod server only. Edit directly on the server and restart the `caddy` container.
---
## Observability
| Tool | Purpose |
|---|---|
| GlitchTip | Error tracking (UI + backend + runner) |
| Grafana Faro | RUM / Web Vitals (collector at `faro.libnovel.cc/collect`) → Alloy (port 12347) |
| OpenTelemetry | Distributed tracing (OTLP → cloudflared → OTel collector → Tempo) |
| Grafana | Dashboards at `https://grafana.libnovel.cc` |
### Grafana dashboards: `homelab/otel/grafana/provisioning/dashboards/`
Key dashboards:
- `backend.json` — Backend logs (Loki: `{service_name="backend"}`, plain text)
- `runner.json` — Runner logs (Loki: `{service_name="runner"}`) + Asynq Prometheus metrics
- `web-vitals.json` — Web Vitals (Loki: `{service_name="unknown_service"} kind=measurement` + pattern parser)
- `catalogue.json` — Scrape progress (Loki: `{service_name="runner"} | json | body="..."`)
### Data pipeline (2026-04-07 working state)
**Browser → Grafana Faro:**
Browser sends RUM data → `https://faro.libnovel.cc/collect`**Alloy** `faro.receiver` (port 12347) → Loki (logs/exceptions) + OTel collector → **Tempo** (traces)
**Backend/Runner → OTel:**
Backend/Runner Go SDK → `https://otel.libnovel.cc` (cloudflared tunnel) → **OTel collector** (port 4318) → Tempo (traces) + Loki (logs via `otlphttp/loki` exporter)
Runner also sends to **Alloy** `otelcol.receiver.otlp` (port 4318) → `otelcol.exporter.loki` → Loki
### Loki log format per service
- `service_name="backend"`: Plain text (e.g. `backend: asynq task dispatch enabled`)
- `service_name="runner"`: JSON with `body`, `attributes{slug,chapters,page}`, `severity`
- `service_name="unknown_service"`: Faro RUM text format (e.g. `kind=measurement lcp=5428.0 ...`)
### OTel Collector ports (homelab)
- gRPC: `4317` — receives from cloudflared (`otel.libnovel.cc`)
- HTTP: `4318` — receives from cloudflared + Alloy
- Metrics: `8888`
### Known issues / pending fixes
- Web Vitals use `service_name="unknown_service"` (Faro SDK doesn't set service.name in browser) — works with `unknown_service` label
- Runner logs go to both Alloy→Loki AND OTel collector→Loki (dual pipeline — intentional for resilience)
---
## Go Backend
- Primary files: `orchestrator.go`, `server/handlers_*.go`, `novelfire/scraper.go`, `storage/hybrid.go`, `storage/pocketbase.go`
- Store interface: `store.go` — never touch MinIO/PocketBase clients directly outside `storage/`
- Two binaries built from the same module: `backend` (HTTP API) and `runner` (Asynq worker)
---
## SvelteKit UI
- Source: `ui/src/`
- i18n: Paraglide — translation files in `ui/messages/*.json` (5 locales)
- Auth debug bypass: `GET /api/auth/debug-login?token=<DEBUG_LOGIN_TOKEN>&username=<username>&next=<path>`
---
## iOS App
Full context in `ios/AGENTS.md`. Quick notes:
- SwiftUI, iOS 17+, `@Observable` for new types
- Download key separator: `::` (not `-`)
- Voice fallback: book override → global default → `"af_bella"`
- Offline pattern: `NetworkMonitor` env object + `OfflineBanner` + `ErrorAlertModifier`

97
CLAUDE.md Normal file
View File

@@ -0,0 +1,97 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Environment / Secrets
All project environment variables are stored in **Doppler**. When you need to access any secret or env var (e.g. API tokens, database URLs, credentials), fetch them via:
```bash
doppler run -- <command> # inject all secrets into a command
doppler secrets get SECRET_NAME # inspect a specific secret
```
Never use `.env` files. Do not ask the user to provide secrets manually — they are available via Doppler.
## Commands
### Docker (via `just` — the primary way to run services)
All services use Doppler for secrets injection. The `just` commands handle this automatically.
```bash
just up # Start all services in background
just up-fg # Start all services, stream logs
just down # Stop all services
just down-volumes # Full reset (destructive — removes all volumes)
just build # Rebuild all Docker images
just build-svc backend # Rebuild a specific service
just restart # Stop + rebuild + start
just logs # Tail all logs
just log backend # Tail a specific service
just shell backend # Open shell in running container
just init # One-shot init: MinIO buckets, PocketBase collections, Postgres
```
### Backend (Go)
```bash
cd backend
go vet ./...
go test -short -race -count=1 -timeout=60s ./...
go test -short -race -count=1 -run TestFoo ./internal/somepackage/
go build ./cmd/backend
go build ./cmd/runner
```
### Frontend (SvelteKit)
```bash
cd ui
npm run dev # Dev server at localhost:5173
npm run build # Production build
npm run check # svelte-check (type-check)
npm run paraglide # Regenerate i18n messages (run after editing messages/*.json)
```
## Architecture
Three services communicate via PocketBase records and a Redis/Valkey task queue:
**Backend** (`backend/cmd/backend`) — HTTP REST API. Handles reads, enqueues tasks to Redis via Asynq, returns presigned MinIO URLs. Minimal processing; delegates heavy work to the runner.
**Runner** (`backend/cmd/runner`) — Asynq task worker. Processes scraping, TTS audio generation, AI text/image generation. Reads/writes PocketBase and MinIO directly.
**UI** (`ui/`) — SvelteKit 2 + Svelte 5 SSR app. Consumes the backend API. Uses Paraglide JS for i18n (5 locales).
### Data layer
| Service | Role |
|---------|------|
| **PocketBase** (SQLite) | Auth, structured records (books, chapters, tasks, subscriptions) |
| **MinIO** (S3-compatible) | Object storage — chapter text, audio files, images |
| **Meilisearch** | Full-text search (runner indexes, backend reads) |
| **Redis/Valkey** | Asynq task queue + presigned URL cache |
### Key backend packages
- `internal/backend/` — HTTP handlers and server setup
- `internal/runner/` — Task processor implementations
- `internal/storage/` — Unified MinIO + PocketBase interface (all data access goes through here)
- `internal/orchestrator/` — Task orchestration across services
- `internal/taskqueue/` — Enqueue helpers (backend side)
- `internal/asynqqueue/` — Asynq queue setup (runner side)
- `internal/config/` — Environment variable loading (Doppler-injected at runtime, no .env files)
- `internal/presigncache/` — Redis cache for MinIO presigned URLs
### UI routing conventions (SvelteKit)
- `+page.svelte` / `+page.server.ts` — Page + server-side load
- `+layout.svelte` / `+layout.server.ts` — Layouts
- `routes/api/` — API routes (`+server.ts`)
- `lib/audio.svelte.ts` — Client-side audio playback store (Svelte 5 runes)
## Key Conventions
- **Svelte 5 runes only** — use `$state`, `$derived`, `$effect`; do not use Svelte 4 stores or reactive statements.
- **Modern Go idioms** — structured logging via `log/slog`, OpenTelemetry tracing throughout.
- **No direct MinIO/PocketBase client calls** outside the `internal/storage/` package.
- **Secrets via Doppler** — never use `.env` files. All secrets are injected by Doppler CLI.
- **CI/CD is Gitea Actions** (`.gitea/workflows/`), not GitHub Actions. Use `gitea.ref_name`/`gitea.sha` variables.
- **Git hooks** in `.githooks/` — enable with `just setup`.
- **i18n**: translation files live in `ui/messages/{en,es,fr,de,pt}.json`; run `npm run paraglide` after editing them.
- **Error tracking**: GlitchTip with per-service DSNs (backend id/2, runner id/3, UI id/1) stored in Doppler.

60
HOMELAB_SECRETS_SETUP.md Normal file
View File

@@ -0,0 +1,60 @@
# Homelab Deployment Secrets Setup
The release workflow now includes automatic deployment to the homelab runner server. You need to add these secrets to Gitea.
## Required Secrets
Go to: `https://gitea.kalekber.cc/kamil/libnovel/settings/secrets/actions`
### 1. HOMELAB_HOST
```
192.168.0.109
```
### 2. HOMELAB_USER
```
root
```
### 3. HOMELAB_SSH_KEY
If you want to use the same SSH key as prod:
- Copy the value from `PROD_SSH_KEY` secret
If you want a separate key:
```bash
# On your local machine or CI runner
cat ~/.ssh/id_rsa # or your preferred key
```
### 4. HOMELAB_SSH_KNOWN_HOSTS
Run this when the homelab server is reachable:
```bash
ssh-keyscan -H 192.168.0.109 2>/dev/null
```
Expected output format:
```
|1|base64hash...|192.168.0.109 ssh-rsa AAAAB3NzaC...
|1|base64hash...|192.168.0.109 ecdsa-sha2-nistp256 AAAAE2...
|1|base64hash...|192.168.0.109 ssh-ed25519 AAAAC3...
```
## Testing
After adding the secrets, the next release (e.g., v4.1.10) will automatically:
1. Build all Docker images
2. Deploy to prod (165.22.70.138) ✅
3. Deploy to homelab (192.168.0.109) ✅ NEW
4. Create a Gitea release
Both deployments run in parallel for faster releases.
## Troubleshooting
If the homelab deployment fails:
- Check that the secrets are set correctly
- Verify SSH access: `ssh root@192.168.0.109`
- Check Doppler config exists: `doppler configs --project libnovel`
- Manually test: `cd /opt/libnovel-runner && doppler run --project libnovel --config prd_homelab -- docker compose pull runner`

View File

@@ -27,7 +27,10 @@ RUN --mount=type=cache,target=/root/go/pkg/mod \
-o /out/runner ./cmd/runner && \
CGO_ENABLED=0 GOOS=linux go build \
-ldflags="-s -w" \
-o /out/healthcheck ./cmd/healthcheck
-o /out/healthcheck ./cmd/healthcheck && \
CGO_ENABLED=0 GOOS=linux go build \
-ldflags="-s -w" \
-o /out/pocketbase ./cmd/pocketbase
# ── backend service ──────────────────────────────────────────────────────────
# Uses Alpine (not distroless) so ffmpeg is available for on-demand voice
@@ -40,6 +43,18 @@ COPY --from=builder /out/backend /backend
USER appuser
ENTRYPOINT ["/backend"]
# ── pocketbase service ───────────────────────────────────────────────────────
# Runs the custom PocketBase binary with Go migrations baked in.
# On every `serve` startup it applies any pending migrations automatically.
# Data is stored in /pb_data (mounted as a Docker volume in production).
FROM alpine:3.21 AS pocketbase
RUN apk add --no-cache ca-certificates wget
COPY --from=builder /out/pocketbase /pocketbase
RUN mkdir -p /pb_data
VOLUME /pb_data
EXPOSE 8090
CMD ["/pocketbase", "serve", "--dir", "/pb_data", "--http", "0.0.0.0:8090"]
# ── runner service ───────────────────────────────────────────────────────────
# Uses Alpine (not distroless) so ffmpeg is available for WAV→MP3 transcoding
# when pocket-tts voices are used.

View File

@@ -26,6 +26,7 @@ import (
"github.com/hibiken/asynq"
"github.com/libnovel/backend/internal/asynqqueue"
"github.com/libnovel/backend/internal/backend"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/config"
"github.com/libnovel/backend/internal/kokoro"
"github.com/libnovel/backend/internal/meili"
@@ -114,6 +115,33 @@ func run() error {
log.Info("POCKET_TTS_URL not set — pocket-tts voices unavailable in backend")
}
// ── Cloudflare Workers AI (voice sample generation + audio-stream live TTS) ──
var cfaiClient cfai.Client
if cfg.CFAI.AccountID != "" && cfg.CFAI.APIToken != "" {
cfaiClient = cfai.New(cfg.CFAI.AccountID, cfg.CFAI.APIToken, cfg.CFAI.Model)
log.Info("cloudflare AI TTS enabled", "model", cfg.CFAI.Model)
} else {
log.Info("CFAI_ACCOUNT_ID/CFAI_API_TOKEN not set — CF AI voices unavailable in backend")
}
// ── Cloudflare Workers AI Image Generation ────────────────────────────────
var imageGenClient cfai.ImageGenClient
if cfg.CFAI.AccountID != "" && cfg.CFAI.APIToken != "" {
imageGenClient = cfai.NewImageGen(cfg.CFAI.AccountID, cfg.CFAI.APIToken)
log.Info("cloudflare AI image generation enabled")
} else {
log.Info("CFAI_ACCOUNT_ID/CFAI_API_TOKEN not set — image generation unavailable")
}
// ── Cloudflare Workers AI Text Generation ─────────────────────────────────
var textGenClient cfai.TextGenClient
if cfg.CFAI.AccountID != "" && cfg.CFAI.APIToken != "" {
textGenClient = cfai.NewTextGen(cfg.CFAI.AccountID, cfg.CFAI.APIToken)
log.Info("cloudflare AI text generation enabled")
} else {
log.Info("CFAI_ACCOUNT_ID/CFAI_API_TOKEN not set — text generation unavailable")
}
// ── Meilisearch (search reads only; indexing is the runner's job) ────────
var searchIndex meili.Client
if cfg.Meilisearch.URL != "" {
@@ -149,21 +177,31 @@ func run() error {
DefaultVoice: cfg.Kokoro.DefaultVoice,
Version: version,
Commit: commit,
AdminToken: cfg.HTTP.AdminToken,
},
backend.Dependencies{
BookReader: store,
RankingStore: store,
AudioStore: store,
TranslationStore: store,
PresignStore: store,
ProgressStore: store,
CoverStore: store,
Producer: producer,
TaskReader: store,
SearchIndex: searchIndex,
Kokoro: kokoroClient,
PocketTTS: pocketTTSClient,
Log: log,
BookReader: store,
RankingStore: store,
AudioStore: store,
TranslationStore: store,
PresignStore: store,
ProgressStore: store,
CoverStore: store,
ChapterImageStore: store,
Producer: producer,
TaskReader: store,
ImportFileStore: store,
SearchIndex: searchIndex,
Kokoro: kokoroClient,
PocketTTS: pocketTTSClient,
CFAI: cfaiClient,
ImageGen: imageGenClient,
TextGen: textGenClient,
BookWriter: store,
AIJobStore: store,
BookAdminStore: store,
NotificationStore: store,
Log: log,
},
)
@@ -200,6 +238,10 @@ func (n *noopKokoro) StreamAudioMP3(_ context.Context, _, _ string) (io.ReadClos
return nil, fmt.Errorf("kokoro not configured (KOKORO_URL is empty)")
}
func (n *noopKokoro) StreamAudioWAV(_ context.Context, _, _ string) (io.ReadCloser, error) {
return nil, fmt.Errorf("kokoro not configured (KOKORO_URL is empty)")
}
func (n *noopKokoro) ListVoices(_ context.Context) ([]string, error) {
return nil, nil
}

View File

@@ -0,0 +1,47 @@
// Command pocketbase is a thin wrapper that runs PocketBase as a Go framework
// with version-controlled Go migrations.
//
// On every `serve`, PocketBase automatically applies any pending migrations from
// the migrations/ package before accepting traffic.
//
// Usage (Docker):
//
// ./pocketbase serve --dir /pb_data --http 0.0.0.0:8090
//
// Migration workflow:
//
// # Generate a timestamped stub:
// go run ./cmd/pocketbase migrate create "description"
// # Apply manually (also runs automatically on serve):
// go run ./cmd/pocketbase migrate up
// # Revert last migration:
// go run ./cmd/pocketbase migrate down 1
// # After migrating an existing install, mark existing schema as done:
// go run ./cmd/pocketbase migrate history-sync
package main
import (
"log"
"github.com/pocketbase/pocketbase"
"github.com/pocketbase/pocketbase/plugins/migratecmd"
// Register all migrations via init().
_ "github.com/libnovel/backend/migrations"
)
func main() {
app := pocketbase.New()
// Register the migrate sub-command.
// Automigrate: false — migrations are written by hand, never auto-generated
// from Admin UI changes. Pending migrations still apply automatically on
// every `serve` regardless of this flag.
migratecmd.MustRegister(app, app.RootCmd, migratecmd.Config{
Automigrate: false,
})
if err := app.Start(); err != nil {
log.Fatal(err)
}
}

View File

@@ -23,6 +23,7 @@ import (
"github.com/getsentry/sentry-go"
"github.com/libnovel/backend/internal/asynqqueue"
"github.com/libnovel/backend/internal/browser"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/config"
"github.com/libnovel/backend/internal/kokoro"
"github.com/libnovel/backend/internal/libretranslate"
@@ -33,6 +34,7 @@ import (
"github.com/libnovel/backend/internal/runner"
"github.com/libnovel/backend/internal/storage"
"github.com/libnovel/backend/internal/taskqueue"
"github.com/libnovel/backend/internal/webpush"
)
// version and commit are set at build time via -ldflags.
@@ -130,6 +132,15 @@ func run() error {
log.Warn("POCKET_TTS_URL not set — pocket-tts voice tasks will fail")
}
// ── Cloudflare Workers AI ────────────────────────────────────────────────
var cfaiClient cfai.Client
if cfg.CFAI.AccountID != "" && cfg.CFAI.APIToken != "" {
cfaiClient = cfai.New(cfg.CFAI.AccountID, cfg.CFAI.APIToken, cfg.CFAI.Model)
log.Info("cloudflare AI TTS enabled", "model", cfg.CFAI.Model)
} else {
log.Info("CFAI_ACCOUNT_ID/CFAI_API_TOKEN not set — CF AI voice tasks will fail")
}
// ── LibreTranslate ──────────────────────────────────────────────────────
ltClient := libretranslate.New(cfg.LibreTranslate.URL, cfg.LibreTranslate.APIKey)
if ltClient != nil {
@@ -180,19 +191,35 @@ func run() error {
log.Info("runner: poll mode — using PocketBase for task dispatch")
}
// ── Web Push ─────────────────────────────────────────────────────────────
var pushSender *webpush.Sender
if cfg.VAPID.PublicKey != "" && cfg.VAPID.PrivateKey != "" {
pushSender = webpush.New(cfg.VAPID.PublicKey, cfg.VAPID.PrivateKey, cfg.VAPID.Subject, log)
log.Info("runner: web push notifications enabled")
} else {
log.Info("runner: VAPID_PUBLIC_KEY/VAPID_PRIVATE_KEY not set — push notifications disabled")
}
deps := runner.Dependencies{
Consumer: consumer,
BookWriter: store,
BookReader: store,
AudioStore: store,
CoverStore: store,
TranslationStore: store,
SearchIndex: searchIndex,
Novel: novel,
Kokoro: kokoroClient,
PocketTTS: pocketTTSClient,
LibreTranslate: ltClient,
Log: log,
BookWriter: store,
BookReader: store,
AudioStore: store,
CoverStore: store,
TranslationStore: store,
BookImport: storage.NewBookImporter(store),
ImportChapterStore: store,
ChapterIngester: store,
SearchIndex: searchIndex,
Novel: novel,
Kokoro: kokoroClient,
PocketTTS: pocketTTSClient,
CFAI: cfaiClient,
LibreTranslate: ltClient,
Notifier: store,
WebPush: pushSender,
Store: store,
Log: log,
}
r := runner.New(rCfg, deps)
@@ -227,6 +254,10 @@ func (n *noopKokoro) StreamAudioMP3(_ context.Context, _, _ string) (io.ReadClos
return nil, fmt.Errorf("kokoro not configured (KOKORO_URL is empty)")
}
func (n *noopKokoro) StreamAudioWAV(_ context.Context, _, _ string) (io.ReadCloser, error) {
return nil, fmt.Errorf("kokoro not configured (KOKORO_URL is empty)")
}
func (n *noopKokoro) ListVoices(_ context.Context) ([]string, error) {
return nil, nil
}

View File

@@ -3,70 +3,99 @@ module github.com/libnovel/backend
go 1.26.1
require (
github.com/SherClockHolmes/webpush-go v1.4.0
github.com/getsentry/sentry-go v0.43.0
github.com/hibiken/asynq v0.26.0
github.com/hibiken/asynq/x v0.0.0-20260203063626-d704b68a426d
github.com/meilisearch/meilisearch-go v0.36.1
github.com/minio/minio-go/v7 v7.0.98
golang.org/x/net v0.51.0
github.com/pdfcpu/pdfcpu v0.11.1
github.com/pocketbase/pocketbase v0.36.9
github.com/prometheus/client_golang v1.23.2
github.com/redis/go-redis/v9 v9.18.0
github.com/yuin/goldmark v1.8.2
go.opentelemetry.io/contrib/bridges/otelslog v0.17.0
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.67.0
go.opentelemetry.io/otel v1.42.0
go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploghttp v0.18.0
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.42.0
go.opentelemetry.io/otel/log v0.18.0
go.opentelemetry.io/otel/sdk v1.42.0
go.opentelemetry.io/otel/sdk/log v0.18.0
golang.org/x/net v0.52.0
)
require (
github.com/andybalholm/brotli v1.1.1 // indirect
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 // indirect
github.com/beorn7/perks v1.0.1 // indirect
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/clipperhouse/uax29/v2 v2.2.0 // indirect
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect
github.com/disintegration/imaging v1.6.2 // indirect
github.com/domodwyer/mailyak/v3 v3.6.2 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect
github.com/fatih/color v1.19.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/getsentry/sentry-go v0.43.0 // indirect
github.com/gabriel-vasile/mimetype v1.4.13 // indirect
github.com/ganigeorgiev/fexpr v0.5.0 // indirect
github.com/go-ini/ini v1.67.0 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/go-ozzo/ozzo-validation/v4 v4.3.0 // indirect
github.com/golang-jwt/jwt/v5 v5.3.1 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 // indirect
github.com/hibiken/asynq v0.26.0 // indirect
github.com/hibiken/asynq/x v0.0.0-20260203063626-d704b68a426d // indirect
github.com/hhrutter/lzw v1.0.0 // indirect
github.com/hhrutter/pkcs7 v0.2.0 // indirect
github.com/hhrutter/tiff v1.0.2 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/klauspost/compress v1.18.2 // indirect
github.com/klauspost/cpuid/v2 v2.2.11 // indirect
github.com/klauspost/crc32 v1.3.0 // indirect
github.com/meilisearch/meilisearch-go v0.36.1 // indirect
github.com/mattn/go-colorable v0.1.14 // indirect
github.com/mattn/go-isatty v0.0.21 // indirect
github.com/mattn/go-runewidth v0.0.19 // indirect
github.com/minio/crc64nvme v1.1.1 // indirect
github.com/minio/md5-simd v1.1.2 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/ncruces/go-strftime v1.0.0 // indirect
github.com/philhofer/fwd v1.2.0 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/prometheus/client_golang v1.23.2 // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pocketbase/dbx v1.12.0 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.66.1 // indirect
github.com/prometheus/procfs v0.16.1 // indirect
github.com/redis/go-redis/v9 v9.18.0 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/robfig/cron/v3 v3.0.1 // indirect
github.com/rs/xid v1.6.0 // indirect
github.com/spf13/cast v1.10.0 // indirect
github.com/spf13/cobra v1.10.2 // indirect
github.com/spf13/pflag v1.0.10 // indirect
github.com/tinylib/msgp v1.6.1 // indirect
github.com/yuin/goldmark v1.8.2 // indirect
go.opentelemetry.io/auto/sdk v1.2.1 // indirect
go.opentelemetry.io/contrib/bridges/otelslog v0.17.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.67.0 // indirect
go.opentelemetry.io/otel v1.42.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploghttp v0.18.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.42.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.42.0 // indirect
go.opentelemetry.io/otel/log v0.18.0 // indirect
go.opentelemetry.io/otel/metric v1.42.0 // indirect
go.opentelemetry.io/otel/sdk v1.42.0 // indirect
go.opentelemetry.io/otel/sdk/log v0.18.0 // indirect
go.opentelemetry.io/otel/trace v1.42.0 // indirect
go.opentelemetry.io/proto/otlp v1.9.0 // indirect
go.uber.org/atomic v1.11.0 // indirect
go.yaml.in/yaml/v2 v2.4.2 // indirect
go.yaml.in/yaml/v3 v3.0.4 // indirect
golang.org/x/crypto v0.48.0 // indirect
golang.org/x/sys v0.41.0 // indirect
golang.org/x/text v0.34.0 // indirect
golang.org/x/crypto v0.49.0 // indirect
golang.org/x/image v0.38.0 // indirect
golang.org/x/oauth2 v0.36.0 // indirect
golang.org/x/sync v0.20.0 // indirect
golang.org/x/sys v0.43.0 // indirect
golang.org/x/text v0.36.0 // indirect
golang.org/x/time v0.14.0 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20260209200024-4cfbd4190f57 // indirect
google.golang.org/grpc v1.79.2 // indirect
google.golang.org/protobuf v1.36.11 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
modernc.org/libc v1.70.0 // indirect
modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect
modernc.org/sqlite v1.48.2 // indirect
)

View File

@@ -1,21 +1,48 @@
github.com/SherClockHolmes/webpush-go v1.4.0 h1:ocnzNKWN23T9nvHi6IfyrQjkIc0oJWv1B1pULsf9i3s=
github.com/SherClockHolmes/webpush-go v1.4.0/go.mod h1:XSq8pKX11vNV8MJEMwjrlTkxhAj1zKfxmyhdV7Pd6UA=
github.com/andybalholm/brotli v1.1.1 h1:PR2pgnyFznKEugtsUo0xLdDop5SKXd5Qf5ysW+7XdTA=
github.com/andybalholm/brotli v1.1.1/go.mod h1:05ib4cKhjx3OQYUY22hTVd34Bc8upXjOLL2rKwwZBoA=
github.com/asaskevich/govalidator v0.0.0-20200108200545-475eaeb16496/go.mod h1:oGkLhpf+kjZl6xBf758TQhh5XrAeiJv/7FRz/2spLIg=
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 h1:DklsrG3dyBCFEj5IhUbnKptjxatkF07cF2ak3yi77so=
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
github.com/bsm/ginkgo/v2 v2.12.0 h1:Ny8MWAHyOepLGlLKYmXG4IEkioBysk6GpaRTLC8zwWs=
github.com/bsm/ginkgo/v2 v2.12.0/go.mod h1:SwYbGRRDovPVboqFv0tPTcG1sN61LM1Z4ARdbAV9g4c=
github.com/bsm/gomega v1.27.10 h1:yeMWxP2pV2fG3FgAODIY8EiRE3dy0aeFYt4l7wh6yKA=
github.com/bsm/gomega v1.27.10/go.mod h1:JyEr/xRbxbtgWNi8tIEVPUYZ5Dzef52k01W3YH0H+O0=
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/clipperhouse/uax29/v2 v2.2.0 h1:ChwIKnQN3kcZteTXMgb1wztSgaU+ZemkgWdohwgs8tY=
github.com/clipperhouse/uax29/v2 v2.2.0/go.mod h1:EFJ2TJMRUaplDxHKj1qAEhCtQPW2tJSwu5BF98AuoVM=
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78=
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc=
github.com/disintegration/imaging v1.6.2 h1:w1LecBlG2Lnp8B3jk5zSuNqd7b4DXhcjwek1ei82L+c=
github.com/disintegration/imaging v1.6.2/go.mod h1:44/5580QXChDfwIclfc/PCwrr44amcmDAg8hxG0Ewe4=
github.com/domodwyer/mailyak/v3 v3.6.2 h1:x3tGMsyFhTCaxp6ycgR0FE/bu5QiNp+hetUuCOBXMn8=
github.com/domodwyer/mailyak/v3 v3.6.2/go.mod h1:lOm/u9CyCVWHeaAmHIdF4RiKVxKUT/H5XX10lIKAL6c=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
github.com/fatih/color v1.19.0 h1:Zp3PiM21/9Ld6FzSKyL5c/BULoe/ONr9KlbYVOfG8+w=
github.com/fatih/color v1.19.0/go.mod h1:zNk67I0ZUT1bEGsSGyCZYZNrHuTkJJB+r6Q9VuMi0LE=
github.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2Wg=
github.com/felixge/httpsnoop v1.0.4/go.mod h1:m8KPJKqk1gH5J9DgRY2ASl2lWCfGKXixSwevea8zH2U=
github.com/frankban/quicktest v1.14.6 h1:7Xjx+VpznH+oBnejlPUj8oUpdxnVs4f8XU8WnHkI4W8=
github.com/frankban/quicktest v1.14.6/go.mod h1:4ptaffx2x8+WTWXmUCuVU6aPUX1/Mz7zb5vbUoiM6w0=
github.com/gabriel-vasile/mimetype v1.4.13 h1:46nXokslUBsAJE/wMsp5gtO500a4F3Nkz9Ufpk2AcUM=
github.com/gabriel-vasile/mimetype v1.4.13/go.mod h1:d+9Oxyo1wTzWdyVUPMmXFvp4F9tea18J8ufA774AB3s=
github.com/ganigeorgiev/fexpr v0.5.0 h1:XA9JxtTE/Xm+g/JFI6RfZEHSiQlk+1glLvRK1Lpv/Tk=
github.com/ganigeorgiev/fexpr v0.5.0/go.mod h1:RyGiGqmeXhEQ6+mlGdnUleLHgtzzu/VGO2WtJkF5drE=
github.com/getsentry/sentry-go v0.43.0 h1:XbXLpFicpo8HmBDaInk7dum18G9KSLcjZiyUKS+hLW4=
github.com/getsentry/sentry-go v0.43.0/go.mod h1:XDotiNZbgf5U8bPDUAfvcFmOnMQQceESxyKaObSssW0=
github.com/go-errors/errors v1.4.2 h1:J6MZopCL4uSllY1OfXM374weqZFFItUbrImctkmUxIA=
github.com/go-errors/errors v1.4.2/go.mod h1:sIVyrIiJhuEF+Pj9Ebtd6P/rEYROXFi3BopGUQ5a5Og=
github.com/go-ini/ini v1.67.0 h1:z6ZrTEZqSWOTyH2FlglNbNgARyHG8oLW9gMELqKr06A=
github.com/go-ini/ini v1.67.0/go.mod h1:ByCAeIL28uOIIG0E3PJtZPDL8WnHpFKFOtgjp+3Ies8=
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
@@ -23,16 +50,39 @@ github.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=
github.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=
github.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=
github.com/go-ozzo/ozzo-validation/v4 v4.3.0 h1:byhDUpfEwjsVQb1vBunvIjh2BHQ9ead57VkAEY4V+Es=
github.com/go-ozzo/ozzo-validation/v4 v4.3.0/go.mod h1:2NKgrcHl3z6cJs+3Oo940FPRiTzuqKbvfrL2RxCj6Ew=
github.com/go-sql-driver/mysql v1.4.1 h1:g24URVg0OFbNUTx9qqY1IRZ9D9z3iPyi5zKhQZpNwpA=
github.com/go-sql-driver/mysql v1.4.1/go.mod h1:zAC/RDZ24gD3HViQzih4MyKcchzm+sOG5ZlKdlhCg5w=
github.com/golang-jwt/jwt/v5 v5.2.1/go.mod h1:pqrtFR0X4osieyHYxtmOUWsAWrfe1Q5UVIyoH402zdk=
github.com/golang-jwt/jwt/v5 v5.3.1 h1:kYf81DTWFe7t+1VvL7eS+jKFVWaUnK9cB1qbwn63YCY=
github.com/golang-jwt/jwt/v5 v5.3.1/go.mod h1:fxCRLWMO43lRc8nhHWY6LGqRcf+1gQWArsqaEUEa5bE=
github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/pprof v0.0.0-20260402051712-545e8a4df936 h1:EwtI+Al+DeppwYX2oXJCETMO23COyaKGP6fHVpkpWpg=
github.com/google/pprof v0.0.0-20260402051712-545e8a4df936/go.mod h1:MxpfABSjhmINe3F1It9d+8exIHFvUqtLIRCdOGNXqiI=
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 h1:HWRh5R2+9EifMyIHV7ZV+MIZqgz+PMpZ14Jynv3O2Zs=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0/go.mod h1:JfhWUomR1baixubs02l85lZYYOm7LV6om4ceouMv45c=
github.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k=
github.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM=
github.com/hhrutter/lzw v1.0.0 h1:laL89Llp86W3rRs83LvKbwYRx6INE8gDn0XNb1oXtm0=
github.com/hhrutter/lzw v1.0.0/go.mod h1:2HC6DJSn/n6iAZfgM3Pg+cP1KxeWc3ezG8bBqW5+WEo=
github.com/hhrutter/pkcs7 v0.2.0 h1:i4HN2XMbGQpZRnKBLsUwO3dSckzgX142TNqY/KfXg+I=
github.com/hhrutter/pkcs7 v0.2.0/go.mod h1:aEzKz0+ZAlz7YaEMY47jDHL14hVWD6iXt0AgqgAvWgE=
github.com/hhrutter/tiff v1.0.2 h1:7H3FQQpKu/i5WaSChoD1nnJbGx4MxU5TlNqqpxw55z8=
github.com/hhrutter/tiff v1.0.2/go.mod h1:pcOeuK5loFUE7Y/WnzGw20YxUdnqjY1P0Jlcieb/cCw=
github.com/hibiken/asynq v0.26.0 h1:1Zxr92MlDnb1Zt/QR5g2vSCqUS03i95lUfqx5X7/wrw=
github.com/hibiken/asynq v0.26.0/go.mod h1:Qk4e57bTnWDoyJ67VkchuV6VzSM9IQW2nPvAGuDyw58=
github.com/hibiken/asynq/x v0.0.0-20260203063626-d704b68a426d h1:Ld5m8EIK5QVOq/owOexKIbETij3skACg4eU1pArHsrw=
github.com/hibiken/asynq/x v0.0.0-20260203063626-d704b68a426d/go.mod h1:hhpStehaxSGg3ib9wJXzw5AXY1YS6lQ9BNavAgPbIhE=
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
github.com/klauspost/compress v1.18.2 h1:iiPHWW0YrcFgpBYhsA6D1+fqHssJscY/Tm/y2Uqnapk=
github.com/klauspost/compress v1.18.2/go.mod h1:R0h/fSBs8DE4ENlcrlib3PsXS61voFxhIs2DeRhCvJ4=
github.com/klauspost/cpuid/v2 v2.0.1/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
@@ -40,6 +90,18 @@ github.com/klauspost/cpuid/v2 v2.2.11 h1:0OwqZRYI2rFrjS4kvkDnqJkKHdHaRnCm68/DY4O
github.com/klauspost/cpuid/v2 v2.2.11/go.mod h1:hqwkgyIinND0mEev00jJYCxPNVRVXFQeu1XKlok6oO0=
github.com/klauspost/crc32 v1.3.0 h1:sSmTt3gUt81RP655XGZPElI0PelVTZ6YwCRnPSupoFM=
github.com/klauspost/crc32 v1.3.0/go.mod h1:D7kQaZhnkX/Y0tstFGf8VUzv2UofNGqCjnC3zdHB0Hw=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
github.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=
github.com/mattn/go-colorable v0.1.14/go.mod h1:6LmQG8QLFO4G5z1gPvYEzlUgJ2wF+stgPZH1UqBm1s8=
github.com/mattn/go-isatty v0.0.21 h1:xYae+lCNBP7QuW4PUnNG61ffM4hVIfm+zUzDuSzYLGs=
github.com/mattn/go-isatty v0.0.21/go.mod h1:ZXfXG4SQHsB/w3ZeOYbR0PrPwLy+n6xiMrJlRFqopa4=
github.com/mattn/go-runewidth v0.0.19 h1:v++JhqYnZuu5jSKrk9RbgF5v4CGUjqRfBm05byFGLdw=
github.com/mattn/go-runewidth v0.0.19/go.mod h1:XBkDxAl56ILZc9knddidhrOlY5R/pDhgLpndooCuJAs=
github.com/meilisearch/meilisearch-go v0.36.1 h1:mJTCJE5g7tRvaqKco6DfqOuJEjX+rRltDEnkEC02Y0M=
github.com/meilisearch/meilisearch-go v0.36.1/go.mod h1:hWcR0MuWLSzHfbz9GGzIr3s9rnXLm1jqkmHkJPbUSvM=
github.com/minio/crc64nvme v1.1.1 h1:8dwx/Pz49suywbO+auHCBpCtlW1OfpcLN7wYgVR6wAI=
@@ -50,42 +112,61 @@ github.com/minio/minio-go/v7 v7.0.98 h1:MeAVKjLVz+XJ28zFcuYyImNSAh8Mq725uNW4beRi
github.com/minio/minio-go/v7 v7.0.98/go.mod h1:cY0Y+W7yozf0mdIclrttzo1Iiu7mEf9y7nk2uXqMOvM=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=
github.com/ncruces/go-strftime v1.0.0 h1:HMFp8mLCTPp341M/ZnA4qaf7ZlsbTc+miZjCLOFAw7w=
github.com/ncruces/go-strftime v1.0.0/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls=
github.com/pdfcpu/pdfcpu v0.11.1 h1:htHBSkGH5jMKWC6e0sihBFbcKZ8vG1M67c8/dJxhjas=
github.com/pdfcpu/pdfcpu v0.11.1/go.mod h1:pP3aGga7pRvwFWAm9WwFvo+V68DfANi9kxSQYioNYcw=
github.com/philhofer/fwd v1.2.0 h1:e6DnBTl7vGY+Gz322/ASL4Gyp1FspeMvx1RNDoToZuM=
github.com/philhofer/fwd v1.2.0/go.mod h1:RqIHx9QI14HlwKwm98g9Re5prTQ6LdeRQn+gXJFxsJM=
github.com/pingcap/errors v0.11.4 h1:lFuQV/oaUMGcD2tqt+01ROSmJs75VG1ToEOkZIZ4nE4=
github.com/pingcap/errors v0.11.4/go.mod h1:Oi8TUi2kEtXXLMJk9l1cGmz20kV3TaQ0usTwv5KuLY8=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prometheus/client_golang v1.20.5 h1:cxppBPuYhUnsO6yo/aoRol4L7q7UFfdm+bR9r+8l63Y=
github.com/prometheus/client_golang v1.20.5/go.mod h1:PIEt8X02hGcP8JWbeHyeZ53Y/jReSnHgO035n//V5WE=
github.com/pocketbase/dbx v1.12.0 h1:/oLErM+A0b4xI0PWTGPqSDVjzix48PqI/bng2l0PzoA=
github.com/pocketbase/dbx v1.12.0/go.mod h1:xXRCIAKTHMgUCyCKZm55pUOdvFziJjQfXaWKhu2vhMs=
github.com/pocketbase/pocketbase v0.36.9 h1:x3mXMB4AwhTzJ34JZpZR7IQyUih7Fx1l86r0V/k4oW8=
github.com/pocketbase/pocketbase v0.36.9/go.mod h1:t3sMcAxGHrDAXNcZ+65cZxBMpFP1vBdI9DrghB4n5Gw=
github.com/prometheus/client_golang v1.23.2 h1:Je96obch5RDVy3FDMndoUsjAhG5Edi49h0RJWRi/o0o=
github.com/prometheus/client_golang v1.23.2/go.mod h1:Tb1a6LWHB3/SPIzCoaDXI4I8UHKeFTEQ1YCr+0Gyqmg=
github.com/prometheus/client_model v0.6.1 h1:ZKSh/rekM+n3CeS952MLRAdFwIKqeY8b62p8ais2e9E=
github.com/prometheus/client_model v0.6.1/go.mod h1:OrxVMOVHjw3lKMa8+x6HeMGkHMQyHDk9E3jmP2AmGiY=
github.com/prometheus/client_model v0.6.2 h1:oBsgwpGs7iVziMvrGhE53c/GrLUsZdHnqNwqPLxwZyk=
github.com/prometheus/client_model v0.6.2/go.mod h1:y3m2F6Gdpfy6Ut/GBsUqTWZqCUvMVzSfMLjcu6wAwpE=
github.com/prometheus/common v0.55.0 h1:KEi6DK7lXW/m7Ig5i47x0vRzuBsHuvJdi5ee6Y3G1dc=
github.com/prometheus/common v0.55.0/go.mod h1:2SECS4xJG1kd8XF9IcM1gMX6510RAEL65zxzNImwdc8=
github.com/prometheus/common v0.66.1 h1:h5E0h5/Y8niHc5DlaLlWLArTQI7tMrsfQjHV+d9ZoGs=
github.com/prometheus/common v0.66.1/go.mod h1:gcaUsgf3KfRSwHY4dIMXLPV0K/Wg1oZ8+SbZk/HH/dA=
github.com/prometheus/procfs v0.15.1 h1:YagwOFzUgYfKKHX6Dr+sHT7km/hxC76UB0learggepc=
github.com/prometheus/procfs v0.15.1/go.mod h1:fB45yRUv8NstnjriLhBQLuOUt+WW4BsoGhij/e3PBqk=
github.com/prometheus/procfs v0.16.1 h1:hZ15bTNuirocR6u0JZ6BAHHmwS1p8B4P6MRqxtzMyRg=
github.com/prometheus/procfs v0.16.1/go.mod h1:teAbpZRB1iIAJYREa1LsoWUXykVXA1KlTmWl8x/U+Is=
github.com/redis/go-redis/v9 v9.18.0 h1:pMkxYPkEbMPwRdenAzUNyFNrDgHx9U+DrBabWNfSRQs=
github.com/redis/go-redis/v9 v9.18.0/go.mod h1:k3ufPphLU5YXwNTUcCRXGxUoF1fqxnhFQmscfkCoDA0=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=
github.com/robfig/cron/v3 v3.0.1 h1:WdRxkvbJztn8LMz/QEvLN5sBU+xKpSqwwUO1Pjr4qDs=
github.com/robfig/cron/v3 v3.0.1/go.mod h1:eQICP3HwyT7UooqI/z+Ov+PtYAWygg1TEWWzGIFLtro=
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
github.com/rs/xid v1.6.0 h1:fV591PaemRlL6JfRxGDEPl69wICngIQ3shQtzfy2gxU=
github.com/rs/xid v1.6.0/go.mod h1:7XoLgs4eV+QndskICGsho+ADou8ySMSjJKDIan90Nz0=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/spf13/cast v1.10.0 h1:h2x0u2shc1QuLHfxi+cTJvs30+ZAHOGRic8uyGTDWxY=
github.com/spf13/cast v1.10.0/go.mod h1:jNfB8QC9IA6ZuY2ZjDp0KtFO2LZZlg4S/7bzP6qqeHo=
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/spf13/cobra v1.10.2 h1:DMTTonx5m65Ic0GOoRY2c16WCbHxOOw6xxezuLaBpcU=
github.com/spf13/cobra v1.10.2/go.mod h1:7C1pvHqHw5A4vrJfjNwvOdzYu0Gml16OCs2GRiTUUS4=
github.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/spf13/pflag v1.0.10 h1:4EBh2KAYBwaONj6b2Ye1GiHfwjqyROoF4RwYO+vPwFk=
github.com/spf13/pflag v1.0.10/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/tinylib/msgp v1.6.1 h1:ESRv8eL3u+DNHUoSAAQRE50Hm162zqAnBoGv9PzScPY=
github.com/tinylib/msgp v1.6.1/go.mod h1:RSp0LW9oSxFut3KzESt5Voq4GVWyS+PSulT77roAqEA=
github.com/xyproto/randomstring v1.0.5 h1:YtlWPoRdgMu3NZtP45drfy1GKoojuR7hmRcnhZqKjWU=
github.com/xyproto/randomstring v1.0.5/go.mod h1:rgmS5DeNXLivK7YprL0pY+lTuhNQW3iGxZ18UQApw/E=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
github.com/yuin/goldmark v1.8.2 h1:kEGpgqJXdgbkhcOgBxkC0X0PmoPG1ZyoZ117rDVp4zE=
github.com/yuin/goldmark v1.8.2/go.mod h1:ip/1k0VRfGynBgxOz0yCqHrbZXhcjxyuS66Brc7iBKg=
github.com/zeebo/xxh3 v1.0.2 h1:xZmwmqxHZA8AI603jOQ0tMqmBr9lPeFwGg6d+xy9DC0=
github.com/zeebo/xxh3 v1.0.2/go.mod h1:5NWz9Sef7zIDm2JHfFlcQvNekmcEl9ekUZQQKCYaDcA=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/contrib/bridges/otelslog v0.17.0 h1:NFIS6x7wyObQ7cR84x7bt1sr8nYBx89s3x3GwRjw40k=
@@ -108,26 +189,111 @@ go.opentelemetry.io/otel/sdk v1.42.0 h1:LyC8+jqk6UJwdrI/8VydAq/hvkFKNHZVIWuslJXY
go.opentelemetry.io/otel/sdk v1.42.0/go.mod h1:rGHCAxd9DAph0joO4W6OPwxjNTYWghRWmkHuGbayMts=
go.opentelemetry.io/otel/sdk/log v0.18.0 h1:n8OyZr7t7otkeTnPTbDNom6rW16TBYGtvyy2Gk6buQw=
go.opentelemetry.io/otel/sdk/log v0.18.0/go.mod h1:C0+wxkTwKpOCZLrlJ3pewPiiQwpzycPI/u6W0Z9fuYk=
go.opentelemetry.io/otel/sdk/log/logtest v0.18.0 h1:l3mYuPsuBx6UKE47BVcPrZoZ0q/KER57vbj2qkgDLXA=
go.opentelemetry.io/otel/sdk/log/logtest v0.18.0/go.mod h1:7cHtiVJpZebB3wybTa4NG+FUo5NPe3PROz1FqB0+qdw=
go.opentelemetry.io/otel/sdk/metric v1.42.0 h1:D/1QR46Clz6ajyZ3G8SgNlTJKBdGp84q9RKCAZ3YGuA=
go.opentelemetry.io/otel/sdk/metric v1.42.0/go.mod h1:Ua6AAlDKdZ7tdvaQKfSmnFTdHx37+J4ba8MwVCYM5hc=
go.opentelemetry.io/otel/trace v1.42.0 h1:OUCgIPt+mzOnaUTpOQcBiM/PLQ/Op7oq6g4LenLmOYY=
go.opentelemetry.io/otel/trace v1.42.0/go.mod h1:f3K9S+IFqnumBkKhRJMeaZeNk9epyhnCmQh/EysQCdc=
go.opentelemetry.io/proto/otlp v1.9.0 h1:l706jCMITVouPOqEnii2fIAuO3IVGBRPV5ICjceRb/A=
go.opentelemetry.io/proto/otlp v1.9.0/go.mod h1:xE+Cx5E/eEHw+ISFkwPLwCZefwVjY+pqKg1qcK03+/4=
go.uber.org/atomic v1.11.0 h1:ZvwS0R+56ePWxUNi+Atn9dWONBPp/AUETXlHW0DxSjE=
go.uber.org/atomic v1.11.0/go.mod h1:LUxbIzbOniOlMKjJjyPfpl4v+PKK2cNJn91OQbhoJI0=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.yaml.in/yaml/v2 v2.4.2 h1:DzmwEr2rDGHl7lsFgAHxmNz/1NlQ7xLIrlN2h5d1eGI=
go.yaml.in/yaml/v2 v2.4.2/go.mod h1:081UH+NErpNdqlCXm3TtEran0rJZGxAYx9hb/ELlsPU=
go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
golang.org/x/crypto v0.48.0 h1:/VRzVqiRSggnhY7gNRxPauEQ5Drw9haKdM0jqfcCFts=
golang.org/x/crypto v0.48.0/go.mod h1:r0kV5h3qnFPlQnBSrULhlsRfryS2pmewsg+XfMgkVos=
golang.org/x/net v0.51.0 h1:94R/GTO7mt3/4wIKpcR5gkGmRLOuE/2hNGeWq/GBIFo=
golang.org/x/net v0.51.0/go.mod h1:aamm+2QF5ogm02fjy5Bb7CQ0WMt1/WVM7FtyaTLlA9Y=
golang.org/x/sys v0.41.0 h1:Ivj+2Cp/ylzLiEU89QhWblYnOE9zerudt9Ftecq2C6k=
golang.org/x/sys v0.41.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.34.0 h1:oL/Qq0Kdaqxa1KbNeMKwQq0reLCCaFtqu2eNuSeNHbk=
golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
golang.org/x/crypto v0.23.0/go.mod h1:CKFgDieR+mRhux2Lsu27y0fO304Db0wZe70UKqHu0v8=
golang.org/x/crypto v0.31.0/go.mod h1:kDsLvtWBEx7MV9tJOj9bnXsPbxwJQ6csT/x4KIN4Ssk=
golang.org/x/crypto v0.49.0 h1:+Ng2ULVvLHnJ/ZFEq4KdcDd/cfjrrjjNSXNzxg0Y4U4=
golang.org/x/crypto v0.49.0/go.mod h1:ErX4dUh2UM+CFYiXZRTcMpEcN8b/1gxEuv3nODoYtCA=
golang.org/x/image v0.0.0-20191009234506-e7c1f5e7dbb8/go.mod h1:FeLwcggjj3mMvU+oOTbSwawSJRM1uh48EjtB4UJZlP0=
golang.org/x/image v0.38.0 h1:5l+q+Y9JDC7mBOMjo4/aPhMDcxEptsX+Tt3GgRQRPuE=
golang.org/x/image v0.38.0/go.mod h1:/3f6vaXC+6CEanU4KJxbcUZyEePbyKbaLoDOe4ehFYY=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.12.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.15.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.34.0 h1:xIHgNUUnW6sYkcM5Jleh05DvLOtwc6RitGHbDk4akRI=
golang.org/x/mod v0.34.0/go.mod h1:ykgH52iCZe79kzLLMhyCUzhMci+nQj+0XkbXpNYtVjY=
golang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
golang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=
golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM=
golang.org/x/net v0.52.0 h1:He/TN1l0e4mmR3QqHMT2Xab3Aj3L9qjbhRm78/6jrW0=
golang.org/x/net v0.52.0/go.mod h1:R1MAz7uMZxVMualyPXb+VaqGSa3LIaUqk0eEt3w36Sw=
golang.org/x/oauth2 v0.36.0 h1:peZ/1z27fi9hUOFCAZaHyrpWG5lwe0RJEEEeH0ThlIs=
golang.org/x/oauth2 v0.36.0/go.mod h1:YDBUJMTkDnJS+A4BP4eZBjCqtokkg1hODuPjwiGPO7Q=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=
golang.org/x/sync v0.6.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.20.0 h1:e0PTpb7pjO8GAtTs2dQ6jYa5BWYlMuX047Dco/pItO4=
golang.org/x/sync v0.20.0/go.mod h1:9xrNwdLfx4jkKbNva9FpL6vEN7evnE43NNNJQ2LF3+0=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.17.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.20.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.28.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.43.0 h1:Rlag2XtaFTxp19wS8MXlJwTvoh8ArU6ezoyFsMyCTNI=
golang.org/x/sys v0.43.0/go.mod h1:4GL1E5IUh+htKOUEOaiffhrAeqysfVGipDYzABqnCmw=
golang.org/x/telemetry v0.0.0-20240228155512-f48c80bd79b2/go.mod h1:TeRTkGYfJXctD9OcfyVLyj2J3IxLnKwHJR8f4D8a3YE=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=
golang.org/x/term v0.12.0/go.mod h1:owVbMEjm3cBLCHdkQu9b1opXd4ETQWc3BhuQGKgXgvU=
golang.org/x/term v0.17.0/go.mod h1:lLRBjIVuehSbZlaOtGMbcMncT+aqLLLmKrsjNrUguwk=
golang.org/x/term v0.20.0/go.mod h1:8UkIAJTvZgivsXaD6/pH6U9ecQzZ45awqEOzuCvwpFY=
golang.org/x/term v0.27.0/go.mod h1:iMsnZpn0cago0GOrHO2+Y7u7JPn5AylBrcoWkElMTSM=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.15.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
golang.org/x/text v0.36.0 h1:JfKh3XmcRPqZPKevfXVpI1wXPTqbkE5f7JA92a55Yxg=
golang.org/x/text v0.36.0/go.mod h1:NIdBknypM8iqVmPiuco0Dh6P5Jcdk8lJL0CUebqK164=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.13.0/go.mod h1:HvlwmtVNQAhOuCjW7xxvovg8wbNq7LwfXh/k7wXUl58=
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d/go.mod h1:aiJjzUbINMkxbQROHiO6hDPo2LHcIPhhQsa9DLh0yGk=
golang.org/x/tools v0.43.0 h1:12BdW9CeB3Z+J/I/wj34VMl8X+fEXBxVR90JeMX5E7s=
golang.org/x/tools v0.43.0/go.mod h1:uHkMso649BX2cZK6+RpuIPXS3ho2hZo4FVwfoy1vIk0=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
google.golang.org/appengine v1.6.5 h1:tycE03LOZYQNhDpS27tcQdAzLCVMaj7QT2SXxebnpCM=
google.golang.org/appengine v1.6.5/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57 h1:JLQynH/LBHfCTSbDWl+py8C+Rg/k1OVH3xfcaiANuF0=
google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57/go.mod h1:kSJwQxqmFXeo79zOmbrALdflXQeAYcUbgS7PbpMknCY=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260209200024-4cfbd4190f57 h1:mWPCjDEyshlQYzBpMNHaEof6UX1PmHcaUODUywQ0uac=
@@ -136,8 +302,39 @@ google.golang.org/grpc v1.79.2 h1:fRMD94s2tITpyJGtBBn7MkMseNpOZU8ZxgC3MMBaXRU=
google.golang.org/grpc v1.79.2/go.mod h1:KmT0Kjez+0dde/v2j9vzwoAScgEPx/Bw1CYChhHLrHQ=
google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=
google.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
modernc.org/cc/v4 v4.27.1 h1:9W30zRlYrefrDV2JE2O8VDtJ1yPGownxciz5rrbQZis=
modernc.org/cc/v4 v4.27.1/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0=
modernc.org/ccgo/v4 v4.32.0 h1:hjG66bI/kqIPX1b2yT6fr/jt+QedtP2fqojG2VrFuVw=
modernc.org/ccgo/v4 v4.32.0/go.mod h1:6F08EBCx5uQc38kMGl+0Nm0oWczoo1c7cgpzEry7Uc0=
modernc.org/fileutil v1.4.0 h1:j6ZzNTftVS054gi281TyLjHPp6CPHr2KCxEXjEbD6SM=
modernc.org/fileutil v1.4.0/go.mod h1:EqdKFDxiByqxLk8ozOxObDSfcVOv/54xDs/DUHdvCUU=
modernc.org/gc/v2 v2.6.5 h1:nyqdV8q46KvTpZlsw66kWqwXRHdjIlJOhG6kxiV/9xI=
modernc.org/gc/v2 v2.6.5/go.mod h1:YgIahr1ypgfe7chRuJi2gD7DBQiKSLMPgBQe9oIiito=
modernc.org/gc/v3 v3.1.2 h1:ZtDCnhonXSZexk/AYsegNRV1lJGgaNZJuKjJSWKyEqo=
modernc.org/gc/v3 v3.1.2/go.mod h1:HFK/6AGESC7Ex+EZJhJ2Gni6cTaYpSMmU/cT9RmlfYY=
modernc.org/goabi0 v0.2.0 h1:HvEowk7LxcPd0eq6mVOAEMai46V+i7Jrj13t4AzuNks=
modernc.org/goabi0 v0.2.0/go.mod h1:CEFRnnJhKvWT1c1JTI3Avm+tgOWbkOu5oPA8eH8LnMI=
modernc.org/libc v1.70.0 h1:U58NawXqXbgpZ/dcdS9kMshu08aiA6b7gusEusqzNkw=
modernc.org/libc v1.70.0/go.mod h1:OVmxFGP1CI/Z4L3E0Q3Mf1PDE0BucwMkcXjjLntvHJo=
modernc.org/mathutil v1.7.1 h1:GCZVGXdaN8gTqB1Mf/usp1Y/hSqgI2vAGGP4jZMCxOU=
modernc.org/mathutil v1.7.1/go.mod h1:4p5IwJITfppl0G4sUEDtCr4DthTaT47/N3aT6MhfgJg=
modernc.org/memory v1.11.0 h1:o4QC8aMQzmcwCK3t3Ux/ZHmwFPzE6hf2Y5LbkRs+hbI=
modernc.org/memory v1.11.0/go.mod h1:/JP4VbVC+K5sU2wZi9bHoq2MAkCnrt2r98UGeSK7Mjw=
modernc.org/opt v0.1.4 h1:2kNGMRiUjrp4LcaPuLY2PzUfqM/w9N23quVwhKt5Qm8=
modernc.org/opt v0.1.4/go.mod h1:03fq9lsNfvkYSfxrfUhZCWPk1lm4cq4N+Bh//bEtgns=
modernc.org/sortutil v1.2.1 h1:+xyoGf15mM3NMlPDnFqrteY07klSFxLElE2PVuWIJ7w=
modernc.org/sortutil v1.2.1/go.mod h1:7ZI3a3REbai7gzCLcotuw9AC4VZVpYMjDzETGsSMqJE=
modernc.org/sqlite v1.48.2 h1:5CnW4uP8joZtA0LedVqLbZV5GD7F/0x91AXeSyjoh5c=
modernc.org/sqlite v1.48.2/go.mod h1:hWjRO6Tj/5Ik8ieqxQybiEOUXy0NJFNp2tpvVpKlvig=
modernc.org/strutil v1.2.1 h1:UneZBkQA+DX2Rp35KcM69cSsNES9ly8mQWD71HKlOA0=
modernc.org/strutil v1.2.1/go.mod h1:EHkiggD70koQxjVdSBM3JKM7k6L0FbGE5eymy9i3B9A=
modernc.org/token v1.1.0 h1:Xl7Ap9dKaEs5kLoOQeQmPWevfnk/DM5qcLcYlA8ys6Y=
modernc.org/token v1.1.0/go.mod h1:UGzOrNV1mAFSEB63lOFHIpNRUVMvYTc6yu1SMY/XTDM=

View File

@@ -10,14 +10,13 @@ import (
// Consumer wraps the PocketBase-backed Consumer for result write-back only.
//
// When using Asynq, the runner no longer polls for work — Asynq delivers
// tasks via the ServeMux handlers. The only Consumer operations the handlers
// need are:
// - FinishAudioTask / FinishScrapeTask — write result back to PocketBase
// - FailTask — mark PocketBase record as failed
// When using Asynq, the runner no longer polls for scrape/audio work — Asynq
// delivers those tasks via the ServeMux handlers. However translation tasks
// live in PocketBase (not Redis), so ClaimNextTranslationTask and HeartbeatTask
// still delegate to the underlying PocketBase consumer.
//
// ClaimNextAudioTask, ClaimNextScrapeTask, HeartbeatTask, and ReapStaleTasks
// are all no-ops here because Asynq owns those responsibilities.
// ClaimNextAudioTask, ClaimNextScrapeTask are no-ops here because Asynq owns
// those responsibilities.
type Consumer struct {
pb taskqueue.Consumer // underlying PocketBase consumer (for write-back)
}
@@ -41,6 +40,10 @@ func (c *Consumer) FinishTranslationTask(ctx context.Context, id string, result
return c.pb.FinishTranslationTask(ctx, id, result)
}
func (c *Consumer) FinishImportTask(ctx context.Context, id string, result domain.ImportResult) error {
return c.pb.FinishImportTask(ctx, id, result)
}
func (c *Consumer) FailTask(ctx context.Context, id, errMsg string) error {
return c.pb.FailTask(ctx, id, errMsg)
}
@@ -55,10 +58,24 @@ func (c *Consumer) ClaimNextAudioTask(_ context.Context, _ string) (domain.Audio
return domain.AudioTask{}, false, nil
}
func (c *Consumer) ClaimNextTranslationTask(_ context.Context, _ string) (domain.TranslationTask, bool, error) {
return domain.TranslationTask{}, false, nil
// ClaimNextTranslationTask delegates to PocketBase because translation tasks
// are stored in PocketBase (not Redis/Asynq) and must still be polled directly.
func (c *Consumer) ClaimNextTranslationTask(ctx context.Context, workerID string) (domain.TranslationTask, bool, error) {
return c.pb.ClaimNextTranslationTask(ctx, workerID)
}
func (c *Consumer) HeartbeatTask(_ context.Context, _ string) error { return nil }
// ClaimNextImportTask delegates to PocketBase because import tasks
// are stored in PocketBase (not Redis/Asynq) and must still be polled directly.
func (c *Consumer) ClaimNextImportTask(ctx context.Context, workerID string) (domain.ImportTask, bool, error) {
return c.pb.ClaimNextImportTask(ctx, workerID)
}
func (c *Consumer) ReapStaleTasks(_ context.Context, _ time.Duration) (int, error) { return 0, nil }
func (c *Consumer) HeartbeatTask(ctx context.Context, id string) error {
return c.pb.HeartbeatTask(ctx, id)
}
// ReapStaleTasks delegates to PocketBase so stale translation tasks are reset
// to pending and can be reclaimed.
func (c *Consumer) ReapStaleTasks(ctx context.Context, staleAfter time.Duration) (int, error) {
return c.pb.ReapStaleTasks(ctx, staleAfter)
}

View File

@@ -7,6 +7,7 @@ import (
"log/slog"
"github.com/hibiken/asynq"
"github.com/libnovel/backend/internal/domain"
"github.com/libnovel/backend/internal/taskqueue"
)
@@ -87,12 +88,42 @@ func (p *Producer) CreateTranslationTask(ctx context.Context, slug string, chapt
return p.pb.CreateTranslationTask(ctx, slug, chapter, lang)
}
// CreateImportTask creates a PocketBase record then enqueues an Asynq job for PDF/EPUB import.
func (p *Producer) CreateImportTask(ctx context.Context, task domain.ImportTask) (string, error) {
id, err := p.pb.CreateImportTask(ctx, task)
if err != nil {
return "", err
}
payload := ImportPayload{
PBTaskID: id,
Slug: task.Slug,
Title: task.Title,
FileType: task.FileType,
ObjectKey: task.ObjectKey,
ChaptersKey: task.ChaptersKey,
}
if err := p.enqueue(ctx, TypeImportBook, payload); err != nil {
// Non-fatal: PB record exists; runner will pick it up on next poll.
p.log.Warn("asynq enqueue import failed (task still in PB, runner will poll)",
"task_id", id, "err", err)
return id, nil
}
return id, nil
}
// CancelTask delegates to PocketBase; Asynq jobs may already be running and
// cannot be reliably cancelled, so we only update the audit record.
func (p *Producer) CancelTask(ctx context.Context, id string) error {
return p.pb.CancelTask(ctx, id)
}
// CancelAudioTasksBySlug delegates to PocketBase to cancel all pending/running
// audio tasks for slug.
func (p *Producer) CancelAudioTasksBySlug(ctx context.Context, slug string) (int, error) {
return p.pb.CancelAudioTasksBySlug(ctx, slug)
}
// enqueue serialises payload and dispatches it to Asynq.
func (p *Producer) enqueue(_ context.Context, taskType string, payload any) error {
b, err := json.Marshal(payload)

View File

@@ -23,6 +23,7 @@ const (
TypeAudioGenerate = "audio:generate"
TypeScrapeBook = "scrape:book"
TypeScrapeCatalogue = "scrape:catalogue"
TypeImportBook = "import:book"
)
// AudioPayload is the Asynq job payload for audio generation tasks.
@@ -44,3 +45,13 @@ type ScrapePayload struct {
FromChapter int `json:"from_chapter"` // 0 unless Kind=="book_range"
ToChapter int `json:"to_chapter"` // 0 unless Kind=="book_range"
}
// ImportPayload is the Asynq job payload for PDF/EPUB import tasks.
type ImportPayload struct {
PBTaskID string `json:"pb_task_id"`
Slug string `json:"slug"`
Title string `json:"title"`
FileType string `json:"file_type"` // "pdf" or "epub"
ObjectKey string `json:"object_key"` // MinIO path to uploaded file
ChaptersKey string `json:"chapters_key"` // MinIO path to pre-parsed chapters JSON
}

View File

@@ -0,0 +1,143 @@
package backend
import (
"archive/zip"
"bytes"
"fmt"
"strings"
)
type epubChapter struct {
Number int
Title string
HTML string
}
func generateEPUB(slug, title, author string, chapters []epubChapter) ([]byte, error) {
var buf bytes.Buffer
w := zip.NewWriter(&buf)
// 1. mimetype — MUST be first, MUST be uncompressed (Store method)
mw, err := w.CreateHeader(&zip.FileHeader{
Name: "mimetype",
Method: zip.Store,
})
if err != nil {
return nil, err
}
mw.Write([]byte("application/epub+zip"))
// 2. META-INF/container.xml
addFile(w, "META-INF/container.xml", containerXML())
// 3. OEBPS/style.css
addFile(w, "OEBPS/style.css", epubCSS())
// 4. OEBPS/content.opf
addFile(w, "OEBPS/content.opf", contentOPF(slug, title, author, chapters))
// 5. OEBPS/toc.ncx
addFile(w, "OEBPS/toc.ncx", tocNCX(slug, title, chapters))
// 6. Chapter files
for _, ch := range chapters {
name := fmt.Sprintf("OEBPS/chapter-%04d.xhtml", ch.Number)
addFile(w, name, chapterXHTML(ch))
}
w.Close()
return buf.Bytes(), nil
}
func addFile(w *zip.Writer, name, content string) {
f, _ := w.Create(name)
f.Write([]byte(content))
}
func containerXML() string {
return `<?xml version="1.0" encoding="UTF-8"?>
<container version="1.0" xmlns="urn:oasis:names:tc:opendocument:xmlns:container">
<rootfiles>
<rootfile full-path="OEBPS/content.opf" media-type="application/oebps-package+xml"/>
</rootfiles>
</container>`
}
func contentOPF(slug, title, author string, chapters []epubChapter) string {
var items, spine strings.Builder
for _, ch := range chapters {
id := fmt.Sprintf("ch%04d", ch.Number)
href := fmt.Sprintf("chapter-%04d.xhtml", ch.Number)
items.WriteString(fmt.Sprintf(` <item id="%s" href="%s" media-type="application/xhtml+xml"/>`+"\n", id, href))
spine.WriteString(fmt.Sprintf(` <itemref idref="%s"/>`+"\n", id))
}
return fmt.Sprintf(`<?xml version="1.0" encoding="UTF-8"?>
<package xmlns="http://www.idpf.org/2007/opf" unique-identifier="uid" version="2.0">
<metadata xmlns:dc="http://purl.org/dc/elements/1.1/">
<dc:title>%s</dc:title>
<dc:creator>%s</dc:creator>
<dc:identifier id="uid">%s</dc:identifier>
<dc:language>en</dc:language>
</metadata>
<manifest>
<item id="ncx" href="toc.ncx" media-type="application/x-dtbncx+xml"/>
<item id="css" href="style.css" media-type="text/css"/>
%s </manifest>
<spine toc="ncx">
%s </spine>
</package>`, escapeXML(title), escapeXML(author), slug, items.String(), spine.String())
}
func tocNCX(slug, title string, chapters []epubChapter) string {
var points strings.Builder
for i, ch := range chapters {
chTitle := ch.Title
if chTitle == "" {
chTitle = fmt.Sprintf("Chapter %d", ch.Number)
}
points.WriteString(fmt.Sprintf(` <navPoint id="np%d" playOrder="%d">
<navLabel><text>%s</text></navLabel>
<content src="chapter-%04d.xhtml"/>
</navPoint>`+"\n", i+1, i+1, escapeXML(chTitle), ch.Number))
}
return fmt.Sprintf(`<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE ncx PUBLIC "-//NISO//DTD ncx 2005-1//EN" "http://www.daisy.org/z3986/2005/ncx-2005-1.dtd">
<ncx xmlns="http://www.daisy.org/z3986/2005/ncx/" version="2005-1">
<head><meta name="dtb:uid" content="%s"/></head>
<docTitle><text>%s</text></docTitle>
<navMap>
%s </navMap>
</ncx>`, slug, escapeXML(title), points.String())
}
func chapterXHTML(ch epubChapter) string {
title := ch.Title
if title == "" {
title = fmt.Sprintf("Chapter %d", ch.Number)
}
return fmt.Sprintf(`<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><title>%s</title><link rel="stylesheet" href="style.css"/></head>
<body>
<h1 class="chapter-title">%s</h1>
%s
</body>
</html>`, escapeXML(title), escapeXML(title), ch.HTML)
}
func epubCSS() string {
return `body { font-family: Georgia, serif; font-size: 1em; line-height: 1.6; margin: 1em 2em; }
h1.chapter-title { font-size: 1.4em; margin-bottom: 1em; }
p { margin: 0 0 0.8em 0; text-indent: 1.5em; }
p:first-of-type { text-indent: 0; }
`
}
func escapeXML(s string) string {
s = strings.ReplaceAll(s, "&", "&amp;")
s = strings.ReplaceAll(s, "<", "&lt;")
s = strings.ReplaceAll(s, ">", "&gt;")
s = strings.ReplaceAll(s, `"`, "&quot;")
return s
}

View File

@@ -9,6 +9,7 @@ package backend
// handleGetRanking, handleGetCover
// handleBookPreview, handleChapterText, handleChapterTextPreview, handleChapterMarkdown, handleReindex
// handleAudioGenerate, handleAudioStatus, handleAudioProxy, handleAudioStream
// handleTTSAnnounce
// handleVoices
// handlePresignChapter, handlePresignAudio, handlePresignVoiceSample
// handlePresignAvatarUpload, handlePresignAvatar
@@ -44,6 +45,7 @@ import (
"strings"
"time"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
"github.com/libnovel/backend/internal/kokoro"
"github.com/libnovel/backend/internal/meili"
@@ -568,6 +570,30 @@ func (s *Server) handleReindex(w http.ResponseWriter, r *http.Request) {
writeJSON(w, 0, map[string]any{"slug": slug, "indexed": count})
}
// handleDedupChapters handles POST /api/admin/dedup-chapters/{slug}.
// Removes duplicate chapters_idx records for a book, keeping the latest record
// per chapter number. Returns the number of duplicate records deleted.
func (s *Server) handleDedupChapters(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "missing slug")
return
}
deleted, err := s.deps.BookWriter.DeduplicateChapters(r.Context(), slug)
if err != nil {
s.deps.Log.Error("dedup-chapters failed", "slug", slug, "err", err)
writeJSON(w, http.StatusInternalServerError, map[string]any{
"error": err.Error(),
"deleted": deleted,
})
return
}
s.deps.Log.Info("dedup-chapters complete", "slug", slug, "deleted", deleted)
writeJSON(w, 0, map[string]any{"slug": slug, "deleted": deleted})
}
// ── Audio ──────────────────────────────────────────────────────────────────────
// handleAudioGenerate handles POST /api/audio/{slug}/{n}.
@@ -708,12 +734,17 @@ func (s *Server) handleAudioProxy(w http.ResponseWriter, r *http.Request) {
// Fast path: if audio already exists in MinIO, redirects to the presigned URL
// (same as handleAudioProxy) — the client plays from storage immediately.
//
// Slow path (first request): streams MP3 audio directly to the client while
// simultaneously uploading it to MinIO. After the stream completes, any
// pending audio_jobs task for this key is marked done. Subsequent requests hit
// the fast path and skip TTS generation entirely.
// Slow path (first request): streams audio directly to the client while
// simultaneously uploading it to MinIO. After the stream completes, subsequent
// requests hit the fast path and skip TTS generation entirely.
//
// Query params: voice (optional, defaults to DefaultVoice)
// Query params:
//
// voice (optional, defaults to DefaultVoice)
// format (optional, "mp3" or "wav"; defaults to "mp3")
//
// Using format=wav skips the ffmpeg transcode for pocket-tts voices, delivering
// raw WAV frames to the client with lower latency at the cost of larger files.
func (s *Server) handleAudioStream(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
n, err := strconv.Atoi(r.PathValue("n"))
@@ -727,7 +758,17 @@ func (s *Server) handleAudioStream(w http.ResponseWriter, r *http.Request) {
voice = s.cfg.DefaultVoice
}
audioKey := s.deps.AudioStore.AudioObjectKey(slug, n, voice)
format := r.URL.Query().Get("format")
if format != "wav" {
format = "mp3"
}
contentType := "audio/mpeg"
if format == "wav" {
contentType = "audio/wav"
}
audioKey := s.deps.AudioStore.AudioObjectKeyExt(slug, n, voice, format)
// ── Fast path: already in MinIO ───────────────────────────────────────────
if s.deps.AudioStore.AudioExists(r.Context(), audioKey) {
@@ -756,23 +797,51 @@ func (s *Server) handleAudioStream(w http.ResponseWriter, r *http.Request) {
return
}
// Open the TTS stream.
// Open the TTS stream (WAV or MP3 depending on format param).
var audioStream io.ReadCloser
if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
if format == "wav" {
if cfai.IsCFAIVoice(voice) {
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
audioStream, err = s.deps.CFAI.StreamAudioWAV(r.Context(), text, voice)
} else if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
}
audioStream, err = s.deps.PocketTTS.StreamAudioWAV(r.Context(), text, voice)
} else {
if s.deps.Kokoro == nil {
jsonError(w, http.StatusServiceUnavailable, "kokoro not configured")
return
}
audioStream, err = s.deps.Kokoro.StreamAudioWAV(r.Context(), text, voice)
}
audioStream, err = s.deps.PocketTTS.StreamAudioMP3(r.Context(), text, voice)
} else {
if s.deps.Kokoro == nil {
jsonError(w, http.StatusServiceUnavailable, "kokoro not configured")
return
if cfai.IsCFAIVoice(voice) {
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
audioStream, err = s.deps.CFAI.StreamAudioMP3(r.Context(), text, voice)
} else if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
}
audioStream, err = s.deps.PocketTTS.StreamAudioMP3(r.Context(), text, voice)
} else {
if s.deps.Kokoro == nil {
jsonError(w, http.StatusServiceUnavailable, "kokoro not configured")
return
}
audioStream, err = s.deps.Kokoro.StreamAudioMP3(r.Context(), text, voice)
}
audioStream, err = s.deps.Kokoro.StreamAudioMP3(r.Context(), text, voice)
}
if err != nil {
s.deps.Log.Error("handleAudioStream: TTS stream failed", "slug", slug, "n", n, "voice", voice, "err", err)
s.deps.Log.Error("handleAudioStream: TTS stream failed", "slug", slug, "n", n, "voice", voice, "format", format, "err", err)
jsonError(w, http.StatusInternalServerError, "tts stream failed")
return
}
@@ -787,11 +856,11 @@ func (s *Server) handleAudioStream(w http.ResponseWriter, r *http.Request) {
go func() {
uploadDone <- s.deps.AudioStore.PutAudioStream(
context.Background(), // use background — request ctx may cancel after client disconnects
audioKey, pr, -1, "audio/mpeg",
audioKey, pr, -1, contentType,
)
}()
w.Header().Set("Content-Type", "audio/mpeg")
w.Header().Set("Content-Type", contentType)
w.Header().Set("Cache-Control", "no-store")
w.Header().Set("X-Accel-Buffering", "no") // disable nginx/caddy buffering
w.WriteHeader(http.StatusOK)
@@ -836,6 +905,227 @@ func (s *Server) handleAudioStream(w http.ResponseWriter, r *http.Request) {
// on its next poll as soon as the MinIO object is present.
}
// handleTTSAnnounce handles GET /api/tts-announce.
//
// Streams a short TTS clip for arbitrary text — used by the UI to announce
// the upcoming chapter number/title through the real <audio> element instead
// of the Web Speech API (which is silently muted on mobile after the audio
// session ends).
//
// Query params:
// - text — the text to synthesize (required, max 300 chars)
// - voice — voice ID (defaults to server default)
// - format — "mp3" or "wav" (default "mp3")
//
// No MinIO caching — announcement clips are tiny and ephemeral.
func (s *Server) handleTTSAnnounce(w http.ResponseWriter, r *http.Request) {
text := r.URL.Query().Get("text")
if text == "" {
jsonError(w, http.StatusBadRequest, "text is required")
return
}
if len(text) > 300 {
text = text[:300]
}
voice := r.URL.Query().Get("voice")
if voice == "" {
voice = s.cfg.DefaultVoice
}
format := r.URL.Query().Get("format")
if format != "wav" {
format = "mp3"
}
contentType := "audio/mpeg"
if format == "wav" {
contentType = "audio/wav"
}
var (
audioStream io.ReadCloser
err error
)
if format == "wav" {
if cfai.IsCFAIVoice(voice) {
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
audioStream, err = s.deps.CFAI.StreamAudioWAV(r.Context(), text, voice)
} else if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
}
audioStream, err = s.deps.PocketTTS.StreamAudioWAV(r.Context(), text, voice)
} else {
if s.deps.Kokoro == nil {
jsonError(w, http.StatusServiceUnavailable, "kokoro not configured")
return
}
audioStream, err = s.deps.Kokoro.StreamAudioWAV(r.Context(), text, voice)
}
} else {
if cfai.IsCFAIVoice(voice) {
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
audioStream, err = s.deps.CFAI.StreamAudioMP3(r.Context(), text, voice)
} else if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
}
audioStream, err = s.deps.PocketTTS.StreamAudioMP3(r.Context(), text, voice)
} else {
if s.deps.Kokoro == nil {
jsonError(w, http.StatusServiceUnavailable, "kokoro not configured")
return
}
audioStream, err = s.deps.Kokoro.StreamAudioMP3(r.Context(), text, voice)
}
}
if err != nil {
s.deps.Log.Error("handleTTSAnnounce: TTS stream failed", "voice", voice, "err", err)
jsonError(w, http.StatusInternalServerError, "tts stream failed")
return
}
defer audioStream.Close()
w.Header().Set("Content-Type", contentType)
w.Header().Set("Cache-Control", "no-store")
w.Header().Set("X-Accel-Buffering", "no")
w.WriteHeader(http.StatusOK)
flusher, canFlush := w.(http.Flusher)
buf := make([]byte, 32*1024)
for {
nr, readErr := audioStream.Read(buf)
if nr > 0 {
if _, writeErr := w.Write(buf[:nr]); writeErr != nil {
return
}
if canFlush {
flusher.Flush()
}
}
if readErr != nil {
break
}
}
}
//
// CF AI voices are batch-only and can take 1-2+ minutes to generate a full
// chapter. This endpoint generates only the FIRST chunk of text (~1 800 chars,
// roughly 1-2 minutes of audio) so the client can start playing immediately
// while the full audio is generated in the background by the runner.
//
// Fast path: if a preview object already exists in MinIO, redirects to its
// presigned URL (no regeneration).
//
// Slow path: generates the first chunk via CF AI, streams the MP3 bytes to the
// client, and simultaneously uploads to MinIO under a "_preview" key so future
// requests hit the fast path.
//
// Only CF AI voices are expected here. Calling this with a Kokoro/PocketTTS
// voice falls back to the normal audio-stream endpoint behaviour.
//
// Query params:
//
// voice (required — must be a cfai: voice)
func (s *Server) handleAudioPreview(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
n, err := strconv.Atoi(r.PathValue("n"))
if err != nil || n < 1 {
jsonError(w, http.StatusBadRequest, "invalid chapter")
return
}
voice := r.URL.Query().Get("voice")
if voice == "" {
voice = s.cfg.DefaultVoice
}
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
// Preview key: same as normal key with a "_preview" suffix before the extension.
// e.g. slug/1/cfai:luna_preview.mp3
previewKey := s.deps.AudioStore.AudioObjectKeyExt(slug, n, voice+"_preview", "mp3")
// ── Fast path: preview already in MinIO ──────────────────────────────────
if s.deps.AudioStore.AudioExists(r.Context(), previewKey) {
presignURL, err := s.deps.PresignStore.PresignAudio(r.Context(), previewKey, 1*time.Hour)
if err != nil {
s.deps.Log.Error("handleAudioPreview: PresignAudio failed", "slug", slug, "n", n, "err", err)
jsonError(w, http.StatusInternalServerError, "presign failed")
return
}
http.Redirect(w, r, presignURL, http.StatusFound)
return
}
// ── Slow path: generate first chunk + stream + save ──────────────────────
// Read the chapter text.
raw, err := s.deps.BookReader.ReadChapter(r.Context(), slug, n)
if err != nil {
s.deps.Log.Error("handleAudioPreview: ReadChapter failed", "slug", slug, "n", n, "err", err)
jsonError(w, http.StatusNotFound, "chapter not found")
return
}
text := stripMarkdown(raw)
if text == "" {
jsonError(w, http.StatusUnprocessableEntity, "chapter text is empty")
return
}
// Take only the first ~1 800 characters — one CF AI chunk, roughly 1-2 min.
const previewChars = 1800
firstChunk := text
if len([]rune(text)) > previewChars {
runes := []rune(text)
firstChunk = string(runes[:previewChars])
// Walk back to last sentence boundary (. ! ?) to avoid a mid-word cut.
for i := previewChars - 1; i > previewChars/2; i-- {
r := runes[i]
if r == '.' || r == '!' || r == '?' || r == '\n' {
firstChunk = string(runes[:i+1])
break
}
}
}
// Generate the preview chunk via CF AI.
mp3, err := s.deps.CFAI.GenerateAudio(r.Context(), firstChunk, voice)
if err != nil {
s.deps.Log.Error("handleAudioPreview: GenerateAudio failed", "slug", slug, "n", n, "voice", voice, "err", err)
jsonError(w, http.StatusInternalServerError, "tts generation failed")
return
}
// Upload to MinIO in the background so the next request hits the fast path.
go func() {
if uploadErr := s.deps.AudioStore.PutAudio(
context.Background(), previewKey, mp3,
); uploadErr != nil {
s.deps.Log.Error("handleAudioPreview: MinIO upload failed", "key", previewKey, "err", uploadErr)
}
}()
w.Header().Set("Content-Type", "audio/mpeg")
w.Header().Set("Content-Length", strconv.Itoa(len(mp3)))
w.Header().Set("Cache-Control", "no-store")
w.WriteHeader(http.StatusOK)
_, _ = w.Write(mp3)
}
// ── Translation ────────────────────────────────────────────────────────────────
// supportedTranslationLangs is the set of target locales the backend accepts.
@@ -989,6 +1279,10 @@ func (s *Server) handleTranslationRead(w http.ResponseWriter, r *http.Request) {
return
}
// Translated chapter content is immutable once generated — cache aggressively.
// The browser and any intermediary (CDN, SvelteKit fetch cache) can reuse this
// response for 1 hour without hitting MinIO again.
w.Header().Set("Cache-Control", "public, max-age=3600, stale-while-revalidate=86400")
writeJSON(w, 0, map[string]string{"html": buf.String(), "lang": lang})
}
@@ -1081,6 +1375,166 @@ func (s *Server) handleAdminTranslationBulk(w http.ResponseWriter, r *http.Reque
})
}
// ── Admin Audio ────────────────────────────────────────────────────────────────
// handleAdminAudioJobs handles GET /api/admin/audio/jobs.
// Returns all audio jobs, optionally filtered by slug (?slug=...).
// Sorted by started descending.
func (s *Server) handleAdminAudioJobs(w http.ResponseWriter, r *http.Request) {
tasks, err := s.deps.TaskReader.ListAudioTasks(r.Context())
if err != nil {
s.deps.Log.Error("handleAdminAudioJobs: ListAudioTasks failed", "err", err)
jsonError(w, http.StatusInternalServerError, "failed to list audio jobs")
return
}
// Optional slug filter.
slugFilter := r.URL.Query().Get("slug")
type jobRow struct {
ID string `json:"id"`
CacheKey string `json:"cache_key"`
Slug string `json:"slug"`
Chapter int `json:"chapter"`
Voice string `json:"voice"`
Status string `json:"status"`
WorkerID string `json:"worker_id"`
ErrorMessage string `json:"error_message"`
Started string `json:"started"`
Finished string `json:"finished"`
}
rows := make([]jobRow, 0, len(tasks))
for _, t := range tasks {
if slugFilter != "" && t.Slug != slugFilter {
continue
}
rows = append(rows, jobRow{
ID: t.ID,
CacheKey: t.CacheKey,
Slug: t.Slug,
Chapter: t.Chapter,
Voice: t.Voice,
Status: string(t.Status),
WorkerID: t.WorkerID,
ErrorMessage: t.ErrorMessage,
Started: t.Started.Format(time.RFC3339),
Finished: t.Finished.Format(time.RFC3339),
})
}
writeJSON(w, 0, map[string]any{"jobs": rows, "total": len(rows)})
}
// handleAdminAudioBulk handles POST /api/admin/audio/bulk.
// Body: {"slug": "...", "voice": "af_bella", "from": 1, "to": 100, "skip_existing": true}
//
// Enqueues one audio task per chapter in [from, to].
// skip_existing (default true): skip chapters already cached in MinIO — use this
// to resume a previously interrupted bulk job.
// force: if true, enqueue even when a pending/running task already exists.
// Max 1000 chapters per request.
func (s *Server) handleAdminAudioBulk(w http.ResponseWriter, r *http.Request) {
var body struct {
Slug string `json:"slug"`
Voice string `json:"voice"`
From int `json:"from"`
To int `json:"to"`
SkipExisting *bool `json:"skip_existing"` // pointer so we can detect omission
Force bool `json:"force"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
jsonError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if body.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if body.Voice == "" {
body.Voice = s.cfg.DefaultVoice
}
if body.From < 1 || body.To < body.From {
jsonError(w, http.StatusBadRequest, "from must be >= 1 and to must be >= from")
return
}
if body.To-body.From > 999 {
jsonError(w, http.StatusBadRequest, "range too large; max 1000 chapters per request")
return
}
// skip_existing defaults to true (resume-friendly).
skipExisting := true
if body.SkipExisting != nil {
skipExisting = *body.SkipExisting
}
var taskIDs []string
skipped := 0
for n := body.From; n <= body.To; n++ {
// Skip chapters already cached in MinIO.
if skipExisting {
audioKey := s.deps.AudioStore.AudioObjectKey(body.Slug, n, body.Voice)
if s.deps.AudioStore.AudioExists(r.Context(), audioKey) {
skipped++
continue
}
}
// Skip chapters with an active (pending/running) task unless force=true.
if !body.Force {
cacheKey := fmt.Sprintf("%s/%d/%s", body.Slug, n, body.Voice)
existing, found, _ := s.deps.TaskReader.GetAudioTask(r.Context(), cacheKey)
if found && (existing.Status == domain.TaskStatusPending || existing.Status == domain.TaskStatusRunning) {
skipped++
continue
}
}
id, err := s.deps.Producer.CreateAudioTask(r.Context(), body.Slug, n, body.Voice)
if err != nil {
s.deps.Log.Error("handleAdminAudioBulk: CreateAudioTask failed",
"slug", body.Slug, "chapter", n, "voice", body.Voice, "err", err)
jsonError(w, http.StatusInternalServerError,
fmt.Sprintf("failed to create task for chapter %d: %s", n, err))
return
}
taskIDs = append(taskIDs, id)
}
writeJSON(w, http.StatusAccepted, map[string]any{
"enqueued": len(taskIDs),
"skipped": skipped,
"task_ids": taskIDs,
})
}
// handleAdminAudioCancelBulk handles POST /api/admin/audio/cancel-bulk.
// Body: {"slug": "..."}
// Cancels all pending and running audio tasks for the given slug.
func (s *Server) handleAdminAudioCancelBulk(w http.ResponseWriter, r *http.Request) {
var body struct {
Slug string `json:"slug"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
jsonError(w, http.StatusBadRequest, "invalid JSON body")
return
}
if body.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
cancelled, err := s.deps.Producer.CancelAudioTasksBySlug(r.Context(), body.Slug)
if err != nil {
s.deps.Log.Error("handleAdminAudioCancelBulk: CancelAudioTasksBySlug failed",
"slug", body.Slug, "err", err)
jsonError(w, http.StatusInternalServerError, "failed to cancel tasks")
return
}
writeJSON(w, 0, map[string]any{"cancelled": cancelled})
}
// ── Voices ─────────────────────────────────────────────────────────────────────
// Returns {"voices": [...]} — merged list from Kokoro and pocket-tts.
func (s *Server) handleVoices(w http.ResponseWriter, r *http.Request) {
@@ -1152,6 +1606,9 @@ func (s *Server) handlePresignVoiceSample(w http.ResponseWriter, r *http.Request
}
key := kokoro.VoiceSampleKey(voice)
if cfai.IsCFAIVoice(voice) {
key = cfai.VoiceSampleKey(voice)
}
// Generate sample on demand when it is not in MinIO yet.
if !s.deps.AudioStore.AudioExists(r.Context(), key) {
@@ -1161,7 +1618,13 @@ func (s *Server) handlePresignVoiceSample(w http.ResponseWriter, r *http.Request
mp3 []byte
err error
)
if pockettts.IsPocketTTSVoice(voice) {
if cfai.IsCFAIVoice(voice) {
if s.deps.CFAI == nil {
jsonError(w, http.StatusServiceUnavailable, "cloudflare AI TTS not configured")
return
}
mp3, err = s.deps.CFAI.GenerateAudio(r.Context(), voiceSampleText, voice)
} else if pockettts.IsPocketTTSVoice(voice) {
if s.deps.PocketTTS == nil {
jsonError(w, http.StatusServiceUnavailable, "pocket-tts not configured")
return
@@ -1538,6 +2001,109 @@ func stripMarkdown(src string) string {
return strings.TrimSpace(src)
}
// ── EPUB export ───────────────────────────────────────────────────────────────
// handleExportEPUB handles GET /api/export/{slug}.
// Generates and streams an EPUB file for the book identified by slug.
// Optional query params: from=N&to=N to limit the chapter range (default: all).
func (s *Server) handleExportEPUB(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "missing slug")
return
}
ctx := r.Context()
// Parse optional from/to range.
fromStr := r.URL.Query().Get("from")
toStr := r.URL.Query().Get("to")
fromN, toN := 0, 0
if fromStr != "" {
v, err := strconv.Atoi(fromStr)
if err != nil || v < 1 {
jsonError(w, http.StatusBadRequest, "invalid 'from' param")
return
}
fromN = v
}
if toStr != "" {
v, err := strconv.Atoi(toStr)
if err != nil || v < 1 {
jsonError(w, http.StatusBadRequest, "invalid 'to' param")
return
}
toN = v
}
// Fetch book metadata for title and author.
meta, inLib, err := s.deps.BookReader.ReadMetadata(ctx, slug)
if err != nil || !inLib {
s.deps.Log.Warn("handleExportEPUB: book not found", "slug", slug, "err", err)
jsonError(w, http.StatusNotFound, "book not found")
return
}
// List all chapters.
chapters, err := s.deps.BookReader.ListChapters(ctx, slug)
if err != nil {
s.deps.Log.Error("handleExportEPUB: ListChapters failed", "slug", slug, "err", err)
jsonError(w, http.StatusInternalServerError, "failed to list chapters")
return
}
// Filter chapters by from/to range.
var filtered []epubChapter
for _, ch := range chapters {
if fromN > 0 && ch.Number < fromN {
continue
}
if toN > 0 && ch.Number > toN {
continue
}
// Fetch markdown from MinIO.
mdText, readErr := s.deps.BookReader.ReadChapter(ctx, slug, ch.Number)
if readErr != nil {
s.deps.Log.Warn("handleExportEPUB: ReadChapter failed", "slug", slug, "n", ch.Number, "err", readErr)
// Skip chapters that cannot be fetched.
continue
}
// Convert markdown to HTML using goldmark.
md := goldmark.New()
var htmlBuf bytes.Buffer
if convErr := md.Convert([]byte(mdText), &htmlBuf); convErr != nil {
htmlBuf.Reset()
htmlBuf.WriteString("<p>" + mdText + "</p>")
}
filtered = append(filtered, epubChapter{
Number: ch.Number,
Title: ch.Title,
HTML: htmlBuf.String(),
})
}
if len(filtered) == 0 {
jsonError(w, http.StatusNotFound, "no chapters found in the requested range")
return
}
epubBytes, err := generateEPUB(slug, meta.Title, meta.Author, filtered)
if err != nil {
s.deps.Log.Error("handleExportEPUB: generateEPUB failed", "slug", slug, "err", err)
jsonError(w, http.StatusInternalServerError, "failed to generate EPUB")
return
}
w.Header().Set("Content-Type", "application/epub+zip")
w.Header().Set("Content-Disposition", fmt.Sprintf(`attachment; filename="%s.epub"`, slug))
w.Header().Set("Content-Length", strconv.Itoa(len(epubBytes)))
w.WriteHeader(http.StatusOK)
w.Write(epubBytes)
}
// ── Hardcoded Kokoro voice fallback ───────────────────────────────────────────
// kokoroVoiceIDs is the built-in fallback list of Kokoro voice IDs used when

View File

@@ -0,0 +1,233 @@
package backend
import (
"context"
"encoding/json"
"fmt"
"net/http"
"strings"
"sync"
"time"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
)
// ── Cancel registry ────────────────────────────────────────────────────────────
// cancelJobsMu guards cancelJobs.
var cancelJobsMu sync.Mutex
// cancelJobs maps a job ID to its CancelFunc. Entries are added when a batch
// job starts and removed when it finishes or is cancelled.
var cancelJobs = map[string]context.CancelFunc{}
func registerCancelJob(id string, cancel context.CancelFunc) {
cancelJobsMu.Lock()
cancelJobs[id] = cancel
cancelJobsMu.Unlock()
}
func deregisterCancelJob(id string) {
cancelJobsMu.Lock()
delete(cancelJobs, id)
cancelJobsMu.Unlock()
}
// ── AI Job list / get / cancel ─────────────────────────────────────────────────
// handleAdminListAIJobs handles GET /api/admin/ai-jobs.
// Returns all ai_job records sorted by started descending.
func (s *Server) handleAdminListAIJobs(w http.ResponseWriter, r *http.Request) {
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
jobs, err := s.deps.AIJobStore.ListAIJobs(r.Context())
if err != nil {
s.deps.Log.Error("admin: list ai jobs failed", "err", err)
jsonError(w, http.StatusInternalServerError, "list ai jobs: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"jobs": jobs})
}
// handleAdminGetAIJob handles GET /api/admin/ai-jobs/{id}.
func (s *Server) handleAdminGetAIJob(w http.ResponseWriter, r *http.Request) {
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
id := r.PathValue("id")
job, ok, err := s.deps.AIJobStore.GetAIJob(r.Context(), id)
if err != nil {
jsonError(w, http.StatusInternalServerError, err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("job %q not found", id))
return
}
writeJSON(w, 0, job)
}
// handleAdminCancelAIJob handles POST /api/admin/ai-jobs/{id}/cancel.
// Marks the job as cancelled in PB and cancels the in-memory context if present.
func (s *Server) handleAdminCancelAIJob(w http.ResponseWriter, r *http.Request) {
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
id := r.PathValue("id")
// Cancel in-memory context if the job is still running in this process.
cancelJobsMu.Lock()
if cancel, ok := cancelJobs[id]; ok {
cancel()
}
cancelJobsMu.Unlock()
// Mark as cancelled in PB.
if err := s.deps.AIJobStore.UpdateAIJob(r.Context(), id, map[string]any{
"status": string(domain.TaskStatusCancelled),
"finished": time.Now().Format(time.RFC3339),
}); err != nil {
s.deps.Log.Error("admin: cancel ai job failed", "id", id, "err", err)
jsonError(w, http.StatusInternalServerError, "cancel ai job: "+err.Error())
return
}
s.deps.Log.Info("admin: ai job cancelled", "id", id)
writeJSON(w, 0, map[string]any{"cancelled": true})
}
// ── Auto-prompt ────────────────────────────────────────────────────────────────
// autoPromptRequest is the JSON body for POST /api/admin/image-gen/auto-prompt.
type autoPromptRequest struct {
// Slug is the book slug.
Slug string `json:"slug"`
// Type is "cover" or "chapter".
Type string `json:"type"`
// Chapter number (required when type == "chapter").
Chapter int `json:"chapter"`
// Model is the text-gen model to use. Defaults to DefaultTextModel.
Model string `json:"model"`
}
// autoPromptResponse is returned by POST /api/admin/image-gen/auto-prompt.
type autoPromptResponse struct {
Prompt string `json:"prompt"`
Model string `json:"model"`
}
// handleAdminImageGenAutoPrompt handles POST /api/admin/image-gen/auto-prompt.
//
// Uses the text generation model to create a vivid image generation prompt
// based on the book's description (for covers) or chapter title/content (for chapters).
func (s *Server) handleAdminImageGenAutoPrompt(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured")
return
}
var req autoPromptRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if req.Type != "cover" && req.Type != "chapter" {
jsonError(w, http.StatusBadRequest, `type must be "cover" or "chapter"`)
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := req.Model
if model == "" {
model = string(cfai.DefaultTextModel)
}
var userPrompt string
if req.Type == "cover" {
userPrompt = fmt.Sprintf(
"Book: \"%s\"\nAuthor: %s\nGenres: %s\n\nDescription:\n%s",
meta.Title,
meta.Author,
strings.Join(meta.Genres, ", "),
meta.Summary,
)
} else {
// For chapter images, use chapter title if available.
chapterTitle := fmt.Sprintf("Chapter %d", req.Chapter)
if req.Chapter > 0 {
chapters, listErr := s.deps.BookReader.ListChapters(r.Context(), req.Slug)
if listErr == nil {
for _, ch := range chapters {
if ch.Number == req.Chapter {
chapterTitle = ch.Title
break
}
}
}
}
userPrompt = fmt.Sprintf(
"Book: \"%s\"\nGenres: %s\nChapter: %s\n\nBook description:\n%s",
meta.Title,
strings.Join(meta.Genres, ", "),
chapterTitle,
meta.Summary,
)
}
systemPrompt := buildAutoPromptSystem(req.Type)
s.deps.Log.Info("admin: image auto-prompt requested",
"slug", req.Slug, "type", req.Type, "chapter", req.Chapter, "model", model)
result, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: cfai.TextModel(model),
Messages: []cfai.TextMessage{
{Role: "system", Content: systemPrompt},
{Role: "user", Content: userPrompt},
},
MaxTokens: 256,
})
if genErr != nil {
s.deps.Log.Error("admin: auto-prompt failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
writeJSON(w, 0, autoPromptResponse{
Prompt: strings.TrimSpace(result),
Model: model,
})
}
func buildAutoPromptSystem(imageType string) string {
if imageType == "cover" {
return `You are a professional prompt engineer for AI image generation (Stable Diffusion / FLUX models). ` +
`Given a book's title, genres, and description, write a single vivid image generation prompt ` +
`for a book cover. The prompt should describe the visual composition, art style, lighting, ` +
`and mood without mentioning text or typography. ` +
`Format: comma-separated visual descriptors, 3060 words. ` +
`Output ONLY the prompt — no explanation, no quotes, no labels.`
}
return `You are a professional prompt engineer for AI image generation (Stable Diffusion / FLUX models). ` +
`Given a book's title, genres, and a specific chapter title, write a single vivid scene illustration prompt. ` +
`Describe the scene, characters, setting, lighting, and art style. ` +
`Format: comma-separated visual descriptors, 3060 words. ` +
`Output ONLY the prompt — no explanation, no quotes, no labels.`
}

View File

@@ -0,0 +1,117 @@
package backend
import (
"errors"
"net/http"
"github.com/libnovel/backend/internal/storage"
)
// handleAdminArchiveBook handles PATCH /api/admin/books/{slug}/archive.
// Soft-deletes a book by setting archived=true in PocketBase and updating the
// Meilisearch document so it is excluded from all public search results.
// The book data is preserved and can be restored with the unarchive endpoint.
func (s *Server) handleAdminArchiveBook(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "missing slug")
return
}
if s.deps.BookAdminStore == nil {
jsonError(w, http.StatusServiceUnavailable, "book admin store not configured")
return
}
if err := s.deps.BookAdminStore.ArchiveBook(r.Context(), slug); err != nil {
if errors.Is(err, storage.ErrNotFound) {
jsonError(w, http.StatusNotFound, "book not found")
return
}
s.deps.Log.Error("archive book failed", "slug", slug, "err", err)
jsonError(w, http.StatusInternalServerError, err.Error())
return
}
// Update the Meilisearch document so the archived flag takes effect
// immediately in search/catalogue results.
if meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), slug); err == nil && ok {
if upsertErr := s.deps.SearchIndex.UpsertBook(r.Context(), meta); upsertErr != nil {
s.deps.Log.Warn("archive book: meili upsert failed", "slug", slug, "err", upsertErr)
}
}
s.deps.Log.Info("book archived", "slug", slug)
writeJSON(w, http.StatusOK, map[string]string{"slug": slug, "status": "archived"})
}
// handleAdminUnarchiveBook handles PATCH /api/admin/books/{slug}/unarchive.
// Restores a previously archived book by clearing the archived flag, making it
// publicly visible in search and catalogue results again.
func (s *Server) handleAdminUnarchiveBook(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "missing slug")
return
}
if s.deps.BookAdminStore == nil {
jsonError(w, http.StatusServiceUnavailable, "book admin store not configured")
return
}
if err := s.deps.BookAdminStore.UnarchiveBook(r.Context(), slug); err != nil {
if errors.Is(err, storage.ErrNotFound) {
jsonError(w, http.StatusNotFound, "book not found")
return
}
s.deps.Log.Error("unarchive book failed", "slug", slug, "err", err)
jsonError(w, http.StatusInternalServerError, err.Error())
return
}
// Sync the updated archived=false state back to Meilisearch.
if meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), slug); err == nil && ok {
if upsertErr := s.deps.SearchIndex.UpsertBook(r.Context(), meta); upsertErr != nil {
s.deps.Log.Warn("unarchive book: meili upsert failed", "slug", slug, "err", upsertErr)
}
}
s.deps.Log.Info("book unarchived", "slug", slug)
writeJSON(w, http.StatusOK, map[string]string{"slug": slug, "status": "active"})
}
// handleAdminDeleteBook handles DELETE /api/admin/books/{slug}.
// Permanently removes all data for a book:
// - PocketBase books record and all chapters_idx records
// - All MinIO chapter markdown objects and the cover image
// - Meilisearch document
//
// This operation is irreversible. Use the archive endpoint for soft-deletion.
func (s *Server) handleAdminDeleteBook(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "missing slug")
return
}
if s.deps.BookAdminStore == nil {
jsonError(w, http.StatusServiceUnavailable, "book admin store not configured")
return
}
if err := s.deps.BookAdminStore.DeleteBook(r.Context(), slug); err != nil {
if errors.Is(err, storage.ErrNotFound) {
jsonError(w, http.StatusNotFound, "book not found")
return
}
s.deps.Log.Error("delete book failed", "slug", slug, "err", err)
jsonError(w, http.StatusInternalServerError, err.Error())
return
}
// Remove from Meilisearch — best-effort (log on failure, don't fail request).
if err := s.deps.SearchIndex.DeleteBook(r.Context(), slug); err != nil {
s.deps.Log.Warn("delete book: meili delete failed", "slug", slug, "err", err)
}
s.deps.Log.Info("book deleted", "slug", slug)
writeJSON(w, http.StatusOK, map[string]string{"slug": slug, "status": "deleted"})
}

View File

@@ -0,0 +1,792 @@
package backend
// Catalogue enrichment handlers: tagline, genre tagging, content warnings,
// quality scoring, batch cover regeneration, and per-book metadata refresh.
//
// All generation endpoints are admin-only (enforced by the SvelteKit proxy layer).
// All long-running operations support cancellation via r.Context().Done().
// Batch operations use an in-memory cancel registry (cancelJobs map) so the
// frontend can send a cancel request by job ID.
import (
"context"
"crypto/rand"
"encoding/hex"
"encoding/json"
"fmt"
"net/http"
"strings"
"time"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
)
// ── Tagline ───────────────────────────────────────────────────────────────
// textGenTaglineRequest is the JSON body for POST /api/admin/text-gen/tagline.
type textGenTaglineRequest struct {
Slug string `json:"slug"`
Model string `json:"model"`
MaxTokens int `json:"max_tokens"`
}
// textGenTaglineResponse is returned by POST /api/admin/text-gen/tagline.
type textGenTaglineResponse struct {
OldTagline string `json:"old_tagline"`
NewTagline string `json:"new_tagline"`
Model string `json:"model"`
}
// handleAdminTextGenTagline handles POST /api/admin/text-gen/tagline.
// Generates a 1-sentence marketing hook for a book.
func (s *Server) handleAdminTextGenTagline(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured")
return
}
var req textGenTaglineRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
system := `You are a copywriter for a web novel platform. ` +
`Given a book's title, genres, and description, write a single punchy tagline ` +
`(one sentence, under 20 words) that hooks a reader. ` +
`Output ONLY the tagline — no quotes, no labels, no explanation.`
user := fmt.Sprintf("Title: %s\nGenres: %s\n\nDescription:\n%s",
meta.Title,
strings.Join(meta.Genres, ", "),
meta.Summary,
)
s.deps.Log.Info("admin: text-gen tagline requested", "slug", req.Slug, "model", model)
result, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{{Role: "system", Content: system}, {Role: "user", Content: user}},
MaxTokens: 64,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen tagline failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
writeJSON(w, 0, textGenTaglineResponse{
OldTagline: "", // BookMeta has no tagline field yet — always empty
NewTagline: strings.TrimSpace(result),
Model: string(model),
})
}
// ── Genres ────────────────────────────────────────────────────────────────
// textGenGenresRequest is the JSON body for POST /api/admin/text-gen/genres.
type textGenGenresRequest struct {
Slug string `json:"slug"`
Model string `json:"model"`
MaxTokens int `json:"max_tokens"`
}
// textGenGenresResponse is returned by POST /api/admin/text-gen/genres.
type textGenGenresResponse struct {
CurrentGenres []string `json:"current_genres"`
ProposedGenres []string `json:"proposed_genres"`
Model string `json:"model"`
}
// handleAdminTextGenGenres handles POST /api/admin/text-gen/genres.
// Suggests a refined genre list based on the book's description.
func (s *Server) handleAdminTextGenGenres(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured")
return
}
var req textGenGenresRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
system := `You are a genre classification expert for a web novel platform. ` +
`Given a book's title and description, return a JSON array of 26 genre tags. ` +
`Use only well-known web novel genres such as: ` +
`Action, Adventure, Comedy, Drama, Fantasy, Historical, Horror, Isekai, Josei, ` +
`Martial Arts, Mature, Mecha, Mystery, Psychological, Romance, School Life, ` +
`Sci-fi, Seinen, Shoujo, Shounen, Slice of Life, Supernatural, System, Tragedy, Wuxia, Xianxia. ` +
`Output ONLY a raw JSON array of strings — no prose, no markdown, no explanation. ` +
`Example: ["Fantasy","Adventure","Action"]`
user := fmt.Sprintf("Title: %s\nCurrent genres: %s\n\nDescription:\n%s",
meta.Title,
strings.Join(meta.Genres, ", "),
meta.Summary,
)
s.deps.Log.Info("admin: text-gen genres requested", "slug", req.Slug, "model", model)
raw, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{{Role: "system", Content: system}, {Role: "user", Content: user}},
MaxTokens: 128,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen genres failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
proposed := parseStringArrayJSON(raw)
writeJSON(w, 0, textGenGenresResponse{
CurrentGenres: meta.Genres,
ProposedGenres: proposed,
Model: string(model),
})
}
// handleAdminTextGenApplyGenres handles POST /api/admin/text-gen/genres/apply.
// Persists the confirmed genre list to PocketBase.
func (s *Server) handleAdminTextGenApplyGenres(w http.ResponseWriter, r *http.Request) {
if s.deps.BookWriter == nil {
jsonError(w, http.StatusServiceUnavailable, "book writer not configured")
return
}
var req struct {
Slug string `json:"slug"`
Genres []string `json:"genres"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
meta.Genres = req.Genres
if err := s.deps.BookWriter.WriteMetadata(r.Context(), meta); err != nil {
s.deps.Log.Error("admin: apply genres failed", "slug", req.Slug, "err", err)
jsonError(w, http.StatusInternalServerError, "write metadata: "+err.Error())
return
}
s.deps.Log.Info("admin: genres applied", "slug", req.Slug, "genres", req.Genres)
writeJSON(w, 0, map[string]any{"updated": true})
}
// ── Content warnings ──────────────────────────────────────────────────────
// textGenContentWarningsRequest is the JSON body for POST /api/admin/text-gen/content-warnings.
type textGenContentWarningsRequest struct {
Slug string `json:"slug"`
Model string `json:"model"`
MaxTokens int `json:"max_tokens"`
}
// textGenContentWarningsResponse is returned by POST /api/admin/text-gen/content-warnings.
type textGenContentWarningsResponse struct {
Warnings []string `json:"warnings"`
Model string `json:"model"`
}
// handleAdminTextGenContentWarnings handles POST /api/admin/text-gen/content-warnings.
// Detects mature or sensitive themes in a book's description.
func (s *Server) handleAdminTextGenContentWarnings(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured")
return
}
var req textGenContentWarningsRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
system := `You are a content moderation assistant for a web novel platform. ` +
`Given a book's title, genres, and description, detect any content warnings that should be shown to readers. ` +
`Choose only relevant warnings from: Violence, Strong Language, Sexual Content, Mature Themes, ` +
`Dark Themes, Gore, Torture, Abuse, Drug Use, Suicide/Self-Harm. ` +
`If the book is clean, return an empty array. ` +
`Output ONLY a raw JSON array of strings — no prose, no markdown. ` +
`Example: ["Violence","Dark Themes"]`
user := fmt.Sprintf("Title: %s\nGenres: %s\n\nDescription:\n%s",
meta.Title,
strings.Join(meta.Genres, ", "),
meta.Summary,
)
s.deps.Log.Info("admin: text-gen content-warnings requested", "slug", req.Slug, "model", model)
raw, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{{Role: "system", Content: system}, {Role: "user", Content: user}},
MaxTokens: 128,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen content-warnings failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
warnings := parseStringArrayJSON(raw)
writeJSON(w, 0, textGenContentWarningsResponse{
Warnings: warnings,
Model: string(model),
})
}
// ── Quality score ─────────────────────────────────────────────────────────
// textGenQualityScoreRequest is the JSON body for POST /api/admin/text-gen/quality-score.
type textGenQualityScoreRequest struct {
Slug string `json:"slug"`
Model string `json:"model"`
MaxTokens int `json:"max_tokens"`
}
// textGenQualityScoreResponse is returned by POST /api/admin/text-gen/quality-score.
type textGenQualityScoreResponse struct {
Score int `json:"score"` // 15
Feedback string `json:"feedback"` // brief reasoning
Model string `json:"model"`
}
// handleAdminTextGenQualityScore handles POST /api/admin/text-gen/quality-score.
// Rates the book description quality on a 15 scale with brief feedback.
func (s *Server) handleAdminTextGenQualityScore(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured")
return
}
var req textGenQualityScoreRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
system := `You are a book description quality reviewer for a web novel platform. ` +
`Rate the provided description on a scale of 15 where: ` +
`1=poor (vague/too short), 2=below average, 3=average, 4=good, 5=excellent (engaging/detailed). ` +
`Respond with ONLY a JSON object: {"score": <1-5>, "feedback": "<one sentence explanation>"}. ` +
`No markdown, no extra text.`
user := fmt.Sprintf("Title: %s\nGenres: %s\n\nDescription:\n%s",
meta.Title,
strings.Join(meta.Genres, ", "),
meta.Summary,
)
s.deps.Log.Info("admin: text-gen quality-score requested", "slug", req.Slug, "model", model)
raw, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{{Role: "system", Content: system}, {Role: "user", Content: user}},
MaxTokens: 128,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen quality-score failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
var parsed struct {
Score int `json:"score"`
Feedback string `json:"feedback"`
}
// Strip markdown fences if any.
clean := extractJSONObject(raw)
if err := json.Unmarshal([]byte(clean), &parsed); err != nil {
// Fallback: try to extract a digit.
parsed.Score = 0
for _, ch := range raw {
if ch >= '1' && ch <= '5' {
parsed.Score = int(ch - '0')
break
}
}
parsed.Feedback = strings.TrimSpace(raw)
}
writeJSON(w, 0, textGenQualityScoreResponse{
Score: parsed.Score,
Feedback: parsed.Feedback,
Model: string(model),
})
}
// ── Batch cover regeneration ──────────────────────────────────────────────
// batchCoverEvent is one SSE event emitted during batch cover regeneration.
type batchCoverEvent struct {
// JobID is the opaque identifier clients use to cancel this job.
JobID string `json:"job_id,omitempty"`
Done int `json:"done"`
Total int `json:"total"`
Slug string `json:"slug,omitempty"`
Error string `json:"error,omitempty"`
Skipped bool `json:"skipped,omitempty"`
Finish bool `json:"finish,omitempty"`
}
// handleAdminBatchCovers handles POST /api/admin/catalogue/batch-covers.
//
// Streams SSE events as it generates covers for every book that has no cover
// stored in MinIO. Each event carries progress info. The final event has Finish=true.
//
// Supports from_item/to_item to process a sub-range of the catalogue (0-based indices).
// Supports job_id to resume a previously interrupted job.
// The job can be cancelled by calling POST /api/admin/ai-jobs/{id}/cancel.
func (s *Server) handleAdminBatchCovers(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil || s.deps.ImageGen == nil {
jsonError(w, http.StatusServiceUnavailable, "image/text generation not configured")
return
}
if s.deps.CoverStore == nil {
jsonError(w, http.StatusServiceUnavailable, "cover store not configured")
return
}
var reqBody struct {
Model string `json:"model"`
NumSteps int `json:"num_steps"`
Width int `json:"width"`
Height int `json:"height"`
FromItem int `json:"from_item"`
ToItem int `json:"to_item"`
JobID string `json:"job_id"`
}
// Body is optional — defaults used if absent.
json.NewDecoder(r.Body).Decode(&reqBody) //nolint:errcheck
allBooks, err := s.deps.BookReader.ListBooks(r.Context())
if err != nil {
jsonError(w, http.StatusInternalServerError, "list books: "+err.Error())
return
}
// Apply range filter.
books := allBooks
if reqBody.FromItem > 0 || reqBody.ToItem > 0 {
from := reqBody.FromItem
to := reqBody.ToItem
if to == 0 || to >= len(allBooks) {
to = len(allBooks) - 1
}
if from < 0 {
from = 0
}
if from <= to && from < len(allBooks) {
books = allBooks[from : to+1]
}
}
// SSE headers.
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("X-Accel-Buffering", "no")
flusher, canFlush := w.(http.Flusher)
sseWrite := func(evt batchCoverEvent) {
b, _ := json.Marshal(evt)
fmt.Fprintf(w, "data: %s\n\n", b)
if canFlush {
flusher.Flush()
}
}
total := len(books)
done := 0
// Create or resume PB ai_job and register cancel context.
var pbJobID string
resumeFrom := 0
ctx, cancel := context.WithCancel(r.Context())
defer cancel()
if s.deps.AIJobStore != nil {
if reqBody.JobID != "" {
if existing, ok, _ := s.deps.AIJobStore.GetAIJob(r.Context(), reqBody.JobID); ok {
pbJobID = reqBody.JobID
resumeFrom = existing.ItemsDone
done = resumeFrom
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), pbJobID, map[string]any{
"status": string(domain.TaskStatusRunning),
"items_total": total,
})
}
}
if pbJobID == "" {
id, createErr := s.deps.AIJobStore.CreateAIJob(r.Context(), domain.AIJob{
Kind: "batch-covers",
Status: domain.TaskStatusRunning,
FromItem: reqBody.FromItem,
ToItem: reqBody.ToItem,
ItemsTotal: total,
Started: time.Now(),
})
if createErr == nil {
pbJobID = id
}
}
if pbJobID != "" {
registerCancelJob(pbJobID, cancel)
defer deregisterCancelJob(pbJobID)
}
}
// Use pbJobID as the SSE job_id when available, else a random hex fallback.
sseJobID := pbJobID
if sseJobID == "" {
sseJobID = randomHex(8)
ctx2, cancel2 := context.WithCancel(r.Context())
registerCancelJob(sseJobID, cancel2)
defer deregisterCancelJob(sseJobID)
defer cancel2()
cancel() // replace ctx with ctx2
ctx = ctx2
}
// Send initial event with jobID so frontend can store it for cancellation.
sseWrite(batchCoverEvent{JobID: sseJobID, Done: done, Total: total})
for i, book := range books {
if ctx.Err() != nil {
break
}
// Skip already-processed items when resuming.
if i < resumeFrom {
continue
}
// Check if cover already exists.
hasCover := s.deps.CoverStore.CoverExists(ctx, book.Slug)
if hasCover {
done++
sseWrite(batchCoverEvent{Done: done, Total: total, Slug: book.Slug, Skipped: true})
if pbJobID != "" && s.deps.AIJobStore != nil {
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), pbJobID, map[string]any{"items_done": done})
}
continue
}
// Build a prompt from the book metadata.
prompt := buildCoverPrompt(book)
// Generate the image via CF AI.
imgBytes, genErr := s.deps.ImageGen.GenerateImage(ctx, cfai.ImageRequest{
Prompt: prompt,
NumSteps: reqBody.NumSteps,
Width: reqBody.Width,
Height: reqBody.Height,
})
if genErr != nil {
done++
s.deps.Log.Error("batch-covers: image gen failed", "slug", book.Slug, "err", genErr)
sseWrite(batchCoverEvent{Done: done, Total: total, Slug: book.Slug, Error: genErr.Error()})
continue
}
// Save to CoverStore.
if saveErr := s.deps.CoverStore.PutCover(ctx, book.Slug, imgBytes, "image/png"); saveErr != nil {
done++
s.deps.Log.Error("batch-covers: save failed", "slug", book.Slug, "err", saveErr)
sseWrite(batchCoverEvent{Done: done, Total: total, Slug: book.Slug, Error: saveErr.Error()})
continue
}
done++
s.deps.Log.Info("batch-covers: cover generated", "slug", book.Slug)
sseWrite(batchCoverEvent{Done: done, Total: total, Slug: book.Slug})
if pbJobID != "" && s.deps.AIJobStore != nil {
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), pbJobID, map[string]any{"items_done": done})
}
}
if pbJobID != "" && s.deps.AIJobStore != nil {
status := domain.TaskStatusDone
if ctx.Err() != nil {
status = domain.TaskStatusCancelled
}
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), pbJobID, map[string]any{
"status": string(status),
"items_done": done,
"finished": time.Now().Format(time.RFC3339),
})
}
sseWrite(batchCoverEvent{Done: done, Total: total, Finish: true})
}
// handleAdminBatchCoversCancel handles POST /api/admin/catalogue/batch-covers/cancel.
// Cancels an in-progress batch cover job by its job ID.
func (s *Server) handleAdminBatchCoversCancel(w http.ResponseWriter, r *http.Request) {
var req struct {
JobID string `json:"job_id"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil || req.JobID == "" {
jsonError(w, http.StatusBadRequest, "job_id is required")
return
}
cancelJobsMu.Lock()
cancel, ok := cancelJobs[req.JobID]
cancelJobsMu.Unlock()
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("job %q not found", req.JobID))
return
}
cancel()
s.deps.Log.Info("batch-covers: job cancelled", "job_id", req.JobID)
writeJSON(w, 0, map[string]any{"cancelled": true})
}
// ── Refresh metadata (per-book) ────────────────────────────────────────────
// refreshMetadataEvent is one SSE event during per-book metadata refresh.
type refreshMetadataEvent struct {
Step string `json:"step"` // "description" | "tagline" | "cover"
Done bool `json:"done"`
Error string `json:"error,omitempty"`
}
// handleAdminRefreshMetadata handles POST /api/admin/catalogue/refresh-metadata/{slug}.
//
// Runs description → tagline → cover generation in sequence for a single book
// and streams SSE progress. Interruptable via client disconnect (r.Context()).
func (s *Server) handleAdminRefreshMetadata(w http.ResponseWriter, r *http.Request) {
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", slug))
return
}
// SSE headers.
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("X-Accel-Buffering", "no")
flusher, canFlush := w.(http.Flusher)
sseWrite := func(evt refreshMetadataEvent) {
b, _ := json.Marshal(evt)
fmt.Fprintf(w, "data: %s\n\n", b)
if canFlush {
flusher.Flush()
}
}
ctx := r.Context()
// Step 1 — description.
if s.deps.TextGen != nil {
if ctx.Err() == nil {
newDesc, genErr := s.deps.TextGen.Generate(ctx, cfai.TextRequest{
Model: cfai.DefaultTextModel,
Messages: []cfai.TextMessage{
{Role: "system", Content: `You are a book description writer for a web novel platform. Write an improved description. Respond with ONLY the new description text — no title, no labels, no markdown.`},
{Role: "user", Content: fmt.Sprintf("Title: %s\nGenres: %s\n\nCurrent description:\n%s\n\nInstructions: Write a compelling 24 sentence description. Keep it spoiler-free and engaging.", meta.Title, strings.Join(meta.Genres, ", "), meta.Summary)},
},
MaxTokens: 512,
})
if genErr == nil && strings.TrimSpace(newDesc) != "" && s.deps.BookWriter != nil {
meta.Summary = strings.TrimSpace(newDesc)
if writeErr := s.deps.BookWriter.WriteMetadata(ctx, meta); writeErr != nil {
sseWrite(refreshMetadataEvent{Step: "description", Error: writeErr.Error()})
} else {
sseWrite(refreshMetadataEvent{Step: "description"})
}
} else if genErr != nil {
sseWrite(refreshMetadataEvent{Step: "description", Error: genErr.Error()})
}
}
}
// Step 2 — cover.
if s.deps.ImageGen != nil && s.deps.CoverStore != nil {
if ctx.Err() == nil {
prompt := buildCoverPrompt(meta)
imgBytes, genErr := s.deps.ImageGen.GenerateImage(ctx, cfai.ImageRequest{Prompt: prompt})
if genErr == nil {
if saveErr := s.deps.CoverStore.PutCover(ctx, slug, imgBytes, "image/png"); saveErr != nil {
sseWrite(refreshMetadataEvent{Step: "cover", Error: saveErr.Error()})
} else {
sseWrite(refreshMetadataEvent{Step: "cover"})
}
} else {
sseWrite(refreshMetadataEvent{Step: "cover", Error: genErr.Error()})
}
}
}
sseWrite(refreshMetadataEvent{Step: "done", Done: true})
}
// ── Helpers ───────────────────────────────────────────────────────────────
// parseStringArrayJSON extracts a JSON string array from model output,
// tolerating markdown fences and surrounding prose.
func parseStringArrayJSON(raw string) []string {
s := raw
if idx := strings.Index(s, "```json"); idx >= 0 {
s = s[idx+7:]
} else if idx := strings.Index(s, "```"); idx >= 0 {
s = s[idx+3:]
}
if idx := strings.LastIndex(s, "```"); idx >= 0 {
s = s[:idx]
}
start := strings.Index(s, "[")
end := strings.LastIndex(s, "]")
if start < 0 || end <= start {
return nil
}
s = s[start : end+1]
var out []string
json.Unmarshal([]byte(s), &out) //nolint:errcheck
return out
}
// extractJSONObject finds the first {...} object in a string.
func extractJSONObject(raw string) string {
start := strings.Index(raw, "{")
end := strings.LastIndex(raw, "}")
if start < 0 || end <= start {
return raw
}
return raw[start : end+1]
}
// buildCoverPrompt constructs a prompt string for cover generation from a book.
func buildCoverPrompt(meta domain.BookMeta) string {
parts := []string{"book cover art"}
if meta.Title != "" {
parts = append(parts, "titled \""+meta.Title+"\"")
}
if len(meta.Genres) > 0 {
parts = append(parts, strings.Join(meta.Genres, ", ")+" genre")
}
if meta.Summary != "" {
summary := meta.Summary
if len(summary) > 200 {
summary = summary[:200]
}
parts = append(parts, summary)
}
return strings.Join(parts, ", ")
}
// randomHex returns a random hex string of n bytes.
func randomHex(n int) string {
b := make([]byte, n)
_, _ = rand.Read(b)
return hex.EncodeToString(b)
}

View File

@@ -0,0 +1,645 @@
package backend
import (
"context"
"encoding/base64"
"encoding/json"
"fmt"
"io"
"net/http"
"strconv"
"strings"
"time"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
)
// handleAdminImageGenModels handles GET /api/admin/image-gen/models.
// Returns the list of supported Cloudflare AI image generation models.
func (s *Server) handleAdminImageGenModels(w http.ResponseWriter, r *http.Request) {
if s.deps.ImageGen == nil {
jsonError(w, http.StatusServiceUnavailable, "image generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
models := s.deps.ImageGen.Models()
writeJSON(w, 0, map[string]any{"models": models})
}
// imageGenRequest is the JSON body for POST /api/admin/image-gen.
type imageGenRequest struct {
// Prompt is the text description of the desired image.
Prompt string `json:"prompt"`
// Model is the CF Workers AI model ID (e.g. "@cf/black-forest-labs/flux-2-dev").
// Defaults to the recommended model for the given type.
Model string `json:"model"`
// Type is either "cover" or "chapter".
Type string `json:"type"`
// Slug is the book slug. Required for cover; required for chapter.
Slug string `json:"slug"`
// Chapter number (1-based). Required when type == "chapter".
Chapter int `json:"chapter"`
// ReferenceImageB64 is an optional base64-encoded PNG/JPEG reference image.
// When present the img2img path is used.
ReferenceImageB64 string `json:"reference_image_b64"`
// NumSteps overrides inference steps (default 20).
NumSteps int `json:"num_steps"`
// Width / Height override output dimensions (0 = model default).
Width int `json:"width"`
Height int `json:"height"`
// Guidance overrides prompt guidance scale (0 = model default).
Guidance float64 `json:"guidance"`
// Strength for img2img: 0.01.0, default 0.75.
Strength float64 `json:"strength"`
// SaveToCover when true stores the result as the book cover in MinIO
// (overwriting any existing cover) and sets the book's cover URL.
// Only valid when type == "cover".
SaveToCover bool `json:"save_to_cover"`
}
// imageGenResponse is the JSON body returned by POST /api/admin/image-gen.
type imageGenResponse struct {
// ImageB64 is the generated image as a base64-encoded PNG string.
ImageB64 string `json:"image_b64"`
// ContentType is "image/png" or "image/jpeg".
ContentType string `json:"content_type"`
// Saved indicates whether the image was persisted to MinIO.
Saved bool `json:"saved"`
// CoverURL is the URL the cover is now served from (only set when Saved==true).
CoverURL string `json:"cover_url,omitempty"`
// Model is the model that was used.
Model string `json:"model"`
// Bytes is the raw image size in bytes.
Bytes int `json:"bytes"`
}
// handleAdminImageGen handles POST /api/admin/image-gen.
//
// Generates an image using Cloudflare Workers AI and optionally stores it.
// Multipart/form-data is also accepted so the reference image can be uploaded
// directly; otherwise the reference is expected as base64 JSON.
func (s *Server) handleAdminImageGen(w http.ResponseWriter, r *http.Request) {
if s.deps.ImageGen == nil {
jsonError(w, http.StatusServiceUnavailable, "image generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
var req imageGenRequest
var refImageData []byte
ct := r.Header.Get("Content-Type")
if strings.HasPrefix(ct, "multipart/form-data") {
// Multipart: parse JSON fields from a "json" part + optional "reference" file part.
if err := r.ParseMultipartForm(32 << 20); err != nil {
jsonError(w, http.StatusBadRequest, "parse multipart: "+err.Error())
return
}
if jsonPart := r.FormValue("json"); jsonPart != "" {
if err := json.Unmarshal([]byte(jsonPart), &req); err != nil {
jsonError(w, http.StatusBadRequest, "parse json field: "+err.Error())
return
}
}
if f, _, err := r.FormFile("reference"); err == nil {
defer f.Close()
refImageData, _ = io.ReadAll(f)
}
} else {
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if req.ReferenceImageB64 != "" {
var decErr error
refImageData, decErr = base64.StdEncoding.DecodeString(req.ReferenceImageB64)
if decErr != nil {
// Try std without padding
refImageData, decErr = base64.RawStdEncoding.DecodeString(req.ReferenceImageB64)
if decErr != nil {
jsonError(w, http.StatusBadRequest, "decode reference_image_b64: "+decErr.Error())
return
}
}
}
}
if strings.TrimSpace(req.Prompt) == "" {
jsonError(w, http.StatusBadRequest, "prompt is required")
return
}
if req.Type != "cover" && req.Type != "chapter" {
jsonError(w, http.StatusBadRequest, `type must be "cover" or "chapter"`)
return
}
if req.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if req.Type == "chapter" && req.Chapter <= 0 {
jsonError(w, http.StatusBadRequest, "chapter must be > 0 when type is chapter")
return
}
// Resolve model
model := cfai.ImageModel(req.Model)
if model == "" {
if req.Type == "cover" {
model = cfai.DefaultImageModel
} else {
model = cfai.ImageModelFlux2Klein4B
}
}
imgReq := cfai.ImageRequest{
Prompt: req.Prompt,
Model: model,
NumSteps: req.NumSteps,
Width: req.Width,
Height: req.Height,
Guidance: req.Guidance,
Strength: req.Strength,
}
s.deps.Log.Info("admin: image gen requested",
"type", req.Type, "slug", req.Slug, "chapter", req.Chapter,
"model", model, "has_reference", len(refImageData) > 0)
var imgData []byte
var genErr error
if len(refImageData) > 0 {
imgData, genErr = s.deps.ImageGen.GenerateImageFromReference(r.Context(), imgReq, refImageData)
} else {
imgData, genErr = s.deps.ImageGen.GenerateImage(r.Context(), imgReq)
}
if genErr != nil {
s.deps.Log.Error("admin: image gen failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "image generation failed: "+genErr.Error())
return
}
contentType := sniffImageContentType(imgData)
// ── Optional persistence ──────────────────────────────────────────────────
var saved bool
var coverURL string
if req.SaveToCover && req.Type == "cover" && s.deps.CoverStore != nil {
if err := s.deps.CoverStore.PutCover(r.Context(), req.Slug, imgData, contentType); err != nil {
s.deps.Log.Error("admin: save generated cover failed", "slug", req.Slug, "err", err)
// Non-fatal: still return the image
} else {
saved = true
coverURL = fmt.Sprintf("/api/cover/novelfire.net/%s", req.Slug)
s.deps.Log.Info("admin: generated cover saved", "slug", req.Slug, "bytes", len(imgData))
}
}
// Encode result as base64
b64 := base64.StdEncoding.EncodeToString(imgData)
writeJSON(w, 0, imageGenResponse{
ImageB64: b64,
ContentType: contentType,
Saved: saved,
CoverURL: coverURL,
Model: string(model),
Bytes: len(imgData),
})
}
// saveCoverRequest is the JSON body for POST /api/admin/image-gen/save-cover.
type saveCoverRequest struct {
// Slug is the book slug whose cover should be overwritten.
Slug string `json:"slug"`
// ImageB64 is the base64-encoded image bytes (PNG or JPEG).
ImageB64 string `json:"image_b64"`
}
// handleAdminImageGenSaveCover handles POST /api/admin/image-gen/save-cover.
//
// Accepts a pre-generated image as base64 and stores it as the book cover in
// MinIO, replacing the existing one. Does not call Cloudflare AI at all.
func (s *Server) handleAdminImageGenSaveCover(w http.ResponseWriter, r *http.Request) {
if s.deps.CoverStore == nil {
jsonError(w, http.StatusServiceUnavailable, "cover store not configured")
return
}
var req saveCoverRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if req.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if req.ImageB64 == "" {
jsonError(w, http.StatusBadRequest, "image_b64 is required")
return
}
imgData, err := base64.StdEncoding.DecodeString(req.ImageB64)
if err != nil {
imgData, err = base64.RawStdEncoding.DecodeString(req.ImageB64)
if err != nil {
jsonError(w, http.StatusBadRequest, "decode image_b64: "+err.Error())
return
}
}
contentType := sniffImageContentType(imgData)
if err := s.deps.CoverStore.PutCover(r.Context(), req.Slug, imgData, contentType); err != nil {
s.deps.Log.Error("admin: save-cover failed", "slug", req.Slug, "err", err)
jsonError(w, http.StatusInternalServerError, "save cover: "+err.Error())
return
}
s.deps.Log.Info("admin: cover saved via image-gen", "slug", req.Slug, "bytes", len(imgData))
writeJSON(w, 0, map[string]any{
"saved": true,
"cover_url": fmt.Sprintf("/api/cover/novelfire.net/%s", req.Slug),
"bytes": len(imgData),
})
}
// sniffImageContentType returns the MIME type of the image bytes.
func sniffImageContentType(data []byte) string {
if len(data) >= 4 {
// PNG: 0x89 P N G
if data[0] == 0x89 && data[1] == 0x50 && data[2] == 0x4e && data[3] == 0x47 {
return "image/png"
}
// JPEG: FF D8 FF
if data[0] == 0xFF && data[1] == 0xD8 && data[2] == 0xFF {
return "image/jpeg"
}
// WebP: RIFF....WEBP
if len(data) >= 12 && data[0] == 'R' && data[1] == 'I' && data[2] == 'F' && data[3] == 'F' &&
data[8] == 'W' && data[9] == 'E' && data[10] == 'B' && data[11] == 'P' {
return "image/webp"
}
}
return "image/png"
}
// saveChapterImageRequest is the JSON body for POST /api/admin/image-gen/save-chapter-image.
type saveChapterImageRequest struct {
// Slug is the book slug.
Slug string `json:"slug"`
// Chapter is the 1-based chapter number.
Chapter int `json:"chapter"`
// ImageB64 is the base64-encoded image bytes (PNG or JPEG).
ImageB64 string `json:"image_b64"`
}
// handleAdminImageGenSaveChapterImage handles POST /api/admin/image-gen/save-chapter-image.
//
// Accepts a pre-generated image as base64 and stores it as the chapter illustration
// in MinIO, replacing the existing one if present. Does not call Cloudflare AI.
func (s *Server) handleAdminImageGenSaveChapterImage(w http.ResponseWriter, r *http.Request) {
if s.deps.ChapterImageStore == nil {
jsonError(w, http.StatusServiceUnavailable, "chapter image store not configured")
return
}
var req saveChapterImageRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if req.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if req.Chapter <= 0 {
jsonError(w, http.StatusBadRequest, "chapter must be > 0")
return
}
if req.ImageB64 == "" {
jsonError(w, http.StatusBadRequest, "image_b64 is required")
return
}
imgData, err := base64.StdEncoding.DecodeString(req.ImageB64)
if err != nil {
imgData, err = base64.RawStdEncoding.DecodeString(req.ImageB64)
if err != nil {
jsonError(w, http.StatusBadRequest, "decode image_b64: "+err.Error())
return
}
}
contentType := sniffImageContentType(imgData)
if err := s.deps.ChapterImageStore.PutChapterImage(r.Context(), req.Slug, req.Chapter, imgData, contentType); err != nil {
s.deps.Log.Error("admin: save-chapter-image failed", "slug", req.Slug, "chapter", req.Chapter, "err", err)
jsonError(w, http.StatusInternalServerError, "save chapter image: "+err.Error())
return
}
s.deps.Log.Info("admin: chapter image saved", "slug", req.Slug, "chapter", req.Chapter, "bytes", len(imgData))
writeJSON(w, 0, map[string]any{
"saved": true,
"image_url": fmt.Sprintf("/api/chapter-image/novelfire.net/%s/%d", req.Slug, req.Chapter),
"bytes": len(imgData),
})
}
// handleHeadChapterImage handles HEAD /api/chapter-image/{domain}/{slug}/{n}.
//
// Returns 200 when an image exists for this chapter, 404 otherwise.
// Used by the SSR loader to check existence without downloading the full image.
func (s *Server) handleHeadChapterImage(w http.ResponseWriter, r *http.Request) {
if s.deps.ChapterImageStore == nil {
w.WriteHeader(http.StatusNotFound)
return
}
slug := r.PathValue("slug")
nStr := r.PathValue("n")
n, err := strconv.Atoi(nStr)
if err != nil || n <= 0 {
w.WriteHeader(http.StatusBadRequest)
return
}
if s.deps.ChapterImageStore.ChapterImageExists(r.Context(), slug, n) {
w.WriteHeader(http.StatusOK)
} else {
w.WriteHeader(http.StatusNotFound)
}
}
// handleGetChapterImage handles GET /api/chapter-image/{domain}/{slug}/{n}.
//
// Serves the stored chapter illustration directly from MinIO.
// Returns 404 when no image has been saved for this chapter.
func (s *Server) handleGetChapterImage(w http.ResponseWriter, r *http.Request) {
if s.deps.ChapterImageStore == nil {
http.NotFound(w, r)
return
}
slug := r.PathValue("slug")
nStr := r.PathValue("n")
n, err := strconv.Atoi(nStr)
if err != nil || n <= 0 {
jsonError(w, http.StatusBadRequest, "invalid chapter number")
return
}
data, contentType, ok, err := s.deps.ChapterImageStore.GetChapterImage(r.Context(), slug, n)
if err != nil {
s.deps.Log.Error("chapter-image: get failed", "slug", slug, "n", n, "err", err)
jsonError(w, http.StatusInternalServerError, "could not retrieve chapter image")
return
}
if !ok {
http.NotFound(w, r)
return
}
w.Header().Set("Content-Type", contentType)
w.Header().Set("Cache-Control", "public, max-age=31536000, immutable")
w.Header().Set("Content-Length", fmt.Sprintf("%d", len(data)))
w.WriteHeader(http.StatusOK)
_, _ = w.Write(data)
}
// handleAdminImageGenAsync handles POST /api/admin/image-gen/async.
//
// Fire-and-forget variant: validates the request, creates an ai_job record of
// kind "image-gen", spawns a background goroutine, and returns HTTP 202 with
// {job_id} immediately. The goroutine calls Cloudflare AI, stores the result
// as base64 in the job payload, and marks the job done/failed when finished.
//
// The admin can then review the result via the ai-jobs page and approve
// (save as cover) or reject (discard) the image.
func (s *Server) handleAdminImageGenAsync(w http.ResponseWriter, r *http.Request) {
if s.deps.ImageGen == nil {
jsonError(w, http.StatusServiceUnavailable, "image generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
var req imageGenRequest
var refImageData []byte
ct := r.Header.Get("Content-Type")
if strings.HasPrefix(ct, "multipart/form-data") {
if err := r.ParseMultipartForm(32 << 20); err != nil {
jsonError(w, http.StatusBadRequest, "parse multipart: "+err.Error())
return
}
if jsonPart := r.FormValue("json"); jsonPart != "" {
if err := json.Unmarshal([]byte(jsonPart), &req); err != nil {
jsonError(w, http.StatusBadRequest, "parse json field: "+err.Error())
return
}
}
if f, _, err := r.FormFile("reference"); err == nil {
defer f.Close()
refImageData, _ = io.ReadAll(f)
}
} else {
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if req.ReferenceImageB64 != "" {
var decErr error
refImageData, decErr = base64.StdEncoding.DecodeString(req.ReferenceImageB64)
if decErr != nil {
refImageData, decErr = base64.RawStdEncoding.DecodeString(req.ReferenceImageB64)
if decErr != nil {
jsonError(w, http.StatusBadRequest, "decode reference_image_b64: "+decErr.Error())
return
}
}
}
}
if strings.TrimSpace(req.Prompt) == "" {
jsonError(w, http.StatusBadRequest, "prompt is required")
return
}
if req.Type != "cover" && req.Type != "chapter" {
jsonError(w, http.StatusBadRequest, `type must be "cover" or "chapter"`)
return
}
if req.Slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if req.Type == "chapter" && req.Chapter <= 0 {
jsonError(w, http.StatusBadRequest, "chapter must be > 0 when type is chapter")
return
}
// Resolve model.
model := cfai.ImageModel(req.Model)
if model == "" {
if req.Type == "cover" {
model = cfai.DefaultImageModel
} else {
model = cfai.ImageModelFlux2Klein4B
}
}
// Encode request params as job payload so the UI can reconstruct context.
type jobParams struct {
Prompt string `json:"prompt"`
Type string `json:"type"`
Chapter int `json:"chapter,omitempty"`
NumSteps int `json:"num_steps,omitempty"`
Width int `json:"width,omitempty"`
Height int `json:"height,omitempty"`
Guidance float64 `json:"guidance,omitempty"`
Strength float64 `json:"strength,omitempty"`
HasRef bool `json:"has_ref,omitempty"`
}
paramsJSON, _ := json.Marshal(jobParams{
Prompt: req.Prompt,
Type: req.Type,
Chapter: req.Chapter,
NumSteps: req.NumSteps,
Width: req.Width,
Height: req.Height,
Guidance: req.Guidance,
Strength: req.Strength,
HasRef: len(refImageData) > 0,
})
jobID, createErr := s.deps.AIJobStore.CreateAIJob(r.Context(), domain.AIJob{
Kind: "image-gen",
Slug: req.Slug,
Status: domain.TaskStatusPending,
Model: string(model),
Payload: string(paramsJSON),
Started: time.Now(),
})
if createErr != nil {
jsonError(w, http.StatusInternalServerError, "create ai job: "+createErr.Error())
return
}
jobCtx, jobCancel := context.WithCancel(context.Background())
registerCancelJob(jobID, jobCancel)
// Mark running before returning.
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), jobID, map[string]any{
"status": string(domain.TaskStatusRunning),
})
s.deps.Log.Info("admin: image-gen async started",
"job_id", jobID, "slug", req.Slug, "type", req.Type, "model", model)
// Capture locals for the goroutine.
store := s.deps.AIJobStore
imageGen := s.deps.ImageGen
coverStore := s.deps.CoverStore
logger := s.deps.Log
capturedReq := req
capturedModel := model
capturedRefImage := refImageData
go func() {
defer deregisterCancelJob(jobID)
defer jobCancel()
if jobCtx.Err() != nil {
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusCancelled),
"finished": time.Now().Format(time.RFC3339),
})
return
}
imgReq := cfai.ImageRequest{
Prompt: capturedReq.Prompt,
Model: capturedModel,
NumSteps: capturedReq.NumSteps,
Width: capturedReq.Width,
Height: capturedReq.Height,
Guidance: capturedReq.Guidance,
Strength: capturedReq.Strength,
}
var imgData []byte
var genErr error
if len(capturedRefImage) > 0 {
imgData, genErr = imageGen.GenerateImageFromReference(jobCtx, imgReq, capturedRefImage)
} else {
imgData, genErr = imageGen.GenerateImage(jobCtx, imgReq)
}
if genErr != nil {
logger.Error("admin: image-gen async failed", "job_id", jobID, "err", genErr)
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusFailed),
"error_message": genErr.Error(),
"finished": time.Now().Format(time.RFC3339),
})
return
}
contentType := sniffImageContentType(imgData)
b64 := base64.StdEncoding.EncodeToString(imgData)
// Build result payload: include the original params + the generated image.
type resultPayload struct {
Prompt string `json:"prompt"`
Type string `json:"type"`
Chapter int `json:"chapter,omitempty"`
ContentType string `json:"content_type"`
ImageB64 string `json:"image_b64"`
Bytes int `json:"bytes"`
NumSteps int `json:"num_steps,omitempty"`
Width int `json:"width,omitempty"`
Height int `json:"height,omitempty"`
Guidance float64 `json:"guidance,omitempty"`
}
resultJSON, _ := json.Marshal(resultPayload{
Prompt: capturedReq.Prompt,
Type: capturedReq.Type,
Chapter: capturedReq.Chapter,
ContentType: contentType,
ImageB64: b64,
Bytes: len(imgData),
NumSteps: capturedReq.NumSteps,
Width: capturedReq.Width,
Height: capturedReq.Height,
Guidance: capturedReq.Guidance,
})
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusDone),
"items_done": 1,
"items_total": 1,
"payload": string(resultJSON),
"finished": time.Now().Format(time.RFC3339),
})
logger.Info("admin: image-gen async done",
"job_id", jobID, "slug", capturedReq.Slug,
"bytes", len(imgData), "content_type", contentType)
// Suppress unused variable warning for coverStore when SaveToCover is false.
_ = coverStore
}()
writeJSON(w, http.StatusAccepted, map[string]any{"job_id": jobID})
}

View File

@@ -0,0 +1,234 @@
package backend
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"path/filepath"
"strings"
"time"
"github.com/libnovel/backend/internal/domain"
"github.com/libnovel/backend/internal/storage"
)
type importRequest struct {
Title string `json:"title"`
Author string `json:"author"`
CoverURL string `json:"cover_url"`
Genres []string `json:"genres"`
Summary string `json:"summary"`
BookStatus string `json:"book_status"` // "ongoing" | "completed" | "hiatus"
FileName string `json:"file_name"`
FileType string `json:"file_type"` // "pdf" or "epub"
ObjectKey string `json:"object_key"` // MinIO path to uploaded file
}
type importResponse struct {
TaskID string `json:"task_id"`
Slug string `json:"slug"`
Preview *importPreview `json:"preview,omitempty"`
}
type importPreview struct {
Chapters int `json:"chapters"`
FirstLines []string `json:"first_lines"`
}
func (s *Server) handleAdminImport(w http.ResponseWriter, r *http.Request) {
if s.deps.Producer == nil {
jsonError(w, http.StatusServiceUnavailable, "task queue not configured")
return
}
ct := r.Header.Get("Content-Type")
var req importRequest
var objectKey string
var chaptersKey string
var chapterCount int
if strings.HasPrefix(ct, "multipart/form-data") {
if err := r.ParseMultipartForm(32 << 20); err != nil {
jsonError(w, http.StatusBadRequest, "parse multipart: "+err.Error())
return
}
req.Title = r.FormValue("title")
req.Author = r.FormValue("author")
req.CoverURL = r.FormValue("cover_url")
req.Summary = r.FormValue("summary")
req.BookStatus = r.FormValue("book_status")
if g := r.FormValue("genres"); g != "" {
for _, s := range strings.Split(g, ",") {
if s = strings.TrimSpace(s); s != "" {
req.Genres = append(req.Genres, s)
}
}
}
req.FileName = r.FormValue("file_name")
req.FileType = r.FormValue("file_type")
analyzeOnly := r.FormValue("analyze") == "true"
file, header, err := r.FormFile("file")
if err != nil {
jsonError(w, http.StatusBadRequest, "parse file: "+err.Error())
return
}
defer file.Close()
if req.FileName == "" {
req.FileName = header.Filename
}
if req.FileType == "" {
req.FileType = strings.TrimPrefix(filepath.Ext(header.Filename), ".")
}
data, err := io.ReadAll(file)
if err != nil {
jsonError(w, http.StatusBadRequest, "read file: "+err.Error())
return
}
// Analyze only - just count chapters
if analyzeOnly {
preview := analyzeImportFile(data, req.FileType)
writeJSON(w, 0, importResponse{
Preview: preview,
})
return
}
// Parse PDF/EPUB on the backend (with timeout) and store chapters as JSON.
// The runner only needs to ingest pre-parsed chapters — no PDF parsing on runner.
parseCtx, parseCancel := context.WithTimeout(r.Context(), 3*time.Minute)
defer parseCancel()
chapters, parseErr := storage.ParseImportFile(parseCtx, data, req.FileType)
if parseErr != nil || len(chapters) == 0 {
jsonError(w, http.StatusUnprocessableEntity, "could not parse file: "+func() string {
if parseErr != nil { return parseErr.Error() }
return "no chapters found"
}())
return
}
// Store raw file in MinIO (for reference/re-import).
objectKey = fmt.Sprintf("imports/%d_%s", time.Now().Unix(), header.Filename)
if s.deps.ImportFileStore == nil {
jsonError(w, http.StatusInternalServerError, "storage not available")
return
}
if err := s.deps.ImportFileStore.PutImportFile(r.Context(), objectKey, data); err != nil {
jsonError(w, http.StatusInternalServerError, "upload file: "+err.Error())
return
}
// Store pre-parsed chapters JSON in MinIO so runner can ingest without re-parsing.
chaptersJSON, _ := json.Marshal(chapters)
chaptersKey = fmt.Sprintf("imports/%d_%s_chapters.json", time.Now().Unix(), strings.TrimSuffix(header.Filename, filepath.Ext(header.Filename)))
if err := s.deps.ImportFileStore.PutImportChapters(r.Context(), chaptersKey, chaptersJSON); err != nil {
jsonError(w, http.StatusInternalServerError, "store chapters: "+err.Error())
return
}
chapterCount = len(chapters)
} else {
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
objectKey = req.ObjectKey
}
if req.Title == "" {
jsonError(w, http.StatusBadRequest, "title is required")
return
}
if req.FileType != "pdf" && req.FileType != "epub" {
jsonError(w, http.StatusBadRequest, "file_type must be 'pdf' or 'epub'")
return
}
slug := strings.ToLower(strings.ReplaceAll(req.Title, " ", "-"))
slug = strings.Map(func(r rune) rune {
if (r >= 'a' && r <= 'z') || (r >= '0' && r <= '9') || r == '-' {
return r
}
return -1
}, slug)
taskID, err := s.deps.Producer.CreateImportTask(r.Context(), domain.ImportTask{
Slug: slug,
Title: req.Title,
Author: req.Author,
CoverURL: req.CoverURL,
Genres: req.Genres,
Summary: req.Summary,
BookStatus: req.BookStatus,
FileType: req.FileType,
ObjectKey: objectKey,
ChaptersKey: chaptersKey,
ChaptersTotal: chapterCount,
InitiatorUserID: "",
})
if err != nil {
jsonError(w, http.StatusInternalServerError, "create import task: "+err.Error())
return
}
writeJSON(w, 0, importResponse{
TaskID: taskID,
Slug: slug,
Preview: &importPreview{Chapters: chapterCount},
})
}
// analyzeImportFile parses the file to count chapters and extract preview lines.
func analyzeImportFile(data []byte, fileType string) *importPreview {
count, firstLines, err := storage.AnalyzeFile(data, fileType)
if err != nil || count == 0 {
// Fall back to rough size estimate so the UI still shows something
count = estimateChapters(data, fileType)
}
return &importPreview{
Chapters: count,
FirstLines: firstLines,
}
}
func estimateChapters(data []byte, fileType string) int {
// Rough estimate: ~100KB per chapter for PDF, ~50KB for EPUB
size := len(data)
if fileType == "pdf" {
return size / 100000
}
return size / 50000
}
func (s *Server) handleAdminImportStatus(w http.ResponseWriter, r *http.Request) {
taskID := r.PathValue("id")
if taskID == "" {
jsonError(w, http.StatusBadRequest, "task id required")
return
}
task, ok, err := s.deps.TaskReader.GetImportTask(r.Context(), taskID)
if err != nil {
jsonError(w, http.StatusInternalServerError, "get task: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, "task not found")
return
}
writeJSON(w, 0, task)
}
func (s *Server) handleAdminImportList(w http.ResponseWriter, r *http.Request) {
tasks, err := s.deps.TaskReader.ListImportTasks(r.Context())
if err != nil {
jsonError(w, http.StatusInternalServerError, "list tasks: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"tasks": tasks})
}

View File

@@ -0,0 +1,119 @@
package backend
import (
"encoding/json"
"net/http"
)
// handleDismissNotification handles DELETE /api/notifications/{id}.
func (s *Server) handleDismissNotification(w http.ResponseWriter, r *http.Request) {
id := r.PathValue("id")
if id == "" {
jsonError(w, http.StatusBadRequest, "notification id required")
return
}
if s.deps.NotificationStore == nil {
jsonError(w, http.StatusServiceUnavailable, "notification store not configured")
return
}
if err := s.deps.NotificationStore.DeleteNotification(r.Context(), id); err != nil {
jsonError(w, http.StatusInternalServerError, "dismiss notification: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}
// handleClearAllNotifications handles DELETE /api/notifications?user_id=...
func (s *Server) handleClearAllNotifications(w http.ResponseWriter, r *http.Request) {
userID := r.URL.Query().Get("user_id")
if userID == "" {
jsonError(w, http.StatusBadRequest, "user_id required")
return
}
if s.deps.NotificationStore == nil {
jsonError(w, http.StatusServiceUnavailable, "notification store not configured")
return
}
if err := s.deps.NotificationStore.ClearAllNotifications(r.Context(), userID); err != nil {
jsonError(w, http.StatusInternalServerError, "clear notifications: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}
// handleMarkAllNotificationsRead handles PATCH /api/notifications?user_id=...
func (s *Server) handleMarkAllNotificationsRead(w http.ResponseWriter, r *http.Request) {
userID := r.URL.Query().Get("user_id")
if userID == "" {
jsonError(w, http.StatusBadRequest, "user_id required")
return
}
if s.deps.NotificationStore == nil {
jsonError(w, http.StatusServiceUnavailable, "notification store not configured")
return
}
if err := s.deps.NotificationStore.MarkAllNotificationsRead(r.Context(), userID); err != nil {
jsonError(w, http.StatusInternalServerError, "mark all read: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}
type notification struct {
ID string `json:"id"`
UserID string `json:"user_id"`
Title string `json:"title"`
Message string `json:"message"`
Link string `json:"link"`
Read bool `json:"read"`
}
func (s *Server) handleListNotifications(w http.ResponseWriter, r *http.Request) {
userID := r.URL.Query().Get("user_id")
if userID == "" {
jsonError(w, http.StatusBadRequest, "user_id required")
return
}
if s.deps.NotificationStore == nil {
jsonError(w, http.StatusServiceUnavailable, "notification store not configured")
return
}
items, err := s.deps.NotificationStore.ListNotifications(r.Context(), userID, 50)
if err != nil {
jsonError(w, http.StatusInternalServerError, "list notifications: "+err.Error())
return
}
// Parse each item as notification
notifications := make([]notification, 0, len(items))
for _, item := range items {
b, _ := json.Marshal(item)
var n notification
json.Unmarshal(b, &n) //nolint:errcheck
notifications = append(notifications, n)
}
writeJSON(w, 0, map[string]any{"notifications": notifications})
}
func (s *Server) handleMarkNotificationRead(w http.ResponseWriter, r *http.Request) {
id := r.PathValue("id")
if id == "" {
jsonError(w, http.StatusBadRequest, "notification id required")
return
}
if s.deps.NotificationStore == nil {
jsonError(w, http.StatusServiceUnavailable, "notification store not configured")
return
}
if err := s.deps.NotificationStore.MarkNotificationRead(r.Context(), id); err != nil {
jsonError(w, http.StatusInternalServerError, "mark read: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}

View File

@@ -0,0 +1,87 @@
package backend
import (
"encoding/json"
"net/http"
"os"
"github.com/libnovel/backend/internal/storage"
)
// handleGetVAPIDPublicKey handles GET /api/push-subscriptions/vapid-public-key.
// Returns the VAPID public key so the SvelteKit frontend can subscribe browsers.
func (s *Server) handleGetVAPIDPublicKey(w http.ResponseWriter, r *http.Request) {
key := os.Getenv("VAPID_PUBLIC_KEY")
if key == "" {
jsonError(w, http.StatusServiceUnavailable, "push notifications not configured")
return
}
writeJSON(w, 0, map[string]string{"public_key": key})
}
// handleSavePushSubscription handles POST /api/push-subscriptions.
// Registers a new browser push subscription for the authenticated user.
func (s *Server) handleSavePushSubscription(w http.ResponseWriter, r *http.Request) {
store, ok := s.deps.Producer.(*storage.Store)
if !ok {
jsonError(w, http.StatusInternalServerError, "storage not available")
return
}
var body struct {
UserID string `json:"user_id"`
Endpoint string `json:"endpoint"`
P256DH string `json:"p256dh"`
Auth string `json:"auth"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
jsonError(w, http.StatusBadRequest, "invalid request body")
return
}
if body.UserID == "" || body.Endpoint == "" || body.P256DH == "" || body.Auth == "" {
jsonError(w, http.StatusBadRequest, "user_id, endpoint, p256dh and auth are required")
return
}
if err := store.SavePushSubscription(r.Context(), storage.PushSubscription{
UserID: body.UserID,
Endpoint: body.Endpoint,
P256DH: body.P256DH,
Auth: body.Auth,
}); err != nil {
jsonError(w, http.StatusInternalServerError, "save push subscription: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}
// handleDeletePushSubscription handles DELETE /api/push-subscriptions.
// Removes a push subscription by endpoint for the given user.
func (s *Server) handleDeletePushSubscription(w http.ResponseWriter, r *http.Request) {
store, ok := s.deps.Producer.(*storage.Store)
if !ok {
jsonError(w, http.StatusInternalServerError, "storage not available")
return
}
var body struct {
UserID string `json:"user_id"`
Endpoint string `json:"endpoint"`
}
if err := json.NewDecoder(r.Body).Decode(&body); err != nil {
jsonError(w, http.StatusBadRequest, "invalid request body")
return
}
if body.UserID == "" || body.Endpoint == "" {
jsonError(w, http.StatusBadRequest, "user_id and endpoint are required")
return
}
if err := store.DeletePushSubscription(r.Context(), body.UserID, body.Endpoint); err != nil {
jsonError(w, http.StatusInternalServerError, "delete push subscription: "+err.Error())
return
}
writeJSON(w, 0, map[string]any{"success": true})
}

View File

@@ -0,0 +1,141 @@
package backend
import (
"encoding/json"
"fmt"
"net/http"
"strings"
"github.com/libnovel/backend/internal/bookstore"
"github.com/libnovel/backend/internal/domain"
)
// handleAdminSplitChapters handles POST /api/admin/books/{slug}/split-chapters.
//
// Request body (JSON):
//
// { "text": "<full text with --- dividers and optional ## Title lines>" }
//
// The text is split on lines containing only "---". Each segment may start with
// a "## Title" line which becomes the chapter title; remaining lines are the
// chapter content. Sequential chapter numbers 1..N are assigned.
//
// All existing chapters for the book are replaced: WriteChapter is called for
// each new chapter (upsert by number), so chapters beyond N are not deleted —
// use the dedup endpoint afterwards if needed.
func (s *Server) handleAdminSplitChapters(w http.ResponseWriter, r *http.Request) {
if s.deps.BookWriter == nil {
jsonError(w, http.StatusServiceUnavailable, "book writer not configured")
return
}
slug := r.PathValue("slug")
if slug == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
var req struct {
Text string `json:"text"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Text) == "" {
jsonError(w, http.StatusBadRequest, "text is required")
return
}
chapters := splitChapterText(req.Text)
if len(chapters) == 0 {
jsonError(w, http.StatusUnprocessableEntity, "no chapters produced from text")
return
}
for _, ch := range chapters {
var mdContent string
if ch.Title != "" && ch.Title != fmt.Sprintf("Chapter %d", ch.Number) {
mdContent = fmt.Sprintf("# %s\n\n%s", ch.Title, ch.Content)
} else {
mdContent = fmt.Sprintf("# Chapter %d\n\n%s", ch.Number, ch.Content)
}
domainCh := domain.Chapter{
Ref: domain.ChapterRef{Number: ch.Number, Title: ch.Title},
Text: mdContent,
}
if err := s.deps.BookWriter.WriteChapter(r.Context(), slug, domainCh); err != nil {
jsonError(w, http.StatusInternalServerError, fmt.Sprintf("write chapter %d: %s", ch.Number, err.Error()))
return
}
}
writeJSON(w, 0, map[string]any{
"chapters": len(chapters),
"slug": slug,
})
}
// splitChapterText splits text on "---" divider lines into bookstore.Chapter
// slices. Each segment may optionally start with a "## Title" header line.
func splitChapterText(text string) []bookstore.Chapter {
lines := strings.Split(text, "\n")
// Collect raw segments split on "---" dividers.
var segments [][]string
cur := []string{}
for _, line := range lines {
if strings.TrimSpace(line) == "---" {
segments = append(segments, cur)
cur = []string{}
} else {
cur = append(cur, line)
}
}
segments = append(segments, cur) // last segment
var chapters []bookstore.Chapter
chNum := 0
for _, seg := range segments {
// Trim leading/trailing blank lines from the segment.
start, end := 0, len(seg)
for start < end && strings.TrimSpace(seg[start]) == "" {
start++
}
for end > start && strings.TrimSpace(seg[end-1]) == "" {
end--
}
seg = seg[start:end]
if len(seg) == 0 {
continue
}
// Check for a "## Title" header on the first line.
title := ""
contentStart := 0
if strings.HasPrefix(strings.TrimSpace(seg[0]), "## ") {
title = strings.TrimSpace(strings.TrimPrefix(strings.TrimSpace(seg[0]), "## "))
contentStart = 1
// Skip blank lines after the title.
for contentStart < len(seg) && strings.TrimSpace(seg[contentStart]) == "" {
contentStart++
}
}
content := strings.TrimSpace(strings.Join(seg[contentStart:], "\n"))
if content == "" {
continue
}
chNum++
if title == "" {
title = fmt.Sprintf("Chapter %d", chNum)
}
chapters = append(chapters, bookstore.Chapter{
Number: chNum,
Title: title,
Content: content,
})
}
return chapters
}

View File

@@ -0,0 +1,969 @@
package backend
import (
"context"
"encoding/json"
"fmt"
"net/http"
"strings"
"time"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
)
// chapterNamesBatchSize is the number of chapters sent per LLM request.
// Keeps output well within the 4096-token response limit (~30 tokens/title).
const chapterNamesBatchSize = 100
// handleAdminTextGenModels handles GET /api/admin/text-gen/models.
// Returns the list of supported Cloudflare AI text generation models.
func (s *Server) handleAdminTextGenModels(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
models := s.deps.TextGen.Models()
writeJSON(w, 0, map[string]any{"models": models})
}
// ── Chapter names ─────────────────────────────────────────────────────────────
// textGenChapterNamesRequest is the JSON body for POST /api/admin/text-gen/chapter-names.
type textGenChapterNamesRequest struct {
// Slug is the book slug whose chapters to process.
Slug string `json:"slug"`
// Pattern is a free-text description of the desired naming convention,
// e.g. "Chapter {n}: {brief scene description}".
Pattern string `json:"pattern"`
// Model is the CF Workers AI model ID. Defaults to the recommended model when empty.
Model string `json:"model"`
// MaxTokens limits response length (0 = model default).
MaxTokens int `json:"max_tokens"`
// FromChapter is the first chapter to process (1-based). 0 = start from chapter 1.
FromChapter int `json:"from_chapter"`
// ToChapter is the last chapter to process (inclusive). 0 = process all.
ToChapter int `json:"to_chapter"`
// JobID is an optional existing ai_job ID for resuming a previous run.
// If set, the handler resumes from items_done instead of starting from scratch.
JobID string `json:"job_id"`
}
// proposedChapterTitle is a single chapter with its AI-proposed title.
type proposedChapterTitle struct {
Number int `json:"number"`
// OldTitle is the current title stored in the database.
OldTitle string `json:"old_title"`
// NewTitle is the AI-proposed replacement.
NewTitle string `json:"new_title"`
}
// chapterNamesBatchEvent is one SSE event emitted per processed batch.
type chapterNamesBatchEvent struct {
// JobID is the PB ai_job ID for this run (emitted on the first event only).
JobID string `json:"job_id,omitempty"`
// Batch is the 1-based batch index.
Batch int `json:"batch"`
// TotalBatches is the total number of batches.
TotalBatches int `json:"total_batches"`
// ChaptersDone is the cumulative count of chapters processed so far.
ChaptersDone int `json:"chapters_done"`
// TotalChapters is the total chapter count for this book.
TotalChapters int `json:"total_chapters"`
// Model is the CF AI model used.
Model string `json:"model"`
// Chapters contains the proposed titles for this batch.
Chapters []proposedChapterTitle `json:"chapters"`
// Error is non-empty if this batch failed.
Error string `json:"error,omitempty"`
// Done is true on the final sentinel event (no Chapters).
Done bool `json:"done,omitempty"`
}
// handleAdminTextGenChapterNames handles POST /api/admin/text-gen/chapter-names.
//
// Splits all chapters into batches of chapterNamesBatchSize, sends each batch
// to the LLM sequentially, and streams results back as Server-Sent Events so
// the frontend can show live progress. Each SSE data line is a JSON-encoded
// chapterNamesBatchEvent. The final event has Done=true.
//
// Does NOT persist anything — the frontend shows a diff and the user must
// confirm via POST /api/admin/text-gen/chapter-names/apply.
func (s *Server) handleAdminTextGenChapterNames(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
var req textGenChapterNamesRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if strings.TrimSpace(req.Pattern) == "" {
jsonError(w, http.StatusBadRequest, "pattern is required")
return
}
// Load existing chapter list.
allChapters, err := s.deps.BookReader.ListChapters(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "list chapters: "+err.Error())
return
}
if len(allChapters) == 0 {
jsonError(w, http.StatusNotFound, fmt.Sprintf("no chapters found for slug %q", req.Slug))
return
}
// Apply chapter range filter.
chapters := allChapters
if req.FromChapter > 0 || req.ToChapter > 0 {
filtered := chapters[:0]
for _, ch := range allChapters {
if req.FromChapter > 0 && ch.Number < req.FromChapter {
continue
}
if req.ToChapter > 0 && ch.Number > req.ToChapter {
break
}
filtered = append(filtered, ch)
}
chapters = filtered
}
if len(chapters) == 0 {
jsonError(w, http.StatusBadRequest, "no chapters in the specified range")
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
// 4096 tokens comfortably fits 100 chapter titles (~30 tokens each).
maxTokens := req.MaxTokens
if maxTokens <= 0 {
maxTokens = 4096
}
// Index existing titles for old/new diff.
existing := make(map[int]string, len(chapters))
for _, ch := range chapters {
existing[ch.Number] = ch.Title
}
// Partition chapters into batches.
batches := chunkChapters(chapters, chapterNamesBatchSize)
totalBatches := len(batches)
s.deps.Log.Info("admin: text-gen chapter-names requested",
"slug", req.Slug, "chapters", len(chapters),
"batches", totalBatches, "model", model, "max_tokens", maxTokens)
systemPrompt := `You are a chapter title editor for a web novel platform. ` +
`The user provides a list of chapter numbers with their current titles, ` +
`and a naming pattern template. ` +
`Your job: produce one new title for every chapter, following the pattern exactly. ` +
`Pattern placeholders: {n} = the chapter number (integer), {scene} = a very short (25 word) scene hint derived from the existing title. ` +
`RULES: ` +
`1. Do NOT include the chapter number inside the title text — the {n} placeholder is already in the pattern. ` +
`2. Do NOT include any prefix like "Chapter X -" or "Chapter X:" inside the title field itself. ` +
`3. The "title" field in your JSON must be the fully-rendered string (e.g. if pattern is "Chapter {n}: {scene}", output "Chapter 3: The Bet"). ` +
`4. Respond ONLY with a raw JSON array — no prose, no markdown fences, no explanation. ` +
`5. Each element: {"number": <int>, "title": <string>}. ` +
`6. Output every chapter in the input list, in order. Do not skip any.`
// Switch to SSE before writing anything.
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("X-Accel-Buffering", "no") // disable nginx/caddy buffering
flusher, canFlush := w.(http.Flusher)
sseWrite := func(evt chapterNamesBatchEvent) {
b, _ := json.Marshal(evt)
fmt.Fprintf(w, "data: %s\n\n", b)
if canFlush {
flusher.Flush()
}
}
// Create or resume an ai_job record for tracking.
var jobID string
resumeFrom := 0
jobCtx := r.Context()
var jobCancel context.CancelFunc
if s.deps.AIJobStore != nil {
if req.JobID != "" {
if existingJob, ok, _ := s.deps.AIJobStore.GetAIJob(r.Context(), req.JobID); ok {
jobID = req.JobID
resumeFrom = existingJob.ItemsDone
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), jobID, map[string]any{
"status": string(domain.TaskStatusRunning),
"items_total": len(chapters),
})
}
}
if jobID == "" {
jobPayload := fmt.Sprintf(`{"pattern":%q}`, req.Pattern)
id, createErr := s.deps.AIJobStore.CreateAIJob(r.Context(), domain.AIJob{
Kind: "chapter-names",
Slug: req.Slug,
Status: domain.TaskStatusRunning,
FromItem: req.FromChapter,
ToItem: req.ToChapter,
ItemsTotal: len(chapters),
Model: string(model),
Payload: jobPayload,
Started: time.Now(),
})
if createErr == nil {
jobID = id
}
}
if jobID != "" {
jobCtx, jobCancel = context.WithCancel(r.Context())
registerCancelJob(jobID, jobCancel)
defer deregisterCancelJob(jobID)
defer jobCancel()
}
}
var allResults []proposedChapterTitle
chaptersDone := resumeFrom
firstEvent := true
for i, batch := range batches {
if jobCtx.Err() != nil {
return // client disconnected or cancelled
}
// Skip batches already processed in a previous run.
batchEnd := (i + 1) * chapterNamesBatchSize
if batchEnd <= resumeFrom {
continue
}
var chapterListSB strings.Builder
for _, ch := range batch {
chapterListSB.WriteString(fmt.Sprintf("%d: %s\n", ch.Number, ch.Title))
}
userPrompt := fmt.Sprintf("Naming pattern: %s\n\nChapters:\n%s", req.Pattern, chapterListSB.String())
raw, genErr := s.deps.TextGen.Generate(jobCtx, cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{
{Role: "system", Content: systemPrompt},
{Role: "user", Content: userPrompt},
},
MaxTokens: maxTokens,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen chapter-names batch failed",
"batch", i+1, "err", genErr)
evt := chapterNamesBatchEvent{
Batch: i + 1,
TotalBatches: totalBatches,
ChaptersDone: chaptersDone,
TotalChapters: len(chapters),
Model: string(model),
Error: genErr.Error(),
}
if firstEvent {
evt.JobID = jobID
firstEvent = false
}
sseWrite(evt)
continue
}
proposed := parseChapterTitlesJSON(raw)
result := make([]proposedChapterTitle, 0, len(proposed))
for _, p := range proposed {
result = append(result, proposedChapterTitle{
Number: p.Number,
OldTitle: existing[p.Number],
NewTitle: p.Title,
})
}
allResults = append(allResults, result...)
chaptersDone += len(batch)
if jobID != "" && s.deps.AIJobStore != nil {
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), jobID, map[string]any{
"items_done": chaptersDone,
})
}
evt := chapterNamesBatchEvent{
Batch: i + 1,
TotalBatches: totalBatches,
ChaptersDone: chaptersDone,
TotalChapters: len(chapters),
Model: string(model),
Chapters: result,
}
if firstEvent {
evt.JobID = jobID
firstEvent = false
}
sseWrite(evt)
}
// Mark job as done in PB, persisting results so the Review button works.
// Use context.Background() — r.Context() may be cancelled if the SSE client
// disconnected before processing finished, which would silently drop results.
if jobID != "" && s.deps.AIJobStore != nil {
status := domain.TaskStatusDone
if jobCtx.Err() != nil {
status = domain.TaskStatusCancelled
}
resultsJSON, _ := json.Marshal(allResults)
finalPayload := fmt.Sprintf(`{"pattern":%q,"slug":%q,"results":%s}`,
req.Pattern, req.Slug, string(resultsJSON))
_ = s.deps.AIJobStore.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(status),
"items_done": chaptersDone,
"finished": time.Now().Format(time.RFC3339),
"payload": finalPayload,
})
}
// Final sentinel event.
sseWrite(chapterNamesBatchEvent{Done: true, TotalChapters: len(chapters), Model: string(model)})
}
// chunkChapters splits a chapter slice into batches of at most size n.
func chunkChapters(chapters []domain.ChapterInfo, n int) [][]domain.ChapterInfo {
var batches [][]domain.ChapterInfo
for len(chapters) > 0 {
end := n
if end > len(chapters) {
end = len(chapters)
}
batches = append(batches, chapters[:end])
chapters = chapters[end:]
}
return batches
}
// parseChapterTitlesJSON extracts the JSON array from a model response.
// It tolerates markdown fences and surrounding prose.
type rawChapterTitle struct {
Number int `json:"number"`
Title string `json:"title"`
}
func parseChapterTitlesJSON(raw string) []rawChapterTitle {
// Strip markdown fences if present.
s := raw
if idx := strings.Index(s, "```json"); idx >= 0 {
s = s[idx+7:]
} else if idx := strings.Index(s, "```"); idx >= 0 {
s = s[idx+3:]
}
if idx := strings.LastIndex(s, "```"); idx >= 0 {
s = s[:idx]
}
// Find the JSON array boundaries.
start := strings.Index(s, "[")
end := strings.LastIndex(s, "]")
if start < 0 || end <= start {
return nil
}
s = s[start : end+1]
var out []rawChapterTitle
json.Unmarshal([]byte(s), &out) //nolint:errcheck
return out
}
// handleAdminTextGenChapterNamesAsync handles POST /api/admin/text-gen/chapter-names/async.
//
// Fire-and-forget variant: validates inputs, creates an ai_job record, spawns a
// background goroutine, and returns HTTP 202 with {job_id} immediately. The
// goroutine runs all batches, stores the proposed titles in the job payload, and
// marks the job done/failed/cancelled when finished.
//
// The client can poll GET /api/admin/ai-jobs/{id} for progress, then call
// POST /api/admin/text-gen/chapter-names/apply once the job is "done".
func (s *Server) handleAdminTextGenChapterNamesAsync(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
var req textGenChapterNamesRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if strings.TrimSpace(req.Pattern) == "" {
jsonError(w, http.StatusBadRequest, "pattern is required")
return
}
// Load existing chapter list (use request context — just for validation).
allChapters, err := s.deps.BookReader.ListChapters(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "list chapters: "+err.Error())
return
}
if len(allChapters) == 0 {
jsonError(w, http.StatusNotFound, fmt.Sprintf("no chapters found for slug %q", req.Slug))
return
}
// Apply chapter range filter.
chapters := allChapters
if req.FromChapter > 0 || req.ToChapter > 0 {
filtered := chapters[:0]
for _, ch := range allChapters {
if req.FromChapter > 0 && ch.Number < req.FromChapter {
continue
}
if req.ToChapter > 0 && ch.Number > req.ToChapter {
break
}
filtered = append(filtered, ch)
}
chapters = filtered
}
if len(chapters) == 0 {
jsonError(w, http.StatusBadRequest, "no chapters in the specified range")
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
maxTokens := req.MaxTokens
if maxTokens <= 0 {
maxTokens = 4096
}
// Index existing titles for old/new diff.
existing := make(map[int]string, len(chapters))
for _, ch := range chapters {
existing[ch.Number] = ch.Title
}
batches := chunkChapters(chapters, chapterNamesBatchSize)
totalBatches := len(batches)
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
jobPayload := fmt.Sprintf(`{"pattern":%q}`, req.Pattern)
jobID, createErr := s.deps.AIJobStore.CreateAIJob(r.Context(), domain.AIJob{
Kind: "chapter-names",
Slug: req.Slug,
Status: domain.TaskStatusPending,
FromItem: req.FromChapter,
ToItem: req.ToChapter,
ItemsTotal: len(chapters),
Model: string(model),
Payload: jobPayload,
Started: time.Now(),
})
if createErr != nil {
jsonError(w, http.StatusInternalServerError, "create ai job: "+createErr.Error())
return
}
jobCtx, jobCancel := context.WithCancel(context.Background())
registerCancelJob(jobID, jobCancel)
s.deps.Log.Info("admin: text-gen chapter-names async started",
"job_id", jobID, "slug", req.Slug,
"chapters", len(chapters), "batches", totalBatches, "model", model)
// Mark running before returning so the UI sees it immediately.
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), jobID, map[string]any{
"status": string(domain.TaskStatusRunning),
})
systemPrompt := `You are a chapter title editor for a web novel platform. ` +
`The user provides a list of chapter numbers with their current titles, ` +
`and a naming pattern template. ` +
`Your job: produce one new title for every chapter, following the pattern exactly. ` +
`Pattern placeholders: {n} = the chapter number (integer), {scene} = a very short (25 word) scene hint derived from the existing title. ` +
`RULES: ` +
`1. Do NOT include the chapter number inside the title text — the {n} placeholder is already in the pattern. ` +
`2. Do NOT include any prefix like "Chapter X -" or "Chapter X:" inside the title field itself. ` +
`3. The "title" field in your JSON must be the fully-rendered string (e.g. if pattern is "Chapter {n}: {scene}", output "Chapter 3: The Bet"). ` +
`4. Respond ONLY with a raw JSON array — no prose, no markdown fences, no explanation. ` +
`5. Each element: {"number": <int>, "title": <string>}. ` +
`6. Output every chapter in the input list, in order. Do not skip any.`
// Capture all locals needed in the goroutine.
store := s.deps.AIJobStore
textGen := s.deps.TextGen
logger := s.deps.Log
capturedModel := model
capturedMaxTokens := maxTokens
capturedPattern := req.Pattern
capturedSlug := req.Slug
go func() {
defer deregisterCancelJob(jobID)
defer jobCancel()
var allResults []proposedChapterTitle
chaptersDone := 0
for i, batch := range batches {
if jobCtx.Err() != nil {
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusCancelled),
"finished": time.Now().Format(time.RFC3339),
})
return
}
var chapterListSB strings.Builder
for _, ch := range batch {
chapterListSB.WriteString(fmt.Sprintf("%d: %s\n", ch.Number, ch.Title))
}
userPrompt := fmt.Sprintf("Naming pattern: %s\n\nChapters:\n%s", capturedPattern, chapterListSB.String())
raw, genErr := textGen.Generate(jobCtx, cfai.TextRequest{
Model: capturedModel,
Messages: []cfai.TextMessage{
{Role: "system", Content: systemPrompt},
{Role: "user", Content: userPrompt},
},
MaxTokens: capturedMaxTokens,
})
if genErr != nil {
logger.Error("admin: text-gen chapter-names async batch failed",
"job_id", jobID, "batch", i+1, "err", genErr)
// Continue — skip errored batch rather than aborting.
continue
}
proposed := parseChapterTitlesJSON(raw)
for _, p := range proposed {
allResults = append(allResults, proposedChapterTitle{
Number: p.Number,
OldTitle: existing[p.Number],
NewTitle: p.Title,
})
}
chaptersDone += len(batch)
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"items_done": chaptersDone,
})
}
// Persist results into payload so the UI can load them for review.
resultsJSON, _ := json.Marshal(allResults)
finalPayload := fmt.Sprintf(`{"pattern":%q,"slug":%q,"results":%s}`,
capturedPattern, capturedSlug, string(resultsJSON))
status := domain.TaskStatusDone
if jobCtx.Err() != nil {
status = domain.TaskStatusCancelled
}
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(status),
"items_done": chaptersDone,
"finished": time.Now().Format(time.RFC3339),
"payload": finalPayload,
})
logger.Info("admin: text-gen chapter-names async done",
"job_id", jobID, "slug", capturedSlug,
"results", len(allResults), "status", string(status))
}()
writeJSON(w, http.StatusAccepted, map[string]any{"job_id": jobID})
}
// ── Apply chapter names ───────────────────────────────────────────────────────
// applyChapterNamesRequest is the JSON body for POST /api/admin/text-gen/chapter-names/apply.
type applyChapterNamesRequest struct {
// Slug is the book slug to update.
Slug string `json:"slug"`
// Chapters is the list of chapters to save (number + new_title pairs).
// The UI may modify individual titles before confirming.
Chapters []applyChapterEntry `json:"chapters"`
}
type applyChapterEntry struct {
Number int `json:"number"`
Title string `json:"title"`
}
// handleAdminTextGenApplyChapterNames handles POST /api/admin/text-gen/chapter-names/apply.
//
// Persists the confirmed chapter titles to PocketBase chapters_idx.
func (s *Server) handleAdminTextGenApplyChapterNames(w http.ResponseWriter, r *http.Request) {
if s.deps.BookWriter == nil {
jsonError(w, http.StatusServiceUnavailable, "book writer not configured")
return
}
var req applyChapterNamesRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if len(req.Chapters) == 0 {
jsonError(w, http.StatusBadRequest, "chapters is required")
return
}
refs := make([]domain.ChapterRef, 0, len(req.Chapters))
for _, ch := range req.Chapters {
if ch.Number <= 0 {
continue
}
refs = append(refs, domain.ChapterRef{
Number: ch.Number,
Title: strings.TrimSpace(ch.Title),
})
}
if err := s.deps.BookWriter.WriteChapterRefs(r.Context(), req.Slug, refs); err != nil {
s.deps.Log.Error("admin: apply chapter names failed", "slug", req.Slug, "err", err)
jsonError(w, http.StatusInternalServerError, "write chapter refs: "+err.Error())
return
}
s.deps.Log.Info("admin: chapter names applied", "slug", req.Slug, "count", len(refs))
writeJSON(w, 0, map[string]any{"updated": len(refs)})
}
// ── Book description ──────────────────────────────────────────────────────────
// textGenDescriptionRequest is the JSON body for POST /api/admin/text-gen/description.
type textGenDescriptionRequest struct {
// Slug is the book slug whose description to regenerate.
Slug string `json:"slug"`
// Instructions is an optional free-text hint for the AI,
// e.g. "Write a 3-sentence blurb, avoid spoilers, dramatic tone."
Instructions string `json:"instructions"`
// Model is the CF Workers AI model ID. Defaults to recommended when empty.
Model string `json:"model"`
// MaxTokens limits response length (0 = model default).
MaxTokens int `json:"max_tokens"`
}
// textGenDescriptionResponse is the JSON body returned by POST /api/admin/text-gen/description.
type textGenDescriptionResponse struct {
// OldDescription is the current summary stored in the database.
OldDescription string `json:"old_description"`
// NewDescription is the AI-proposed replacement.
NewDescription string `json:"new_description"`
// Model is the model that was used.
Model string `json:"model"`
}
// handleAdminTextGenDescription handles POST /api/admin/text-gen/description.
//
// Reads the current book metadata, sends it to the LLM, and returns a proposed
// new description. Does NOT persist anything — the user must confirm via
// POST /api/admin/text-gen/description/apply.
func (s *Server) handleAdminTextGenDescription(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
var req textGenDescriptionRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
// Load current book metadata.
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
systemPrompt := `You are a book description writer for a web novel platform. ` +
`Given a book's title, author, genres, and current description, write an improved ` +
`description that accurately captures the story. ` +
`Respond with ONLY the new description text — no title, no labels, no markdown, no quotes.`
instructions := strings.TrimSpace(req.Instructions)
if instructions == "" {
instructions = "Write a compelling 24 sentence description. Keep it spoiler-free and engaging."
}
userPrompt := fmt.Sprintf(
"Title: %s\nAuthor: %s\nGenres: %s\nStatus: %s\n\nCurrent description:\n%s\n\nInstructions: %s",
meta.Title,
meta.Author,
strings.Join(meta.Genres, ", "),
meta.Status,
meta.Summary,
instructions,
)
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
s.deps.Log.Info("admin: text-gen description requested",
"slug", req.Slug, "model", model)
newDesc, genErr := s.deps.TextGen.Generate(r.Context(), cfai.TextRequest{
Model: model,
Messages: []cfai.TextMessage{
{Role: "system", Content: systemPrompt},
{Role: "user", Content: userPrompt},
},
MaxTokens: req.MaxTokens,
})
if genErr != nil {
s.deps.Log.Error("admin: text-gen description failed", "err", genErr)
jsonError(w, http.StatusBadGateway, "text generation failed: "+genErr.Error())
return
}
writeJSON(w, 0, textGenDescriptionResponse{
OldDescription: meta.Summary,
NewDescription: strings.TrimSpace(newDesc),
Model: string(model),
})
}
// ── Apply description ─────────────────────────────────────────────────────────
// applyDescriptionRequest is the JSON body for POST /api/admin/text-gen/description/apply.
type applyDescriptionRequest struct {
// Slug is the book slug to update.
Slug string `json:"slug"`
// Description is the new summary text to persist.
Description string `json:"description"`
}
// handleAdminTextGenApplyDescription handles POST /api/admin/text-gen/description/apply.
//
// Updates only the summary field in PocketBase, leaving all other book metadata
// unchanged.
func (s *Server) handleAdminTextGenApplyDescription(w http.ResponseWriter, r *http.Request) {
if s.deps.BookWriter == nil {
jsonError(w, http.StatusServiceUnavailable, "book writer not configured")
return
}
var req applyDescriptionRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
if strings.TrimSpace(req.Description) == "" {
jsonError(w, http.StatusBadRequest, "description is required")
return
}
// Read existing metadata so we can write it back with only summary changed.
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
meta.Summary = strings.TrimSpace(req.Description)
if err := s.deps.BookWriter.WriteMetadata(r.Context(), meta); err != nil {
s.deps.Log.Error("admin: apply description failed", "slug", req.Slug, "err", err)
jsonError(w, http.StatusInternalServerError, "write metadata: "+err.Error())
return
}
s.deps.Log.Info("admin: book description applied", "slug", req.Slug)
writeJSON(w, 0, map[string]any{"updated": true})
}
// handleAdminTextGenDescriptionAsync handles POST /api/admin/text-gen/description/async.
//
// Fire-and-forget variant: validates inputs, creates an ai_job record of kind
// "description", spawns a background goroutine that calls the LLM, stores the
// old/new description in the job payload, and marks the job done/failed.
// Returns HTTP 202 with {job_id} immediately.
func (s *Server) handleAdminTextGenDescriptionAsync(w http.ResponseWriter, r *http.Request) {
if s.deps.TextGen == nil {
jsonError(w, http.StatusServiceUnavailable, "text generation not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN missing)")
return
}
if s.deps.AIJobStore == nil {
jsonError(w, http.StatusServiceUnavailable, "ai job store not configured")
return
}
var req textGenDescriptionRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
jsonError(w, http.StatusBadRequest, "parse body: "+err.Error())
return
}
if strings.TrimSpace(req.Slug) == "" {
jsonError(w, http.StatusBadRequest, "slug is required")
return
}
// Load current metadata eagerly so we can fail fast if the book is missing.
meta, ok, err := s.deps.BookReader.ReadMetadata(r.Context(), req.Slug)
if err != nil {
jsonError(w, http.StatusInternalServerError, "read metadata: "+err.Error())
return
}
if !ok {
jsonError(w, http.StatusNotFound, fmt.Sprintf("book %q not found", req.Slug))
return
}
model := cfai.TextModel(req.Model)
if model == "" {
model = cfai.DefaultTextModel
}
instructions := strings.TrimSpace(req.Instructions)
if instructions == "" {
instructions = "Write a compelling 24 sentence description. Keep it spoiler-free and engaging."
}
// Encode the initial params (without result) as the starting payload.
type initPayload struct {
Instructions string `json:"instructions"`
OldDescription string `json:"old_description"`
}
initJSON, _ := json.Marshal(initPayload{
Instructions: instructions,
OldDescription: meta.Summary,
})
jobID, createErr := s.deps.AIJobStore.CreateAIJob(r.Context(), domain.AIJob{
Kind: "description",
Slug: req.Slug,
Status: domain.TaskStatusPending,
Model: string(model),
Payload: string(initJSON),
Started: time.Now(),
})
if createErr != nil {
jsonError(w, http.StatusInternalServerError, "create ai job: "+createErr.Error())
return
}
jobCtx, jobCancel := context.WithCancel(context.Background())
registerCancelJob(jobID, jobCancel)
_ = s.deps.AIJobStore.UpdateAIJob(r.Context(), jobID, map[string]any{
"status": string(domain.TaskStatusRunning),
})
s.deps.Log.Info("admin: text-gen description async started",
"job_id", jobID, "slug", req.Slug, "model", model)
// Capture locals.
store := s.deps.AIJobStore
textGen := s.deps.TextGen
logger := s.deps.Log
capturedMeta := meta
capturedModel := model
capturedInstructions := instructions
capturedMaxTokens := req.MaxTokens
go func() {
defer deregisterCancelJob(jobID)
defer jobCancel()
if jobCtx.Err() != nil {
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusCancelled),
"finished": time.Now().Format(time.RFC3339),
})
return
}
systemPrompt := `You are a book description writer for a web novel platform. ` +
`Given a book's title, author, genres, and current description, write an improved ` +
`description that accurately captures the story. ` +
`Respond with ONLY the new description text — no title, no labels, no markdown, no quotes.`
userPrompt := fmt.Sprintf(
"Title: %s\nAuthor: %s\nGenres: %s\nStatus: %s\n\nCurrent description:\n%s\n\nInstructions: %s",
capturedMeta.Title,
capturedMeta.Author,
strings.Join(capturedMeta.Genres, ", "),
capturedMeta.Status,
capturedMeta.Summary,
capturedInstructions,
)
newDesc, genErr := textGen.Generate(jobCtx, cfai.TextRequest{
Model: capturedModel,
Messages: []cfai.TextMessage{
{Role: "system", Content: systemPrompt},
{Role: "user", Content: userPrompt},
},
MaxTokens: capturedMaxTokens,
})
if genErr != nil {
logger.Error("admin: text-gen description async failed", "job_id", jobID, "err", genErr)
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusFailed),
"error_message": genErr.Error(),
"finished": time.Now().Format(time.RFC3339),
})
return
}
type resultPayload struct {
Instructions string `json:"instructions"`
OldDescription string `json:"old_description"`
NewDescription string `json:"new_description"`
}
resultJSON, _ := json.Marshal(resultPayload{
Instructions: capturedInstructions,
OldDescription: capturedMeta.Summary,
NewDescription: strings.TrimSpace(newDesc),
})
_ = store.UpdateAIJob(context.Background(), jobID, map[string]any{
"status": string(domain.TaskStatusDone),
"items_done": 1,
"items_total": 1,
"payload": string(resultJSON),
"finished": time.Now().Format(time.RFC3339),
})
logger.Info("admin: text-gen description async done", "job_id", jobID, "slug", capturedMeta.Slug)
}()
writeJSON(w, http.StatusAccepted, map[string]any{"job_id": jobID})
}

View File

@@ -30,6 +30,7 @@ import (
sentryhttp "github.com/getsentry/sentry-go/http"
"github.com/libnovel/backend/internal/bookstore"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
"github.com/libnovel/backend/internal/kokoro"
"github.com/libnovel/backend/internal/meili"
@@ -56,6 +57,9 @@ type Dependencies struct {
// CoverStore reads and writes book cover images from MinIO.
// If nil, the cover endpoint falls back to a CDN redirect.
CoverStore bookstore.CoverStore
// ChapterImageStore reads and writes per-chapter illustration images from MinIO.
// If nil, chapter image endpoints return 404/503.
ChapterImageStore bookstore.ChapterImageStore
// Producer creates scrape/audio tasks in PocketBase.
Producer taskqueue.Producer
// TaskReader reads scrape/audio task records from PocketBase.
@@ -69,6 +73,31 @@ type Dependencies struct {
// PocketTTS is the pocket-tts client (used for voice list only in the backend;
// audio generation is done by the runner).
PocketTTS pockettts.Client
// CFAI is the Cloudflare Workers AI TTS client (used for voice sample
// generation and audio-stream live TTS; audio task generation is done by the runner).
CFAI cfai.Client
// ImageGen is the Cloudflare Workers AI image generation client.
// If nil, image generation endpoints return 503.
ImageGen cfai.ImageGenClient
// TextGen is the Cloudflare Workers AI text generation client.
// If nil, text generation endpoints return 503.
TextGen cfai.TextGenClient
// BookWriter writes book metadata and chapter refs to PocketBase.
// Used by admin text-gen apply endpoints.
BookWriter bookstore.BookWriter
// ImportFileStore uploads raw PDF/EPUB files to MinIO for the runner to process.
// Always wired to the concrete *storage.Store (not the Asynq wrapper).
ImportFileStore bookstore.ImportFileStore
// AIJobStore tracks long-running AI generation jobs in PocketBase.
// If nil, job persistence is disabled (jobs still run but are not recorded).
AIJobStore bookstore.AIJobStore
// BookAdminStore provides admin-only operations: archive, unarchive, hard-delete.
// If nil, the admin book management endpoints return 503.
BookAdminStore bookstore.BookAdminStore
// NotificationStore manages per-user in-app notifications.
// Always wired directly to *storage.Store (not the Asynq wrapper) so
// notification endpoints work regardless of whether Redis/Asynq is in use.
NotificationStore bookstore.NotificationStore
// Log is the structured logger.
Log *slog.Logger
}
@@ -82,6 +111,9 @@ type Config struct {
// Version and Commit are embedded in /health and /api/version responses.
Version string
Commit string
// AdminToken is the bearer token required for all /api/admin/* endpoints.
// When empty a startup warning is logged and admin routes are unprotected.
AdminToken string
}
// Server is the HTTP API server.
@@ -108,9 +140,30 @@ func New(cfg Config, deps Dependencies) *Server {
return &Server{cfg: cfg, deps: deps}
}
// requireAdmin returns a handler that enforces Bearer token authentication.
// When AdminToken is empty all requests are allowed through (with a warning logged
// once at startup via ListenAndServe).
func (s *Server) requireAdmin(next http.HandlerFunc) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
if s.cfg.AdminToken == "" {
next(w, r)
return
}
if r.Header.Get("Authorization") != "Bearer "+s.cfg.AdminToken {
jsonError(w, http.StatusUnauthorized, "unauthorized")
return
}
next(w, r)
}
}
// ListenAndServe registers all routes and starts the HTTP server.
// It blocks until ctx is cancelled, then performs a graceful shutdown.
func (s *Server) ListenAndServe(ctx context.Context) error {
if s.cfg.AdminToken == "" {
s.deps.Log.Warn("backend: BACKEND_ADMIN_TOKEN is not set — /api/admin/* endpoints are unprotected")
}
mux := http.NewServeMux()
// Health / version
@@ -164,15 +217,96 @@ func (s *Server) ListenAndServe(ctx context.Context) error {
// Streaming audio: serves from MinIO if cached, else streams live TTS
// while simultaneously uploading to MinIO for future requests.
mux.HandleFunc("GET /api/audio-stream/{slug}/{n}", s.handleAudioStream)
// TTS for arbitrary short text (chapter announcements) — no MinIO caching.
mux.HandleFunc("GET /api/tts-announce", s.handleTTSAnnounce)
// CF AI preview: generates only the first ~1 800-char chunk so the client
// can start playing immediately while the full audio is generated by the runner.
mux.HandleFunc("GET /api/audio-preview/{slug}/{n}", s.handleAudioPreview)
// Translation task creation (backend creates task; runner executes via LibreTranslate)
mux.HandleFunc("POST /api/translation/{slug}/{n}", s.handleTranslationGenerate)
mux.HandleFunc("GET /api/translation/status/{slug}/{n}", s.handleTranslationStatus)
mux.HandleFunc("GET /api/translation/{slug}/{n}", s.handleTranslationRead)
// admin is a shorthand that wraps every /api/admin/* handler with bearer-token auth.
admin := func(pattern string, h http.HandlerFunc) {
mux.HandleFunc(pattern, s.requireAdmin(h))
}
// Admin translation endpoints
mux.HandleFunc("GET /api/admin/translation/jobs", s.handleAdminTranslationJobs)
mux.HandleFunc("POST /api/admin/translation/bulk", s.handleAdminTranslationBulk)
admin("GET /api/admin/translation/jobs", s.handleAdminTranslationJobs)
admin("POST /api/admin/translation/bulk", s.handleAdminTranslationBulk)
// Admin audio endpoints
admin("GET /api/admin/audio/jobs", s.handleAdminAudioJobs)
admin("POST /api/admin/audio/bulk", s.handleAdminAudioBulk)
admin("POST /api/admin/audio/cancel-bulk", s.handleAdminAudioCancelBulk)
// Admin image generation endpoints
admin("GET /api/admin/image-gen/models", s.handleAdminImageGenModels)
admin("POST /api/admin/image-gen", s.handleAdminImageGen)
admin("POST /api/admin/image-gen/async", s.handleAdminImageGenAsync)
admin("POST /api/admin/image-gen/save-cover", s.handleAdminImageGenSaveCover)
admin("POST /api/admin/image-gen/save-chapter-image", s.handleAdminImageGenSaveChapterImage)
// Chapter image serving
mux.HandleFunc("GET /api/chapter-image/{domain}/{slug}/{n}", s.handleGetChapterImage)
mux.HandleFunc("HEAD /api/chapter-image/{domain}/{slug}/{n}", s.handleHeadChapterImage)
// Admin text generation endpoints (chapter names + book description)
admin("GET /api/admin/text-gen/models", s.handleAdminTextGenModels)
admin("POST /api/admin/text-gen/chapter-names", s.handleAdminTextGenChapterNames)
admin("POST /api/admin/text-gen/chapter-names/async", s.handleAdminTextGenChapterNamesAsync)
admin("POST /api/admin/text-gen/chapter-names/apply", s.handleAdminTextGenApplyChapterNames)
admin("POST /api/admin/text-gen/description", s.handleAdminTextGenDescription)
admin("POST /api/admin/text-gen/description/async", s.handleAdminTextGenDescriptionAsync)
admin("POST /api/admin/text-gen/description/apply", s.handleAdminTextGenApplyDescription)
// Admin catalogue enrichment endpoints
admin("POST /api/admin/text-gen/tagline", s.handleAdminTextGenTagline)
admin("POST /api/admin/text-gen/genres", s.handleAdminTextGenGenres)
admin("POST /api/admin/text-gen/genres/apply", s.handleAdminTextGenApplyGenres)
admin("POST /api/admin/text-gen/content-warnings", s.handleAdminTextGenContentWarnings)
admin("POST /api/admin/text-gen/quality-score", s.handleAdminTextGenQualityScore)
admin("POST /api/admin/catalogue/batch-covers", s.handleAdminBatchCovers)
admin("POST /api/admin/catalogue/batch-covers/cancel", s.handleAdminBatchCoversCancel)
admin("POST /api/admin/catalogue/refresh-metadata/{slug}", s.handleAdminRefreshMetadata)
// Admin AI job tracking endpoints
admin("GET /api/admin/ai-jobs", s.handleAdminListAIJobs)
admin("GET /api/admin/ai-jobs/{id}", s.handleAdminGetAIJob)
admin("POST /api/admin/ai-jobs/{id}/cancel", s.handleAdminCancelAIJob)
// Auto-prompt generation from book/chapter content
admin("POST /api/admin/image-gen/auto-prompt", s.handleAdminImageGenAutoPrompt)
// Admin data repair endpoints
admin("POST /api/admin/dedup-chapters/{slug}", s.handleDedupChapters)
// Admin book management (soft-delete / hard-delete)
admin("PATCH /api/admin/books/{slug}/archive", s.handleAdminArchiveBook)
admin("PATCH /api/admin/books/{slug}/unarchive", s.handleAdminUnarchiveBook)
admin("DELETE /api/admin/books/{slug}", s.handleAdminDeleteBook)
// Admin chapter split (imported books)
admin("POST /api/admin/books/{slug}/split-chapters", s.handleAdminSplitChapters)
// Import (PDF/EPUB)
admin("POST /api/admin/import", s.handleAdminImport)
admin("GET /api/admin/import", s.handleAdminImportList)
admin("GET /api/admin/import/{id}", s.handleAdminImportStatus)
// Notifications
mux.HandleFunc("GET /api/notifications", s.handleListNotifications)
mux.HandleFunc("PATCH /api/notifications", s.handleMarkAllNotificationsRead)
mux.HandleFunc("PATCH /api/notifications/{id}", s.handleMarkNotificationRead)
mux.HandleFunc("DELETE /api/notifications", s.handleClearAllNotifications)
mux.HandleFunc("DELETE /api/notifications/{id}", s.handleDismissNotification)
// Web Push subscriptions
mux.HandleFunc("GET /api/push-subscriptions/vapid-public-key", s.handleGetVAPIDPublicKey)
mux.HandleFunc("POST /api/push-subscriptions", s.handleSavePushSubscription)
mux.HandleFunc("DELETE /api/push-subscriptions", s.handleDeletePushSubscription)
// Voices list
mux.HandleFunc("GET /api/voices", s.handleVoices)
@@ -185,6 +319,9 @@ func (s *Server) ListenAndServe(ctx context.Context) error {
mux.HandleFunc("GET /api/presign/avatar/{userId}", s.handlePresignAvatar)
mux.HandleFunc("PUT /api/avatar-upload/{userId}", s.handleAvatarUpload)
// EPUB export
mux.HandleFunc("GET /api/export/{slug}", s.handleExportEPUB)
// Reading progress
mux.HandleFunc("GET /api/progress", s.handleGetProgress)
mux.HandleFunc("POST /api/progress/{slug}", s.handleSetProgress)
@@ -330,6 +467,23 @@ func (s *Server) voices(ctx context.Context) []domain.Voice {
}
}
// ── Cloudflare AI voices ──────────────────────────────────────────────────
if s.deps.CFAI != nil {
for _, speaker := range cfai.Speakers() {
gender := "m"
if cfai.IsFemale(speaker) {
gender = "f"
}
result = append(result, domain.Voice{
ID: cfai.VoiceID(speaker),
Engine: "cfai",
Lang: "en",
Gender: gender,
})
}
s.deps.Log.Info("backend: loaded CF AI voices", "count", len(cfai.Speakers()))
}
s.voiceMu.Lock()
s.cachedVoices = result
s.voiceMu.Unlock()

View File

@@ -35,6 +35,11 @@ type BookWriter interface {
// ChapterExists returns true if the markdown object for ref already exists.
ChapterExists(ctx context.Context, slug string, ref domain.ChapterRef) bool
// DeduplicateChapters removes duplicate chapters_idx records for slug,
// keeping only one record per chapter number (the one with the latest
// updated timestamp). Returns the number of duplicate records deleted.
DeduplicateChapters(ctx context.Context, slug string) (int, error)
}
// BookReader is the read side used by the backend to serve content.
@@ -80,9 +85,14 @@ type RankingStore interface {
// AudioStore covers audio object storage (runner writes; backend reads).
type AudioStore interface {
// AudioObjectKey returns the MinIO object key for a cached audio file.
// AudioObjectKey returns the MinIO object key for a cached MP3 audio file.
// Format: {slug}/{n}/{voice}.mp3
AudioObjectKey(slug string, n int, voice string) string
// AudioObjectKeyExt returns the MinIO object key for a cached audio file
// with a custom extension (e.g. "mp3" or "wav").
AudioObjectKeyExt(slug string, n int, voice, ext string) string
// AudioExists returns true when the audio object is present in MinIO.
AudioExists(ctx context.Context, key string) bool
@@ -91,7 +101,7 @@ type AudioStore interface {
// PutAudioStream uploads audio from r to MinIO under key.
// size must be the exact byte length of r, or -1 to use multipart upload.
// contentType should be "audio/mpeg".
// contentType should be "audio/mpeg" or "audio/wav".
PutAudioStream(ctx context.Context, key string, r io.Reader, size int64, contentType string) error
}
@@ -148,6 +158,33 @@ type CoverStore interface {
CoverExists(ctx context.Context, slug string) bool
}
// AIJobStore manages AI generation jobs tracked in PocketBase.
type AIJobStore interface {
// CreateAIJob inserts a new ai_job record with status=running and returns its ID.
CreateAIJob(ctx context.Context, job domain.AIJob) (string, error)
// GetAIJob retrieves a single ai_job by ID.
// Returns (zero, false, nil) when not found.
GetAIJob(ctx context.Context, id string) (domain.AIJob, bool, error)
// UpdateAIJob patches an existing ai_job record with the given fields.
UpdateAIJob(ctx context.Context, id string, fields map[string]any) error
// ListAIJobs returns all ai_job records sorted by started descending.
ListAIJobs(ctx context.Context) ([]domain.AIJob, error)
}
// ChapterImageStore covers per-chapter illustration images stored in MinIO.
// The backend admin writes them; the backend serves them.
type ChapterImageStore interface {
// PutChapterImage stores a raw image for chapter n of slug in MinIO.
PutChapterImage(ctx context.Context, slug string, n int, data []byte, contentType string) error
// GetChapterImage retrieves the image for chapter n of slug.
// Returns (nil, "", false, nil) when no image exists.
GetChapterImage(ctx context.Context, slug string, n int) ([]byte, string, bool, error)
// ChapterImageExists returns true when an image is stored for slug/n.
ChapterImageExists(ctx context.Context, slug string, n int) bool
}
// TranslationStore covers machine-translated chapter storage in MinIO.
// The runner writes translations; the backend reads them.
type TranslationStore interface {
@@ -163,3 +200,61 @@ type TranslationStore interface {
// GetTranslation retrieves translated markdown from MinIO.
GetTranslation(ctx context.Context, key string) (string, error)
}
// Chapter represents a single chapter extracted from PDF/EPUB.
type Chapter struct {
Number int // 1-based chapter number
Title string // chapter title (may be empty)
Content string // plain text content
}
// BookImporter handles PDF/EPUB file parsing and chapter extraction.
// Used by the runner to import books from uploaded files.
type BookImporter interface {
// Import extracts chapters from a PDF or EPUB file stored in MinIO.
// Returns the extracted chapters or an error.
Import(ctx context.Context, objectKey, fileType string) ([]Chapter, error)
}
// BookAdminStore covers admin-only operations for managing books in the catalogue.
// All methods require admin authorisation at the HTTP handler level.
type BookAdminStore interface {
// ArchiveBook sets archived=true on a book record, hiding it from all
// public search and catalogue responses. Returns ErrNotFound when the
// slug does not exist.
ArchiveBook(ctx context.Context, slug string) error
// UnarchiveBook clears archived on a book record, making it publicly
// visible again. Returns ErrNotFound when the slug does not exist.
UnarchiveBook(ctx context.Context, slug string) error
// DeleteBook permanently removes all data for a book:
// - PocketBase books record
// - All PocketBase chapters_idx records
// - All MinIO chapter markdown objects ({slug}/chapter-*.md)
// - MinIO cover image (covers/{slug}.jpg)
// The caller is responsible for also deleting the Meilisearch document.
DeleteBook(ctx context.Context, slug string) error
}
// ImportFileStore uploads raw import files to object storage.
// Kept separate from BookImporter so the HTTP handler can upload the file
// without a concrete type assertion, regardless of which Producer is wired.
type ImportFileStore interface {
PutImportFile(ctx context.Context, objectKey string, data []byte) error
// PutImportChapters stores the pre-parsed chapters JSON under the given key.
PutImportChapters(ctx context.Context, key string, data []byte) error
// GetImportChapters retrieves the pre-parsed chapters JSON.
GetImportChapters(ctx context.Context, key string) ([]byte, error)
}
// NotificationStore manages per-user in-app notifications.
// Always wired directly to the concrete *storage.Store so it works
// regardless of whether the Asynq task-queue wrapper is in use.
type NotificationStore interface {
ListNotifications(ctx context.Context, userID string, limit int) ([]map[string]any, error)
MarkNotificationRead(ctx context.Context, id string) error
MarkAllNotificationsRead(ctx context.Context, userID string) error
DeleteNotification(ctx context.Context, id string) error
ClearAllNotifications(ctx context.Context, userID string) error
}

View File

@@ -39,8 +39,9 @@ func (m *mockStore) ReadChapter(_ context.Context, _ string, _ int) (string, err
func (m *mockStore) ListChapters(_ context.Context, _ string) ([]domain.ChapterInfo, error) {
return nil, nil
}
func (m *mockStore) CountChapters(_ context.Context, _ string) int { return 0 }
func (m *mockStore) ReindexChapters(_ context.Context, _ string) (int, error) { return 0, nil }
func (m *mockStore) CountChapters(_ context.Context, _ string) int { return 0 }
func (m *mockStore) ReindexChapters(_ context.Context, _ string) (int, error) { return 0, nil }
func (m *mockStore) DeduplicateChapters(_ context.Context, _ string) (int, error) { return 0, nil }
// RankingStore
func (m *mockStore) WriteRankingItem(_ context.Context, _ domain.RankingItem) error { return nil }
@@ -52,9 +53,10 @@ func (m *mockStore) RankingFreshEnough(_ context.Context, _ time.Duration) (bool
}
// AudioStore
func (m *mockStore) AudioObjectKey(_ string, _ int, _ string) string { return "" }
func (m *mockStore) AudioExists(_ context.Context, _ string) bool { return false }
func (m *mockStore) PutAudio(_ context.Context, _ string, _ []byte) error { return nil }
func (m *mockStore) AudioObjectKey(_ string, _ int, _ string) string { return "" }
func (m *mockStore) AudioObjectKeyExt(_ string, _ int, _, _ string) string { return "" }
func (m *mockStore) AudioExists(_ context.Context, _ string) bool { return false }
func (m *mockStore) PutAudio(_ context.Context, _ string, _ []byte) error { return nil }
func (m *mockStore) PutAudioStream(_ context.Context, _ string, _ io.Reader, _ int64, _ string) error {
return nil
}

View File

@@ -0,0 +1,315 @@
// Package cfai provides a client for Cloudflare Workers AI Text-to-Speech models.
//
// The Cloudflare Workers AI REST API is used to run TTS models:
//
// POST https://api.cloudflare.com/client/v4/accounts/{accountID}/ai/run/{model}
// Authorization: Bearer {apiToken}
// Content-Type: application/json
// { "text": "...", "speaker": "luna" }
//
// → 200 audio/mpeg — raw MP3 bytes
//
// Currently supported model: @cf/deepgram/aura-2-en (40 English speakers).
// Voice IDs are prefixed with "cfai:" to distinguish them from Kokoro/pocket-tts
// voices (e.g. "cfai:luna", "cfai:orion").
//
// The API is batch-only (no streaming), so GenerateAudio waits for the full
// response. There is no 100-second Cloudflare proxy timeout because we are
// calling the Cloudflare API directly, not routing through a Cloudflare-proxied
// homelab tunnel.
//
// The aura-2-en model enforces a hard 2 000-character limit per request.
// GenerateAudio transparently splits longer texts into sentence-boundary chunks
// and concatenates the resulting MP3 frames.
package cfai
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"strings"
"time"
)
const (
// DefaultModel is the Cloudflare Workers AI TTS model used by default.
DefaultModel = "@cf/deepgram/aura-2-en"
// voicePrefix is the prefix used to namespace CF AI voice IDs.
voicePrefix = "cfai:"
)
// aura2Speakers is the exhaustive list of speakers supported by aura-2-en.
var aura2Speakers = []string{
"amalthea", "andromeda", "apollo", "arcas", "aries", "asteria",
"athena", "atlas", "aurora", "callista", "cora", "cordelia",
"delia", "draco", "electra", "harmonia", "helena", "hera",
"hermes", "hyperion", "iris", "janus", "juno", "jupiter",
"luna", "mars", "minerva", "neptune", "odysseus", "ophelia",
"orion", "orpheus", "pandora", "phoebe", "pluto", "saturn",
"thalia", "theia", "vesta", "zeus",
}
// femaleSpeakers is the set of aura-2-en speaker names that are female voices.
var femaleSpeakers = map[string]struct{}{
"amalthea": {}, "andromeda": {}, "aries": {}, "asteria": {},
"athena": {}, "aurora": {}, "callista": {}, "cora": {},
"cordelia": {}, "delia": {}, "electra": {}, "harmonia": {},
"helena": {}, "hera": {}, "iris": {}, "juno": {},
"luna": {}, "minerva": {}, "ophelia": {}, "pandora": {},
"phoebe": {}, "thalia": {}, "theia": {}, "vesta": {},
}
// IsCFAIVoice reports whether voice is served by the Cloudflare AI client.
// CF AI voices use the "cfai:" prefix, e.g. "cfai:luna".
func IsCFAIVoice(voice string) bool {
return strings.HasPrefix(voice, voicePrefix)
}
// SpeakerName strips the "cfai:" prefix and returns the bare speaker name.
// If voice is not a CF AI voice the original string is returned unchanged.
func SpeakerName(voice string) string {
return strings.TrimPrefix(voice, voicePrefix)
}
// VoiceID returns the full voice ID (with prefix) for a bare speaker name.
func VoiceID(speaker string) string {
return voicePrefix + speaker
}
// VoiceSampleKey returns the MinIO object key for a CF AI voice sample MP3.
func VoiceSampleKey(voice string) string {
safe := strings.Map(func(r rune) rune {
if (r >= 'a' && r <= 'z') || (r >= 'A' && r <= 'Z') ||
(r >= '0' && r <= '9') || r == '_' || r == '-' {
return r
}
return '_'
}, voice)
return fmt.Sprintf("_voice-samples/%s.mp3", safe)
}
// IsFemale reports whether the given CF AI voice ID (with or without prefix)
// is a female speaker.
func IsFemale(voice string) bool {
speaker := SpeakerName(voice)
_, ok := femaleSpeakers[speaker]
return ok
}
// Speakers returns all available bare speaker names for aura-2-en.
func Speakers() []string {
out := make([]string, len(aura2Speakers))
copy(out, aura2Speakers)
return out
}
// Client is the interface for interacting with Cloudflare Workers AI TTS.
type Client interface {
// GenerateAudio synthesises text using the given voice (e.g. "cfai:luna")
// and returns raw MP3 bytes.
GenerateAudio(ctx context.Context, text, voice string) ([]byte, error)
// StreamAudioMP3 is not natively supported by the CF AI batch API.
// It buffers the full response and returns an io.ReadCloser over the bytes,
// so callers can use it like a stream without special-casing.
StreamAudioMP3(ctx context.Context, text, voice string) (io.ReadCloser, error)
// StreamAudioWAV is not natively supported; the CF AI model returns MP3.
// This method returns the same MP3 bytes wrapped as an io.ReadCloser.
StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error)
// ListVoices returns all available voice IDs (with the "cfai:" prefix).
ListVoices(ctx context.Context) ([]string, error)
}
// httpClient is the concrete CF AI HTTP client.
type httpClient struct {
accountID string
apiToken string
model string
http *http.Client
}
// New returns a Client for the given Cloudflare account and API token.
// model defaults to DefaultModel when empty.
func New(accountID, apiToken, model string) Client {
if model == "" {
model = DefaultModel
}
return &httpClient{
accountID: accountID,
apiToken: apiToken,
model: model,
http: &http.Client{Timeout: 5 * time.Minute},
}
}
// GenerateAudio calls the Cloudflare Workers AI TTS endpoint and returns MP3 bytes.
// The aura-2-en model rejects inputs longer than 2 000 characters, so this method
// splits the text into sentence-bounded chunks and concatenates the MP3 responses.
func (c *httpClient) GenerateAudio(ctx context.Context, text, voice string) ([]byte, error) {
if text == "" {
return nil, fmt.Errorf("cfai: empty text")
}
speaker := SpeakerName(voice)
if speaker == "" {
speaker = "luna"
}
chunks := splitText(text, 1800) // stay comfortably under the 2 000-char limit
var combined []byte
for _, chunk := range chunks {
part, err := c.generateChunk(ctx, chunk, speaker)
if err != nil {
return nil, err
}
combined = append(combined, part...)
}
return combined, nil
}
// generateChunk sends a single ≤2 000-character request and returns MP3 bytes.
func (c *httpClient) generateChunk(ctx context.Context, text, speaker string) ([]byte, error) {
body, err := json.Marshal(map[string]any{
"text": text,
"speaker": speaker,
})
if err != nil {
return nil, fmt.Errorf("cfai: marshal request: %w", err)
}
url := fmt.Sprintf("https://api.cloudflare.com/client/v4/accounts/%s/ai/run/%s",
c.accountID, c.model)
req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(body))
if err != nil {
return nil, fmt.Errorf("cfai: build request: %w", err)
}
req.Header.Set("Authorization", "Bearer "+c.apiToken)
req.Header.Set("Content-Type", "application/json")
resp, err := c.http.Do(req)
if err != nil {
return nil, fmt.Errorf("cfai: request: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return nil, fmt.Errorf("cfai: server returned %d: %s", resp.StatusCode, strings.TrimSpace(string(body)))
}
mp3, err := io.ReadAll(resp.Body)
if err != nil {
return nil, fmt.Errorf("cfai: read response: %w", err)
}
return mp3, nil
}
// splitText splits src into chunks of at most maxChars characters each.
// It tries to break at paragraph boundaries first, then at sentence-ending
// punctuation (. ! ?), and falls back to the nearest space.
func splitText(src string, maxChars int) []string {
if len(src) <= maxChars {
return []string{src}
}
var chunks []string
remaining := src
for len(remaining) > 0 {
if len(remaining) <= maxChars {
chunks = append(chunks, strings.TrimSpace(remaining))
break
}
// Search window: the first maxChars bytes of remaining.
// Use byte length here because the API limit is in bytes/chars for ASCII;
// for safety we operate on rune-aware slices.
window := remaining
if len(window) > maxChars {
// Trim to maxChars runes (not bytes), ensuring we don't split a multi-byte char.
window = runeSlice(remaining, maxChars)
}
cut := -1
// 1. Prefer paragraph break (\n\n or \n).
if i := strings.LastIndex(window, "\n\n"); i > 0 {
cut = i + 2
} else if i := strings.LastIndex(window, "\n"); i > 0 {
cut = i + 1
}
// 2. Fall back to sentence-ending punctuation followed by a space.
if cut < 0 {
for _, punct := range []string{". ", "! ", "? ", ".\n", "!\n", "?\n"} {
if i := strings.LastIndex(window, punct); i > 0 {
candidate := i + len(punct)
if cut < 0 || candidate > cut {
cut = candidate
}
}
}
}
// 3. Last resort: nearest space.
if cut < 0 {
if i := strings.LastIndex(window, " "); i > 0 {
cut = i + 1
}
}
// 4. Hard cut at maxChars runes if no boundary found.
if cut < 0 {
cut = len(window)
}
chunk := strings.TrimSpace(remaining[:cut])
if chunk != "" {
chunks = append(chunks, chunk)
}
remaining = remaining[cut:]
}
return chunks
}
// runeSlice returns the first n runes of s as a string.
func runeSlice(s string, n int) string {
count := 0
for i := range s {
if count == n {
return s[:i]
}
count++
}
return s
}
// StreamAudioMP3 generates audio and wraps the MP3 bytes as an io.ReadCloser.
func (c *httpClient) StreamAudioMP3(ctx context.Context, text, voice string) (io.ReadCloser, error) {
mp3, err := c.GenerateAudio(ctx, text, voice)
if err != nil {
return nil, err
}
return io.NopCloser(bytes.NewReader(mp3)), nil
}
// StreamAudioWAV generates audio (MP3) and wraps it as an io.ReadCloser.
// Note: the CF AI aura-2-en model returns MP3 regardless of the method name.
func (c *httpClient) StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error) {
return c.StreamAudioMP3(ctx, text, voice)
}
// ListVoices returns all available CF AI voice IDs (with the "cfai:" prefix).
func (c *httpClient) ListVoices(_ context.Context) ([]string, error) {
ids := make([]string, len(aura2Speakers))
for i, s := range aura2Speakers {
ids[i] = VoiceID(s)
}
return ids, nil
}

View File

@@ -0,0 +1,475 @@
// Image generation via Cloudflare Workers AI text-to-image models.
//
// API reference:
//
// POST https://api.cloudflare.com/client/v4/accounts/{accountID}/ai/run/{model}
// Authorization: Bearer {apiToken}
//
// FLUX.2 models (flux-2-dev, flux-2-klein-4b, flux-2-klein-9b):
//
// Content-Type: multipart/form-data
// Fields: prompt, num_steps, width, height, guidance, image_b64 (optional)
// Response: { "image": "<base64 JPEG>" }
//
// Other models (flux-1-schnell, SDXL, SD 1.5):
//
// Content-Type: application/json
// Body: { "prompt": "...", "num_steps": 20 }
// Response: { "image": "<base64>" } or raw bytes depending on model
//
// Reference-image request (FLUX.2):
//
// Same multipart form; include image_b64 field with base64-encoded reference.
//
// Reference-image request (SD img2img):
//
// JSON body: { "prompt": "...", "image": [r,g,b,a,...], "strength": 0.75 }
//
// Recommended models for LibNovel:
// - Book covers (no reference): flux-2-dev, flux-2-klein-9b, lucid-origin
// - Chapter images (speed): flux-2-klein-4b, flux-1-schnell
// - With reference image: flux-2-dev, flux-2-klein-9b, sd-v1-5-img2img
package cfai
import (
"bytes"
"context"
"encoding/base64"
"encoding/json"
"fmt"
"image"
"image/draw"
"image/jpeg"
_ "image/jpeg" // register JPEG decoder
"image/png"
_ "image/png" // register PNG decoder
"io"
"mime/multipart"
"net/http"
"strings"
"time"
)
// ImageModel identifies a Cloudflare Workers AI text-to-image model.
type ImageModel string
const (
// ImageModelFlux2Dev — best quality, multi-reference. Recommended for covers.
ImageModelFlux2Dev ImageModel = "@cf/black-forest-labs/flux-2-dev"
// ImageModelFlux2Klein9B — 9B params, multi-reference. Good for covers.
ImageModelFlux2Klein9B ImageModel = "@cf/black-forest-labs/flux-2-klein-9b"
// ImageModelFlux2Klein4B — ultra-fast, unified gen+edit. Recommended for chapters.
ImageModelFlux2Klein4B ImageModel = "@cf/black-forest-labs/flux-2-klein-4b"
// ImageModelFlux1Schnell — fastest, text-only. Good for quick illustrations.
ImageModelFlux1Schnell ImageModel = "@cf/black-forest-labs/flux-1-schnell"
// ImageModelSDXLLightning — fast 1024px generation.
ImageModelSDXLLightning ImageModel = "@cf/bytedance/stable-diffusion-xl-lightning"
// ImageModelSD15Img2Img — explicit img2img with flat RGBA reference.
ImageModelSD15Img2Img ImageModel = "@cf/runwayml/stable-diffusion-v1-5-img2img"
// ImageModelSDXLBase — Stability AI SDXL base.
ImageModelSDXLBase ImageModel = "@cf/stabilityai/stable-diffusion-xl-base-1.0"
// ImageModelLucidOrigin — Leonardo AI; strong prompt adherence.
ImageModelLucidOrigin ImageModel = "@cf/leonardo/lucid-origin"
// ImageModelPhoenix10 — Leonardo AI; accurate text rendering.
ImageModelPhoenix10 ImageModel = "@cf/leonardo/phoenix-1.0"
// DefaultImageModel is the default model for book-cover generation.
DefaultImageModel = ImageModelFlux2Dev
)
// ImageModelInfo describes a single image generation model.
type ImageModelInfo struct {
ID string `json:"id"`
Label string `json:"label"`
Provider string `json:"provider"`
SupportsRef bool `json:"supports_ref"`
RecommendedFor []string `json:"recommended_for"` // "cover" and/or "chapter"
Description string `json:"description"`
}
// AllImageModels returns metadata about every supported image model.
func AllImageModels() []ImageModelInfo {
return []ImageModelInfo{
{
ID: string(ImageModelFlux2Dev), Label: "FLUX.2 Dev", Provider: "Black Forest Labs",
SupportsRef: true, RecommendedFor: []string{"cover"},
Description: "Best quality; multi-reference editing. Recommended for book covers.",
},
{
ID: string(ImageModelFlux2Klein9B), Label: "FLUX.2 Klein 9B", Provider: "Black Forest Labs",
SupportsRef: true, RecommendedFor: []string{"cover"},
Description: "9B parameters with multi-reference support.",
},
{
ID: string(ImageModelFlux2Klein4B), Label: "FLUX.2 Klein 4B", Provider: "Black Forest Labs",
SupportsRef: true, RecommendedFor: []string{"chapter"},
Description: "Ultra-fast unified gen+edit. Recommended for chapter images.",
},
{
ID: string(ImageModelFlux1Schnell), Label: "FLUX.1 Schnell", Provider: "Black Forest Labs",
SupportsRef: false, RecommendedFor: []string{"chapter"},
Description: "Fastest inference. Good for quick chapter illustrations.",
},
{
ID: string(ImageModelSDXLLightning), Label: "SDXL Lightning", Provider: "ByteDance",
SupportsRef: false, RecommendedFor: []string{"chapter"},
Description: "Lightning-fast 1024px images in a few steps.",
},
{
ID: string(ImageModelSD15Img2Img), Label: "SD 1.5 img2img", Provider: "RunwayML",
SupportsRef: true, RecommendedFor: []string{"cover", "chapter"},
Description: "Explicit img2img: generates from a reference image + prompt.",
},
{
ID: string(ImageModelSDXLBase), Label: "SDXL Base 1.0", Provider: "Stability AI",
SupportsRef: false, RecommendedFor: []string{"cover"},
Description: "Stable Diffusion XL base model.",
},
{
ID: string(ImageModelLucidOrigin), Label: "Lucid Origin", Provider: "Leonardo AI",
SupportsRef: false, RecommendedFor: []string{"cover"},
Description: "Highly prompt-responsive; strong graphic design and HD renders.",
},
{
ID: string(ImageModelPhoenix10), Label: "Phoenix 1.0", Provider: "Leonardo AI",
SupportsRef: false, RecommendedFor: []string{"cover"},
Description: "Exceptional prompt adherence; accurate text rendering.",
},
}
}
// ImageRequest is the input to GenerateImage / GenerateImageFromReference.
type ImageRequest struct {
// Prompt is the text description of the desired image.
Prompt string
// Model is the CF Workers AI model. Defaults to DefaultImageModel when empty.
Model ImageModel
// NumSteps controls inference quality (default 20). Range: 120.
NumSteps int
// Width and Height in pixels. 0 = model default (typically 1024x1024).
Width, Height int
// Guidance controls prompt adherence (default 7.5).
Guidance float64
// Strength for img2img: 0.0 = copy reference, 1.0 = ignore reference (default 0.75).
Strength float64
}
// ImageGenClient generates images via Cloudflare Workers AI.
type ImageGenClient interface {
// GenerateImage creates an image from a text prompt only.
// Returns raw PNG bytes.
GenerateImage(ctx context.Context, req ImageRequest) ([]byte, error)
// GenerateImageFromReference creates an image from a text prompt + reference image.
// refImage should be PNG or JPEG bytes. Returns raw PNG bytes.
GenerateImageFromReference(ctx context.Context, req ImageRequest, refImage []byte) ([]byte, error)
// Models returns metadata about all supported image models.
Models() []ImageModelInfo
}
// imageGenHTTPClient is the concrete CF AI image generation client.
type imageGenHTTPClient struct {
accountID string
apiToken string
http *http.Client
}
// NewImageGen returns an ImageGenClient for the given Cloudflare account.
func NewImageGen(accountID, apiToken string) ImageGenClient {
return &imageGenHTTPClient{
accountID: accountID,
apiToken: apiToken,
http: &http.Client{Timeout: 5 * time.Minute},
}
}
// requiresMultipart reports whether the model requires a multipart/form-data
// request body instead of JSON. FLUX.2 models on Cloudflare Workers AI changed
// their API to require multipart and return {"image":"<base64>"} instead of
// raw image bytes.
func requiresMultipart(model ImageModel) bool {
switch model {
case ImageModelFlux2Dev, ImageModelFlux2Klein4B, ImageModelFlux2Klein9B:
return true
default:
return false
}
}
// GenerateImage generates an image from text only.
func (c *imageGenHTTPClient) GenerateImage(ctx context.Context, req ImageRequest) ([]byte, error) {
req = applyImageDefaults(req)
// FLUX.2 multipart models use "steps"; JSON models use "num_steps".
stepsKey := "num_steps"
if requiresMultipart(req.Model) {
stepsKey = "steps"
}
fields := map[string]any{
"prompt": req.Prompt,
stepsKey: req.NumSteps,
}
if req.Width > 0 {
fields["width"] = req.Width
}
if req.Height > 0 {
fields["height"] = req.Height
}
if req.Guidance > 0 {
fields["guidance"] = req.Guidance
}
return c.callImageAPI(ctx, req.Model, fields, nil)
}
// refImageMaxDim is the maximum dimension (width or height) for reference images
// sent to Cloudflare Workers AI. CF's JSON body limit is ~4 MB; a 768px JPEG
// stays well under that while preserving enough detail for img2img guidance.
const refImageMaxDim = 768
// GenerateImageFromReference generates an image from a text prompt + reference image.
func (c *imageGenHTTPClient) GenerateImageFromReference(ctx context.Context, req ImageRequest, refImage []byte) ([]byte, error) {
if len(refImage) == 0 {
return c.GenerateImage(ctx, req)
}
req = applyImageDefaults(req)
// Shrink the reference image if it exceeds the safe payload size.
refImage = resizeRefImage(refImage, refImageMaxDim)
// FLUX.2 multipart models use "steps"; JSON models use "num_steps".
stepsKey := "num_steps"
if requiresMultipart(req.Model) {
stepsKey = "steps"
}
fields := map[string]any{
"prompt": req.Prompt,
stepsKey: req.NumSteps,
}
if req.Width > 0 {
fields["width"] = req.Width
}
if req.Height > 0 {
fields["height"] = req.Height
}
if req.Guidance > 0 {
fields["guidance"] = req.Guidance
}
if requiresMultipart(req.Model) {
// FLUX.2: reference image sent as base64 form field "image_b64".
fields["image_b64"] = base64.StdEncoding.EncodeToString(refImage)
if req.Strength > 0 {
fields["strength"] = req.Strength
}
return c.callImageAPI(ctx, req.Model, fields, nil)
}
if req.Model == ImageModelSD15Img2Img {
pixels, err := decodeImageToRGBA(refImage)
if err != nil {
return nil, fmt.Errorf("cfai/image: decode reference: %w", err)
}
strength := req.Strength
if strength <= 0 {
strength = 0.75
}
fields["image"] = pixels
fields["strength"] = strength
return c.callImageAPI(ctx, req.Model, fields, nil)
}
// Other FLUX models: image_b64 JSON field.
fields["image_b64"] = base64.StdEncoding.EncodeToString(refImage)
if req.Strength > 0 {
fields["strength"] = req.Strength
}
return c.callImageAPI(ctx, req.Model, fields, nil)
}
// Models returns all supported image model metadata.
func (c *imageGenHTTPClient) Models() []ImageModelInfo {
return AllImageModels()
}
func (c *imageGenHTTPClient) callImageAPI(ctx context.Context, model ImageModel, fields map[string]any, _ []byte) ([]byte, error) {
cfURL := fmt.Sprintf("https://api.cloudflare.com/client/v4/accounts/%s/ai/run/%s",
c.accountID, string(model))
var (
bodyReader io.Reader
contentType string
)
if requiresMultipart(model) {
// Build a multipart/form-data body from the fields map.
// All values are serialised to their string representation.
var buf bytes.Buffer
mw := multipart.NewWriter(&buf)
for k, v := range fields {
var strVal string
switch tv := v.(type) {
case string:
strVal = tv
default:
encoded, merr := json.Marshal(tv)
if merr != nil {
return nil, fmt.Errorf("cfai/image: marshal field %q: %w", k, merr)
}
strVal = strings.Trim(string(encoded), `"`)
}
if werr := mw.WriteField(k, strVal); werr != nil {
return nil, fmt.Errorf("cfai/image: write field %q: %w", k, werr)
}
}
if cerr := mw.Close(); cerr != nil {
return nil, fmt.Errorf("cfai/image: close multipart writer: %w", cerr)
}
bodyReader = &buf
contentType = mw.FormDataContentType()
} else {
encoded, merr := json.Marshal(fields)
if merr != nil {
return nil, fmt.Errorf("cfai/image: marshal: %w", merr)
}
bodyReader = bytes.NewReader(encoded)
contentType = "application/json"
}
req, err := http.NewRequestWithContext(ctx, http.MethodPost, cfURL, bodyReader)
if err != nil {
return nil, fmt.Errorf("cfai/image: build request: %w", err)
}
req.Header.Set("Authorization", "Bearer "+c.apiToken)
req.Header.Set("Content-Type", contentType)
resp, err := c.http.Do(req)
if err != nil {
return nil, fmt.Errorf("cfai/image: http: %w", err)
}
defer resp.Body.Close()
respBody, err := io.ReadAll(resp.Body)
if err != nil {
return nil, fmt.Errorf("cfai/image: read response: %w", err)
}
if resp.StatusCode != http.StatusOK {
msg := string(respBody)
if len(msg) > 300 {
msg = msg[:300]
}
return nil, fmt.Errorf("cfai/image: model %s returned %d: %s", model, resp.StatusCode, msg)
}
// Try to parse as {"image": "<base64>"} first (FLUX.2 and newer models).
// Fall back to treating the body as raw image bytes for legacy models.
var jsonResp struct {
Image string `json:"image"`
}
if jerr := json.Unmarshal(respBody, &jsonResp); jerr == nil && jsonResp.Image != "" {
imgBytes, decErr := base64.StdEncoding.DecodeString(jsonResp.Image)
if decErr != nil {
// Try raw (no padding) base64
imgBytes, decErr = base64.RawStdEncoding.DecodeString(jsonResp.Image)
if decErr != nil {
return nil, fmt.Errorf("cfai/image: decode base64 response: %w", decErr)
}
}
return imgBytes, nil
}
// Legacy: model returned raw image bytes directly.
return respBody, nil
}
func applyImageDefaults(req ImageRequest) ImageRequest {
if req.Model == "" {
req.Model = DefaultImageModel
}
if req.NumSteps <= 0 {
req.NumSteps = 20
}
return req
}
// resizeRefImage down-scales an image so that its longest side is at most maxDim
// pixels, then re-encodes it as JPEG (quality 85). If the image is already small
// enough, or if decoding fails, the original bytes are returned unchanged.
// This keeps the JSON payload well under Cloudflare Workers AI's 4 MB body limit.
func resizeRefImage(data []byte, maxDim int) []byte {
src, format, err := image.Decode(bytes.NewReader(data))
if err != nil {
return data
}
b := src.Bounds()
w, h := b.Dx(), b.Dy()
longest := w
if h > longest {
longest = h
}
if longest <= maxDim {
return data // already fits
}
// Compute target dimensions preserving aspect ratio.
scale := float64(maxDim) / float64(longest)
newW := int(float64(w)*scale + 0.5)
newH := int(float64(h)*scale + 0.5)
if newW < 1 {
newW = 1
}
if newH < 1 {
newH = 1
}
// Nearest-neighbour downsample (no extra deps, sufficient for reference guidance).
dst := image.NewRGBA(image.Rect(0, 0, newW, newH))
for y := 0; y < newH; y++ {
for x := 0; x < newW; x++ {
srcX := b.Min.X + int(float64(x)/scale)
srcY := b.Min.Y + int(float64(y)/scale)
draw.Draw(dst, image.Rect(x, y, x+1, y+1), src, image.Pt(srcX, srcY), draw.Src)
}
}
var buf bytes.Buffer
if format == "jpeg" {
if encErr := jpeg.Encode(&buf, dst, &jpeg.Options{Quality: 85}); encErr != nil {
return data
}
} else {
if encErr := png.Encode(&buf, dst); encErr != nil {
return data
}
}
return buf.Bytes()
}
// decodeImageToRGBA decodes PNG/JPEG bytes to a flat []uint8 RGBA pixel array
// required by the stable-diffusion-v1-5-img2img model.
func decodeImageToRGBA(data []byte) ([]uint8, error) {
img, _, err := image.Decode(bytes.NewReader(data))
if err != nil {
return nil, fmt.Errorf("decode image: %w", err)
}
bounds := img.Bounds()
w := bounds.Max.X - bounds.Min.X
h := bounds.Max.Y - bounds.Min.Y
pixels := make([]uint8, w*h*4)
idx := 0
for y := bounds.Min.Y; y < bounds.Max.Y; y++ {
for x := bounds.Min.X; x < bounds.Max.X; x++ {
r, g, b, a := img.At(x, y).RGBA()
pixels[idx] = uint8(r >> 8)
pixels[idx+1] = uint8(g >> 8)
pixels[idx+2] = uint8(b >> 8)
pixels[idx+3] = uint8(a >> 8)
idx += 4
}
}
return pixels, nil
}

View File

@@ -0,0 +1,253 @@
// Text generation via Cloudflare Workers AI LLM models.
//
// API reference:
//
// POST https://api.cloudflare.com/client/v4/accounts/{accountID}/ai/run/{model}
// Authorization: Bearer {apiToken}
// Content-Type: application/json
//
// Request body (all models):
//
// { "messages": [{"role":"system","content":"..."},{"role":"user","content":"..."}] }
//
// Response (wrapped):
//
// { "result": { "response": "..." }, "success": true }
package cfai
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"time"
)
// TextModel identifies a Cloudflare Workers AI text generation model.
type TextModel string
const (
// TextModelGemma4 — Google Gemma 4, 256k context.
TextModelGemma4 TextModel = "@cf/google/gemma-4-26b-a4b-it"
// TextModelLlama4Scout — Meta Llama 4 Scout 17B, multimodal.
TextModelLlama4Scout TextModel = "@cf/meta/llama-4-scout-17b-16e-instruct"
// TextModelLlama33_70B — Meta Llama 3.3 70B, fast fp8.
TextModelLlama33_70B TextModel = "@cf/meta/llama-3.3-70b-instruct-fp8-fast"
// TextModelQwen3_30B — Qwen3 30B MoE, function calling.
TextModelQwen3_30B TextModel = "@cf/qwen/qwen3-30b-a3b-fp8"
// TextModelMistralSmall — Mistral Small 3.1 24B, 128k context.
TextModelMistralSmall TextModel = "@cf/mistralai/mistral-small-3.1-24b-instruct"
// TextModelQwQ32B — Qwen QwQ 32B reasoning model.
TextModelQwQ32B TextModel = "@cf/qwen/qwq-32b"
// TextModelDeepSeekR1 — DeepSeek R1 distill Qwen 32B.
TextModelDeepSeekR1 TextModel = "@cf/deepseek-ai/deepseek-r1-distill-qwen-32b"
// TextModelGemma3_12B — Google Gemma 3 12B, 80k context.
TextModelGemma3_12B TextModel = "@cf/google/gemma-3-12b-it"
// TextModelGPTOSS120B — OpenAI gpt-oss-120b, high reasoning.
TextModelGPTOSS120B TextModel = "@cf/openai/gpt-oss-120b"
// TextModelGPTOSS20B — OpenAI gpt-oss-20b, lower latency.
TextModelGPTOSS20B TextModel = "@cf/openai/gpt-oss-20b"
// TextModelNemotron3 — NVIDIA Nemotron 3 120B, agentic.
TextModelNemotron3 TextModel = "@cf/nvidia/nemotron-3-120b-a12b"
// TextModelLlama32_3B — Meta Llama 3.2 3B, lightweight.
TextModelLlama32_3B TextModel = "@cf/meta/llama-3.2-3b-instruct"
// DefaultTextModel is the default model used when none is specified.
DefaultTextModel = TextModelLlama4Scout
)
// TextModelInfo describes a single text generation model.
type TextModelInfo struct {
ID string `json:"id"`
Label string `json:"label"`
Provider string `json:"provider"`
ContextSize int `json:"context_size"` // max context in tokens
Description string `json:"description"`
}
// AllTextModels returns metadata about every supported text generation model.
func AllTextModels() []TextModelInfo {
return []TextModelInfo{
{
ID: string(TextModelGemma4), Label: "Gemma 4 26B", Provider: "Google",
ContextSize: 256000,
Description: "Google's most intelligent open model family. 256k context, function calling.",
},
{
ID: string(TextModelLlama4Scout), Label: "Llama 4 Scout 17B", Provider: "Meta",
ContextSize: 131000,
Description: "Natively multimodal, 16 experts. Good all-purpose model with function calling.",
},
{
ID: string(TextModelLlama33_70B), Label: "Llama 3.3 70B (fp8 fast)", Provider: "Meta",
ContextSize: 24000,
Description: "Llama 3.3 70B quantized to fp8 for speed. Excellent instruction following.",
},
{
ID: string(TextModelQwen3_30B), Label: "Qwen3 30B MoE", Provider: "Qwen",
ContextSize: 32768,
Description: "MoE architecture with strong reasoning and instruction following.",
},
{
ID: string(TextModelMistralSmall), Label: "Mistral Small 3.1 24B", Provider: "MistralAI",
ContextSize: 128000,
Description: "Strong text performance with 128k context and function calling.",
},
{
ID: string(TextModelQwQ32B), Label: "QwQ 32B (reasoning)", Provider: "Qwen",
ContextSize: 24000,
Description: "Reasoning model — thinks before answering. Slower but more accurate.",
},
{
ID: string(TextModelDeepSeekR1), Label: "DeepSeek R1 32B", Provider: "DeepSeek",
ContextSize: 80000,
Description: "R1-distilled reasoning model. Outperforms o1-mini on many benchmarks.",
},
{
ID: string(TextModelGemma3_12B), Label: "Gemma 3 12B", Provider: "Google",
ContextSize: 80000,
Description: "Multimodal, 128k context, multilingual (140+ languages).",
},
{
ID: string(TextModelGPTOSS120B), Label: "GPT-OSS 120B", Provider: "OpenAI",
ContextSize: 128000,
Description: "OpenAI open-weight model for production, general purpose, high reasoning.",
},
{
ID: string(TextModelGPTOSS20B), Label: "GPT-OSS 20B", Provider: "OpenAI",
ContextSize: 128000,
Description: "OpenAI open-weight model for lower latency and specialized use cases.",
},
{
ID: string(TextModelNemotron3), Label: "Nemotron 3 120B", Provider: "NVIDIA",
ContextSize: 256000,
Description: "Hybrid MoE with leading accuracy for multi-agent applications.",
},
{
ID: string(TextModelLlama32_3B), Label: "Llama 3.2 3B", Provider: "Meta",
ContextSize: 80000,
Description: "Lightweight model for simple tasks. Fast and cheap.",
},
}
}
// TextMessage is a single message in a chat conversation.
type TextMessage struct {
Role string `json:"role"` // "system" or "user"
Content string `json:"content"` // message text
}
// TextRequest is the input to Generate.
type TextRequest struct {
// Model is the CF Workers AI model ID. Defaults to DefaultTextModel when empty.
Model TextModel
// Messages is the conversation history (system + user messages).
Messages []TextMessage
// MaxTokens limits the output length (0 = model default).
MaxTokens int
}
// TextGenClient generates text via Cloudflare Workers AI LLM models.
type TextGenClient interface {
// Generate sends a chat-style request and returns the model's response text.
Generate(ctx context.Context, req TextRequest) (string, error)
// Models returns metadata about all supported text generation models.
Models() []TextModelInfo
}
// textGenHTTPClient is the concrete CF AI text generation client.
type textGenHTTPClient struct {
accountID string
apiToken string
http *http.Client
}
// NewTextGen returns a TextGenClient for the given Cloudflare account.
func NewTextGen(accountID, apiToken string) TextGenClient {
return &textGenHTTPClient{
accountID: accountID,
apiToken: apiToken,
http: &http.Client{Timeout: 5 * time.Minute},
}
}
// Generate sends messages to the model and returns the response text.
func (c *textGenHTTPClient) Generate(ctx context.Context, req TextRequest) (string, error) {
if req.Model == "" {
req.Model = DefaultTextModel
}
body := map[string]any{
"messages": req.Messages,
}
if req.MaxTokens > 0 {
body["max_tokens"] = req.MaxTokens
}
encoded, err := json.Marshal(body)
if err != nil {
return "", fmt.Errorf("cfai/text: marshal: %w", err)
}
url := fmt.Sprintf("https://api.cloudflare.com/client/v4/accounts/%s/ai/run/%s",
c.accountID, string(req.Model))
httpReq, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(encoded))
if err != nil {
return "", fmt.Errorf("cfai/text: build request: %w", err)
}
httpReq.Header.Set("Authorization", "Bearer "+c.apiToken)
httpReq.Header.Set("Content-Type", "application/json")
resp, err := c.http.Do(httpReq)
if err != nil {
return "", fmt.Errorf("cfai/text: http: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
errBody, _ := io.ReadAll(resp.Body)
msg := string(errBody)
if len(msg) > 300 {
msg = msg[:300]
}
return "", fmt.Errorf("cfai/text: model %s returned %d: %s", req.Model, resp.StatusCode, msg)
}
// CF AI wraps responses: { "result": { "response": "..." }, "success": true }
// Some models (e.g. Llama 4 Scout) return response as an array:
// { "result": { "response": [{"generated_text":"..."}] } }
var wrapper struct {
Result struct {
Response json.RawMessage `json:"response"`
} `json:"result"`
Success bool `json:"success"`
Errors []string `json:"errors"`
}
if err := json.NewDecoder(resp.Body).Decode(&wrapper); err != nil {
return "", fmt.Errorf("cfai/text: decode response: %w", err)
}
if !wrapper.Success {
return "", fmt.Errorf("cfai/text: model %s error: %v", req.Model, wrapper.Errors)
}
// Try plain string first.
var text string
if err := json.Unmarshal(wrapper.Result.Response, &text); err == nil {
return text, nil
}
// Fall back: array of objects with a "generated_text" field.
var arr []struct {
GeneratedText string `json:"generated_text"`
}
if err := json.Unmarshal(wrapper.Result.Response, &arr); err == nil && len(arr) > 0 {
return arr[0].GeneratedText, nil
}
return "", fmt.Errorf("cfai/text: model %s: unrecognised response shape: %s", req.Model, wrapper.Result.Response)
}
// Models returns all supported text generation model metadata.
func (c *textGenHTTPClient) Models() []TextModelInfo {
return AllTextModels()
}

View File

@@ -66,6 +66,18 @@ type PocketTTS struct {
URL string
}
// CFAI holds credentials for Cloudflare Workers AI TTS.
type CFAI struct {
// AccountID is the Cloudflare account ID.
// An empty string disables CF AI generation.
AccountID string
// APIToken is a Workers AI API token with Workers AI Read+Edit permissions.
APIToken string
// Model is the Workers AI TTS model ID.
// Defaults to "@cf/deepgram/aura-2-en" when empty.
Model string
}
// LibreTranslate holds connection settings for a self-hosted LibreTranslate instance.
type LibreTranslate struct {
// URL is the base URL of the LibreTranslate instance, e.g. https://translate.libnovel.cc
@@ -80,6 +92,10 @@ type LibreTranslate struct {
type HTTP struct {
// Addr is the listen address, e.g. ":8080"
Addr string
// AdminToken is the bearer token required for all /api/admin/* endpoints.
// Set via BACKEND_ADMIN_TOKEN. When empty, admin endpoints are unprotected —
// only acceptable when the backend is unreachable from the public internet.
AdminToken string
}
// Meilisearch holds connection settings for the Meilisearch full-text search service.
@@ -111,6 +127,19 @@ type Redis struct {
Password string
}
// VAPID holds Web Push VAPID key pair for browser push notifications.
// Generate a pair once with: go run ./cmd/genkeys (or use the web-push CLI).
// The public key is exposed via GET /api/push-subscriptions/vapid-public-key
// and embedded in the SvelteKit app via PUBLIC_VAPID_PUBLIC_KEY.
type VAPID struct {
// PublicKey is the base64url-encoded VAPID public key (65 bytes, uncompressed EC P-256).
PublicKey string
// PrivateKey is the base64url-encoded VAPID private key (32 bytes).
PrivateKey string
// Subject is the mailto: or https: URL used as the VAPID subscriber contact.
Subject string
}
// Runner holds settings specific to the runner/worker binary.
type Runner struct {
// PollInterval is how often the runner checks PocketBase for pending tasks.
@@ -153,12 +182,14 @@ type Config struct {
MinIO MinIO
Kokoro Kokoro
PocketTTS PocketTTS
CFAI CFAI
LibreTranslate LibreTranslate
HTTP HTTP
Runner Runner
Meilisearch Meilisearch
Valkey Valkey
Redis Redis
VAPID VAPID
// LogLevel is one of "debug", "info", "warn", "error".
LogLevel string
}
@@ -203,13 +234,20 @@ func Load() Config {
URL: envOr("POCKET_TTS_URL", ""),
},
CFAI: CFAI{
AccountID: envOr("CFAI_ACCOUNT_ID", ""),
APIToken: envOr("CFAI_API_TOKEN", ""),
Model: envOr("CFAI_TTS_MODEL", ""),
},
LibreTranslate: LibreTranslate{
URL: envOr("LIBRETRANSLATE_URL", ""),
APIKey: envOr("LIBRETRANSLATE_API_KEY", ""),
},
HTTP: HTTP{
Addr: envOr("BACKEND_HTTP_ADDR", ":8080"),
Addr: envOr("BACKEND_HTTP_ADDR", ":8080"),
AdminToken: envOr("BACKEND_ADMIN_TOKEN", ""),
},
Runner: Runner{
@@ -239,6 +277,12 @@ func Load() Config {
Addr: envOr("REDIS_ADDR", ""),
Password: envOr("REDIS_PASSWORD", ""),
},
VAPID: VAPID{
PublicKey: envOr("VAPID_PUBLIC_KEY", ""),
PrivateKey: envOr("VAPID_PRIVATE_KEY", ""),
Subject: envOr("VAPID_SUBJECT", "mailto:admin@libnovel.cc"),
},
}
}

View File

@@ -24,6 +24,9 @@ type BookMeta struct {
// updated in PocketBase. Populated on read; not sent on write (PocketBase
// manages its own updated field).
MetaUpdated int64 `json:"meta_updated,omitempty"`
// Archived is true when the book has been soft-deleted by an admin.
// Archived books are excluded from all public search and catalogue responses.
Archived bool `json:"archived,omitempty"`
}
// CatalogueEntry is a lightweight book reference returned by catalogue pages.
@@ -123,6 +126,8 @@ type ScrapeTask struct {
// ScrapeResult is the outcome reported by the runner after finishing a ScrapeTask.
type ScrapeResult struct {
// Slug is the book slug that was scraped. Empty for catalogue tasks.
Slug string `json:"slug,omitempty"`
BooksFound int `json:"books_found"`
ChaptersScraped int `json:"chapters_scraped"`
ChaptersSkipped int `json:"chapters_skipped"`
@@ -169,3 +174,61 @@ type TranslationResult struct {
ObjectKey string `json:"object_key,omitempty"`
ErrorMessage string `json:"error_message,omitempty"`
}
// ImportTask represents a PDF/EPUB import job stored in PocketBase.
type ImportTask struct {
ID string `json:"id"`
Slug string `json:"slug"` // derived from filename
Title string `json:"title"`
FileName string `json:"file_name"`
FileType string `json:"file_type"` // "pdf" or "epub"
ObjectKey string `json:"object_key,omitempty"` // MinIO path to uploaded file
ChaptersKey string `json:"chapters_key,omitempty"` // MinIO path to pre-parsed chapters JSON
Author string `json:"author,omitempty"`
CoverURL string `json:"cover_url,omitempty"`
Genres []string `json:"genres,omitempty"`
Summary string `json:"summary,omitempty"`
BookStatus string `json:"book_status,omitempty"` // "ongoing" | "completed" | "hiatus"
WorkerID string `json:"worker_id,omitempty"`
InitiatorUserID string `json:"initiator_user_id,omitempty"` // PocketBase user ID who submitted the import
Status TaskStatus `json:"status"`
ChaptersDone int `json:"chapters_done"`
ChaptersTotal int `json:"chapters_total"`
ErrorMessage string `json:"error_message,omitempty"`
Started time.Time `json:"started"`
Finished time.Time `json:"finished,omitempty"`
}
// ImportResult is the outcome reported by the runner after finishing an ImportTask.
type ImportResult struct {
Slug string `json:"slug,omitempty"`
ChaptersImported int `json:"chapters_imported"`
ErrorMessage string `json:"error_message,omitempty"`
}
// AIJob represents an AI generation task tracked in PocketBase (ai_jobs collection).
type AIJob struct {
ID string `json:"id"`
// Kind is one of: "chapter-names", "batch-covers", "chapter-covers", "refresh-metadata".
Kind string `json:"kind"`
// Slug is the book slug for per-book jobs; empty for catalogue-wide jobs.
Slug string `json:"slug"`
Status TaskStatus `json:"status"`
// FromItem is the first item to process (chapter number, or 0-based book index).
// 0 = start from the beginning.
FromItem int `json:"from_item"`
// ToItem is the last item to process (inclusive). 0 = process all.
ToItem int `json:"to_item"`
// ItemsDone is the cumulative count of successfully processed items.
ItemsDone int `json:"items_done"`
// ItemsTotal is the total number of items in this job.
ItemsTotal int `json:"items_total"`
Model string `json:"model"`
// Payload is a JSON-encoded string with job-specific parameters
// (e.g. naming pattern for chapter-names, num_steps for batch-covers).
Payload string `json:"payload"`
ErrorMessage string `json:"error_message,omitempty"`
Started time.Time `json:"started,omitempty"`
Finished time.Time `json:"finished,omitempty"`
HeartbeatAt time.Time `json:"heartbeat_at,omitempty"`
}

View File

@@ -27,6 +27,11 @@ type Client interface {
// waiting for the full output. The caller must always close the ReadCloser.
StreamAudioMP3(ctx context.Context, text, voice string) (io.ReadCloser, error)
// StreamAudioWAV synthesises text and returns an io.ReadCloser that streams
// WAV-encoded audio incrementally using kokoro-fastapi's streaming mode with
// response_format:"wav". The caller must always close the ReadCloser.
StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error)
// ListVoices returns the available voice IDs. Falls back to an empty slice
// on error — callers should treat an empty list as "service unavailable".
ListVoices(ctx context.Context) ([]string, error)
@@ -167,6 +172,47 @@ func (c *httpClient) StreamAudioMP3(ctx context.Context, text, voice string) (io
return resp.Body, nil
}
// StreamAudioWAV calls POST /v1/audio/speech with stream:true and response_format:wav,
// returning an io.ReadCloser that delivers WAV bytes as kokoro generates them.
func (c *httpClient) StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error) {
if text == "" {
return nil, fmt.Errorf("kokoro: empty text")
}
if voice == "" {
voice = "af_bella"
}
reqBody, err := json.Marshal(map[string]any{
"model": "kokoro",
"input": text,
"voice": voice,
"response_format": "wav",
"speed": 1.0,
"stream": true,
})
if err != nil {
return nil, fmt.Errorf("kokoro: marshal wav stream request: %w", err)
}
req, err := http.NewRequestWithContext(ctx, http.MethodPost,
c.baseURL+"/v1/audio/speech", bytes.NewReader(reqBody))
if err != nil {
return nil, fmt.Errorf("kokoro: build wav stream request: %w", err)
}
req.Header.Set("Content-Type", "application/json")
resp, err := c.http.Do(req)
if err != nil {
return nil, fmt.Errorf("kokoro: wav stream request: %w", err)
}
if resp.StatusCode != http.StatusOK {
_, _ = io.Copy(io.Discard, resp.Body)
resp.Body.Close()
return nil, fmt.Errorf("kokoro: wav stream returned %d", resp.StatusCode)
}
return resp.Body, nil
}
// ListVoices calls GET /v1/audio/voices and returns the list of voice IDs.
func (c *httpClient) ListVoices(ctx context.Context) ([]string, error) {
req, err := http.NewRequestWithContext(ctx, http.MethodGet,

View File

@@ -32,11 +32,15 @@ type Client interface {
// BookExists reports whether a book with the given slug is already in the
// index. Used by the catalogue refresh to skip re-indexing known books.
BookExists(ctx context.Context, slug string) bool
// DeleteBook removes a book document from the search index by slug.
DeleteBook(ctx context.Context, slug string) error
// Search returns up to limit books matching query.
// Archived books are always excluded.
Search(ctx context.Context, query string, limit int) ([]domain.BookMeta, error)
// Catalogue queries books with optional filters, sort, and pagination.
// Returns books, the total hit count for pagination, and a FacetResult
// with available genre and status values from the index.
// Archived books are always excluded.
Catalogue(ctx context.Context, q CatalogueQuery) ([]domain.BookMeta, int64, FacetResult, error)
}
@@ -99,7 +103,7 @@ func Configure(host, apiKey string) error {
return fmt.Errorf("meili: update searchable attributes: %w", err)
}
filterable := []interface{}{"status", "genres"}
filterable := []interface{}{"status", "genres", "archived"}
if _, err := idx.UpdateFilterableAttributes(&filterable); err != nil {
return fmt.Errorf("meili: update filterable attributes: %w", err)
}
@@ -128,6 +132,9 @@ type bookDoc struct {
// MetaUpdated is the Unix timestamp (seconds) of the last PocketBase update.
// Used for sort=update ("recently updated" ordering).
MetaUpdated int64 `json:"meta_updated"`
// Archived is true when the book has been soft-deleted by an admin.
// Used as a filter to exclude archived books from all search results.
Archived bool `json:"archived"`
}
func toDoc(b domain.BookMeta) bookDoc {
@@ -144,6 +151,7 @@ func toDoc(b domain.BookMeta) bookDoc {
Rank: b.Ranking,
Rating: b.Rating,
MetaUpdated: b.MetaUpdated,
Archived: b.Archived,
}
}
@@ -161,6 +169,7 @@ func fromDoc(d bookDoc) domain.BookMeta {
Ranking: d.Rank,
Rating: d.Rating,
MetaUpdated: d.MetaUpdated,
Archived: d.Archived,
}
}
@@ -184,13 +193,24 @@ func (c *MeiliClient) BookExists(_ context.Context, slug string) bool {
return err == nil && doc.Slug != ""
}
// DeleteBook removes a book document from the index by slug.
// The operation is fire-and-forget (Meilisearch processes tasks asynchronously).
func (c *MeiliClient) DeleteBook(_ context.Context, slug string) error {
if _, err := c.idx.DeleteDocument(slug, nil); err != nil {
return fmt.Errorf("meili: delete book %q: %w", slug, err)
}
return nil
}
// Search returns books matching query, up to limit results.
// Archived books are always excluded.
func (c *MeiliClient) Search(_ context.Context, query string, limit int) ([]domain.BookMeta, error) {
if limit <= 0 {
limit = 20
}
res, err := c.idx.Search(query, &meilisearch.SearchRequest{
Limit: int64(limit),
Limit: int64(limit),
Filter: "archived = false",
})
if err != nil {
return nil, fmt.Errorf("meili: search %q: %w", query, err)
@@ -231,17 +251,15 @@ func (c *MeiliClient) Catalogue(_ context.Context, q CatalogueQuery) ([]domain.B
Facets: []string{"genres", "status"},
}
// Build filter
var filters []string
// Build filter — always exclude archived books
filters := []string{"archived = false"}
if q.Genre != "" && q.Genre != "all" {
filters = append(filters, fmt.Sprintf("genres = %q", q.Genre))
}
if q.Status != "" && q.Status != "all" {
filters = append(filters, fmt.Sprintf("status = %q", q.Status))
}
if len(filters) > 0 {
req.Filter = strings.Join(filters, " AND ")
}
req.Filter = strings.Join(filters, " AND ")
// Map UI sort tokens to Meilisearch sort expressions.
switch q.Sort {
@@ -318,7 +336,8 @@ func sortStrings(s []string) {
type NoopClient struct{}
func (NoopClient) UpsertBook(_ context.Context, _ domain.BookMeta) error { return nil }
func (NoopClient) BookExists(_ context.Context, _ string) bool { return false }
func (NoopClient) BookExists(_ context.Context, _ string) bool { return false }
func (NoopClient) DeleteBook(_ context.Context, _ string) error { return nil }
func (NoopClient) Search(_ context.Context, _ string, _ int) ([]domain.BookMeta, error) {
return nil, nil
}

View File

@@ -241,7 +241,7 @@ func (s *Scraper) ScrapeChapterList(ctx context.Context, bookURL string, upTo in
}
pageURL := fmt.Sprintf("%s?page=%d", baseChapterURL, page)
s.log.Info("scraping chapter list", "page", page, "url", pageURL)
s.log.Debug("scraping chapter list", "page", page, "url", pageURL)
raw, err := retryGet(ctx, s.log, s.client, pageURL, 9, 6*time.Second)
if err != nil {

View File

@@ -68,7 +68,7 @@ func New(cfg Config, novel scraper.NovelScraper, store bookstore.BookWriter, log
// Returns a ScrapeResult with counters. The result's ErrorMessage is non-empty
// if the run failed at the metadata or chapter-list level.
func (o *Orchestrator) RunBook(ctx context.Context, task domain.ScrapeTask) domain.ScrapeResult {
o.log.Info("orchestrator: RunBook starting",
o.log.Debug("orchestrator: RunBook starting",
"task_id", task.ID,
"kind", task.Kind,
"url", task.TargetURL,
@@ -90,6 +90,7 @@ func (o *Orchestrator) RunBook(ctx context.Context, task domain.ScrapeTask) doma
result.Errors++
return result
}
result.Slug = meta.Slug
if err := o.store.WriteMetadata(ctx, meta); err != nil {
o.log.Error("metadata write failed", "slug", meta.Slug, "err", err)
@@ -97,13 +98,14 @@ func (o *Orchestrator) RunBook(ctx context.Context, task domain.ScrapeTask) doma
result.Errors++
} else {
result.BooksFound = 1
result.Slug = meta.Slug
// Fire optional post-metadata hook (e.g. Meilisearch indexing).
if o.postMetadata != nil {
o.postMetadata(ctx, meta)
}
}
o.log.Info("metadata saved", "slug", meta.Slug, "title", meta.Title)
o.log.Debug("metadata saved", "slug", meta.Slug, "title", meta.Title)
// ── Step 2: Chapter list ──────────────────────────────────────────────────
refs, err := o.novel.ScrapeChapterList(ctx, task.TargetURL, task.ToChapter)
@@ -114,7 +116,7 @@ func (o *Orchestrator) RunBook(ctx context.Context, task domain.ScrapeTask) doma
return result
}
o.log.Info("chapter list fetched", "slug", meta.Slug, "chapters", len(refs))
o.log.Debug("chapter list fetched", "slug", meta.Slug, "chapters", len(refs))
// Persist chapter refs (without text) so the index exists early.
if wErr := o.store.WriteChapterRefs(ctx, meta.Slug, refs); wErr != nil {

View File

@@ -89,6 +89,8 @@ func (s *stubStore) WriteChapterRefs(_ context.Context, _ string, _ []domain.Cha
return nil
}
func (s *stubStore) DeduplicateChapters(_ context.Context, _ string) (int, error) { return 0, nil }
func (s *stubStore) ChapterExists(_ context.Context, slug string, ref domain.ChapterRef) bool {
s.mu.Lock()
defer s.mu.Unlock()

View File

@@ -59,6 +59,12 @@ type Client interface {
// The caller must always close the returned ReadCloser.
StreamAudioMP3(ctx context.Context, text, voice string) (io.ReadCloser, error)
// StreamAudioWAV synthesises text and returns an io.ReadCloser that streams
// raw WAV audio directly from pocket-tts without any transcoding.
// The stream begins with a WAV header followed by 16-bit PCM frames at 16 kHz.
// The caller must always close the returned ReadCloser.
StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error)
// ListVoices returns the available predefined voice names.
ListVoices(ctx context.Context) ([]string, error)
}
@@ -160,6 +166,25 @@ func (c *httpClient) StreamAudioMP3(ctx context.Context, text, voice string) (io
return pr, nil
}
// StreamAudioWAV posts to POST /tts and returns an io.ReadCloser that delivers
// raw WAV bytes directly from pocket-tts — no ffmpeg transcoding required.
// The first bytes will be a WAV header (RIFF/fmt chunk) followed by PCM frames.
// The caller must always close the returned ReadCloser.
func (c *httpClient) StreamAudioWAV(ctx context.Context, text, voice string) (io.ReadCloser, error) {
if text == "" {
return nil, fmt.Errorf("pockettts: empty text")
}
if voice == "" {
voice = "alba"
}
resp, err := c.postTTS(ctx, text, voice)
if err != nil {
return nil, err
}
return resp.Body, nil
}
// ListVoices returns the statically known predefined voice names.
// pocket-tts has no REST endpoint for listing voices.
func (c *httpClient) ListVoices(_ context.Context) ([]string, error) {

View File

@@ -54,6 +54,7 @@ func (r *Runner) runAsynq(ctx context.Context) error {
mux.HandleFunc(asynqqueue.TypeAudioGenerate, r.handleAudioTask)
mux.HandleFunc(asynqqueue.TypeScrapeBook, r.handleScrapeTask)
mux.HandleFunc(asynqqueue.TypeScrapeCatalogue, r.handleScrapeTask)
mux.HandleFunc(asynqqueue.TypeImportBook, r.handleImportTask)
// Register Asynq queue metrics with the default Prometheus registry so
// the /metrics endpoint (metrics.go) can expose them.
@@ -78,7 +79,7 @@ func (r *Runner) runAsynq(ctx context.Context) error {
// Write /tmp/runner.alive every 30s so Docker healthcheck passes in asynq mode.
// This mirrors the heartbeat file behavior from the poll() loop.
go func() {
heartbeatTick := time.NewTicker(r.cfg.StaleTaskThreshold)
heartbeatTick := time.NewTicker(r.cfg.StaleTaskThreshold / 2)
defer heartbeatTick.Stop()
for {
select {
@@ -191,6 +192,25 @@ func (r *Runner) handleAudioTask(ctx context.Context, t *asynq.Task) error {
return nil
}
// handleImportTask is the Asynq handler for TypeImportBook (PDF/EPUB import).
func (r *Runner) handleImportTask(ctx context.Context, t *asynq.Task) error {
var p asynqqueue.ImportPayload
if err := json.Unmarshal(t.Payload(), &p); err != nil {
return fmt.Errorf("unmarshal import payload: %w", err)
}
task := domain.ImportTask{
ID: p.PBTaskID,
Slug: p.Slug,
Title: p.Title,
FileType: p.FileType,
ChaptersKey: p.ChaptersKey,
}
r.tasksRunning.Add(1)
defer r.tasksRunning.Add(-1)
r.runImportTask(ctx, task, p.ObjectKey)
return nil
}
// pollTranslationTasks claims all available translation tasks from PocketBase
// and dispatches them to goroutines. Translation tasks don't go through Redis/Asynq
// because they're stored in PocketBase, so we need this separate poll loop.

View File

@@ -19,3 +19,53 @@ func stripMarkdown(src string) string {
src = regexp.MustCompile(`\n{3,}`).ReplaceAllString(src, "\n\n")
return strings.TrimSpace(src)
}
// chunkText splits text into chunks of at most maxChars characters, breaking
// at sentence boundaries (". ", "! ", "? ", "\n") so that the TTS service
// receives natural prose fragments rather than mid-sentence cuts.
//
// If a single sentence exceeds maxChars it is included as its own chunk —
// never silently truncated.
func chunkText(text string, maxChars int) []string {
if len(text) <= maxChars {
return []string{text}
}
// Sentence-boundary delimiters — we split AFTER these sequences.
// Order matters: longer sequences first.
delimiters := []string{".\n", "!\n", "?\n", ". ", "! ", "? ", "\n\n", "\n"}
var chunks []string
remaining := text
for len(remaining) > 0 {
if len(remaining) <= maxChars {
chunks = append(chunks, strings.TrimSpace(remaining))
break
}
// Find the last sentence boundary within the maxChars window.
window := remaining[:maxChars]
cutAt := -1
for _, delim := range delimiters {
idx := strings.LastIndex(window, delim)
if idx > 0 && idx+len(delim) > cutAt {
cutAt = idx + len(delim)
}
}
if cutAt <= 0 {
// No boundary found — hard-break at maxChars to avoid infinite loop.
cutAt = maxChars
}
chunk := strings.TrimSpace(remaining[:cutAt])
if chunk != "" {
chunks = append(chunks, chunk)
}
remaining = strings.TrimSpace(remaining[cutAt:])
}
return chunks
}

View File

@@ -15,6 +15,7 @@ package runner
import (
"context"
"encoding/json"
"fmt"
"log/slog"
"os"
@@ -27,6 +28,7 @@ import (
"go.opentelemetry.io/otel/codes"
"github.com/libnovel/backend/internal/bookstore"
"github.com/libnovel/backend/internal/cfai"
"github.com/libnovel/backend/internal/domain"
"github.com/libnovel/backend/internal/kokoro"
"github.com/libnovel/backend/internal/libretranslate"
@@ -34,10 +36,27 @@ import (
"github.com/libnovel/backend/internal/orchestrator"
"github.com/libnovel/backend/internal/pockettts"
"github.com/libnovel/backend/internal/scraper"
"github.com/libnovel/backend/internal/storage"
"github.com/libnovel/backend/internal/taskqueue"
"github.com/libnovel/backend/internal/webpush"
"github.com/prometheus/client_golang/prometheus"
)
// Notifier creates notifications for users.
type Notifier interface {
CreateNotification(ctx context.Context, userID, title, message, link string) error
}
// ChapterIngester persists imported chapters for a book.
type ChapterIngester interface {
IngestChapters(ctx context.Context, slug string, chapters []bookstore.Chapter) error
}
// ImportChapterStore retrieves pre-parsed chapter JSON blobs from object storage.
type ImportChapterStore interface {
GetImportChapters(ctx context.Context, key string) ([]byte, error)
}
// Config tunes the runner behaviour.
type Config struct {
// WorkerID uniquely identifies this runner instance in PocketBase records.
@@ -102,6 +121,23 @@ type Dependencies struct {
TranslationStore bookstore.TranslationStore
// CoverStore stores book cover images in MinIO.
CoverStore bookstore.CoverStore
// BookImport handles PDF/EPUB file parsing and chapter extraction.
// Kept for backward compatibility when ChaptersKey is not set.
BookImport bookstore.BookImporter
// ImportChapterStore retrieves pre-parsed chapter JSON blobs from MinIO.
// When set and the task has a ChaptersKey, the runner reads from here
// instead of calling BookImport.Import() (the new preferred path).
ImportChapterStore ImportChapterStore
// ChapterIngester persists extracted chapters into MinIO/PocketBase.
ChapterIngester ChapterIngester
// Notifier creates notifications for users.
Notifier Notifier
// WebPush sends browser push notifications to subscribed users.
// If nil, push notifications are disabled.
WebPush *webpush.Sender
// Store is the underlying *storage.Store; used for push subscription lookups.
// Only needed when WebPush is non-nil.
Store *storage.Store
// SearchIndex indexes books in Meilisearch after scraping.
// If nil a no-op is used.
SearchIndex meili.Client
@@ -112,6 +148,9 @@ type Dependencies struct {
// PocketTTS is the pocket-tts client (CPU, kyutai voices: alba, marius, etc.).
// If nil, pocket-tts voice tasks will fail with a clear error.
PocketTTS pockettts.Client
// CFAI is the Cloudflare Workers AI TTS client (cfai:* prefixed voices).
// If nil, CF AI voice tasks will fail with a clear error.
CFAI cfai.Client
// LibreTranslate is the machine translation client.
// If nil, translation tasks will fail with a clear error.
LibreTranslate libretranslate.Client
@@ -221,6 +260,7 @@ func (r *Runner) runPoll(ctx context.Context) error {
scrapeSem := make(chan struct{}, r.cfg.MaxConcurrentScrape)
audioSem := make(chan struct{}, r.cfg.MaxConcurrentAudio)
translationSem := make(chan struct{}, r.cfg.MaxConcurrentTranslation)
importSem := make(chan struct{}, 1) // Limit concurrent imports
var wg sync.WaitGroup
tick := time.NewTicker(r.cfg.PollInterval)
@@ -240,7 +280,7 @@ func (r *Runner) runPoll(ctx context.Context) error {
// Run one poll immediately on startup, then on each tick.
for {
r.poll(ctx, scrapeSem, audioSem, translationSem, &wg)
r.poll(ctx, scrapeSem, audioSem, translationSem, importSem, &wg)
select {
case <-ctx.Done():
@@ -265,7 +305,7 @@ func (r *Runner) runPoll(ctx context.Context) error {
}
// poll claims all available pending tasks and dispatches them to goroutines.
func (r *Runner) poll(ctx context.Context, scrapeSem, audioSem, translationSem chan struct{}, wg *sync.WaitGroup) {
func (r *Runner) poll(ctx context.Context, scrapeSem, audioSem, translationSem, importSem chan struct{}, wg *sync.WaitGroup) {
// ── Heartbeat file ────────────────────────────────────────────────────
// Touch /tmp/runner.alive so the Docker health check can confirm the
// runner is actively polling. Failure is non-fatal — just log it.
@@ -381,6 +421,39 @@ translationLoop:
r.runTranslationTask(ctx, t)
}(task)
}
// ── Import tasks ─────────────────────────────────────────────────────
importLoop:
for {
if ctx.Err() != nil {
return
}
select {
case importSem <- struct{}{}:
// Slot acquired — proceed to claim a task.
default:
// All slots busy; leave remaining pending tasks for next tick.
break importLoop
}
task, ok, err := r.deps.Consumer.ClaimNextImportTask(ctx, r.cfg.WorkerID)
if err != nil {
<-importSem
r.deps.Log.Error("runner: ClaimNextImportTask failed", "err", err)
break
}
if !ok {
<-importSem
break
}
r.tasksRunning.Add(1)
wg.Add(1)
go func(t domain.ImportTask) {
defer wg.Done()
defer func() { <-importSem }()
defer r.tasksRunning.Add(-1)
r.runImportTask(ctx, t, t.ObjectKey)
}(task)
}
}
// newOrchestrator builds an orchestrator with the Meilisearch post-hook wired in.
@@ -440,16 +513,56 @@ func (r *Runner) runScrapeTask(ctx context.Context, task domain.ScrapeTask) {
log.Warn("runner: unknown task kind")
}
if err := r.deps.Consumer.FinishScrapeTask(ctx, task.ID, result); err != nil {
// Use a fresh context for the final write so a cancelled task context doesn't
// prevent the result counters from being persisted to PocketBase.
finishCtx, finishCancel := context.WithTimeout(context.Background(), 15*time.Second)
defer finishCancel()
if err := r.deps.Consumer.FinishScrapeTask(finishCtx, task.ID, result); err != nil {
log.Error("runner: FinishScrapeTask failed", "err", err)
}
if result.ErrorMessage != "" {
r.tasksFailed.Add(1)
span.SetStatus(codes.Error, result.ErrorMessage)
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Scrape Failed",
fmt.Sprintf("Scrape task (%s) failed: %s", task.Kind, result.ErrorMessage),
"/admin/tasks")
}
} else {
r.tasksCompleted.Add(1)
span.SetStatus(codes.Ok, "")
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Scrape Complete",
fmt.Sprintf("Scraped %d chapters, skipped %d (%s)", result.ChaptersScraped, result.ChaptersSkipped, task.Kind),
"/admin/tasks")
}
// Fan-out in-app new-chapter notification to all users who have this book
// in their library. Runs in background so it doesn't block the task loop.
if r.deps.Store != nil && result.ChaptersScraped > 0 &&
result.Slug != "" && task.Kind != "catalogue" {
go func() {
notifyCtx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
title := result.Slug
_ = r.deps.Store.NotifyUsersWithBook(notifyCtx, result.Slug,
"New chapters available",
fmt.Sprintf("%d new chapter(s) added to %s", result.ChaptersScraped, title),
"/books/"+result.Slug)
}()
}
// Send Web Push notifications to subscribed browsers.
if r.deps.WebPush != nil && r.deps.Store != nil &&
result.ChaptersScraped > 0 && result.Slug != "" && task.Kind != "catalogue" {
go r.deps.WebPush.SendToBook(context.Background(), r.deps.Store, result.Slug, webpush.Payload{
Title: "New chapter available",
Body: fmt.Sprintf("%d new chapter(s) added", result.ChaptersScraped),
URL: "/books/" + result.Slug,
Icon: "/icon-192.png",
})
}
}
log.Info("runner: scrape task finished",
@@ -474,7 +587,7 @@ func (r *Runner) runCatalogueTask(ctx context.Context, task domain.ScrapeTask, o
TargetURL: entry.URL,
}
bookResult := o.RunBook(ctx, bookTask)
result.BooksFound += bookResult.BooksFound + 1
result.BooksFound += bookResult.BooksFound
result.ChaptersScraped += bookResult.ChaptersScraped
result.ChaptersSkipped += bookResult.ChaptersSkipped
result.Errors += bookResult.Errors
@@ -529,6 +642,12 @@ func (r *Runner) runAudioTask(ctx context.Context, task domain.AudioTask) {
if err := r.deps.Consumer.FinishAudioTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishAudioTask failed", "err", err)
}
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Audio Failed",
fmt.Sprintf("Ch.%d of %s (%s): %s", task.Chapter, task.Slug, task.Voice, msg),
fmt.Sprintf("/books/%s", task.Slug))
}
}
raw, err := r.deps.BookReader.ReadChapter(ctx, task.Slug, task.Chapter)
@@ -555,13 +674,25 @@ func (r *Runner) runAudioTask(ctx context.Context, task domain.AudioTask) {
return
}
log.Info("runner: audio generated via pocket-tts", "voice", task.Voice)
} else if cfai.IsCFAIVoice(task.Voice) {
if r.deps.CFAI == nil {
fail("cloudflare AI client not configured (CFAI_ACCOUNT_ID/CFAI_API_TOKEN empty)")
return
}
var genErr error
audioData, genErr = r.deps.CFAI.GenerateAudio(ctx, text, task.Voice)
if genErr != nil {
fail(fmt.Sprintf("cfai generate: %v", genErr))
return
}
log.Info("runner: audio generated via cloudflare AI", "voice", task.Voice)
} else {
if r.deps.Kokoro == nil {
fail("kokoro client not configured (KOKORO_URL is empty)")
return
}
var genErr error
audioData, genErr = r.deps.Kokoro.GenerateAudio(ctx, text, task.Voice)
audioData, genErr = kokoroGenerateChunked(ctx, r.deps.Kokoro, text, task.Voice, log)
if genErr != nil {
fail(fmt.Sprintf("kokoro generate: %v", genErr))
return
@@ -581,5 +712,173 @@ func (r *Runner) runAudioTask(ctx context.Context, task domain.AudioTask) {
if err := r.deps.Consumer.FinishAudioTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishAudioTask failed", "err", err)
}
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Audio Ready",
fmt.Sprintf("Ch.%d of %s (%s) is ready", task.Chapter, task.Slug, task.Voice),
fmt.Sprintf("/books/%s", task.Slug))
}
log.Info("runner: audio task finished", "key", key)
}
// kokoroGenerateChunked splits text into ~1 000-character sentence-boundary
// chunks, calls Kokoro.GenerateAudio for each, and concatenates the raw MP3
// bytes. This avoids EOF / timeout failures that occur when the Kokoro
// FastAPI server receives very large inputs (e.g. a full imported PDF chapter).
//
// Concatenating raw MP3 frames is valid — MP3 is a frame-based format and
// standard players handle multi-segment files correctly.
func kokoroGenerateChunked(ctx context.Context, k kokoro.Client, text, voice string, log *slog.Logger) ([]byte, error) {
const chunkSize = 1000
chunks := chunkText(text, chunkSize)
log.Info("runner: kokoro chunked generation", "chunks", len(chunks), "total_chars", len(text))
var combined []byte
for i, chunk := range chunks {
data, err := k.GenerateAudio(ctx, chunk, voice)
if err != nil {
return nil, fmt.Errorf("chunk %d/%d: %w", i+1, len(chunks), err)
}
combined = append(combined, data...)
log.Info("runner: kokoro chunk done", "chunk", i+1, "of", len(chunks), "bytes", len(data))
}
return combined, nil
}
// runImportTask executes one PDF/EPUB import task.
// Preferred path: when task.ChaptersKey is set, it reads pre-parsed chapters
// JSON from MinIO (written by the backend at upload time) and ingests them.
// Fallback path: when ChaptersKey is empty, calls BookImport.Import() to
// parse the raw file on the runner (legacy behaviour, not used for new tasks).
func (r *Runner) runImportTask(ctx context.Context, task domain.ImportTask, objectKey string) {
ctx, span := otel.Tracer("runner").Start(ctx, "runner.import_task")
defer span.End()
span.SetAttributes(
attribute.String("task.id", task.ID),
attribute.String("book.slug", task.Slug),
attribute.String("file.type", task.FileType),
attribute.String("chapters_key", task.ChaptersKey),
)
log := r.deps.Log.With("task_id", task.ID, "slug", task.Slug, "file_type", task.FileType)
log.Info("runner: import task starting", "chapters_key", task.ChaptersKey)
hbCtx, hbCancel := context.WithCancel(ctx)
defer hbCancel()
go func() {
tick := time.NewTicker(r.cfg.HeartbeatInterval)
defer tick.Stop()
for {
select {
case <-hbCtx.Done():
return
case <-tick.C:
if err := r.deps.Consumer.HeartbeatTask(ctx, task.ID); err != nil {
log.Warn("runner: heartbeat failed", "err", err)
}
}
}
}()
fail := func(msg string) {
log.Error("runner: import task failed", "reason", msg)
r.tasksFailed.Add(1)
span.SetStatus(codes.Error, msg)
result := domain.ImportResult{ErrorMessage: msg}
if err := r.deps.Consumer.FinishImportTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishImportTask failed", "err", err)
}
}
var chapters []bookstore.Chapter
if task.ChaptersKey != "" && r.deps.ImportChapterStore != nil {
// New path: read pre-parsed chapters JSON uploaded by the backend.
raw, err := r.deps.ImportChapterStore.GetImportChapters(ctx, task.ChaptersKey)
if err != nil {
fail(fmt.Sprintf("get chapters JSON: %v", err))
return
}
if err := json.Unmarshal(raw, &chapters); err != nil {
fail(fmt.Sprintf("unmarshal chapters JSON: %v", err))
return
}
log.Info("runner: loaded pre-parsed chapters", "count", len(chapters))
} else {
// Legacy path: parse the raw file on the runner.
if r.deps.BookImport == nil {
fail("book import not configured (BookImport dependency missing)")
return
}
var err error
chapters, err = r.deps.BookImport.Import(ctx, objectKey, task.FileType)
if err != nil {
fail(fmt.Sprintf("import file: %v", err))
return
}
log.Info("runner: parsed chapters from file (legacy path)", "count", len(chapters))
}
if len(chapters) == 0 {
fail("no chapters extracted from file")
return
}
// Persist chapters via ChapterIngester.
if r.deps.ChapterIngester == nil {
fail("chapter ingester not configured")
return
}
if err := r.deps.ChapterIngester.IngestChapters(ctx, task.Slug, chapters); err != nil {
fail(fmt.Sprintf("store chapters: %v", err))
return
}
// Write book metadata so the book appears in PocketBase catalogue.
if r.deps.BookWriter != nil {
meta := domain.BookMeta{
Slug: task.Slug,
Title: task.Title,
Author: task.Author,
Cover: task.CoverURL,
Status: task.BookStatus,
Genres: task.Genres,
Summary: task.Summary,
TotalChapters: len(chapters),
}
if meta.Status == "" {
meta.Status = "completed"
}
if err := r.deps.BookWriter.WriteMetadata(ctx, meta); err != nil {
log.Warn("runner: import task WriteMetadata failed (non-fatal)", "err", err)
} else {
// Index in Meilisearch so the book is searchable.
if err := r.deps.SearchIndex.UpsertBook(ctx, meta); err != nil {
log.Warn("runner: import task meilisearch upsert failed (non-fatal)", "err", err)
}
}
}
r.tasksCompleted.Add(1)
span.SetStatus(codes.Ok, "")
result := domain.ImportResult{
Slug: task.Slug,
ChaptersImported: len(chapters),
}
if err := r.deps.Consumer.FinishImportTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishImportTask failed", "err", err)
}
// Notify the user who initiated the import.
if r.deps.Notifier != nil {
msg := fmt.Sprintf("Import completed: %d chapters from %s", len(chapters), task.Title)
targetUser := task.InitiatorUserID
if targetUser == "" {
targetUser = "admin"
}
_ = r.deps.Notifier.CreateNotification(ctx, targetUser, "Import Complete", msg, "/admin/import")
}
log.Info("runner: import task finished", "chapters", len(chapters))
}

View File

@@ -54,6 +54,10 @@ func (s *stubConsumer) ClaimNextTranslationTask(_ context.Context, _ string) (do
return domain.TranslationTask{}, false, nil
}
func (s *stubConsumer) ClaimNextImportTask(_ context.Context, _ string) (domain.ImportTask, bool, error) {
return domain.ImportTask{}, false, nil
}
func (s *stubConsumer) FinishScrapeTask(_ context.Context, id string, _ domain.ScrapeResult) error {
s.finished = append(s.finished, id)
return nil
@@ -69,6 +73,11 @@ func (s *stubConsumer) FinishTranslationTask(_ context.Context, id string, _ dom
return nil
}
func (s *stubConsumer) FinishImportTask(_ context.Context, id string, _ domain.ImportResult) error {
s.finished = append(s.finished, id)
return nil
}
func (s *stubConsumer) FailTask(_ context.Context, id, _ string) error {
s.failCalled = append(s.failCalled, id)
return nil
@@ -94,6 +103,10 @@ func (s *stubBookWriter) ChapterExists(_ context.Context, _ string, _ domain.Cha
return false
}
func (s *stubBookWriter) DeduplicateChapters(_ context.Context, _ string) (int, error) {
return 0, nil
}
// stubBookReader satisfies bookstore.BookReader — returns a single chapter.
type stubBookReader struct {
text string
@@ -126,6 +139,9 @@ type stubAudioStore struct {
func (s *stubAudioStore) AudioObjectKey(slug string, n int, voice string) string {
return slug + "/" + string(rune('0'+n)) + "/" + voice + ".mp3"
}
func (s *stubAudioStore) AudioObjectKeyExt(slug string, n int, voice, ext string) string {
return slug + "/" + string(rune('0'+n)) + "/" + voice + "." + ext
}
func (s *stubAudioStore) AudioExists(_ context.Context, _ string) bool { return false }
func (s *stubAudioStore) PutAudio(_ context.Context, _ string, _ []byte) error {
s.putCalled.Add(1)
@@ -199,6 +215,14 @@ func (s *stubKokoro) StreamAudioMP3(_ context.Context, _, _ string) (io.ReadClos
return io.NopCloser(bytes.NewReader(s.data)), nil
}
func (s *stubKokoro) StreamAudioWAV(_ context.Context, _, _ string) (io.ReadCloser, error) {
s.called.Add(1)
if s.genErr != nil {
return nil, s.genErr
}
return io.NopCloser(bytes.NewReader(s.data)), nil
}
func (s *stubKokoro) ListVoices(_ context.Context) ([]string, error) {
return []string{"af_bella"}, nil
}

View File

@@ -53,6 +53,12 @@ func (r *Runner) runTranslationTask(ctx context.Context, task domain.Translation
if err := r.deps.Consumer.FinishTranslationTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishTranslationTask failed", "err", err)
}
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Translation Failed",
fmt.Sprintf("Ch.%d of %s (%s): %s", task.Chapter, task.Slug, task.Lang, msg),
fmt.Sprintf("/books/%s", task.Slug))
}
}
// Guard: LibreTranslate must be configured.
@@ -93,5 +99,11 @@ func (r *Runner) runTranslationTask(ctx context.Context, task domain.Translation
if err := r.deps.Consumer.FinishTranslationTask(ctx, task.ID, result); err != nil {
log.Error("runner: FinishTranslationTask failed", "err", err)
}
if r.deps.Notifier != nil {
_ = r.deps.Notifier.CreateNotification(ctx, "admin",
"Translation Ready",
fmt.Sprintf("Ch.%d of %s translated to %s", task.Chapter, task.Slug, task.Lang),
fmt.Sprintf("/books/%s", task.Slug))
}
log.Info("runner: translation task finished", "key", key)
}

View File

@@ -0,0 +1,857 @@
package storage
import (
"archive/zip"
"bytes"
"context"
"fmt"
"io"
"os"
"sort"
"strconv"
"strings"
"github.com/libnovel/backend/internal/bookstore"
"github.com/libnovel/backend/internal/domain"
minio "github.com/minio/minio-go/v7"
"github.com/pdfcpu/pdfcpu/pkg/api"
"github.com/pdfcpu/pdfcpu/pkg/pdfcpu/model"
"golang.org/x/net/html"
)
type importer struct {
mc *minioClient
}
// NewBookImporter creates a BookImporter that reads files from MinIO.
func NewBookImporter(s *Store) bookstore.BookImporter {
return &importer{mc: s.mc}
}
func (i *importer) Import(ctx context.Context, objectKey, fileType string) ([]bookstore.Chapter, error) {
if fileType != "pdf" && fileType != "epub" {
return nil, fmt.Errorf("unsupported file type: %s", fileType)
}
obj, err := i.mc.client.GetObject(ctx, "imports", objectKey, minio.GetObjectOptions{})
if err != nil {
return nil, fmt.Errorf("get object from minio: %w", err)
}
defer obj.Close()
data, err := io.ReadAll(obj)
if err != nil {
return nil, fmt.Errorf("read object: %w", err)
}
if fileType == "pdf" {
return parsePDF(data)
}
return parseEPUB(data)
}
// AnalyzeFile parses the given PDF or EPUB data and returns the detected
// chapter count and up to 3 preview lines (first non-empty line of each of
// the first 3 chapters). It is used by the analyze-only endpoint so users
// can preview chapter count before committing the import.
// Note: uses parsePDF which is backed by pdfcpu ExtractContent — fast, no hang risk.
func AnalyzeFile(data []byte, fileType string) (chapterCount int, firstLines []string, err error) {
var chapters []bookstore.Chapter
switch fileType {
case "pdf":
chapters, err = parsePDF(data)
case "epub":
chapters, err = parseEPUB(data)
default:
return 0, nil, fmt.Errorf("unsupported file type: %s", fileType)
}
if err != nil {
return 0, nil, err
}
chapterCount = len(chapters)
for i, ch := range chapters {
if i >= 3 {
break
}
line := strings.TrimSpace(ch.Content)
if nl := strings.Index(line, "\n"); nl > 0 {
line = line[:nl]
}
if len(line) > 120 {
line = line[:120] + "…"
}
firstLines = append(firstLines, line)
}
return chapterCount, firstLines, nil
}
// decryptPDF strips encryption from a PDF using an empty user password.
// Returns the decrypted bytes, or an error if decryption is not possible.
// This handles the common case of "owner-only" encrypted PDFs (copy/print
// restrictions) which use an empty user password and open normally in readers.
func decryptPDF(data []byte) ([]byte, error) {
conf := model.NewDefaultConfiguration()
conf.UserPW = ""
conf.OwnerPW = ""
var out bytes.Buffer
err := api.Decrypt(bytes.NewReader(data), &out, conf)
if err != nil {
return nil, err
}
return out.Bytes(), nil
}
// ParseImportFile parses a PDF or EPUB and returns chapters.
// Unlike AnalyzeFile it respects ctx cancellation so callers can apply a timeout.
// For PDFs it first attempts to strip encryption with an empty password.
func ParseImportFile(ctx context.Context, data []byte, fileType string) ([]bookstore.Chapter, error) {
type result struct {
chapters []bookstore.Chapter
err error
}
ch := make(chan result, 1)
go func() {
var chapters []bookstore.Chapter
var err error
switch fileType {
case "pdf":
chapters, err = parsePDF(data)
case "epub":
chapters, err = parseEPUB(data)
default:
err = fmt.Errorf("unsupported file type: %s", fileType)
}
ch <- result{chapters, err}
}()
select {
case <-ctx.Done():
return nil, fmt.Errorf("parse timed out: %w", ctx.Err())
case r := <-ch:
return r.chapters, r.err
}
}
// pdfSkipBookmarks lists bookmark titles that are front/back matter, not story chapters.
// These are skipped when building the chapter list.
var pdfSkipBookmarks = map[string]bool{
"cover": true, "insert": true, "title page": true, "copyright": true,
"appendix": true, "color insert": true, "color illustrations": true,
}
// parsePDF extracts text from PDF bytes and returns it as a single chapter.
//
// The full readable text is returned as one chapter so the admin can manually
// split it into chapters via the UI using --- markers.
//
// Strategy:
// 1. Decrypt owner-protected PDFs (empty user password).
// 2. Extract raw content streams for every page using pdfcpu ExtractContent.
// 3. Concatenate text from all pages in order, skipping front matter
// (cover, title page, copyright — typically the first 10 pages).
func parsePDF(data []byte) ([]bookstore.Chapter, error) {
// Decrypt owner-protected PDFs (empty user password).
decrypted, err := decryptPDF(data)
if err == nil {
data = decrypted
}
conf := model.NewDefaultConfiguration()
conf.UserPW = ""
conf.OwnerPW = ""
// Extract all page content streams to a temp directory.
tmpDir, err := os.MkdirTemp("", "pdf-extract-*")
if err != nil {
return nil, fmt.Errorf("create temp dir: %w", err)
}
defer os.RemoveAll(tmpDir)
if err := api.ExtractContent(bytes.NewReader(data), tmpDir, "out", nil, conf); err != nil {
return nil, fmt.Errorf("extract PDF content: %w", err)
}
entries, err := os.ReadDir(tmpDir)
if err != nil || len(entries) == 0 {
return nil, fmt.Errorf("PDF has no content pages")
}
// Parse page number from filename and build ordered text map.
pageTexts := make(map[int]string, len(entries))
maxPage := 0
for _, e := range entries {
pageNum := pageNumFromFilename(e.Name())
if pageNum <= 0 {
continue
}
raw, readErr := os.ReadFile(tmpDir + "/" + e.Name())
if readErr != nil {
continue
}
pageTexts[pageNum] = fixWin1252(extractTextFromContentStream(raw))
if pageNum > maxPage {
maxPage = pageNum
}
}
// Determine front-matter cutoff using bookmarks if available,
// otherwise skip the first 10 pages (cover/title/copyright).
bodyStart := 1
bookmarks, bmErr := api.Bookmarks(bytes.NewReader(data), conf)
if bmErr == nil {
for _, bm := range bookmarks {
title := strings.ToLower(strings.TrimSpace(bm.Title))
if !pdfSkipBookmarks[title] && bm.PageFrom > 0 {
// First non-front-matter bookmark — body starts here.
bodyStart = bm.PageFrom
break
}
}
} else if maxPage > 10 {
bodyStart = 11
}
// Concatenate all body pages.
var sb strings.Builder
for p := bodyStart; p <= maxPage; p++ {
t := strings.TrimSpace(pageTexts[p])
if t == "" {
continue
}
sb.WriteString(t)
sb.WriteString("\n\n")
}
text := strings.TrimSpace(sb.String())
if text == "" {
return nil, fmt.Errorf("could not extract any text from PDF")
}
return []bookstore.Chapter{{
Number: 1,
Title: "Full Text",
Content: text,
}}, nil
}
// pageNumFromFilename extracts the page number from a pdfcpu content-stream
// filename like "out_Content_page_42.txt". Returns 0 if not parseable.
func pageNumFromFilename(name string) int {
// Strip directory prefix and extension.
base := name
if idx := strings.LastIndex(base, "/"); idx >= 0 {
base = base[idx+1:]
}
if idx := strings.LastIndex(base, "."); idx >= 0 {
base = base[:idx]
}
// Find last "_" and parse the number after it.
if idx := strings.LastIndex(base, "_"); idx >= 0 {
n, err := strconv.Atoi(base[idx+1:])
if err == nil && n > 0 {
return n
}
}
return 0
}
// win1252ToUnicode maps the Windows-1252 control range 0x800x9F to the
// Unicode characters they actually represent in that encoding.
// Standard Latin-1 maps these bytes to control characters; Win-1252 maps
// them to typographic symbols that appear in publisher PDFs.
var win1252ToUnicode = map[byte]rune{
0x80: '\u20AC', // €
0x82: '\u201A', //
0x83: '\u0192', // ƒ
0x84: '\u201E', // „
0x85: '\u2026', // …
0x86: '\u2020', // †
0x87: '\u2021', // ‡
0x88: '\u02C6', // ˆ
0x89: '\u2030', // ‰
0x8A: '\u0160', // Š
0x8B: '\u2039', //
0x8C: '\u0152', // Œ
0x8E: '\u017D', // Ž
0x91: '\u2018', // ' (left single quotation mark)
0x92: '\u2019', // ' (right single quotation mark / apostrophe)
0x93: '\u201C', // " (left double quotation mark)
0x94: '\u201D', // " (right double quotation mark)
0x95: '\u2022', // • (bullet)
0x96: '\u2013', // (en dash)
0x97: '\u2014', // — (em dash)
0x98: '\u02DC', // ˜
0x99: '\u2122', // ™
0x9A: '\u0161', // š
0x9B: '\u203A', //
0x9C: '\u0153', // œ
0x9E: '\u017E', // ž
0x9F: '\u0178', // Ÿ
}
// fixWin1252 replaces Windows-1252 specific bytes (0x800x9F) in a string
// that was decoded as raw Latin-1 bytes with their proper Unicode equivalents.
func fixWin1252(s string) string {
// Fast path: if no bytes in 0x800x9F range, return unchanged.
needsFix := false
for i := 0; i < len(s); i++ {
b := s[i]
if b >= 0x80 && b <= 0x9F {
needsFix = true
break
}
}
if !needsFix {
return s
}
var sb strings.Builder
sb.Grow(len(s))
for i := 0; i < len(s); i++ {
b := s[i]
if b >= 0x80 && b <= 0x9F {
if r, ok := win1252ToUnicode[b]; ok {
sb.WriteRune(r)
continue
}
}
sb.WriteByte(b)
}
return sb.String()
}
// extractTextFromContentStream parses a raw PDF content stream and extracts
// readable text from Tj and TJ operators.
//
// TJ arrays may contain a mix of literal strings (parenthesised) and hex glyph
// arrays. Only the literal strings are decoded — hex arrays require per-font
// ToUnicode CMaps and are skipped. Kerning adjustment numbers inside TJ arrays
// are also ignored (they're just spacing hints).
//
// Line breaks are inserted on ET / Td / TD / T* operators.
func extractTextFromContentStream(stream []byte) string {
s := string(stream)
var sb strings.Builder
i := 0
n := len(s)
for i < n {
// TJ array: [ ... ]TJ — collect all literal strings, skip hex & numbers.
if s[i] == '[' {
j := i + 1
for j < n && s[j] != ']' {
if s[j] == '(' {
// Literal string inside TJ array.
k := j + 1
depth := 1
for k < n && depth > 0 {
if s[k] == '\\' {
k += 2
continue
}
if s[k] == '(' {
depth++
} else if s[k] == ')' {
depth--
}
k++
}
lit := pdfUnescapeString(s[j+1 : k-1])
if hasPrintableASCII(lit) {
sb.WriteString(lit)
}
j = k
continue
}
j++
}
// Check if this is a TJ operator (skip whitespace after ']').
end := j + 1
for end < n && (s[end] == ' ' || s[end] == '\t' || s[end] == '\r' || s[end] == '\n') {
end++
}
if end+2 <= n && s[end:end+2] == "TJ" && (end+2 == n || !isAlphaNum(s[end+2])) {
i = end + 2
continue
}
i = j + 1
continue
}
// Single string: (string) Tj
if s[i] == '(' {
j := i + 1
depth := 1
for j < n && depth > 0 {
if s[j] == '\\' {
j += 2
continue
}
if s[j] == '(' {
depth++
} else if s[j] == ')' {
depth--
}
j++
}
lit := pdfUnescapeString(s[i+1 : j-1])
if hasPrintableASCII(lit) {
// Check for Tj operator.
end := j
for end < n && (s[end] == ' ' || s[end] == '\t') {
end++
}
if end+2 <= n && s[end:end+2] == "Tj" && (end+2 == n || !isAlphaNum(s[end+2])) {
sb.WriteString(lit)
i = end + 2
continue
}
}
i = j
continue
}
// Detect end of text object (ET) — add a newline.
if i+2 <= n && s[i:i+2] == "ET" && (i+2 == n || !isAlphaNum(s[i+2])) {
sb.WriteByte('\n')
i += 2
continue
}
// Detect Td / TD / T* — newline within text block.
if i+2 <= n && (s[i:i+2] == "Td" || s[i:i+2] == "TD" || s[i:i+2] == "T*") &&
(i+2 == n || !isAlphaNum(s[i+2])) {
sb.WriteByte('\n')
i += 2
continue
}
i++
}
return sb.String()
}
func isAlphaNum(b byte) bool {
return (b >= 'a' && b <= 'z') || (b >= 'A' && b <= 'Z') || (b >= '0' && b <= '9') || b == '_'
}
func hasPrintableASCII(s string) bool {
for _, c := range s {
if c >= 0x20 && c < 0x7F {
return true
}
}
return false
}
// pdfUnescapeString handles PDF string escape sequences.
func pdfUnescapeString(s string) string {
if !strings.ContainsRune(s, '\\') {
return s
}
var sb strings.Builder
i := 0
for i < len(s) {
if s[i] == '\\' && i+1 < len(s) {
switch s[i+1] {
case 'n':
sb.WriteByte('\n')
case 'r':
sb.WriteByte('\r')
case 't':
sb.WriteByte('\t')
case '(', ')', '\\':
sb.WriteByte(s[i+1])
default:
// Octal escape \ddd
if s[i+1] >= '0' && s[i+1] <= '7' {
end := i + 2
for end < i+5 && end < len(s) && s[end] >= '0' && s[end] <= '7' {
end++
}
val, _ := strconv.ParseInt(s[i+1:end], 8, 16)
sb.WriteByte(byte(val))
i = end
continue
}
sb.WriteByte(s[i+1])
}
i += 2
} else {
sb.WriteByte(s[i])
i++
}
}
return sb.String()
}
// ── EPUB parsing ──────────────────────────────────────────────────────────────
func parseEPUB(data []byte) ([]bookstore.Chapter, error) {
zr, err := zip.NewReader(bytes.NewReader(data), int64(len(data)))
if err != nil {
return nil, fmt.Errorf("open EPUB zip: %w", err)
}
// 1. Read META-INF/container.xml → find rootfile (content.opf path).
opfPath, err := epubRootfilePath(zr)
if err != nil {
return nil, fmt.Errorf("epub container: %w", err)
}
// 2. Parse content.opf → spine order of chapter files.
spineFiles, titleMap, err := epubSpine(zr, opfPath)
if err != nil {
return nil, fmt.Errorf("epub spine: %w", err)
}
if len(spineFiles) == 0 {
return nil, fmt.Errorf("EPUB spine is empty")
}
// Base directory of the OPF file for resolving relative hrefs.
opfDir := ""
if idx := strings.LastIndex(opfPath, "/"); idx >= 0 {
opfDir = opfPath[:idx+1]
}
var chapters []bookstore.Chapter
chNum := 0
for i, href := range spineFiles {
fullPath := opfDir + href
content, err := epubFileContent(zr, fullPath)
if err != nil {
continue
}
text := htmlToText(content)
if strings.TrimSpace(text) == "" {
continue
}
chNum++
title := titleMap[href]
if title == "" {
title = fmt.Sprintf("Chapter %d", chNum)
}
_ = i // spine index unused for numbering
chapters = append(chapters, bookstore.Chapter{
Number: chNum,
Title: title,
Content: text,
})
}
if len(chapters) == 0 {
return nil, fmt.Errorf("no readable chapters found in EPUB")
}
return chapters, nil
}
// epubRootfilePath parses META-INF/container.xml and returns the full-path
// of the OPF package document.
func epubRootfilePath(zr *zip.Reader) (string, error) {
f := zipFile(zr, "META-INF/container.xml")
if f == nil {
return "", fmt.Errorf("META-INF/container.xml not found")
}
rc, err := f.Open()
if err != nil {
return "", err
}
defer rc.Close()
doc, err := html.Parse(rc)
if err != nil {
return "", err
}
var path string
var walk func(*html.Node)
walk = func(n *html.Node) {
if n.Type == html.ElementNode && strings.EqualFold(n.Data, "rootfile") {
for _, a := range n.Attr {
if strings.EqualFold(a.Key, "full-path") {
path = a.Val
return
}
}
}
for c := n.FirstChild; c != nil; c = c.NextSibling {
walk(c)
}
}
walk(doc)
if path == "" {
return "", fmt.Errorf("rootfile full-path not found in container.xml")
}
return path, nil
}
// epubSpine parses the OPF document and returns the spine item hrefs in order,
// plus a map from href → nav title (if available from NCX/NAV).
func epubSpine(zr *zip.Reader, opfPath string) ([]string, map[string]string, error) {
f := zipFile(zr, opfPath)
if f == nil {
return nil, nil, fmt.Errorf("OPF file %q not found in EPUB", opfPath)
}
rc, err := f.Open()
if err != nil {
return nil, nil, err
}
defer rc.Close()
opfData, err := io.ReadAll(rc)
if err != nil {
return nil, nil, err
}
// Build id→href map from <manifest>.
idToHref := make(map[string]string)
// Also keep a href→navTitle map (populated from NCX later).
hrefTitle := make(map[string]string)
// Parse OPF XML with html.Parse (handles malformed XML too).
doc, _ := html.Parse(bytes.NewReader(opfData))
var manifestItems []struct{ id, href, mediaType string }
var spineIdrefs []string
var ncxID string
var walk func(*html.Node)
walk = func(n *html.Node) {
if n.Type == html.ElementNode {
tag := strings.ToLower(n.Data)
switch tag {
case "item":
var id, href, mt string
for _, a := range n.Attr {
switch strings.ToLower(a.Key) {
case "id":
id = a.Val
case "href":
href = a.Val
case "media-type":
mt = a.Val
}
}
if id != "" && href != "" {
manifestItems = append(manifestItems, struct{ id, href, mediaType string }{id, href, mt})
idToHref[id] = href
}
case "itemref":
for _, a := range n.Attr {
if strings.ToLower(a.Key) == "idref" {
spineIdrefs = append(spineIdrefs, a.Val)
}
}
case "spine":
for _, a := range n.Attr {
if strings.ToLower(a.Key) == "toc" {
ncxID = a.Val
}
}
}
}
for c := n.FirstChild; c != nil; c = c.NextSibling {
walk(c)
}
}
walk(doc)
// Build ordered spine href list.
var spineHrefs []string
for _, idref := range spineIdrefs {
if href, ok := idToHref[idref]; ok {
spineHrefs = append(spineHrefs, href)
}
}
// If no explicit spine, fall back to all XHTML items in manifest order.
if len(spineHrefs) == 0 {
sort.Slice(manifestItems, func(i, j int) bool {
return manifestItems[i].href < manifestItems[j].href
})
for _, it := range manifestItems {
mt := strings.ToLower(it.mediaType)
if strings.Contains(mt, "html") || strings.HasSuffix(strings.ToLower(it.href), ".html") || strings.HasSuffix(strings.ToLower(it.href), ".xhtml") {
spineHrefs = append(spineHrefs, it.href)
}
}
}
// Try to get chapter titles from NCX (toc.ncx).
opfDir := ""
if idx := strings.LastIndex(opfPath, "/"); idx >= 0 {
opfDir = opfPath[:idx+1]
}
if ncxHref, ok := idToHref[ncxID]; ok {
ncxPath := opfDir + ncxHref
if ncxFile := zipFile(zr, ncxPath); ncxFile != nil {
if ncxRC, err := ncxFile.Open(); err == nil {
defer ncxRC.Close()
parseNCXTitles(ncxRC, hrefTitle)
}
}
}
return spineHrefs, hrefTitle, nil
}
// parseNCXTitles extracts navPoint label→src mappings from a toc.ncx.
func parseNCXTitles(r io.Reader, out map[string]string) {
doc, err := html.Parse(r)
if err != nil {
return
}
// Collect navPoints: each has a <navLabel><text>…</text></navLabel> and
// a <content src="…"/> child.
var walk func(*html.Node)
walk = func(n *html.Node) {
if n.Type == html.ElementNode && strings.EqualFold(n.Data, "navpoint") {
var label, src string
var inner func(*html.Node)
inner = func(c *html.Node) {
if c.Type == html.ElementNode {
if strings.EqualFold(c.Data, "text") && label == "" {
if c.FirstChild != nil && c.FirstChild.Type == html.TextNode {
label = strings.TrimSpace(c.FirstChild.Data)
}
}
if strings.EqualFold(c.Data, "content") {
for _, a := range c.Attr {
if strings.EqualFold(a.Key, "src") {
// Strip fragment identifier (#...).
src = strings.SplitN(a.Val, "#", 2)[0]
}
}
}
}
for child := c.FirstChild; child != nil; child = child.NextSibling {
inner(child)
}
}
inner(n)
if label != "" && src != "" {
out[src] = label
}
}
for c := n.FirstChild; c != nil; c = c.NextSibling {
walk(c)
}
}
walk(doc)
}
// epubFileContent returns the raw bytes of a file inside the EPUB zip.
func epubFileContent(zr *zip.Reader, path string) ([]byte, error) {
f := zipFile(zr, path)
if f == nil {
return nil, fmt.Errorf("file %q not in EPUB", path)
}
rc, err := f.Open()
if err != nil {
return nil, err
}
defer rc.Close()
return io.ReadAll(rc)
}
// zipFile finds a file by name (case-insensitive) in a zip.Reader.
func zipFile(zr *zip.Reader, name string) *zip.File {
nameLower := strings.ToLower(name)
for _, f := range zr.File {
if strings.ToLower(f.Name) == nameLower {
return f
}
}
return nil
}
// htmlToText converts HTML/XHTML content to plain text suitable for storage.
func htmlToText(data []byte) string {
doc, err := html.Parse(bytes.NewReader(data))
if err != nil {
return string(data)
}
var sb strings.Builder
var walk func(*html.Node)
walk = func(n *html.Node) {
if n.Type == html.TextNode {
text := strings.TrimSpace(n.Data)
if text != "" {
sb.WriteString(text)
sb.WriteByte(' ')
}
}
if n.Type == html.ElementNode {
switch strings.ToLower(n.Data) {
case "p", "div", "br", "h1", "h2", "h3", "h4", "h5", "h6", "li", "tr":
// Block-level: ensure newline before content.
if sb.Len() > 0 {
s := sb.String()
if s[len(s)-1] != '\n' {
sb.WriteByte('\n')
}
}
case "script", "style", "head":
// Skip entirely.
return
}
}
for c := n.FirstChild; c != nil; c = c.NextSibling {
walk(c)
}
if n.Type == html.ElementNode {
switch strings.ToLower(n.Data) {
case "p", "div", "h1", "h2", "h3", "h4", "h5", "h6", "li", "tr":
sb.WriteByte('\n')
}
}
}
walk(doc)
// Collapse multiple blank lines.
lines := strings.Split(sb.String(), "\n")
var out []string
blanks := 0
for _, l := range lines {
l = strings.TrimSpace(l)
if l == "" {
blanks++
if blanks <= 1 {
out = append(out, "")
}
} else {
blanks = 0
out = append(out, l)
}
}
return strings.TrimSpace(strings.Join(out, "\n"))
}
// ── Chapter ingestion ─────────────────────────────────────────────────────────
// IngestChapters stores extracted chapters for a book.
// Each chapter is written as a markdown file in the chapters MinIO bucket
// and its index record is upserted in PocketBase via WriteChapter.
func (s *Store) IngestChapters(ctx context.Context, slug string, chapters []bookstore.Chapter) error {
for _, ch := range chapters {
var mdContent string
if ch.Title != "" && ch.Title != fmt.Sprintf("Chapter %d", ch.Number) {
mdContent = fmt.Sprintf("# %s\n\n%s", ch.Title, ch.Content)
} else {
mdContent = fmt.Sprintf("# Chapter %d\n\n%s", ch.Number, ch.Content)
}
domainCh := domain.Chapter{
Ref: domain.ChapterRef{Number: ch.Number, Title: ch.Title},
Text: mdContent,
}
if err := s.WriteChapter(ctx, slug, domainCh); err != nil {
return fmt.Errorf("ingest chapter %d: %w", ch.Number, err)
}
}
return nil
}
// GetImportObjectKey returns the MinIO object key for an uploaded import file.
func GetImportObjectKey(filename string) string {
return fmt.Sprintf("imports/%s", filename)
}

View File

@@ -109,10 +109,17 @@ func ChapterObjectKey(slug string, n int) string {
return fmt.Sprintf("%s/chapter-%06d.md", slug, n)
}
// AudioObjectKey returns the MinIO object key for a cached audio file.
// AudioObjectKeyExt returns the MinIO object key for a cached audio file
// with a custom extension (e.g. "mp3" or "wav").
// Format: {slug}/{n}/{voice}.{ext}
func AudioObjectKeyExt(slug string, n int, voice, ext string) string {
return fmt.Sprintf("%s/%d/%s.%s", slug, n, voice, ext)
}
// AudioObjectKey returns the MinIO object key for a cached MP3 audio file.
// Format: {slug}/{n}/{voice}.mp3
func AudioObjectKey(slug string, n int, voice string) string {
return fmt.Sprintf("%s/%d/%s.mp3", slug, n, voice)
return AudioObjectKeyExt(slug, n, voice, "mp3")
}
// AvatarObjectKey returns the MinIO object key for a user avatar image.
@@ -127,6 +134,12 @@ func CoverObjectKey(slug string) string {
return fmt.Sprintf("covers/%s.jpg", slug)
}
// ChapterImageObjectKey returns the MinIO object key for a chapter illustration.
// Format: chapter-images/{slug}/{n:06d}.jpg
func ChapterImageObjectKey(slug string, n int) string {
return fmt.Sprintf("chapter-images/%s/%06d.jpg", slug, n)
}
// TranslationObjectKey returns the MinIO object key for a translated chapter.
// Format: {lang}/{slug}/{n:06d}.md
func TranslationObjectKey(lang, slug string, n int) string {
@@ -258,3 +271,28 @@ func coverContentType(data []byte) string {
}
return "image/jpeg"
}
// ── Chapter image operations ───────────────────────────────────────────────────
// putChapterImage stores a chapter illustration in the browse bucket.
func (m *minioClient) putChapterImage(ctx context.Context, key, contentType string, data []byte) error {
return m.putObject(ctx, m.bucketBrowse, key, contentType, data)
}
// getChapterImage retrieves a chapter illustration. Returns (nil, false, nil)
// when the object does not exist.
func (m *minioClient) getChapterImage(ctx context.Context, key string) ([]byte, bool, error) {
if !m.objectExists(ctx, m.bucketBrowse, key) {
return nil, false, nil
}
data, err := m.getObject(ctx, m.bucketBrowse, key)
if err != nil {
return nil, false, err
}
return data, true, nil
}
// chapterImageExists returns true when the chapter image object exists.
func (m *minioClient) chapterImageExists(ctx context.Context, key string) bool {
return m.objectExists(ctx, m.bucketBrowse, key)
}

View File

@@ -26,6 +26,11 @@ import (
// ErrNotFound is returned by single-record lookups when no record exists.
var ErrNotFound = errors.New("storage: record not found")
// pbHTTPClient is a shared HTTP client with a 30 s timeout so that a slow or
// hung PocketBase never stalls the backend/runner process indefinitely.
// http.DefaultClient has no timeout and must not be used for PocketBase calls.
var pbHTTPClient = &http.Client{Timeout: 30 * time.Second}
// pbClient is the internal PocketBase REST admin client.
type pbClient struct {
baseURL string
@@ -66,7 +71,7 @@ func (c *pbClient) authToken(ctx context.Context) (string, error) {
}
req.Header.Set("Content-Type", "application/json")
resp, err := http.DefaultClient.Do(req)
resp, err := pbHTTPClient.Do(req)
if err != nil {
return "", fmt.Errorf("pb auth: %w", err)
}
@@ -104,7 +109,7 @@ func (c *pbClient) do(ctx context.Context, method, path string, body io.Reader)
req.Header.Set("Content-Type", "application/json")
}
resp, err := http.DefaultClient.Do(req)
resp, err := pbHTTPClient.Do(req)
if err != nil {
return nil, fmt.Errorf("pb: %s %s: %w", method, path, err)
}

View File

@@ -53,6 +53,9 @@ var _ bookstore.PresignStore = (*Store)(nil)
var _ bookstore.ProgressStore = (*Store)(nil)
var _ bookstore.CoverStore = (*Store)(nil)
var _ bookstore.TranslationStore = (*Store)(nil)
var _ bookstore.AIJobStore = (*Store)(nil)
var _ bookstore.ChapterImageStore = (*Store)(nil)
var _ bookstore.BookAdminStore = (*Store)(nil)
var _ taskqueue.Producer = (*Store)(nil)
var _ taskqueue.Consumer = (*Store)(nil)
var _ taskqueue.Reader = (*Store)(nil)
@@ -74,12 +77,24 @@ func (s *Store) WriteMetadata(ctx context.Context, meta domain.BookMeta) error {
"rating": meta.Rating,
}
// Upsert via filter: if exists PATCH, otherwise POST.
// Use a conflict-retry pattern to handle concurrent scrapes racing to insert
// the same slug: if POST fails (or another concurrent writer beat us to it),
// re-fetch and PATCH instead.
existing, err := s.getBookBySlug(ctx, meta.Slug)
if err != nil && err != ErrNotFound {
return fmt.Errorf("WriteMetadata: %w", err)
}
if err == ErrNotFound {
return s.pb.post(ctx, "/api/collections/books/records", payload, nil)
postErr := s.pb.post(ctx, "/api/collections/books/records", payload, nil)
if postErr == nil {
return nil
}
// POST failed — a concurrent writer may have inserted the same slug.
// Re-fetch and fall through to PATCH.
existing, err = s.getBookBySlug(ctx, meta.Slug)
if err != nil {
return postErr // original POST error is more informative
}
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/books/records/%s", existing.ID), payload)
}
@@ -118,7 +133,23 @@ func (s *Store) upsertChapterIdx(ctx context.Context, slug string, ref domain.Ch
return err
}
if len(items) == 0 {
return s.pb.post(ctx, "/api/collections/chapters_idx/records", payload, nil)
// Set created timestamp on first insert so recentlyUpdatedBooks can sort by it.
insertPayload := map[string]any{
"slug": slug,
"number": ref.Number,
"title": ref.Title,
"created": time.Now().UTC().Format(time.RFC3339),
}
postErr := s.pb.post(ctx, "/api/collections/chapters_idx/records", insertPayload, nil)
if postErr == nil {
return nil
}
// POST failed — a concurrent writer may have inserted the same slug+number.
// Re-fetch and fall through to PATCH (mirrors WriteMetadata retry pattern).
items, err = s.pb.listAll(ctx, "chapters_idx", filter, "")
if err != nil || len(items) == 0 {
return postErr // original POST error is more informative
}
}
var rec struct {
ID string `json:"id"`
@@ -127,6 +158,59 @@ func (s *Store) upsertChapterIdx(ctx context.Context, slug string, ref domain.Ch
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/chapters_idx/records/%s", rec.ID), payload)
}
// DeduplicateChapters removes duplicate chapters_idx records for slug.
// For each chapter number that has more than one record, it keeps the record
// with the latest "updated" timestamp and deletes the rest.
// Returns the number of records deleted.
func (s *Store) DeduplicateChapters(ctx context.Context, slug string) (int, error) {
filter := fmt.Sprintf(`slug=%q`, slug)
items, err := s.pb.listAll(ctx, "chapters_idx", filter, "number")
if err != nil {
return 0, fmt.Errorf("DeduplicateChapters: list: %w", err)
}
type record struct {
ID string `json:"id"`
Number int `json:"number"`
Updated string `json:"updated"`
}
// Group records by chapter number.
byNumber := make(map[int][]record)
for _, raw := range items {
var rec record
if err := json.Unmarshal(raw, &rec); err != nil || rec.ID == "" {
continue
}
byNumber[rec.Number] = append(byNumber[rec.Number], rec)
}
deleted := 0
for _, recs := range byNumber {
if len(recs) <= 1 {
continue
}
// Keep the record with the latest Updated timestamp; delete the rest.
keep := 0
for i := 1; i < len(recs); i++ {
if recs[i].Updated > recs[keep].Updated {
keep = i
}
}
for i, rec := range recs {
if i == keep {
continue
}
if delErr := s.pb.delete(ctx, fmt.Sprintf("/api/collections/chapters_idx/records/%s", rec.ID)); delErr != nil {
s.log.Warn("DeduplicateChapters: delete failed", "slug", slug, "number", rec.Number, "id", rec.ID, "err", delErr)
continue
}
deleted++
}
}
return deleted, nil
}
// ── BookReader ────────────────────────────────────────────────────────────────
type pbBook struct {
@@ -143,6 +227,7 @@ type pbBook struct {
Ranking int `json:"ranking"`
Rating float64 `json:"rating"`
Updated string `json:"updated"`
Archived bool `json:"archived"`
}
func (b pbBook) toDomain() domain.BookMeta {
@@ -163,6 +248,7 @@ func (b pbBook) toDomain() domain.BookMeta {
Ranking: b.Ranking,
Rating: b.Rating,
MetaUpdated: metaUpdated,
Archived: b.Archived,
}
}
@@ -192,7 +278,7 @@ func (s *Store) ReadMetadata(ctx context.Context, slug string) (domain.BookMeta,
}
func (s *Store) ListBooks(ctx context.Context) ([]domain.BookMeta, error) {
items, err := s.pb.listAll(ctx, "books", "", "title")
items, err := s.pb.listAll(ctx, "books", "archived=false", "title")
if err != nil {
return nil, err
}
@@ -293,6 +379,84 @@ func (s *Store) ReindexChapters(ctx context.Context, slug string) (int, error) {
return count, nil
}
// ── BookAdminStore ────────────────────────────────────────────────────────────
// ArchiveBook sets archived=true on the book record for slug.
func (s *Store) ArchiveBook(ctx context.Context, slug string) error {
book, err := s.getBookBySlug(ctx, slug)
if err == ErrNotFound {
return ErrNotFound
}
if err != nil {
return fmt.Errorf("ArchiveBook: %w", err)
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/books/records/%s", book.ID),
map[string]any{"archived": true})
}
// UnarchiveBook clears archived on the book record for slug.
func (s *Store) UnarchiveBook(ctx context.Context, slug string) error {
book, err := s.getBookBySlug(ctx, slug)
if err == ErrNotFound {
return ErrNotFound
}
if err != nil {
return fmt.Errorf("UnarchiveBook: %w", err)
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/books/records/%s", book.ID),
map[string]any{"archived": false})
}
// DeleteBook permanently removes all data for a book:
// - PocketBase books record
// - All PocketBase chapters_idx records for the slug
// - All MinIO chapter markdown objects ({slug}/chapter-*.md)
// - MinIO cover image (covers/{slug}.jpg)
func (s *Store) DeleteBook(ctx context.Context, slug string) error {
// 1. Fetch the book record to get its PocketBase ID.
book, err := s.getBookBySlug(ctx, slug)
if err == ErrNotFound {
return ErrNotFound
}
if err != nil {
return fmt.Errorf("DeleteBook: fetch: %w", err)
}
// 2. Delete all chapters_idx records.
filter := fmt.Sprintf(`slug=%q`, slug)
items, err := s.pb.listAll(ctx, "chapters_idx", filter, "")
if err != nil && err != ErrNotFound {
return fmt.Errorf("DeleteBook: list chapters_idx: %w", err)
}
for _, raw := range items {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
if delErr := s.pb.delete(ctx, fmt.Sprintf("/api/collections/chapters_idx/records/%s", rec.ID)); delErr != nil {
s.log.Warn("DeleteBook: delete chapters_idx record failed", "slug", slug, "id", rec.ID, "err", delErr)
}
}
}
// 3. Delete MinIO chapter objects.
if err := s.mc.deleteObjects(ctx, s.mc.bucketChapters, slug+"/"); err != nil {
s.log.Warn("DeleteBook: delete chapter objects failed", "slug", slug, "err", err)
}
// 4. Delete MinIO cover image.
if err := s.mc.deleteObjects(ctx, s.mc.bucketBrowse, CoverObjectKey(slug)); err != nil {
s.log.Warn("DeleteBook: delete cover failed", "slug", slug, "err", err)
}
// 5. Delete the PocketBase books record.
if err := s.pb.delete(ctx, fmt.Sprintf("/api/collections/books/records/%s", book.ID)); err != nil {
return fmt.Errorf("DeleteBook: delete books record: %w", err)
}
return nil
}
// ── RankingStore ──────────────────────────────────────────────────────────────
func (s *Store) WriteRankingItem(ctx context.Context, item domain.RankingItem) error {
@@ -376,6 +540,10 @@ func (s *Store) AudioObjectKey(slug string, n int, voice string) string {
return AudioObjectKey(slug, n, voice)
}
func (s *Store) AudioObjectKeyExt(slug string, n int, voice, ext string) string {
return AudioObjectKeyExt(slug, n, voice, ext)
}
func (s *Store) AudioExists(ctx context.Context, key string) bool {
return s.mc.objectExists(ctx, s.mc.bucketAudio, key)
}
@@ -560,6 +728,118 @@ func (s *Store) CreateTranslationTask(ctx context.Context, slug string, chapter
return rec.ID, nil
}
func (s *Store) CreateImportTask(ctx context.Context, task domain.ImportTask) (string, error) {
payload := map[string]any{
"slug": task.Slug,
"title": task.Title,
"file_name": task.Slug + "." + task.FileType,
"file_type": task.FileType,
"object_key": task.ObjectKey,
"chapters_key": task.ChaptersKey,
"author": task.Author,
"cover_url": task.CoverURL,
"summary": task.Summary,
"book_status": task.BookStatus,
"status": string(domain.TaskStatusPending),
"chapters_done": 0,
"chapters_total": task.ChaptersTotal,
"started": time.Now().UTC().Format(time.RFC3339),
"initiator_user_id": task.InitiatorUserID,
}
if len(task.Genres) > 0 {
payload["genres"] = strings.Join(task.Genres, ",")
}
var rec struct {
ID string `json:"id"`
}
if err := s.pb.post(ctx, "/api/collections/import_tasks/records", payload, &rec); err != nil {
return "", err
}
return rec.ID, nil
}
// CreateNotification creates a notification record in PocketBase.
func (s *Store) CreateNotification(ctx context.Context, userID, title, message, link string) error {
payload := map[string]any{
"user_id": userID,
"title": title,
"message": message,
"link": link,
"read": false,
"created": time.Now().UTC().Format(time.RFC3339),
}
return s.pb.post(ctx, "/api/collections/notifications/records", payload, nil)
}
// ListNotifications returns notifications for a user.
func (s *Store) ListNotifications(ctx context.Context, userID string, limit int) ([]map[string]any, error) {
filter := fmt.Sprintf(`user_id=%q`, userID)
items, err := s.pb.listAll(ctx, "notifications", filter, "-created")
if err != nil {
return nil, err
}
// Parse each json.RawMessage into a map
results := make([]map[string]any, 0, len(items))
for _, raw := range items {
var m map[string]any
if json.Unmarshal(raw, &m) == nil {
results = append(results, m)
}
}
if limit > 0 && len(results) > limit {
results = results[:limit]
}
return results, nil
}
// MarkNotificationRead marks a notification as read.
func (s *Store) MarkNotificationRead(ctx context.Context, id string) error {
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/notifications/records/%s", id),
map[string]any{"read": true})
}
// DeleteNotification deletes a single notification by ID.
func (s *Store) DeleteNotification(ctx context.Context, id string) error {
return s.pb.delete(ctx, fmt.Sprintf("/api/collections/notifications/records/%s", id))
}
// ClearAllNotifications deletes all notifications for a user.
func (s *Store) ClearAllNotifications(ctx context.Context, userID string) error {
filter := fmt.Sprintf(`user_id=%q`, userID)
items, err := s.pb.listAll(ctx, "notifications", filter, "")
if err != nil {
return fmt.Errorf("ClearAllNotifications list: %w", err)
}
for _, raw := range items {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
_ = s.pb.delete(ctx, fmt.Sprintf("/api/collections/notifications/records/%s", rec.ID))
}
}
return nil
}
// MarkAllNotificationsRead marks all notifications for a user as read.
func (s *Store) MarkAllNotificationsRead(ctx context.Context, userID string) error {
filter := fmt.Sprintf(`user_id=%q&&read=false`, userID)
items, err := s.pb.listAll(ctx, "notifications", filter, "")
if err != nil {
return fmt.Errorf("MarkAllNotificationsRead list: %w", err)
}
for _, raw := range items {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
_ = s.pb.patch(ctx, fmt.Sprintf("/api/collections/notifications/records/%s", rec.ID),
map[string]any{"read": true})
}
}
return nil
}
func (s *Store) CancelTask(ctx context.Context, id string) error {
// Try scraping_tasks first, then audio_jobs, then translation_jobs.
if err := s.pb.patch(ctx, fmt.Sprintf("/api/collections/scraping_tasks/records/%s", id),
@@ -574,6 +854,28 @@ func (s *Store) CancelTask(ctx context.Context, id string) error {
map[string]string{"status": string(domain.TaskStatusCancelled)})
}
func (s *Store) CancelAudioTasksBySlug(ctx context.Context, slug string) (int, error) {
filter := fmt.Sprintf(`slug='%s'&&(status='pending'||status='running')`, slug)
items, err := s.pb.listAll(ctx, "audio_jobs", filter, "")
if err != nil {
return 0, fmt.Errorf("CancelAudioTasksBySlug list: %w", err)
}
cancelled := 0
for _, raw := range items {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
if patchErr := s.pb.patch(ctx,
fmt.Sprintf("/api/collections/audio_jobs/records/%s", rec.ID),
map[string]string{"status": string(domain.TaskStatusCancelled)}); patchErr == nil {
cancelled++
}
}
}
return cancelled, nil
}
// ── taskqueue.Consumer ────────────────────────────────────────────────────────
func (s *Store) ClaimNextScrapeTask(ctx context.Context, workerID string) (domain.ScrapeTask, bool, error) {
@@ -612,6 +914,18 @@ func (s *Store) ClaimNextTranslationTask(ctx context.Context, workerID string) (
return task, err == nil, err
}
func (s *Store) ClaimNextImportTask(ctx context.Context, workerID string) (domain.ImportTask, bool, error) {
raw, err := s.pb.claimRecord(ctx, "import_tasks", workerID, nil)
if err != nil {
return domain.ImportTask{}, false, err
}
if raw == nil {
return domain.ImportTask{}, false, nil
}
task, err := parseImportTask(raw)
return task, err == nil, err
}
func (s *Store) FinishScrapeTask(ctx context.Context, id string, result domain.ScrapeResult) error {
status := string(domain.TaskStatusDone)
if result.ErrorMessage != "" {
@@ -652,6 +966,20 @@ func (s *Store) FinishTranslationTask(ctx context.Context, id string, result dom
})
}
func (s *Store) FinishImportTask(ctx context.Context, id string, result domain.ImportResult) error {
status := string(domain.TaskStatusDone)
if result.ErrorMessage != "" {
status = string(domain.TaskStatusFailed)
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/import_tasks/records/%s", id), map[string]any{
"status": status,
"chapters_done": result.ChaptersImported,
"chapters_total": result.ChaptersImported,
"error_message": result.ErrorMessage,
"finished": time.Now().UTC().Format(time.RFC3339),
})
}
func (s *Store) FailTask(ctx context.Context, id, errMsg string) error {
payload := map[string]any{
"status": string(domain.TaskStatusFailed),
@@ -668,7 +996,7 @@ func (s *Store) FailTask(ctx context.Context, id, errMsg string) error {
}
// HeartbeatTask updates the heartbeat_at field on a running task.
// Tries scraping_tasks first, then audio_jobs, then translation_jobs.
// Tries scraping_tasks, audio_jobs, translation_jobs, then import_tasks.
func (s *Store) HeartbeatTask(ctx context.Context, id string) error {
payload := map[string]any{
"heartbeat_at": time.Now().UTC().Format(time.RFC3339),
@@ -679,7 +1007,10 @@ func (s *Store) HeartbeatTask(ctx context.Context, id string) error {
if err := s.pb.patch(ctx, fmt.Sprintf("/api/collections/audio_jobs/records/%s", id), payload); err == nil {
return nil
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/translation_jobs/records/%s", id), payload)
if err := s.pb.patch(ctx, fmt.Sprintf("/api/collections/translation_jobs/records/%s", id), payload); err == nil {
return nil
}
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/import_tasks/records/%s", id), payload)
}
// ReapStaleTasks finds all running tasks whose heartbeat_at is either missing
@@ -697,7 +1028,7 @@ func (s *Store) ReapStaleTasks(ctx context.Context, staleAfter time.Duration) (i
}
total := 0
for _, collection := range []string{"scraping_tasks", "audio_jobs", "translation_jobs"} {
for _, collection := range []string{"scraping_tasks", "audio_jobs", "translation_jobs", "import_tasks"} {
items, err := s.pb.listAll(ctx, collection, filter, "")
if err != nil {
return total, fmt.Errorf("ReapStaleTasks list %s: %w", collection, err)
@@ -790,8 +1121,7 @@ func (s *Store) ListTranslationTasks(ctx context.Context) ([]domain.TranslationT
}
func (s *Store) GetTranslationTask(ctx context.Context, cacheKey string) (domain.TranslationTask, bool, error) {
filter := fmt.Sprintf(`cache_key='%s'`, cacheKey)
items, err := s.pb.listAll(ctx, "translation_jobs", filter, "-started")
items, err := s.pb.listAll(ctx, "translation_jobs", fmt.Sprintf("cache_key=%q", cacheKey), "-started")
if err != nil || len(items) == 0 {
return domain.TranslationTask{}, false, err
}
@@ -799,6 +1129,33 @@ func (s *Store) GetTranslationTask(ctx context.Context, cacheKey string) (domain
return t, err == nil, err
}
func (s *Store) ListImportTasks(ctx context.Context) ([]domain.ImportTask, error) {
items, err := s.pb.listAll(ctx, "import_tasks", "", "-started")
if err != nil {
return nil, err
}
tasks := make([]domain.ImportTask, 0, len(items))
for _, raw := range items {
t, err := parseImportTask(raw)
if err == nil {
tasks = append(tasks, t)
}
}
return tasks, nil
}
func (s *Store) GetImportTask(ctx context.Context, id string) (domain.ImportTask, bool, error) {
var raw json.RawMessage
if err := s.pb.get(ctx, fmt.Sprintf("/api/collections/import_tasks/records/%s", id), &raw); err != nil {
if err == ErrNotFound {
return domain.ImportTask{}, false, nil
}
return domain.ImportTask{}, false, err
}
t, err := parseImportTask(raw)
return t, err == nil, err
}
// ── Parsers ───────────────────────────────────────────────────────────────────
func parseScrapeTask(raw json.RawMessage) (domain.ScrapeTask, error) {
@@ -905,6 +1262,66 @@ func parseTranslationTask(raw json.RawMessage) (domain.TranslationTask, error) {
}, nil
}
func parseImportTask(raw json.RawMessage) (domain.ImportTask, error) {
var rec struct {
ID string `json:"id"`
Slug string `json:"slug"`
Title string `json:"title"`
FileName string `json:"file_name"`
FileType string `json:"file_type"`
ObjectKey string `json:"object_key"`
ChaptersKey string `json:"chapters_key"`
Author string `json:"author"`
CoverURL string `json:"cover_url"`
Genres string `json:"genres"` // stored as comma-separated
Summary string `json:"summary"`
BookStatus string `json:"book_status"`
WorkerID string `json:"worker_id"`
InitiatorUserID string `json:"initiator_user_id"`
Status string `json:"status"`
ChaptersDone int `json:"chapters_done"`
ChaptersTotal int `json:"chapters_total"`
ErrorMessage string `json:"error_message"`
Started string `json:"started"`
Finished string `json:"finished"`
}
if err := json.Unmarshal(raw, &rec); err != nil {
return domain.ImportTask{}, err
}
started, _ := time.Parse(time.RFC3339, rec.Started)
finished, _ := time.Parse(time.RFC3339, rec.Finished)
var genres []string
if rec.Genres != "" {
for _, g := range strings.Split(rec.Genres, ",") {
if g = strings.TrimSpace(g); g != "" {
genres = append(genres, g)
}
}
}
return domain.ImportTask{
ID: rec.ID,
Slug: rec.Slug,
Title: rec.Title,
FileName: rec.FileName,
FileType: rec.FileType,
ObjectKey: rec.ObjectKey,
ChaptersKey: rec.ChaptersKey,
Author: rec.Author,
CoverURL: rec.CoverURL,
Genres: genres,
Summary: rec.Summary,
BookStatus: rec.BookStatus,
WorkerID: rec.WorkerID,
InitiatorUserID: rec.InitiatorUserID,
Status: domain.TaskStatus(rec.Status),
ChaptersDone: rec.ChaptersDone,
ChaptersTotal: rec.ChaptersTotal,
ErrorMessage: rec.ErrorMessage,
Started: started,
Finished: finished,
}, nil
}
// ── CoverStore ─────────────────────────────────────────────────────────────────
func (s *Store) PutCover(ctx context.Context, slug string, data []byte, contentType string) error {
@@ -931,10 +1348,59 @@ func (s *Store) GetCover(ctx context.Context, slug string) ([]byte, string, bool
return data, ct, true, nil
}
// PutImportFile stores an uploaded import file (PDF/EPUB) in MinIO.
func (s *Store) PutImportFile(ctx context.Context, key string, data []byte) error {
return s.mc.putObject(ctx, "imports", key, "application/octet-stream", data)
}
// PutImportChapters stores a pre-parsed chapters JSON blob in MinIO.
func (s *Store) PutImportChapters(ctx context.Context, key string, data []byte) error {
return s.mc.putObject(ctx, "imports", key, "application/json", data)
}
// GetImportChapters retrieves the pre-parsed chapters JSON from MinIO.
func (s *Store) GetImportChapters(ctx context.Context, key string) ([]byte, error) {
data, err := s.mc.getObject(ctx, "imports", key)
if err != nil {
return nil, fmt.Errorf("get chapters object: %w", err)
}
return data, nil
}
func (s *Store) CoverExists(ctx context.Context, slug string) bool {
return s.mc.coverExists(ctx, CoverObjectKey(slug))
}
// ── ChapterImageStore ──────────────────────────────────────────────────────────
func (s *Store) PutChapterImage(ctx context.Context, slug string, n int, data []byte, contentType string) error {
key := ChapterImageObjectKey(slug, n)
if contentType == "" {
contentType = coverContentType(data)
}
if err := s.mc.putChapterImage(ctx, key, contentType, data); err != nil {
return fmt.Errorf("PutChapterImage: %w", err)
}
return nil
}
func (s *Store) GetChapterImage(ctx context.Context, slug string, n int) ([]byte, string, bool, error) {
key := ChapterImageObjectKey(slug, n)
data, ok, err := s.mc.getChapterImage(ctx, key)
if err != nil {
return nil, "", false, fmt.Errorf("GetChapterImage: %w", err)
}
if !ok {
return nil, "", false, nil
}
ct := coverContentType(data)
return data, ct, true, nil
}
func (s *Store) ChapterImageExists(ctx context.Context, slug string, n int) bool {
return s.mc.chapterImageExists(ctx, ChapterImageObjectKey(slug, n))
}
// ── TranslationStore ───────────────────────────────────────────────────────────
func (s *Store) TranslationObjectKey(lang, slug string, n int) string {
@@ -956,3 +1422,273 @@ func (s *Store) GetTranslation(ctx context.Context, key string) (string, error)
}
return string(data), nil
}
// ── AIJobStore ────────────────────────────────────────────────────────────────
func (s *Store) CreateAIJob(ctx context.Context, job domain.AIJob) (string, error) {
payload := map[string]any{
"kind": job.Kind,
"slug": job.Slug,
"status": string(job.Status),
"from_item": job.FromItem,
"to_item": job.ToItem,
"items_done": job.ItemsDone,
"items_total": job.ItemsTotal,
"model": job.Model,
"payload": job.Payload,
"started": job.Started.Format(time.RFC3339),
}
var out struct {
ID string `json:"id"`
}
if err := s.pb.post(ctx, "/api/collections/ai_jobs/records", payload, &out); err != nil {
return "", fmt.Errorf("CreateAIJob: %w", err)
}
return out.ID, nil
}
func (s *Store) GetAIJob(ctx context.Context, id string) (domain.AIJob, bool, error) {
var raw json.RawMessage
if err := s.pb.get(ctx, fmt.Sprintf("/api/collections/ai_jobs/records/%s", id), &raw); err != nil {
if strings.Contains(err.Error(), "404") {
return domain.AIJob{}, false, nil
}
return domain.AIJob{}, false, fmt.Errorf("GetAIJob: %w", err)
}
job, err := parseAIJob(raw)
if err != nil {
return domain.AIJob{}, false, err
}
return job, true, nil
}
func (s *Store) UpdateAIJob(ctx context.Context, id string, fields map[string]any) error {
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/ai_jobs/records/%s", id), fields)
}
func (s *Store) ListAIJobs(ctx context.Context) ([]domain.AIJob, error) {
items, err := s.pb.listAll(ctx, "ai_jobs", "", "-started")
if err != nil {
return nil, fmt.Errorf("ListAIJobs: %w", err)
}
out := make([]domain.AIJob, 0, len(items))
for _, raw := range items {
j, err := parseAIJob(raw)
if err != nil {
continue
}
out = append(out, j)
}
return out, nil
}
func parseAIJob(raw json.RawMessage) (domain.AIJob, error) {
var r struct {
ID string `json:"id"`
Kind string `json:"kind"`
Slug string `json:"slug"`
Status string `json:"status"`
FromItem int `json:"from_item"`
ToItem int `json:"to_item"`
ItemsDone int `json:"items_done"`
ItemsTotal int `json:"items_total"`
Model string `json:"model"`
Payload string `json:"payload"`
ErrorMessage string `json:"error_message"`
Started string `json:"started"`
Finished string `json:"finished"`
HeartbeatAt string `json:"heartbeat_at"`
}
if err := json.Unmarshal(raw, &r); err != nil {
return domain.AIJob{}, fmt.Errorf("parseAIJob: %w", err)
}
parseT := func(s string) time.Time {
if s == "" {
return time.Time{}
}
t, _ := time.Parse(time.RFC3339, s)
return t
}
return domain.AIJob{
ID: r.ID,
Kind: r.Kind,
Slug: r.Slug,
Status: domain.TaskStatus(r.Status),
FromItem: r.FromItem,
ToItem: r.ToItem,
ItemsDone: r.ItemsDone,
ItemsTotal: r.ItemsTotal,
Model: r.Model,
Payload: r.Payload,
ErrorMessage: r.ErrorMessage,
Started: parseT(r.Started),
Finished: parseT(r.Finished),
HeartbeatAt: parseT(r.HeartbeatAt),
}, nil
}
// ── Push subscriptions ────────────────────────────────────────────────────────
// PushSubscription holds the Web Push subscription data for a single browser.
type PushSubscription struct {
ID string
UserID string
Endpoint string
P256DH string
Auth string
}
// SavePushSubscription upserts a Web Push subscription for a user.
// If a record with the same endpoint already exists it is updated in place.
func (s *Store) SavePushSubscription(ctx context.Context, sub PushSubscription) error {
filter := fmt.Sprintf("endpoint=%q", sub.Endpoint)
existing, err := s.pb.listAll(ctx, "push_subscriptions", filter, "")
if err != nil {
return fmt.Errorf("SavePushSubscription list: %w", err)
}
payload := map[string]any{
"user_id": sub.UserID,
"endpoint": sub.Endpoint,
"p256dh": sub.P256DH,
"auth": sub.Auth,
}
if len(existing) > 0 {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(existing[0], &rec) == nil && rec.ID != "" {
return s.pb.patch(ctx, fmt.Sprintf("/api/collections/push_subscriptions/records/%s", rec.ID), payload)
}
}
return s.pb.post(ctx, "/api/collections/push_subscriptions/records", payload, nil)
}
// DeletePushSubscription removes a Web Push subscription by endpoint.
func (s *Store) DeletePushSubscription(ctx context.Context, userID, endpoint string) error {
filter := fmt.Sprintf("user_id=%q&&endpoint=%q", userID, endpoint)
items, err := s.pb.listAll(ctx, "push_subscriptions", filter, "")
if err != nil {
return fmt.Errorf("DeletePushSubscription list: %w", err)
}
for _, raw := range items {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
_ = s.pb.delete(ctx, fmt.Sprintf("/api/collections/push_subscriptions/records/%s", rec.ID))
}
}
return nil
}
// ListPushSubscriptionsByBook returns all push subscriptions belonging to users
// who have the given book slug in their library (user_library collection).
func (s *Store) ListPushSubscriptionsByBook(ctx context.Context, slug string) ([]PushSubscription, error) {
// Find all users who have this book in their library
libFilter := fmt.Sprintf("slug=%q&&user_id!=''", slug)
libItems, err := s.pb.listAll(ctx, "user_library", libFilter, "")
if err != nil {
return nil, fmt.Errorf("ListPushSubscriptionsByBook list library: %w", err)
}
// Collect unique user IDs
seen := make(map[string]bool)
var userIDs []string
for _, raw := range libItems {
var rec struct {
UserID string `json:"user_id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.UserID != "" && !seen[rec.UserID] {
seen[rec.UserID] = true
userIDs = append(userIDs, rec.UserID)
}
}
if len(userIDs) == 0 {
return nil, nil
}
// Build OR filter for push_subscriptions
parts := make([]string, len(userIDs))
for i, uid := range userIDs {
parts[i] = fmt.Sprintf("user_id=%q", uid)
}
subFilter := strings.Join(parts, "||")
subItems, err := s.pb.listAll(ctx, "push_subscriptions", subFilter, "")
if err != nil {
return nil, fmt.Errorf("ListPushSubscriptionsByBook list subs: %w", err)
}
subs := make([]PushSubscription, 0, len(subItems))
for _, raw := range subItems {
var rec struct {
ID string `json:"id"`
UserID string `json:"user_id"`
Endpoint string `json:"endpoint"`
P256DH string `json:"p256dh"`
Auth string `json:"auth"`
}
if json.Unmarshal(raw, &rec) == nil && rec.Endpoint != "" {
subs = append(subs, PushSubscription{
ID: rec.ID,
UserID: rec.UserID,
Endpoint: rec.Endpoint,
P256DH: rec.P256DH,
Auth: rec.Auth,
})
}
}
return subs, nil
}
// NotifyUsersWithBook creates an in-app notification for every logged-in user
// who has slug in their library. Errors for individual users are logged but
// do not abort the loop. Returns the number of notifications created.
func (s *Store) NotifyUsersWithBook(ctx context.Context, slug, title, message, link string) int {
userIDs, err := s.ListUserIDsWithBook(ctx, slug)
if err != nil || len(userIDs) == 0 {
return 0
}
var n int
for _, uid := range userIDs {
if createErr := s.CreateNotification(ctx, uid, title, message, link); createErr == nil {
n++
}
}
return n
}
// who have slug in their user_library. Used to fan-out new-chapter notifications.
// Admin users and users who have opted out of in-app new-chapter notifications
// (notify_new_chapters=false on app_users) are excluded.
func (s *Store) ListUserIDsWithBook(ctx context.Context, slug string) ([]string, error) {
// Collect user IDs to skip: admins + opted-out users.
skipIDs := make(map[string]bool)
excludedItems, err := s.pb.listAll(ctx, "app_users", `role="admin"||notify_new_chapters=false`, "")
if err == nil {
for _, raw := range excludedItems {
var rec struct {
ID string `json:"id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.ID != "" {
skipIDs[rec.ID] = true
}
}
}
filter := fmt.Sprintf("slug=%q&&user_id!=''", slug)
items, err := s.pb.listAll(ctx, "user_library", filter, "")
if err != nil {
return nil, fmt.Errorf("ListUserIDsWithBook: %w", err)
}
seen := make(map[string]bool)
var ids []string
for _, raw := range items {
var rec struct {
UserID string `json:"user_id"`
}
if json.Unmarshal(raw, &rec) == nil && rec.UserID != "" && !seen[rec.UserID] && !skipIDs[rec.UserID] {
seen[rec.UserID] = true
ids = append(ids, rec.UserID)
}
}
return ids, nil
}

View File

@@ -33,9 +33,18 @@ type Producer interface {
// returns the assigned PocketBase record ID.
CreateTranslationTask(ctx context.Context, slug string, chapter int, lang string) (string, error)
// CreateImportTask inserts a new import task with status=pending and
// returns the assigned PocketBase record ID.
// The task struct must have at minimum Slug, Title, FileType, and ObjectKey set.
CreateImportTask(ctx context.Context, task domain.ImportTask) (string, error)
// CancelTask transitions a pending task to status=cancelled.
// Returns ErrNotFound if the task does not exist.
CancelTask(ctx context.Context, id string) error
// CancelAudioTasksBySlug cancels all pending or running audio tasks for slug.
// Returns the number of tasks cancelled.
CancelAudioTasksBySlug(ctx context.Context, slug string) (int, error)
}
// Consumer is the read/claim side of the task queue used by the runner.
@@ -55,6 +64,11 @@ type Consumer interface {
// Returns (zero, false, nil) when the queue is empty.
ClaimNextTranslationTask(ctx context.Context, workerID string) (domain.TranslationTask, bool, error)
// ClaimNextImportTask atomically finds the oldest pending import task,
// sets its status=running and worker_id=workerID, and returns it.
// Returns (zero, false, nil) when the queue is empty.
ClaimNextImportTask(ctx context.Context, workerID string) (domain.ImportTask, bool, error)
// FinishScrapeTask marks a running scrape task as done and records the result.
FinishScrapeTask(ctx context.Context, id string, result domain.ScrapeResult) error
@@ -64,6 +78,9 @@ type Consumer interface {
// FinishTranslationTask marks a running translation task as done and records the result.
FinishTranslationTask(ctx context.Context, id string, result domain.TranslationResult) error
// FinishImportTask marks a running import task as done and records the result.
FinishImportTask(ctx context.Context, id string, result domain.ImportResult) error
// FailTask marks a task (scrape, audio, or translation) as failed with an error message.
FailTask(ctx context.Context, id, errMsg string) error
@@ -100,4 +117,11 @@ type Reader interface {
// GetTranslationTask returns the most recent translation task for cacheKey.
// Returns (zero, false, nil) if not found.
GetTranslationTask(ctx context.Context, cacheKey string) (domain.TranslationTask, bool, error)
// ListImportTasks returns all import tasks sorted by started descending.
ListImportTasks(ctx context.Context) ([]domain.ImportTask, error)
// GetImportTask returns a single import task by ID.
// Returns (zero, false, nil) if not found.
GetImportTask(ctx context.Context, id string) (domain.ImportTask, bool, error)
}

View File

@@ -26,7 +26,11 @@ func (s *stubStore) CreateAudioTask(_ context.Context, _ string, _ int, _ string
func (s *stubStore) CreateTranslationTask(_ context.Context, _ string, _ int, _ string) (string, error) {
return "translation-1", nil
}
func (s *stubStore) CancelTask(_ context.Context, _ string) error { return nil }
func (s *stubStore) CreateImportTask(_ context.Context, _ domain.ImportTask) (string, error) {
return "import-1", nil
}
func (s *stubStore) CancelTask(_ context.Context, _ string) error { return nil }
func (s *stubStore) CancelAudioTasksBySlug(_ context.Context, _ string) (int, error) { return 0, nil }
func (s *stubStore) ClaimNextScrapeTask(_ context.Context, _ string) (domain.ScrapeTask, bool, error) {
return domain.ScrapeTask{ID: "task-1", Status: domain.TaskStatusRunning}, true, nil
@@ -37,6 +41,9 @@ func (s *stubStore) ClaimNextAudioTask(_ context.Context, _ string) (domain.Audi
func (s *stubStore) ClaimNextTranslationTask(_ context.Context, _ string) (domain.TranslationTask, bool, error) {
return domain.TranslationTask{ID: "translation-1", Status: domain.TaskStatusRunning}, true, nil
}
func (s *stubStore) ClaimNextImportTask(_ context.Context, _ string) (domain.ImportTask, bool, error) {
return domain.ImportTask{ID: "import-1", Status: domain.TaskStatusRunning}, true, nil
}
func (s *stubStore) FinishScrapeTask(_ context.Context, _ string, _ domain.ScrapeResult) error {
return nil
}
@@ -46,6 +53,9 @@ func (s *stubStore) FinishAudioTask(_ context.Context, _ string, _ domain.AudioR
func (s *stubStore) FinishTranslationTask(_ context.Context, _ string, _ domain.TranslationResult) error {
return nil
}
func (s *stubStore) FinishImportTask(_ context.Context, _ string, _ domain.ImportResult) error {
return nil
}
func (s *stubStore) FailTask(_ context.Context, _, _ string) error { return nil }
func (s *stubStore) HeartbeatTask(_ context.Context, _ string) error { return nil }
@@ -68,6 +78,10 @@ func (s *stubStore) ListTranslationTasks(_ context.Context) ([]domain.Translatio
func (s *stubStore) GetTranslationTask(_ context.Context, _ string) (domain.TranslationTask, bool, error) {
return domain.TranslationTask{}, false, nil
}
func (s *stubStore) ListImportTasks(_ context.Context) ([]domain.ImportTask, error) { return nil, nil }
func (s *stubStore) GetImportTask(_ context.Context, _ string) (domain.ImportTask, bool, error) {
return domain.ImportTask{}, false, nil
}
// Verify the stub satisfies all three interfaces at compile time.
var _ taskqueue.Producer = (*stubStore)(nil)

View File

@@ -0,0 +1,147 @@
// Package webpush sends Web Push notifications using the VAPID protocol.
package webpush
import (
"context"
"encoding/json"
"fmt"
"log/slog"
"sync"
webpushgo "github.com/SherClockHolmes/webpush-go"
"github.com/libnovel/backend/internal/storage"
)
// Payload is the JSON body delivered to the browser service worker.
type Payload struct {
Title string `json:"title"`
Body string `json:"body"`
URL string `json:"url,omitempty"`
Icon string `json:"icon,omitempty"`
}
// Sender sends Web Push notifications to subscribed browsers.
type Sender struct {
vapidPublic string
vapidPrivate string
subject string
log *slog.Logger
}
// New returns a Sender configured with the given VAPID key pair.
// subject should be a mailto: or https: contact URL per the VAPID spec.
func New(vapidPublic, vapidPrivate, subject string, log *slog.Logger) *Sender {
if log == nil {
log = slog.Default()
}
return &Sender{
vapidPublic: vapidPublic,
vapidPrivate: vapidPrivate,
subject: subject,
log: log,
}
}
// Enabled returns true when VAPID keys are configured.
func (s *Sender) Enabled() bool {
return s.vapidPublic != "" && s.vapidPrivate != ""
}
// Send delivers payload to all provided subscriptions concurrently.
// Errors for individual subscriptions are logged but do not abort other sends.
// Returns the number of successful sends.
func (s *Sender) Send(ctx context.Context, subs []storage.PushSubscription, p Payload) int {
if !s.Enabled() || len(subs) == 0 {
return 0
}
body, err := json.Marshal(p)
if err != nil {
s.log.Error("webpush: marshal payload", "err", err)
return 0
}
var (
wg sync.WaitGroup
mu sync.Mutex
success int
)
for _, sub := range subs {
sub := sub
wg.Add(1)
go func() {
defer wg.Done()
resp, err := webpushgo.SendNotificationWithContext(ctx, body, &webpushgo.Subscription{
Endpoint: sub.Endpoint,
Keys: webpushgo.Keys{
P256dh: sub.P256DH,
Auth: sub.Auth,
},
}, &webpushgo.Options{
VAPIDPublicKey: s.vapidPublic,
VAPIDPrivateKey: s.vapidPrivate,
Subscriber: s.subject,
TTL: 86400,
})
if err != nil {
s.log.Warn("webpush: send failed", "endpoint", truncate(sub.Endpoint, 60), "err", err)
return
}
defer resp.Body.Close() //nolint:errcheck
if resp.StatusCode >= 400 {
s.log.Warn("webpush: push service returned error",
"endpoint", truncate(sub.Endpoint, 60),
"status", resp.StatusCode)
return
}
mu.Lock()
success++
mu.Unlock()
}()
}
wg.Wait()
return success
}
// SendToBook sends a push notification to all subscribers of the given book.
// store is used to list subscriptions for the book's library followers.
func (s *Sender) SendToBook(ctx context.Context, store *storage.Store, slug string, p Payload) {
if !s.Enabled() {
return
}
subs, err := store.ListPushSubscriptionsByBook(ctx, slug)
if err != nil {
s.log.Warn("webpush: list push subscriptions", "slug", slug, "err", err)
return
}
if len(subs) == 0 {
return
}
n := s.Send(ctx, subs, p)
s.log.Info("webpush: sent chapter notification",
"slug", slug,
"recipients", n,
"total_subs", len(subs),
)
}
// GenerateVAPIDKeys generates a new VAPID key pair and prints them.
// Useful for one-off key generation during setup.
func GenerateVAPIDKeys() (public, private string, err error) {
private, public, err = webpushgo.GenerateVAPIDKeys()
if err != nil {
return "", "", fmt.Errorf("generate VAPID keys: %w", err)
}
return public, private, nil
}
func truncate(s string, n int) string {
if len(s) <= n {
return s
}
return s[:n] + "..."
}

View File

@@ -0,0 +1,431 @@
// Migration 1 — full schema baseline.
//
// Creates all 21 collections that were previously bootstrapped by
// scripts/pb-init-v3.sh. Also creates the initial superuser from the
// POCKETBASE_ADMIN_EMAIL / POCKETBASE_ADMIN_PASSWORD env vars (first run only).
//
// This migration is intentionally idempotent: each collection is skipped if it
// already exists. This makes it safe to apply on an existing install without
// running `migrate history-sync` first — existing collections are left untouched
// and migration 2 still runs to add the three fields that were missing.
package migrations
import (
"os"
"github.com/pocketbase/pocketbase/core"
m "github.com/pocketbase/pocketbase/migrations"
)
func init() {
m.Register(func(app core.App) error {
steps := []func(core.App) error{
createBooks,
createChaptersIdx,
createRanking,
createProgress,
createScrapingTasks,
createAudioJobs,
createAppUsers,
createUserSessions,
createUserLibrary,
createUserSettings,
createUserSubscriptions,
createBookComments,
createCommentVotes,
createTranslationJobs,
createImportTasks,
createNotifications,
createPushSubscriptions,
createAIJobs,
createDiscoveryVotes,
createBookRatings,
createSiteConfig,
createInitialSuperuser,
}
for _, step := range steps {
if err := step(app); err != nil {
return err
}
}
return nil
}, func(app core.App) error {
// Down: drop all collections in safe reverse order.
names := []string{
"site_config", "book_ratings", "discovery_votes", "ai_jobs",
"push_subscriptions", "notifications", "import_tasks",
"translation_jobs", "comment_votes", "book_comments",
"user_subscriptions", "user_settings", "user_library",
"user_sessions", "app_users", "audio_jobs", "scraping_tasks",
"progress", "ranking", "chapters_idx", "books",
}
for _, name := range names {
coll, err := app.FindCollectionByNameOrId(name)
if err != nil {
continue // already absent — safe to skip
}
if err := app.Delete(coll); err != nil {
return err
}
}
return nil
})
}
// ── Helpers ───────────────────────────────────────────────────────────────────
// saveIfAbsent saves the collection only when no collection with that name
// exists yet. This makes the migration safe to run on an existing install
// without history-sync — already-created collections are simply skipped.
func saveIfAbsent(app core.App, c *core.Collection) error {
if _, err := app.FindCollectionByNameOrId(c.Name); err == nil {
return nil // already exists — skip
}
return app.Save(c)
}
// ── Collection creators ───────────────────────────────────────────────────────
func createBooks(app core.App) error {
c := core.NewBaseCollection("books")
c.Fields.Add(
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "title", Required: true},
&core.TextField{Name: "author"},
&core.TextField{Name: "cover"},
&core.TextField{Name: "status"},
&core.JSONField{Name: "genres"},
&core.TextField{Name: "summary"},
&core.NumberField{Name: "total_chapters"},
&core.TextField{Name: "source_url"},
&core.NumberField{Name: "ranking"},
&core.TextField{Name: "meta_updated"},
&core.BoolField{Name: "archived"},
)
return saveIfAbsent(app, c)
}
func createChaptersIdx(app core.App) error {
c := core.NewBaseCollection("chapters_idx")
c.Fields.Add(
&core.TextField{Name: "slug", Required: true},
&core.NumberField{Name: "number", Required: true},
&core.TextField{Name: "title"},
)
// Enforce uniqueness on (slug, number) — prevents duplicate chapter entries.
c.AddIndex("idx_chapters_idx_slug_number", true, "slug, number", "")
// Allow fast "recently updated books" queries.
c.AddIndex("idx_chapters_idx_created", false, "created", "")
return saveIfAbsent(app, c)
}
func createRanking(app core.App) error {
c := core.NewBaseCollection("ranking")
c.Fields.Add(
&core.NumberField{Name: "rank", Required: true},
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "title"},
&core.TextField{Name: "author"},
&core.TextField{Name: "cover"},
&core.TextField{Name: "status"},
&core.JSONField{Name: "genres"},
&core.TextField{Name: "source_url"},
)
return saveIfAbsent(app, c)
}
func createProgress(app core.App) error {
c := core.NewBaseCollection("progress")
c.Fields.Add(
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "slug", Required: true},
&core.NumberField{Name: "chapter"},
&core.TextField{Name: "user_id"},
&core.NumberField{Name: "audio_time"},
)
return saveIfAbsent(app, c)
}
func createScrapingTasks(app core.App) error {
c := core.NewBaseCollection("scraping_tasks")
c.Fields.Add(
&core.TextField{Name: "kind"},
&core.TextField{Name: "target_url"},
&core.NumberField{Name: "from_chapter"},
&core.NumberField{Name: "to_chapter"},
&core.TextField{Name: "worker_id"},
&core.TextField{Name: "status", Required: true},
&core.NumberField{Name: "books_found"},
&core.NumberField{Name: "chapters_scraped"},
&core.NumberField{Name: "chapters_skipped"},
&core.NumberField{Name: "errors"},
&core.TextField{Name: "error_message"},
&core.DateField{Name: "started"},
&core.DateField{Name: "finished"},
&core.DateField{Name: "heartbeat_at"},
)
return saveIfAbsent(app, c)
}
func createAudioJobs(app core.App) error {
c := core.NewBaseCollection("audio_jobs")
c.Fields.Add(
&core.TextField{Name: "cache_key", Required: true},
&core.TextField{Name: "slug", Required: true},
&core.NumberField{Name: "chapter", Required: true},
&core.TextField{Name: "voice"},
&core.TextField{Name: "worker_id"},
&core.TextField{Name: "status", Required: true},
&core.TextField{Name: "error_message"},
&core.DateField{Name: "started"},
&core.DateField{Name: "finished"},
&core.DateField{Name: "heartbeat_at"},
)
return saveIfAbsent(app, c)
}
func createAppUsers(app core.App) error {
c := core.NewBaseCollection("app_users")
c.Fields.Add(
&core.TextField{Name: "username", Required: true},
&core.TextField{Name: "password_hash"},
&core.TextField{Name: "role"},
&core.TextField{Name: "avatar_url"},
&core.TextField{Name: "email"},
&core.BoolField{Name: "email_verified"},
&core.TextField{Name: "verification_token"},
&core.TextField{Name: "verification_token_exp"},
&core.TextField{Name: "oauth_provider"},
&core.TextField{Name: "oauth_id"},
&core.TextField{Name: "polar_customer_id"},
&core.TextField{Name: "polar_subscription_id"},
&core.BoolField{Name: "notify_new_chapters"},
)
return saveIfAbsent(app, c)
}
func createUserSessions(app core.App) error {
c := core.NewBaseCollection("user_sessions")
c.Fields.Add(
&core.TextField{Name: "user_id", Required: true},
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "user_agent"},
&core.TextField{Name: "ip"},
&core.TextField{Name: "device_fingerprint"},
// created_at is a custom text field (not the system `created` date field).
&core.TextField{Name: "created_at"},
&core.TextField{Name: "last_seen"},
)
return saveIfAbsent(app, c)
}
func createUserLibrary(app core.App) error {
c := core.NewBaseCollection("user_library")
c.Fields.Add(
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "user_id"},
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "saved_at"},
&core.TextField{Name: "shelf"},
)
return saveIfAbsent(app, c)
}
func createUserSettings(app core.App) error {
c := core.NewBaseCollection("user_settings")
c.Fields.Add(
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "user_id"},
&core.BoolField{Name: "auto_next"},
&core.TextField{Name: "voice"},
&core.NumberField{Name: "speed"},
&core.TextField{Name: "theme"},
&core.TextField{Name: "locale"},
&core.TextField{Name: "font_family"},
&core.NumberField{Name: "font_size"},
&core.BoolField{Name: "announce_chapter"},
&core.TextField{Name: "audio_mode"},
)
return saveIfAbsent(app, c)
}
func createUserSubscriptions(app core.App) error {
c := core.NewBaseCollection("user_subscriptions")
c.Fields.Add(
&core.TextField{Name: "follower_id", Required: true},
&core.TextField{Name: "followee_id", Required: true},
)
return saveIfAbsent(app, c)
}
func createBookComments(app core.App) error {
c := core.NewBaseCollection("book_comments")
c.Fields.Add(
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "user_id"},
&core.TextField{Name: "username"},
&core.TextField{Name: "body"},
&core.NumberField{Name: "upvotes"},
&core.NumberField{Name: "downvotes"},
&core.TextField{Name: "parent_id"},
)
return saveIfAbsent(app, c)
}
func createCommentVotes(app core.App) error {
c := core.NewBaseCollection("comment_votes")
c.Fields.Add(
&core.TextField{Name: "comment_id", Required: true},
&core.TextField{Name: "user_id"},
&core.TextField{Name: "session_id"},
&core.TextField{Name: "vote"},
)
return saveIfAbsent(app, c)
}
func createTranslationJobs(app core.App) error {
c := core.NewBaseCollection("translation_jobs")
c.Fields.Add(
&core.TextField{Name: "cache_key", Required: true},
&core.TextField{Name: "slug", Required: true},
&core.NumberField{Name: "chapter", Required: true},
&core.TextField{Name: "lang", Required: true},
&core.TextField{Name: "worker_id"},
&core.TextField{Name: "status", Required: true},
&core.TextField{Name: "error_message"},
&core.DateField{Name: "started"},
&core.DateField{Name: "finished"},
&core.DateField{Name: "heartbeat_at"},
)
return saveIfAbsent(app, c)
}
func createImportTasks(app core.App) error {
c := core.NewBaseCollection("import_tasks")
c.Fields.Add(
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "title", Required: true},
&core.TextField{Name: "file_name"},
&core.TextField{Name: "file_type"},
&core.TextField{Name: "object_key"},
&core.TextField{Name: "chapters_key"},
&core.TextField{Name: "author"},
&core.TextField{Name: "cover_url"},
&core.TextField{Name: "genres"},
&core.TextField{Name: "summary"},
&core.TextField{Name: "book_status"},
&core.TextField{Name: "worker_id"},
&core.TextField{Name: "initiator_user_id"},
&core.TextField{Name: "status", Required: true},
&core.NumberField{Name: "chapters_done"},
&core.NumberField{Name: "chapters_total"},
&core.TextField{Name: "error_message"},
&core.DateField{Name: "started"},
&core.DateField{Name: "finished"},
&core.DateField{Name: "heartbeat_at"},
)
return saveIfAbsent(app, c)
}
func createNotifications(app core.App) error {
c := core.NewBaseCollection("notifications")
c.Fields.Add(
&core.TextField{Name: "user_id", Required: true},
&core.TextField{Name: "title", Required: true},
&core.TextField{Name: "message"},
&core.TextField{Name: "link"},
&core.BoolField{Name: "read"},
)
return saveIfAbsent(app, c)
}
func createPushSubscriptions(app core.App) error {
c := core.NewBaseCollection("push_subscriptions")
c.Fields.Add(
&core.TextField{Name: "user_id", Required: true},
&core.TextField{Name: "endpoint", Required: true},
&core.TextField{Name: "p256dh", Required: true},
&core.TextField{Name: "auth", Required: true},
)
return saveIfAbsent(app, c)
}
func createAIJobs(app core.App) error {
c := core.NewBaseCollection("ai_jobs")
c.Fields.Add(
&core.TextField{Name: "kind", Required: true},
&core.TextField{Name: "slug"},
&core.TextField{Name: "status", Required: true},
&core.NumberField{Name: "from_item"},
&core.NumberField{Name: "to_item"},
&core.NumberField{Name: "items_done"},
&core.NumberField{Name: "items_total"},
&core.TextField{Name: "model"},
&core.TextField{Name: "payload"},
&core.TextField{Name: "error_message"},
&core.DateField{Name: "started"},
&core.DateField{Name: "finished"},
&core.DateField{Name: "heartbeat_at"},
)
return saveIfAbsent(app, c)
}
func createDiscoveryVotes(app core.App) error {
c := core.NewBaseCollection("discovery_votes")
c.Fields.Add(
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "user_id"},
&core.TextField{Name: "slug", Required: true},
&core.TextField{Name: "action", Required: true},
)
return saveIfAbsent(app, c)
}
func createBookRatings(app core.App) error {
c := core.NewBaseCollection("book_ratings")
c.Fields.Add(
&core.TextField{Name: "session_id", Required: true},
&core.TextField{Name: "user_id"},
&core.TextField{Name: "slug", Required: true},
&core.NumberField{Name: "rating", Required: true},
)
return saveIfAbsent(app, c)
}
func createSiteConfig(app core.App) error {
c := core.NewBaseCollection("site_config")
c.Fields.Add(
&core.TextField{Name: "decoration"},
&core.TextField{Name: "logoAnimation"},
&core.TextField{Name: "eventLabel"},
)
return saveIfAbsent(app, c)
}
// createInitialSuperuser creates the first PocketBase superuser from env vars.
// It is a no-op if a superuser with that email already exists, or if the env
// vars are not set. This replaces the superuser bootstrap block in
// scripts/pb-init-v3.sh.
func createInitialSuperuser(app core.App) error {
email := os.Getenv("POCKETBASE_ADMIN_EMAIL")
password := os.Getenv("POCKETBASE_ADMIN_PASSWORD")
if email == "" || password == "" {
return nil
}
existing, _ := app.FindFirstRecordByData("_superusers", "email", email)
if existing != nil {
return nil // superuser already exists
}
superusers, err := app.FindCollectionByNameOrId("_superusers")
if err != nil {
return err
}
record := core.NewRecord(superusers)
record.Set("email", email)
record.Set("password", password)
record.Set("passwordConfirm", password)
return app.Save(record)
}

View File

@@ -0,0 +1,71 @@
// Migration 2 — add fields present in code but absent from pb-init-v3.sh.
//
// Discovered by auditing every PocketBase field access in the Go backend
// and SvelteKit UI against the collection definitions in pb-init-v3.sh:
//
// books.rating (number) — written by WriteMetadata but never defined.
// app_users.notify_new_chapters_push (bool) — used in UI push-notification opt-in.
// book_comments.chapter (number) — used to scope comments to a chapter (0 = book-level).
//
// The check for field existence makes this migration safe to re-apply on
// a fresh install where migration 1 already created the collections without
// these fields.
package migrations
import (
"github.com/pocketbase/pocketbase/core"
m "github.com/pocketbase/pocketbase/migrations"
)
func init() {
m.Register(func(app core.App) error {
type addition struct {
collection string
field core.Field
}
additions := []addition{
{"books", &core.NumberField{Name: "rating"}},
{"app_users", &core.BoolField{Name: "notify_new_chapters_push"}},
{"book_comments", &core.NumberField{Name: "chapter"}},
}
for _, a := range additions {
coll, err := app.FindCollectionByNameOrId(a.collection)
if err != nil {
return err
}
if coll.Fields.GetByName(a.field.GetName()) != nil {
continue // already present — idempotent
}
coll.Fields.Add(a.field)
if err := app.Save(coll); err != nil {
return err
}
}
return nil
}, func(app core.App) error {
type removal struct {
collection string
field string
}
removals := []removal{
{"books", "rating"},
{"app_users", "notify_new_chapters_push"},
{"book_comments", "chapter"},
}
for _, r := range removals {
coll, err := app.FindCollectionByNameOrId(r.collection)
if err != nil {
continue
}
f := coll.Fields.GetByName(r.field)
if f == nil {
continue
}
coll.Fields.RemoveById(f.GetId())
if err := app.Save(coll); err != nil {
return err
}
}
return nil
})
}

106
docker-bake.hcl Normal file
View File

@@ -0,0 +1,106 @@
# docker-bake.hcl — defines all five production images.
#
# CI passes version info as environment variables; locally everything gets :dev tags.
#
# Local build (no push):
# docker buildx bake
#
# CI environment variables: VERSION, MAJOR_MINOR, COMMIT, BUILD_TIME
variable "DOCKER_USER" { default = "kalekber" }
variable "VERSION" { default = "dev" } # e.g. "4.1.6" (no leading v)
variable "MAJOR_MINOR" { default = "dev" } # e.g. "4.1"
variable "COMMIT" { default = "unknown" }
variable "BUILD_TIME" { default = "" }
# ── Shared defaults ───────────────────────────────────────────────────────────
target "_defaults" {
pull = true
# CI overrides to push=true via --set *.output=type=image,push=true
output = ["type=image,push=false"]
cache-to = ["type=inline"]
}
# ── Go targets (share the backend/ build context + builder stage) ─────────────
target "backend" {
inherits = ["_defaults"]
context = "backend"
target = "backend"
tags = [
"${DOCKER_USER}/libnovel-backend:${VERSION}",
"${DOCKER_USER}/libnovel-backend:${MAJOR_MINOR}",
"${DOCKER_USER}/libnovel-backend:latest",
]
cache-from = ["type=registry,ref=${DOCKER_USER}/libnovel-backend:latest"]
args = {
VERSION = VERSION
COMMIT = COMMIT
}
}
target "runner" {
inherits = ["_defaults"]
context = "backend"
target = "runner"
tags = [
"${DOCKER_USER}/libnovel-runner:${VERSION}",
"${DOCKER_USER}/libnovel-runner:${MAJOR_MINOR}",
"${DOCKER_USER}/libnovel-runner:latest",
]
cache-from = ["type=registry,ref=${DOCKER_USER}/libnovel-runner:latest"]
args = {
VERSION = VERSION
COMMIT = COMMIT
}
}
target "pocketbase" {
inherits = ["_defaults"]
context = "backend"
target = "pocketbase"
tags = [
"${DOCKER_USER}/libnovel-pocketbase:${VERSION}",
"${DOCKER_USER}/libnovel-pocketbase:${MAJOR_MINOR}",
"${DOCKER_USER}/libnovel-pocketbase:latest",
]
cache-from = ["type=registry,ref=${DOCKER_USER}/libnovel-pocketbase:latest"]
}
# ── UI (SvelteKit — separate context) ────────────────────────────────────────
target "ui" {
inherits = ["_defaults"]
context = "ui"
tags = [
"${DOCKER_USER}/libnovel-ui:${VERSION}",
"${DOCKER_USER}/libnovel-ui:${MAJOR_MINOR}",
"${DOCKER_USER}/libnovel-ui:latest",
]
cache-from = ["type=registry,ref=${DOCKER_USER}/libnovel-ui:latest"]
args = {
BUILD_VERSION = VERSION
BUILD_COMMIT = COMMIT
BUILD_TIME = BUILD_TIME
}
}
# ── Caddy (custom plugins — separate context) ─────────────────────────────────
target "caddy" {
inherits = ["_defaults"]
context = "caddy"
tags = [
"${DOCKER_USER}/libnovel-caddy:${VERSION}",
"${DOCKER_USER}/libnovel-caddy:${MAJOR_MINOR}",
"${DOCKER_USER}/libnovel-caddy:latest",
]
cache-from = ["type=registry,ref=${DOCKER_USER}/libnovel-caddy:latest"]
}
# ── Default group: all five images ────────────────────────────────────────────
group "default" {
targets = ["backend", "runner", "pocketbase", "ui", "caddy"]
}

View File

@@ -19,6 +19,9 @@ x-infra-env: &infra-env
MEILI_API_KEY: "${MEILI_MASTER_KEY}"
# Valkey
VALKEY_ADDR: "valkey:6379"
# Cloudflare AI (TTS + image generation)
CFAI_ACCOUNT_ID: "${CFAI_ACCOUNT_ID}"
CFAI_API_TOKEN: "${CFAI_API_TOKEN}"
services:
# ─── MinIO (object storage: chapters, audio, avatars, browse) ────────────────
@@ -55,6 +58,8 @@ services:
mc mb --ignore-existing local/audio;
mc mb --ignore-existing local/avatars;
mc mb --ignore-existing local/catalogue;
mc mb --ignore-existing local/translations;
mc mb --ignore-existing local/imports;
echo 'buckets ready';
"
environment:
@@ -62,12 +67,21 @@ services:
MINIO_ROOT_PASSWORD: "${MINIO_ROOT_PASSWORD}"
# ─── PocketBase (auth + structured data) ─────────────────────────────────────
# Custom binary built from backend/cmd/pocketbase — runs Go migrations on every
# startup before accepting traffic, replacing the old pb-init-v3.sh script.
pocketbase:
image: ghcr.io/muchobien/pocketbase:latest
image: kalekber/libnovel-pocketbase:${GIT_TAG:-latest}
build:
context: ./backend
dockerfile: Dockerfile
target: pocketbase
labels:
com.centurylinklabs.watchtower.enable: "true"
restart: unless-stopped
environment:
PB_ADMIN_EMAIL: "${POCKETBASE_ADMIN_EMAIL}"
PB_ADMIN_PASSWORD: "${POCKETBASE_ADMIN_PASSWORD}"
# Used by migration 1 to create the initial superuser on a fresh install.
POCKETBASE_ADMIN_EMAIL: "${POCKETBASE_ADMIN_EMAIL}"
POCKETBASE_ADMIN_PASSWORD: "${POCKETBASE_ADMIN_PASSWORD}"
# No public port — accessed only by backend/runner on the internal network.
expose:
- "8090"
@@ -77,25 +91,12 @@ services:
test: ["CMD", "wget", "-qO-", "http://localhost:8090/api/health"]
interval: 10s
timeout: 5s
retries: 5
# ─── PocketBase collection bootstrap ─────────────────────────────────────────
pb-init:
image: alpine:3.19
depends_on:
pocketbase:
condition: service_healthy
environment:
POCKETBASE_URL: "http://pocketbase:8090"
POCKETBASE_ADMIN_EMAIL: "${POCKETBASE_ADMIN_EMAIL}"
POCKETBASE_ADMIN_PASSWORD: "${POCKETBASE_ADMIN_PASSWORD}"
volumes:
- ./scripts/pb-init-v3.sh:/pb-init.sh:ro
entrypoint: ["sh", "/pb-init.sh"]
retries: 10
start_period: 30s
# ─── Meilisearch (full-text search) ──────────────────────────────────────────
meilisearch:
image: getmeili/meilisearch:latest
image: getmeili/meilisearch:v1.40.0
restart: unless-stopped
environment:
MEILI_MASTER_KEY: "${MEILI_MASTER_KEY}"
@@ -161,8 +162,6 @@ services:
restart: unless-stopped
stop_grace_period: 35s
depends_on:
pb-init:
condition: service_completed_successfully
pocketbase:
condition: service_healthy
minio:
@@ -179,11 +178,12 @@ services:
environment:
<<: *infra-env
BACKEND_HTTP_ADDR: ":8080"
BACKEND_ADMIN_TOKEN: "${BACKEND_ADMIN_TOKEN}"
LOG_LEVEL: "${LOG_LEVEL}"
KOKORO_URL: "${KOKORO_URL}"
KOKORO_VOICE: "${KOKORO_VOICE}"
POCKET_TTS_URL: "${POCKET_TTS_URL}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN_BACKEND}"
OTEL_EXPORTER_OTLP_ENDPOINT: "${OTEL_EXPORTER_OTLP_ENDPOINT}"
OTEL_SERVICE_NAME: "backend"
# Asynq task queue — backend enqueues jobs to local Redis sidecar.
@@ -215,8 +215,6 @@ services:
restart: unless-stopped
stop_grace_period: 135s
depends_on:
pb-init:
condition: service_completed_successfully
pocketbase:
condition: service_healthy
minio:
@@ -246,7 +244,7 @@ services:
KOKORO_URL: "${KOKORO_URL}"
KOKORO_VOICE: "${KOKORO_VOICE}"
POCKET_TTS_URL: "${POCKET_TTS_URL}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN_RUNNER}"
OTEL_EXPORTER_OTLP_ENDPOINT: "${OTEL_EXPORTER_OTLP_ENDPOINT}"
OTEL_SERVICE_NAME: "runner"
healthcheck:
@@ -270,8 +268,6 @@ services:
restart: unless-stopped
stop_grace_period: 35s
depends_on:
pb-init:
condition: service_completed_successfully
backend:
condition: service_healthy
pocketbase:
@@ -290,6 +286,8 @@ services:
POCKETBASE_ADMIN_EMAIL: "${POCKETBASE_ADMIN_EMAIL}"
POCKETBASE_ADMIN_PASSWORD: "${POCKETBASE_ADMIN_PASSWORD}"
AUTH_SECRET: "${AUTH_SECRET}"
BACKEND_ADMIN_TOKEN: "${BACKEND_ADMIN_TOKEN}"
DEBUG_LOGIN_TOKEN: "${DEBUG_LOGIN_TOKEN}"
PUBLIC_MINIO_PUBLIC_URL: "${MINIO_PUBLIC_ENDPOINT}"
# Valkey
VALKEY_ADDR: "valkey:6379"
@@ -298,14 +296,21 @@ services:
PUBLIC_UMAMI_SCRIPT_URL: "${PUBLIC_UMAMI_SCRIPT_URL}"
# GlitchTip client + server-side error tracking
PUBLIC_GLITCHTIP_DSN: "${PUBLIC_GLITCHTIP_DSN}"
# Grafana Faro RUM (browser Web Vitals, traces, errors)
PUBLIC_FARO_COLLECTOR_URL: "${PUBLIC_FARO_COLLECTOR_URL}"
# OpenTelemetry tracing
OTEL_EXPORTER_OTLP_ENDPOINT: "${OTEL_EXPORTER_OTLP_ENDPOINT}"
OTEL_SERVICE_NAME: "ui"
# Allow large PDF/EPUB uploads (adapter-node default is 512KB)
BODY_SIZE_LIMIT: "52428800"
# OAuth2 providers
GOOGLE_CLIENT_ID: "${GOOGLE_CLIENT_ID}"
GOOGLE_CLIENT_SECRET: "${GOOGLE_CLIENT_SECRET}"
GITHUB_CLIENT_ID: "${GITHUB_CLIENT_ID}"
GITHUB_CLIENT_SECRET: "${GITHUB_CLIENT_SECRET}"
# Polar (subscriptions)
POLAR_API_TOKEN: "${POLAR_API_TOKEN}"
POLAR_WEBHOOK_SECRET: "${POLAR_WEBHOOK_SECRET}"
healthcheck:
test: ["CMD", "wget", "-qO-", "http://127.0.0.1:3000/health"]
interval: 15s

View File

@@ -18,6 +18,7 @@
# uptime.libnovel.cc → uptime-kuma:3001
# push.libnovel.cc → gotify:80
# grafana.libnovel.cc → grafana:3000
# faro.libnovel.cc → alloy:12347
services:
@@ -63,7 +64,7 @@ services:
LIBRETRANSLATE_API_KEY: "${LIBRETRANSLATE_API_KEY}"
# ── Asynq / Redis ─────────────────────────────────────────────────────
REDIS_ADDR: "redis:6379"
REDIS_ADDR: "${REDIS_ADDR}"
REDIS_PASSWORD: "${REDIS_PASSWORD}"
KOKORO_URL: "http://kokoro-fastapi:8880"
@@ -81,10 +82,12 @@ services:
RUNNER_SKIP_INITIAL_CATALOGUE_REFRESH: "true"
LOG_LEVEL: "${LOG_LEVEL}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN_RUNNER}"
# OTel — send runner traces/metrics to the local collector (HTTP)
OTEL_EXPORTER_OTLP_ENDPOINT: "http://otel-collector:4318"
# OTel — send runner traces/logs to Alloy (HTTP)
# Alloy forwards traces → OTel collector → Tempo
# logs → Loki
OTEL_EXPORTER_OTLP_ENDPOINT: "http://alloy:4318"
OTEL_SERVICE_NAME: "runner"
healthcheck:
@@ -168,6 +171,11 @@ services:
VALKEY_URL: "redis://valkey:6379/1"
PORT: "8000"
ENABLE_USER_REGISTRATION: "false"
MEDIA_ROOT: "/code/uploads"
volumes:
- glitchtip_uploads:/code/uploads
# Patch: GzipChunk fallback for sentry-cli 3.x raw zip uploads (GlitchTip bug)
- ./glitchtip/files_api.py:/code/apps/files/api.py:ro
healthcheck:
test: ["CMD", "python3", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:8000/api/0/')"]
interval: 15s
@@ -189,6 +197,11 @@ services:
DEFAULT_FROM_EMAIL: "noreply@libnovel.cc"
VALKEY_URL: "redis://valkey:6379/1"
SERVER_ROLE: "worker"
MEDIA_ROOT: "/code/uploads"
volumes:
- glitchtip_uploads:/code/uploads
# Patch: GzipChunk fallback for sentry-cli 3.x raw zip uploads (GlitchTip bug)
- ./glitchtip/files_api.py:/code/apps/files/api.py:ro
# ── Umami ───────────────────────────────────────────────────────────────────
umami:
@@ -346,6 +359,24 @@ services:
timeout: 5s
retries: 5
# ── Grafana Alloy (Faro RUM receiver) ───────────────────────────────────────
# Receives browser telemetry from @grafana/faro-web-sdk (Web Vitals, traces,
# errors). Exposes POST /collect at faro.libnovel.cc via cloudflared.
# Forwards traces to otel-collector (→ Tempo) and logs to Loki directly.
alloy:
image: grafana/alloy:latest
restart: unless-stopped
command: ["run", "--server.http.listen-addr=0.0.0.0:12348", "/etc/alloy/alloy.river"]
volumes:
- ./otel/alloy.river:/etc/alloy/alloy.river:ro
expose:
- "12347" # Faro HTTP receiver (POST /collect)
- "12348" # Alloy UI / health endpoint
- "4318" # OTLP receiver (HTTP) for backend/runner logs
depends_on:
- otel-collector
- loki
# ── OTel Collector ──────────────────────────────────────────────────────────
# Receives OTLP from backend/ui/runner, fans out to Tempo + Prometheus + Loki.
otel-collector:
@@ -354,9 +385,9 @@ services:
volumes:
- ./otel/collector.yaml:/etc/otelcol-contrib/config.yaml:ro
expose:
- "4317" # OTLP gRPC
- "4318" # OTLP HTTP
- "8888" # Collector self-metrics (scraped by Prometheus)
- "14317" # OTLP gRPC (Alloy forwards traces here)
- "14318" # OTLP HTTP (Alloy forwards traces here)
- "8888" # Collector self-metrics (scraped by Prometheus)
depends_on:
- tempo
- prometheus
@@ -522,3 +553,4 @@ volumes:
grafana_data:
pocket_tts_cache:
hf_cache:
glitchtip_uploads:

View File

@@ -0,0 +1,127 @@
"""Port of sentry.api.endpoints.chunk.ChunkUploadEndpoint"""
import logging
from gzip import GzipFile
from io import BytesIO
from django.conf import settings
from django.shortcuts import aget_object_or_404
from django.urls import reverse
from ninja import File, Router
from ninja.errors import HttpError
from ninja.files import UploadedFile
from apps.organizations_ext.models import Organization
from glitchtip.api.authentication import AuthHttpRequest
from glitchtip.api.decorators import optional_slash
from glitchtip.api.permissions import has_permission
from .models import FileBlob
# Force just one blob
CHUNK_UPLOAD_BLOB_SIZE = 32 * 1024 * 1024 # 32MB
MAX_CHUNKS_PER_REQUEST = 1
MAX_REQUEST_SIZE = CHUNK_UPLOAD_BLOB_SIZE
MAX_CONCURRENCY = 1
HASH_ALGORITHM = "sha1"
CHUNK_UPLOAD_ACCEPT = (
"debug_files", # DIF assemble
"release_files", # Release files assemble
"pdbs", # PDB upload and debug id override
"sources", # Source artifact bundle upload
"artifact_bundles", # Artifact bundles contain debug ids to link source to sourcemaps
"proguard",
)
class GzipChunk(BytesIO):
def __init__(self, file):
raw = file.read()
try:
data = GzipFile(fileobj=BytesIO(raw), mode="rb").read()
except Exception:
# sentry-cli 3.x sends raw (uncompressed) zip data despite gzip being
# advertised by the server — fall back to using the raw bytes as-is.
data = raw
self.size = len(data)
self.name = file.name
super().__init__(data)
router = Router()
@optional_slash(router, "get", "organizations/{slug:organization_slug}/chunk-upload/")
async def get_chunk_upload_info(request: AuthHttpRequest, organization_slug: str):
"""Get server settings for chunk file upload"""
path = reverse("api:get_chunk_upload_info", args=[organization_slug])
url = (
path
if settings.GLITCHTIP_CHUNK_UPLOAD_USE_RELATIVE_URL
else settings.GLITCHTIP_URL.geturl() + path
)
return {
"url": url,
"chunkSize": CHUNK_UPLOAD_BLOB_SIZE,
"chunksPerRequest": MAX_CHUNKS_PER_REQUEST,
"maxFileSize": 2147483648,
"maxRequestSize": MAX_REQUEST_SIZE,
"concurrency": MAX_CONCURRENCY,
"hashAlgorithm": HASH_ALGORITHM,
"compression": ["gzip"],
"accept": CHUNK_UPLOAD_ACCEPT,
}
@optional_slash(router, "post", "organizations/{slug:organization_slug}/chunk-upload/")
@has_permission(["project:write", "project:admin", "project:releases"])
async def chunk_upload(
request: AuthHttpRequest,
organization_slug: str,
file_gzip: list[UploadedFile] = File(...),
):
"""Upload one more more gzipped files to save"""
logger = logging.getLogger("glitchtip.files")
logger.info("chunkupload.start")
organization = await aget_object_or_404(
Organization, slug=organization_slug.lower(), users=request.auth.user_id
)
files = [GzipChunk(chunk) for chunk in file_gzip]
if len(files) == 0:
# No files uploaded is ok
logger.info("chunkupload.end", extra={"status": 200})
return
logger.info("chunkupload.post.files", extra={"len": len(files)})
# Validate file size
checksums = []
size = 0
for chunk in files:
size += chunk.size
if chunk.size > CHUNK_UPLOAD_BLOB_SIZE:
logger.info("chunkupload.end", extra={"status": 400})
raise HttpError(400, "Chunk size too large")
checksums.append(chunk.name)
if size > MAX_REQUEST_SIZE:
logger.info("chunkupload.end", extra={"status": 400})
raise HttpError(400, "Request too large")
if len(files) > MAX_CHUNKS_PER_REQUEST:
logger.info("chunkupload.end", extra={"status": 400})
raise HttpError(400, "Too many chunks")
try:
await FileBlob.from_files(
zip(files, checksums), organization=organization, logger=logger
)
except IOError as err:
logger.info("chunkupload.end", extra={"status": 400})
raise HttpError(400, str(err)) from err
logger.info("chunkupload.end", extra={"status": 200})

81
homelab/otel/alloy.river Normal file
View File

@@ -0,0 +1,81 @@
// Grafana Alloy — Faro RUM receiver + OTel log bridge
//
// Receives browser telemetry (Web Vitals, traces, logs, exceptions) from the
// LibNovel SvelteKit frontend via the @grafana/faro-web-sdk.
//
// Also receives OTLP logs from the backend and runner services, and forwards
// them to Loki in the native push format (solving the OTLP→Loki gap).
//
// Pipeline:
// faro.receiver → receives HTTP POST /collect from browsers
// otelcol.receiver.otlp → receives OTLP logs from backend/runner (HTTP :4318)
// otelcol.exporter.otlphttp → forwards traces to OTel Collector → Tempo
// loki.write → forwards Faro logs/exceptions to Loki
// otelcol.exporter.loki → forwards OTel logs to Loki (native format)
//
// The Faro endpoint is exposed publicly at faro.libnovel.cc via cloudflared.
faro.receiver "faro" {
server {
listen_address = "0.0.0.0"
listen_port = 12347
cors_allowed_origins = ["https://libnovel.cc", "https://www.libnovel.cc"]
}
output {
logs = [loki.write.faro.receiver]
traces = [otelcol.exporter.otlphttp.faro.input]
}
}
// Receive OTLP traces and logs from backend/runner
otelcol.receiver.otlp "otel_logs" {
http {
endpoint = "0.0.0.0:4318"
}
output {
logs = [otelcol.exporter.loki.otel_logs.input]
traces = [otelcol.exporter.otlphttp.otel_logs.input]
}
}
// Convert OTel logs to Loki format and forward to loki.write
otelcol.exporter.loki "otel_logs" {
forward_to = [loki.write.otel_logs.receiver]
}
// Send backend/runner traces to the OTel Collector → Tempo
otelcol.exporter.otlphttp "otel_logs" {
client {
endpoint = "http://otel-collector:4318"
tls {
insecure = true
}
}
}
// Push backend/runner logs to Loki (native push format)
loki.write "otel_logs" {
endpoint {
url = "http://loki:3100/loki/api/v1/push"
}
}
// Forward Faro traces to the OTel Collector (which routes to Tempo)
otelcol.exporter.otlphttp "faro" {
client {
endpoint = "http://otel-collector:4318"
tls {
insecure = true
}
}
}
// Forward Faro logs/exceptions directly to Loki
loki.write "faro" {
endpoint {
url = "http://loki:3100/loki/api/v1/push"
}
}

View File

@@ -17,7 +17,7 @@ processors:
timeout: 5s
send_batch_size: 512
# Attach host metadata to all telemetry
# Attach host metadata to traces/metrics
resourcedetection:
detectors: [env, system]
timeout: 5s
@@ -53,6 +53,15 @@ extensions:
service:
extensions: [health_check, pprof]
telemetry:
metrics:
# otel-collector v0.103+ replaced `address` with `readers`
readers:
- pull:
exporter:
prometheus:
host: 0.0.0.0
port: 8888
pipelines:
traces:
receivers: [otlp]
@@ -64,5 +73,7 @@ service:
exporters: [prometheusremotewrite]
logs:
receivers: [otlp]
processors: [resourcedetection, batch]
# No resourcedetection — preserve service.name from OTel resource attributes
# (backend=backend, runner=runner, Alloy/Faro=no service.name → unknown_service)
processors: [batch]
exporters: [otlphttp/loki]

View File

@@ -1,7 +1,7 @@
{
"uid": "libnovel-backend",
"title": "Backend API",
"description": "Request rate, error rate, and latency for the LibNovel backend. Powered by Tempo span metrics and UI OTel instrumentation.",
"description": "Request rate, error rate, and latency for the LibNovel backend. Powered by Tempo span metrics.",
"tags": ["libnovel", "backend", "api"],
"timezone": "browser",
"refresh": "30s",
@@ -173,7 +173,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"backend\", http_response_status_code=~\"5..\"}[5m])) * 60",
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"backend\", status_code=\"STATUS_CODE_ERROR\"}[5m])) * 60",
"legendFormat": "5xx/min",
"instant": true
}
@@ -182,7 +182,7 @@
{
"id": 10,
"type": "timeseries",
"title": "Request Rate by Status",
"title": "Request Rate (total vs errors)",
"gridPos": { "x": 0, "y": 4, "w": 12, "h": 8 },
"options": {
"tooltip": { "mode": "multi" },
@@ -191,27 +191,21 @@
"fieldConfig": {
"defaults": { "unit": "reqps", "custom": { "lineWidth": 2, "fillOpacity": 10 } },
"overrides": [
{ "matcher": { "id": "byFrameRefID", "options": "errors" }, "properties": [{ "id": "color", "value": { "fixedColor": "red", "mode": "fixed" } }] }
{ "matcher": { "id": "byName", "options": "errors" }, "properties": [{ "id": "color", "value": { "fixedColor": "red", "mode": "fixed" } }] }
]
},
"targets": [
{
"refId": "success",
"refId": "total",
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"backend\", http_response_status_code=~\"2..\"}[5m]))",
"legendFormat": "2xx"
},
{
"refId": "notfound",
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"backend\", http_response_status_code=~\"4..\"}[5m]))",
"legendFormat": "4xx"
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"backend\"}[5m]))",
"legendFormat": "total"
},
{
"refId": "errors",
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"backend\", http_response_status_code=~\"5..\"}[5m]))",
"legendFormat": "5xx"
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"backend\", status_code=\"STATUS_CODE_ERROR\"}[5m]))",
"legendFormat": "errors"
}
]
},
@@ -248,50 +242,30 @@
{
"id": 12,
"type": "timeseries",
"title": "Requests / min by HTTP method (UI → Backend)",
"title": "Request Rate by Span Name (top operations)",
"gridPos": { "x": 0, "y": 12, "w": 12, "h": 8 },
"description": "Throughput broken down by HTTP route / span name from Tempo span metrics.",
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "list", "placement": "bottom" }
},
"fieldConfig": {
"defaults": { "unit": "short", "custom": { "lineWidth": 2, "fillOpacity": 5 } }
"defaults": { "unit": "reqps", "custom": { "lineWidth": 2, "fillOpacity": 5 } }
},
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"backend\"}[5m])) by (http_request_method) * 60",
"legendFormat": "{{http_request_method}}"
"expr": "topk(10, sum(rate(traces_spanmetrics_calls_total{service=\"backend\"}[5m])) by (span_name))",
"legendFormat": "{{span_name}}"
}
]
},
{
"id": 13,
"type": "timeseries",
"title": "Requests / min — UI → PocketBase",
"title": "Latency by Span Name (p95)",
"gridPos": { "x": 12, "y": 12, "w": 12, "h": 8 },
"description": "Traffic from SvelteKit server to PocketBase (auth, collections, etc.).",
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "list", "placement": "bottom" }
},
"fieldConfig": {
"defaults": { "unit": "short", "custom": { "lineWidth": 2, "fillOpacity": 5 } }
},
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(http_client_request_duration_seconds_count{job=\"ui\", server_address=\"pocketbase\"}[5m])) by (http_request_method, http_response_status_code) * 60",
"legendFormat": "{{http_request_method}} {{http_response_status_code}}"
}
]
},
{
"id": 14,
"type": "timeseries",
"title": "UI → Backend Latency (p50 / p95)",
"gridPos": { "x": 0, "y": 20, "w": 12, "h": 8 },
"description": "HTTP client latency as seen from the SvelteKit SSR layer calling backend.",
"description": "p95 latency per operation — helps identify slow endpoints.",
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "list", "placement": "bottom" }
@@ -302,13 +276,8 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "histogram_quantile(0.50, sum(rate(http_client_request_duration_seconds_bucket{job=\"ui\", server_address=\"backend\"}[5m])) by (le))",
"legendFormat": "p50"
},
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "histogram_quantile(0.95, sum(rate(http_client_request_duration_seconds_bucket{job=\"ui\", server_address=\"backend\"}[5m])) by (le))",
"legendFormat": "p95"
"expr": "topk(10, histogram_quantile(0.95, sum(rate(traces_spanmetrics_latency_bucket{service=\"backend\"}[5m])) by (le, span_name)))",
"legendFormat": "{{span_name}}"
}
]
},
@@ -316,7 +285,7 @@
"id": 20,
"type": "logs",
"title": "Backend Errors",
"gridPos": { "x": 0, "y": 28, "w": 24, "h": 10 },
"gridPos": { "x": 0, "y": 20, "w": 24, "h": 10 },
"options": {
"showTime": true,
"showLabels": false,
@@ -329,7 +298,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"backend\"} | json | level =~ `(WARN|ERROR|error|warn)`",
"expr": "{service_name=\"backend\"}",
"legendFormat": ""
}
]

View File

@@ -1,7 +1,7 @@
{
"uid": "libnovel-catalogue",
"title": "Catalogue & Content Progress",
"description": "Scraping progress, audio generation coverage, and catalogue health derived from runner structured logs.",
"description": "Scraping progress from runner OTel logs in Loki. Logs are JSON: body=message, attributes.slug/chapters/page=fields.",
"tags": ["libnovel", "catalogue", "content"],
"timezone": "browser",
"refresh": "1m",
@@ -12,9 +12,9 @@
"id": 1,
"type": "stat",
"title": "Books Scraped (last 24h)",
"description": "Count of unique book slugs appearing in successful scrape task completions.",
"description": "Count of unique slugs from chapter list fetched messages.",
"gridPos": { "x": 0, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["sum"] }, "colorMode": "value", "graphMode": "none" },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "value", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"color": { "fixedColor": "blue", "mode": "fixed" },
@@ -24,7 +24,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum_over_time({service_name=\"runner\"} | json | msg=`scrape task done` [24h])",
"expr": "count(count_over_time({service_name=\"runner\"} | json | body=\"chapter list fetched\" [24h])) by (attributes_slug)",
"legendFormat": "books scraped"
}
]
@@ -33,8 +33,9 @@
"id": 2,
"type": "stat",
"title": "Chapters Scraped (last 24h)",
"description": "Count of 'chapter list fetched' events.",
"gridPos": { "x": 4, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["sum"] }, "colorMode": "value", "graphMode": "none" },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "value", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"color": { "fixedColor": "blue", "mode": "fixed" },
@@ -44,17 +45,18 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum_over_time({service_name=\"runner\"} | json | unwrap scraped [24h])",
"legendFormat": "chapters scraped"
"expr": "sum(count_over_time({service_name=\"runner\"} | json | body=\"chapter list fetched\" [24h]))",
"legendFormat": "chapter lists fetched"
}
]
},
{
"id": 3,
"type": "stat",
"title": "Audio Jobs Completed (last 24h)",
"title": "Metadata Saved (last 24h)",
"description": "Count of 'metadata saved' events.",
"gridPos": { "x": 8, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["sum"] }, "colorMode": "value", "graphMode": "none" },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "value", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"color": { "fixedColor": "green", "mode": "fixed" },
@@ -64,43 +66,18 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum_over_time({service_name=\"runner\"} | json | msg=`audio task done` [24h])",
"legendFormat": "audio done"
"expr": "sum(count_over_time({service_name=\"runner\"} | json | body=\"metadata saved\" [24h]))",
"legendFormat": "metadata saved"
}
]
},
{
"id": 4,
"type": "stat",
"title": "Audio Jobs Failed (last 24h)",
"gridPos": { "x": 12, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["sum"] }, "colorMode": "background", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{ "color": "green", "value": null },
{ "color": "yellow", "value": 1 },
{ "color": "red", "value": 5 }
]
}
}
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum_over_time({service_name=\"runner\"} | json | msg=`audio task failed` [24h])",
"legendFormat": "audio failed"
}
]
},
{
"id": 5,
"type": "stat",
"title": "Scrape Errors (last 24h)",
"gridPos": { "x": 16, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["sum"] }, "colorMode": "background", "graphMode": "none" },
"description": "Count of error severity logs from the runner.",
"gridPos": { "x": 12, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "background", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"thresholds": {
@@ -116,97 +93,93 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum_over_time({service_name=\"runner\"} | json | msg=`scrape task failed` [24h])",
"legendFormat": "scrape errors"
"expr": "sum(count_over_time({service_name=\"runner\"} | json | severity=\"ERROR\" [24h]))",
"legendFormat": "errors"
}
]
},
{
"id": 6,
"id": 5,
"type": "stat",
"title": "Catalogue Refresh — Books Indexed",
"description": "Total books indexed in the last catalogue refresh cycle (from the ok field in the summary log).",
"gridPos": { "x": 20, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "value", "graphMode": "none" },
"title": "Rate Limited (last 24h)",
"description": "Count of rate limiting events from Novelfire.",
"gridPos": { "x": 16, "y": 0, "w": 4, "h": 4 },
"options": { "reduceOptions": { "calcs": ["lastNotNull"] }, "colorMode": "background", "graphMode": "none" },
"fieldConfig": {
"defaults": {
"color": { "fixedColor": "purple", "mode": "fixed" },
"thresholds": { "mode": "absolute", "steps": [] }
"thresholds": {
"mode": "absolute",
"steps": [
{ "color": "green", "value": null },
{ "color": "yellow", "value": 5 },
{ "color": "red", "value": 50 }
]
}
}
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "last_over_time({service_name=\"runner\"} | json | op=`catalogue_refresh` | msg=`catalogue refresh done` | unwrap ok [7d])",
"legendFormat": "indexed"
"expr": "sum(count_over_time({service_name=\"runner\"} | json | body=~\"rate limit\" [24h]))",
"legendFormat": "rate limited"
}
]
},
{
"id": 10,
"type": "timeseries",
"title": "Audio Generation Rate (tasks/min)",
"title": "Scrape Rate (books/min)",
"description": "Rate of events per minute.",
"gridPos": { "x": 0, "y": 4, "w": 12, "h": 8 },
"description": "Rate of audio task completions and failures over time.",
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "list", "placement": "bottom" }
},
"options": { "tooltip": { "mode": "multi" }, "legend": { "displayMode": "list", "placement": "bottom" } },
"fieldConfig": {
"defaults": { "unit": "short", "custom": { "lineWidth": 2, "fillOpacity": 10 } },
"overrides": [
{ "matcher": { "id": "byName", "options": "failed" }, "properties": [{ "id": "color", "value": { "fixedColor": "red", "mode": "fixed" } }] },
{ "matcher": { "id": "byName", "options": "completed" }, "properties": [{ "id": "color", "value": { "fixedColor": "green", "mode": "fixed" } }] }
]
"defaults": { "unit": "short", "custom": { "lineWidth": 2, "fillOpacity": 10 } }
},
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"runner\", span_name=\"runner.audio_task\", status_code!=\"STATUS_CODE_ERROR\"}[5m])) * 60",
"legendFormat": "completed"
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum(rate({service_name=\"runner\"} | json | body=\"chapter list fetched\" [5m])) * 60",
"legendFormat": "books/min"
},
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"runner\", span_name=\"runner.audio_task\", status_code=\"STATUS_CODE_ERROR\"}[5m])) * 60",
"legendFormat": "failed"
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum(rate({service_name=\"runner\"} | json | body=\"metadata saved\" [5m])) * 60",
"legendFormat": "metadata/min"
}
]
},
{
"id": 11,
"type": "timeseries",
"title": "Scraping Rate (tasks/min)",
"title": "Error Rate (errors/min)",
"description": "Rate of error and rate-limit messages over time.",
"gridPos": { "x": 12, "y": 4, "w": 12, "h": 8 },
"description": "Rate of scrape task completions and failures over time.",
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "list", "placement": "bottom" }
},
"options": { "tooltip": { "mode": "multi" }, "legend": { "displayMode": "list", "placement": "bottom" } },
"fieldConfig": {
"defaults": { "unit": "short", "custom": { "lineWidth": 2, "fillOpacity": 10 } },
"overrides": [
{ "matcher": { "id": "byName", "options": "failed" }, "properties": [{ "id": "color", "value": { "fixedColor": "red", "mode": "fixed" } }] },
{ "matcher": { "id": "byName", "options": "completed" }, "properties": [{ "id": "color", "value": { "fixedColor": "blue", "mode": "fixed" } }] }
{ "matcher": { "id": "byName", "options": "errors/min" }, "properties": [{ "id": "color", "value": { "fixedColor": "red", "mode": "fixed" } }] },
{ "matcher": { "id": "byName", "options": "rate-limit/min" }, "properties": [{ "id": "color", "value": { "fixedColor": "orange", "mode": "fixed" } }] }
]
},
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"runner\", span_name=\"runner.scrape_task\", status_code!=\"STATUS_CODE_ERROR\"}[5m])) * 60",
"legendFormat": "completed"
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum(rate({service_name=\"runner\"} | json | severity=\"ERROR\" [5m])) * 60",
"legendFormat": "errors/min"
},
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "sum(rate(traces_spanmetrics_calls_total{service=\"runner\", span_name=\"runner.scrape_task\", status_code=\"STATUS_CODE_ERROR\"}[5m])) * 60",
"legendFormat": "failed"
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum(rate({service_name=\"runner\"} | json | body=~\"rate limit\" [5m])) * 60",
"legendFormat": "rate-limit/min"
}
]
},
{
"id": 20,
"type": "logs",
"title": "Scrape Task Events",
"description": "One log line per completed or failed scrape task. Fields: task_id, kind, url, scraped, skipped, errors.",
"title": "Runner Logs (errors & warnings)",
"description": "Runner log lines containing errors or warnings.",
"gridPos": { "x": 0, "y": 12, "w": 24, "h": 10 },
"options": {
"showTime": true,
@@ -220,7 +193,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"runner\"} | json | msg =~ `scrape task (done|failed|starting)`",
"expr": "{service_name=\"runner\"} | json | severity=~\"ERROR|WARN\"",
"legendFormat": ""
}
]
@@ -228,12 +201,12 @@
{
"id": 21,
"type": "logs",
"title": "Audio Task Events",
"description": "One log line per completed or failed audio task. Fields: task_id, slug, chapter, voice, key (on success), reason (on failure).",
"title": "Runner Logs (all)",
"description": "All runner log entries.",
"gridPos": { "x": 0, "y": 22, "w": 24, "h": 10 },
"options": {
"showTime": true,
"showLabels": false,
"showLabels": true,
"wrapLogMessage": false,
"prettifyLogMessage": true,
"enableLogDetails": true,
@@ -243,30 +216,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"runner\"} | json | msg =~ `audio task (done|failed|starting)`",
"legendFormat": ""
}
]
},
{
"id": 22,
"type": "logs",
"title": "Catalogue Refresh Progress",
"description": "Progress logs from the background catalogue refresh (every 24h). Fields: op=catalogue_refresh, scraped, ok, skipped, errors.",
"gridPos": { "x": 0, "y": 32, "w": 24, "h": 8 },
"options": {
"showTime": true,
"showLabels": false,
"wrapLogMessage": false,
"prettifyLogMessage": true,
"enableLogDetails": true,
"sortOrder": "Descending",
"dedupStrategy": "none"
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"runner\"} | json | op=`catalogue_refresh`",
"expr": "{service_name=\"runner\"}",
"legendFormat": ""
}
]

View File

@@ -35,7 +35,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_tasks_running",
"expr": "runner_tasks_running",
"legendFormat": "running",
"instant": true
}
@@ -61,7 +61,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_tasks_completed_total",
"expr": "runner_tasks_completed_total",
"legendFormat": "completed",
"instant": true
}
@@ -93,7 +93,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_tasks_failed_total",
"expr": "runner_tasks_failed_total",
"legendFormat": "failed",
"instant": true
}
@@ -126,7 +126,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_uptime_seconds",
"expr": "runner_uptime_seconds",
"legendFormat": "uptime",
"instant": true
}
@@ -159,7 +159,7 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_tasks_failed_total / clamp_min(libnovel_runner_tasks_completed_total + libnovel_runner_tasks_failed_total, 1)",
"expr": "runner_tasks_failed_total / clamp_min(runner_tasks_completed_total + runner_tasks_failed_total, 1)",
"legendFormat": "failure rate",
"instant": true
}
@@ -215,17 +215,17 @@
"targets": [
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "rate(libnovel_runner_tasks_completed_total[5m]) * 60",
"expr": "rate(runner_tasks_completed_total[5m]) * 60",
"legendFormat": "completed"
},
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "rate(libnovel_runner_tasks_failed_total[5m]) * 60",
"expr": "rate(runner_tasks_failed_total[5m]) * 60",
"legendFormat": "failed"
},
{
"datasource": { "type": "prometheus", "uid": "prometheus" },
"expr": "libnovel_runner_tasks_running",
"expr": "runner_tasks_running",
"legendFormat": "running"
}
]
@@ -345,7 +345,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"runner\"} | json | level =~ `(WARN|ERROR|error|warn)`",
"expr": "{service_name=\"runner\"}",
"legendFormat": ""
}
]
@@ -368,7 +368,7 @@
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"runner\"} | json",
"expr": "{service_name=\"runner\"}",
"legendFormat": ""
}
]

View File

@@ -0,0 +1,804 @@
{
"uid": "libnovel-web-vitals",
"title": "Web Vitals (RUM)",
"description": "Core Web Vitals from @grafana/faro-web-sdk. Data: browser \u2192 Alloy faro.receiver \u2192 Loki ({service_name=unknown_service}). Log format: key=value pairs, e.g. lcp=767.000000 fcp=767.000000. Use | regexp to extract.",
"tags": [
"libnovel",
"frontend",
"rum",
"web-vitals"
],
"timezone": "browser",
"refresh": "1m",
"time": {
"from": "now-24h",
"to": "now"
},
"schemaVersion": 39,
"panels": [
{
"id": 1,
"type": "stat",
"title": "LCP \u2014 p75 (Largest Contentful Paint)",
"description": "Good < 2.5s, needs improvement < 4s, poor >= 4s. Source: Loki {service_name=unknown_service} Faro measurements.",
"gridPos": {
"x": 0,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"decimals": 0,
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "yellow",
"value": 2500
},
{
"color": "red",
"value": 4000
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `lcp=(?P<lcp>\\d+\\.?\\d*)` | unwrap lcp [1h])",
"legendFormat": "LCP p75",
"instant": true
}
]
},
{
"id": 2,
"type": "stat",
"title": "INP \u2014 p75 (Interaction to Next Paint)",
"description": "Good < 200ms, needs improvement < 500ms, poor >= 500ms.",
"gridPos": {
"x": 4,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"decimals": 0,
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "yellow",
"value": 200
},
{
"color": "red",
"value": 500
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `inp=(?P<inp>\\d+\\.?\\d*)` | unwrap inp [1h])",
"legendFormat": "INP p75",
"instant": true
}
]
},
{
"id": 3,
"type": "stat",
"title": "CLS \u2014 p75 (Cumulative Layout Shift)",
"description": "Good < 0.1, needs improvement < 0.25, poor >= 0.25.",
"gridPos": {
"x": 8,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "short",
"decimals": 3,
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "yellow",
"value": 0.1
},
{
"color": "red",
"value": 0.25
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `cls=(?P<cls>\\d+\\.?\\d*)` | unwrap cls [1h])",
"legendFormat": "CLS p75",
"instant": true
}
]
},
{
"id": 4,
"type": "stat",
"title": "TTFB \u2014 p75 (Time to First Byte)",
"description": "Good < 800ms, needs improvement < 1800ms, poor >= 1800ms.",
"gridPos": {
"x": 12,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"decimals": 0,
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "yellow",
"value": 800
},
{
"color": "red",
"value": 1800
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `ttfb=(?P<ttfb>\\d+\\.?\\d*)` | unwrap ttfb [1h])",
"legendFormat": "TTFB p75",
"instant": true
}
]
},
{
"id": 5,
"type": "stat",
"title": "FCP \u2014 p75 (First Contentful Paint)",
"description": "Good < 1.8s, needs improvement < 3s, poor >= 3s.",
"gridPos": {
"x": 16,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"decimals": 0,
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "yellow",
"value": 1800
},
{
"color": "red",
"value": 3000
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `fcp=(?P<fcp>\\d+\\.?\\d*)` | unwrap fcp [1h])",
"legendFormat": "FCP p75",
"instant": true
}
]
},
{
"id": 6,
"type": "stat",
"title": "Measurements / min",
"description": "Number of Faro measurement events in the last 5 minutes (activity indicator).",
"gridPos": {
"x": 20,
"y": 0,
"w": 4,
"h": 4
},
"options": {
"reduceOptions": {
"calcs": [
"lastNotNull"
]
},
"colorMode": "value",
"graphMode": "area"
},
"fieldConfig": {
"defaults": {
"unit": "short",
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "sum(count_over_time({service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" [5m]))",
"legendFormat": "measurements",
"instant": true
}
]
},
{
"id": 10,
"type": "timeseries",
"title": "LCP over time (p50 / p75 / p95)",
"gridPos": {
"x": 0,
"y": 4,
"w": 12,
"h": 8
},
"options": {
"tooltip": {
"mode": "multi"
},
"legend": {
"displayMode": "list",
"placement": "bottom"
}
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"custom": {
"lineWidth": 2,
"fillOpacity": 10
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "Good (2.5s)"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "green",
"mode": "fixed"
}
},
{
"id": "custom.lineStyle",
"value": {
"fill": "dash",
"dash": [
4,
4
]
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "Poor (4s)"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "red",
"mode": "fixed"
}
},
{
"id": "custom.lineStyle",
"value": {
"fill": "dash",
"dash": [
4,
4
]
}
}
]
}
]
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `lcp=(?P<lcp>\\d+\\.?\\d*)` | unwrap lcp [5m])",
"legendFormat": "p50"
},
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `lcp=(?P<lcp>\\d+\\.?\\d*)` | unwrap lcp [5m])",
"legendFormat": "p75"
},
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `lcp=(?P<lcp>\\d+\\.?\\d*)` | unwrap lcp [5m])",
"legendFormat": "p95"
},
{
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"expr": "2500",
"legendFormat": "Good (2.5s)"
},
{
"datasource": {
"type": "prometheus",
"uid": "prometheus"
},
"expr": "4000",
"legendFormat": "Poor (4s)"
}
]
},
{
"id": 11,
"type": "timeseries",
"title": "TTFB over time (p50 / p75 / p95)",
"gridPos": {
"x": 12,
"y": 4,
"w": 12,
"h": 8
},
"options": {
"tooltip": {
"mode": "multi"
},
"legend": {
"displayMode": "list",
"placement": "bottom"
}
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"custom": {
"lineWidth": 2,
"fillOpacity": 10
}
}
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `ttfb=(?P<ttfb>\\d+\\.?\\d*)` | unwrap ttfb [5m])",
"legendFormat": "p50"
},
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.75, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `ttfb=(?P<ttfb>\\d+\\.?\\d*)` | unwrap ttfb [5m])",
"legendFormat": "p75"
},
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"kind=measurement\" |= \"type=web-vitals\" | regexp `ttfb=(?P<ttfb>\\d+\\.?\\d*)` | unwrap ttfb [5m])",
"legendFormat": "p95"
}
]
},
{
"id": 20,
"type": "logs",
"title": "Frontend Errors & Exceptions",
"description": "JS exceptions captured by Faro. kind=exception events.",
"gridPos": {
"x": 0,
"y": 12,
"w": 24,
"h": 10
},
"options": {
"showTime": true,
"showLabels": true,
"wrapLogMessage": true,
"prettifyLogMessage": true,
"enableLogDetails": true,
"sortOrder": "Descending",
"dedupStrategy": "none"
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "{service_name=\"unknown_service\"} | regexp `(?P<kind>\\w+)` | kind = \"exception\"",
"legendFormat": ""
}
]
},
{
"id": 21,
"type": "logs",
"title": "Web Vitals Measurements",
"description": "All Faro measurement events.",
"gridPos": {
"x": 0,
"y": 22,
"w": 24,
"h": 10
},
"options": {
"showTime": true,
"showLabels": true,
"wrapLogMessage": false,
"prettifyLogMessage": true,
"enableLogDetails": true,
"sortOrder": "Descending",
"dedupStrategy": "none"
},
"targets": [
{
"datasource": {
"type": "loki",
"uid": "loki"
},
"expr": "{service_name=\"unknown_service\"} | regexp `(?P<kind>\\w+)` | kind = \"measurement\"",
"legendFormat": ""
}
]
},
{
"id": 30,
"type": "row",
"title": "API Performance (Upstream Requests)",
"gridPos": { "x": 0, "y": 32, "w": 24, "h": 1 },
"collapsed": false
},
{
"id": 31,
"type": "timeseries",
"title": "API Request Duration — p50 / p75 / p95 by endpoint",
"description": "Duration of all libnovel.cc/api/* fetch requests captured by Faro faro.performance.resource events. Values in ms.",
"gridPos": { "x": 0, "y": 33, "w": 24, "h": 10 },
"options": {
"tooltip": { "mode": "multi" },
"legend": { "displayMode": "table", "placement": "bottom", "calcs": ["mean", "max", "lastNotNull"] }
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"custom": { "lineWidth": 2, "fillOpacity": 5 }
}
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress/audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/progress/audio-time"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress/audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/progress/audio-time"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/presign/audio\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/presign/audio"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/presign/audio\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/presign/audio"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress\" !~ \"audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/progress"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress\" !~ \"audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/progress"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/comments\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/comments"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/comments\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/comments"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/settings\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/settings"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/settings\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/settings"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.50, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/catalogue-page\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p50 /api/catalogue-page"
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/catalogue-page\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [5m])",
"legendFormat": "p95 /api/catalogue-page"
}
]
},
{
"id": 32,
"type": "barchart",
"title": "API Avg Duration — last 1h",
"description": "Average duration per endpoint over the last hour. Useful for spotting the slowest APIs at a glance.",
"gridPos": { "x": 0, "y": 43, "w": 12, "h": 8 },
"options": {
"orientation": "horizontal",
"legend": { "displayMode": "list", "placement": "bottom" },
"tooltip": { "mode": "single" },
"xTickLabelRotation": 0
},
"fieldConfig": {
"defaults": { "unit": "ms", "color": { "mode": "palette-classic" } }
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress/audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/progress/audio-time",
"instant": true
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/presign/audio\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/presign/audio",
"instant": true
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/progress\" !~ \"audio-time\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/progress",
"instant": true
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/comments\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/comments",
"instant": true
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/settings\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/settings",
"instant": true
},
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "avg_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api/catalogue-page\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h])",
"legendFormat": "/api/catalogue-page",
"instant": true
}
]
},
{
"id": 33,
"type": "stat",
"title": "Slowest API call — p95 last 1h",
"description": "p95 duration of the single slowest endpoint in the last hour.",
"gridPos": { "x": 12, "y": 43, "w": 6, "h": 4 },
"options": {
"reduceOptions": { "calcs": ["lastNotNull"] },
"colorMode": "background",
"graphMode": "none"
},
"fieldConfig": {
"defaults": {
"unit": "ms",
"decimals": 0,
"thresholds": {
"mode": "absolute",
"steps": [
{ "color": "green", "value": null },
{ "color": "yellow", "value": 500 },
{ "color": "red", "value": 1000 }
]
}
}
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "max(quantile_over_time(0.95, {service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur [1h]))",
"legendFormat": "p95 max",
"instant": true
}
]
},
{
"id": 34,
"type": "stat",
"title": "API Requests / min",
"description": "Rate of libnovel.cc API requests captured by Faro in the last 5 minutes.",
"gridPos": { "x": 18, "y": 43, "w": 6, "h": 4 },
"options": {
"reduceOptions": { "calcs": ["lastNotNull"] },
"colorMode": "value",
"graphMode": "area"
},
"fieldConfig": {
"defaults": {
"unit": "short",
"thresholds": { "mode": "absolute", "steps": [{ "color": "green", "value": null }] }
}
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "sum(count_over_time({service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api\" [5m])) / 5",
"legendFormat": "req/min",
"instant": true
}
]
},
{
"id": 35,
"type": "logs",
"title": "Slow API Requests (>500ms)",
"description": "Individual faro.performance.resource events where duration > 500ms. Useful for debugging outliers.",
"gridPos": { "x": 0, "y": 47, "w": 24, "h": 8 },
"options": {
"showTime": true,
"showLabels": false,
"wrapLogMessage": false,
"prettifyLogMessage": false,
"enableLogDetails": true,
"sortOrder": "Descending",
"dedupStrategy": "none"
},
"targets": [
{
"datasource": { "type": "loki", "uid": "loki" },
"expr": "{service_name=\"unknown_service\"} |= \"faro.performance.resource\" |= \"libnovel.cc/api\" | regexp `event_data_duration=(?P<dur>[0-9.]+)` | unwrap dur | dur > 500",
"legendFormat": ""
}
]
}
]
}

View File

@@ -13,6 +13,11 @@
# - RUNNER_SKIP_INITIAL_CATALOGUE_REFRESH=true
# - REDIS_ADDR → rediss://redis.libnovel.cc:6380 (prod Redis via Caddy TLS proxy)
# - LibreTranslate service for machine translation (internal network only)
#
# extra_hosts pins storage.libnovel.cc and pb.libnovel.cc to the prod server IP
# (165.22.70.138) so that large PutObject uploads and PocketBase writes bypass
# Cloudflare's 100-second proxy timeout entirely. TLS still terminates at Caddy
# on prod; the TLS certificate is valid for the domain names so SNI works fine.
services:
libretranslate:
@@ -28,20 +33,21 @@ services:
volumes:
- libretranslate_models:/home/libretranslate/.local/share/argos-translate
- libretranslate_db:/app/db
healthcheck:
test: ["CMD-SHELL", "curl -sf http://localhost:5000/languages || exit 1"]
interval: 30s
timeout: 10s
retries: 5
start_period: 120s
runner:
image: kalekber/libnovel-runner:latest
restart: unless-stopped
stop_grace_period: 135s
labels:
- "com.centurylinklabs.watchtower.enable=true"
depends_on:
libretranslate:
condition: service_healthy
- libretranslate
# Pin prod subdomains to the prod server IP to bypass Cloudflare's 100s
# proxy timeout. Large MP3 PutObject uploads and PocketBase writes go
# directly to Caddy on prod; TLS and SNI still work normally.
extra_hosts:
- "storage.libnovel.cc:165.22.70.138"
- "pb.libnovel.cc:165.22.70.138"
environment:
# ── PocketBase ──────────────────────────────────────────────────────────
POCKETBASE_URL: "https://pb.libnovel.cc"
@@ -70,6 +76,10 @@ services:
# ── Pocket TTS ──────────────────────────────────────────────────────────
POCKET_TTS_URL: "${POCKET_TTS_URL}"
# ── Cloudflare Workers AI TTS ────────────────────────────────────────────
CFAI_ACCOUNT_ID: "${CFAI_ACCOUNT_ID}"
CFAI_API_TOKEN: "${CFAI_API_TOKEN}"
# ── LibreTranslate (internal Docker network) ────────────────────────────
LIBRETRANSLATE_URL: "http://libretranslate:5000"
LIBRETRANSLATE_API_KEY: "${LIBRETRANSLATE_API_KEY}"
@@ -92,7 +102,7 @@ services:
# ── Observability ───────────────────────────────────────────────────────
LOG_LEVEL: "${LOG_LEVEL}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN_RUNNER}"
healthcheck:
test: ["CMD", "/healthcheck", "file", "/tmp/runner.alive", "120"]

View File

@@ -56,14 +56,14 @@ build-svc svc:
# Push all custom images to Docker Hub (requires docker login)
push:
{{doppler}} docker compose push backend runner ui caddy
{{doppler}} docker compose push backend runner ui caddy pocketbase
# Build then push all custom images
build-push: build push
# Pull all images from Docker Hub (uses GIT_TAG from Doppler)
pull-images:
{{doppler}} docker compose pull backend runner ui caddy
{{doppler}} docker compose pull backend runner ui caddy pocketbase
# Pull all third-party base images (minio, pocketbase, etc.)
pull-infra:
@@ -122,6 +122,13 @@ secrets-env:
secrets-dashboard:
doppler open dashboard
# ── Developer setup ───────────────────────────────────────────────────────────
# One-time dev setup: configure git to use committed hooks in .githooks/
setup:
git config core.hooksPath .githooks
@echo "Git hooks configured (.githooks/pre-commit active)."
# ── Gitea CI ──────────────────────────────────────────────────────────────────
# Validate workflow files

View File

@@ -62,6 +62,39 @@ create() {
esac
}
# add_index COLLECTION INDEX_NAME SQL_EXPR
# Fetches current schema, adds index if absent by name, PATCHes collection.
add_index() {
COLL="$1"; INAME="$2"; ISQL="$3"
SCHEMA=$(curl -sf -H "Authorization: Bearer $TOK" "$PB/api/collections/$COLL" 2>/dev/null)
PARSED=$(echo "$SCHEMA" | python3 -c "
import sys, json
d = json.load(sys.stdin)
indexes = d.get('indexes', [])
exists = any('$INAME' in idx for idx in indexes)
print('exists=' + str(exists))
print('id=' + d.get('id', ''))
if not exists:
indexes.append('$ISQL')
print('indexes=' + json.dumps(indexes))
" 2>/dev/null)
if echo "$PARSED" | grep -q "^exists=True"; then
log "index exists (skip): $COLL.$INAME"; return
fi
COLL_ID=$(echo "$PARSED" | grep "^id=" | sed 's/^id=//')
[ -z "$COLL_ID" ] && { log "WARNING: cannot resolve id for $COLL"; return; }
NEW_INDEXES=$(echo "$PARSED" | grep "^indexes=" | sed 's/^indexes=//')
STATUS=$(curl -s -o /dev/null -w "%{http_code}" \
-X PATCH "$PB/api/collections/$COLL_ID" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOK" \
-d "{\"indexes\":${NEW_INDEXES}}")
case "$STATUS" in
200|201) log "added index: $COLL.$INAME" ;;
*) log "WARNING: add_index $COLL.$INAME returned $STATUS" ;;
esac
}
# add_field COLLECTION FIELD_NAME FIELD_TYPE
# Fetches current schema, appends field if absent, PATCHes collection.
# Requires python3 for safe JSON manipulation.
@@ -111,14 +144,16 @@ create "books" '{
{"name":"total_chapters","type":"number"},
{"name":"source_url", "type":"text"},
{"name":"ranking", "type":"number"},
{"name":"meta_updated", "type":"text"}
{"name":"meta_updated", "type":"text"},
{"name":"archived", "type":"bool"}
]}'
create "chapters_idx" '{
"name":"chapters_idx","type":"base","fields":[
{"name":"slug", "type":"text", "required":true},
{"name":"number","type":"number", "required":true},
{"name":"title", "type":"text"}
{"name":"slug", "type":"text", "required":true},
{"name":"number", "type":"number", "required":true},
{"name":"title", "type":"text"},
{"name":"created", "type":"date"}
]}'
create "ranking" '{
@@ -190,14 +225,15 @@ create "app_users" '{
{"name":"oauth_id", "type":"text"}
]}'
create "user_sessions" '{
create "user_sessions" '{
"name":"user_sessions","type":"base","fields":[
{"name":"user_id", "type":"text","required":true},
{"name":"session_id","type":"text","required":true},
{"name":"user_agent","type":"text"},
{"name":"ip", "type":"text"},
{"name":"created_at","type":"text"},
{"name":"last_seen", "type":"text"}
{"name":"user_id", "type":"text","required":true},
{"name":"session_id", "type":"text","required":true},
{"name":"user_agent", "type":"text"},
{"name":"ip", "type":"text"},
{"name":"device_fingerprint", "type":"text"},
{"name":"created_at", "type":"text"},
{"name":"last_seen", "type":"text"}
]}'
create "user_library" '{
@@ -210,12 +246,18 @@ create "user_library" '{
create "user_settings" '{
"name":"user_settings","type":"base","fields":[
{"name":"session_id","type":"text","required":true},
{"name":"user_id", "type":"text"},
{"name":"auto_next","type":"bool"},
{"name":"voice", "type":"text"},
{"name":"speed", "type":"number"},
{"name":"updated", "type":"text"}
{"name":"session_id", "type":"text", "required":true},
{"name":"user_id", "type":"text"},
{"name":"auto_next", "type":"bool"},
{"name":"voice", "type":"text"},
{"name":"speed", "type":"number"},
{"name":"theme", "type":"text"},
{"name":"locale", "type":"text"},
{"name":"font_family", "type":"text"},
{"name":"font_size", "type":"number"},
{"name":"announce_chapter","type":"bool"},
{"name":"audio_mode", "type":"text"},
{"name":"updated", "type":"text"}
]}'
create "user_subscriptions" '{
@@ -259,6 +301,88 @@ create "translation_jobs" '{
{"name":"heartbeat_at", "type":"date"}
]}'
create "import_tasks" '{
"name":"import_tasks","type":"base","fields":[
{"name":"slug", "type":"text", "required":true},
{"name":"title", "type":"text", "required":true},
{"name":"file_name", "type":"text"},
{"name":"file_type", "type":"text"},
{"name":"object_key", "type":"text"},
{"name":"chapters_key", "type":"text"},
{"name":"author", "type":"text"},
{"name":"cover_url", "type":"text"},
{"name":"genres", "type":"text"},
{"name":"summary", "type":"text"},
{"name":"book_status", "type":"text"},
{"name":"worker_id", "type":"text"},
{"name":"initiator_user_id", "type":"text"},
{"name":"status", "type":"text", "required":true},
{"name":"chapters_done", "type":"number"},
{"name":"chapters_total", "type":"number"},
{"name":"error_message", "type":"text"},
{"name":"started", "type":"date"},
{"name":"finished", "type":"date"},
{"name":"heartbeat_at", "type":"date"}
]}'
create "notifications" '{
"name":"notifications","type":"base","fields":[
{"name":"user_id", "type":"text","required":true},
{"name":"title", "type":"text","required":true},
{"name":"message", "type":"text"},
{"name":"link", "type":"text"},
{"name":"read", "type":"bool"},
{"name":"created", "type":"date"}
]}'
create "push_subscriptions" '{
"name":"push_subscriptions","type":"base","fields":[
{"name":"user_id", "type":"text","required":true},
{"name":"endpoint", "type":"text","required":true},
{"name":"p256dh", "type":"text","required":true},
{"name":"auth", "type":"text","required":true}
]}'
create "ai_jobs" '{
"name":"ai_jobs","type":"base","fields":[
{"name":"kind", "type":"text", "required":true},
{"name":"slug", "type":"text"},
{"name":"status", "type":"text", "required":true},
{"name":"from_item", "type":"number"},
{"name":"to_item", "type":"number"},
{"name":"items_done", "type":"number"},
{"name":"items_total", "type":"number"},
{"name":"model", "type":"text"},
{"name":"payload", "type":"text"},
{"name":"error_message", "type":"text"},
{"name":"started", "type":"date"},
{"name":"finished", "type":"date"},
{"name":"heartbeat_at", "type":"date"}
]}'
create "discovery_votes" '{
"name":"discovery_votes","type":"base","fields":[
{"name":"session_id","type":"text","required":true},
{"name":"user_id", "type":"text"},
{"name":"slug", "type":"text","required":true},
{"name":"action", "type":"text","required":true}
]}'
create "book_ratings" '{
"name":"book_ratings","type":"base","fields":[
{"name":"session_id","type":"text", "required":true},
{"name":"user_id", "type":"text"},
{"name":"slug", "type":"text", "required":true},
{"name":"rating", "type":"number", "required":true}
]}'
create "site_config" '{
"name":"site_config","type":"base","fields":[
{"name":"decoration", "type":"text"},
{"name":"logoAnimation", "type":"text"},
{"name":"eventLabel", "type":"text"}
]}'
# ── 5. Field migrations (idempotent — adds fields missing from older installs) ─
add_field "scraping_tasks" "heartbeat_at" "date"
add_field "audio_jobs" "heartbeat_at" "date"
@@ -274,5 +398,22 @@ add_field "app_users" "oauth_provider" "text"
add_field "app_users" "oauth_id" "text"
add_field "app_users" "polar_customer_id" "text"
add_field "app_users" "polar_subscription_id" "text"
add_field "user_library" "shelf" "text"
add_field "user_sessions" "device_fingerprint" "text"
add_field "chapters_idx" "created" "date"
add_field "user_settings" "theme" "text"
add_field "user_settings" "locale" "text"
add_field "user_settings" "font_family" "text"
add_field "user_settings" "font_size" "number"
add_field "user_settings" "announce_chapter" "bool"
add_field "user_settings" "audio_mode" "text"
add_field "books" "archived" "bool"
add_field "app_users" "notify_new_chapters" "bool"
# ── 6. Indexes ────────────────────────────────────────────────────────────────
add_index "chapters_idx" "idx_chapters_idx_slug_number" \
"CREATE UNIQUE INDEX idx_chapters_idx_slug_number ON chapters_idx (slug, number)"
add_index "chapters_idx" "idx_chapters_idx_created" \
"CREATE INDEX idx_chapters_idx_created ON chapters_idx (created)"
log "done"

3
ui/.gitignore vendored
View File

@@ -22,5 +22,4 @@ Thumbs.db
vite.config.js.timestamp-*
vite.config.ts.timestamp-*
# Generated by CI at build time — do not commit
/static/releases.json

View File

@@ -3,6 +3,7 @@
"nav_library": "Library",
"nav_catalogue": "Catalogue",
"nav_feed": "Feed",
"nav_feedback": "Feedback",
"nav_admin": "Admin",
"nav_profile": "Profile",
@@ -160,6 +161,9 @@
"profile_theme_amber": "Amber",
"profile_theme_slate": "Slate",
"profile_theme_rose": "Rose",
"profile_theme_forest": "Forest",
"profile_theme_mono": "Mono",
"profile_theme_cyber": "Cyberpunk",
"profile_theme_light": "Light",
"profile_theme_light_slate": "Light Blue",
"profile_theme_light_rose": "Light Rose",
@@ -301,6 +305,24 @@
"book_detail_range_queuing": "Queuing\u2026",
"book_detail_scrape_range": "Scrape range",
"book_detail_admin": "Admin",
"book_detail_admin_book_cover": "Book Cover",
"book_detail_admin_chapter_cover": "Chapter Cover",
"book_detail_admin_chapter_n": "Chapter #",
"book_detail_admin_description": "Description",
"book_detail_admin_chapter_names": "Chapter Names",
"book_detail_admin_audio_tts": "Audio TTS",
"book_detail_admin_voice": "Voice",
"book_detail_admin_generate": "Generate",
"book_detail_admin_save_cover": "Save Cover",
"book_detail_admin_saving": "Saving…",
"book_detail_admin_saved": "Saved",
"book_detail_admin_apply": "Apply",
"book_detail_admin_applying": "Applying…",
"book_detail_admin_applied": "Applied",
"book_detail_admin_discard": "Discard",
"book_detail_admin_enqueue_audio": "Enqueue Audio",
"book_detail_admin_cancel_audio": "Cancel",
"book_detail_admin_enqueued": "Enqueued {enqueued}, skipped {skipped}",
"book_detail_scraping_progress": "Fetching the first 20 chapters. This page will refresh automatically.",
"book_detail_scraping_home": "\u2190 Home",
"book_detail_rescrape_book": "Rescrape book",
@@ -347,6 +369,26 @@
"profile_upgrade_monthly": "Monthly \u2014 $6 / mo",
"profile_upgrade_annual": "Annual \u2014 $48 / yr",
"profile_free_limits": "Free plan: 3 audio chapters per day, English reading only.",
"subscribe_page_title": "Go Pro \u2014 libnovel",
"subscribe_heading": "Read more. Listen more.",
"subscribe_subheading": "Upgrade to Pro and unlock the full libnovel experience.",
"subscribe_monthly_label": "Monthly",
"subscribe_monthly_price": "$6",
"subscribe_monthly_period": "per month",
"subscribe_annual_label": "Annual",
"subscribe_annual_price": "$48",
"subscribe_annual_period": "per year",
"subscribe_annual_save": "Save 33%",
"subscribe_cta_monthly": "Start monthly plan",
"subscribe_cta_annual": "Start annual plan",
"subscribe_already_pro": "You already have a Pro subscription.",
"subscribe_manage": "Manage subscription",
"subscribe_benefit_audio": "Unlimited audio chapters per day",
"subscribe_benefit_voices": "Voice selection across all TTS engines",
"subscribe_benefit_translation": "Read in French, Indonesian, Portuguese, and Russian",
"subscribe_benefit_downloads": "Download chapters for offline listening",
"subscribe_login_prompt": "Sign in to subscribe",
"subscribe_login_cta": "Sign in",
"user_currently_reading": "Currently Reading",
"user_library_count": "Library ({n})",
@@ -360,13 +402,21 @@
"admin_nav_scrape": "Scrape",
"admin_nav_audio": "Audio",
"admin_nav_translation": "Translation",
"admin_nav_import": "Import",
"admin_nav_changelog": "Changelog",
"admin_nav_image_gen": "Image Gen",
"admin_nav_text_gen": "Text Gen",
"admin_nav_catalogue_tools": "Catalogue Tools",
"admin_nav_ai_jobs": "AI Jobs",
"admin_nav_notifications": "Notifications",
"admin_nav_feedback": "Feedback",
"admin_nav_errors": "Errors",
"admin_nav_analytics": "Analytics",
"admin_nav_logs": "Logs",
"admin_nav_uptime": "Uptime",
"admin_nav_push": "Push",
"admin_nav_gitea": "Gitea",
"admin_nav_grafana": "Grafana",
"admin_scrape_status_idle": "Idle",
"admin_scrape_status_running": "Running",
@@ -419,5 +469,31 @@
"profile_text_size_sm": "Small",
"profile_text_size_md": "Normal",
"profile_text_size_lg": "Large",
"profile_text_size_xl": "X-Large"
"profile_text_size_xl": "X-Large",
"feed_page_title": "Feed — LibNovel",
"feed_heading": "Following Feed",
"feed_subheading": "Books your followed users are reading",
"feed_empty_heading": "Nothing here yet",
"feed_empty_body": "Follow other readers to see what they're reading.",
"feed_not_logged_in": "Sign in to see your feed.",
"feed_reader_label": "reading",
"feed_chapters_label": "{n} chapters",
"feed_browse_cta": "Browse catalogue",
"feed_find_users_cta": "Discover readers",
"admin_translation_page_title": "Translation \u2014 Admin",
"admin_translation_heading": "Machine Translation",
"admin_translation_tab_enqueue": "Enqueue",
"admin_translation_tab_jobs": "Jobs",
"admin_translation_filter_placeholder": "Filter by slug, lang, or status\u2026",
"admin_translation_no_matching": "No matching jobs.",
"admin_translation_no_jobs": "No translation jobs yet.",
"admin_ai_jobs_page_title": "AI Jobs \u2014 Admin",
"admin_ai_jobs_heading": "AI Jobs",
"admin_ai_jobs_subheading": "Background AI generation tasks",
"admin_text_gen_page_title": "Text Gen \u2014 Admin",
"admin_text_gen_heading": "Text Generation"
}

View File

@@ -1,8 +1,8 @@
{
"$schema": "https://inlang.com/schema/inlang-message-format",
"nav_library": "Bibliothèque",
"nav_catalogue": "Catalogue",
"nav_feed": "Fil",
"nav_feedback": "Retour",
"nav_admin": "Admin",
"nav_profile": "Profil",
@@ -10,7 +10,6 @@
"nav_sign_out": "Déconnexion",
"nav_toggle_menu": "Menu",
"nav_admin_panel": "Panneau admin",
"footer_library": "Bibliothèque",
"footer_catalogue": "Catalogue",
"footer_feedback": "Retour",
@@ -19,7 +18,6 @@
"footer_dmca": "DMCA",
"footer_copyright": "© {year} libnovel",
"footer_dev": "dev",
"home_title": "libnovel",
"home_stat_books": "Livres",
"home_stat_chapters": "Chapitres",
@@ -33,7 +31,6 @@
"home_discover_novels": "Découvrir des romans",
"home_via_reader": "via {username}",
"home_chapter_badge": "ch.{n}",
"player_generating": "Génération… {percent}%",
"player_loading": "Chargement…",
"player_chapters": "Chapitres",
@@ -57,7 +54,6 @@
"player_auto_next_aria": "Suivant auto {state}",
"player_go_to_chapter": "Aller au chapitre",
"player_close": "Fermer le lecteur",
"login_page_title": "Connexion — libnovel",
"login_heading": "Se connecter à libnovel",
"login_subheading": "Choisissez un fournisseur pour continuer",
@@ -67,7 +63,6 @@
"login_error_oauth_state": "Connexion annulée ou expirée. Veuillez réessayer.",
"login_error_oauth_failed": "Impossible de se connecter au fournisseur. Veuillez réessayer.",
"login_error_oauth_no_email": "Votre compte n'a pas d'adresse e-mail vérifiée. Ajoutez-en une et réessayez.",
"books_page_title": "Bibliothèque — libnovel",
"books_heading": "Votre bibliothèque",
"books_empty_title": "Aucun livre pour l'instant",
@@ -77,7 +72,6 @@
"books_last_read": "Dernier lu : Ch.{n}",
"books_reading_progress": "Ch.{current} / {total}",
"books_remove": "Supprimer",
"catalogue_page_title": "Catalogue — libnovel",
"catalogue_heading": "Catalogue",
"catalogue_search_placeholder": "Rechercher des romans…",
@@ -98,7 +92,6 @@
"catalogue_loading": "Chargement…",
"catalogue_load_more": "Charger plus",
"catalogue_results_count": "{n} résultats",
"book_detail_page_title": "{title} — libnovel",
"book_detail_signin_to_save": "Connectez-vous pour sauvegarder",
"book_detail_add_to_library": "Ajouter à la bibliothèque",
@@ -115,13 +108,11 @@
"book_detail_rescrape": "Réextraire",
"book_detail_scraping": "Extraction en cours…",
"book_detail_in_library": "Dans la bibliothèque",
"chapters_page_title": "Chapitres — {title}",
"chapters_heading": "Chapitres",
"chapters_back_to_book": "Retour au livre",
"chapters_reading_now": "En cours de lecture",
"chapters_empty": "Aucun chapitre extrait pour l'instant.",
"reader_page_title": "{title} — Ch.{n} — libnovel",
"reader_play_narration": "Lire la narration",
"reader_generating_audio": "Génération audio…",
@@ -143,7 +134,6 @@
"reader_auto_next": "Suivant auto",
"reader_speed": "Vitesse",
"reader_preview_notice": "Aperçu — ce chapitre n'a pas été entièrement extrait.",
"profile_page_title": "Profil — libnovel",
"profile_heading": "Profil",
"profile_avatar_label": "Avatar",
@@ -160,6 +150,9 @@
"profile_theme_amber": "Ambre",
"profile_theme_slate": "Ardoise",
"profile_theme_rose": "Rose",
"profile_theme_forest": "Forêt",
"profile_theme_mono": "Mono",
"profile_theme_cyber": "Cyberpunk",
"profile_theme_light": "Light",
"profile_theme_light_slate": "Light Blue",
"profile_theme_light_rose": "Light Rose",
@@ -175,7 +168,6 @@
"profile_sessions_heading": "Sessions actives",
"profile_sign_out_all": "Se déconnecter de tous les autres appareils",
"profile_joined": "Inscrit le {date}",
"user_page_title": "{username} — libnovel",
"user_library_heading": "Bibliothèque de {username}",
"user_follow": "Suivre",
@@ -183,13 +175,11 @@
"user_followers": "{n} abonnés",
"user_following": "{n} abonnements",
"user_library_empty": "Aucun livre dans la bibliothèque.",
"error_not_found_title": "Page introuvable",
"error_not_found_body": "La page que vous cherchez n'existe pas.",
"error_generic_title": "Une erreur s'est produite",
"error_go_home": "Accueil",
"error_status": "Erreur {status}",
"admin_scrape_page_title": "Extraction — Admin",
"admin_scrape_heading": "Extraction",
"admin_scrape_catalogue": "Extraire le catalogue",
@@ -207,14 +197,11 @@
"admin_scrape_status_cancelled": "Annulé",
"admin_tasks_heading": "Tâches récentes",
"admin_tasks_empty": "Aucune tâche pour l'instant.",
"admin_audio_page_title": "Audio — Admin",
"admin_audio_heading": "Tâches audio",
"admin_audio_empty": "Aucune tâche audio.",
"admin_changelog_page_title": "Changelog — Admin",
"admin_changelog_heading": "Changelog",
"comments_heading": "Commentaires",
"comments_empty": "Aucun commentaire pour l'instant. Soyez le premier !",
"comments_placeholder": "Écrire un commentaire…",
@@ -228,12 +215,10 @@
"comments_hide_replies": "Masquer les réponses",
"comments_edited": "modifié",
"comments_deleted": "[supprimé]",
"disclaimer_page_title": "Avertissement — libnovel",
"privacy_page_title": "Politique de confidentialité — libnovel",
"dmca_page_title": "DMCA — libnovel",
"terms_page_title": "Conditions d'utilisation — libnovel",
"common_loading": "Chargement…",
"common_error": "Erreur",
"common_save": "Enregistrer",
@@ -247,15 +232,12 @@
"common_no": "Non",
"common_on": "activé",
"common_off": "désactivé",
"locale_switcher_label": "Langue",
"books_empty_library": "Votre bibliothèque est vide.",
"books_empty_discover": "Les livres que vous commencez à lire ou enregistrez depuis",
"books_empty_discover_link": "Découvrir",
"books_empty_discover_suffix": "apparaîtront ici.",
"books_count": "{n} livre{s}",
"catalogue_sort_updated": "Mis à jour",
"catalogue_search_button": "Rechercher",
"catalogue_refresh": "Actualiser",
@@ -288,7 +270,6 @@
"catalogue_scrape_forbidden_badge": "Interdit",
"catalogue_scrape_novel_button": "Extraire",
"catalogue_scraping_novel": "Extraction…",
"book_detail_not_in_library": "pas dans la bibliothèque",
"book_detail_continue_ch": "Continuer ch.{n}",
"book_detail_start_ch1": "Commencer au ch.1",
@@ -301,23 +282,38 @@
"book_detail_range_queuing": "En file d'attente…",
"book_detail_scrape_range": "Plage d'extraction",
"book_detail_admin": "Admin",
"book_detail_admin_book_cover": "Couverture du livre",
"book_detail_admin_chapter_cover": "Couverture du chapitre",
"book_detail_admin_chapter_n": "Chapitre n°",
"book_detail_admin_description": "Description",
"book_detail_admin_chapter_names": "Noms des chapitres",
"book_detail_admin_audio_tts": "Audio TTS",
"book_detail_admin_voice": "Voix",
"book_detail_admin_generate": "Générer",
"book_detail_admin_save_cover": "Enregistrer la couverture",
"book_detail_admin_saving": "Enregistrement…",
"book_detail_admin_saved": "Enregistré",
"book_detail_admin_apply": "Appliquer",
"book_detail_admin_applying": "Application…",
"book_detail_admin_applied": "Appliqué",
"book_detail_admin_discard": "Ignorer",
"book_detail_admin_enqueue_audio": "Mettre en file audio",
"book_detail_admin_cancel_audio": "Annuler",
"book_detail_admin_enqueued": "{enqueued} en file, {skipped} ignorés",
"book_detail_scraping_progress": "Récupération des 20 premiers chapitres. Cette page sera actualisée automatiquement.",
"book_detail_scraping_home": "← Accueil",
"book_detail_rescrape_book": "Réextraire le livre",
"book_detail_less": "Moins",
"book_detail_more": "Plus",
"chapters_search_placeholder": "Rechercher des chapitres…",
"chapters_jump_to": "Aller au Ch.{n}",
"chapters_no_match": "Aucun chapitre ne correspond à « {q} »",
"chapters_none_available": "Aucun chapitre disponible pour l'instant.",
"chapters_reading_indicator": "en cours",
"chapters_result_count": "{n} résultats",
"reader_fetching_chapter": "Récupération du chapitre…",
"reader_words": "{n} mots",
"reader_preview_audio_notice": "Aperçu — audio non disponible pour les livres hors bibliothèque.",
"profile_click_to_change": "Cliquez sur l'avatar pour changer la photo",
"profile_tts_voice": "Voix TTS",
"profile_auto_advance": "Avancer automatiquement au chapitre suivant",
@@ -335,7 +331,6 @@
"profile_updating": "Mise à jour…",
"profile_password_changed_ok": "Mot de passe modifié avec succès.",
"profile_playback_speed": "Vitesse de lecture — {speed}x",
"profile_subscription_heading": "Abonnement",
"profile_plan_pro": "Pro",
"profile_plan_free": "Gratuit",
@@ -347,27 +342,48 @@
"profile_upgrade_monthly": "Mensuel — 6 $ / mois",
"profile_upgrade_annual": "Annuel — 48 $ / an",
"profile_free_limits": "Plan gratuit : 3 chapitres audio par jour, lecture en anglais uniquement.",
"subscribe_page_title": "Passer Pro — libnovel",
"subscribe_heading": "Lisez plus. Écoutez plus.",
"subscribe_subheading": "Passez Pro et débloquez l'expérience libnovel complète.",
"subscribe_monthly_label": "Mensuel",
"subscribe_monthly_price": "6 $",
"subscribe_monthly_period": "par mois",
"subscribe_annual_label": "Annuel",
"subscribe_annual_price": "48 $",
"subscribe_annual_period": "par an",
"subscribe_annual_save": "Économisez 33 %",
"subscribe_cta_monthly": "Commencer le plan mensuel",
"subscribe_cta_annual": "Commencer le plan annuel",
"subscribe_already_pro": "Vous avez déjà un abonnement Pro.",
"subscribe_manage": "Gérer l'abonnement",
"subscribe_benefit_audio": "Chapitres audio illimités par jour",
"subscribe_benefit_voices": "Sélection de voix pour tous les moteurs TTS",
"subscribe_benefit_translation": "Lire en français, indonésien, portugais et russe",
"subscribe_benefit_downloads": "Télécharger des chapitres pour une écoute hors ligne",
"subscribe_login_prompt": "Connectez-vous pour vous abonner",
"subscribe_login_cta": "Se connecter",
"user_currently_reading": "En cours de lecture",
"user_library_count": "Bibliothèque ({n})",
"user_joined": "Inscrit le {date}",
"user_followers_label": "abonnés",
"user_following_label": "abonnements",
"user_no_books": "Aucun livre dans la bibliothèque pour l'instant.",
"admin_pages_label": "Pages",
"admin_tools_label": "Outils",
"admin_nav_scrape": "Scrape",
"admin_nav_audio": "Audio",
"admin_nav_translation": "Traduction",
"admin_nav_changelog": "Modifications",
"admin_nav_feedback": "Retours",
"admin_nav_image_gen": "Image Gen",
"admin_nav_text_gen": "Text Gen",
"admin_nav_catalogue_tools": "Catalogue Tools",
"admin_nav_ai_jobs": "Tâches IA",
"admin_nav_notifications": "Notifications",
"admin_nav_errors": "Erreurs",
"admin_nav_analytics": "Analytique",
"admin_nav_logs": "Journaux",
"admin_nav_uptime": "Disponibilité",
"admin_nav_push": "Notifications",
"admin_scrape_status_idle": "Inactif",
"admin_scrape_full_catalogue": "Catalogue complet",
"admin_scrape_single_book": "Livre unique",
@@ -378,25 +394,21 @@
"admin_scrape_start": "Démarrer l'extraction",
"admin_scrape_queuing": "En file d'attente…",
"admin_scrape_running": "En cours…",
"admin_audio_filter_jobs": "Filtrer par slug, voix ou statut…",
"admin_audio_filter_cache": "Filtrer par slug, chapitre ou voix…",
"admin_audio_no_matching_jobs": "Aucun job correspondant.",
"admin_audio_no_jobs": "Aucun job audio pour l'instant.",
"admin_audio_cache_empty": "Cache audio vide.",
"admin_audio_no_cache_results": "Aucun résultat.",
"admin_changelog_gitea": "Releases Gitea",
"admin_changelog_no_releases": "Aucune release trouvée.",
"admin_changelog_load_error": "Impossible de charger les releases : {error}",
"comments_top": "Les meilleures",
"comments_new": "Nouvelles",
"comments_posting": "Publication…",
"comments_login_link": "Connectez-vous",
"comments_login_suffix": "pour laisser un commentaire.",
"comments_anonymous": "Anonyme",
"reader_audio_narration": "Narration Audio",
"reader_playing": "Lecture en cours — contrôles ci-dessous",
"reader_paused": "En pause — contrôles ci-dessous",
@@ -409,7 +421,6 @@
"reader_voice_applies_next": "La nouvelle voix s'appliquera au prochain « Lire la narration ».",
"reader_choose_voice": "Choisir une voix",
"reader_generating_narration": "Génération de la narration…",
"profile_font_family": "Police",
"profile_font_system": "Système",
"profile_font_serif": "Serif",
@@ -418,5 +429,30 @@
"profile_text_size_sm": "Petit",
"profile_text_size_md": "Normal",
"profile_text_size_lg": "Grand",
"profile_text_size_xl": "Très grand"
"profile_text_size_xl": "Très grand",
"feed_page_title": "Fil — LibNovel",
"feed_heading": "Fil d'abonnements",
"feed_subheading": "Livres lus par vos abonnements",
"feed_empty_heading": "Rien encore",
"feed_empty_body": "Suivez d'autres lecteurs pour voir ce qu'ils lisent.",
"feed_not_logged_in": "Connectez-vous pour voir votre fil.",
"feed_reader_label": "lit",
"feed_chapters_label": "{n} chapitres",
"feed_browse_cta": "Parcourir le catalogue",
"feed_find_users_cta": "Trouver des lecteurs",
"admin_nav_gitea": "Gitea",
"admin_nav_grafana": "Grafana",
"admin_translation_page_title": "Translation — Admin",
"admin_translation_heading": "Machine Translation",
"admin_translation_tab_enqueue": "Enqueue",
"admin_translation_tab_jobs": "Jobs",
"admin_translation_filter_placeholder": "Filter by slug, lang, or status…",
"admin_translation_no_matching": "No matching jobs.",
"admin_translation_no_jobs": "No translation jobs yet.",
"admin_ai_jobs_page_title": "AI Jobs — Admin",
"admin_ai_jobs_heading": "AI Jobs",
"admin_ai_jobs_subheading": "Background AI generation tasks",
"admin_text_gen_page_title": "Text Gen — Admin",
"admin_text_gen_heading": "Text Generation",
"admin_nav_import": "Import"
}

View File

@@ -1,8 +1,8 @@
{
"$schema": "https://inlang.com/schema/inlang-message-format",
"nav_library": "Perpustakaan",
"nav_catalogue": "Katalog",
"nav_feed": "Umpan",
"nav_feedback": "Masukan",
"nav_admin": "Admin",
"nav_profile": "Profil",
@@ -10,7 +10,6 @@
"nav_sign_out": "Keluar",
"nav_toggle_menu": "Menu",
"nav_admin_panel": "Panel admin",
"footer_library": "Perpustakaan",
"footer_catalogue": "Katalog",
"footer_feedback": "Masukan",
@@ -19,7 +18,6 @@
"footer_dmca": "DMCA",
"footer_copyright": "© {year} libnovel",
"footer_dev": "dev",
"home_title": "libnovel",
"home_stat_books": "Buku",
"home_stat_chapters": "Bab",
@@ -33,7 +31,6 @@
"home_discover_novels": "Temukan Novel",
"home_via_reader": "via {username}",
"home_chapter_badge": "bab.{n}",
"player_generating": "Membuat… {percent}%",
"player_loading": "Memuat…",
"player_chapters": "Bab",
@@ -57,7 +54,6 @@
"player_auto_next_aria": "Auto-lanjut {state}",
"player_go_to_chapter": "Pergi ke bab",
"player_close": "Tutup pemutar",
"login_page_title": "Masuk — libnovel",
"login_heading": "Masuk ke libnovel",
"login_subheading": "Pilih penyedia untuk melanjutkan",
@@ -67,7 +63,6 @@
"login_error_oauth_state": "Masuk dibatalkan atau kedaluwarsa. Coba lagi.",
"login_error_oauth_failed": "Tidak dapat terhubung ke penyedia. Coba lagi.",
"login_error_oauth_no_email": "Akunmu tidak memiliki alamat email terverifikasi. Tambahkan dan coba lagi.",
"books_page_title": "Perpustakaan — libnovel",
"books_heading": "Perpustakaanmu",
"books_empty_title": "Belum ada buku",
@@ -77,7 +72,6 @@
"books_last_read": "Terakhir: Bab.{n}",
"books_reading_progress": "Bab.{current} / {total}",
"books_remove": "Hapus",
"catalogue_page_title": "Katalog — libnovel",
"catalogue_heading": "Katalog",
"catalogue_search_placeholder": "Cari novel…",
@@ -98,7 +92,6 @@
"catalogue_loading": "Memuat…",
"catalogue_load_more": "Muat lebih banyak",
"catalogue_results_count": "{n} hasil",
"book_detail_page_title": "{title} — libnovel",
"book_detail_signin_to_save": "Masuk untuk menyimpan",
"book_detail_add_to_library": "Tambah ke Perpustakaan",
@@ -115,13 +108,11 @@
"book_detail_rescrape": "Perbarui",
"book_detail_scraping": "Memperbarui…",
"book_detail_in_library": "Ada di Perpustakaan",
"chapters_page_title": "Bab — {title}",
"chapters_heading": "Bab",
"chapters_back_to_book": "Kembali ke buku",
"chapters_reading_now": "Sedang dibaca",
"chapters_empty": "Belum ada bab yang diambil.",
"reader_page_title": "{title} — Bab.{n} — libnovel",
"reader_play_narration": "Putar narasi",
"reader_generating_audio": "Membuat audio…",
@@ -143,7 +134,6 @@
"reader_auto_next": "Auto-lanjut",
"reader_speed": "Kecepatan",
"reader_preview_notice": "Pratinjau — bab ini belum sepenuhnya diambil.",
"profile_page_title": "Profil — libnovel",
"profile_heading": "Profil",
"profile_avatar_label": "Avatar",
@@ -160,6 +150,9 @@
"profile_theme_amber": "Amber",
"profile_theme_slate": "Abu-abu",
"profile_theme_rose": "Mawar",
"profile_theme_forest": "Hutan",
"profile_theme_mono": "Mono",
"profile_theme_cyber": "Cyberpunk",
"profile_theme_light": "Light",
"profile_theme_light_slate": "Light Blue",
"profile_theme_light_rose": "Light Rose",
@@ -175,7 +168,6 @@
"profile_sessions_heading": "Sesi aktif",
"profile_sign_out_all": "Keluar dari semua perangkat lain",
"profile_joined": "Bergabung {date}",
"user_page_title": "{username} — libnovel",
"user_library_heading": "Perpustakaan {username}",
"user_follow": "Ikuti",
@@ -183,13 +175,11 @@
"user_followers": "{n} pengikut",
"user_following": "{n} mengikuti",
"user_library_empty": "Tidak ada buku di perpustakaan.",
"error_not_found_title": "Halaman tidak ditemukan",
"error_not_found_body": "Halaman yang kamu cari tidak ada.",
"error_generic_title": "Terjadi kesalahan",
"error_go_home": "Ke beranda",
"error_status": "Error {status}",
"admin_scrape_page_title": "Scrape — Admin",
"admin_scrape_heading": "Scrape",
"admin_scrape_catalogue": "Scrape Katalog",
@@ -207,14 +197,11 @@
"admin_scrape_status_cancelled": "Dibatalkan",
"admin_tasks_heading": "Tugas terbaru",
"admin_tasks_empty": "Belum ada tugas.",
"admin_audio_page_title": "Audio — Admin",
"admin_audio_heading": "Tugas Audio",
"admin_audio_empty": "Tidak ada tugas audio.",
"admin_changelog_page_title": "Changelog — Admin",
"admin_changelog_heading": "Changelog",
"comments_heading": "Komentar",
"comments_empty": "Belum ada komentar. Jadilah yang pertama!",
"comments_placeholder": "Tulis komentar…",
@@ -228,12 +215,10 @@
"comments_hide_replies": "Sembunyikan balasan",
"comments_edited": "diedit",
"comments_deleted": "[dihapus]",
"disclaimer_page_title": "Penyangkalan — libnovel",
"privacy_page_title": "Kebijakan Privasi — libnovel",
"dmca_page_title": "DMCA — libnovel",
"terms_page_title": "Syarat Layanan — libnovel",
"common_loading": "Memuat…",
"common_error": "Error",
"common_save": "Simpan",
@@ -247,15 +232,12 @@
"common_no": "Tidak",
"common_on": "aktif",
"common_off": "nonaktif",
"locale_switcher_label": "Bahasa",
"books_empty_library": "Perpustakaanmu kosong.",
"books_empty_discover": "Buku yang mulai kamu baca atau simpan dari",
"books_empty_discover_link": "Temukan",
"books_empty_discover_suffix": "akan muncul di sini.",
"books_count": "{n} buku",
"catalogue_sort_updated": "Diperbarui",
"catalogue_search_button": "Cari",
"catalogue_refresh": "Segarkan",
@@ -288,7 +270,6 @@
"catalogue_scrape_forbidden_badge": "Terlarang",
"catalogue_scrape_novel_button": "Scrape",
"catalogue_scraping_novel": "Scraping…",
"book_detail_not_in_library": "tidak di perpustakaan",
"book_detail_continue_ch": "Lanjutkan bab.{n}",
"book_detail_start_ch1": "Mulai dari bab.1",
@@ -301,23 +282,38 @@
"book_detail_range_queuing": "Mengantri…",
"book_detail_scrape_range": "Rentang scrape",
"book_detail_admin": "Admin",
"book_detail_admin_book_cover": "Sampul Buku",
"book_detail_admin_chapter_cover": "Sampul Bab",
"book_detail_admin_chapter_n": "Bab #",
"book_detail_admin_description": "Deskripsi",
"book_detail_admin_chapter_names": "Nama Bab",
"book_detail_admin_audio_tts": "Audio TTS",
"book_detail_admin_voice": "Suara",
"book_detail_admin_generate": "Buat",
"book_detail_admin_save_cover": "Simpan Sampul",
"book_detail_admin_saving": "Menyimpan…",
"book_detail_admin_saved": "Tersimpan",
"book_detail_admin_apply": "Terapkan",
"book_detail_admin_applying": "Menerapkan…",
"book_detail_admin_applied": "Diterapkan",
"book_detail_admin_discard": "Buang",
"book_detail_admin_enqueue_audio": "Antre Audio",
"book_detail_admin_cancel_audio": "Batal",
"book_detail_admin_enqueued": "Diantre {enqueued}, dilewati {skipped}",
"book_detail_scraping_progress": "Mengambil 20 bab pertama. Halaman ini akan dimuat ulang otomatis.",
"book_detail_scraping_home": "← Beranda",
"book_detail_rescrape_book": "Scrape ulang buku",
"book_detail_less": "Lebih sedikit",
"book_detail_more": "Selengkapnya",
"chapters_search_placeholder": "Cari bab…",
"chapters_jump_to": "Loncat ke Bab.{n}",
"chapters_no_match": "Tidak ada bab yang cocok dengan \"{q}\"",
"chapters_none_available": "Belum ada bab tersedia.",
"chapters_reading_indicator": "sedang dibaca",
"chapters_result_count": "{n} hasil",
"reader_fetching_chapter": "Mengambil bab…",
"reader_words": "{n} kata",
"reader_preview_audio_notice": "Pratinjau — audio tidak tersedia untuk buku di luar perpustakaan.",
"profile_click_to_change": "Klik avatar untuk mengganti foto",
"profile_tts_voice": "Suara TTS",
"profile_auto_advance": "Otomatis lanjut ke bab berikutnya",
@@ -335,7 +331,6 @@
"profile_updating": "Memperbarui…",
"profile_password_changed_ok": "Kata sandi berhasil diubah.",
"profile_playback_speed": "Kecepatan pemutaran — {speed}x",
"profile_subscription_heading": "Langganan",
"profile_plan_pro": "Pro",
"profile_plan_free": "Gratis",
@@ -347,27 +342,48 @@
"profile_upgrade_monthly": "Bulanan — $6 / bln",
"profile_upgrade_annual": "Tahunan — $48 / thn",
"profile_free_limits": "Paket gratis: 3 bab audio per hari, hanya bahasa Inggris.",
"subscribe_page_title": "Jadi Pro — libnovel",
"subscribe_heading": "Baca lebih. Dengarkan lebih.",
"subscribe_subheading": "Tingkatkan ke Pro dan buka pengalaman libnovel sepenuhnya.",
"subscribe_monthly_label": "Bulanan",
"subscribe_monthly_price": "$6",
"subscribe_monthly_period": "per bulan",
"subscribe_annual_label": "Tahunan",
"subscribe_annual_price": "$48",
"subscribe_annual_period": "per tahun",
"subscribe_annual_save": "Hemat 33%",
"subscribe_cta_monthly": "Mulai paket bulanan",
"subscribe_cta_annual": "Mulai paket tahunan",
"subscribe_already_pro": "Anda sudah berlangganan Pro.",
"subscribe_manage": "Kelola langganan",
"subscribe_benefit_audio": "Bab audio tak terbatas per hari",
"subscribe_benefit_voices": "Pilihan suara untuk semua mesin TTS",
"subscribe_benefit_translation": "Baca dalam bahasa Prancis, Indonesia, Portugis, dan Rusia",
"subscribe_benefit_downloads": "Unduh bab untuk didengarkan secara offline",
"subscribe_login_prompt": "Masuk untuk berlangganan",
"subscribe_login_cta": "Masuk",
"user_currently_reading": "Sedang Dibaca",
"user_library_count": "Perpustakaan ({n})",
"user_joined": "Bergabung {date}",
"user_followers_label": "pengikut",
"user_following_label": "mengikuti",
"user_no_books": "Belum ada buku di perpustakaan.",
"admin_pages_label": "Halaman",
"admin_tools_label": "Alat",
"admin_nav_scrape": "Scrape",
"admin_nav_audio": "Audio",
"admin_nav_translation": "Terjemahan",
"admin_nav_changelog": "Perubahan",
"admin_nav_feedback": "Masukan",
"admin_nav_image_gen": "Image Gen",
"admin_nav_text_gen": "Text Gen",
"admin_nav_catalogue_tools": "Catalogue Tools",
"admin_nav_ai_jobs": "Tugas AI",
"admin_nav_notifications": "Notifikasi",
"admin_nav_errors": "Kesalahan",
"admin_nav_analytics": "Analitik",
"admin_nav_logs": "Log",
"admin_nav_uptime": "Uptime",
"admin_nav_push": "Notifikasi",
"admin_scrape_status_idle": "Menunggu",
"admin_scrape_full_catalogue": "Katalog penuh",
"admin_scrape_single_book": "Satu buku",
@@ -378,25 +394,21 @@
"admin_scrape_start": "Mulai scrape",
"admin_scrape_queuing": "Mengantri…",
"admin_scrape_running": "Berjalan…",
"admin_audio_filter_jobs": "Filter berdasarkan slug, suara, atau status…",
"admin_audio_filter_cache": "Filter berdasarkan slug, bab, atau suara…",
"admin_audio_no_matching_jobs": "Tidak ada pekerjaan yang cocok.",
"admin_audio_no_jobs": "Belum ada pekerjaan audio.",
"admin_audio_cache_empty": "Cache audio kosong.",
"admin_audio_no_cache_results": "Tidak ada hasil.",
"admin_changelog_gitea": "Rilis Gitea",
"admin_changelog_no_releases": "Tidak ada rilis.",
"admin_changelog_load_error": "Gagal memuat rilis: {error}",
"comments_top": "Teratas",
"comments_new": "Terbaru",
"comments_posting": "Mengirim…",
"comments_login_link": "Masuk",
"comments_login_suffix": "untuk meninggalkan komentar.",
"comments_anonymous": "Anonim",
"reader_audio_narration": "Narasi Audio",
"reader_playing": "Memutar — kontrol di bawah",
"reader_paused": "Dijeda — kontrol di bawah",
@@ -409,7 +421,6 @@
"reader_voice_applies_next": "Suara baru berlaku pada \"Putar narasi\" berikutnya.",
"reader_choose_voice": "Pilih Suara",
"reader_generating_narration": "Membuat narasi…",
"profile_font_family": "Jenis Font",
"profile_font_system": "Sistem",
"profile_font_serif": "Serif",
@@ -418,5 +429,30 @@
"profile_text_size_sm": "Kecil",
"profile_text_size_md": "Normal",
"profile_text_size_lg": "Besar",
"profile_text_size_xl": "Sangat Besar"
"profile_text_size_xl": "Sangat Besar",
"feed_page_title": "Umpan — LibNovel",
"feed_heading": "Umpan Ikutan",
"feed_subheading": "Buku yang sedang dibaca oleh pengguna yang Anda ikuti",
"feed_empty_heading": "Belum ada apa-apa",
"feed_empty_body": "Ikuti pembaca lain untuk melihat apa yang mereka baca.",
"feed_not_logged_in": "Masuk untuk melihat umpan Anda.",
"feed_reader_label": "membaca",
"feed_chapters_label": "{n} bab",
"feed_browse_cta": "Jelajahi katalog",
"feed_find_users_cta": "Temukan pembaca",
"admin_nav_gitea": "Gitea",
"admin_nav_grafana": "Grafana",
"admin_translation_page_title": "Translation — Admin",
"admin_translation_heading": "Machine Translation",
"admin_translation_tab_enqueue": "Enqueue",
"admin_translation_tab_jobs": "Jobs",
"admin_translation_filter_placeholder": "Filter by slug, lang, or status…",
"admin_translation_no_matching": "No matching jobs.",
"admin_translation_no_jobs": "No translation jobs yet.",
"admin_ai_jobs_page_title": "AI Jobs — Admin",
"admin_ai_jobs_heading": "AI Jobs",
"admin_ai_jobs_subheading": "Background AI generation tasks",
"admin_text_gen_page_title": "Text Gen — Admin",
"admin_text_gen_heading": "Text Generation",
"admin_nav_import": "Import"
}

View File

@@ -1,8 +1,8 @@
{
"$schema": "https://inlang.com/schema/inlang-message-format",
"nav_library": "Biblioteca",
"nav_catalogue": "Catálogo",
"nav_feed": "Feed",
"nav_feedback": "Feedback",
"nav_admin": "Admin",
"nav_profile": "Perfil",
@@ -10,7 +10,6 @@
"nav_sign_out": "Sair",
"nav_toggle_menu": "Menu",
"nav_admin_panel": "Painel admin",
"footer_library": "Biblioteca",
"footer_catalogue": "Catálogo",
"footer_feedback": "Feedback",
@@ -19,7 +18,6 @@
"footer_dmca": "DMCA",
"footer_copyright": "© {year} libnovel",
"footer_dev": "dev",
"home_title": "libnovel",
"home_stat_books": "Livros",
"home_stat_chapters": "Capítulos",
@@ -33,7 +31,6 @@
"home_discover_novels": "Descobrir Romances",
"home_via_reader": "via {username}",
"home_chapter_badge": "cap.{n}",
"player_generating": "Gerando… {percent}%",
"player_loading": "Carregando…",
"player_chapters": "Capítulos",
@@ -57,7 +54,6 @@
"player_auto_next_aria": "Próximo automático {state}",
"player_go_to_chapter": "Ir para capítulo",
"player_close": "Fechar player",
"login_page_title": "Entrar — libnovel",
"login_heading": "Entrar no libnovel",
"login_subheading": "Escolha um provedor para continuar",
@@ -67,7 +63,6 @@
"login_error_oauth_state": "Login cancelado ou expirado. Tente novamente.",
"login_error_oauth_failed": "Não foi possível conectar ao provedor. Tente novamente.",
"login_error_oauth_no_email": "Sua conta não tem endereço de email verificado. Adicione um e tente novamente.",
"books_page_title": "Biblioteca — libnovel",
"books_heading": "Sua Biblioteca",
"books_empty_title": "Nenhum livro ainda",
@@ -77,7 +72,6 @@
"books_last_read": "Último: Cap.{n}",
"books_reading_progress": "Cap.{current} / {total}",
"books_remove": "Remover",
"catalogue_page_title": "Catálogo — libnovel",
"catalogue_heading": "Catálogo",
"catalogue_search_placeholder": "Pesquisar romances…",
@@ -98,7 +92,6 @@
"catalogue_loading": "Carregando…",
"catalogue_load_more": "Carregar mais",
"catalogue_results_count": "{n} resultados",
"book_detail_page_title": "{title} — libnovel",
"book_detail_signin_to_save": "Entre para salvar",
"book_detail_add_to_library": "Adicionar à Biblioteca",
@@ -115,13 +108,11 @@
"book_detail_rescrape": "Atualizar",
"book_detail_scraping": "Atualizando…",
"book_detail_in_library": "Na Biblioteca",
"chapters_page_title": "Capítulos — {title}",
"chapters_heading": "Capítulos",
"chapters_back_to_book": "Voltar ao livro",
"chapters_reading_now": "Lendo",
"chapters_empty": "Nenhum capítulo extraído ainda.",
"reader_page_title": "{title} — Cap.{n} — libnovel",
"reader_play_narration": "Reproduzir narração",
"reader_generating_audio": "Gerando áudio…",
@@ -143,7 +134,6 @@
"reader_auto_next": "Próximo automático",
"reader_speed": "Velocidade",
"reader_preview_notice": "Prévia — este capítulo não foi totalmente extraído.",
"profile_page_title": "Perfil — libnovel",
"profile_heading": "Perfil",
"profile_avatar_label": "Avatar",
@@ -160,6 +150,9 @@
"profile_theme_amber": "Âmbar",
"profile_theme_slate": "Ardósia",
"profile_theme_rose": "Rosa",
"profile_theme_forest": "Floresta",
"profile_theme_mono": "Mono",
"profile_theme_cyber": "Cyberpunk",
"profile_theme_light": "Light",
"profile_theme_light_slate": "Light Blue",
"profile_theme_light_rose": "Light Rose",
@@ -175,7 +168,6 @@
"profile_sessions_heading": "Sessões ativas",
"profile_sign_out_all": "Sair de todos os outros dispositivos",
"profile_joined": "Entrou em {date}",
"user_page_title": "{username} — libnovel",
"user_library_heading": "Biblioteca de {username}",
"user_follow": "Seguir",
@@ -183,13 +175,11 @@
"user_followers": "{n} seguidores",
"user_following": "{n} seguindo",
"user_library_empty": "Nenhum livro na biblioteca.",
"error_not_found_title": "Página não encontrada",
"error_not_found_body": "A página que você procura não existe.",
"error_generic_title": "Algo deu errado",
"error_go_home": "Ir para início",
"error_status": "Erro {status}",
"admin_scrape_page_title": "Extração — Admin",
"admin_scrape_heading": "Extração",
"admin_scrape_catalogue": "Extrair Catálogo",
@@ -207,14 +197,11 @@
"admin_scrape_status_cancelled": "Cancelado",
"admin_tasks_heading": "Tarefas recentes",
"admin_tasks_empty": "Nenhuma tarefa ainda.",
"admin_audio_page_title": "Áudio — Admin",
"admin_audio_heading": "Tarefas de Áudio",
"admin_audio_empty": "Nenhuma tarefa de áudio.",
"admin_changelog_page_title": "Changelog — Admin",
"admin_changelog_heading": "Changelog",
"comments_heading": "Comentários",
"comments_empty": "Nenhum comentário ainda. Seja o primeiro!",
"comments_placeholder": "Escreva um comentário…",
@@ -228,12 +215,10 @@
"comments_hide_replies": "Ocultar respostas",
"comments_edited": "editado",
"comments_deleted": "[excluído]",
"disclaimer_page_title": "Aviso Legal — libnovel",
"privacy_page_title": "Política de Privacidade — libnovel",
"dmca_page_title": "DMCA — libnovel",
"terms_page_title": "Termos de Serviço — libnovel",
"common_loading": "Carregando…",
"common_error": "Erro",
"common_save": "Salvar",
@@ -247,15 +232,12 @@
"common_no": "Não",
"common_on": "ativado",
"common_off": "desativado",
"locale_switcher_label": "Idioma",
"books_empty_library": "Sua biblioteca está vazia.",
"books_empty_discover": "Livros que você começar a ler ou salvar de",
"books_empty_discover_link": "Descobrir",
"books_empty_discover_suffix": "aparecerão aqui.",
"books_count": "{n} livro{s}",
"catalogue_sort_updated": "Atualizado",
"catalogue_search_button": "Pesquisar",
"catalogue_refresh": "Atualizar",
@@ -288,7 +270,6 @@
"catalogue_scrape_forbidden_badge": "Proibido",
"catalogue_scrape_novel_button": "Extrair",
"catalogue_scraping_novel": "Extraindo…",
"book_detail_not_in_library": "não está na biblioteca",
"book_detail_continue_ch": "Continuar cap.{n}",
"book_detail_start_ch1": "Começar pelo cap.1",
@@ -301,23 +282,38 @@
"book_detail_range_queuing": "Na fila…",
"book_detail_scrape_range": "Intervalo de extração",
"book_detail_admin": "Admin",
"book_detail_admin_book_cover": "Capa do Livro",
"book_detail_admin_chapter_cover": "Capa do Capítulo",
"book_detail_admin_chapter_n": "Capítulo nº",
"book_detail_admin_description": "Descrição",
"book_detail_admin_chapter_names": "Nomes dos Capítulos",
"book_detail_admin_audio_tts": "Áudio TTS",
"book_detail_admin_voice": "Voz",
"book_detail_admin_generate": "Gerar",
"book_detail_admin_save_cover": "Salvar Capa",
"book_detail_admin_saving": "Salvando…",
"book_detail_admin_saved": "Salvo",
"book_detail_admin_apply": "Aplicar",
"book_detail_admin_applying": "Aplicando…",
"book_detail_admin_applied": "Aplicado",
"book_detail_admin_discard": "Descartar",
"book_detail_admin_enqueue_audio": "Enfileirar Áudio",
"book_detail_admin_cancel_audio": "Cancelar",
"book_detail_admin_enqueued": "{enqueued} enfileirados, {skipped} ignorados",
"book_detail_scraping_progress": "Buscando os primeiros 20 capítulos. Esta página será atualizada automaticamente.",
"book_detail_scraping_home": "← Início",
"book_detail_rescrape_book": "Reextrair livro",
"book_detail_less": "Menos",
"book_detail_more": "Mais",
"chapters_search_placeholder": "Pesquisar capítulos…",
"chapters_jump_to": "Ir para Cap.{n}",
"chapters_no_match": "Nenhum capítulo encontrado para \"{q}\"",
"chapters_none_available": "Nenhum capítulo disponível ainda.",
"chapters_reading_indicator": "lendo",
"chapters_result_count": "{n} resultados",
"reader_fetching_chapter": "Buscando capítulo…",
"reader_words": "{n} palavras",
"reader_preview_audio_notice": "Prévia — áudio não disponível para livros fora da biblioteca.",
"profile_click_to_change": "Clique no avatar para mudar a foto",
"profile_tts_voice": "Voz TTS",
"profile_auto_advance": "Avançar automaticamente para o próximo capítulo",
@@ -335,7 +331,6 @@
"profile_updating": "Atualizando…",
"profile_password_changed_ok": "Senha alterada com sucesso.",
"profile_playback_speed": "Velocidade de reprodução — {speed}x",
"profile_subscription_heading": "Assinatura",
"profile_plan_pro": "Pro",
"profile_plan_free": "Gratuito",
@@ -347,27 +342,48 @@
"profile_upgrade_monthly": "Mensal — $6 / mês",
"profile_upgrade_annual": "Anual — $48 / ano",
"profile_free_limits": "Plano gratuito: 3 capítulos de áudio por dia, somente inglês.",
"subscribe_page_title": "Seja Pro — libnovel",
"subscribe_heading": "Leia mais. Ouça mais.",
"subscribe_subheading": "Torne-se Pro e desbloqueie a experiência completa do libnovel.",
"subscribe_monthly_label": "Mensal",
"subscribe_monthly_price": "$6",
"subscribe_monthly_period": "por mês",
"subscribe_annual_label": "Anual",
"subscribe_annual_price": "$48",
"subscribe_annual_period": "por ano",
"subscribe_annual_save": "Economize 33%",
"subscribe_cta_monthly": "Começar plano mensal",
"subscribe_cta_annual": "Começar plano anual",
"subscribe_already_pro": "Você já tem uma assinatura Pro.",
"subscribe_manage": "Gerenciar assinatura",
"subscribe_benefit_audio": "Capítulos de áudio ilimitados por dia",
"subscribe_benefit_voices": "Seleção de voz para todos os mecanismos TTS",
"subscribe_benefit_translation": "Leia em francês, indonésio, português e russo",
"subscribe_benefit_downloads": "Baixe capítulos para ouvir offline",
"subscribe_login_prompt": "Entre para assinar",
"subscribe_login_cta": "Entrar",
"user_currently_reading": "Lendo Agora",
"user_library_count": "Biblioteca ({n})",
"user_joined": "Entrou em {date}",
"user_followers_label": "seguidores",
"user_following_label": "seguindo",
"user_no_books": "Nenhum livro na biblioteca ainda.",
"admin_pages_label": "Páginas",
"admin_tools_label": "Ferramentas",
"admin_nav_scrape": "Scrape",
"admin_nav_audio": "Áudio",
"admin_nav_translation": "Tradução",
"admin_nav_changelog": "Alterações",
"admin_nav_feedback": "Feedback",
"admin_nav_image_gen": "Image Gen",
"admin_nav_text_gen": "Text Gen",
"admin_nav_catalogue_tools": "Catalogue Tools",
"admin_nav_ai_jobs": "Tarefas de IA",
"admin_nav_notifications": "Notificações",
"admin_nav_errors": "Erros",
"admin_nav_analytics": "Análise",
"admin_nav_logs": "Logs",
"admin_nav_uptime": "Uptime",
"admin_nav_push": "Notificações",
"admin_scrape_status_idle": "Ocioso",
"admin_scrape_full_catalogue": "Catálogo completo",
"admin_scrape_single_book": "Livro único",
@@ -378,25 +394,21 @@
"admin_scrape_start": "Iniciar extração",
"admin_scrape_queuing": "Na fila…",
"admin_scrape_running": "Executando…",
"admin_audio_filter_jobs": "Filtrar por slug, voz ou status…",
"admin_audio_filter_cache": "Filtrar por slug, capítulo ou voz…",
"admin_audio_no_matching_jobs": "Nenhum job correspondente.",
"admin_audio_no_jobs": "Nenhum job de áudio ainda.",
"admin_audio_cache_empty": "Cache de áudio vazio.",
"admin_audio_no_cache_results": "Sem resultados.",
"admin_changelog_gitea": "Releases do Gitea",
"admin_changelog_no_releases": "Nenhum release encontrado.",
"admin_changelog_load_error": "Não foi possível carregar os releases: {error}",
"comments_top": "Mais votados",
"comments_new": "Novos",
"comments_posting": "Publicando…",
"comments_login_link": "Entre",
"comments_login_suffix": "para deixar um comentário.",
"comments_anonymous": "Anônimo",
"reader_audio_narration": "Narração em Áudio",
"reader_playing": "Reproduzindo — controles abaixo",
"reader_paused": "Pausado — controles abaixo",
@@ -409,7 +421,6 @@
"reader_voice_applies_next": "A nova voz será aplicada no próximo \"Reproduzir narração\".",
"reader_choose_voice": "Escolher Voz",
"reader_generating_narration": "Gerando narração…",
"profile_font_family": "Fonte",
"profile_font_system": "Sistema",
"profile_font_serif": "Serif",
@@ -418,5 +429,30 @@
"profile_text_size_sm": "Pequeno",
"profile_text_size_md": "Normal",
"profile_text_size_lg": "Grande",
"profile_text_size_xl": "Muito grande"
"profile_text_size_xl": "Muito grande",
"feed_page_title": "Feed — LibNovel",
"feed_heading": "Feed de seguidos",
"feed_subheading": "Livros que seus seguidos estão lendo",
"feed_empty_heading": "Nada aqui ainda",
"feed_empty_body": "Siga outros leitores para ver o que estão lendo.",
"feed_not_logged_in": "Faça login para ver seu feed.",
"feed_reader_label": "lendo",
"feed_chapters_label": "{n} capítulos",
"feed_browse_cta": "Ver catálogo",
"feed_find_users_cta": "Encontrar leitores",
"admin_nav_gitea": "Gitea",
"admin_nav_grafana": "Grafana",
"admin_translation_page_title": "Translation — Admin",
"admin_translation_heading": "Machine Translation",
"admin_translation_tab_enqueue": "Enqueue",
"admin_translation_tab_jobs": "Jobs",
"admin_translation_filter_placeholder": "Filter by slug, lang, or status…",
"admin_translation_no_matching": "No matching jobs.",
"admin_translation_no_jobs": "No translation jobs yet.",
"admin_ai_jobs_page_title": "AI Jobs — Admin",
"admin_ai_jobs_heading": "AI Jobs",
"admin_ai_jobs_subheading": "Background AI generation tasks",
"admin_text_gen_page_title": "Text Gen — Admin",
"admin_text_gen_heading": "Text Generation",
"admin_nav_import": "Import"
}

View File

@@ -1,8 +1,8 @@
{
"$schema": "https://inlang.com/schema/inlang-message-format",
"nav_library": "Библиотека",
"nav_catalogue": "Каталог",
"nav_feed": "Лента",
"nav_feedback": "Обратная связь",
"nav_admin": "Админ",
"nav_profile": "Профиль",
@@ -10,7 +10,6 @@
"nav_sign_out": "Выйти",
"nav_toggle_menu": "Меню",
"nav_admin_panel": "Панель администратора",
"footer_library": "Библиотека",
"footer_catalogue": "Каталог",
"footer_feedback": "Обратная связь",
@@ -19,7 +18,6 @@
"footer_dmca": "DMCA",
"footer_copyright": "© {year} libnovel",
"footer_dev": "dev",
"home_title": "libnovel",
"home_stat_books": "Книги",
"home_stat_chapters": "Главы",
@@ -33,7 +31,6 @@
"home_discover_novels": "Открыть новеллы",
"home_via_reader": "от {username}",
"home_chapter_badge": "гл.{n}",
"player_generating": "Генерация… {percent}%",
"player_loading": "Загрузка…",
"player_chapters": "Главы",
@@ -57,7 +54,6 @@
"player_auto_next_aria": "Автопереход {state}",
"player_go_to_chapter": "Перейти к главе",
"player_close": "Закрыть плеер",
"login_page_title": "Вход — libnovel",
"login_heading": "Войти в libnovel",
"login_subheading": "Выберите провайдера для входа",
@@ -67,7 +63,6 @@
"login_error_oauth_state": "Вход отменён или истёк срок действия. Попробуйте снова.",
"login_error_oauth_failed": "Не удалось подключиться к провайдеру. Попробуйте снова.",
"login_error_oauth_no_email": "У вашего аккаунта нет подтверждённого email. Добавьте его и повторите попытку.",
"books_page_title": "Библиотека — libnovel",
"books_heading": "Ваша библиотека",
"books_empty_title": "Книг пока нет",
@@ -77,7 +72,6 @@
"books_last_read": "Последнее: гл.{n}",
"books_reading_progress": "Гл.{current} / {total}",
"books_remove": "Удалить",
"catalogue_page_title": "Каталог — libnovel",
"catalogue_heading": "Каталог",
"catalogue_search_placeholder": "Поиск новелл…",
@@ -98,7 +92,6 @@
"catalogue_loading": "Загрузка…",
"catalogue_load_more": "Загрузить ещё",
"catalogue_results_count": "{n} результатов",
"book_detail_page_title": "{title} — libnovel",
"book_detail_signin_to_save": "Войдите, чтобы сохранить",
"book_detail_add_to_library": "В библиотеку",
@@ -115,13 +108,11 @@
"book_detail_rescrape": "Обновить",
"book_detail_scraping": "Обновление…",
"book_detail_in_library": "В библиотеке",
"chapters_page_title": "Главы — {title}",
"chapters_heading": "Главы",
"chapters_back_to_book": "К книге",
"chapters_reading_now": "Читается",
"chapters_empty": "Главы ещё не загружены.",
"reader_page_title": "{title} — Гл.{n} — libnovel",
"reader_play_narration": "Воспроизвести озвучку",
"reader_generating_audio": "Генерация аудио…",
@@ -143,7 +134,6 @@
"reader_auto_next": "Автопереход",
"reader_speed": "Скорость",
"reader_preview_notice": "Предпросмотр — эта глава не полностью загружена.",
"profile_page_title": "Профиль — libnovel",
"profile_heading": "Профиль",
"profile_avatar_label": "Аватар",
@@ -160,6 +150,9 @@
"profile_theme_amber": "Янтарь",
"profile_theme_slate": "Сланец",
"profile_theme_rose": "Роза",
"profile_theme_forest": "Лес",
"profile_theme_mono": "Моно",
"profile_theme_cyber": "Киберпанк",
"profile_theme_light": "Light",
"profile_theme_light_slate": "Light Blue",
"profile_theme_light_rose": "Light Rose",
@@ -175,7 +168,6 @@
"profile_sessions_heading": "Активные сессии",
"profile_sign_out_all": "Выйти на всех других устройствах",
"profile_joined": "Зарегистрирован {date}",
"user_page_title": "{username} — libnovel",
"user_library_heading": "Библиотека {username}",
"user_follow": "Подписаться",
@@ -183,13 +175,11 @@
"user_followers": "{n} подписчиков",
"user_following": "{n} подписок",
"user_library_empty": "В библиотеке нет книг.",
"error_not_found_title": "Страница не найдена",
"error_not_found_body": "Запрошенная страница не существует.",
"error_generic_title": "Что-то пошло не так",
"error_go_home": "На главную",
"error_status": "Ошибка {status}",
"admin_scrape_page_title": "Парсинг — Админ",
"admin_scrape_heading": "Парсинг",
"admin_scrape_catalogue": "Парсинг каталога",
@@ -207,14 +197,11 @@
"admin_scrape_status_cancelled": "Отменено",
"admin_tasks_heading": "Последние задачи",
"admin_tasks_empty": "Задач пока нет.",
"admin_audio_page_title": "Аудио — Админ",
"admin_audio_heading": "Аудио задачи",
"admin_audio_empty": "Аудио задач нет.",
"admin_changelog_page_title": "Changelog — Админ",
"admin_changelog_heading": "Changelog",
"comments_heading": "Комментарии",
"comments_empty": "Комментариев пока нет. Будьте первым!",
"comments_placeholder": "Написать комментарий…",
@@ -228,12 +215,10 @@
"comments_hide_replies": "Скрыть ответы",
"comments_edited": "изменено",
"comments_deleted": "[удалено]",
"disclaimer_page_title": "Отказ от ответственности — libnovel",
"privacy_page_title": "Политика конфиденциальности — libnovel",
"dmca_page_title": "DMCA — libnovel",
"terms_page_title": "Условия использования — libnovel",
"common_loading": "Загрузка…",
"common_error": "Ошибка",
"common_save": "Сохранить",
@@ -247,15 +232,12 @@
"common_no": "Нет",
"common_on": "вкл.",
"common_off": "выкл.",
"locale_switcher_label": "Язык",
"books_empty_library": "Ваша библиотека пуста.",
"books_empty_discover": "Книги, которые вы начнёте читать или сохраните из",
"books_empty_discover_link": "Каталога",
"books_empty_discover_suffix": "появятся здесь.",
"books_count": "{n} книг{s}",
"catalogue_sort_updated": "По дате обновления",
"catalogue_search_button": "Поиск",
"catalogue_refresh": "Обновить",
@@ -288,7 +270,6 @@
"catalogue_scrape_forbidden_badge": "Запрещено",
"catalogue_scrape_novel_button": "Парсить",
"catalogue_scraping_novel": "Парсинг…",
"book_detail_not_in_library": "не в библиотеке",
"book_detail_continue_ch": "Продолжить гл.{n}",
"book_detail_start_ch1": "Начать с гл.1",
@@ -301,23 +282,38 @@
"book_detail_range_queuing": "В очереди…",
"book_detail_scrape_range": "Диапазон глав",
"book_detail_admin": "Администрирование",
"book_detail_admin_book_cover": "Обложка книги",
"book_detail_admin_chapter_cover": "Обложка главы",
"book_detail_admin_chapter_n": "Глава №",
"book_detail_admin_description": "Описание",
"book_detail_admin_chapter_names": "Названия глав",
"book_detail_admin_audio_tts": "Аудио TTS",
"book_detail_admin_voice": "Голос",
"book_detail_admin_generate": "Сгенерировать",
"book_detail_admin_save_cover": "Сохранить обложку",
"book_detail_admin_saving": "Сохранение…",
"book_detail_admin_saved": "Сохранено",
"book_detail_admin_apply": "Применить",
"book_detail_admin_applying": "Применение…",
"book_detail_admin_applied": "Применено",
"book_detail_admin_discard": "Отменить",
"book_detail_admin_enqueue_audio": "Поставить в очередь",
"book_detail_admin_cancel_audio": "Отмена",
"book_detail_admin_enqueued": "В очереди {enqueued}, пропущено {skipped}",
"book_detail_scraping_progress": "Загружаются первые 20 глав. Страница обновится автоматически.",
"book_detail_scraping_home": "← На главную",
"book_detail_rescrape_book": "Перепарсить книгу",
"book_detail_less": "Скрыть",
"book_detail_more": "Ещё",
"chapters_search_placeholder": "Поиск глав…",
"chapters_jump_to": "Перейти к гл.{n}",
"chapters_no_match": "Главы по запросу «{q}» не найдены",
"chapters_none_available": "Глав пока нет.",
"chapters_reading_indicator": "читается",
"chapters_result_count": "{n} результатов",
"reader_fetching_chapter": "Загрузка главы…",
"reader_words": "{n} слов",
"reader_preview_audio_notice": "Предпросмотр — аудио недоступно для книг вне библиотеки.",
"profile_click_to_change": "Нажмите на аватар для смены фото",
"profile_tts_voice": "Голос TTS",
"profile_auto_advance": "Автопереход к следующей главе",
@@ -335,7 +331,6 @@
"profile_updating": "Обновление…",
"profile_password_changed_ok": "Пароль успешно изменён.",
"profile_playback_speed": "Скорость воспроизведения — {speed}x",
"profile_subscription_heading": "Подписка",
"profile_plan_pro": "Pro",
"profile_plan_free": "Бесплатно",
@@ -347,27 +342,48 @@
"profile_upgrade_monthly": "Ежемесячно — $6 / мес",
"profile_upgrade_annual": "Ежегодно — $48 / год",
"profile_free_limits": "Бесплатный план: 3 аудиоглавы в день, только английский.",
"subscribe_page_title": "Перейти на Pro — libnovel",
"subscribe_heading": "Читайте больше. Слушайте больше.",
"subscribe_subheading": "Перейдите на Pro и откройте полный опыт libnovel.",
"subscribe_monthly_label": "Ежемесячно",
"subscribe_monthly_price": "$6",
"subscribe_monthly_period": "в месяц",
"subscribe_annual_label": "Ежегодно",
"subscribe_annual_price": "$48",
"subscribe_annual_period": "в год",
"subscribe_annual_save": "Сэкономьте 33%",
"subscribe_cta_monthly": "Начать месячный план",
"subscribe_cta_annual": "Начать годовой план",
"subscribe_already_pro": "У вас уже есть подписка Pro.",
"subscribe_manage": "Управление подпиской",
"subscribe_benefit_audio": "Неограниченные аудиоглавы в день",
"subscribe_benefit_voices": "Выбор голоса для всех TTS-движков",
"subscribe_benefit_translation": "Читайте на французском, индонезийском, португальском и русском",
"subscribe_benefit_downloads": "Скачивайте главы для прослушивания офлайн",
"subscribe_login_prompt": "Войдите, чтобы оформить подписку",
"subscribe_login_cta": "Войти",
"user_currently_reading": "Сейчас читает",
"user_library_count": "Библиотека ({n})",
"user_joined": "Зарегистрирован {date}",
"user_followers_label": "подписчиков",
"user_following_label": "подписок",
"user_no_books": "Книг в библиотеке пока нет.",
"admin_pages_label": "Страницы",
"admin_tools_label": "Инструменты",
"admin_nav_scrape": "Скрейпинг",
"admin_nav_audio": "Аудио",
"admin_nav_translation": "Перевод",
"admin_nav_changelog": "Изменения",
"admin_nav_feedback": "Отзывы",
"admin_nav_image_gen": "Image Gen",
"admin_nav_text_gen": "Text Gen",
"admin_nav_catalogue_tools": "Catalogue Tools",
"admin_nav_ai_jobs": "Задачи ИИ",
"admin_nav_notifications": "Уведомления",
"admin_nav_errors": "Ошибки",
"admin_nav_analytics": "Аналитика",
"admin_nav_logs": "Логи",
"admin_nav_uptime": "Мониторинг",
"admin_nav_push": "Уведомления",
"admin_scrape_status_idle": "Ожидание",
"admin_scrape_full_catalogue": "Полный каталог",
"admin_scrape_single_book": "Одна книга",
@@ -378,25 +394,21 @@
"admin_scrape_start": "Начать парсинг",
"admin_scrape_queuing": "В очереди…",
"admin_scrape_running": "Выполняется…",
"admin_audio_filter_jobs": "Фильтр по slug, голосу или статусу…",
"admin_audio_filter_cache": "Фильтр по slug, главе или голосу…",
"admin_audio_no_matching_jobs": "Заданий не найдено.",
"admin_audio_no_jobs": "Аудиозаданий пока нет.",
"admin_audio_cache_empty": "Аудиокэш пуст.",
"admin_audio_no_cache_results": "Результатов нет.",
"admin_changelog_gitea": "Релизы Gitea",
"admin_changelog_no_releases": "Релизов не найдено.",
"admin_changelog_load_error": "Не удалось загрузить релизы: {error}",
"comments_top": "Лучшие",
"comments_new": "Новые",
"comments_posting": "Отправка…",
"comments_login_link": "Войдите",
"comments_login_suffix": "чтобы оставить комментарий.",
"comments_anonymous": "Аноним",
"reader_audio_narration": "Аудионарратив",
"reader_playing": "Воспроизводится — управление ниже",
"reader_paused": "Пауза — управление ниже",
@@ -409,7 +421,6 @@
"reader_voice_applies_next": "Новый голос применится при следующем нажатии «Воспроизвести».",
"reader_choose_voice": "Выбрать голос",
"reader_generating_narration": "Генерация озвучки…",
"profile_font_family": "Шрифт",
"profile_font_system": "Системный",
"profile_font_serif": "Serif",
@@ -418,5 +429,30 @@
"profile_text_size_sm": "Маленький",
"profile_text_size_md": "Нормальный",
"profile_text_size_lg": "Большой",
"profile_text_size_xl": "Очень большой"
"profile_text_size_xl": "Очень большой",
"feed_page_title": "Лента — LibNovel",
"feed_heading": "Лента подписок",
"feed_subheading": "Книги, которые читают ваши подписки",
"feed_empty_heading": "Пока ничего нет",
"feed_empty_body": "Подпишитесь на других читателей, чтобы видеть, что они читают.",
"feed_not_logged_in": "Войдите, чтобы видеть свою ленту.",
"feed_reader_label": "читает",
"feed_chapters_label": "{n} глав",
"feed_browse_cta": "Каталог",
"feed_find_users_cta": "Найти читателей",
"admin_nav_gitea": "Gitea",
"admin_nav_grafana": "Grafana",
"admin_translation_page_title": "Translation — Admin",
"admin_translation_heading": "Machine Translation",
"admin_translation_tab_enqueue": "Enqueue",
"admin_translation_tab_jobs": "Jobs",
"admin_translation_filter_placeholder": "Filter by slug, lang, or status…",
"admin_translation_no_matching": "No matching jobs.",
"admin_translation_no_jobs": "No translation jobs yet.",
"admin_ai_jobs_page_title": "AI Jobs — Admin",
"admin_ai_jobs_heading": "AI Jobs",
"admin_ai_jobs_subheading": "Background AI generation tasks",
"admin_text_gen_page_title": "Text Gen — Admin",
"admin_text_gen_heading": "Text Generation",
"admin_nav_import": "Import"
}

142
ui/package-lock.json generated
View File

@@ -10,6 +10,7 @@
"dependencies": {
"@aws-sdk/client-s3": "^3.1005.0",
"@aws-sdk/s3-request-presigner": "^3.1005.0",
"@grafana/faro-web-sdk": "^2.3.1",
"@inlang/paraglide-js": "^2.15.1",
"@opentelemetry/exporter-logs-otlp-http": "^0.214.0",
"@opentelemetry/exporter-trace-otlp-http": "^0.214.0",
@@ -1689,6 +1690,115 @@
"module-details-from-path": "^1.0.4"
}
},
"node_modules/@grafana/faro-core": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/@grafana/faro-core/-/faro-core-2.3.1.tgz",
"integrity": "sha512-htDKO0YFKr0tfntrPoM151vOPSZzmP6oE0+0MDvbI1WDaBW4erXmYi3feGJLWDXt5/vZBg9iQRmZoRzTLTTcOA==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/otlp-transformer": "^0.213.0"
}
},
"node_modules/@grafana/faro-core/node_modules/@opentelemetry/otlp-transformer": {
"version": "0.213.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/otlp-transformer/-/otlp-transformer-0.213.0.tgz",
"integrity": "sha512-RSuAlxFFPjeK4d5Y6ps8L2WhaQI6CXWllIjvo5nkAlBpmq2XdYWEBGiAbOF4nDs8CX4QblJDv5BbMUft3sEfDw==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/api-logs": "0.213.0",
"@opentelemetry/core": "2.6.0",
"@opentelemetry/resources": "2.6.0",
"@opentelemetry/sdk-logs": "0.213.0",
"@opentelemetry/sdk-metrics": "2.6.0",
"@opentelemetry/sdk-trace-base": "2.6.0",
"protobufjs": "^7.0.0"
},
"engines": {
"node": "^18.19.0 || >=20.6.0"
},
"peerDependencies": {
"@opentelemetry/api": "^1.3.0"
}
},
"node_modules/@grafana/faro-core/node_modules/@opentelemetry/resources": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/resources/-/resources-2.6.0.tgz",
"integrity": "sha512-D4y/+OGe3JSuYUCBxtH5T9DSAWNcvCb/nQWIga8HNtXTVPQn59j0nTBAgaAXxUVBDl40mG3Tc76b46wPlZaiJQ==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/core": "2.6.0",
"@opentelemetry/semantic-conventions": "^1.29.0"
},
"engines": {
"node": "^18.19.0 || >=20.6.0"
},
"peerDependencies": {
"@opentelemetry/api": ">=1.3.0 <1.10.0"
}
},
"node_modules/@grafana/faro-core/node_modules/@opentelemetry/sdk-logs": {
"version": "0.213.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/sdk-logs/-/sdk-logs-0.213.0.tgz",
"integrity": "sha512-00xlU3GZXo3kXKve4DLdrAL0NAFUaZ9appU/mn00S/5kSUdAvyYsORaDUfR04Mp2CLagAOhrzfUvYozY/EZX2g==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/api-logs": "0.213.0",
"@opentelemetry/core": "2.6.0",
"@opentelemetry/resources": "2.6.0",
"@opentelemetry/semantic-conventions": "^1.29.0"
},
"engines": {
"node": "^18.19.0 || >=20.6.0"
},
"peerDependencies": {
"@opentelemetry/api": ">=1.4.0 <1.10.0"
}
},
"node_modules/@grafana/faro-core/node_modules/@opentelemetry/sdk-metrics": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/sdk-metrics/-/sdk-metrics-2.6.0.tgz",
"integrity": "sha512-CicxWZxX6z35HR83jl+PLgtFgUrKRQ9LCXyxgenMnz5A1lgYWfAog7VtdOvGkJYyQgMNPhXQwkYrDLujk7z1Iw==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/core": "2.6.0",
"@opentelemetry/resources": "2.6.0"
},
"engines": {
"node": "^18.19.0 || >=20.6.0"
},
"peerDependencies": {
"@opentelemetry/api": ">=1.9.0 <1.10.0"
}
},
"node_modules/@grafana/faro-core/node_modules/@opentelemetry/sdk-trace-base": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/sdk-trace-base/-/sdk-trace-base-2.6.0.tgz",
"integrity": "sha512-g/OZVkqlxllgFM7qMKqbPV9c1DUPhQ7d4n3pgZFcrnrNft9eJXZM2TNHTPYREJBrtNdRytYyvwjgL5geDKl3EQ==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/core": "2.6.0",
"@opentelemetry/resources": "2.6.0",
"@opentelemetry/semantic-conventions": "^1.29.0"
},
"engines": {
"node": "^18.19.0 || >=20.6.0"
},
"peerDependencies": {
"@opentelemetry/api": ">=1.3.0 <1.10.0"
}
},
"node_modules/@grafana/faro-web-sdk": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/@grafana/faro-web-sdk/-/faro-web-sdk-2.3.1.tgz",
"integrity": "sha512-WMfErl2YSP+CcfcobMpCdK6apX86hc8bymMXsvYLQpBBkQ0KJjIilEQS/YXd+g/cg6F1kwbeweisBKluNNy5sA==",
"license": "Apache-2.0",
"dependencies": {
"@grafana/faro-core": "^2.3.1",
"ua-parser-js": "1.0.41",
"web-vitals": "^5.1.0"
}
},
"node_modules/@grpc/grpc-js": {
"version": "1.14.3",
"resolved": "https://registry.npmjs.org/@grpc/grpc-js/-/grpc-js-1.14.3.tgz",
@@ -7377,6 +7487,32 @@
"node": ">=14.17"
}
},
"node_modules/ua-parser-js": {
"version": "1.0.41",
"resolved": "https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-1.0.41.tgz",
"integrity": "sha512-LbBDqdIC5s8iROCUjMbW1f5dJQTEFB1+KO9ogbvlb3nm9n4YHa5p4KTvFPWvh2Hs8gZMBuiB1/8+pdfe/tDPug==",
"funding": [
{
"type": "opencollective",
"url": "https://opencollective.com/ua-parser-js"
},
{
"type": "paypal",
"url": "https://paypal.me/faisalman"
},
{
"type": "github",
"url": "https://github.com/sponsors/faisalman"
}
],
"license": "MIT",
"bin": {
"ua-parser-js": "script/cli.js"
},
"engines": {
"node": "*"
}
},
"node_modules/undici-types": {
"version": "7.18.2",
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-7.18.2.tgz",
@@ -7540,6 +7676,12 @@
}
}
},
"node_modules/web-vitals": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/web-vitals/-/web-vitals-5.2.0.tgz",
"integrity": "sha512-i2z98bEmaCqSDiHEDu+gHl/dmR4Q+TxFmG3/13KkMO+o8UxQzCqWaDRCiLgEa41nlO4VpXSI0ASa1xWmO9sBlA==",
"license": "Apache-2.0"
},
"node_modules/webidl-conversions": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",

View File

@@ -31,6 +31,7 @@
"dependencies": {
"@aws-sdk/client-s3": "^3.1005.0",
"@aws-sdk/s3-request-presigner": "^3.1005.0",
"@grafana/faro-web-sdk": "^2.3.1",
"@inlang/paraglide-js": "^2.15.1",
"@opentelemetry/exporter-logs-otlp-http": "^0.214.0",
"@opentelemetry/exporter-trace-otlp-http": "^0.214.0",

View File

@@ -97,6 +97,48 @@
--color-success: #16a34a; /* green-600 */
}
/* ── Forest theme — dark green ────────────────────────────────────────── */
[data-theme="forest"] {
--color-brand: #4ade80; /* green-400 */
--color-brand-dim: #16a34a; /* green-600 */
--color-surface: #0a130d; /* custom near-black green */
--color-surface-2: #111c14; /* custom dark green */
--color-surface-3: #1a2e1e; /* custom mid green */
--color-muted: #6b9a77; /* custom muted green */
--color-text: #e8f5e9; /* custom light green-tinted white */
--color-border: #1e3a24; /* custom green border */
--color-danger: #f87171; /* red-400 */
--color-success: #4ade80; /* green-400 */
}
/* ── Mono theme — pure dark with white accent ─────────────────────────── */
[data-theme="mono"] {
--color-brand: #f4f4f5; /* zinc-100 — white accent */
--color-brand-dim: #a1a1aa; /* zinc-400 */
--color-surface: #09090b; /* zinc-950 */
--color-surface-2: #18181b; /* zinc-900 */
--color-surface-3: #27272a; /* zinc-800 */
--color-muted: #71717a; /* zinc-500 */
--color-text: #f4f4f5; /* zinc-100 */
--color-border: #27272a; /* zinc-800 */
--color-danger: #f87171; /* red-400 */
--color-success: #4ade80; /* green-400 */
}
/* ── Cyberpunk theme — dark with neon cyan/magenta accents ────────────── */
[data-theme="cyber"] {
--color-brand: #22d3ee; /* cyan-400 — neon cyan */
--color-brand-dim: #06b6d4; /* cyan-500 */
--color-surface: #050712; /* custom near-black blue */
--color-surface-2: #0d1117; /* custom dark blue-black */
--color-surface-3: #161b27; /* custom dark blue */
--color-muted: #6272a4; /* dracula comment blue */
--color-text: #e2e8f0; /* slate-200 */
--color-border: #1e2d45; /* custom dark border */
--color-danger: #ff5555; /* dracula red */
--color-success: #50fa7b; /* dracula green */
}
html {
background-color: var(--color-surface);
color: var(--color-text);
@@ -106,12 +148,14 @@ html {
:root {
--reading-font: system-ui, -apple-system, sans-serif;
--reading-size: 1.05rem;
--reading-line-height: 1.85;
--reading-max-width: 72ch;
}
/* ── Chapter prose ─────────────────────────────────────────────────── */
.prose-chapter {
max-width: 72ch;
line-height: 1.85;
max-width: var(--reading-max-width, 72ch);
line-height: var(--reading-line-height, 1.85);
font-family: var(--reading-font);
font-size: var(--reading-size);
color: var(--color-muted);
@@ -134,6 +178,12 @@ html {
margin-bottom: 1.2em;
}
/* Indented paragraph style — book-like, no gap, indent instead */
.prose-chapter.para-indented p {
text-indent: 2em;
margin-bottom: 0.35em;
}
.prose-chapter em {
color: var(--color-muted);
}
@@ -147,6 +197,49 @@ html {
margin: 2em 0;
}
/* ── Reading progress bar ───────────────────────────────────────────── */
.reading-progress {
position: fixed;
top: 0;
left: 0;
height: 2px;
z-index: 100;
background: var(--color-brand);
pointer-events: none;
transition: width 0.1s linear;
}
/* ── Paginated reader ───────────────────────────────────────────────── */
.paginated-container {
overflow: hidden;
cursor: pointer;
user-select: none;
-webkit-user-select: none;
}
.paginated-container .prose-chapter {
transition: transform 0.25s cubic-bezier(0.4, 0, 0.2, 1);
will-change: transform;
}
/* ── Hide scrollbars (used on horizontal carousels) ────────────────── */
.scrollbar-none {
scrollbar-width: none; /* Firefox */
-ms-overflow-style: none; /* IE / Edge legacy */
}
.scrollbar-none::-webkit-scrollbar {
display: none; /* Chrome / Safari / WebKit */
}
/* ── Hero carousel fade ─────────────────────────────────────────────── */
@keyframes fade-in {
from { opacity: 0; }
to { opacity: 1; }
}
.animate-fade-in {
animation: fade-in 0.4s ease-out forwards;
}
/* ── Navigation progress bar ───────────────────────────────────────── */
@keyframes progress-bar {
0% { width: 0%; opacity: 1; }
@@ -154,5 +247,68 @@ html {
100% { width: 100%; opacity: 0; }
}
.animate-progress-bar {
animation: progress-bar 8s cubic-bezier(0.1, 0.05, 0.1, 1) forwards;
animation: progress-bar 4s cubic-bezier(0.1, 0.05, 0.1, 1) forwards;
}
/* ── Logo animation classes (used in nav + admin preview) ───────────── */
@keyframes logo-glow-pulse {
0%, 100% { text-shadow: 0 0 6px color-mix(in srgb, var(--color-brand) 60%, transparent); }
50% { text-shadow: 0 0 18px color-mix(in srgb, var(--color-brand) 90%, transparent), 0 0 32px color-mix(in srgb, var(--color-brand) 40%, transparent); }
}
.logo-anim-glow {
animation: logo-glow-pulse 2.4s ease-in-out infinite;
}
@keyframes logo-shimmer {
0% { background-position: -200% center; }
100% { background-position: 200% center; }
}
.logo-anim-shimmer {
background: linear-gradient(
90deg,
var(--color-brand) 0%,
color-mix(in srgb, var(--color-brand) 40%, white) 40%,
var(--color-brand) 50%,
color-mix(in srgb, var(--color-brand) 40%, white) 60%,
var(--color-brand) 100%
);
background-size: 200% auto;
background-clip: text;
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
animation: logo-shimmer 2.2s linear infinite;
}
@keyframes logo-pulse-scale {
0%, 100% { transform: scale(1); }
50% { transform: scale(1.06); }
}
.logo-anim-pulse {
display: inline-block;
animation: logo-pulse-scale 1.8s ease-in-out infinite;
}
@keyframes logo-rainbow {
0% { filter: hue-rotate(0deg); }
100% { filter: hue-rotate(360deg); }
}
.logo-anim-rainbow {
animation: logo-rainbow 4s linear infinite;
}
/* ── Respect reduced motion — disable all decorative animations ─────── */
@media (prefers-reduced-motion: reduce) {
*,
*::before,
*::after {
animation-duration: 0.01ms !important;
animation-iteration-count: 1 !important;
transition-duration: 0.01ms !important;
}
}
/* ── Footer content-visibility — skip paint for off-screen footer ───── */
footer {
content-visibility: auto;
contain-intrinsic-size: auto 80px;
}

View File

@@ -3,6 +3,7 @@
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<link rel="manifest" href="/manifest.webmanifest" />
<link rel="icon" href="/favicon.ico" sizes="16x16 32x32" />
<link rel="icon" type="image/png" href="/favicon-32.png" sizes="32x32" />
<link rel="icon" type="image/png" href="/favicon-16.png" sizes="16x16" />

View File

@@ -1,5 +1,6 @@
import * as Sentry from '@sentry/sveltekit';
import { env } from '$env/dynamic/public';
import { initializeFaro, getWebInstrumentations } from '@grafana/faro-web-sdk';
// Sentry / GlitchTip client-side error tracking.
// No-op when PUBLIC_GLITCHTIP_DSN is unset (e.g. local dev).
@@ -13,4 +14,21 @@ if (env.PUBLIC_GLITCHTIP_DSN) {
});
}
// Grafana Faro RUM — browser performance monitoring (Web Vitals, traces, errors).
// No-op when PUBLIC_FARO_COLLECTOR_URL is unset (e.g. local dev).
if (env.PUBLIC_FARO_COLLECTOR_URL) {
initializeFaro({
url: env.PUBLIC_FARO_COLLECTOR_URL,
app: {
name: 'libnovel-ui',
version: env.PUBLIC_BUILD_VERSION || 'dev',
environment: 'production'
},
instrumentations: [
// Core Web Vitals (LCP, CLS, INP, TTFB, FCP) + JS errors + console
...getWebInstrumentations({ captureConsole: false })
]
});
}
export const handleError = Sentry.handleErrorWithSentry();

View File

@@ -32,8 +32,18 @@
* It only runs once per chapter (guarded by nextStatus !== 'none').
*/
import type { Voice } from '$lib/types';
export type AudioStatus = 'idle' | 'loading' | 'generating' | 'ready' | 'error';
export type NextStatus = 'none' | 'prefetching' | 'prefetched' | 'failed';
/**
* 'stream' Use /api/audio-stream: audio starts playing within seconds,
* stream is saved to MinIO concurrently. No runner task needed.
* 'generate' Legacy mode: queue a runner task, poll until done, then play
* from the presigned MinIO URL. Needed for CF AI voices which
* do not support native streaming.
*/
export type AudioMode = 'stream' | 'generate';
class AudioStore {
// ── What is loaded ──────────────────────────────────────────────────────
@@ -44,12 +54,22 @@ class AudioStore {
voice = $state('af_bella');
speed = $state(1.0);
/**
* Playback mode:
* 'stream' pipe from /api/audio-stream (low latency, saves concurrently)
* 'generate' queue runner task, poll, then play presigned URL (CF AI / legacy)
*/
audioMode = $state<AudioMode>('stream');
/** Cover image URL for the currently loaded book. */
cover = $state('');
/** Full chapter list for the currently loaded book (number + title). */
chapters = $state<{ number: number; title: string }[]>([]);
/** Available voices (populated by the chapter AudioPlayer on mount). */
voices = $state<Voice[]>([]);
// ── Loading/generation state ────────────────────────────────────────────
status = $state<AudioStatus>('idle');
audioUrl = $state('');
@@ -57,6 +77,13 @@ class AudioStore {
/** Pseudo-progress bar value 0100 during generation */
progress = $state(0);
/**
* True while playing a short CF AI preview clip (~1-2 min) and the full
* audio is still being generated in the background. Set to false once the
* full audio URL has been swapped in.
*/
isPreview = $state(false);
// ── Playback state (kept in sync with the <audio> element) ─────────────
currentTime = $state(0);
duration = $state(0);
@@ -75,6 +102,13 @@ class AudioStore {
*/
seekRequest = $state<number | null>(null);
// ── Sleep timer ──────────────────────────────────────────────────────────
/** Epoch ms when sleep timer should fire. 0 = off. */
sleepUntil = $state(0);
/** When true, pause after the current chapter ends instead of navigating. */
sleepAfterChapter = $state(false);
// ── Auto-next ────────────────────────────────────────────────────────────
/**
* When true, navigates to the next chapter when the current one ends
@@ -82,6 +116,13 @@ class AudioStore {
*/
autoNext = $state(false);
/**
* When true, announces the upcoming chapter number and title via the
* Web Speech API before auto-next navigation fires.
* e.g. "Chapter 12 — The Final Battle"
*/
announceChapter = $state(false);
/**
* The next chapter number for the currently playing chapter, or null if
* there is no next chapter. Written by the chapter page's AudioPlayer.
@@ -129,11 +170,39 @@ class AudioStore {
return this.status === 'ready' || this.status === 'generating' || this.status === 'loading';
}
/**
* When true the persistent mini-bar in +layout.svelte is hidden.
* Set by the chapter reader page when playerStyle is 'float' or 'minimal'
* so the in-page player is the sole control surface.
* Cleared when leaving the chapter page (page destroy / onDestroy effect).
*/
suppressMiniBar = $state(false);
/**
* Position of the draggable float overlay (bottom-right anchor offsets).
* Stored here (module singleton) so the position survives chapter navigation.
* x > 0 = moved left; y > 0 = moved up.
*/
floatPos = $state({ x: 0, y: 0 });
/** True when the currently loaded track matches slug+chapter */
isCurrentChapter(slug: string, chapter: number): boolean {
return this.slug === slug && this.chapter === chapter;
}
// ── Announce-chapter navigation state ────────────────────────────────────
/**
* When true, the <audio> element is playing a short announcement clip
* (not chapter audio). The next `onended` should navigate to
* announcePendingSlug / announcePendingChapter instead of the normal
* auto-next flow.
*/
announceNavigatePending = $state(false);
/** Target book slug for the pending announce-then-navigate transition. */
announcePendingSlug = $state('');
/** Target chapter number for the pending announce-then-navigate transition. */
announcePendingChapter = $state(0);
/** Reset all next-chapter pre-fetch state. */
resetNextPrefetch() {
this.nextStatus = 'none';

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,132 @@
<script lang="ts">
import { cn } from '$lib/utils';
interface ChapterMeta {
number: number;
title: string;
}
interface Props {
/** Full chapter list to render and filter. */
chapters: ChapterMeta[];
/** Number of the currently-active chapter (highlighted + auto-scrolled). */
activeChapter: number;
/** z-index class, e.g. "z-[60]" or "z-[80]". Defaults to "z-[60]". */
zIndex?: string;
/** Called when a chapter row is tapped. The overlay does NOT close itself. */
onselect: (chapterNumber: number) => void;
/** Called when the close (✕) button is tapped. */
onclose: () => void;
}
let {
chapters,
activeChapter,
zIndex = 'z-[60]',
onselect,
onclose
}: Props = $props();
let search = $state('');
const filtered = $derived(
search.trim() === ''
? chapters
: chapters.filter((ch) =>
(ch.title || `Chapter ${ch.number}`)
.toLowerCase()
.includes(search.toLowerCase()) ||
String(ch.number).includes(search)
)
);
function handleClose() {
search = '';
onclose();
}
function handleSelect(n: number) {
search = '';
onselect(n);
}
</script>
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="fixed inset-0 flex flex-col {zIndex}"
style="background: var(--color-surface);"
>
<!-- Header -->
<div
class="flex items-center gap-3 px-4 py-3 border-b border-(--color-border) shrink-0"
style="padding-top: max(0.75rem, env(safe-area-inset-top));"
>
<button
type="button"
onclick={handleClose}
class="p-2 rounded-full text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Close chapter picker"
>
<!-- close / ✕ -->
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
</svg>
</button>
<span class="text-xs font-semibold text-(--color-muted) uppercase tracking-wider flex-1">Chapters</span>
</div>
<!-- Search -->
<div class="px-4 py-3 shrink-0 border-b border-(--color-border)">
<div class="relative">
<svg class="absolute left-3 top-1/2 -translate-y-1/2 w-4 h-4 text-(--color-muted)" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"/>
</svg>
<input
type="search"
placeholder="Search chapters…"
bind:value={search}
class="w-full pl-9 pr-4 py-2 text-sm bg-(--color-surface-2) border border-(--color-border) rounded-lg text-(--color-text) placeholder:text-(--color-muted) focus:outline-none focus:border-(--color-brand) transition-colors"
/>
</div>
</div>
<!-- Chapter list -->
<div
class="flex-1 overflow-y-auto overscroll-contain"
style="padding-bottom: env(safe-area-inset-bottom);"
>
{#each filtered as ch (ch.number)}
<button
type="button"
onclick={() => handleSelect(ch.number)}
class={cn(
'w-full flex items-center gap-3 px-4 py-3 border-b border-(--color-border)/40 transition-colors text-left',
ch.number === activeChapter ? 'bg-(--color-brand)/8' : 'hover:bg-(--color-surface-2)'
)}
>
<span class={cn(
'w-8 h-8 shrink-0 rounded-full border-2 flex items-center justify-center tabular-nums text-xs font-semibold transition-colors',
ch.number === activeChapter
? 'border-(--color-brand) bg-(--color-brand) text-(--color-surface)'
: 'border-(--color-border) text-(--color-muted)'
)}>{ch.number}</span>
<span class={cn(
'flex-1 text-sm truncate',
ch.number === activeChapter ? 'font-semibold text-(--color-brand)' : 'text-(--color-text)'
)}>{ch.title || `Chapter ${ch.number}`}</span>
{#if ch.number === activeChapter}
<!-- play icon -->
<svg class="w-4 h-4 shrink-0 text-(--color-brand)" fill="currentColor" viewBox="0 0 24 24">
<path d="M8 5v14l11-7z"/>
</svg>
{/if}
</button>
{/each}
{#if filtered.length === 0}
<p class="px-4 py-8 text-sm text-(--color-muted) text-center">No chapters match "{search}"</p>
{/if}
</div>
</div>

View File

@@ -241,311 +241,321 @@
const totalCount = $derived(
comments.reduce((n, c) => n + 1 + (c.replies?.length ?? 0), 0)
);
// ── Collapsed state ───────────────────────────────────────────────────────
// Hidden by default when there are no comments; expand on user tap.
let expanded = $state(false);
const hasComments = $derived(!loading && comments.length > 0);
// Auto-expand once comments load in
$effect(() => {
if (hasComments) expanded = true;
});
</script>
<div class="mt-10">
<!-- Header + sort controls -->
<div class="flex items-center justify-between gap-3 mb-4 flex-wrap">
<h2 class="text-base font-semibold text-(--color-text)">
{#if !expanded && !hasComments && !loading}
<!-- Collapsed: just a subtle link — no wasted real-estate for empty chapters -->
<button
type="button"
onclick={() => (expanded = true)}
class="flex items-center gap-1.5 text-sm text-(--color-muted) hover:text-(--color-text) transition-colors"
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="1.5">
<path stroke-linecap="round" stroke-linejoin="round" d="M8.625 12a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H8.25m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0H12m4.125 0a.375.375 0 11-.75 0 .375.375 0 01.75 0zm0 0h-.375M21 12c0 4.556-4.03 8.25-9 8.25a9.764 9.764 0 01-2.555-.337A5.972 5.972 0 015.41 20.97a5.969 5.969 0 01-.474-.065 4.48 4.48 0 00.978-2.025c.09-.457-.133-.901-.467-1.226C3.93 16.178 3 14.189 3 12c0-4.556 4.03-8.25 9-8.25s9 3.694 9 8.25z"/>
</svg>
{m.comments_heading()}
{#if !loading && totalCount > 0}
<span class="text-(--color-muted) font-normal text-sm ml-1">({totalCount})</span>
{/if}
</h2>
<!-- Sort tabs -->
{#if !loading && comments.length > 0}
<div class="flex items-center gap-1 text-xs rounded-lg bg-(--color-surface-2)/60 p-1">
<Button
variant="ghost"
size="sm"
class={cn('px-2.5 py-1 h-auto text-xs rounded-md', sort === 'top' ? 'bg-(--color-surface-3) text-(--color-text) hover:bg-(--color-surface-3)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => (sort = 'top')}
>{m.comments_top()}</Button>
<Button
variant="ghost"
size="sm"
class={cn('px-2.5 py-1 h-auto text-xs rounded-md', sort === 'new' ? 'bg-(--color-surface-3) text-(--color-text) hover:bg-(--color-surface-3)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => (sort = 'new')}
>{m.comments_new()}</Button>
</div>
{/if}
</div>
<!-- Post form -->
<div class="mb-6">
{#if isLoggedIn}
<div class="flex flex-col gap-2">
<Textarea
bind:value={newBody}
placeholder={m.comments_placeholder()}
rows={3}
/>
<div class="flex items-center justify-between gap-3">
<span class={cn('text-xs tabular-nums', charOver ? 'text-(--color-danger)' : 'text-(--color-muted) opacity-60')}>
{charCount}/2000
</span>
<div class="flex items-center gap-3">
{#if postError}
<span class="text-xs text-(--color-danger)">{postError}</span>
{/if}
<Button
variant="default"
size="sm"
disabled={posting || !newBody.trim() || charOver}
onclick={postComment}
>
{posting ? m.comments_posting() : m.comments_submit()}
</Button>
</div>
</div>
</div>
{:else}
<p class="text-sm text-(--color-muted)">
<a href="/login" class="text-(--color-brand) hover:text-(--color-brand-dim) transition-colors">{m.comments_login_link()}</a>
{m.comments_login_suffix()}
</p>
{/if}
</div>
<!-- Comment list -->
{#if loading}
<div class="flex flex-col gap-3">
{#each Array(3) as _}
<div class="rounded-lg bg-(--color-surface-2)/50 p-4 animate-pulse">
<div class="h-3 w-24 bg-(--color-surface-3) rounded mb-3"></div>
<div class="h-3 w-full bg-(--color-surface-3)/60 rounded mb-2"></div>
<div class="h-3 w-3/4 bg-(--color-surface-3)/60 rounded"></div>
</div>
{/each}
</div>
{:else if loadError}
<p class="text-sm text-(--color-danger)">{loadError}</p>
{:else if comments.length === 0}
<p class="text-sm text-(--color-muted)">{m.comments_empty()}</p>
</button>
{:else}
<div class="flex flex-col gap-3">
{#each comments as comment (comment.id)}
{@const myVote = myVotes[comment.id]}
{@const voting = votingIds.has(comment.id)}
{@const deleting = deletingIds.has(comment.id)}
{@const isOwner = isLoggedIn && currentUserId === comment.user_id}
<!-- Expanded: full comments section -->
<div class="rounded-lg bg-(--color-surface-2)/50 border border-(--color-border)/50 px-4 py-3 flex flex-col gap-2 {deleting ? 'opacity-50' : ''}">
<!-- Header -->
<div class="flex items-center gap-2 flex-wrap">
{#if avatarUrls[comment.user_id]}
<img src={avatarUrls[comment.user_id]} alt={comment.username} class="w-6 h-6 rounded-full object-cover flex-shrink-0" />
{:else}
<div class="w-6 h-6 rounded-full bg-(--color-surface-3) flex items-center justify-center flex-shrink-0">
<span class="text-[9px] font-semibold text-(--color-text) leading-none">{initials(comment.username)}</span>
</div>
{/if}
{#if comment.username}
<a href="/users/{comment.username}" class="text-sm font-medium text-(--color-text) hover:text-(--color-brand) transition-colors">{comment.username}</a>
{:else}
<span class="text-sm font-medium text-(--color-muted)">{m.comments_anonymous()}</span>
<!-- Header + sort controls -->
<div class="flex items-center justify-between gap-3 mb-4 flex-wrap">
<h2 class="text-base font-semibold text-(--color-text)">
{m.comments_heading()}
{#if !loading && totalCount > 0}
<span class="text-(--color-muted) font-normal text-sm ml-1">({totalCount})</span>
{/if}
<span class="text-(--color-muted) opacity-60 text-xs">&middot;</span>
<span class="text-xs text-(--color-muted)">{formatDate(comment.created)}</span>
</div>
<!-- Body -->
<p class="text-sm text-(--color-text) leading-relaxed whitespace-pre-wrap break-words">{comment.body}</p>
<!-- Actions row: votes + reply + delete -->
<div class="flex items-center gap-3 pt-1 flex-wrap">
<!-- Upvote -->
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', myVote === 'up' ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={voting}
onclick={() => vote(comment.id, 'up')}
title={m.comments_vote_up()}
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M14 10h4.764a2 2 0 011.789 2.894l-3.5 7A2 2 0 0115.263 21h-4.017c-.163 0-.326-.02-.485-.06L7 20m7-10V5a2 2 0 00-2-2h-.095c-.5 0-.905.405-.905.905 0 .714-.211 1.412-.608 2.006L7 11v9m7-10h-2M7 20H5a2 2 0 01-2-2v-6a2 2 0 012-2h2.5"/>
</svg>
<span class="tabular-nums">{comment.upvotes ?? 0}</span>
</Button>
<!-- Downvote -->
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', myVote === 'down' ? 'text-(--color-danger)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={voting}
onclick={() => vote(comment.id, 'down')}
title={m.comments_vote_down()}
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M10 14H5.236a2 2 0 01-1.789-2.894l3.5-7A2 2 0 018.736 3h4.018a2 2 0 01.485.06l3.76.94m-7 10v5a2 2 0 002 2h.096c.5 0 .905-.405.905-.904 0-.715.211-1.413.608-2.008L17 13V4m-7 10h2m5-10h2a2 2 0 012 2v6a2 2 0 01-2 2h-2.5"/>
</svg>
<span class="tabular-nums">{comment.downvotes ?? 0}</span>
</Button>
<!-- Reply button -->
{#if isLoggedIn}
</h2>
{#if !loading && comments.length > 0}
<div class="flex items-center gap-1 text-xs rounded-lg bg-(--color-surface-2)/60 p-1">
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyingTo === comment.id ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => {
if (replyingTo === comment.id) {
replyingTo = null;
replyBody = '';
replyError = '';
} else {
replyingTo = comment.id;
replyBody = '';
replyError = '';
}
}}
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M3 10h10a8 8 0 018 8v2M3 10l6 6m-6-6l6-6"/>
</svg>
{m.comments_reply()}
</Button>
{/if}
<!-- Delete (owner only) -->
{#if isOwner}
class={cn('px-2.5 py-1 h-auto text-xs rounded-md', sort === 'top' ? 'bg-(--color-surface-3) text-(--color-text) hover:bg-(--color-surface-3)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => (sort = 'top')}
>{m.comments_top()}</Button>
<Button
variant="ghost"
size="sm"
class="h-auto px-1 py-0 gap-1 text-xs text-(--color-muted) hover:text-(--color-danger) ml-auto"
disabled={deleting}
onclick={() => deleteComment(comment.id)}
title="Delete comment"
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
</svg>
{m.comments_delete()}
</Button>
{/if}
class={cn('px-2.5 py-1 h-auto text-xs rounded-md', sort === 'new' ? 'bg-(--color-surface-3) text-(--color-text) hover:bg-(--color-surface-3)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => (sort = 'new')}
>{m.comments_new()}</Button>
</div>
{/if}
</div>
<!-- Inline reply form -->
{#if replyingTo === comment.id}
<div class="mt-1 flex flex-col gap-2 pl-2 border-l-2 border-(--color-border)">
<!-- Post form -->
<div class="mb-6">
{#if isLoggedIn}
<div class="flex flex-col gap-2">
<Textarea
bind:value={replyBody}
bind:value={newBody}
placeholder={m.comments_placeholder()}
rows={2}
rows={3}
/>
<div class="flex items-center justify-between gap-2">
<span class={cn('text-xs tabular-nums', replyCharOver ? 'text-(--color-danger)' : 'text-(--color-muted) opacity-60')}>
{replyCharCount}/2000
<div class="flex items-center justify-between gap-3">
<span class={cn('text-xs tabular-nums', charOver ? 'text-(--color-danger)' : 'text-(--color-muted) opacity-60')}>
{charCount}/2000
</span>
<div class="flex items-center gap-2">
{#if replyError}
<span class="text-xs text-(--color-danger)">{replyError}</span>
<div class="flex items-center gap-3">
{#if postError}
<span class="text-xs text-(--color-danger)">{postError}</span>
{/if}
<Button
variant="ghost"
variant="default"
size="sm"
class="text-(--color-muted) hover:text-(--color-text)"
onclick={() => { replyingTo = null; replyBody = ''; replyError = ''; }}
>{m.common_cancel()}</Button>
<Button
variant="default"
size="sm"
disabled={replyPosting || !replyBody.trim() || replyCharOver}
onclick={() => postReply(comment.id)}
>
{replyPosting ? m.comments_posting() : m.comments_reply()}
</Button>
</div>
disabled={posting || !newBody.trim() || charOver}
onclick={postComment}
>
{posting ? m.comments_posting() : m.comments_submit()}
</Button>
</div>
</div>
{/if}
</div>
{:else}
<p class="text-sm text-(--color-muted)">
<a href="/login" class="text-(--color-brand) hover:text-(--color-brand-dim) transition-colors">{m.comments_login_link()}</a>
{m.comments_login_suffix()}
</p>
{/if}
</div>
<!-- Replies -->
{#if comment.replies && comment.replies.length > 0}
<div class="mt-1 flex flex-col gap-2 pl-3 border-l-2 border-(--color-border)/60">
{#each comment.replies as reply (reply.id)}
{@const replyVote = myVotes[reply.id]}
{@const replyVoting = votingIds.has(reply.id)}
{@const replyDeleting = deletingIds.has(reply.id)}
{@const replyIsOwner = isLoggedIn && currentUserId === reply.user_id}
<!-- Comment list -->
{#if loading}
<div class="flex flex-col gap-3">
{#each Array(3) as _}
<div class="rounded-lg bg-(--color-surface-2)/50 p-4 animate-pulse">
<div class="h-3 w-24 bg-(--color-surface-3) rounded mb-3"></div>
<div class="h-3 w-full bg-(--color-surface-3)/60 rounded mb-2"></div>
<div class="h-3 w-3/4 bg-(--color-surface-3)/60 rounded"></div>
</div>
{/each}
</div>
{:else if loadError}
<p class="text-sm text-(--color-danger)">{loadError}</p>
{:else}
<div class="flex flex-col gap-3">
{#each comments as comment (comment.id)}
{@const myVote = myVotes[comment.id]}
{@const voting = votingIds.has(comment.id)}
{@const deleting = deletingIds.has(comment.id)}
{@const isOwner = isLoggedIn && currentUserId === comment.user_id}
<div class="rounded-md bg-(--color-surface-2)/30 px-3 py-2.5 flex flex-col gap-1.5 {replyDeleting ? 'opacity-50' : ''}">
<!-- Reply header -->
<div class="flex items-center gap-2 flex-wrap">
{#if avatarUrls[reply.user_id]}
<img src={avatarUrls[reply.user_id]} alt={reply.username} class="w-5 h-5 rounded-full object-cover flex-shrink-0" />
{:else}
<div class="w-5 h-5 rounded-full bg-(--color-surface-3) flex items-center justify-center flex-shrink-0">
<span class="text-[8px] font-semibold text-(--color-text) leading-none">{initials(reply.username)}</span>
</div>
{/if}
{#if reply.username}
<a href="/users/{reply.username}" class="text-xs font-medium text-(--color-text) hover:text-(--color-brand) transition-colors">{reply.username}</a>
{:else}
<span class="text-xs font-medium text-(--color-muted)">{m.comments_anonymous()}</span>
{/if}
<span class="text-(--color-muted) opacity-60 text-xs">&middot;</span>
<span class="text-xs text-(--color-muted)">{formatDate(reply.created)}</span>
<div class={cn('rounded-lg bg-(--color-surface-2)/50 border border-(--color-border)/50 px-4 py-3 flex flex-col gap-2', deleting && 'opacity-50')}>
<!-- Header -->
<div class="flex items-center gap-2 flex-wrap">
{#if avatarUrls[comment.user_id]}
<img src={avatarUrls[comment.user_id]} alt={comment.username} class="w-6 h-6 rounded-full object-cover flex-shrink-0" />
{:else}
<div class="w-6 h-6 rounded-full bg-(--color-surface-3) flex items-center justify-center flex-shrink-0">
<span class="text-[9px] font-semibold text-(--color-text) leading-none">{initials(comment.username)}</span>
</div>
{/if}
{#if comment.username}
<a href="/users/{comment.username}" class="text-sm font-medium text-(--color-text) hover:text-(--color-brand) transition-colors">{comment.username}</a>
{:else}
<span class="text-sm font-medium text-(--color-muted)">{m.comments_anonymous()}</span>
{/if}
<span class="text-(--color-muted) opacity-60 text-xs">&middot;</span>
<span class="text-xs text-(--color-muted)">{formatDate(comment.created)}</span>
</div>
<!-- Reply body -->
<p class="text-sm text-(--color-text) leading-relaxed whitespace-pre-wrap break-words">{reply.body}</p>
<!-- Body -->
<p class="text-sm text-(--color-text) leading-relaxed whitespace-pre-wrap break-words">{comment.body}</p>
<!-- Reply actions -->
<div class="flex items-center gap-3 pt-0.5">
<!-- Actions row: votes + reply + delete -->
<div class="flex items-center gap-3 pt-1 flex-wrap">
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyVote === 'up' ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={replyVoting}
onclick={() => vote(reply.id, 'up', comment.id)}
class={cn('h-auto px-1 py-0 gap-1 text-xs', myVote === 'up' ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={voting}
onclick={() => vote(comment.id, 'up')}
title={m.comments_vote_up()}
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M14 10h4.764a2 2 0 011.789 2.894l-3.5 7A2 2 0 0115.263 21h-4.017c-.163 0-.326-.02-.485-.06L7 20m7-10V5a2 2 0 00-2-2h-.095c-.5 0-.905.405-.905.905 0 .714-.211 1.412-.608 2.006L7 11v9m7-10h-2M7 20H5a2 2 0 01-2-2v-6a2 2 0 012-2h2.5"/>
</svg>
<span class="tabular-nums">{reply.upvotes ?? 0}</span>
</Button>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M14 10h4.764a2 2 0 011.789 2.894l-3.5 7A2 2 0 0115.263 21h-4.017c-.163 0-.326-.02-.485-.06L7 20m7-10V5a2 2 0 00-2-2h-.095c-.5 0-.905.405-.905.905 0 .714-.211 1.412-.608 2.006L7 11v9m7-10h-2M7 20H5a2 2 0 01-2-2v-6a2 2 0 012-2h2.5"/>
</svg>
<span class="tabular-nums">{comment.upvotes ?? 0}</span>
</Button>
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyVote === 'down' ? 'text-(--color-danger)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={replyVoting}
onclick={() => vote(reply.id, 'down', comment.id)}
class={cn('h-auto px-1 py-0 gap-1 text-xs', myVote === 'down' ? 'text-(--color-danger)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={voting}
onclick={() => vote(comment.id, 'down')}
title={m.comments_vote_down()}
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M10 14H5.236a2 2 0 01-1.789-2.894l3.5-7A2 2 0 018.736 3h4.018a2 2 0 01.485.06l3.76.94m-7 10v5a2 2 0 002 2h.096c.5 0 .905-.405.905-.904 0-.715.211-1.413.608-2.008L17 13V4m-7 10h2m5-10h2a2 2 0 012 2v6a2 2 0 01-2 2h-2.5"/>
</svg>
<span class="tabular-nums">{reply.downvotes ?? 0}</span>
</Button>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M10 14H5.236a2 2 0 01-1.789-2.894l3.5-7A2 2 0 018.736 3h4.018a2 2 0 01.485.06l3.76.94m-7 10v5a2 2 0 002 2h.096c.5 0 .905-.405.905-.904 0-.715.211-1.413.608-2.008L17 13V4m-7 10h2m5-10h2a2 2 0 012 2v6a2 2 0 01-2 2h-2.5"/>
</svg>
<span class="tabular-nums">{comment.downvotes ?? 0}</span>
</Button>
{#if replyIsOwner}
{#if isLoggedIn}
<Button
variant="ghost"
size="sm"
class="h-auto px-1 py-0 gap-1 text-xs text-(--color-muted) hover:text-(--color-danger) ml-auto"
disabled={replyDeleting}
onclick={() => deleteComment(reply.id, comment.id)}
title="Delete reply"
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyingTo === comment.id ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
onclick={() => {
if (replyingTo === comment.id) {
replyingTo = null; replyBody = ''; replyError = '';
} else {
replyingTo = comment.id; replyBody = ''; replyError = '';
}
}}
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
</svg>
{m.comments_delete()}
</Button>
{/if}
</div>
</div>
{/each}
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M3 10h10a8 8 0 018 8v2M3 10l6 6m-6-6l6-6"/>
</svg>
{m.comments_reply()}
</Button>
{/if}
{#if isOwner}
<Button
variant="ghost"
size="sm"
class="h-auto px-1 py-0 gap-1 text-xs text-(--color-muted) hover:text-(--color-danger) ml-auto"
disabled={deleting}
onclick={() => deleteComment(comment.id)}
title="Delete comment"
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
</svg>
{m.comments_delete()}
</Button>
{/if}
</div>
{/if}
</div>
{/each}
</div>
<!-- Inline reply form -->
{#if replyingTo === comment.id}
<div class="mt-1 flex flex-col gap-2 pl-2 border-l-2 border-(--color-border)">
<Textarea
bind:value={replyBody}
placeholder={m.comments_placeholder()}
rows={2}
/>
<div class="flex items-center justify-between gap-2">
<span class={cn('text-xs tabular-nums', replyCharOver ? 'text-(--color-danger)' : 'text-(--color-muted) opacity-60')}>
{replyCharCount}/2000
</span>
<div class="flex items-center gap-2">
{#if replyError}
<span class="text-xs text-(--color-danger)">{replyError}</span>
{/if}
<Button
variant="ghost"
size="sm"
class="text-(--color-muted) hover:text-(--color-text)"
onclick={() => { replyingTo = null; replyBody = ''; replyError = ''; }}
>{m.common_cancel()}</Button>
<Button
variant="default"
size="sm"
disabled={replyPosting || !replyBody.trim() || replyCharOver}
onclick={() => postReply(comment.id)}
>
{replyPosting ? m.comments_posting() : m.comments_reply()}
</Button>
</div>
</div>
</div>
{/if}
<!-- Replies -->
{#if comment.replies && comment.replies.length > 0}
<div class="mt-1 flex flex-col gap-2 pl-3 border-l-2 border-(--color-border)/60">
{#each comment.replies as reply (reply.id)}
{@const replyVote = myVotes[reply.id]}
{@const replyVoting = votingIds.has(reply.id)}
{@const replyDeleting = deletingIds.has(reply.id)}
{@const replyIsOwner = isLoggedIn && currentUserId === reply.user_id}
<div class={cn('rounded-md bg-(--color-surface-2)/30 px-3 py-2.5 flex flex-col gap-1.5', replyDeleting && 'opacity-50')}>
<div class="flex items-center gap-2 flex-wrap">
{#if avatarUrls[reply.user_id]}
<img src={avatarUrls[reply.user_id]} alt={reply.username} class="w-5 h-5 rounded-full object-cover flex-shrink-0" />
{:else}
<div class="w-5 h-5 rounded-full bg-(--color-surface-3) flex items-center justify-center flex-shrink-0">
<span class="text-[8px] font-semibold text-(--color-text) leading-none">{initials(reply.username)}</span>
</div>
{/if}
{#if reply.username}
<a href="/users/{reply.username}" class="text-xs font-medium text-(--color-text) hover:text-(--color-brand) transition-colors">{reply.username}</a>
{:else}
<span class="text-xs font-medium text-(--color-muted)">{m.comments_anonymous()}</span>
{/if}
<span class="text-(--color-muted) opacity-60 text-xs">&middot;</span>
<span class="text-xs text-(--color-muted)">{formatDate(reply.created)}</span>
</div>
<p class="text-sm text-(--color-text) leading-relaxed whitespace-pre-wrap break-words">{reply.body}</p>
<div class="flex items-center gap-3 pt-0.5">
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyVote === 'up' ? 'text-(--color-brand)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={replyVoting}
onclick={() => vote(reply.id, 'up', comment.id)}
title={m.comments_vote_up()}
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M14 10h4.764a2 2 0 011.789 2.894l-3.5 7A2 2 0 0115.263 21h-4.017c-.163 0-.326-.02-.485-.06L7 20m7-10V5a2 2 0 00-2-2h-.095c-.5 0-.905.405-.905.905 0 .714-.211 1.412-.608 2.006L7 11v9m7-10h-2M7 20H5a2 2 0 01-2-2v-6a2 2 0 012-2h2.5"/>
</svg>
<span class="tabular-nums">{reply.upvotes ?? 0}</span>
</Button>
<Button
variant="ghost"
size="sm"
class={cn('h-auto px-1 py-0 gap-1 text-xs', replyVote === 'down' ? 'text-(--color-danger)' : 'text-(--color-muted) hover:text-(--color-text)')}
disabled={replyVoting}
onclick={() => vote(reply.id, 'down', comment.id)}
title={m.comments_vote_down()}
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M10 14H5.236a2 2 0 01-1.789-2.894l3.5-7A2 2 0 018.736 3h4.018a2 2 0 01.485.06l3.76.94m-7 10v5a2 2 0 002 2h.096c.5 0 .905-.405.905-.904 0-.715.211-1.413.608-2.008L17 13V4m-7 10h2m5-10h2a2 2 0 012 2v6a2 2 0 01-2 2h-2.5"/>
</svg>
<span class="tabular-nums">{reply.downvotes ?? 0}</span>
</Button>
{#if replyIsOwner}
<Button
variant="ghost"
size="sm"
class="h-auto px-1 py-0 gap-1 text-xs text-(--color-muted) hover:text-(--color-danger) ml-auto"
disabled={replyDeleting}
onclick={() => deleteComment(reply.id, comment.id)}
title="Delete reply"
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24" stroke-width="2">
<path stroke-linecap="round" stroke-linejoin="round" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"/>
</svg>
{m.comments_delete()}
</Button>
{/if}
</div>
</div>
{/each}
</div>
{/if}
</div>
{/each}
</div>
{/if}
{/if}
</div>

View File

@@ -0,0 +1,193 @@
<script lang="ts">
import { browser } from '$app/environment';
import { goto } from '$app/navigation';
import { cn } from '$lib/utils';
interface Book {
slug: string;
title: string;
cover?: string;
author?: string;
genres?: string[] | string;
}
interface ReadingEntry {
book: Book;
chapter: number;
}
interface Props {
/** The slug of the book currently being read — highlighted in the list. */
currentSlug: string;
onclose: () => void;
}
let { currentSlug, onclose }: Props = $props();
let entries = $state<ReadingEntry[]>([]);
let loading = $state(true);
let error = $state('');
// ── Fetch in-progress books on mount ──────────────────────────────────────
$effect(() => {
if (!browser) return;
(async () => {
try {
const res = await fetch('/api/home');
if (!res.ok) throw new Error(`HTTP ${res.status}`);
const data = await res.json() as { continue_reading: ReadingEntry[] };
entries = data.continue_reading ?? [];
} catch {
error = 'Failed to load your reading list.';
} finally {
loading = false;
}
})();
});
// ── Body scroll lock ──────────────────────────────────────────────────────
$effect(() => {
if (!browser) return;
const prev = document.body.style.overflow;
document.body.style.overflow = 'hidden';
return () => { document.body.style.overflow = prev; };
});
// ── Keyboard: Escape closes ───────────────────────────────────────────────
function onKeydown(e: KeyboardEvent) {
if (e.key === 'Escape') onclose();
}
// ── Navigate to a book's current chapter ─────────────────────────────────
function openBook(entry: ReadingEntry) {
onclose();
goto(`/books/${entry.book.slug}/chapters/${entry.chapter}`);
}
function parseGenres(genres: string[] | string | undefined): string[] {
if (!genres) return [];
if (Array.isArray(genres)) return genres;
try { const p = JSON.parse(genres); return Array.isArray(p) ? p : []; } catch { return []; }
}
</script>
<svelte:window onkeydown={onKeydown} />
<!-- Backdrop -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="fixed inset-0 z-[70] flex flex-col"
style="background: rgba(0,0,0,0.6); backdrop-filter: blur(4px);"
onpointerdown={(e) => { if (e.target === e.currentTarget) onclose(); }}
>
<!-- Panel — matches SearchModal style -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="w-full max-w-2xl mx-auto mt-0 sm:mt-16 flex flex-col bg-(--color-surface) sm:rounded-2xl border-b sm:border border-(--color-border) shadow-2xl overflow-hidden"
style="max-height: 100svh;"
onpointerdown={(e) => e.stopPropagation()}
>
<!-- Header row -->
<div class="flex items-center gap-3 px-4 py-3 border-b border-(--color-border) shrink-0">
<svg class="w-5 h-5 text-(--color-muted) shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.747 0 3.332.477 4.5 1.253v13C19.832 18.477 18.247 18 16.5 18c-1.746 0-3.332.477-4.5 1.253"/>
</svg>
<span class="flex-1 text-base font-semibold text-(--color-text)">Currently Reading</span>
<button
type="button"
onclick={onclose}
class="shrink-0 px-3 py-1 rounded-lg text-sm text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Close"
>
Cancel
</button>
</div>
<!-- Scrollable body -->
<div class="flex-1 overflow-y-auto overscroll-contain">
{#if loading}
<!-- Loading skeleton -->
<div class="px-4 pt-3 pb-4 space-y-1">
{#each [1, 2, 3] as _}
<div class="flex items-center gap-3 px-0 py-3 border-b border-(--color-border)/40">
<div class="shrink-0 w-10 h-14 rounded bg-(--color-surface-3) animate-pulse"></div>
<div class="flex-1 space-y-2">
<div class="h-3.5 bg-(--color-surface-3) rounded animate-pulse w-3/4"></div>
<div class="h-3 bg-(--color-surface-3) rounded animate-pulse w-1/2"></div>
</div>
</div>
{/each}
</div>
{:else if error}
<p class="px-5 py-8 text-sm text-center text-(--color-danger)">{error}</p>
{:else if entries.length === 0}
<div class="px-5 py-12 text-center">
<p class="text-sm font-semibold text-(--color-text) mb-1">No books in progress</p>
<p class="text-xs text-(--color-muted)">Books you start reading will appear here.</p>
</div>
{:else}
{#each entries as entry, i}
{@const genres = parseGenres(entry.book.genres)}
{@const isCurrent = entry.book.slug === currentSlug}
<button
type="button"
onclick={() => openBook(entry)}
class={cn(
'w-full flex items-center gap-3 px-4 py-3 text-left transition-colors border-b border-(--color-border)/40 last:border-0',
isCurrent ? 'bg-(--color-brand)/8' : 'hover:bg-(--color-surface-2)'
)}
>
<!-- Cover thumbnail -->
<div class="shrink-0 w-10 h-14 rounded overflow-hidden bg-(--color-surface-2) border border-(--color-border) relative">
{#if entry.book.cover}
<img src={entry.book.cover} alt="" class="w-full h-full object-cover" loading="lazy" />
{:else}
<div class="w-full h-full flex items-center justify-center">
<svg class="w-5 h-5 text-(--color-muted)/40" fill="currentColor" viewBox="0 0 24 24">
<path d="M18 2H6c-1.1 0-2 .9-2 2v16c0 1.1.9 2 2 2h12c1.1 0 2-.9 2-2V4c0-1.1-.9-2-2-2z"/>
</svg>
</div>
{/if}
</div>
<!-- Info -->
<div class="flex-1 min-w-0">
<div class="flex items-start gap-2">
<p class={cn(
'text-sm font-semibold leading-snug line-clamp-1 flex-1',
isCurrent ? 'text-(--color-brand)' : 'text-(--color-text)'
)}>
{entry.book.title}
</p>
{#if isCurrent}
<span class="shrink-0 text-[10px] font-semibold px-1.5 py-0.5 rounded bg-(--color-brand)/15 text-(--color-brand) border border-(--color-brand)/30 leading-none mt-0.5">
Now
</span>
{/if}
</div>
{#if entry.book.author}
<p class="text-xs text-(--color-muted) mt-0.5 truncate">{entry.book.author}</p>
{/if}
<div class="flex items-center gap-1.5 mt-1 flex-wrap">
<span class="text-xs text-(--color-muted)/60">Ch. {entry.chapter}</span>
{#each genres.slice(0, 2) as g}
<span class="text-[10px] px-1.5 py-0.5 rounded-full bg-(--color-surface-3) text-(--color-muted)">{g}</span>
{/each}
</div>
</div>
<!-- Chevron (dimmed for current, normal for others) -->
<svg class={cn('w-4 h-4 shrink-0', isCurrent ? 'text-(--color-brand)/40' : 'text-(--color-muted)/40')} fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"/>
</svg>
</button>
{/each}
{/if}
</div>
</div>
</div>

View File

@@ -0,0 +1,675 @@
<script lang="ts">
import { audioStore } from '$lib/audio.svelte';
import { cn } from '$lib/utils';
import { goto } from '$app/navigation';
import { fly } from 'svelte/transition';
import type { Voice } from '$lib/types';
import ChapterPickerOverlay from '$lib/components/ChapterPickerOverlay.svelte';
interface Props {
/** Called when the user closes the overlay. */
onclose: () => void;
/** When true, open the chapter picker immediately on mount. */
openChapters?: boolean;
}
let { onclose, openChapters = false }: Props = $props();
// Voices come from the store (populated by AudioPlayer on mount/play)
const voices = $derived(audioStore.voices);
const kokoroVoices = $derived(voices.filter((v) => v.engine === 'kokoro'));
const pocketVoices = $derived(voices.filter((v) => v.engine === 'pocket-tts'));
const cfaiVoices = $derived(voices.filter((v) => v.engine === 'cfai'));
let showVoiceModal = $state(false);
// svelte-ignore state_referenced_locally
let showChapterModal = $state(openChapters && audioStore.chapters.length > 0);
let voiceSearch = $state('');
let samplePlayingVoice = $state<string | null>(null);
let sampleAudio: HTMLAudioElement | null = null;
// ── Pull-down-to-dismiss gesture ─────────────────────────────────────────
let dragY = $state(0);
let isDragging = $state(false);
let dragStartY = 0;
let dragStartTime = 0;
let overlayEl = $state<HTMLDivElement | null>(null);
// Register ontouchmove with passive:false so e.preventDefault() works.
// Svelte 5 does not support the |nonpassive modifier, so we use $effect.
$effect(() => {
if (!overlayEl) return;
overlayEl.addEventListener('touchmove', onTouchMove, { passive: false });
return () => overlayEl!.removeEventListener('touchmove', onTouchMove);
});
function onTouchStart(e: TouchEvent) {
// Don't hijack touches that start inside a scrollable element
const target = e.target as Element;
if (target.closest('.overflow-y-auto')) return;
// Don't activate if a modal is open (they handle their own scroll)
if (showVoiceModal || showChapterModal) return;
isDragging = true;
dragStartY = e.touches[0].clientY;
dragStartTime = Date.now();
dragY = 0;
}
function onTouchMove(e: TouchEvent) {
if (!isDragging) return;
const delta = e.touches[0].clientY - dragStartY;
// Only track downward movement
if (delta > 0) {
dragY = delta;
// Prevent page scroll while dragging the overlay down
e.preventDefault();
} else {
dragY = 0;
}
}
function onTouchEnd() {
if (!isDragging) return;
isDragging = false;
const elapsed = Date.now() - dragStartTime;
const velocity = dragY / Math.max(elapsed, 1); // px/ms
// Dismiss if dragged far enough (>130px) or flicked fast enough (>0.4px/ms)
if (dragY > 130 || velocity > 0.4) {
// Animate out: snap to bottom then close
dragY = window.innerHeight;
setTimeout(onclose, 220);
} else {
// Spring back to 0
dragY = 0;
}
}
// ── Voice search filtering ────────────────────────────────────────────────
const voiceSearchLower = $derived(voiceSearch.toLowerCase());
const filteredKokoro = $derived(kokoroVoices.filter((v) => voiceLabel(v).toLowerCase().includes(voiceSearchLower)));
const filteredPocket = $derived(pocketVoices.filter((v) => voiceLabel(v).toLowerCase().includes(voiceSearchLower)));
const filteredCfai = $derived(cfaiVoices.filter((v) => voiceLabel(v).toLowerCase().includes(voiceSearchLower)));
// ── Chapter search ────────────────────────────────────────────────────────
// (search state is managed internally by ChapterPickerOverlay)
// ── Chapter click-to-play ─────────────────────────────────────────────────
function playChapter(chapterNumber: number) {
audioStore.autoStartChapter = chapterNumber;
goto(`/books/${audioStore.slug}/chapters/${chapterNumber}`);
}
function voiceLabel(v: Voice | string): string {
if (typeof v === 'string') {
const found = voices.find((x) => x.id === v);
if (found) return voiceLabel(found);
const id = v as string;
return id.replace(/_/g, ' ').replace(/\b\w/g, (c) => c.toUpperCase());
}
const base = v.id.replace(/_/g, ' ').replace(/\b\w/g, (c) => c.toUpperCase());
return v.lang !== 'en-us' ? `${base} (${v.lang})` : base;
}
function stopSample() {
if (sampleAudio) {
sampleAudio.pause();
sampleAudio.src = '';
sampleAudio = null;
}
samplePlayingVoice = null;
}
async function playSample(voiceId: string) {
if (samplePlayingVoice === voiceId) { stopSample(); return; }
stopSample();
samplePlayingVoice = voiceId;
try {
const res = await fetch(`/api/presign/voice-sample?voice=${encodeURIComponent(voiceId)}`);
if (!res.ok) { samplePlayingVoice = null; return; }
const { url } = (await res.json()) as { url: string };
sampleAudio = new Audio(url);
sampleAudio.onended = () => stopSample();
sampleAudio.onerror = () => stopSample();
sampleAudio.play().catch(() => stopSample());
} catch {
samplePlayingVoice = null;
}
}
function selectVoice(voiceId: string) {
stopSample();
audioStore.voice = voiceId;
showVoiceModal = false;
voiceSearch = '';
}
// ── Speed ────────────────────────────────────────────────────────────────
const SPEED_OPTIONS = [0.75, 1, 1.25, 1.5, 2] as const;
// ── Sleep timer ──────────────────────────────────────────────────────────
const SLEEP_OPTIONS = [15, 30, 45, 60]; // minutes
let sleepRemainingSec = $derived.by(() => {
void audioStore.currentTime; // re-run every second while playing
if (!audioStore.sleepUntil) return 0;
return Math.max(0, Math.floor((audioStore.sleepUntil - Date.now()) / 1000));
});
function cycleSleepTimer() {
if (!audioStore.sleepUntil && !audioStore.sleepAfterChapter) {
audioStore.sleepAfterChapter = true;
} else if (audioStore.sleepAfterChapter) {
audioStore.sleepAfterChapter = false;
audioStore.sleepUntil = Date.now() + SLEEP_OPTIONS[0] * 60 * 1000;
} else {
const remaining = audioStore.sleepUntil - Date.now();
const currentMin = Math.round(remaining / 60000);
const idx = SLEEP_OPTIONS.findIndex((m) => m >= currentMin);
if (idx === -1 || idx === SLEEP_OPTIONS.length - 1) {
audioStore.sleepUntil = 0;
} else {
audioStore.sleepUntil = Date.now() + SLEEP_OPTIONS[idx + 1] * 60 * 1000;
}
}
}
function formatSleepRemaining(secs: number): string {
if (secs <= 0) return 'Off';
const m = Math.floor(secs / 60);
const s = secs % 60;
return m > 0 ? `${m}m${s > 0 ? ` ${s}s` : ''}` : `${s}s`;
}
const sleepLabel = $derived(
audioStore.sleepAfterChapter
? 'End Ch.'
: audioStore.sleepUntil > Date.now()
? formatSleepRemaining(sleepRemainingSec)
: 'Sleep'
);
// ── Format time ──────────────────────────────────────────────────────────
function formatTime(s: number): string {
if (!isFinite(s) || s < 0) return '0:00';
const m = Math.floor(s / 60);
const sec = Math.floor(s % 60);
return `${m}:${sec.toString().padStart(2, '0')}`;
}
// ── Playback controls ────────────────────────────────────────────────────
function seek(e: Event) {
audioStore.seekRequest = Number((e.currentTarget as HTMLInputElement).value);
}
function skipBack() {
audioStore.seekRequest = Math.max(0, audioStore.currentTime - 15);
}
function skipForward() {
audioStore.seekRequest = Math.min(audioStore.duration || 0, audioStore.currentTime + 30);
}
function togglePlay() {
audioStore.toggleRequest++;
}
// Close on Escape
$effect(() => {
function onKey(e: KeyboardEvent) {
if (e.key === 'Escape') {
if (showChapterModal) { showChapterModal = false; }
else if (showVoiceModal) { showVoiceModal = false; voiceSearch = ''; }
else { onclose(); }
}
}
window.addEventListener('keydown', onKey);
return () => window.removeEventListener('keydown', onKey);
});
</script>
<!-- Full-screen listening mode overlay -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
transition:fly={{ y: '100%', duration: 320, opacity: 1 }}
bind:this={overlayEl}
class="fixed inset-0 z-60 flex flex-col overflow-hidden"
style="
background: var(--color-surface);
transform: translateY({dragY}px);
opacity: {Math.max(0, 1 - dragY / 500)};
transition: {isDragging ? 'none' : 'transform 0.32s cubic-bezier(0.32,0.72,0,1), opacity 0.32s ease'};
will-change: transform;
touch-action: pan-x;
pointer-events: auto;
"
ontouchstart={onTouchStart}
ontouchend={onTouchEnd}
>
<!-- ── Blurred background (full-screen atmospheric layer) ───────────── -->
{#if audioStore.cover}
<img
src={audioStore.cover}
alt=""
aria-hidden="true"
class="absolute inset-0 w-full h-full object-cover pointer-events-none select-none"
style="filter: blur(40px) brightness(0.25) saturate(1.4); transform: scale(1.15); z-index: 0;"
/>
{:else}
<div class="absolute inset-0 pointer-events-none" style="background: var(--color-surface-2); z-index: 0;"></div>
{/if}
<!-- Subtle vignette overlay for depth -->
<div
class="absolute inset-0 pointer-events-none"
style="background: radial-gradient(ellipse at center, transparent 40%, rgba(0,0,0,0.55) 100%); z-index: 1;"
aria-hidden="true"
></div>
<!-- ── Header bar ─────────────────────────────────────────────────────── -->
<div class="relative flex items-center justify-between px-4 pt-3 pb-2 shrink-0" style="z-index: 2;">
<button
type="button"
onclick={onclose}
class="p-2 rounded-full text-(--color-text)/70 hover:text-(--color-text) hover:bg-white/10 transition-colors"
aria-label="Close listening mode"
>
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"/>
</svg>
</button>
<span class="text-xs font-semibold text-(--color-text)/60 uppercase tracking-wider">Now Playing</span>
<div class="flex items-center gap-2">
<!-- Chapters button -->
{#if audioStore.chapters.length > 0}
<button
type="button"
onclick={() => { showChapterModal = !showChapterModal; showVoiceModal = false; voiceSearch = ''; }}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-medium border transition-colors',
showChapterModal
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-white/20 bg-black/25 text-(--color-text)/70 hover:text-(--color-text) backdrop-blur-sm'
)}
aria-label="Browse chapters"
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 6h16M4 10h16M4 14h10"/>
</svg>
Chapters
</button>
{/if}
<!-- Voice selector button -->
<button
type="button"
onclick={() => { showVoiceModal = !showVoiceModal; showChapterModal = false; }}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-medium border transition-colors',
showVoiceModal
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-white/20 bg-black/25 text-(--color-text)/70 hover:text-(--color-text) backdrop-blur-sm'
)}
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 11a7 7 0 01-7 7m0 0a7 7 0 01-7-7m7 7v4m0 0H8m4 0h4m-4-8a3 3 0 01-3-3V5a3 3 0 116 0v6a3 3 0 01-3 3z"/>
</svg>
<span class="max-w-[80px] truncate">{voiceLabel(audioStore.voice)}</span>
</button>
</div>
</div>
<!-- ── Portrait cover card + track info ───────────────────────────────── -->
<div class="relative flex flex-col items-center gap-4 px-8 pt-2 pb-4 shrink-0" style="z-index: 2;">
<!-- Cover card -->
<div
class="rounded-2xl overflow-hidden shadow-2xl"
style="height: 38svh; min-height: 180px; max-height: 320px; aspect-ratio: 2/3;"
>
{#if audioStore.cover}
<img
src={audioStore.cover}
alt=""
class="w-full h-full object-cover"
/>
{:else}
<div class="w-full h-full bg-(--color-surface-2) flex items-center justify-center">
<svg class="w-16 h-16 text-(--color-muted)/30" fill="currentColor" viewBox="0 0 24 24">
<path d="M18 2H6c-1.1 0-2 .9-2 2v16c0 1.1.9 2 2 2h12c1.1 0 2-.9 2-2V4c0-1.1-.9-2-2-2zm-2 14H8v-2h8v2zm0-4H8v-2h8v2zm0-4H8V6h8v2z"/>
</svg>
</div>
{/if}
</div>
<!-- Track info -->
<div class="text-center w-full">
{#if audioStore.chapter > 0}
<p class="text-[10px] font-bold uppercase tracking-widest text-(--color-brand) mb-0.5">
Chapter {audioStore.chapter}
</p>
{/if}
<p class="text-lg font-bold text-(--color-text) leading-snug line-clamp-2">
{audioStore.chapterTitle || (audioStore.chapter > 0 ? `Chapter ${audioStore.chapter}` : '')}
</p>
<p class="text-sm text-(--color-text)/50 mt-0.5 truncate">{audioStore.bookTitle}</p>
</div>
</div>
<!-- Voice modal (full-screen overlay) -->
{#if showVoiceModal && voices.length > 0}
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="fixed inset-0 z-[80] flex flex-col"
style="background: var(--color-surface);"
>
<!-- Modal header -->
<div class="flex items-center gap-3 px-4 py-3 border-b border-(--color-border) shrink-0" style="padding-top: max(0.75rem, env(safe-area-inset-top));">
<button
type="button"
onclick={() => { stopSample(); showVoiceModal = false; voiceSearch = ''; }}
class="p-2 rounded-full text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Close voice picker"
>
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"/>
</svg>
</button>
<span class="text-xs font-semibold text-(--color-muted) uppercase tracking-wider flex-1">Select Voice</span>
</div>
<!-- Search input -->
<div class="px-4 py-3 shrink-0 border-b border-(--color-border)">
<div class="relative">
<svg class="absolute left-3 top-1/2 -translate-y-1/2 w-4 h-4 text-(--color-muted)" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"/>
</svg>
<input
type="search"
placeholder="Search voices…"
bind:value={voiceSearch}
class="w-full pl-9 pr-4 py-2 text-sm bg-(--color-surface-2) border border-(--color-border) rounded-lg text-(--color-text) placeholder:text-(--color-muted) focus:outline-none focus:border-(--color-brand) transition-colors"
/>
</div>
</div>
<!-- Voice list -->
<div class="flex-1 overflow-y-auto overscroll-contain" style="padding-bottom: env(safe-area-inset-bottom);">
{#each ([['Kokoro', filteredKokoro], ['Pocket TTS', filteredPocket], ['CF AI', filteredCfai]] as [string, Voice[]][]) as [label, group]}
{#if group.length > 0}
<p class="text-[10px] font-semibold text-(--color-muted) uppercase tracking-wider px-4 py-2 sticky top-0 bg-(--color-surface) border-b border-(--color-border)/50">{label}</p>
{#each group as v (v.id)}
<div
class={cn(
'flex items-center gap-3 px-4 py-3 border-b border-(--color-border)/40 transition-colors',
audioStore.voice === v.id ? 'bg-(--color-brand)/8' : 'hover:bg-(--color-surface-2)'
)}
>
<button type="button" onclick={() => selectVoice(v.id)} class="flex-1 flex items-center gap-3 text-left">
<span class={cn(
'w-4 h-4 shrink-0 rounded-full border-2 flex items-center justify-center transition-colors',
audioStore.voice === v.id ? 'border-(--color-brand) bg-(--color-brand)' : 'border-(--color-border)'
)}>
{#if audioStore.voice === v.id}
<svg class="w-2 h-2 text-(--color-surface)" fill="currentColor" viewBox="0 0 24 24"><path d="M9 16.17L4.83 12l-1.42 1.41L9 19 21 7l-1.41-1.41z"/></svg>
{/if}
</span>
<span class={cn('text-sm', audioStore.voice === v.id ? 'font-semibold text-(--color-brand)' : 'text-(--color-text)')}>{voiceLabel(v)}</span>
</button>
<button
type="button"
onclick={() => playSample(v.id)}
class={cn('shrink-0 p-2 rounded-full transition-colors', samplePlayingVoice === v.id ? 'text-(--color-brand) bg-(--color-brand)/10' : 'text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2)')}
title={samplePlayingVoice === v.id ? 'Stop sample' : 'Play sample'}
aria-label={samplePlayingVoice === v.id ? 'Stop sample' : 'Play sample'}
>
{#if samplePlayingVoice === v.id}
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 24 24"><path d="M6 4h4v16H6V4zm8 0h4v16h-4V4z"/></svg>
{:else}
<svg class="w-4 h-4" fill="currentColor" viewBox="0 0 24 24"><path d="M8 5v14l11-7z"/></svg>
{/if}
</button>
</div>
{/each}
{/if}
{/each}
{#if filteredKokoro.length === 0 && filteredPocket.length === 0 && filteredCfai.length === 0}
<p class="px-4 py-8 text-sm text-(--color-muted) text-center">No voices match "{voiceSearch}"</p>
{/if}
</div>
</div>
{/if}
<!-- ── Controls area (bottom half) ───────────────────────────────────── -->
<div class="flex-1 flex flex-col justify-end px-6 pb-6 gap-0 shrink-0 overflow-hidden" style="z-index: 2; position: relative;">
<!-- Seek bar -->
<div class="shrink-0 mb-1">
<input
type="range"
aria-label="Seek"
min="0"
max={audioStore.duration || 0}
value={audioStore.currentTime}
oninput={seek}
class="w-full h-1.5 cursor-pointer block"
style="accent-color: var(--color-brand);"
/>
<div class="flex justify-between text-xs text-(--color-muted) tabular-nums mt-1">
<span>{formatTime(audioStore.currentTime)}</span>
<!-- Remaining time in centre -->
{#if audioStore.duration > 0}
<span class="text-(--color-muted)/60">{formatTime(Math.max(0, audioStore.duration - audioStore.currentTime))}</span>
{/if}
<span>{formatTime(audioStore.duration)}</span>
</div>
</div>
<!-- Transport controls -->
<div class="flex items-center justify-between pt-3 pb-4 shrink-0">
<!-- Prev chapter — smaller, clearly secondary -->
{#if audioStore.chapter > 1 && audioStore.slug}
<button
type="button"
onclick={() => playChapter(audioStore.chapter - 1)}
class="p-2 rounded-full text-(--color-muted)/60 hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
title="Previous chapter"
aria-label="Previous chapter"
>
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 24 24">
<path d="M6 6h2v12H6zm2 6 8.5 6V6z"/>
</svg>
</button>
{:else}
<div class="w-9 h-9"></div>
{/if}
<!-- Skip back 15s — medium -->
<button
type="button"
onclick={skipBack}
class="p-2 rounded-full text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Skip back 15 seconds"
title="Back 15s"
>
<svg class="w-7 h-7" fill="currentColor" viewBox="0 0 24 24">
<path d="M11.99 5V1l-5 5 5 5V7c3.31 0 6 2.69 6 6s-2.69 6-6 6-6-2.69-6-6h-2c0 4.42 3.58 8 8 8s8-3.58 8-8-3.58-8-8-8z"/>
<text x="8.5" y="14.5" font-size="5" font-family="sans-serif" font-weight="bold" fill="currentColor">15</text>
</svg>
</button>
<!-- Play / Pause — largest, centred -->
<button
type="button"
onclick={togglePlay}
class="w-18 h-18 rounded-full bg-(--color-brand) text-(--color-surface) flex items-center justify-center hover:bg-(--color-brand-dim) transition-colors shadow-xl"
style="width: 4.5rem; height: 4.5rem;"
aria-label={audioStore.isPlaying ? 'Pause' : 'Play'}
>
{#if audioStore.isPlaying}
<svg class="w-8 h-8" fill="currentColor" viewBox="0 0 24 24">
<path d="M6 4h4v16H6V4zm8 0h4v16h-4V4z"/>
</svg>
{:else}
<svg class="w-8 h-8 ml-1" fill="currentColor" viewBox="0 0 24 24">
<path d="M8 5v14l11-7z"/>
</svg>
{/if}
</button>
<!-- Skip forward 30s — medium -->
<button
type="button"
onclick={skipForward}
class="p-2 rounded-full text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Skip forward 30 seconds"
title="Forward 30s"
>
<svg class="w-7 h-7" fill="currentColor" viewBox="0 0 24 24">
<path d="M12 5V1l5 5-5 5V7c-3.31 0-6 2.69-6 6s2.69 6 6 6 6-2.69 6-6h2c0 4.42-3.58 8-8 8s-8-3.58-8-8 3.58-8 8-8z"/>
<text x="8.5" y="14.5" font-size="5" font-family="sans-serif" font-weight="bold" fill="currentColor">30</text>
</svg>
</button>
<!-- Next chapter — smaller, clearly secondary -->
{#if audioStore.nextChapter !== null && audioStore.slug}
<button
type="button"
onclick={() => playChapter(audioStore.nextChapter!)}
class="p-2 rounded-full text-(--color-muted)/60 hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
title="Next chapter"
aria-label="Next chapter"
>
<svg class="w-5 h-5" fill="currentColor" viewBox="0 0 24 24">
<path d="M6 18l8.5-6L6 6v12zM16 6v12h2V6h-2z"/>
</svg>
</button>
{:else}
<div class="w-9 h-9"></div>
{/if}
</div>
<!-- Secondary controls: unified single row — Speed · Auto · Announce · Sleep -->
<div class="flex items-center justify-center gap-2 shrink-0 flex-wrap">
<!-- Speed — segmented pill -->
<div class="flex items-center gap-0.5 bg-(--color-surface-2) rounded-full px-1.5 py-1 border border-(--color-border)">
{#each SPEED_OPTIONS as s}
<button
type="button"
onclick={() => (audioStore.speed = s)}
class={cn(
'px-2 py-0.5 rounded-full text-xs font-semibold transition-colors',
audioStore.speed === s
? 'bg-(--color-brand) text-(--color-surface)'
: 'text-(--color-muted) hover:text-(--color-text)'
)}
aria-pressed={audioStore.speed === s}
>{s}×</button>
{/each}
</div>
<!-- Auto-next pill -->
<button
type="button"
onclick={() => (audioStore.autoNext = !audioStore.autoNext)}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-semibold border transition-colors',
audioStore.autoNext
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-(--color-border) bg-(--color-surface-2) text-(--color-muted) hover:text-(--color-text)'
)}
aria-pressed={audioStore.autoNext}
title={audioStore.autoNext ? 'Auto-next on' : 'Auto-next off'}
>
<svg class="w-3.5 h-3.5" fill="currentColor" viewBox="0 0 24 24">
<path d="M6 18l8.5-6L6 6v12zm8.5-6L23 6v12l-8.5-6z"/>
</svg>
Auto
{#if audioStore.autoNext && audioStore.nextStatus === 'prefetching'}
<span class="w-1.5 h-1.5 rounded-full bg-(--color-brand) animate-pulse"></span>
{:else if audioStore.autoNext && audioStore.nextStatus === 'prefetched'}
<span class="w-1.5 h-1.5 rounded-full bg-green-400"></span>
{/if}
</button>
<!-- Announce chapter pill (only meaningful when auto-next is on) -->
<button
type="button"
onclick={() => (audioStore.announceChapter = !audioStore.announceChapter)}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-semibold border transition-colors',
audioStore.announceChapter
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-(--color-border) bg-(--color-surface-2) text-(--color-muted) hover:text-(--color-text)'
)}
aria-pressed={audioStore.announceChapter}
title={audioStore.announceChapter ? 'Chapter announcing on' : 'Chapter announcing off'}
>
<svg class="w-3.5 h-3.5" fill="currentColor" viewBox="0 0 24 24">
<path d="M3 9v6h4l5 5V4L7 9H3zm13.5 3c0-1.77-1.02-3.29-2.5-4.03v8.05c1.48-.73 2.5-2.25 2.5-4.02z"/>
</svg>
Announce
</button>
<!-- Stream / Generate mode toggle -->
<!-- CF AI voices are batch-only and always use generate mode regardless of this setting -->
<button
type="button"
onclick={() => {
if (!audioStore.voice.startsWith('cfai:')) {
audioStore.audioMode = audioStore.audioMode === 'stream' ? 'generate' : 'stream';
}
}}
disabled={audioStore.voice.startsWith('cfai:')}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-semibold border transition-colors',
audioStore.voice.startsWith('cfai:')
? 'border-(--color-border) bg-(--color-surface-2) text-(--color-border) cursor-not-allowed opacity-50'
: audioStore.audioMode === 'stream'
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-(--color-border) bg-(--color-surface-2) text-(--color-muted) hover:text-(--color-text)'
)}
aria-pressed={audioStore.audioMode === 'stream'}
title={audioStore.voice.startsWith('cfai:') ? 'CF AI voices always use generate mode' : audioStore.audioMode === 'stream' ? 'Stream mode — audio starts instantly' : 'Generate mode — wait for full audio before playing'}
>
{#if audioStore.audioMode === 'stream' && !audioStore.voice.startsWith('cfai:')}
<svg class="w-3.5 h-3.5" fill="currentColor" viewBox="0 0 24 24">
<path d="M8 5v14l11-7z"/>
</svg>
{:else}
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-4l-4 4m0 0l-4-4m4 4V4"/>
</svg>
{/if}
{audioStore.audioMode === 'stream' && !audioStore.voice.startsWith('cfai:') ? 'Stream' : 'Generate'}
</button>
<!-- Sleep timer pill -->
<button
type="button"
onclick={cycleSleepTimer}
class={cn(
'flex items-center gap-1.5 px-3 py-1.5 rounded-full text-xs font-semibold border transition-colors',
audioStore.sleepUntil || audioStore.sleepAfterChapter
? 'border-(--color-brand) bg-(--color-brand)/15 text-(--color-brand)'
: 'border-(--color-border) bg-(--color-surface-2) text-(--color-muted) hover:text-(--color-text)'
)}
title="Sleep timer"
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M20.354 15.354A9 9 0 018.646 3.646 9.003 9.003 0 0012 21a9.003 9.003 0 008.354-5.646z"/>
</svg>
{sleepLabel}
</button>
</div>
</div>
</div>
<!-- Chapter picker rendered OUTSIDE the transformed overlay so that
fixed inset-0 anchors to the real viewport, not the CSS-transformed
containing block (transform: translateY breaks fixed positioning). -->
{#if showChapterModal && audioStore.chapters.length > 0}
<ChapterPickerOverlay
chapters={audioStore.chapters}
activeChapter={audioStore.chapter}
zIndex="z-[80]"
onselect={playChapter}
onclose={() => { showChapterModal = false; }}
/>
{/if}

View File

@@ -0,0 +1,184 @@
<script lang="ts">
import { browser } from '$app/environment';
import { cn } from '$lib/utils';
interface Notification {
id: string;
title: string;
message: string;
link: string;
read: boolean;
}
interface Props {
notifications: Notification[];
userId: string;
isAdmin: boolean;
onclose: () => void;
onMarkRead: (id: string) => void;
onMarkAllRead: () => void;
onDismiss: (id: string) => void;
onClearAll: () => void;
}
let {
notifications,
userId,
isAdmin,
onclose,
onMarkRead,
onMarkAllRead,
onDismiss,
onClearAll,
}: Props = $props();
let filter = $state<'all' | 'unread'>('all');
const filtered = $derived(
filter === 'unread' ? notifications.filter(n => !n.read) : notifications
);
const unreadCount = $derived(notifications.filter(n => !n.read).length);
// Body scroll lock + Escape to close
$effect(() => {
if (browser) {
const prev = document.body.style.overflow;
document.body.style.overflow = 'hidden';
return () => { document.body.style.overflow = prev; };
}
});
function onKeydown(e: KeyboardEvent) {
if (e.key === 'Escape') onclose();
}
const viewAllHref = $derived(isAdmin ? '/admin/notifications' : '/notifications');
</script>
<svelte:window onkeydown={onKeydown} />
<!-- Backdrop -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="fixed inset-0 z-[70] flex flex-col"
style="background: rgba(0,0,0,0.6); backdrop-filter: blur(4px);"
onpointerdown={(e) => { if (e.target === e.currentTarget) onclose(); }}
>
<!-- Modal panel — slides down from top -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="w-full max-w-2xl mx-auto mt-0 sm:mt-16 flex flex-col bg-(--color-surface) sm:rounded-2xl border-b sm:border border-(--color-border) shadow-2xl overflow-hidden"
style="max-height: 100svh;"
onpointerdown={(e) => e.stopPropagation()}
>
<!-- Header row -->
<div class="flex items-center justify-between px-4 py-3 border-b border-(--color-border) shrink-0">
<div class="flex items-center gap-3">
<span class="text-base font-semibold text-(--color-text)">Notifications</span>
{#if unreadCount > 0}
<span class="text-xs font-semibold px-2 py-0.5 rounded-full bg-(--color-brand) text-black leading-none">
{unreadCount}
</span>
{/if}
</div>
<div class="flex items-center gap-1">
{#if unreadCount > 0}
<button
type="button"
onclick={onMarkAllRead}
class="text-xs text-(--color-muted) hover:text-(--color-text) transition-colors px-2 py-1 rounded hover:bg-(--color-surface-2)"
>Mark all read</button>
{/if}
{#if notifications.length > 0}
<button
type="button"
onclick={onClearAll}
class="text-xs text-(--color-muted) hover:text-red-400 transition-colors px-2 py-1 rounded hover:bg-(--color-surface-2)"
>Clear all</button>
{/if}
<button
type="button"
onclick={onclose}
class="shrink-0 px-3 py-1 rounded-lg text-sm text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Close notifications"
>
Cancel
</button>
</div>
</div>
<!-- Filter tabs -->
<div class="flex gap-0 px-4 py-2 border-b border-(--color-border)/60 shrink-0">
<button
type="button"
onclick={() => filter = 'all'}
class={cn(
'text-xs px-3 py-1.5 rounded-l border border-(--color-border) transition-colors',
filter === 'all'
? 'bg-(--color-brand) text-black border-(--color-brand) font-semibold'
: 'text-(--color-muted) hover:text-(--color-text)'
)}
>All ({notifications.length})</button>
<button
type="button"
onclick={() => filter = 'unread'}
class={cn(
'text-xs px-3 py-1.5 rounded-r border border-l-0 border-(--color-border) transition-colors',
filter === 'unread'
? 'bg-(--color-brand) text-black border-(--color-brand) font-semibold'
: 'text-(--color-muted) hover:text-(--color-text)'
)}
>Unread ({unreadCount})</button>
</div>
<!-- Scrollable list -->
<div class="flex-1 overflow-y-auto overscroll-contain min-h-0">
{#if filtered.length === 0}
<div class="py-16 text-center text-(--color-muted) text-sm">
{filter === 'unread' ? 'No unread notifications' : 'No notifications yet'}
</div>
{:else}
{#each filtered as n (n.id)}
<div class={cn(
'flex items-start gap-1 border-b border-(--color-border)/40 last:border-0 hover:bg-(--color-surface-2) group transition-colors',
n.read && 'opacity-60'
)}>
<a
href={n.link || (isAdmin ? '/admin' : '/')}
onclick={() => { onMarkRead(n.id); onclose(); }}
class="flex-1 px-4 py-3.5 min-w-0"
>
<div class="flex items-center gap-1.5">
{#if !n.read}
<span class="w-1.5 h-1.5 rounded-full bg-(--color-brand) shrink-0"></span>
{/if}
<span class="text-sm font-semibold text-(--color-text) truncate">{n.title}</span>
</div>
<p class="text-sm text-(--color-muted) mt-0.5 line-clamp-2">{n.message}</p>
</a>
<button
type="button"
onclick={() => onDismiss(n.id)}
class="shrink-0 p-3 text-(--color-muted) hover:text-red-400 opacity-0 group-hover:opacity-100 transition-all"
title="Dismiss"
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
</svg>
</button>
</div>
{/each}
{/if}
</div>
<!-- Footer -->
<div class="px-4 py-3 border-t border-(--color-border)/40 shrink-0">
<a
href={viewAllHref}
onclick={onclose}
class="block text-center text-sm text-(--color-muted) hover:text-(--color-brand) transition-colors"
>View all notifications</a>
</div>
</div>
</div>

View File

@@ -0,0 +1,424 @@
<script lang="ts">
import { browser } from '$app/environment';
import { goto } from '$app/navigation';
import { cn } from '$lib/utils';
interface Props {
onclose: () => void;
}
let { onclose }: Props = $props();
// ── Types ─────────────────────────────────────────────────────────────────
interface SearchResult {
slug: string;
title: string;
cover?: string;
author?: string;
genres?: string[];
status?: string;
chapters?: string; // e.g. "42 chapters"
url?: string; // novelfire source url — present for remote results
}
interface SearchResponse {
results: SearchResult[];
local_count: number;
remote_count: number;
}
// ── State ─────────────────────────────────────────────────────────────────
const RECENTS_KEY = 'search_recents_v1';
const MAX_RECENTS = 8;
function loadRecents(): string[] {
if (!browser) return [];
try {
const raw = localStorage.getItem(RECENTS_KEY);
if (raw) return JSON.parse(raw) as string[];
} catch { /* ignore */ }
return [];
}
function saveRecents(list: string[]) {
if (!browser) return;
try { localStorage.setItem(RECENTS_KEY, JSON.stringify(list)); } catch { /* ignore */ }
}
let recents = $state<string[]>(loadRecents());
let query = $state('');
let results = $state<SearchResult[]>([]);
let localCount = $state(0);
let remoteCount = $state(0);
let loading = $state(false);
let error = $state('');
// For keyboard navigation through results
let selectedIdx = $state(-1);
// Input element ref for autofocus
let inputEl = $state<HTMLInputElement | null>(null);
// ── Autofocus + body scroll lock ──────────────────────────────────────────
$effect(() => {
if (inputEl) inputEl.focus();
if (browser) {
const prev = document.body.style.overflow;
document.body.style.overflow = 'hidden';
return () => { document.body.style.overflow = prev; };
}
});
// ── Keyboard shortcuts (global): Escape closes ────────────────────────────
function onKeydown(e: KeyboardEvent) {
if (e.key === 'Escape') { onclose(); return; }
const total = visibleResults.length;
if (total === 0) return;
if (e.key === 'ArrowDown') {
e.preventDefault();
selectedIdx = (selectedIdx + 1) % total;
} else if (e.key === 'ArrowUp') {
e.preventDefault();
selectedIdx = (selectedIdx - 1 + total) % total;
} else if (e.key === 'Enter' && selectedIdx >= 0) {
e.preventDefault();
navigateTo(visibleResults[selectedIdx]);
}
}
// ── Debounced search ──────────────────────────────────────────────────────
let debounceTimer = 0;
$effect(() => {
const q = query.trim();
selectedIdx = -1;
if (q.length < 2) {
results = [];
localCount = 0;
remoteCount = 0;
loading = false;
error = '';
clearTimeout(debounceTimer);
return;
}
loading = true;
error = '';
clearTimeout(debounceTimer);
debounceTimer = setTimeout(async () => {
try {
const res = await fetch(`/api/search?q=${encodeURIComponent(q)}`);
if (!res.ok) throw new Error(`HTTP ${res.status}`);
const data: SearchResponse = await res.json();
results = data.results ?? [];
localCount = data.local_count ?? 0;
remoteCount = data.remote_count ?? 0;
} catch (e) {
error = 'Search failed. Please try again.';
results = [];
} finally {
loading = false;
}
}, 300) as unknown as number;
});
// Results visible in the list — same as results (no client-side filter needed)
const visibleResults = $derived(results);
// ── Genre suggestions shown when query is empty ───────────────────────────
const GENRE_SUGGESTIONS = [
'Fantasy', 'Action', 'Romance', 'Cultivation', 'System',
'Reincarnation', 'Sci-Fi', 'Horror', 'Slice of Life', 'Adventure',
];
// ── Navigation helpers ────────────────────────────────────────────────────
function navigateTo(r: SearchResult) {
pushRecent(query.trim());
goto(`/books/${r.slug}`);
onclose();
}
function searchGenre(genre: string) {
goto(`/catalogue?genre=${encodeURIComponent(genre)}`);
onclose();
}
function submitQuery() {
const q = query.trim();
if (!q) return;
pushRecent(q);
goto(`/catalogue?q=${encodeURIComponent(q)}`);
onclose();
}
// ── Recent searches ───────────────────────────────────────────────────────
function pushRecent(q: string) {
if (!q || q.length < 2) return;
const next = [q, ...recents.filter(r => r.toLowerCase() !== q.toLowerCase())].slice(0, MAX_RECENTS);
recents = next;
saveRecents(next);
}
function removeRecent(q: string) {
const next = recents.filter(r => r !== q);
recents = next;
saveRecents(next);
}
function clearAllRecents() {
recents = [];
saveRecents([]);
}
function applyRecent(q: string) {
query = q;
if (inputEl) inputEl.focus();
}
// ── Helpers ───────────────────────────────────────────────────────────────
function parseGenres(genres: string[] | string | undefined): string[] {
if (!genres) return [];
if (Array.isArray(genres)) return genres;
try { const p = JSON.parse(genres); return Array.isArray(p) ? p : []; } catch { return []; }
}
const isRemote = (r: SearchResult) => r.url != null && r.url.includes('novelfire');
</script>
<!-- svelte:window for global keyboard handling -->
<svelte:window onkeydown={onKeydown} />
<!-- Backdrop -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="fixed inset-0 z-[70] flex flex-col"
style="background: rgba(0,0,0,0.6); backdrop-filter: blur(4px);"
onpointerdown={(e) => { if (e.target === e.currentTarget) onclose(); }}
>
<!-- Modal panel — slides down from top -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div
class="w-full max-w-2xl mx-auto mt-0 sm:mt-16 flex flex-col bg-(--color-surface) sm:rounded-2xl border-b sm:border border-(--color-border) shadow-2xl overflow-hidden"
style="max-height: 100svh; sm:max-height: calc(100svh - 8rem);"
onpointerdown={(e) => e.stopPropagation()}
>
<!-- ── Search input row ──────────────────────────────────────────────── -->
<div class="flex items-center gap-3 px-4 py-3 border-b border-(--color-border) shrink-0">
<!-- Search icon -->
{#if loading}
<svg class="w-5 h-5 text-(--color-brand) animate-spin shrink-0" fill="none" viewBox="0 0 24 24">
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"/>
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4z"/>
</svg>
{:else}
<svg class="w-5 h-5 text-(--color-muted) shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"/>
</svg>
{/if}
<input
bind:this={inputEl}
bind:value={query}
type="search"
placeholder="Search books, authors, genres…"
class="flex-1 bg-transparent text-(--color-text) placeholder:text-(--color-muted) text-base focus:outline-none min-w-0 [&::-webkit-search-cancel-button]:hidden [&::-webkit-search-decoration]:hidden"
onkeydown={(e) => { if (e.key === 'Enter') { e.preventDefault(); submitQuery(); } }}
autocomplete="off"
autocorrect="off"
spellcheck={false}
/>
<button
type="button"
onclick={onclose}
class="shrink-0 px-3 py-1 rounded-lg text-sm text-(--color-muted) hover:text-(--color-text) hover:bg-(--color-surface-2) transition-colors"
aria-label="Close search"
>
Cancel
</button>
</div>
<!-- ── Scrollable body ───────────────────────────────────────────────── -->
<div class="flex-1 overflow-y-auto overscroll-contain">
<!-- ── Error state ─────────────────────────────────────────────── -->
{#if error}
<p class="px-5 py-8 text-sm text-center text-(--color-danger)">{error}</p>
<!-- ── Results ─────────────────────────────────────────────────── -->
{:else if visibleResults.length > 0}
<!-- Result count + "see all" hint -->
<div class="flex items-center justify-between px-4 pt-3 pb-1.5">
<p class="text-xs text-(--color-muted)">
{#if localCount > 0 && remoteCount > 0}
<span class="text-(--color-text) font-medium">{localCount}</span> in library
· <span class="text-(--color-text) font-medium">{remoteCount}</span> from Novelfire
{:else if localCount > 0}
<span class="text-(--color-text) font-medium">{localCount}</span> in library
{:else}
<span class="text-(--color-text) font-medium">{remoteCount}</span> from Novelfire
{/if}
</p>
<!-- "All results in catalogue" shortcut -->
<button
type="button"
onclick={submitQuery}
class="text-xs text-(--color-brand) hover:text-(--color-brand-dim) transition-colors"
>
See all in catalogue →
</button>
</div>
{#each visibleResults as r, i}
{@const genres = parseGenres(r.genres)}
{@const remote = isRemote(r)}
<button
type="button"
onclick={() => navigateTo(r)}
class={cn(
'w-full flex items-center gap-3 px-4 py-3 text-left transition-colors border-b border-(--color-border)/40 last:border-0',
selectedIdx === i ? 'bg-(--color-surface-2)' : 'hover:bg-(--color-surface-2)'
)}
>
<!-- Cover thumbnail -->
<div class="shrink-0 w-10 h-14 rounded overflow-hidden bg-(--color-surface-2) border border-(--color-border)">
{#if r.cover}
<img src={r.cover} alt="" class="w-full h-full object-cover" loading="lazy" />
{:else}
<div class="w-full h-full flex items-center justify-center">
<svg class="w-5 h-5 text-(--color-muted)/40" fill="currentColor" viewBox="0 0 24 24">
<path d="M18 2H6c-1.1 0-2 .9-2 2v16c0 1.1.9 2 2 2h12c1.1 0 2-.9 2-2V4c0-1.1-.9-2-2-2z"/>
</svg>
</div>
{/if}
</div>
<!-- Info -->
<div class="flex-1 min-w-0">
<div class="flex items-start gap-2">
<p class="text-sm font-semibold text-(--color-text) leading-snug line-clamp-1 flex-1">
{r.title}
</p>
{#if remote}
<span class="shrink-0 text-[10px] font-semibold px-1.5 py-0.5 rounded bg-(--color-surface-3) text-(--color-muted) border border-(--color-border) leading-none mt-0.5">
Novelfire
</span>
{/if}
</div>
{#if r.author}
<p class="text-xs text-(--color-muted) mt-0.5 truncate">{r.author}</p>
{/if}
<div class="flex items-center gap-1.5 mt-1 flex-wrap">
{#if r.chapters}
<span class="text-xs text-(--color-muted)/60">{r.chapters}</span>
{/if}
{#if r.status}
<span class="text-xs text-(--color-muted)/60 capitalize">{r.status}</span>
{/if}
{#each genres.slice(0, 2) as g}
<span class="text-[10px] px-1.5 py-0.5 rounded-full bg-(--color-surface-3) text-(--color-muted)">{g}</span>
{/each}
</div>
</div>
<!-- Chevron -->
<svg class="w-4 h-4 text-(--color-muted)/40 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7"/>
</svg>
</button>
{/each}
<!-- "See all results" footer button -->
<button
type="button"
onclick={submitQuery}
class="w-full flex items-center justify-center gap-2 px-4 py-4 text-sm text-(--color-brand) hover:bg-(--color-surface-2) transition-colors"
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"/>
</svg>
See all results for "{query.trim()}"
</button>
<!-- ── No results ───────────────────────────────────────────────── -->
{:else if query.trim().length >= 2 && !loading}
<div class="px-5 py-10 text-center">
<p class="text-sm font-semibold text-(--color-text) mb-1">No results for "{query.trim()}"</p>
<p class="text-xs text-(--color-muted) mb-5">Try a different title, author, or browse by genre below.</p>
<div class="flex flex-wrap gap-2 justify-center">
{#each GENRE_SUGGESTIONS as genre}
<button
type="button"
onclick={() => searchGenre(genre)}
class="px-3 py-1.5 rounded-full border border-(--color-border) bg-(--color-surface-2) text-sm text-(--color-muted) hover:text-(--color-text) hover:border-(--color-brand)/50 transition-colors"
>
{genre}
</button>
{/each}
</div>
</div>
<!-- ── Empty state (query too short or empty) ───────────────────── -->
{:else if query.trim().length === 0}
<!-- Recent searches -->
{#if recents.length > 0}
<div class="px-4 pt-4 pb-2">
<div class="flex items-center justify-between mb-2">
<p class="text-xs font-semibold text-(--color-muted) uppercase tracking-wider">Recent</p>
<button
type="button"
onclick={clearAllRecents}
class="text-xs text-(--color-muted) hover:text-(--color-text) transition-colors"
>
Clear all
</button>
</div>
{#each recents as r}
<div class="flex items-center gap-2 group">
<button
type="button"
onclick={() => applyRecent(r)}
class="flex-1 flex items-center gap-2.5 px-1 py-2 rounded-lg text-sm text-(--color-text) hover:bg-(--color-surface-2) transition-colors text-left"
>
<svg class="w-3.5 h-3.5 text-(--color-muted)/50 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z"/>
</svg>
{r}
</button>
<button
type="button"
onclick={() => removeRecent(r)}
class="shrink-0 p-1 rounded text-(--color-muted)/40 hover:text-(--color-muted) opacity-0 group-hover:opacity-100 transition-all"
aria-label="Remove"
>
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"/>
</svg>
</button>
</div>
{/each}
</div>
<div class="mx-4 my-2 border-t border-(--color-border)/60"></div>
{/if}
<!-- Genre suggestions -->
<div class="px-4 pt-3 pb-5">
<p class="text-xs font-semibold text-(--color-muted) uppercase tracking-wider mb-3">Browse by genre</p>
<div class="flex flex-wrap gap-2">
{#each GENRE_SUGGESTIONS as genre}
<button
type="button"
onclick={() => searchGenre(genre)}
class="px-3 py-1.5 rounded-full border border-(--color-border) bg-(--color-surface-2) text-sm text-(--color-muted) hover:text-(--color-text) hover:border-(--color-brand)/50 transition-colors"
>
{genre}
</button>
{/each}
</div>
</div>
{/if}
</div>
</div>
</div>

View File

@@ -0,0 +1,285 @@
<script lang="ts">
/**
* SeasonalDecoration — full-viewport canvas particle overlay.
*
* Modes:
* snow — white circular snowflakes drifting down with gentle sway
* sakura — pink/white ellipse petals falling and rotating
* fireflies — small glowing dots floating up, pulsing opacity
* leaves — orange/red/yellow tear-drop shapes tumbling down
* stars — white stars twinkling in place (fixed positions, opacity animation)
*/
type Mode = 'snow' | 'sakura' | 'fireflies' | 'leaves' | 'stars';
interface Props { mode: Mode }
let { mode }: Props = $props();
let canvas = $state<HTMLCanvasElement | null>(null);
let raf = 0;
// ── Particle types ──────────────────────────────────────────────────────
interface Particle {
x: number; y: number; r: number;
vx: number; vy: number;
angle: number; vAngle: number;
opacity: number; vOpacity: number;
color: string;
// star-specific
twinkleOffset?: number;
}
// ── Palette helpers ──────────────────────────────────────────────────────
function rand(min: number, max: number) { return min + Math.random() * (max - min); }
function randInt(min: number, max: number) { return Math.floor(rand(min, max + 1)); }
const SNOW_COLORS = ['rgba(255,255,255,0.85)', 'rgba(200,220,255,0.75)', 'rgba(220,235,255,0.8)'];
const SAKURA_COLORS = ['rgba(255,182,193,0.85)', 'rgba(255,200,210,0.8)', 'rgba(255,240,245,0.9)', 'rgba(255,160,180,0.75)'];
const FIREFLY_COLORS = ['rgba(180,255,100,0.9)', 'rgba(220,255,150,0.85)', 'rgba(255,255,180,0.8)'];
const LEAF_COLORS = ['rgba(210,80,20,0.85)', 'rgba(190,120,30,0.8)', 'rgba(220,160,40,0.85)', 'rgba(180,60,10,0.8)', 'rgba(240,140,30,0.9)'];
const STAR_COLORS = ['rgba(255,255,255,0.9)', 'rgba(255,240,180,0.85)', 'rgba(180,210,255,0.8)'];
// ── Spawn helpers ────────────────────────────────────────────────────────
function spawnSnow(W: number, H: number): Particle {
return {
x: rand(0, W), y: rand(-H * 0.2, -4),
r: rand(1.5, 5),
vx: rand(-0.4, 0.4), vy: rand(0.6, 2.0),
angle: 0, vAngle: 0,
opacity: rand(0.5, 1), vOpacity: 0,
color: SNOW_COLORS[randInt(0, SNOW_COLORS.length - 1)],
};
}
function spawnSakura(W: number, H: number): Particle {
return {
x: rand(0, W), y: rand(-H * 0.2, -4),
r: rand(3, 7),
vx: rand(-0.6, 0.6), vy: rand(0.5, 1.6),
angle: rand(0, Math.PI * 2), vAngle: rand(-0.03, 0.03),
opacity: rand(0.6, 1), vOpacity: 0,
color: SAKURA_COLORS[randInt(0, SAKURA_COLORS.length - 1)],
};
}
function spawnFirefly(W: number, H: number): Particle {
return {
x: rand(0, W), y: rand(H * 0.3, H),
r: rand(1.5, 3.5),
vx: rand(-0.3, 0.3), vy: rand(-0.8, -0.2),
angle: 0, vAngle: 0,
opacity: rand(0.2, 0.8), vOpacity: rand(0.008, 0.025) * (Math.random() < 0.5 ? 1 : -1),
color: FIREFLY_COLORS[randInt(0, FIREFLY_COLORS.length - 1)],
};
}
function spawnLeaf(W: number, H: number): Particle {
return {
x: rand(0, W), y: rand(-H * 0.2, -4),
r: rand(4, 9),
vx: rand(-1.2, 1.2), vy: rand(0.8, 2.5),
angle: rand(0, Math.PI * 2), vAngle: rand(-0.05, 0.05),
opacity: rand(0.6, 1), vOpacity: 0,
color: LEAF_COLORS[randInt(0, LEAF_COLORS.length - 1)],
};
}
function spawnStar(W: number, H: number): Particle {
return {
x: rand(0, W), y: rand(0, H),
r: rand(0.8, 2.5),
vx: 0, vy: 0,
angle: 0, vAngle: 0,
opacity: rand(0.1, 0.9),
vOpacity: rand(0.004, 0.015) * (Math.random() < 0.5 ? 1 : -1),
color: STAR_COLORS[randInt(0, STAR_COLORS.length - 1)],
twinkleOffset: rand(0, Math.PI * 2),
};
}
// ── Draw helpers ─────────────────────────────────────────────────────────
function drawSnow(ctx: CanvasRenderingContext2D, p: Particle) {
ctx.beginPath();
ctx.arc(p.x, p.y, p.r, 0, Math.PI * 2);
ctx.fillStyle = p.color;
ctx.globalAlpha = p.opacity;
ctx.fill();
}
function drawSakura(ctx: CanvasRenderingContext2D, p: Particle) {
ctx.save();
ctx.translate(p.x, p.y);
ctx.rotate(p.angle);
ctx.beginPath();
ctx.ellipse(0, 0, p.r * 1.8, p.r, 0, 0, Math.PI * 2);
ctx.fillStyle = p.color;
ctx.globalAlpha = p.opacity;
ctx.fill();
ctx.restore();
}
function drawFirefly(ctx: CanvasRenderingContext2D, p: Particle) {
// Glow effect: large soft circle + small bright core
const grd = ctx.createRadialGradient(p.x, p.y, 0, p.x, p.y, p.r * 4);
grd.addColorStop(0, p.color);
grd.addColorStop(1, 'rgba(0,0,0,0)');
ctx.beginPath();
ctx.arc(p.x, p.y, p.r * 4, 0, Math.PI * 2);
ctx.fillStyle = grd;
ctx.globalAlpha = p.opacity * 0.6;
ctx.fill();
ctx.beginPath();
ctx.arc(p.x, p.y, p.r, 0, Math.PI * 2);
ctx.fillStyle = p.color;
ctx.globalAlpha = p.opacity;
ctx.fill();
}
function drawLeaf(ctx: CanvasRenderingContext2D, p: Particle) {
ctx.save();
ctx.translate(p.x, p.y);
ctx.rotate(p.angle);
ctx.beginPath();
ctx.moveTo(0, -p.r * 1.5);
ctx.bezierCurveTo(p.r * 1.2, -p.r * 0.5, p.r * 1.2, p.r * 0.5, 0, p.r * 1.5);
ctx.bezierCurveTo(-p.r * 1.2, p.r * 0.5, -p.r * 1.2, -p.r * 0.5, 0, -p.r * 1.5);
ctx.fillStyle = p.color;
ctx.globalAlpha = p.opacity;
ctx.fill();
ctx.restore();
}
function drawStar(ctx: CanvasRenderingContext2D, p: Particle, t: number) {
const pulse = 0.5 + 0.5 * Math.sin(t * 0.002 + (p.twinkleOffset ?? 0));
ctx.beginPath();
ctx.arc(p.x, p.y, p.r, 0, Math.PI * 2);
ctx.fillStyle = p.color;
ctx.globalAlpha = p.opacity * pulse;
ctx.fill();
}
// ── Particle count by mode ────────────────────────────────────────────────
const COUNT: Record<Mode, number> = {
snow: 120, sakura: 60, fireflies: 50, leaves: 45, stars: 150,
};
// ── Main effect ──────────────────────────────────────────────────────────
$effect(() => {
if (!canvas) return;
const ctx = canvas.getContext('2d');
if (!ctx) return;
let W = window.innerWidth;
let H = window.innerHeight;
canvas.width = W;
canvas.height = H;
const onResize = () => {
W = window.innerWidth;
H = window.innerHeight;
canvas!.width = W;
canvas!.height = H;
// Reseed stars on resize since they're positionally fixed
if (mode === 'stars') {
particles.length = 0;
for (let i = 0; i < COUNT.stars; i++) particles.push(spawnStar(W, H));
}
};
window.addEventListener('resize', onResize);
const n = COUNT[mode];
const particles: Particle[] = [];
// Pre-scatter initial particles across the full height
for (let i = 0; i < n; i++) {
let p: Particle;
switch (mode) {
case 'snow': p = spawnSnow(W, H); p.y = rand(0, H); break;
case 'sakura': p = spawnSakura(W, H); p.y = rand(0, H); break;
case 'fireflies': p = spawnFirefly(W, H); break;
case 'leaves': p = spawnLeaf(W, H); p.y = rand(0, H); break;
case 'stars': p = spawnStar(W, H); break;
}
particles.push(p);
}
let t = 0;
function tick() {
ctx!.clearRect(0, 0, W, H);
ctx!.save();
for (let i = 0; i < particles.length; i++) {
const p = particles[i];
switch (mode) {
case 'snow': {
// Gentle horizontal sway
p.vx = Math.sin(t * 0.001 + p.y * 0.01) * 0.5;
p.x += p.vx; p.y += p.vy;
if (p.y > H + 10) particles[i] = spawnSnow(W, H);
else drawSnow(ctx!, p);
break;
}
case 'sakura': {
p.vx = Math.sin(t * 0.0008 + p.y * 0.008) * 0.8;
p.x += p.vx; p.y += p.vy;
p.angle += p.vAngle;
if (p.y > H + 20) particles[i] = spawnSakura(W, H);
else drawSakura(ctx!, p);
break;
}
case 'fireflies': {
p.x += p.vx + Math.sin(t * 0.002 + i) * 0.3;
p.y += p.vy;
p.opacity += p.vOpacity;
if (p.opacity >= 1) { p.opacity = 1; p.vOpacity *= -1; }
if (p.opacity <= 0.1) { p.opacity = 0.1; p.vOpacity *= -1; }
if (p.y < -10) particles[i] = spawnFirefly(W, H);
else drawFirefly(ctx!, p);
break;
}
case 'leaves': {
p.vx = Math.sin(t * 0.001 + p.y * 0.01) * 1.2 + p.vx * 0.02;
p.x += p.vx; p.y += p.vy;
p.angle += p.vAngle;
if (p.y > H + 20) particles[i] = spawnLeaf(W, H);
else drawLeaf(ctx!, p);
break;
}
case 'stars': {
drawStar(ctx!, p, t);
break;
}
}
}
ctx!.restore();
t++;
raf = requestAnimationFrame(tick);
}
raf = requestAnimationFrame(tick);
return () => {
cancelAnimationFrame(raf);
window.removeEventListener('resize', onResize);
};
});
</script>
<!--
Fixed full-viewport overlay, pointer-events-none so all clicks pass through.
z-index 40 keeps it below the sticky nav (z-50) but above page content.
-->
<canvas
bind:this={canvas}
class="fixed inset-0 z-40 pointer-events-none"
aria-hidden="true"
></canvas>

View File

@@ -0,0 +1,60 @@
<script lang="ts">
interface Props {
rating: number; // current user rating 05 (0 = unrated)
avg?: number; // average rating
count?: number; // total ratings
readonly?: boolean; // display-only mode
size?: 'sm' | 'md';
onrate?: (r: number) => void;
}
let { rating = 0, avg = 0, count = 0, readonly = false, size = 'md', onrate }: Props = $props();
let hovered = $state(0);
const starSize = $derived(size === 'sm' ? 'w-3.5 h-3.5' : 'w-5 h-5');
const display = $derived(hovered || rating || 0);
</script>
<div class="flex items-center gap-2">
<div class="flex items-center gap-0.5">
{#each [1,2,3,4,5] as star}
<button
type="button"
disabled={readonly}
onmouseenter={() => { if (!readonly) hovered = star; }}
onmouseleave={() => { if (!readonly) hovered = 0; }}
onclick={() => { if (!readonly) onrate?.(star); }}
class="transition-transform {readonly ? 'cursor-default' : 'cursor-pointer hover:scale-110 active:scale-95'} disabled:pointer-events-none"
aria-label="Rate {star} star{star !== 1 ? 's' : ''}"
>
<svg
class="{starSize} transition-colors"
fill={star <= display ? 'currentColor' : 'none'}
stroke="currentColor"
stroke-width="1.5"
viewBox="0 0 24 24"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
d="M11.049 2.927c.3-.921 1.603-.921 1.902 0l1.519 4.674a1 1 0 00.95.69h4.915c.969 0 1.371 1.24.588 1.81l-3.976 2.888a1 1 0 00-.363 1.118l1.518 4.674c.3.922-.755 1.688-1.538 1.118l-3.976-2.888a1 1 0 00-1.176 0l-3.976 2.888c-.783.57-1.838-.197-1.538-1.118l1.518-4.674a1 1 0 00-.363-1.118l-3.976-2.888c-.784-.57-.38-1.81.588-1.81h4.914a1 1 0 00.951-.69l1.519-4.674z"
/>
</svg>
</button>
{/each}
</div>
{#if count > 0}
<div class="flex flex-col">
<span class="text-sm font-semibold text-(--color-text)">{avg.toFixed(1)}</span>
<span class="text-xs text-(--color-muted) leading-none">{count} {count === 1 ? 'rating' : 'ratings'}</span>
</div>
{:else if !readonly}
<span class="text-xs text-(--color-muted)">Rate this book</span>
{/if}
</div>
<style>
button[disabled] { pointer-events: none; }
svg { color: #f59e0b; }
</style>

View File

@@ -18,6 +18,13 @@
}
}
function handleBackdropKeyDown(e: KeyboardEvent) {
if ((e.key === 'Enter' || e.key === ' ') && e.target === e.currentTarget) {
open = false;
onclose?.();
}
}
function handleKeyDown(e: KeyboardEvent) {
if (e.key === 'Escape') {
open = false;
@@ -34,7 +41,9 @@
class="fixed inset-0 z-50 flex items-center justify-center bg-black/70 backdrop-blur-sm p-4"
role="dialog"
aria-modal="true"
tabindex="-1"
onclick={handleBackdropClick}
onkeydown={handleBackdropKeyDown}
>
<div class={cn('bg-(--color-surface) rounded-2xl border border-(--color-border) shadow-2xl w-full max-w-sm', className)}>
{@render children?.()}

View File

@@ -2,7 +2,7 @@
> Auto-generated i18n message functions. Import `messages.js` to use translated strings.
Compiled from: `/Users/kalekber/code/libnovel-v2/ui/project.inlang`
Compiled from: `/opt/libnovel-v3/ui/project.inlang`
## What is this folder?

View File

@@ -2,6 +2,7 @@
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
export * from './nav_library.js'
export * from './nav_catalogue.js'
export * from './nav_feed.js'
export * from './nav_feedback.js'
export * from './nav_admin.js'
export * from './nav_profile.js'
@@ -149,6 +150,9 @@ export * from './profile_theme_label.js'
export * from './profile_theme_amber.js'
export * from './profile_theme_slate.js'
export * from './profile_theme_rose.js'
export * from './profile_theme_forest.js'
export * from './profile_theme_mono.js'
export * from './profile_theme_cyber.js'
export * from './profile_theme_light.js'
export * from './profile_theme_light_slate.js'
export * from './profile_theme_light_rose.js'
@@ -278,6 +282,24 @@ export * from './book_detail_to_chapter.js'
export * from './book_detail_range_queuing.js'
export * from './book_detail_scrape_range.js'
export * from './book_detail_admin.js'
export * from './book_detail_admin_book_cover.js'
export * from './book_detail_admin_chapter_cover.js'
export * from './book_detail_admin_chapter_n.js'
export * from './book_detail_admin_description.js'
export * from './book_detail_admin_chapter_names.js'
export * from './book_detail_admin_audio_tts.js'
export * from './book_detail_admin_voice.js'
export * from './book_detail_admin_generate.js'
export * from './book_detail_admin_save_cover.js'
export * from './book_detail_admin_saving.js'
export * from './book_detail_admin_saved.js'
export * from './book_detail_admin_apply.js'
export * from './book_detail_admin_applying.js'
export * from './book_detail_admin_applied.js'
export * from './book_detail_admin_discard.js'
export * from './book_detail_admin_enqueue_audio.js'
export * from './book_detail_admin_cancel_audio.js'
export * from './book_detail_admin_enqueued.js'
export * from './book_detail_scraping_progress.js'
export * from './book_detail_scraping_home.js'
export * from './book_detail_rescrape_book.js'
@@ -320,6 +342,26 @@ export * from './profile_upgrade_desc.js'
export * from './profile_upgrade_monthly.js'
export * from './profile_upgrade_annual.js'
export * from './profile_free_limits.js'
export * from './subscribe_page_title.js'
export * from './subscribe_heading.js'
export * from './subscribe_subheading.js'
export * from './subscribe_monthly_label.js'
export * from './subscribe_monthly_price.js'
export * from './subscribe_monthly_period.js'
export * from './subscribe_annual_label.js'
export * from './subscribe_annual_price.js'
export * from './subscribe_annual_period.js'
export * from './subscribe_annual_save.js'
export * from './subscribe_cta_monthly.js'
export * from './subscribe_cta_annual.js'
export * from './subscribe_already_pro.js'
export * from './subscribe_manage.js'
export * from './subscribe_benefit_audio.js'
export * from './subscribe_benefit_voices.js'
export * from './subscribe_benefit_translation.js'
export * from './subscribe_benefit_downloads.js'
export * from './subscribe_login_prompt.js'
export * from './subscribe_login_cta.js'
export * from './user_currently_reading.js'
export * from './user_library_count.js'
export * from './user_joined.js'
@@ -331,13 +373,21 @@ export * from './admin_tools_label.js'
export * from './admin_nav_scrape.js'
export * from './admin_nav_audio.js'
export * from './admin_nav_translation.js'
export * from './admin_nav_import.js'
export * from './admin_nav_changelog.js'
export * from './admin_nav_image_gen.js'
export * from './admin_nav_text_gen.js'
export * from './admin_nav_catalogue_tools.js'
export * from './admin_nav_ai_jobs.js'
export * from './admin_nav_notifications.js'
export * from './admin_nav_feedback.js'
export * from './admin_nav_errors.js'
export * from './admin_nav_analytics.js'
export * from './admin_nav_logs.js'
export * from './admin_nav_uptime.js'
export * from './admin_nav_push.js'
export * from './admin_nav_gitea.js'
export * from './admin_nav_grafana.js'
export * from './admin_scrape_status_idle.js'
export * from './admin_scrape_full_catalogue.js'
export * from './admin_scrape_single_book.js'
@@ -383,4 +433,26 @@ export * from './profile_text_size.js'
export * from './profile_text_size_sm.js'
export * from './profile_text_size_md.js'
export * from './profile_text_size_lg.js'
export * from './profile_text_size_xl.js'
export * from './profile_text_size_xl.js'
export * from './feed_page_title.js'
export * from './feed_heading.js'
export * from './feed_subheading.js'
export * from './feed_empty_heading.js'
export * from './feed_empty_body.js'
export * from './feed_not_logged_in.js'
export * from './feed_reader_label.js'
export * from './feed_chapters_label.js'
export * from './feed_browse_cta.js'
export * from './feed_find_users_cta.js'
export * from './admin_translation_page_title.js'
export * from './admin_translation_heading.js'
export * from './admin_translation_tab_enqueue.js'
export * from './admin_translation_tab_jobs.js'
export * from './admin_translation_filter_placeholder.js'
export * from './admin_translation_no_matching.js'
export * from './admin_translation_no_jobs.js'
export * from './admin_ai_jobs_page_title.js'
export * from './admin_ai_jobs_heading.js'
export * from './admin_ai_jobs_subheading.js'
export * from './admin_text_gen_page_title.js'
export * from './admin_text_gen_heading.js'

View File

@@ -0,0 +1,44 @@
/* eslint-disable */
import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
/** @typedef {{}} Admin_Ai_Jobs_HeadingInputs */
const en_admin_ai_jobs_heading = /** @type {(inputs: Admin_Ai_Jobs_HeadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
const ru_admin_ai_jobs_heading = /** @type {(inputs: Admin_Ai_Jobs_HeadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
const id_admin_ai_jobs_heading = /** @type {(inputs: Admin_Ai_Jobs_HeadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
const pt_admin_ai_jobs_heading = /** @type {(inputs: Admin_Ai_Jobs_HeadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
const fr_admin_ai_jobs_heading = /** @type {(inputs: Admin_Ai_Jobs_HeadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
/**
* | output |
* | --- |
* | "AI Jobs" |
*
* @param {Admin_Ai_Jobs_HeadingInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_ai_jobs_heading = /** @type {((inputs?: Admin_Ai_Jobs_HeadingInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Ai_Jobs_HeadingInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_ai_jobs_heading(inputs)
if (locale === "ru") return ru_admin_ai_jobs_heading(inputs)
if (locale === "id") return id_admin_ai_jobs_heading(inputs)
if (locale === "pt") return pt_admin_ai_jobs_heading(inputs)
return fr_admin_ai_jobs_heading(inputs)
});

View File

@@ -0,0 +1,44 @@
/* eslint-disable */
import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
/** @typedef {{}} Admin_Ai_Jobs_Page_TitleInputs */
const en_admin_ai_jobs_page_title = /** @type {(inputs: Admin_Ai_Jobs_Page_TitleInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs — Admin`)
};
const ru_admin_ai_jobs_page_title = /** @type {(inputs: Admin_Ai_Jobs_Page_TitleInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs — Admin`)
};
const id_admin_ai_jobs_page_title = /** @type {(inputs: Admin_Ai_Jobs_Page_TitleInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs — Admin`)
};
const pt_admin_ai_jobs_page_title = /** @type {(inputs: Admin_Ai_Jobs_Page_TitleInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs — Admin`)
};
const fr_admin_ai_jobs_page_title = /** @type {(inputs: Admin_Ai_Jobs_Page_TitleInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs — Admin`)
};
/**
* | output |
* | --- |
* | "AI Jobs — Admin" |
*
* @param {Admin_Ai_Jobs_Page_TitleInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_ai_jobs_page_title = /** @type {((inputs?: Admin_Ai_Jobs_Page_TitleInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Ai_Jobs_Page_TitleInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_ai_jobs_page_title(inputs)
if (locale === "ru") return ru_admin_ai_jobs_page_title(inputs)
if (locale === "id") return id_admin_ai_jobs_page_title(inputs)
if (locale === "pt") return pt_admin_ai_jobs_page_title(inputs)
return fr_admin_ai_jobs_page_title(inputs)
});

View File

@@ -0,0 +1,44 @@
/* eslint-disable */
import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
/** @typedef {{}} Admin_Ai_Jobs_SubheadingInputs */
const en_admin_ai_jobs_subheading = /** @type {(inputs: Admin_Ai_Jobs_SubheadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Background AI generation tasks`)
};
const ru_admin_ai_jobs_subheading = /** @type {(inputs: Admin_Ai_Jobs_SubheadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Background AI generation tasks`)
};
const id_admin_ai_jobs_subheading = /** @type {(inputs: Admin_Ai_Jobs_SubheadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Background AI generation tasks`)
};
const pt_admin_ai_jobs_subheading = /** @type {(inputs: Admin_Ai_Jobs_SubheadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Background AI generation tasks`)
};
const fr_admin_ai_jobs_subheading = /** @type {(inputs: Admin_Ai_Jobs_SubheadingInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Background AI generation tasks`)
};
/**
* | output |
* | --- |
* | "Background AI generation tasks" |
*
* @param {Admin_Ai_Jobs_SubheadingInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_ai_jobs_subheading = /** @type {((inputs?: Admin_Ai_Jobs_SubheadingInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Ai_Jobs_SubheadingInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_ai_jobs_subheading(inputs)
if (locale === "ru") return ru_admin_ai_jobs_subheading(inputs)
if (locale === "id") return id_admin_ai_jobs_subheading(inputs)
if (locale === "pt") return pt_admin_ai_jobs_subheading(inputs)
return fr_admin_ai_jobs_subheading(inputs)
});

View File

@@ -0,0 +1,44 @@
/* eslint-disable */
import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
/** @typedef {{}} Admin_Nav_Ai_JobsInputs */
const en_admin_nav_ai_jobs = /** @type {(inputs: Admin_Nav_Ai_JobsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`AI Jobs`)
};
const ru_admin_nav_ai_jobs = /** @type {(inputs: Admin_Nav_Ai_JobsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Задачи ИИ`)
};
const id_admin_nav_ai_jobs = /** @type {(inputs: Admin_Nav_Ai_JobsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Tugas AI`)
};
const pt_admin_nav_ai_jobs = /** @type {(inputs: Admin_Nav_Ai_JobsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Tarefas de IA`)
};
const fr_admin_nav_ai_jobs = /** @type {(inputs: Admin_Nav_Ai_JobsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Tâches IA`)
};
/**
* | output |
* | --- |
* | "AI Jobs" |
*
* @param {Admin_Nav_Ai_JobsInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_nav_ai_jobs = /** @type {((inputs?: Admin_Nav_Ai_JobsInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Nav_Ai_JobsInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_nav_ai_jobs(inputs)
if (locale === "ru") return ru_admin_nav_ai_jobs(inputs)
if (locale === "id") return id_admin_nav_ai_jobs(inputs)
if (locale === "pt") return pt_admin_nav_ai_jobs(inputs)
return fr_admin_nav_ai_jobs(inputs)
});

View File

@@ -5,12 +5,35 @@ import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {{}} Admin_Nav_AnalyticsInputs */
const en_admin_nav_analytics = (_inputs = {}) => /** @type {LocalizedString} */ (`Analytics`);
const ru_admin_nav_analytics = (_inputs = {}) => /** @type {LocalizedString} */ (`Аналитика`);
const id_admin_nav_analytics = (_inputs = {}) => /** @type {LocalizedString} */ (`Analitik`);
const pt_admin_nav_analytics = (_inputs = {}) => /** @type {LocalizedString} */ (`Análise`);
const fr_admin_nav_analytics = (_inputs = {}) => /** @type {LocalizedString} */ (`Analytique`);
const en_admin_nav_analytics = /** @type {(inputs: Admin_Nav_AnalyticsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Analytics`)
};
const ru_admin_nav_analytics = /** @type {(inputs: Admin_Nav_AnalyticsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Аналитика`)
};
const id_admin_nav_analytics = /** @type {(inputs: Admin_Nav_AnalyticsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Analitik`)
};
const pt_admin_nav_analytics = /** @type {(inputs: Admin_Nav_AnalyticsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Análise`)
};
const fr_admin_nav_analytics = /** @type {(inputs: Admin_Nav_AnalyticsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Analytique`)
};
/**
* | output |
* | --- |
* | "Analytics" |
*
* @param {Admin_Nav_AnalyticsInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_nav_analytics = /** @type {((inputs?: Admin_Nav_AnalyticsInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Nav_AnalyticsInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_nav_analytics(inputs)
@@ -18,4 +41,4 @@ export const admin_nav_analytics = /** @type {((inputs?: Admin_Nav_AnalyticsInpu
if (locale === "id") return id_admin_nav_analytics(inputs)
if (locale === "pt") return pt_admin_nav_analytics(inputs)
return fr_admin_nav_analytics(inputs)
});
});

View File

@@ -5,12 +5,35 @@ import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {{}} Admin_Nav_AudioInputs */
const en_admin_nav_audio = (_inputs = {}) => /** @type {LocalizedString} */ (`Audio`);
const ru_admin_nav_audio = (_inputs = {}) => /** @type {LocalizedString} */ (`Аудио`);
const id_admin_nav_audio = (_inputs = {}) => /** @type {LocalizedString} */ (`Audio`);
const pt_admin_nav_audio = (_inputs = {}) => /** @type {LocalizedString} */ (`Áudio`);
const fr_admin_nav_audio = (_inputs = {}) => /** @type {LocalizedString} */ (`Audio`);
const en_admin_nav_audio = /** @type {(inputs: Admin_Nav_AudioInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Audio`)
};
const ru_admin_nav_audio = /** @type {(inputs: Admin_Nav_AudioInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Аудио`)
};
const id_admin_nav_audio = /** @type {(inputs: Admin_Nav_AudioInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Audio`)
};
const pt_admin_nav_audio = /** @type {(inputs: Admin_Nav_AudioInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Áudio`)
};
const fr_admin_nav_audio = /** @type {(inputs: Admin_Nav_AudioInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Audio`)
};
/**
* | output |
* | --- |
* | "Audio" |
*
* @param {Admin_Nav_AudioInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_nav_audio = /** @type {((inputs?: Admin_Nav_AudioInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Nav_AudioInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_nav_audio(inputs)
@@ -18,4 +41,4 @@ export const admin_nav_audio = /** @type {((inputs?: Admin_Nav_AudioInputs, opti
if (locale === "id") return id_admin_nav_audio(inputs)
if (locale === "pt") return pt_admin_nav_audio(inputs)
return fr_admin_nav_audio(inputs)
});
});

View File

@@ -0,0 +1,44 @@
/* eslint-disable */
import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {import('../runtime.js').LocalizedString} LocalizedString */
/** @typedef {{}} Admin_Nav_Catalogue_ToolsInputs */
const en_admin_nav_catalogue_tools = /** @type {(inputs: Admin_Nav_Catalogue_ToolsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Catalogue Tools`)
};
const ru_admin_nav_catalogue_tools = /** @type {(inputs: Admin_Nav_Catalogue_ToolsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Catalogue Tools`)
};
const id_admin_nav_catalogue_tools = /** @type {(inputs: Admin_Nav_Catalogue_ToolsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Catalogue Tools`)
};
const pt_admin_nav_catalogue_tools = /** @type {(inputs: Admin_Nav_Catalogue_ToolsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Catalogue Tools`)
};
const fr_admin_nav_catalogue_tools = /** @type {(inputs: Admin_Nav_Catalogue_ToolsInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Catalogue Tools`)
};
/**
* | output |
* | --- |
* | "Catalogue Tools" |
*
* @param {Admin_Nav_Catalogue_ToolsInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_nav_catalogue_tools = /** @type {((inputs?: Admin_Nav_Catalogue_ToolsInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Nav_Catalogue_ToolsInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_nav_catalogue_tools(inputs)
if (locale === "ru") return ru_admin_nav_catalogue_tools(inputs)
if (locale === "id") return id_admin_nav_catalogue_tools(inputs)
if (locale === "pt") return pt_admin_nav_catalogue_tools(inputs)
return fr_admin_nav_catalogue_tools(inputs)
});

View File

@@ -5,12 +5,35 @@ import { getLocale, experimentalStaticLocale } from '../runtime.js';
/** @typedef {{}} Admin_Nav_ChangelogInputs */
const en_admin_nav_changelog = (_inputs = {}) => /** @type {LocalizedString} */ (`Changelog`);
const ru_admin_nav_changelog = (_inputs = {}) => /** @type {LocalizedString} */ (`Изменения`);
const id_admin_nav_changelog = (_inputs = {}) => /** @type {LocalizedString} */ (`Perubahan`);
const pt_admin_nav_changelog = (_inputs = {}) => /** @type {LocalizedString} */ (`Alterações`);
const fr_admin_nav_changelog = (_inputs = {}) => /** @type {LocalizedString} */ (`Modifications`);
const en_admin_nav_changelog = /** @type {(inputs: Admin_Nav_ChangelogInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Changelog`)
};
const ru_admin_nav_changelog = /** @type {(inputs: Admin_Nav_ChangelogInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Изменения`)
};
const id_admin_nav_changelog = /** @type {(inputs: Admin_Nav_ChangelogInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Perubahan`)
};
const pt_admin_nav_changelog = /** @type {(inputs: Admin_Nav_ChangelogInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Alterações`)
};
const fr_admin_nav_changelog = /** @type {(inputs: Admin_Nav_ChangelogInputs) => LocalizedString} */ () => {
return /** @type {LocalizedString} */ (`Modifications`)
};
/**
* | output |
* | --- |
* | "Changelog" |
*
* @param {Admin_Nav_ChangelogInputs} inputs
* @param {{ locale?: "en" | "ru" | "id" | "pt" | "fr" }} options
* @returns {LocalizedString}
*/
export const admin_nav_changelog = /** @type {((inputs?: Admin_Nav_ChangelogInputs, options?: { locale?: "en" | "ru" | "id" | "pt" | "fr" }) => LocalizedString) & import('../runtime.js').MessageMetadata<Admin_Nav_ChangelogInputs, { locale?: "en" | "ru" | "id" | "pt" | "fr" }, {}>} */ ((inputs = {}, options = {}) => {
const locale = experimentalStaticLocale ?? options.locale ?? getLocale()
if (locale === "en") return en_admin_nav_changelog(inputs)
@@ -18,4 +41,4 @@ export const admin_nav_changelog = /** @type {((inputs?: Admin_Nav_ChangelogInpu
if (locale === "id") return id_admin_nav_changelog(inputs)
if (locale === "pt") return pt_admin_nav_changelog(inputs)
return fr_admin_nav_changelog(inputs)
});
});

Some files were not shown because too many files have changed in this diff Show More