Compare commits

...

8 Commits

Author SHA1 Message Date
Admin
9dae5e7cc0 fix(infra): add POCKET_TTS_URL to backend and runner services
Some checks failed
CI / Backend (push) Successful in 27s
Release / Test backend (push) Successful in 39s
CI / UI (push) Successful in 48s
Release / Check ui (push) Successful in 25s
CI / UI (pull_request) Successful in 25s
CI / Backend (pull_request) Successful in 45s
Release / Docker / caddy (push) Successful in 1m7s
Release / Docker / backend (push) Successful in 2m17s
Release / Docker / ui (push) Successful in 2m9s
Release / Docker / runner (push) Failing after 3m34s
Release / Gitea Release (push) Has been skipped
Backend was missing POCKET_TTS_URL entirely — pocketTTSClient was nil
so voices() only returned 67 Kokoro voices. Runner already had the var
via Doppler but it was absent from the compose environment block.

Also fix stray leading space on backend environment: key (YAML parse error).

Verified: /api/voices now returns 87 voices (67 kokoro + 20 pocket-tts).
2026-03-28 20:54:04 +05:00
Admin
908f5679fd fix(ui): defer catalogue filter navigation to explicit Apply button
Some checks failed
CI / Backend (push) Failing after 11s
Release / Check ui (push) Successful in 34s
Release / Test backend (push) Successful in 52s
CI / UI (push) Successful in 55s
Release / Docker / caddy (push) Successful in 48s
CI / UI (pull_request) Successful in 39s
CI / Backend (pull_request) Successful in 43s
Release / Docker / runner (push) Failing after 38s
Release / Docker / ui (push) Successful in 1m55s
Release / Docker / backend (push) Successful in 3m30s
Release / Gitea Release (push) Has been skipped
Removed onchange→navigateWithFilters from all three selects — the
immediate navigation on every change was the root cause of both bugs:
1. Filters applied before the user finished selecting all options.
2. Svelte 5 bind:value updates state after onchange fires, so the
   navigateWithFilters call read stale values → wrong URL params → no results.

Renamed navigateWithFilters to applyFilters (no overrides arg needed).
Added an amber Apply button next to Reset; selects now only update local
state until the user presses Apply.
2026-03-28 19:44:36 +05:00
Admin
f75292f531 fix(homelab): add Google + GitHub OAuth env vars to Fider service
All checks were successful
CI / Backend (pull_request) Successful in 45s
CI / UI (pull_request) Successful in 38s
2026-03-28 19:38:11 +05:00
Admin
2cf0528730 fix(ui): show version+SHA+build time in footer; fix env not reaching runtime image
Some checks failed
CI / Backend (push) Successful in 51s
CI / UI (push) Successful in 27s
Release / Docker / caddy (push) Failing after 23s
Release / Test backend (push) Successful in 38s
Release / Check ui (push) Successful in 39s
CI / Backend (pull_request) Successful in 28s
CI / UI (pull_request) Successful in 36s
Release / Docker / backend (push) Failing after 1m25s
Release / Docker / runner (push) Successful in 2m25s
Release / Docker / ui (push) Successful in 1m51s
Release / Gitea Release (push) Has been skipped
PUBLIC_BUILD_VERSION and PUBLIC_BUILD_COMMIT were set only in the builder
stage ENV — they were never re-declared in the runtime stage, so the Node
server started with them undefined and the badge always showed 'dev'.

Fix: re-declare all three ARGs after the second FROM and set runtime ENVs.
Add PUBLIC_BUILD_TIME (ISO timestamp from gitea.event.head_commit.timestamp)
injected via build-arg in release.yaml. Badge now shows e.g.:
  v2.3.9+abc1234 · 28 Mar 2026 14:30 UTC
2026-03-28 19:33:49 +05:00
Admin
428b57732e fix(ui): resolve avatar URL from MinIO; fall back to OAuth provider URL
Some checks failed
CI / Backend (push) Successful in 50s
CI / UI (push) Successful in 34s
Release / Test backend (push) Successful in 37s
Release / Check ui (push) Successful in 45s
Release / Docker / caddy (push) Successful in 35s
CI / Backend (pull_request) Successful in 49s
CI / UI (pull_request) Successful in 40s
Release / Docker / backend (push) Failing after 52s
Release / Docker / runner (push) Failing after 58s
Release / Docker / ui (push) Failing after 55s
Release / Gitea Release (push) Has been skipped
Add resolveAvatarUrl(userId, storedValue) helper that tries MinIO first,
then falls back to the stored HTTP URL for OAuth users (Google/GitHub)
who have never uploaded a custom avatar.

Add getUserById() to pocketbase helpers for batch avatar resolution in
comments. Update all 6 call sites to use the new helper.
2026-03-28 19:23:30 +05:00
Admin
61e77e3e28 ci: remove Docker builds from CI; keep vet/build/test/type-check only
All checks were successful
CI / Backend (push) Successful in 43s
CI / UI (push) Successful in 37s
CI / UI (pull_request) Successful in 35s
CI / Backend (pull_request) Successful in 1m9s
Docker image builds belong to the release workflow (tag-triggered).
CI now runs: go vet, go build (backend/runner/healthcheck), go test,
svelte-check, and UI vite build — fast feedback without Docker overhead.
Also triggers on all branches, not just main/master.
2026-03-28 19:08:17 +05:00
Admin
b363c151a5 fix(ui): fix catalogue filters in Svelte 5; improve build badge visibility
Some checks failed
Release / Test backend (push) Successful in 40s
Release / Check ui (push) Successful in 49s
Release / Docker / caddy (push) Failing after 1m0s
CI / Test backend (pull_request) Successful in 40s
CI / Docker / caddy (pull_request) Failing after 21s
CI / Check ui (pull_request) Successful in 48s
Release / Docker / ui (push) Successful in 2m49s
Release / Docker / runner (push) Successful in 3m10s
Release / Docker / backend (push) Successful in 3m47s
Release / Gitea Release (push) Has been skipped
CI / Docker / ui (pull_request) Successful in 1m31s
CI / Docker / backend (pull_request) Successful in 3m13s
CI / Docker / runner (pull_request) Successful in 3m59s
- Catalogue filter selects: replace value= initializer with bind:value +
  onchange goto() so filters navigate immediately on change (no Apply button)
- Add selected= on each <option> for correct DOM initialisation in Svelte 5
- Build badge: give distinct bg-zinc-800 pill, visible zinc-300/400/500 text
  instead of zinc-700/zinc-800 which blended into the footer background
2026-03-28 19:01:58 +05:00
Admin
aef9e04419 fix(runner): harden catalogue scrape against 429s; disable sourcemap upload
Some checks failed
Release / Test backend (push) Successful in 29s
Release / Check ui (push) Successful in 43s
CI / Test backend (pull_request) Successful in 28s
Release / Docker / caddy (push) Successful in 1m8s
CI / Docker / caddy (pull_request) Failing after 39s
CI / Check ui (pull_request) Successful in 55s
Release / Docker / runner (push) Successful in 2m22s
Release / Docker / ui (push) Successful in 2m20s
CI / Docker / backend (pull_request) Successful in 2m32s
CI / Docker / ui (pull_request) Successful in 1m28s
CI / Docker / runner (pull_request) Successful in 2m25s
Release / Docker / backend (push) Successful in 2m33s
Release / Gitea Release (push) Failing after 2s
- scraper.go: ScrapeCatalogue now uses retryGet (9 attempts, 10s base) +
  500–1500ms inter-page jitter instead of bare GetContent. ScrapeMetadata
  also switched to retryGet so a single 429 on a book page is retried rather
  than aborting the whole refresh.
- catalogue_refresh.go: per-book delay is now configurable
  (RUNNER_CATALOGUE_REQUEST_DELAY, default 2s) + up to 50% random jitter
  applied before every metadata fetch. Only metadata is scraped here —
  chapters are fetched on-demand, not during catalogue refresh. Progress
  logged every 50 books instead of 100.
- config.go / runner.go / main.go: add CatalogueRequestDelay field wired
  from RUNNER_CATALOGUE_REQUEST_DELAY env var.
- release.yaml: comment out upload-sourcemaps job and remove it from the
  release needs; GlitchTip auth token needs refreshing after DB wipe.
2026-03-28 16:28:36 +05:00
20 changed files with 262 additions and 187 deletions

View File

@@ -2,20 +2,14 @@ name: CI
on:
push:
branches: ["main", "master"]
paths:
- "backend/**"
- "ui/**"
- "caddy/**"
- "docker-compose.yml"
- ".gitea/workflows/ci.yaml"
pull_request:
branches: ["main", "master"]
paths:
- "backend/**"
- "ui/**"
- "caddy/**"
- "docker-compose.yml"
- ".gitea/workflows/ci.yaml"
concurrency:
@@ -23,10 +17,13 @@ concurrency:
cancel-in-progress: true
jobs:
# ── backend: vet & test ───────────────────────────────────────────────────────
test-backend:
name: Test backend
# ── Go: vet + build + test ────────────────────────────────────────────────
backend:
name: Backend
runs-on: ubuntu-latest
defaults:
run:
working-directory: backend
steps:
- uses: actions/checkout@v4
@@ -36,16 +33,23 @@ jobs:
cache-dependency-path: backend/go.sum
- name: go vet
working-directory: backend
run: go vet ./...
- name: Build backend
run: go build -o /dev/null ./cmd/backend
- name: Build runner
run: go build -o /dev/null ./cmd/runner
- name: Build healthcheck
run: go build -o /dev/null ./cmd/healthcheck
- name: Run tests
working-directory: backend
run: go test -short -race -count=1 -timeout=60s ./...
# ── ui: type-check & build ────────────────────────────────────────────────────
check-ui:
name: Check ui
# ── UI: type-check + build ────────────────────────────────────────────────
ui:
name: UI
runs-on: ubuntu-latest
defaults:
run:
@@ -67,57 +71,3 @@ jobs:
- name: Build
run: npm run build
# ── docker: validate Dockerfiles build (no push) ──────────────────────────────
docker-backend:
name: Docker / backend
runs-on: ubuntu-latest
needs: [test-backend]
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Build
uses: docker/build-push-action@v6
with:
context: backend
target: backend
push: false
docker-runner:
name: Docker / runner
runs-on: ubuntu-latest
needs: [test-backend]
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Build
uses: docker/build-push-action@v6
with:
context: backend
target: runner
push: false
docker-ui:
name: Docker / ui
runs-on: ubuntu-latest
needs: [check-ui]
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Build
uses: docker/build-push-action@v6
with:
context: ui
push: false
docker-caddy:
name: Docker / caddy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- name: Build
uses: docker/build-push-action@v6
with:
context: caddy
push: false

View File

@@ -136,52 +136,51 @@ jobs:
cache-to: type=inline
# ── ui: source map upload ─────────────────────────────────────────────────────
# Builds the UI with source maps and uploads them to GlitchTip so that error
# stack traces resolve to original .svelte/.ts file names and line numbers.
# Runs in parallel with docker-ui (both need check-ui to pass first).
upload-sourcemaps:
name: Upload source maps
runs-on: ubuntu-latest
needs: [check-ui]
defaults:
run:
working-directory: ui
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "22"
cache: npm
cache-dependency-path: ui/package-lock.json
- name: Install dependencies
run: npm ci
- name: Build with source maps
run: npm run build
- name: Download glitchtip-cli
run: |
curl -L "https://gitlab.com/glitchtip/glitchtip-cli/-/jobs/artifacts/v0.1.0/raw/artifacts/glitchtip-cli-linux-x86_64?job=build-linux-x86_64" \
-o /usr/local/bin/glitchtip-cli
chmod +x /usr/local/bin/glitchtip-cli
- name: Inject debug IDs into build artifacts
run: glitchtip-cli sourcemaps inject ./build
env:
SENTRY_URL: https://errors.libnovel.cc/
SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
SENTRY_ORG: libnovel
SENTRY_PROJECT: libnovel-ui
- name: Upload source maps to GlitchTip
run: glitchtip-cli sourcemaps upload ./build --release ${{ gitea.ref_name }}
env:
SENTRY_URL: https://errors.libnovel.cc/
SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
SENTRY_ORG: libnovel
SENTRY_PROJECT: libnovel-ui
# Commented out: GlitchTip project/auth token needs to be recreated after
# the GlitchTip DB wipe. Re-enable once GLITCHTIP_AUTH_TOKEN is updated.
# upload-sourcemaps:
# name: Upload source maps
# runs-on: ubuntu-latest
# needs: [check-ui]
# defaults:
# run:
# working-directory: ui
# steps:
# - uses: actions/checkout@v4
#
# - uses: actions/setup-node@v4
# with:
# node-version: "22"
# cache: npm
# cache-dependency-path: ui/package-lock.json
#
# - name: Install dependencies
# run: npm ci
#
# - name: Build with source maps
# run: npm run build
#
# - name: Download glitchtip-cli
# run: |
# curl -L "https://gitlab.com/glitchtip/glitchtip-cli/-/jobs/artifacts/v0.1.0/raw/artifacts/glitchtip-cli-linux-x86_64?job=build-linux-x86_64" \
# -o /usr/local/bin/glitchtip-cli
# chmod +x /usr/local/bin/glitchtip-cli
#
# - name: Inject debug IDs into build artifacts
# run: glitchtip-cli sourcemaps inject ./build
# env:
# SENTRY_URL: https://errors.libnovel.cc/
# SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
# SENTRY_ORG: libnovel
# SENTRY_PROJECT: libnovel-ui
#
# - name: Upload source maps to GlitchTip
# run: glitchtip-cli sourcemaps upload ./build --release ${{ gitea.ref_name }}
# env:
# SENTRY_URL: https://errors.libnovel.cc/
# SENTRY_AUTH_TOKEN: ${{ secrets.GLITCHTIP_AUTH_TOKEN }}
# SENTRY_ORG: libnovel
# SENTRY_PROJECT: libnovel-ui
# ── docker: ui ────────────────────────────────────────────────────────────────
docker-ui:
@@ -219,6 +218,7 @@ jobs:
build-args: |
BUILD_VERSION=${{ steps.meta.outputs.version }}
BUILD_COMMIT=${{ gitea.sha }}
BUILD_TIME=${{ gitea.event.head_commit.timestamp }}
cache-from: type=registry,ref=${{ secrets.DOCKER_USER }}/libnovel-ui:latest
cache-to: type=inline
@@ -261,7 +261,7 @@ jobs:
release:
name: Gitea Release
runs-on: ubuntu-latest
needs: [docker-backend, docker-runner, docker-ui, docker-caddy, upload-sourcemaps]
needs: [docker-backend, docker-runner, docker-ui, docker-caddy]
steps:
- uses: actions/checkout@v4
with:

View File

@@ -152,6 +152,7 @@ func run() error {
OrchestratorWorkers: workers,
MetricsAddr: cfg.Runner.MetricsAddr,
CatalogueRefreshInterval: cfg.Runner.CatalogueRefreshInterval,
CatalogueRequestDelay: cfg.Runner.CatalogueRequestDelay,
SkipInitialCatalogueRefresh: cfg.Runner.SkipInitialCatalogueRefresh,
RedisAddr: cfg.Redis.Addr,
RedisPassword: cfg.Redis.Password,

View File

@@ -126,6 +126,11 @@ type Runner struct {
// is already indexed and a 24h walk would be wasteful.
// Controlled by RUNNER_SKIP_INITIAL_CATALOGUE_REFRESH=true.
SkipInitialCatalogueRefresh bool
// CatalogueRequestDelay is the base delay inserted between per-book metadata
// requests during a catalogue refresh. A random jitter of up to 50% is added
// on top. Defaults to 2s. Increase to reduce 429 pressure on novelfire.net.
// Controlled by RUNNER_CATALOGUE_REQUEST_DELAY (e.g. "3s", "500ms").
CatalogueRequestDelay time.Duration
}
// Config is the top-level configuration struct consumed by both binaries.
@@ -196,6 +201,7 @@ func Load() Config {
MetricsAddr: envOr("RUNNER_METRICS_ADDR", ":9091"),
CatalogueRefreshInterval: envDuration("RUNNER_CATALOGUE_REFRESH_INTERVAL", 0),
SkipInitialCatalogueRefresh: envBool("RUNNER_SKIP_INITIAL_CATALOGUE_REFRESH", false),
CatalogueRequestDelay: envDuration("RUNNER_CATALOGUE_REQUEST_DELAY", 2*time.Second),
},
Meilisearch: Meilisearch{

View File

@@ -13,6 +13,7 @@ import (
"errors"
"fmt"
"log/slog"
"math/rand"
"net/url"
"path"
"strconv"
@@ -55,6 +56,9 @@ func (s *Scraper) SourceName() string { return "novelfire.net" }
// ── CatalogueProvider ─────────────────────────────────────────────────────────
// ScrapeCatalogue streams all CatalogueEntry values across all catalogue pages.
// Each page fetch uses retryGet with 429-aware exponential backoff.
// A small inter-page delay (cataloguePageDelay) is inserted between requests to
// avoid hammering the server when paging through hundreds of catalogue pages.
func (s *Scraper) ScrapeCatalogue(ctx context.Context) (<-chan domain.CatalogueEntry, <-chan error) {
entries := make(chan domain.CatalogueEntry, 64)
errs := make(chan error, 16)
@@ -73,8 +77,18 @@ func (s *Scraper) ScrapeCatalogue(ctx context.Context) (<-chan domain.CatalogueE
default:
}
// Polite inter-page delay — skipped on the very first page.
if page > 1 {
jitter := time.Duration(500+rand.Intn(1000)) * time.Millisecond
select {
case <-ctx.Done():
return
case <-time.After(jitter):
}
}
s.log.Info("scraping catalogue page", "page", page, "url", pageURL)
raw, err := s.client.GetContent(ctx, pageURL)
raw, err := retryGet(ctx, s.log, s.client, pageURL, 9, 10*time.Second)
if err != nil {
errs <- fmt.Errorf("catalogue page %d: %w", page, err)
return
@@ -139,10 +153,11 @@ func (s *Scraper) ScrapeCatalogue(ctx context.Context) (<-chan domain.CatalogueE
// ── MetadataProvider ──────────────────────────────────────────────────────────
// ScrapeMetadata fetches and parses book metadata from the book's landing page.
// Uses retryGet with 429-aware exponential backoff (up to 9 attempts).
func (s *Scraper) ScrapeMetadata(ctx context.Context, bookURL string) (domain.BookMeta, error) {
s.log.Debug("metadata fetch starting", "url", bookURL)
raw, err := s.client.GetContent(ctx, bookURL)
raw, err := retryGet(ctx, s.log, s.client, bookURL, 9, 10*time.Second)
if err != nil {
return domain.BookMeta{}, fmt.Errorf("metadata fetch %s: %w", bookURL, err)
}

View File

@@ -6,17 +6,20 @@ package runner
//
// Design:
// - Runs on its own ticker (CatalogueRefreshInterval, default 24h) inside Run().
// - Also fires once on startup.
// - ScrapeCatalogue streams CatalogueEntry values over a channel — we iterate
// and call ScrapeMetadata for each entry.
// - Per-request random jitter (13s) prevents hammering novelfire.net.
// - Cover images are fetched from the URL embedded in BookMeta.Cover and
// stored in MinIO (browse bucket, key: covers/{slug}.jpg).
// - WriteMetadata + UpsertBook are called for every successfully scraped book.
// - Errors for individual books are logged and skipped; the loop continues.
// - The cover URL stored in BookMeta.Cover is rewritten to the internal proxy
// path (/api/cover/novelfire.net/{slug}) so the UI always fetches via the
// backend, which will serve from MinIO.
// - Also fires once on startup (unless SkipInitialCatalogueRefresh is set).
// - ScrapeCatalogue streams CatalogueEntry values over a channel — already has
// its own inter-page jitter + retryGet (see scraper.go).
// - Per-book: only metadata is scraped here (not chapters). Chapters are scraped
// on-demand when a user opens a book or via an explicit scrape task.
// - Between each metadata request a configurable base delay plus up to 50%
// random jitter is applied (CatalogueRequestDelay, default 2s). This keeps
// the request rate well below novelfire.net's rate limit even for ~15k books.
// - ScrapeMetadata itself uses retryGet with 429-aware exponential backoff
// (up to 9 attempts), so transient rate limits are handled gracefully.
// - Cover images are fetched and stored in MinIO on first sight; subsequent
// refreshes skip covers that already exist (CoverExists check).
// - Books already present in Meilisearch are skipped entirely (fast path).
// - Errors for individual books are logged and skipped; the loop never aborts.
import (
"context"
@@ -29,7 +32,7 @@ import (
// runCatalogueRefresh performs one full catalogue walk: scrapes metadata for
// every book on novelfire.net, downloads covers to MinIO, and upserts to
// Meilisearch. Errors for individual books are logged and skipped.
// Meilisearch. Individual book failures are logged and skipped.
func (r *Runner) runCatalogueRefresh(ctx context.Context) {
if r.deps.Novel == nil {
r.deps.Log.Warn("runner: catalogue refresh skipped — Novel scraper not configured")
@@ -40,8 +43,9 @@ func (r *Runner) runCatalogueRefresh(ctx context.Context) {
return
}
delay := r.cfg.CatalogueRequestDelay
log := r.deps.Log.With("op", "catalogue_refresh")
log.Info("runner: catalogue refresh starting")
log.Info("runner: catalogue refresh starting", "request_delay", delay)
entries, errCh := r.deps.Novel.ScrapeCatalogue(ctx)
@@ -51,26 +55,26 @@ func (r *Runner) runCatalogueRefresh(ctx context.Context) {
break
}
// Skip books already present in Meilisearch — they were indexed on a
// previous run. Re-indexing only happens when a scrape task is
// explicitly enqueued (e.g. via the admin UI or API).
// Fast path: skip books already indexed in Meilisearch.
if r.deps.SearchIndex.BookExists(ctx, entry.Slug) {
skipped++
continue
}
// Random jitter between books to avoid rate-limiting.
jitter := time.Duration(1000+rand.Intn(2000)) * time.Millisecond
// Polite delay between metadata requests: base + up to 50% jitter.
// This applies before every fetch so we never fire bursts.
jitter := time.Duration(rand.Int63n(int64(delay / 2)))
select {
case <-ctx.Done():
break
case <-time.After(jitter):
case <-time.After(delay + jitter):
}
// ScrapeMetadata internally retries on 429 with exponential back-off.
meta, err := r.deps.Novel.ScrapeMetadata(ctx, entry.URL)
if err != nil {
log.Warn("runner: catalogue refresh: metadata scrape failed",
"url", entry.URL, "err", err)
log.Warn("runner: catalogue refresh: metadata scrape failed — skipping book",
"slug", entry.Slug, "url", entry.URL, "err", err)
errCount++
continue
}
@@ -81,35 +85,32 @@ func (r *Runner) runCatalogueRefresh(ctx context.Context) {
// Persist to PocketBase.
if err := r.deps.BookWriter.WriteMetadata(ctx, meta); err != nil {
log.Warn("runner: catalogue refresh: WriteMetadata failed",
log.Warn("runner: catalogue refresh: WriteMetadata failed — skipping book",
"slug", meta.Slug, "err", err)
errCount++
continue
}
// Index in Meilisearch.
// Index in Meilisearch (non-fatal).
if err := r.deps.SearchIndex.UpsertBook(ctx, meta); err != nil {
log.Warn("runner: catalogue refresh: UpsertBook failed",
"slug", meta.Slug, "err", err)
// non-fatal — continue
}
// Download and store cover image in MinIO if we have a cover URL
// and a CoverStore is wired in.
// Download cover to MinIO if not already cached (non-fatal).
if r.deps.CoverStore != nil && originalCover != "" {
if !r.deps.CoverStore.CoverExists(ctx, meta.Slug) {
if err := r.downloadCover(ctx, meta.Slug, originalCover); err != nil {
log.Warn("runner: catalogue refresh: cover download failed",
"slug", meta.Slug, "url", originalCover, "err", err)
// non-fatal
}
}
}
ok++
if ok%100 == 0 {
if ok%50 == 0 {
log.Info("runner: catalogue refresh progress",
"scraped", ok, "errors", errCount)
"scraped", ok, "skipped", skipped, "errors", errCount)
}
}

View File

@@ -62,6 +62,10 @@ type Config struct {
// scrapes per-book metadata, downloads covers, and re-indexes everything in
// Meilisearch. Defaults to 24h (expensive — full catalogue walk).
CatalogueRefreshInterval time.Duration
// CatalogueRequestDelay is the base inter-request pause during a catalogue
// refresh metadata walk. Jitter of up to 50% is added on top.
// Defaults to 2s. Set via RUNNER_CATALOGUE_REQUEST_DELAY.
CatalogueRequestDelay time.Duration
// SkipInitialCatalogueRefresh suppresses the immediate catalogue walk that
// otherwise fires at startup. The periodic ticker (CatalogueRefreshInterval)
// still fires normally. Set RUNNER_SKIP_INITIAL_CATALOGUE_REFRESH=true for
@@ -145,6 +149,9 @@ func New(cfg Config, deps Dependencies) *Runner {
if cfg.CatalogueRefreshInterval <= 0 {
cfg.CatalogueRefreshInterval = 24 * time.Hour
}
if cfg.CatalogueRequestDelay <= 0 {
cfg.CatalogueRequestDelay = 2 * time.Second
}
if cfg.MetricsAddr == "" {
cfg.MetricsAddr = ":9091"
}

View File

@@ -154,7 +154,7 @@ services:
# No public port — all traffic is routed via Caddy.
expose:
- "8080"
environment:
environment:
<<: *infra-env
BACKEND_HTTP_ADDR: ":8080"
LOG_LEVEL: "${LOG_LEVEL}"
@@ -224,6 +224,7 @@ services:
# Kokoro-FastAPI TTS endpoint
KOKORO_URL: "${KOKORO_URL}"
KOKORO_VOICE: "${KOKORO_VOICE}"
POCKET_TTS_URL: "${POCKET_TTS_URL}"
GLITCHTIP_DSN: "${GLITCHTIP_DSN}"
OTEL_EXPORTER_OTLP_ENDPOINT: "${OTEL_EXPORTER_OTLP_ENDPOINT}"
OTEL_SERVICE_NAME: "runner"

View File

@@ -222,6 +222,10 @@ services:
EMAIL_SMTP_USERNAME: "${FIDER_SMTP_USER}"
EMAIL_SMTP_PASSWORD: "${FIDER_SMTP_PASSWORD}"
EMAIL_SMTP_ENABLE_STARTTLS: "false"
OAUTH_GOOGLE_CLIENTID: "${OAUTH_GOOGLE_CLIENTID}"
OAUTH_GOOGLE_SECRET: "${OAUTH_GOOGLE_SECRET}"
OAUTH_GITHUB_CLIENTID: "${OAUTH_GITHUB_CLIENTID}"
OAUTH_GITHUB_SECRET: "${OAUTH_GITHUB_SECRET}"
# ── Dozzle ──────────────────────────────────────────────────────────────────
# Watches both homelab and prod containers.

View File

@@ -14,10 +14,12 @@ COPY . .
# Build-time version info — injected by docker-compose or CI via --build-arg.
ARG BUILD_VERSION=dev
ARG BUILD_COMMIT=unknown
ARG BUILD_TIME=unknown
# Expose as PUBLIC_ env vars so SvelteKit's $env/dynamic/public can read them.
ENV PUBLIC_BUILD_VERSION=$BUILD_VERSION
ENV PUBLIC_BUILD_COMMIT=$BUILD_COMMIT
ENV PUBLIC_BUILD_TIME=$BUILD_TIME
RUN npm run build
@@ -40,5 +42,16 @@ ENV NODE_ENV=production
ENV PORT=3000
ENV HOST=0.0.0.0
# Carry build-time metadata into the runtime image so the UI footer can
# display the version, commit SHA, and build timestamp.
# These must be re-declared after the second FROM — ARG values do not
# cross stage boundaries, but ENV values set here persist at runtime.
ARG BUILD_VERSION=dev
ARG BUILD_COMMIT=unknown
ARG BUILD_TIME=unknown
ENV PUBLIC_BUILD_VERSION=$BUILD_VERSION
ENV PUBLIC_BUILD_COMMIT=$BUILD_COMMIT
ENV PUBLIC_BUILD_TIME=$BUILD_TIME
EXPOSE $PORT
CMD ["node", "build"]

View File

@@ -40,8 +40,8 @@ export async function presignAvatarUploadUrl(userId: string, mimeType: string):
}
/**
* Returns a presigned GET URL for a user's avatar, rewritten to the public URL.
* Returns null if no avatar exists.
* Returns a presigned GET URL for a user's avatar from MinIO.
* Returns null if no avatar object exists in MinIO for this user.
*/
export async function presignAvatarUrl(userId: string): Promise<string | null> {
const res = await backendFetch(`/api/presign/avatar/${encodeURIComponent(userId)}`);
@@ -54,6 +54,42 @@ export async function presignAvatarUrl(userId: string): Promise<string | null> {
return data.url ? rewriteHost(data.url) : null;
}
/**
* Resolves the best available avatar URL for a user.
*
* Priority:
* 1. MinIO — if the user has uploaded a custom avatar it will be found here
* (presigned, short-lived GET URL).
* 2. OAuth provider URL — stored in avatar_url when the account was created
* via Google / GitHub OAuth (e.g. https://lh3.googleusercontent.com/...).
* Returned as-is; the browser fetches it directly.
*
* Pass the raw `avatar_url` field from the PocketBase record as `storedValue`
* so this function can distinguish between a MinIO key and a remote URL without
* an extra DB round-trip.
*
* Returns null when neither source yields an avatar.
*/
export async function resolveAvatarUrl(
userId: string,
storedValue: string | null | undefined
): Promise<string | null> {
// 1. Try MinIO first (custom upload takes priority over OAuth picture).
try {
const minioUrl = await presignAvatarUrl(userId);
if (minioUrl) return minioUrl;
} catch {
// MinIO unavailable — fall through to OAuth fallback.
}
// 2. Fall back to OAuth-provided picture URL if it looks like a remote URL.
if (storedValue && storedValue.startsWith('http')) {
return storedValue;
}
return null;
}
/**
* Rewrites the MinIO host in a presigned URL to the public-facing URL.
*

View File

@@ -541,6 +541,19 @@ export async function getUserByUsername(username: string): Promise<User | null>
return listOne<User>('app_users', `username="${username.replace(/"/g, '\\"')}"`);
}
/**
* Look up a user by their PocketBase record ID. Returns null if not found.
*/
export async function getUserById(id: string): Promise<User | null> {
const token = await getToken();
const res = await fetch(`${PB_URL}/api/collections/app_users/records/${encodeURIComponent(id)}`, {
headers: { Authorization: `Bearer ${token}` }
});
if (res.status === 404) return null;
if (!res.ok) return null;
return res.json() as Promise<User>;
}
/**
* Look up a user by email. Returns null if not found.
*/

View File

@@ -433,17 +433,26 @@
<a href="/dmca" class="hover:text-zinc-500 transition-colors">DMCA</a>
<span>&copy; {new Date().getFullYear()} libnovel</span>
</div>
<!-- Build version / commit SHA -->
<div class="text-zinc-700 tabular-nums font-mono">
<!-- Build version / commit SHA / build time -->
{#snippet buildTime()}
{#if env.PUBLIC_BUILD_TIME && env.PUBLIC_BUILD_TIME !== 'unknown'}
{@const d = new Date(env.PUBLIC_BUILD_TIME)}
<span class="text-zinc-500" title="Build time">
· {d.toUTCString().replace(' GMT', ' UTC').replace(/:\d\d /, ' ')}
</span>
{/if}
{/snippet}
<div class="text-xs tabular-nums font-mono px-2 py-0.5 rounded bg-zinc-800 border border-zinc-700">
{#if env.PUBLIC_BUILD_VERSION && env.PUBLIC_BUILD_VERSION !== 'dev'}
<span title="Build version">{env.PUBLIC_BUILD_VERSION}</span>
<span class="text-zinc-300" title="Build version">{env.PUBLIC_BUILD_VERSION}</span>
{#if env.PUBLIC_BUILD_COMMIT && env.PUBLIC_BUILD_COMMIT !== 'unknown'}
<span class="text-zinc-800 select-all" title="Commit SHA"
<span class="text-zinc-500 select-all" title="Commit SHA"
>+{env.PUBLIC_BUILD_COMMIT.slice(0, 7)}</span
>
{/if}
{@render buildTime()}
{:else}
<span class="text-zinc-800">dev</span>
<span class="text-zinc-400">dev</span>
{/if}
</div>
</div>

View File

@@ -1,6 +1,7 @@
import { json, error } from '@sveltejs/kit';
import type { RequestHandler } from './$types';
import { getUserByUsername } from '$lib/server/pocketbase';
import { resolveAvatarUrl } from '$lib/server/minio';
/**
* GET /api/auth/me
@@ -13,10 +14,11 @@ export const GET: RequestHandler = async ({ locals }) => {
}
// Fetch full record from PocketBase to get avatar_url
const record = await getUserByUsername(locals.user.username).catch(() => null);
const avatarUrl = await resolveAvatarUrl(locals.user.id, record?.avatar_url).catch(() => null);
return json({
id: locals.user.id,
username: locals.user.username,
role: locals.user.role,
avatar_url: record?.avatar_url ?? null
avatar_url: avatarUrl
});
};

View File

@@ -5,9 +5,10 @@ import {
listReplies,
createComment,
getMyVotes,
getUserById,
type CommentSort
} from '$lib/server/pocketbase';
import { presignAvatarUrl } from '$lib/server/minio';
import { resolveAvatarUrl } from '$lib/server/minio';
import { log } from '$lib/server/logger';
/**
@@ -38,13 +39,15 @@ export const GET: RequestHandler = async ({ params, url, locals }) => {
replies: repliesPerComment[i]
}));
// Batch-resolve avatar presign URLs for all unique user_ids
// Batch-resolve avatar URLs for all unique user_ids
// MinIO first (custom upload), fall back to OAuth provider picture.
const allComments = [...topLevel, ...allReplies];
const uniqueUserIds = [...new Set(allComments.map((c) => c.user_id).filter(Boolean))];
const avatarEntries = await Promise.all(
uniqueUserIds.map(async (userId) => {
try {
const url = await presignAvatarUrl(userId);
const user = await getUserById(userId);
const url = await resolveAvatarUrl(userId, user?.avatar_url);
return [userId, url] as [string, string | null];
} catch {
return [userId, null] as [string, null];

View File

@@ -1,6 +1,6 @@
import { json, error } from '@sveltejs/kit';
import type { RequestHandler } from './$types';
import { presignAvatarUrl } from '$lib/server/minio';
import { presignAvatarUrl, resolveAvatarUrl } from '$lib/server/minio';
import { updateUserAvatarUrl, getUserByUsername } from '$lib/server/pocketbase';
import { backendFetch } from '$lib/server/scraper';
@@ -63,10 +63,6 @@ export const GET: RequestHandler = async ({ locals }) => {
if (!locals.user) error(401, 'Not authenticated');
const record = await getUserByUsername(locals.user.username).catch(() => null);
if (!record?.avatar_url) {
return json({ avatar_url: null });
}
const avatarUrl = await presignAvatarUrl(locals.user.id);
const avatarUrl = await resolveAvatarUrl(locals.user.id, record?.avatar_url).catch(() => null);
return json({ avatar_url: avatarUrl });
};

View File

@@ -1,7 +1,7 @@
import { json, error } from '@sveltejs/kit';
import type { RequestHandler } from './$types';
import { getPublicProfile, getSubscription } from '$lib/server/pocketbase';
import { presignAvatarUrl } from '$lib/server/minio';
import { resolveAvatarUrl } from '$lib/server/minio';
import { log } from '$lib/server/logger';
/**
@@ -15,11 +15,9 @@ export const GET: RequestHandler = async ({ params, locals }) => {
const profile = await getPublicProfile(username);
if (!profile) error(404, `User "${username}" not found`);
// Resolve avatar presigned URL if set
// Resolve avatar — MinIO first, fall back to OAuth provider picture
let avatarUrl: string | null = null;
if (profile.avatar_url) {
avatarUrl = await presignAvatarUrl(profile.id).catch(() => null);
}
avatarUrl = await resolveAvatarUrl(profile.id, profile.avatar_url).catch(() => null);
// Is the current logged-in user subscribed?
let isSubscribed = false;

View File

@@ -1,5 +1,6 @@
<script lang="ts">
import { enhance } from '$app/forms';
import { goto } from '$app/navigation';
import { navigating } from '$app/state';
import { untrack } from 'svelte';
import type { PageData, ActionData } from './$types';
@@ -7,6 +8,29 @@
let { data, form }: { data: PageData; form: ActionData } = $props();
// ── Local filter state (mirrors URL params) ──────────────────────────────
// These are separate from data.* so we can bind them to selects and keep
// the DOM in sync. They sync back from data whenever a navigation completes.
let filterSort = $state(untrack(() => data.sort));
let filterGenre = $state(untrack(() => data.genre));
let filterStatus = $state(untrack(() => data.status));
// Keep local state in sync whenever SvelteKit re-runs the load (URL changed).
$effect(() => {
filterSort = data.sort;
filterGenre = data.genre;
filterStatus = data.status;
});
function applyFilters() {
const params = new URLSearchParams();
params.set('sort', filterSort);
params.set('genre', filterGenre);
params.set('status', filterStatus);
params.set('page', '1');
goto(`/catalogue?${params.toString()}`);
}
// Track which novel card is currently being navigated to
let loadingSlug = $state<string | null>(null);
@@ -389,11 +413,11 @@
<select
id="filter-sort"
name="sort"
value={data.sort}
bind:value={filterSort}
class="bg-zinc-900 border border-zinc-700 text-zinc-200 text-sm rounded px-3 py-2 focus:outline-none focus:border-amber-400 w-full"
>
{#each sorts as s}
<option value={s.value}>{s.label}</option>
<option value={s.value} selected={s.value === filterSort}>{s.label}</option>
{/each}
</select>
</div>
@@ -403,12 +427,12 @@
<select
id="filter-genre"
name="genre"
value={data.genre}
bind:value={filterGenre}
disabled={isRankView}
class="bg-zinc-900 border border-zinc-700 text-zinc-200 text-sm rounded px-3 py-2 focus:outline-none focus:border-amber-400 disabled:opacity-40 disabled:cursor-not-allowed w-full"
>
{#each genres as g}
<option value={g.value}>{g.label}</option>
<option value={g.value} selected={g.value === filterGenre}>{g.label}</option>
{/each}
</select>
</div>
@@ -418,12 +442,12 @@
<select
id="filter-status"
name="status"
value={data.status}
bind:value={filterStatus}
disabled={isRankView}
class="bg-zinc-900 border border-zinc-700 text-zinc-200 text-sm rounded px-3 py-2 focus:outline-none focus:border-amber-400 disabled:opacity-40 disabled:cursor-not-allowed w-full"
>
{#each statuses as st}
<option value={st.value}>{st.label}</option>
<option value={st.value} selected={st.value === filterStatus}>{st.label}</option>
{/each}
</select>
</div>
@@ -438,8 +462,8 @@
Reset
</a>
<button
type="submit"
onclick={() => (filtersOpen = false)}
type="button"
onclick={applyFilters}
class="px-4 py-2 rounded bg-amber-400 text-zinc-900 text-sm font-semibold hover:bg-amber-300 transition-colors"
>
Apply

View File

@@ -1,7 +1,7 @@
import { fail, redirect } from '@sveltejs/kit';
import type { Actions, PageServerLoad } from './$types';
import { changePassword, listUserSessions, getUserByUsername } from '$lib/server/pocketbase';
import { presignAvatarUrl } from '$lib/server/minio';
import { resolveAvatarUrl } from '$lib/server/minio';
import { log } from '$lib/server/logger';
export const load: PageServerLoad = async ({ locals }) => {
@@ -16,13 +16,11 @@ export const load: PageServerLoad = async ({ locals }) => {
log.warn('profile', 'listUserSessions failed (non-fatal)', { err: String(e) });
}
// Fetch avatar presigned URL if user has one
// Fetch avatar — MinIO first, fall back to OAuth provider picture
let avatarUrl: string | null = null;
try {
const record = await getUserByUsername(locals.user.username);
if (record?.avatar_url) {
avatarUrl = await presignAvatarUrl(locals.user.id);
}
avatarUrl = await resolveAvatarUrl(locals.user.id, record?.avatar_url);
} catch (e) {
log.warn('profile', 'avatar fetch failed (non-fatal)', { err: String(e) });
}

View File

@@ -6,7 +6,7 @@ import {
getUserPublicLibrary,
getUserCurrentlyReading
} from '$lib/server/pocketbase';
import { presignAvatarUrl } from '$lib/server/minio';
import { resolveAvatarUrl } from '$lib/server/minio';
import { log } from '$lib/server/logger';
export const load: PageServerLoad = async ({ params, locals }) => {
@@ -15,11 +15,9 @@ export const load: PageServerLoad = async ({ params, locals }) => {
const profile = await getPublicProfile(username).catch(() => null);
if (!profile) error(404, `User "${username}" not found`);
// Resolve avatar
// Resolve avatar — MinIO first, fall back to OAuth provider picture
let avatarUrl: string | null = null;
if (profile.avatar_url) {
avatarUrl = await presignAvatarUrl(profile.id).catch(() => null);
}
avatarUrl = await resolveAvatarUrl(profile.id, profile.avatar_url).catch(() => null);
// Subscription state for the logged-in visitor
let isSubscribed = false;